Quantcast
Channel: PowerShell.com – PowerShell Scripts, Tips, Forums, and Resources: Active Threads
Viewing all articles
Browse latest Browse all 8411

limitations get-childitem or get-wmiobject

$
0
0

For quit some time I want to search very large fileservers in a cluster with filepaths that exceed 260 characters for certain file extension. I have tried get-childitem and -include -filter etc but it crashes or stops because of memory issues, or without error.

I do filtering upfront and use the pipeline but it somehow breaks.

The get-wmiobject script looks like this:

$query="Select * from CIM_Datafile where drive='P:' AND extension='pst'"

Get-WmiObject -query $query | 

Select-object @{Name="Name";expression={$_.FileName}},@{Name="MB";expression={"{0:N2}" -f ($_.FileSize / 1MB)}},@{Name="Directory";expression={$_.Path}},@{Name="CreationTime";expression={([WMI]"").ConvertToDateTime($_.CreationDate)}},@{Name="LastAccess";expression={([WMI]"").ConvertToDateTime($_.LastAccessed)}},@{Name="LastWriteTime";expression={([WMI]"").ConvertToDateTime($_.LastModified)}}|

export-csv 'c:\temp\outpst11--.csv' -NoType

That also stops without errors but the data is incompleet. If you are talking about millions of files which approach is the best, lowest memory footprint, quickest, best filtering?

Either methods that I use end up with limit results, So I am almost reverted back to tools like treesize etc, but I wanted to use powershell.

Is there anyway that I could achieve this within powershell?


Viewing all articles
Browse latest Browse all 8411

Trending Articles