Quantcast
Channel: Windows PowerShell forum
Viewing all articles
Browse latest Browse all 21975

putting a massive amount of data into a single variable

$
0
0

I'm running a command to find the amount of space .xar files (it's a Citrix temp file of sort) are using on my organization's main file server. The server itself presents quite a few terabytes of data and all our user's profile folders are there, so we're talking about many thousands of folders with a whole mess of files. The command I was using is the following:

(gci w: -r -force -include *.xar | measure -sum -property Length).Sum

It took a long time to run, at least 5 hours, but it was finished when I got back to my desk this morning. Turns out we have 63 gigabytes of those .xar files.

I have a few questions:

1) Is this the best way to get that info?

2) If I need to put that command into a variable {$size = (gci w: -r -force ... Length).Sum} to get other info, am I going to kill my poor server's memory?

Thank you


zarberg@gmail.com


Viewing all articles
Browse latest Browse all 21975

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>