This is the mail archive of the cygwin mailing list for the Cygwin project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

"du -b --files0-from=-" running out of memory


I have a problem with du running out of memory.

I'm feeding it a list of null-separated file names via standard input,
to a command-line that looks like:

  du -b --files0-from=-

The problem is that when du is run in this way, it leaks memory like a
sieve. I feed it about 4.7 million paths but eventually it falls over as
it hits the 32-bit address space limit.

Now, I can understand why a du -c might want to exclude excess hard
links to files, but that at most requires a hash table for device &
inode pairs - it's hard to see why 4.7 million entries would cause OOM -
and in any case, I'm not asking for a grand total.

Is there any other alternative to running e.g. xargs -0 du -b, possibly
with a high -n <arg> to xargs to limit memory leakage?

-- Barry

-- 
http://barrkel.blogspot.com/

--
Unsubscribe info:      http://cygwin.com/ml/#unsubscribe-simple
Problem reports:       http://cygwin.com/problems.html
Documentation:         http://cygwin.com/docs.html
FAQ:                   http://cygwin.com/faq/


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]