This is the mail archive of the binutils@sourceware.org mailing list for the binutils project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

how to get "the big picture" with nm, objdump, size for object size?


Hi all!

 I am trying to decrease the object size for my C++ code with nm, size
and objdump. The 'size' command gives me
$ size my_obj.o
   text    data     bss     dec     hex filename
 143576      72     208  143856   231f0 my_obj.o
just 143kb which is much less than real file size (which is 1,6Mb).

 So, how can I get what sections occupy so many space? I am sure they
are debugging symbols and something else; so, how can I sum up numbers
from output of the tools to get something like 1,6Mb? Googling didn't help.

 After some playing with the tools I found out that
	nm -a --demangle --print-size --size-sort -t d my_obj.o
gives me quite useful information about what (weak) symbols are fat and
so should be eliminated from every object file (like template functions
from Boost library that have much code in headers only).

Just the first invocation has given me the clue to optimize my_obj.o
from 2Mb to 1,6Mb (400kb, -20% !) and that is very good, but the second
one "decreased" the size by only 14kb (not so good, hardly costs its
effort).

 I calculated the sum of symbols that nm shows with
	nm --||-- | awk '{ sum+=$2} END {print sum}'
and it is 117kb. I thought there is a proportion between "cutting"
symbols from that 117kb and descreasing real object size but it is wrong.

Thanks in advance,
 Ilya Murav'jov


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]