Bug #55079
closedrpm: remove contents of build directory at end of %install section
100%
Description
I've been doing some measurements of disk usage during SUSE RPM builds (of Pacific, but this should roughly apply for newer Cephs too). In our particular build environment, which builds everything in VMs, we see something like this:
Filesystem Size Used Avail Use% Mounted on df start of build: /dev/vda 53G 14G 40G 25% / df end of build: /dev/vda 53G 31G 23G 58% / df end of install: /dev/vda 53G 39G 15G 74% / df before clamscan: /dev/vda 53G 41G 13G 78% / df after clamscan: /dev/vda 53G 50G 3.9G 93% /
So after compiling everything, we've consumed about 17GB (that's all the binaries and object files and whatnot that end up in the "build" directory in the source tree). Then, after %install (which installs everything in the build root, ready to be turned into actual RPMs), we've used another 8GB. The next part - the clamscan bit - is one of the rpmlint checks SUSE runs, which takes another 9G when it extracts all the built RPMs (including debuginfo RPMs), in order to scan them.
In summary, our build worker VMs currently need a bit over 50G disk to build Ceph.
If I add `rm -rf build`
to the very end of the install section, to get rid of the 17GB of built binaries, we go into clamscan with 24G free, rather than 41G free, and when clamscan finishes we're using 32G. This means the peak build disk usage with that change is about 39G, so we reduce our build worker's disk space requirements by about 11G (or 20%).