Cleaning up a full Bosh director - cloudfoundry/cloud_controller_ng GitHub Wiki
Sometime, you may notice that the pipeline jobs are failing like:
Uploading release file:
Director responded with non-successful status code '500' response ''
Exit code 1
or
Error: Failed to upload blob, code 1, output: 'Error running app - Putting dav blob 71998771-9645-4fac-94d7-1826f8b7d94b: Wrong response code: 500; body: <html>
<head><title>500 Internal Server Error</title></head>
<body bgcolor="white">
<center><h1>500 Internal Server Error</h1></center>
<hr><center>nginx</center>
</body>
</html>
', error: ''
This is most likely because your Bosh director's disk is full. To verify, target the environment and run:
~/workspace/capi-ci-private/mulan
± cs |master ✓| → bbl ssh --director --cmd "df -h"
...
Filesystem Size Used Avail Use% Mounted on
devtmpfs 1.8G 0 1.8G 0% /dev
tmpfs 1.9G 0 1.9G 0% /dev/shm
tmpfs 1.9G 52M 1.8G 3% /run
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 1.9G 0 1.9G 0% /sys/fs/cgroup
/dev/sda1 2.8G 1.5G 1.2G 58% /
/dev/sda3 33G 2.9G 29G 10% /var/vcap/data
tmpfs 1.0M 56K 968K 6% /var/vcap/data/sys/run
/dev/sdb1 63G 60G 0 100% /var/vcap/store <-- yo dawg your disk is full
Connection to 10.0.0.6 closed.
The easiest way to get this under 100% (most likely to 99%) is to ssh to the director and delete the task logs:
± cs |master ✓| → bbl ssh --director
bosh/0:~$ sudo su -
bosh/0:~# cd /var/vcap/store/director/tasks
bosh/0:/var/vcap/store/director/tasks# rm -rf ./*
Then, log off of the director and run bosh clean-up --all
to clear up more space. After doing this on Mulan, the result was:
bbl ssh --director --cmd "df -h"
...
Filesystem Size Used Avail Use% Mounted on
devtmpfs 1.8G 0 1.8G 0% /dev
tmpfs 1.9G 0 1.9G 0% /dev/shm
tmpfs 1.9G 152M 1.7G 9% /run
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 1.9G 0 1.9G 0% /sys/fs/cgroup
/dev/sda1 2.8G 1.5G 1.2G 58% /
/dev/sda3 33G 3.0G 29G 10% /var/vcap/data
tmpfs 1.0M 56K 968K 6% /var/vcap/data/sys/run
/dev/sdb1 63G 27G 34G 44% /var/vcap/store <-- yo dawg your disk is no longer full
Connection to 10.0.0.6 closed.