Docker container logs is quite big
Last updated
Was this helpful?
Last updated
Was this helpful?
So like every good software, Docker store their container's logs on your disk, because my logstash container is outputing log like a tons (getting logs from 6 servers) so the logfile is the same size as the log I recieve -> 2GB of log is equal to 2GB docker's log
The log file of docker container is located at /var/lib/docker/containers/*/*-json.log (you must have root privilege in order to perform action on this file)
I run an ELK Stack for collecting logs from mutiple sources (currently there are 8 machine that send log to my ELK Stack with the average volume of 250 e/s) the estimate logs are 8GB/days, it doesn't seem so bad right?
The problem is logstash container generate a huge amount of logs (because some the)
Because the default logging driver and the amout of event logstash recieve from the server across our system
This is the easy way and not the good way to do this kind of problem, it is not going to permanent fixing the issue because the issue is not the log file is too big -> it is because you configure the Logging Driver the wrong way (so I think)
Command to check the largest file on /var
Command to truncate the log file to zero, what truncate will do
This solution isn't the best way but it is the fastest way, I would say
"To update the logging driver for a container, the container has to be re-created with the desired options"
That is called log rotation guys, you need to have a policy for your log, compressed format,... below are some practices I found on the internet and consider it is the right way to do
When an application in a Docker container emits logs, they are sent to the applicationβs stdout
and stderr
output streams, docker logging driver can access this stream and send log to a file