To write log in python, we all know and use the standard python logging with different handler for its output, like Console output, file output, etc... Today I will demonstrate on how to write and use a file rotating handler, which will support to write the log to a specific file, but it will keeps rotate to make sure it only leave the maximum configured disk space on the machine.
Minimal Python File Rotating Handler
Okay, let's go straight to the code, this is the minimal working python logger with file rotating handler
import os
import logging
from logging.handlers import RotatingFileHandler
LOG_FILE_DIRECTORY = "log"
LOG_FORMATTER = "%(asctime)s %(levelname)s %(pathname)s:%(lineno)d - %(message)s"
def ensure_directory_exist(path):
if not os.path.exists(path):
os.makedirs(path)
formatter = logging.Formatter(LOG_FORMATTER)
ensure_directory_exist(LOG_FILE_DIRECTORY)
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
file_path = os.path.join(LOG_FILE_DIRECTORY, 'example.rotate.handler.log')
# Create a rotating file handler with a maximum file size of 10 MB
# and a backup count of 5
file_handler = RotatingFileHandler(
file_path,
maxBytes=10*1024*1024,
backupCount=5
)
file_handler.setFormatter(formatter)
logger.addHandler(file_handler)
if __name__ == "__main__":
logger.debug("This is logger run from the minimal handler")
Create a file named minimal_logger.py
with the above content, run it via python minimal_logger.py
and you can see it created a file named log/example.rotate.handler.log
with these content:
2023-05-26 16:42:08,127 DEBUG /Users/dragonknight/work/personal/file-rotating-logger/src/fetchcomp/minimal.py:37 - This is logger run from the minimal handler
Looking at the code, you can see that by using RotatingFileHandler
with maxBytes
to specify the maximum file size of each file, incorporate with backupCount
to define how many files you want to be kept in the system, as well as file_path
we now have a file rotating logger handler that save log to a file, and always keeps it in maximum file capacity that we predefined.
That's about it, you can grab above snippet and tinker as you like. But if you want to add some extra tweak and more robust logger mechanism that will be used in real project, please continue to read the next step.
Advanced setting to setup and use in project
Improve path writing
The full pathname in the logging file provide the full information of the file, but most of the time, we only need the relative path in the project, so we can make the file path more compact, to improve readability. File path will change from /Users/dragonknight/work/personal/file-rotating-logger/src/fetchcomp/minimal.py:36
to src/fetchcomp/minimal.py:36
.
import os
import logging
class CompactPathFormatter(logging.Formatter):
def format(self, record):
pathname = record.pathname
compact_pathname = os.path.relpath(pathname, start=os.getcwd())
record.pathname = compact_pathname
return super().format(record)
formatter = CompactPathFormatter('%(asctime)s %(levelname)s %(pathname)s:%(lineno)d - %(message)s')
In this formatter, os.path.relpath
is used to compute the relative path of the pathname
with respect to the current working directory (os.getcwd()
). This results in a more compact representation of the path.
Add process id and logger name
We can add the process id and the logger name to better understanding the logging information, after level name: %(process)d [%(name)s]
Add more handlers
Sometime, a side from the file handler, we often need at least the stream handler, to output it to the console.
console_handler = logging.StreamHandler()
console_handler.setFormatter(formatter)
logger.addHandler(console_handler)
With the above snippet, your logger now output both to the console and the file.
Conclusion
In conclusion, loggin is an essential aspect of any Python application. It allows us to capture and store valuable information about the execution of our code, enabling us to diagnose issues, monitor performance, and gain insights into how our application behaves in different scenarios.
File rotating handlers are valuable tool in managing log files effectively and efficiently. They provide a reliable mechanism for controlling log file growth, prevent excessive disk usage, and ensuring that our logging infrastructure operates smoothly.