Create zip file of logging file using python module logging - python-2.7

I want to create a zip file of logging file. I have created logging file using python module logging and RotatingFileHandler.
import logging
from logging.handlers import RotatingFileHandler
# create a logging format
log_formatter = logging.Formatter('Date: %(asctime)s - %(message)s')
logFile = scheduler_name + "_"+ scheduler_id+".log"
# create a file handler
myhandler = RotatingFileHandler(logFile, mode='a', maxBytes=5*1024*1024,
backupCount=2, encoding=None, delay=0)
myhandler.setFormatter(log_formatter)
myhandler.setLevel(logging.INFO)
# add the handlers to the logger
app_log = logging.getLogger()
app_log.addHandler(myhandler)
Using that I have created a logging file and I want to create zip file using logging module inbuilt functionality

I didnt try, but possibly it should be like this. Take care of dst_file_name to generate dynamically just like your 'logFile'
import commands
myhandler.doRollover()
for i in range(self.backupCount, 0, -1):
dst_file_name='myzip.zip'
src_file_name=scheduler_name + "_"+i+".log"
cmd = "zip %s %s" %(dst_file_name, src_file_name)
commands.getoutput(cmd)
enter code here

Related

Django: Logging to custom files every day

I'm running Django 3.1 on Docker and I want to log to different files everyday. I have a couple of crons running and also celery tasks. I don't want to log to one file because a lot of processes will be writing and debugging/reading the file will be difficult.
If I have cron tasks my_cron_1, my_cron_2,my_cron_3
I want to be able to log to a file and append the date
MyCron1_2020-12-14.log
MyCron2_2020-12-14.log
MyCron3_2020-12-14.log
MyCron1_2020-12-15.log
MyCron2_2020-12-15.log
MyCron3_2020-12-15.log
MyCron1_2020-12-16.log
MyCron2_2020-12-16.log
MyCron3_2020-12-16.log
Basically, I want to be able to pass in a name to a function that will write to a log file.
Right now I have a class MyLogger
import logging
class MyLogger:
def __init__(self,filename):
# Gets or creates a logger
self._filename = filename
def log(self,message):
message = str(message)
print(message)
logger = logging.getLogger(__name__)
# set log level
logger.setLevel(logging.DEBUG)
# define file handler and set formatter
file_handler = logging.FileHandler('logs/'+self._filename+'.log')
#formatter = logging.Formatter('%(asctime)s : %(levelname)s: %(message)s')
formatter = logging.Formatter('%(asctime)s : %(message)s')
file_handler.setFormatter(formatter)
# add file handler to logger
logger.addHandler(file_handler)
# Logs
logger.info(message)
I call the class like this
logger = MyLogger("FirstLogFile_2020-12-14")
logger.log("ONE")
logger1 = MyLogger("SecondLogFile_2020-12-14")
logger1.log("TWO")
FirstLogFile_2020-12-14 will have ONE TWO but it should only have ONE
SecondLogFile_2020-12-14 will have TWO
Why is this? Why are the logs being written to the incorrect file? What's wrong with my code?

reusing logging config across python modules

This snippet creates a logging filter that puts ERROR level and above into the console and DEBUG and above into the log file. What I can't seem to figure out is how to reuse that config across my various modules so that I'm writing to the same logfile, but the name correctly indicates the module that generated the message.
Thanks in advance!
import logging
default_formatter = logging.Formatter(
"%(asctime)s:%(name)s:%(levelname)s:%(message)s")
console_handler = logging.StreamHandler()
console_handler.setLevel(logging.ERROR)
console_handler.setFormatter(default_formatter)
file_handler = logging.FileHandler("error.log", "a")
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(default_formatter)
noralog = logging.getLogger(__name__)
noralog.setLevel(logging.DEBUG)
noralog.addHandler(console_handler)
noralog.addHandler(file_handler)
noralog.debug('PUT ME ONLY IN THE FILE')
noralog.error('STREAM AND FILE')
This seems to work, but I'm not sure it's the best solution:
#noralog.py
def setup_logging(localname):
import logging
default_formatter = logging.Formatter(
"%(asctime)s:%(name)s:%(levelname)s:%(message)s")
console_handler = logging.StreamHandler()
console_handler.setLevel(logging.ERROR)
console_handler.setFormatter(default_formatter)
file_handler = logging.FileHandler("error.log", "a")
file_handler.setLevel(logging.DEBUG)
file_handler.setFormatter(default_formatter)
noralog = logging.getLogger(localname)
noralog.setLevel(logging.DEBUG)
noralog.addHandler(console_handler)
noralog.addHandler(file_handler)
return noralog
#othermod.py
import noralog
ff = noralog.setup_logging(__name__)
ff.debug('PUT THIS ONLY IN FILE LOG')
ff.error('PUT THIS IN STREAM AND FILE LOG')

define python logger in one script and use it in all other scripts

I have two python scripts and one more python file which contains common functions used by these scripts.
I want to define a logger function in this common file and want to use these loggers in test scripts.
Can anyone please provide the best solution on how to define this logger in common file and use it in all other files?
Please correct if my examples are wrong. :)
Ex:
Common file: (common.py)
This file should have a logger with name "logtool" which can be used by all scripts
My test script 1: (test1.py)
logtool.info("some text")
logtool.debug("Some text")
My test script 2: (test2.py)
logtool.info("Some text")
logtool.debug("Some text")
You can have the following code in your common.py file -
import logging
logging.basicConfig(level=logging.INFO,
format='%(asctime)s %(name)-12s %(levelname)-8s %(message)s',
datefmt='%m-%d %H:%M',
filename='logger.log',
filemode='w', )
logtool = logging.getLogger('MainLogs')
logtool.setLevel(logging.INFO)
fh = logging.FileHandler('logger.log')
ch = logging.StreamHandler()
ch.setLevel(logging.INFO)
logtool.addHandler(ch)
Then, simply import logger in other scripts -
from common import logtool

Python logger is printing messages twice on both the PyCharm console and the log file

Similar questions were caused by the logger getting called twice. So maybe in my case it is caused by the second getLogger(). Any ideas how I can fix it?
import logging
import logging.handlers
logger = logging.getLogger("")
logger.setLevel(logging.DEBUG)
handler = logging.handlers.RotatingFileHandler(
"the_log.log", maxBytes=3000000, backupCount=2)
formatter = logging.Formatter(
'[%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s')
handler.setFormatter(formatter)
logger.addHandler(handler)
# This is required to print messages to the console as well as the log file.
logging.getLogger().addHandler(logging.StreamHandler())
Using a config file. e.g. logging.config.fileConfig('logging.ini')
[logger_root]
level=ERROR
handlers=stream_handler
[logger_webserver]
level=DEBUG
handlers=stream_handler
qualname=webserver
propagate=0
You have to set logger.propagate = 0 (Python 3 Docs) when you're configuring the root logger and using non-root-loggers at the same time.
I know this was asked a long time ago, but it's the top result on DuckDuckGo.

Where are python logs default stored when ran through IPython notebook?

In an IPython notebook cell I wrote:
import logging
logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger(__name__)
handler = logging.FileHandler('model.log')
handler.setLevel(logging.INFO)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
handler.setFormatter(formatter)
logger.addHandler(handler)
Notice that I am supplying a file name, but not a path.
Where could I find that log? (ran a 'find' and couldn't locate it...)
There's multiple ways to set the IPython working directory. If you don't set any of that in your IPython profile/config, environment or notebook, the log should be in your working directory. Also try $ ipython locate to print the default IPython directory path, the log may be there.
What about giving it an absolute file path to see if it works at all?
Other than that the call to logging.basicConfig doesn't seem to do anything inside an IPython notebook:
# In:
import logging
logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger()
logger.debug('root debug test')
There's no output.
As per the docs, the logging.basicConfig doesn't do anything if the root logger already has handlers configured for it. This seems to be the case, IPython apparently already has the root logger set up. We can confirm it:
# In:
import logging
logger = logging.getLogger()
logger.handlers
# Out:
[<logging.StreamHandler at 0x106fa19d0>]
So we can try setting the root logger level manually:
import logging
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
logger.debug('root debug test')
which yields a formatted output in the notebook:
Now onto setting up the file logger:
# In:
import logging
# set root logger level
root_logger = logging.getLogger()
root_logger.setLevel(logging.DEBUG)
# setup custom logger
logger = logging.getLogger(__name__)
handler = logging.FileHandler('model.log')
handler.setLevel(logging.INFO)
logger.addHandler(handler)
# log
logger.info('test info my')
which results in writing the output both to the notebook and the model.log file, which for me is located in a directory I started IPython and notebook from.
Mind that repeated calls to this piece of code without restarting the IPython kernel will result in creating and attaching yet another handler to the logger on every run and the number of messages being logged to the file with each log call will grow.
Declare the path of the log file in the basicConfig like this :
log_file_path = "/your/path/"
logging.basicConfig(level = logging.DEBUG,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
filename = log_file_path,
filemode = 'w')
You can then start logging and why not add a different log format to the console if you want :
# define a Handler which writes INFO messages or higher to the sys.stderr
console = logging.StreamHandler()
console.setLevel(logging.DEBUG)
# set a format which is simpler for console use
formatter = logging.Formatter('%(name)-12s: %(levelname)-8s %(message)s')
# tell the handler to use this format
console.setFormatter(formatter)
# add the handler to the root logger
logging.getLogger().addHandler(console)
logger = logging.getLogger()
et voilĂ  .