Django server not working with AzureLogHandler (opencensus) - django

I'm trying to connect my django project logs to Azure Application Insights using OpenCensus. The middleware for montirong requests works well but I also want to send telemetry logs (not just requests) to Azure. Here is my django LOGGING configuration :
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'formatters': {
'verbose': {
'format': '%(asctime)s %(levelname).3s %(process)d %(name)s : %(message)s',
},
'simple': {
'format': '%(asctime)s %(levelname)-7s : %(message)s',
},
},
'filters': {
'require_debug_false': {
'()': 'django.utils.log.RequireDebugFalse',
}
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'verbose',
},
'azure': {
'formatter': 'simple',
'class': 'opencensus.ext.azure.log_exporter.AzureLogHandler',
'connection_string': 'InstrumentationKey=XXXX-XXXX-XXXX-XXXX'
},
'mail_admins': {
'level': 'ERROR',
'filters': ['require_debug_false'],
'class': 'django.utils.log.AdminEmailHandler'
}
},
'loggers': {
'': {
'level': os.environ.get('LOGLEVEL', 'INFO'),
'handlers': ['console', 'azure'],
},
'devdebug': {
'handlers': ['console'],
'level': 'INFO',
'propagate': False,
},
'django': {
'handlers': ['console', 'mail_admins'],
'level': os.environ.get('LOGLEVEL', 'INFO'),
'propagate': False,
}
},
}
Without 'azure' handler in my root logger config, everything works fine. With 'azure' handler, the server starts but doesn't work : I am unable to connect to it. I really don't know what is happening as it doesn't show me unusual logs (even with LOGLEVEL=DEBUG).
My handler configuration should be good as I can receive logs in Azure (when I run any manage command). Even when I run manage runsslserver localhost:53215, I receive logs but it is like my server is not running when I try to reach it.

I have setup the logging via settings and have it working (although not using runsslserver. I have set the connection_string as an environment variable.
The problem I am having is that the 'django' logger does not work in production. All other loggers are sending messages but not that one??? I have a classic case of it works on my machine and our dev servers but not prod... FML!!
INTEGRATIONS = ['postgresql', 'httplib','logging', 'threading']
config_integration.trace_integrations(INTEGRATIONS)
# Set the AppInsights Key as an env variable so it can be used by the logging system
os.environ['APPLICATIONINSIGHTS_CONNECTION_STRING'] = 'InstrumentationKey=XXXXXXXXXXXXXXXXXXXXXXX'
LOGGING = {
'disable_existing_loggers': True, #False, #<-- if true then make sure that you have a 'django' and '' logger
'filters': {
"require_debug_false": {
'()': 'django.utils.log.RequireDebugFalse',
},
'require_debug_true': {
'()': 'django.utils.log.RequireDebugTrue',
},
},
'formatters': {
'simple': {
'format': '[%(asctime)s] %(levelname)s %(message)s',
'datefmt': '%Y-%m-%d %H:%M:%S'
},
'verbose': {
'format': '[%(asctime)s] %(levelname)s [%(name)s.%(funcName)s:%(lineno)d] %(message)s',
'datefmt': '%Y-%m-%d %H:%M:%S'
},
'azure_verbose': {
'format': '[%(asctime)s] %(levelname)s [%(name)s.%(funcName)s:%(lineno)d] %(message)s traceId=%(traceId)s spanId=%(spanId)s',
'datefmt': '%Y-%m-%d %H:%M:%S'
},
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'level': 'INFO',
'filters': ['require_debug_true'],
'formatter': 'verbose'
},
'azure':{
'level': 'INFO',
'filters': ['require_debug_false'],
'class': 'opencensus.ext.azure.log_exporter.AzureLogHandler',
'formatter': 'azure_verbose'
},
},
'loggers': {
'mylogger': {
"handlers": [
"azure",
"console",
],
},
'django': {
'handlers': [
'azure',
'console',
],
},
'py.warnings': {
'handlers': [
'azure',
'console',
],
},
'': {
'handlers': [
'azure',
'console',
],
},
},
'version': 1
}

Related

How can I use a custom logging handler with django?

I'm trying to integrate logging with loki in my django app like this:
handler = logging_loki.LokiHandler(
url="http://localhost:3100/loki/api/v1/push",
tags={"app": "django", "env": ENV},
version="1",
)
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'formatters': {
'standard': {
'format': '[%(asctime)s] {%(module)s} [%(levelname)s] - %(message)s',
'datefmt': '%d-%m-%Y %H:%M:%S'
},
},
'handlers': {
'console': {
'level': 'INFO',
'class': 'logging.StreamHandler',
'formatter': 'standard',
}
},
'loggers': {
'': {
'handlers': ['console', handler], # this doesnt work
'level': 'DEBUG',
'propagate': True,
# 'name': ENV
}
}
}
What do I need to change so that django uses this custom handler? I tried just referencing the handler object in the logging dict but that didn't need to be the right approach.
I also tried this:
LOGGING_CONFIG = None
logging.config.dictConfig(LOGGING)
logging.getLogger(__name__).addHandler(handler)
but that's not sending any logs to loki
Try this:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'formatters': {
'standard': {
'format': '[%(asctime)s] {%(module)s} [%(levelname)s] - %(message)s',
'datefmt': '%d-%m-%Y %H:%M:%S'
},
},
'handlers': {
'console': {
'level': 'INFO',
'class': 'logging.StreamHandler',
'formatter': 'standard',
},
'loki': {
'level': 'INFO',
'class': 'logging_loki.LokiHandler',
'url': "http://localhost:3100/loki/api/v1/push",
'tags' {"app": "django", "env": ENV},
'version': "1",
},
},
'loggers': {
'': {
'handlers': ['console', 'loki'],
'level': 'DEBUG',
'propagate': True,
}
}
}
Specifically, you have to:
Define you handler in the handlers dictionary in the LOGGING setting. The configuration here is used to specify the initialisation arguments for the handler.
Then assign this handler using its key (loki in the example I've given above) to the logger(s) you want it to handle.

Elastic APM: How to turn off logging for django

flushing due to time since last flush 9.060s > max_flush_time 9.060s
I 'm getting tone of those message in django debug.
I tried changing to their default setting
LOGGING = {
'version': 1,
'disable_existing_loggers': True,
'formatters': {
'verbose': {
'format': '%(levelname)s %(asctime)s %(module)s %(process)d %(thread)d %(message)s'
},
},
'handlers': {
'elasticapm': {
'level': 'WARNING',
'class': 'elasticapm.contrib.django.handlers.LoggingHandler',
},
'console': {
'level': 'DEBUG',
'class': 'logging.StreamHandler',
'formatter': 'verbose'
}
},
'loggers': {
'django.db.backends': {
'level': 'ERROR',
'handlers': ['console'],
'propagate': False,
},
'mysite': {
'level': 'WARNING',
'handlers': ['elasticapm'],
'propagate': False,
},
# Log errors from the Elastic APM module to the console (recommended)
'elasticapm.errors': {
'level': 'ERROR',
'handlers': ['console'],
'propagate': False,
},
},
}
Still getting lots of logs.
How to turn this off?

Django creates log file but doesn't write into it

I am trying to use Django logger by giving the following settings:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'filters': {
'require_debug_false': {
'()': 'django.utils.log.RequireDebugFalse',
},
'require_debug_true': {
'()': 'django.utils.log.RequireDebugTrue',
},
},
'formatters': {
'simple': {
'format': '[%(asctime)s] %(levelname)s %(message)s',
'datefmt': '%Y-%m-%d %H:%M:%S'
},
},
'handlers': {
'logfile': {
'level': 'INFO',
'filters': ['require_debug_false','require_debug_true'],
'class': 'logging.FileHandler',
'filename': '/tmp/django-log.log',
'formatter': 'simple'
},
},
'loggers': {
'user_activity': {
'handlers': ['logfile',]
},
}
}
and then using it in the view like following:
user_logger = logging.getLogger('user_activity')
...
user_logger.info("opened page")
Although Django creates file django-log.log, it doesn't write to it.
Is there a problem with my settings?
Thanks
In the handler filters, you can use either django.utils.log.RequireDebugFalse or django.utils.log.RequireDebugTrue, not both. They are mutually exclusive.
You would need to set the level of the logging when you create the log object using logging.getLogger(name):
Try this before writing to the logger:
user_logger.setLevel(logging.INFO)

Configure Raven using Celery and Django without Djcelery

How to configure django + raven with celery, without using the djcelery application that is in is out of date with celery 3.1+.
Old configuration using:
INSTALLED_APPS = INSTALLED_APPS + (
'raven.contrib.django.raven_compat',
)
You don't need djcelery with Celery 3.1+, just add raven.contrib.django.raven_compat to your INSTALLED_APPS.
INSTALLED_APPS = (
...
'raven.contrib.django.raven_compat',
)
RAVEN_CONFIG = {
'dsn': 'YOUR_DSN',
}
There is a sample code of LOGGING setting:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'filters': {
'require_debug_false': {
'()': 'django.utils.log.RequireDebugFalse',
},
'require_debug_true': {
'()': 'django.utils.log.RequireDebugTrue',
},
},
'formatters': {
'verbose': {
'format': '%(levelname)s %(asctime)s %(module)s %(process)d %(thread)d %(message)s',
},
'simple': {
'format': '%(levelname)s %(name)s %(message)s',
},
'clear': {
'format': '%(message)s',
},
},
'handlers': {
'console': {
'level': 'DEBUG',
'class': 'logging.StreamHandler',
'formatter': 'clear',
'filters': ['require_debug_true', ],
},
'mail_admins': {
'level': 'ERROR',
'class': 'django.utils.log.AdminEmailHandler',
'filters': ['require_debug_false', ],
},
'sentry': {
'level': 'WARNING',
'class': 'raven.contrib.django.raven_compat.handlers.SentryHandler',
'filters': ['require_debug_false', ],
},
},
'loggers': {
'raven': {
'level': 'DEBUG',
'handlers': ['console', ],
'propagate': False,
},
'sentry.errors': {
'level': 'DEBUG',
'handlers': ['console', ],
'propagate': False,
},
'log_to_console': {
'handlers': ['console', ],
'level': 'DEBUG',
'propagate': False,
},
'log_to_sentry': {
'handlers': ['sentry', ],
'level': 'WARNING',
},
},
}

Django logging setting configuration issue

My Logging settings is like below:
LOGGING = {
'version': 1,
'disable_existing_loggers': True,
'formatters': {
'standard': {
'format' : "[%(asctime)s] %(levelname)s [%(name)s:%(lineno)s] %(message)s",
'datefmt' : "%d/%b/%Y %H:%M:%S"
},
},
'handlers': {
'null': {
'level':'DEBUG',
'class':'django.utils.log.NullHandler',
},
'logfile': {
'level':'DEBUG',
'class':'logging.handlers.RotatingFileHandler',
'filename': LOG_ROOT + "/logfile",
'maxBytes': 50000,
'backupCount': 2,
'formatter': 'standard',
},
'console':{
'level':'INFO',
'class':'logging.StreamHandler',
'formatter': 'standard'
},
},
'loggers': {
'django': {
'handlers':['console'],
'propagate': True,
'level':'WARN',
},
'django.db.backends': {
'handlers': ['console'],
'level': 'DEBUG',
'propagate': False,
},
'home': {
'handlers': ['console', 'logfile'],
'level': 'DEBUG',
},
}
}
I am calling it like
import logging
log = logging.getLogger('home')
log.error("Hey there it works!!")
getting error No handlers could be found for logger "home"
Seems I am missing something during configuration, also went through the past SQ questions to figure it out, but couldn't figure it out.
What is wrong, any clue?
Thanks in advance.