I tried to find this, but I'm probably not searching the right way. When I run my Django app, I get a record printed to the error output every time an endpoint is called, similar to:
[20/Jun/2019 09:45:37] "GET /analyst/run_session/ HTTP/1.1" 200 1271
The problem is, I have a call set up every second to refresh data from the api, so my console is being flooded by these entries.
I have tried setting DEBUG=False, and setting MESSAGE_LEVEL=message_constants.ERROR, but it doesn't seem to suppress these entries. Is there something obvious I'm missing?
This is done via logging configs, you will need to override the config for the django.server logger.
For example:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'loggers': {
'django.server': {
'level': 'ERROR'
}
}
}
Related
In my django app, I have defined the logging configuration:
default_config = {
'handlers': handlers_to_use,
'level': 'WARN',
}
LOGGING: Dict[str, Any] = {
'version': 1,
'disable_existing_loggers': False,
'handlers': handler_configs,
'root': default_config,
'loggers': {
'': default_config
}
}
So you can see Im using the unnamed logger '' and the root logger, which should set the default logging level to WARN. However, there are some packages (factory_boy and PIL) that are giving me DEBUG logs, which doesnt make sense because WARN should only give me ERROR and WARN logs, based on the heirarchy.
How are they overriding the default? If I add factory and PIL to the list of loggers, things work correctly, but Im wondering why the unnamed nor the root logger doesnt catch the debug logs
Any help would be greatly appreciated
You're still getting the loggers that are defined elsewhere, because of this line:
'disable_existing_loggers': False
The other packages like factory_boy and PIL are not overriding the default. You are just not overriding them.
If you disable the existing loggers, it's then on you to define everything yourself. You'll only get whatever's configured in your config.
I want to log all my database changes being done from the application and not only from the django-admin. how can i achieve that ? Currently we can only see the history in django admin for the changes done through the admin interface. do i need to define signals for this?
In settings.py, we have to enable logging. Put this code in your settings.py
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'console': {
'level': 'DEBUG',
'class': 'logging.StreamHandler',
},
},
'loggers': {
'django.db.backends': {
'handlers': ['console'],
'level': 'DEBUG',
'propagate': True,
},
},
}
Django documentation for logging - https://docs.djangoproject.com/en/1.11/topics/logging/#django-db-backends
Django admin uses LogEntry.objects.log_action to record those history changes. There's nothing stopping you from calling that same method in your own code to record changes made elsewhere.
You can use a pre_save signal to get the object before committing to the database and then fetch the old values from the database to compare and check for changes.
The formatting of the message can be a plain string, but the admin puts it in a JSON format so it can be translated. You can look at the source for construct_change_message is django.contrib.admin.utils to figure out that JSON format if you want to continue using that for the ManyToManyField, etc.
There are two types of changes possible.
If you are concerned with the structural changes in the database, they are anyways saved in the migrations folder inside your app directory.
If you want to log DB changes in terms of entries made in the database, you might find the python package django-audit-log useful. You can install it via pip, and once installed, you can add trackers to your models by doing something like this:
from audit_log.models.managers import AuditLog
class YourModelName(models.Model):
#your model definition here
audit_log = AuditLog()
You can find the docs here
Another alternative is django-reversion which allows you to do version control for model instances.
Hope this helps!
I have configured my application so it accepts requests with my FQDN in the Host: header (using ALLOWED_HOSTS). However, our backend team monitors the site with a different address (just a HTTP “ping” to check if the application is alive). I don’t need to serve any pages on this second site, however, getting e-mails about a missing ALLOWED_HOSTS entry for this particular case is a bit annoying (it comes every five minutes).
I have found several articles that discuss how to disable such notifications all at once, but I’d like to ignore this for only this one host. Is there a solution for this?
You could solve this by addding a filter to the mail_admins and silencing the specific django.security.DisallowedHost log record. Docs.
When a non-valid host is received a django.core.exceptions.SuspiciousOperation is raised which is logged by django.security.DisallowedHost logger and handled by default by mail_admins logging handler.
By adding a django.utils.log.CallbackFilter filter to the mail_admins handler you could silent the log record you want. In this case silenting the mail_admins handler will avoid sending an email.
An example LOGGING settings would look like this (I will only add the relevant bits, your settings will probably have more entries):
def skip_missing_host(record):
if record.name == 'django.security.DisallowedHost':
# `record.msg` contents are similar to:
# u"Invalid HTTP_HOST header: 'somehost'.
# You may need to add u'somehost' to ALLOWED_HOSTS."
# Add condition to determine when the ERROR should be logged:
if "u'somehost'" in record.msg:
# Returning False indicates that the record shouldn't be handled:
return False
return True
LOGGING = {
'filters': {
'skip_missing_host': {
'()': 'django.utils.log.CallbackFilter',
'callback': skip_missing_host,
},
},
'handlers': {
'mail_admins': {
'level': 'ERROR',
'filters': ['require_debug_false', 'skip_missing_host'],
'class': 'django.utils.log.AdminEmailHandler'
},
},
}
Alternatively you could create your own django.security.DisallowedHost handler, which would need to be configured in the LOGGING setting.
e.g. in get_response() at https://github.com/django/django/blob/master/django/core/handlers/base.py#L133
there is a
logger.warning('Not Found: %s', request.path,
extra={
'status_code': 404,
'request': request
})
...which seems to log something every time a request 404's I think.
This is clogging up my logs as (for instance) RSS bots crawl some old, non-working URLs on my site
I'd like to stop the logging noise, so I've tried something like the below in my LOGGING config in settings.py.
LOGGING = {
...
'loggers': {
...
'django.core.handlers': {
'handlers': ['app_logs'],
'propagate': False,
'level': 'ERROR'
},
}
Here, I'm trying to quiet the logger.warning by setting the module log level of django.core.handlers to ERROR, but it seems to be not working. Anyone know what to check or do?
Maybe I'm missing something obvious or perhaps flat out doing it wrong hmmm
The correct logger name is django.request.
Django (the python web framework) uses python's logging system to store logs.
Is there an easy way to store log messages in a database, and then allow admin users to look over them via the web? It's the sort of thing I could write myself, but no point re-inventing the wheel. I don't want to log exceptions, but info/debug/notice type messages that I have added to the code.
Ideally I'd like to be able to store metadata about the log message as it's done (like the remote IP address, user agent, wsgi process id, etc.), and then filter / browse based on that (i.e. show me all log messages from this IP address in the last 24 hours). Has anyone done this?
Just use Sentry. Raven, the Django piece of the functionality hooks into the logging framework, so in addition to collecting errors from your app, any custom log messages should show up as well.
Apart from the obvious choice of Sentry, for the sake of exercise, there is a nice blog article titled "Creating custom log handler that logs to database models in django" in the "James Lin Blog", which briefly explains how to do this using a second database, with code samples.
The code is adapted from the standard Python RotatingFileHandler:
...The RotatingFileHandler allows you to specify which file to write to and rotate files, therefore my DBHandler should also allow you to specify which model to insert to and specify expiry in settings, and best of all, on a standalone app.
This could also be easily adapted for using a single db.
check django-db-logger
it takes less than a minute to integrate
https://github.com/CiCiUi/django-db-logger
try django-requests. I've tried it and it basically just puts the request logs in a table called requests.
You can check a good solution that I posted here. You just need a string-connection to connect to your database. For example, if you use a MySQL, the connection-string should be:
# mysqlclient
'mysql+mysqldb://username:password#host:port/database'
or
# PyMySQL
'mysql+pymysql://username:password#host:port/database')
then you can use PhpMyAdmin as a "MySQL web administration tool" to look over the database via web browsers or DataGrip (my preference) to access any database remotely.
for using the handler in Django you just need to add the handler class to the LOGGING variable of setting.py as follow:
level = 'INFO' if DEBUG else 'WARNING' # I prefer INFO in debugging mode and WARNING in production
handler = ['log_db_handler', ] # In production I rarely check the server to see console logs
if DEBUG:
handler.append('console')
LOGGING = {'version': 1,
'disable_existing_loggers': False,
'formatters': {'verbose': {'format': '{levelname} {message}', # {asctime} {module} {process:d} {thread:d}
'style': '{', }, },
'handlers': {'log_db_handler': {'level': level,
'class': 'db_logger.handlers.DBHandler',
'formatter': 'verbose', },
'console': {'class': 'logging.StreamHandler', }},
'loggers': {'db_log': {'handlers': handler,
'level': level,
'propagate': False, },
'django': {'handlers': handler,
'level': level,
'propagate': True, },
'django.request': {'handlers': handler,
'level': level,
'propagate': True, }}}
Pay attention that the 'db_logger.handlers.DBHandler' points to the handler class.