I know there is a very similar question. That one is six years old and the answer in that one doesn't help me. All I want is to know how to configure django so that it can log to the console.
This are my settings:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'console': {
'class': 'logging.StreamHandler',
},
},
'loggers': {
'django': {
'handlers': ['console'],
'level': os.getenv('DJANGO_LOG_LEVEL', 'INFO'),
},
},
}
And in my view I have this:
class Home(TemplateView):
template_name = "inicio/magic_py.html"
def get_context_data(self, **kwargs):
logger = logging.getLogger("django")
logger.debug("home!!!!!!")
print("home?")
The console doesn't show the log.debug, it only shows the print. What am I missing? please help.
I use django 1.10
DEBUG log level is lower than INFO so your logs are being filtered out, you'll need to either lower your log level to DEBUG or you need to log using logger.info() or higher.
Related
I am trying to access the logs from my django app in GCP logging. I have thus far been unsuccessful.
Here is my logging config:
client = gcp_logging.Client.from_service_account_json(
json_credentials_path='logging_service_account.json')
client.setup_logging()
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'formatters': {
'standard': {'format': '%(levelname)s : %(message)s - [in %(pathname)s:%(lineno)d]'},
'short': {'format': '%(message)s'}
},
'handlers': {
'stackdriver': {
'formatter': 'standard',
'class': 'google.cloud.logging.handlers.CloudLoggingHandler',
'client': client
},
'requestlogs_to_stdout': {
'class': 'logging.StreamHandler',
'filters': ['request_id_context'],
}
},
'filters': {
'request_id_context': {
'()': 'requestlogs.logging.RequestIdContext'
}
},
'loggers': {
'StackDriverHandler': {
'handlers': ['stackdriver'],
'level': "DEBUG"
},
'django.request': {
'handlers': ['stackdriver']
},
'requestlogs': {
'handlers': ['requestlogs_to_stdout'],
'level': 'INFO',
'propagate': False,
},
},
}
I invoke the logs along the lines of:
import logging
logger = logging.getLogger('StackDriverHandler')
class OrganisationDetail(generics.RetrieveUpdateDestroyAPIView):
///
def patch(self, request, pk, format=None):
try:
///
if serializer.is_valid():
serializer.save()
logger.info(f"PATCH SUCCESSFUL: {serializer.data}")
return Response(serializer.data)
logger.warning(f"PATCH Failed: {serializer.errors}")
return JsonResponse(serializer.errors, status=400)
except Exception as e:
logger.error(f"PATCH Failed with exception: {e}")
return JsonResponse({'error': str(e)}, status=500)
In GCP, I set up a service account, enabled logging api and gave the SA write logs and monitor metrics permissions.
I then made a secret to contain my service_account key, and in my cloud-build.yaml file I run a step like this:
- name: gcr.io/cloud-builders/gcloud
entrypoint: 'bash'
args: [ '-c', "gcloud secrets versions access latest --secret=<secret_name> --format='get(payload.data)' | tr '_-' '/+' | base64 -d > logging_service_account.json" ]
The above step should:
Fetch the secret
Write it to a json file in the app instance container that can be accessed by my settings.py file with gcp_logging.Client.from_service_account_json( json_credentials_path='logging_service_account.json')
Perhaps there is a more straight forward way to achieve this, but it feels like it should work. Any help would be much appreciated. Thanks
After all the steps above, when I visit the logging service on my gcp console, I only see the one log under my logging service account that says it is created, none of the logs from my actual django app.
I've set up a very simple asynchronous view but it's not working. As per the Django instructions I want to check that it's not my middleware causing the issue. The Django docs say that the django.request logger will disclose which middleware is not working in async. Below is the quote from the official docs. I've set up the django.request logger and it logs an 4xx or 5xx errors (as expected) but that's all. Is this a mistake in the Django docs?
https://docs.djangoproject.com/en/4.0/topics/async/
You will only get the benefits of a fully-asynchronous request stack if you have no synchronous middleware loaded into your site. If there is a piece of synchronous middleware, then Django must use a thread per request to safely emulate a synchronous environment for it.
Middleware can be built to support both sync and async contexts. Some of Django’s middleware is built like this, but not all. To see what middleware Django has to adapt, you can turn on debug logging for the django.request logger and look for log messages about “Synchronous middleware … adapted”.
settings.py
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'console': {
'level': 'DEBUG',
'class': 'logging.StreamHandler',
},
'file': {
'level': 'DEBUG',
'class': 'logging.FileHandler',
'filename': 'logs/debug.log',
},
},
'root': {
'handlers': ['console'],
'level': 'WARNING',
},
'loggers': {
'django.request': {
#'handlers': ['file','console'],
'handlers': ['file'],
'level': 'DEBUG',
'propagate': True,
},
},
}
I am struggling with Django logging configuration. I have one app called "api" and I want to save to file all logs from this app. When I set up a logger to django everything works fine but when I change it to my app_name it doesn't.
Here is my configuration:
File structure:
email_api
api
tasks.py
email_api
celery.py
settings
logs
email.log
My logging configuration:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'file': {
'level': 'DEBUG',
'class': 'logging.FileHandler',
'filename': 'logs/email.log',
},
},
'loggers': {
'api': {
'handlers': ['file'],
'level': 'DEBUG',
'propagate': True,
},
},
}
tasks.py file where I logging:
import logging
logger = logging.getLogger(__name__)
#app.task(bind=True, default_retry_delay=3, max_retries=3,)
def send_email(self, data, email_id):
message = create_message(data, email)
try:
logger.debug("Log Message Here")
message.send()
Keys in the LOGGING['loggers'][...] dict are names of loggers. You have configured logging with api as a name of the logger.
In order to write to this logger, you should request it by that name:
logger = logging.getLogger('api')
...
logger.debug("Log Message Here")
I've configured my logging as so:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'console': {
'class': 'logging.StreamHandler',
},
'cute':{
'class': 'logging.handlers.SocketHandler',
'host': '127.0.0.1',
'port': 19996
},
},
'loggers': {
'django': {
'handlers': ['cute'],
'level': os.getenv('DJANGO_LOG_LEVEL', 'INFO'),
},
},
}
But when I try and log out I get an error in the console:
TypeError: an integer is required (got type socket)
This seems to be happening within an attempt to pickle the log message, I think.
What is going on, and how can I get the SocketHandler to work?
There is a bug report about it. The request object cannot be pickled and the log fails.
My loggers were like yours before I got the same error as you did. Since my code contains apps that don't work with request, I partially fixed my problem creating a log for django.request without socket_handler
'django.request': {
'handlers': ['defaultfile', 'console'],
'level': 'WARNING',
'propagate': False,
},
However the bug report also suggest to create a custom SocketHandler removing request:
from logging.handlers import SocketHandler as _SocketHandler
class DjangoSocketHandler(_SocketHandler):
def emit(self, record):
if hasattr(record, 'request'):
record.request = None
return super().emit(record)
I haven't tried this yet, but it could be a way to go.
I'm new to django and dont know much about sql. I have made a model and its working fine. I'm saving data through django admin with no issues. However I have to save lots of data one by one. Is there a way to know what queries djano is creating when I click save button on django admin? I can dublicate that query and attach data with it.
I have installed django_debug_toolbar but even that is not showing sql queries when I save the data. Is there a way to see django orm sql queries WHILE saving the data? Can I log it into the terminal? I'm using postgresql.
You have to configure either PostgreSQL server or Django to log your queries.
postgresql.conf example:
log_min_duration_statement = 0
Django settings.py example:
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'formatters': {
'simple': {
'format': '%(asctime)s (%(process)d) %(levelname)s %(message)s',
},
},
'handlers': {
'console': {
'class': 'logging.StreamHandler',
'formatter': 'simple',
},
'file': {
'class': 'logging.FileHandler',
'filename': '/tmp/mycoolapp.log',
'formatter': 'simple',
},
},
'loggers': {
'django.contrib.admin': {
'handlers': ['console', 'file'],
'level': 'DEBUG',
},
},
}