Redis+Docker+Django - Error 111 Connection Refused - django

I'm trying to use Redis as a broker for Celery for my Django project that uses Docker Compose. I can't figure out what exactly I've done wrong, but despite the fact that the console log messages are telling me that Redis is running and accepting connections (and indeed, when I do docker ps, I can see the container running), I still get an error about the connection being refused. I even did
docker exec -it <redis_container_name> redis-cli
ping
and saw that the response was PONG.
Here are the Celery settings in my settings.py:
BROKER_URL = 'redis://localhost:6379/0'
BROKER_TRANSPORT = 'redis'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ENABLE_UTC = True
CELERY_TIMEZONE = "UTC"
Here are the Redis container settings in my docker-compose.yml:
redis:
image: redis
ports:
- "6379:6379"
I remembered to link the redis container with my web container as well. I can start up the server just fine, but I get the connection refused error when I try to upload anything to the site. What exactly is going wrong?
EDIT: I remembered to use VBoxManage to port forward such that I can go to my browser and access my site at localhost:8000, so it doesn't seem like I need to use the VM's IP instead of localhost for my settings.py.
EDIT 2: If I replace localhost in the settings with either the IP address of the docker-machine VM or the IP address of the Redis container, then what happens is that I really quickly get a false success message on my website when I upload a file, but then nothing actually gets uploaded. The underlying upload function, insertIntoDatabase(), uses delay.

I just had similar problem due to updating Celery from v3.1 to v4 and according to this tutorial it was needed to change BROKER_URL to CELERY_BROKER_URL in the settings.py
settings.py part
CELERY_BROKER_URL = 'redis://cache:6379/0'
CELERY_RESULT_BACKEND = 'redis://cache:6379/0'
docker-compose.yml part
version: '2'
services:
web:
container_name: django-container
*******
other options
*******
depends_on:
- cache
- db
cache:
container_name: redis-container
restart: always
image: redis:latest

Is Django running in a seperate container that is linked to the Redis container? If so, you should have some environment variables with the Ip and port that Django should use to connect to the Redis container. Set BROKER_URL to use the redis Ip and port env vars and you should be in business. Ditto for RESULT_BACKEND.
Reference docs for the env vars are here: Docker Compose docs
Here's some example code for how we use the automatically added env vars in one of our projects at OfferUp:
BROKER_TRANSPORT = "redis"
_REDIS_LOCATION = 'redis://{}:{}'.format(os.environ.get("REDIS_PORT_6379_TCP_ADDR"), os.environ.get("REDIS_PORT_6379_TCP_PORT"))
BROKER_URL = _REDIS_LOCATION + "/0"
CELERY_RESULT_BACKEND = _REDIS_LOCATION + "/1"

Related

ReadOnlyError in Django application with Redis and DjangoCannels

I have a Django app using DgangoChannels, Djangochannelrestframework. It establishes a websocket connection with ReactJS frontend. As channel layers I use Redis like that
CHANNEL_LAYERS = {
"default": {
"BACKEND": "channels_redis.core.RedisChannelLayer",
"CONFIG": {
"hosts": [("redis", 6379)],
},
},
}
Redis and Django runs in docker. My redis docker setup is
redis:
image: "redis:7.0.4-alpine"
command: redis-server
ports:
- "6379:6379"
networks:
- nginx_network
When I run my app on production server everything works for 5-8 hours. But after that period, if Django app trying to send a message via ws if falls with the error
ReadOnlyError at /admin/operations/operation/add/
READONLY You can't write against a read only replica.
Request Method: POST
Request URL: http://62.84.123.168/admin/operations/operation/add/
Django Version: 3.2.12
Exception Type: ReadOnlyError
Exception Value:
READONLY You can't write against a read only replica.
Exception Location: /usr/local/lib/python3.8/site-packages/channels_redis/core.py, line 673, in group_send
Python Executable: /usr/local/bin/python
Python Version: 3.8.13
Python Path:
['/opt/code',
'/usr/local/bin',
'/usr/local/lib/python38.zip',
'/usr/local/lib/python3.8',
'/usr/local/lib/python3.8/lib-dynload',
'/usr/local/lib/python3.8/site-packages']
Server time: Tue, 02 Aug 2022 08:23:18 +0300
I understand that it somehow connected with Redis replication, but no idea why if falls after period of time and how to fix it
I have the same error, the possible solution is here
Fix by adding command to docker and disable the replica-read-only config,
add this to your redis docker compose
​ command: redis-server --appendonly yes --replica-read-only no ​
then you could try to verify if the ​replica-read-only​ is disable using​redis-cli > config get replica-read-only​ command , if the result is no then it successful to disable.

Docker Kibana accessible on externally host?

I've been trying to debug my kibana-elasticsearch service for a couple of days now.
I want to be able to access my Docker Kibana container ui in a web browser on a separate host but the service is not available?
It only for exploring and testing so I don't need any authentication on it for now. I've locked down the security group to trusted ip addresses.
Both Kibana and Elasticsearch containers are running. I can access Kibana via localhost:5601.
After trawling through loads of posts and documentation I know the issue is with binding the container to a an ip address to make it accessible to external hosts.
my kibana config file:
# Kibana is served by a back end server. This setting specifies the port to use.
#server.port: 5601
# Specifies the address to which the Kibana server will bind. IP addresses and host names are both valid values.
# The default is 'localhost', which usually means remote machines will not be able to connect.
# To allow connections from remote users, set this parameter to a non-loopback address.
server.host: "0.0.0.0"
my docker compose file:
version: '3.6'
services:
# creates a fluentd service with mountpoints
fluentd:
container_name: fluentd
user: root
build:
context: .
image: fluentd
ports:
- "9880:9880"
volumes:
# LOCAL_HOST_DIR:CONTAINER_DIR
- /var/lib/docker/containers:/fluentd/log/containers # Example: Reading docker logs on
- ./file:/fluentd/log/files/ #Example: Reading logs from a file
- ./configurations:/fluentd/etc/ #where defualt config file is located
# - ./logs:/output/ # Example:Fluentd will collect logs and store it here for demo
logging:
driver: "local"
# This app sends logs to Fluentd endpoint via HTTP
http-myapp:
container_name: http-myapp
image: alpine
volumes:
- ./http:/app
command: [ /bin/sh , -c , "apk add --no-cache curl && chmod +x /app/http_app.sh && ./app/http_app.sh"]
# write test files to a local volume
file-myapp:
image: alpine
container_name: log-generator
# restart: always
volumes:
- ./file:/app
command: [/bin/sh, -c , "chmod +x /app/app.sh && ./app/app.sh"]
elasticsearch: # port 9200
image: docker.elastic.co/elasticsearch/elasticsearch:7.13.1
container_name: elasticsearch
environment:
- node.name=elasticsearch
# - cluster.initial_master_nodes=elasticsearch
- bootstrap.memory_lock=true
- "ES_JAVA_OPTS=-Xmx256m -Xms256m"
- discovery.type=single-node
volumes:
- esdata:/usr/share/elasticsearch/data
ulimits:
memlock:
soft: -1
hard: -1
kibana:
image: docker.elastic.co/kibana/kibana:7.13.1
container_name: kibana
volumes:
- ./kibana.yml:/usr/share/kibana/config/kibana.yml
ports:
- "5601:5601"
environment:
ELASTICSEARCH_URL: http://elasticsearch:9200
ELASTICSEARCH_HOSTS: http://elasticsearch:9200
SERVER_NAME: kibana
# SERVER.HOST: "0.0.0.0"
volumes:
esdata:

Django +docker-compose + Celery + redis - How to use Redis deployed in my own remote server?

I have a Django app deployed in Docker containers.
I have 3 config environnements: dev, preprod and prod.
dev is my local environnement (localhost) and preprod/prod are remote linux environnements.
It works when using the "public" Redis server and standard config.
But I need to use our own Redis deployed in Docker container in a remote server (192.168.xx.xx) with name container redis_cont.
And I do not really know how to config. I do not know if it is possible?
I would appreciate some help.
docker-compose
version: '3.7'
services:
web:
restart: always
build:
context: ./app
dockerfile: Dockerfile.dev
restart: always
command: python manage.py runserver 0.0.0.0:8000
volumes:
- ./app:/usr/src/app
ports:
- 8000:8000
env_file:
- ./.env.dev
entrypoint: [ "/usr/src/app/entrypoint.dev.sh" ]
depends_on:
- redis
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/"]
interval: 30s
timeout: 10s
retries: 50
redis:
container_name: redis_cont <= container running in remote linux server
image: "redis:alpine"
celery:
build:
context: ./app
dockerfile: Dockerfile.dev
command: celery -A core worker -l info
volumes:
- ./app:/usr/src/app
env_file:
- ./.env.dev
depends_on:
- web
- redis
celery-beat:
build:
context: ./app
dockerfile: Dockerfile.dev
command: celery -A core beat -l info
volumes:
- ./app:/usr/src/app
env_file:
- ./.env.dev
depends_on:
- web
- redis
settings.py
CELERY_BROKER_URL = 'redis://redis:6379'
CELERY_RESULT_BACKEND = 'redis://redis:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_BEAT_SCHEDULE = {
'hello': {
'task': 'project.tasks.hello',
'schedule': crontab() # execute every minute
},
}
Since the containers are not created via the same docker-compose, they won't share the same network. redis_cont just doesn't exist to the services built in the isolated network of your docker-compose.
If Redis container is published on the remote and is accessible using ip:port, you should be able to use it directly in your settings.py. No need to add a new service in your compose file.
Note
To establish a communication between services in the same docker-compose you should use the service name (web, celery-beat, etc in your case) and not the container name.

Dockerize Django application with mongoDB

I am working on Django web application which store the data in mongoDB database. When I run the docker using the docker-compose.yml file, it open the login page and gives the CSFR token error. Following are the logs of Django container:
pymongo.errors.ServerSelectionTimeoutError: localhost:27017: [Errno 111] Connection refused, Timeout: 30s, Topology Description: <TopologyDescription id: 61ad29e66ee4fa015775e4b9, topology_type: Single, servers: [<ServerDescription ('localhost', 27017) server_type: Unknown, rtt: None, error=AutoReconnect('localhost:27017: [Errno 111] Connection refused')>]>
[05/Dec/2021 21:13:23] "GET /dashboard/ HTTP/1.1" 500 94504
Content of docker-compose.yml file:
version: "3.7"
services:
mongodb_container:
image: mongo:latest
volumes:
- mongodb_data_container:/data/db
ports:
- 27017:27017
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- django_data_container:/home/app/webapp
ports:
- "8000:8000"
links:
- mongodb_container
depends_on:
- mongodb_container
Can anyone tell me how I can communicate the Django with mongoDB using dockers?
From the error:
pymongo.errors.ServerSelectionTimeoutError: localhost:27017
we can tell that you are trying to connect to the localhost on port 27017, but localhost will only address the django container itself. In order to connect to another container (mongodb), change the connection string or connection configuration in django settings to point to that other container's name. In your case you want to change localhost to mongodb_container.

django-redis connection error inside docker

django views.py
import redis
import jwt
from access import utils
import os
redis_url = os.environ['REDIS_URI']
R = redis.StrictRedis(redis_url)
def set(request):
R.set('foo', 'bar')
return JsonResponse({"code":200,"msg":"success"})
docker-compose
version: "3"
services:
rango:
container_name: rango
build: ./
command: python backend/manage.py runserver 0.0.0.0:8000
# command: npm start --prefix frontend/rango-frontend/
working_dir: /usr/src/rango
environment:
REDIS_URI: redis://redis_db:6379
ports:
- "8000:8000"
tty: true
links:
- elasticsearch
- node
- redis
#elastic search
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:6.5.0
ports:
- "9200:9200"
#node
node:
image: node:10.13.0
#redis
redis:
image: redis
environment:
- ALLOW_EMPTY_PASSWORD=yes
ports:
- "6379:6379"
here i am connecting redis from django inside docker.
it is giving me exceptions connexctions refused.
Please have a look into my code and shared screenshot below
By default, docker compose makes containers discoverable with a hostname identical to the container name. Your redis container is thus discoverable via the hostname redis. However, your Django container is using the hostname redis_db.
Update your docker-compose.yml and change the REDIS_URI to reference the correct hostname:
REDIS_URI: redis://redis:6379