I am building a web application that requires some long running tasks to be on AWS ECS using celery as a distributed task queue. The problem I am facing is that my celery worker running on ECS is not receiving tasks from SQS even though it seems to be connected to it.
Following are the logs from ECS task.
/usr/local/lib/python3.8/site-packages/celery/platforms.py:797: RuntimeWarning: You're running the worker with superuser privileges: this is
absolutely not recommended!
Please specify a different user using the --uid option.
User information: uid=0 euid=0 gid=0 egid=0
warnings.warn(RuntimeWarning(ROOT_DISCOURAGED.format(
-------------- celery#ip-xxx-xxx-xxx-xxx.us-east-2.compute.internal v5.0.1 (singularity)
--- ***** -----
-- ******* ---- Linux-4.14.252-195.483.amzn2.x86_64-x86_64-with-glibc2.2.5 2021-12-14 06:39:58
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: emptive_portal:0x7fbfda752310
- ** ---------- .> transport: sqs://XXXXXXXXXXXXXXXX:**#localhost//
- ** ---------- .> results: disabled://
- *** --- * --- .> concurrency: 2 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> emptive-celery2.fifo exchange=sync(direct) key=emptive-celery.fifo
[tasks]
. import_export_celery.tasks.run_export_job
. import_export_celery.tasks.run_import_job
[Errno 2] No such file or directory: 'seq_tokens/emptive-staging_web_sequence_token.txt'
[Errno 2] No such file or directory: 'seq_tokens/emptive-staging_web_sequence_token.txt'
2021-12-14 06:39:58 [INFO] Connected to sqs://XXXXXXXXXXXXXXXXX:**#localhost//
[Errno 2] No such file or directory: 'seq_tokens/emptive-staging_web_sequence_token.txt'
[Errno 2] No such file or directory: 'seq_tokens/emptive-staging_web_sequence_token.txt'
2021-12-14 06:39:58 [INFO] celery#ip-xxx-xxx-xxx-xxx.us-east-2.compute.internal ready.
To be noted, I have ran the same container, that I have deployed to ECS, locally on the same machine as the django webserver is on which is sending tasks. That celery worker doesn't have any problem receiving tasks.
I have also tried giving ecsTaskExecutionRole full permissions to SQS but that doesn't seem to affect anything. Any help would be appreciated.
EDIT: Forgot to show my celery broker config in django settings.py
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
# SQS CONFIG
CELERY_BROKER_URL = "sqs://%s:%s#" % (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
BROKER_TRANSPORT_OPTIONS = {
'region': 'us-east-2',
'polling_interval': 20,
}
CELERY_RESULT_BACKEND = None
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_SEND_EVENTS = False
So I finally fixed this. The issue was really stupid on my part. :) I just had to replace BROKER_TRANSPORT_OPTIONS with CELERY_BROKER_TRANSPORT_OPTIONS in the celery config.
New config:
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
# SQS CONFIG
CELERY_BROKER_URL = "sqs://%s:%s#" % (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_BROKER_TRANSPORT_OPTIONS = {
'region': 'us-east-2',
'polling_interval': 20,
}
CELERY_RESULT_BACKEND = None
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_SEND_EVENTS = False
I have django 3.2.7, celery 5.2.1, redis 3.5.3
I have next celery settings. (REDIS_PASSWORD) is env variable:
CELERY_BROKER_URL = f'redis://:{REDIS_PASSWORD}#redis:6379/4'
CELERY_BROKER_TRANSPORT_OPTIONS = {'visibility_timeout': 3600}
CELERY_RESULT_BACKEND = f'redis://:{REDIS_PASSWORD}#redis:6379/1'
CELERY_ACCEPT_CONTENT = ['application/json']
But when I start my docker-compose app, it shows me
celery | --- ***** -----
celery | -- ******* ---- Linux-5.11.0-34-generic-x86_64-with-glibc2.31 2021-09-16 10:20:11
celery | - *** --- * ---
celery | - ** ---------- [config]
celery | - ** ---------- .> app: project:0x7f5cd0df0880
celery | - ** ---------- .> transport: redis://redis:6379/0 <==== NO CHANGES HERE
celery | - ** ---------- .> results: redis://:**#redis:6379/1
celery | - *** --- * --- .> concurrency: 16 (prefork)
celery | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
celery | --- ***** -----
How can I set it?
Problem was solved by removing environment params from docker-compose file.
I'm using Django with celery:
In celery.py I have:
app.conf.beat_schedule = {
'send_notes_email': {
'task': 'send_notes_email_async',
'schedule': crontab(minute=3),
},
}
The task is set:
#app.task(bind=True, name='send_notes_email_async', max_retries=3)
def send_notes_email_async():
print('a')
send_notes_email()
Celery is working, recognize the tasks, but doesn't trigger after 3 minutes.
Also no errors.
Redis reports keys
Celery tasks are working without schedule
-------------- celery#WIN-U5VDSO5V1CK v4.2.1 (windowlicker)
---- **** -----
--- * *** * -- Windows-7-6.1.7601-SP1
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: PH:0x3d2d128
- ** ---------- .> transport: redis://localhost:6379//
- ** ---------- .> results: redis://localhost:6379/
- *** --- * --- .> concurrency: 2 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. send_notes_email_async
Your beat schedule configuration is wrong, minute=3 in cronjob means that execute the task at minute 3 (01:03, 02:03, 03:03, 04:03...).
This is what you want, which executes the task every 3 minutes:
app.conf.beat_schedule = {
'send_notes_email': {
'task': 'send_notes_email_async',
'schedule': crontab(minute=*/3),
},
}
I'm having troubles starting a Celery worker on Elastic Beanstalk instance. After a couple of seconds, it just quits unexpectedly with no error. The instance should have enough RAM. I'm attaching the output by worker with log level debug (sensitive information replaced by **). Any guidance would be super helpful. Thanks.
(venv) [ec2-user#ip-** app]$ celery -A app worker -l debug
[12/Mar/2018 10:18:29] DEBUG [raven.contrib.django.client.DjangoClient:265] Configuring Raven for host: <raven.conf.remote.RemoteConfig object at 0x7f556309e940>
-------------- celery#ip-** v4.1.0 (latentcall)
---- **** -----
--- * *** * -- Linux-4.9.75-25.55.amzn1.x86_64-x86_64-with-glibc2.3.4 2018-03-12 10:18:29
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: app:0x7f556998edd8
- ** ---------- .> transport: sqs://**:**#localhost//
- ** ---------- .> results:
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. celery.accumulate
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap
. common.tasks.send_templated_email
. orders.tasks.import_orders_from_all_companies
[2018-03-12 10:18:29,783: DEBUG/MainProcess] Setting config variable for region to 'eu-central-1'
[2018-03-12 10:18:29,784: DEBUG/MainProcess] Loading variable profile from defaults.
[2018-03-12 10:18:29,784: DEBUG/MainProcess] Loading variable config_file from defaults.
[2018-03-12 10:18:29,784: DEBUG/MainProcess] Loading variable credentials_file from defaults.
[2018-03-12 10:18:29,784: DEBUG/MainProcess] Loading variable data_path from defaults.
[2018-03-12 10:18:29,785: DEBUG/MainProcess] Loading variable region from instance vars with value 'eu-central-1'.
[2018-03-12 10:18:29,785: DEBUG/MainProcess] Loading variable profile from defaults.
[2018-03-12 10:18:29,785: DEBUG/MainProcess] Loading variable ca_bundle from defaults.
[2018-03-12 10:18:29,786: DEBUG/MainProcess] Loading variable profile from defaults.
[2018-03-12 10:18:29,786: DEBUG/MainProcess] Loading variable api_versions from defaults.
[2018-03-12 10:18:29,786: DEBUG/MainProcess] Loading JSON file: /opt/python/run/venv/local/lib/python3.6/site-packages/botocore/data/endpoints.json
[2018-03-12 10:18:29,790: DEBUG/MainProcess] Loading variable profile from defaults.
[2018-03-12 10:18:29,790: DEBUG/MainProcess] Event choose-service-name: calling handler <function handle_service_name_alias at 0x7f55633fe048>
[2018-03-12 10:18:29,793: DEBUG/MainProcess] Loading JSON file: /opt/python/run/venv/local/lib/python3.6/site-packages/botocore/data/sqs/2012-11-05/service-2.json
[2018-03-12 10:18:29,796: DEBUG/MainProcess] Event creating-client-class.sqs: calling handler <function add_generate_presigned_url at 0x7f5563448d90>
[2018-03-12 10:18:29,797: DEBUG/MainProcess] The s3 config key is not a dictionary type, ignoring its value of: None
[2018-03-12 10:18:29,802: DEBUG/MainProcess] Setting sqs timeout as (60, 60)
[2018-03-12 10:18:29,802: DEBUG/MainProcess] Loading JSON file: /opt/python/run/venv/local/lib/python3.6/site-packages/botocore/data/_retry.json
[2018-03-12 10:18:29,803: DEBUG/MainProcess] Registering retry handlers for service: sqs
[2018-03-12 10:18:29,803: DEBUG/MainProcess] Event before-parameter-build.sqs.ListQueues: calling handler <function generate_idempotent_uuid at 0x7f5563406378>
[2018-03-12 10:18:29,804: DEBUG/MainProcess] Making request for OperationModel(name=ListQueues) (verify_ssl=True) with params: {'url_path': '/', 'query_string': '', 'method': 'POST', 'headers': {'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8', 'User-Agent': 'Boto3/1.5.16 Python/3.6.2 Linux/4.9.75-25.55.amzn1.x86_64 Botocore/1.8.30'}, 'body': {'Action': 'ListQueues', 'Version': '2012-11-05', 'QueueNamePrefix': ''}, 'url': 'https://eu-central-1.queue.amazonaws.com/', 'context': {'client_region': 'eu-central-1', 'client_config': <botocore.config.Config object at 0x7f5562069f28>, 'has_streaming_input': False, 'auth_type': None}}
[2018-03-12 10:18:29,804: DEBUG/MainProcess] Event request-created.sqs.ListQueues: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7f5562069cf8>>
[2018-03-12 10:18:29,804: DEBUG/MainProcess] Event choose-signer.sqs.ListQueues: calling handler <function set_operation_specific_signer at 0x7f5563406268>
[2018-03-12 10:18:29,805: DEBUG/MainProcess] Calculating signature using v4 auth.
[2018-03-12 10:18:29,805: DEBUG/MainProcess] CanonicalRequest:
POST
/
content-type:application/x-www-form-urlencoded; charset=utf-8
host:eu-central-1.queue.amazonaws.com
x-amz-date:20180312T091829Z
content-type;host;x-amz-date
**
[2018-03-12 10:18:29,805: DEBUG/MainProcess] StringToSign:
AWS4-HMAC-SHA256
20180312T091829Z
20180312/eu-central-1/sqs/aws4_request
**
[2018-03-12 10:18:29,806: DEBUG/MainProcess] Signature:
**
[2018-03-12 10:18:29,806: DEBUG/MainProcess] Sending http request: <PreparedRequest [POST]>
[2018-03-12 10:18:29,807: INFO/MainProcess] Starting new HTTPS connection (1): eu-central-1.queue.amazonaws.com
[2018-03-12 10:18:29,839: DEBUG/MainProcess] "POST / HTTP/1.1" 200 409
[2018-03-12 10:18:29,840: DEBUG/MainProcess] Response headers: {'server': 'Server', 'date': 'Mon, 12 Mar 2018 09:18:29 GMT', 'content-type': 'text/xml', 'content-length': '409', 'connection': 'keep-alive', 'x-amzn-requestid': '**'}
[2018-03-12 10:18:29,840: DEBUG/MainProcess] Response body:
b'<?xml version="1.0"?><ListQueuesResponse xmlns="http://queue.amazonaws.com/doc/2012-11-05/"><ListQueuesResult><QueueUrl>**</QueueUrl><QueueUrl>**</QueueUrl></ListQueuesResult><ResponseMetadata><RequestId>**</RequestId></ResponseMetadata></ListQueuesResponse>'
[2018-03-12 10:18:29,841: DEBUG/MainProcess] Event needs-retry.sqs.ListQueues: calling handler <botocore.retryhandler.RetryHandler object at 0x7f556201c390>
[2018-03-12 10:18:29,841: DEBUG/MainProcess] No retry needed.
[2018-03-12 10:18:29,841: INFO/MainProcess] Connected to sqs://**:**#localhost//
[2018-03-12 10:18:29,850: DEBUG/MainProcess] Setting config variable for region to 'eu-central-1'
[2018-03-12 10:18:29,850: DEBUG/MainProcess] Loading variable profile from defaults.
[2018-03-12 10:18:29,851: DEBUG/MainProcess] Loading variable config_file from defaults.
[2018-03-12 10:18:29,851: DEBUG/MainProcess] Loading variable credentials_file from defaults.
[2018-03-12 10:18:29,851: DEBUG/MainProcess] Loading variable data_path from defaults.
[2018-03-12 10:18:29,852: DEBUG/MainProcess] Loading variable region from instance vars with value 'eu-central-1'.
[2018-03-12 10:18:29,852: DEBUG/MainProcess] Loading variable profile from defaults.
[2018-03-12 10:18:29,852: DEBUG/MainProcess] Loading variable ca_bundle from defaults.
[2018-03-12 10:18:29,852: DEBUG/MainProcess] Loading variable profile from defaults.
[2018-03-12 10:18:29,852: DEBUG/MainProcess] Loading variable api_versions from defaults.
[2018-03-12 10:18:29,853: DEBUG/MainProcess] Loading JSON file: /opt/python/run/venv/local/lib/python3.6/site-packages/botocore/data/endpoints.json
[2018-03-12 10:18:29,857: DEBUG/MainProcess] Loading variable profile from defaults.
[2018-03-12 10:18:29,857: DEBUG/MainProcess] Event choose-service-name: calling handler <function handle_service_name_alias at 0x7f55633fe048>
[2018-03-12 10:18:29,861: DEBUG/MainProcess] Loading JSON file: /opt/python/run/venv/local/lib/python3.6/site-packages/botocore/data/sqs/2012-11-05/service-2.json
[2018-03-12 10:18:29,863: DEBUG/MainProcess] Event creating-client-class.sqs: calling handler <function add_generate_presigned_url at 0x7f5563448d90>
[2018-03-12 10:18:29,863: DEBUG/MainProcess] The s3 config key is not a dictionary type, ignoring its value of: None
[2018-03-12 10:18:29,865: DEBUG/MainProcess] Setting sqs timeout as (60, 60)
[2018-03-12 10:18:29,865: DEBUG/MainProcess] Loading JSON file: /opt/python/run/venv/local/lib/python3.6/site-packages/botocore/data/_retry.json
[2018-03-12 10:18:29,866: DEBUG/MainProcess] Registering retry handlers for service: sqs
[2018-03-12 10:18:29,866: DEBUG/MainProcess] Event before-parameter-build.sqs.ListQueues: calling handler <function generate_idempotent_uuid at 0x7f5563406378>
[2018-03-12 10:18:29,866: DEBUG/MainProcess] Making request for OperationModel(name=ListQueues) (verify_ssl=True) with params: {'url_path': '/', 'query_string': '', 'method': 'POST', 'headers': {'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8', 'User-Agent': 'Boto3/1.5.16 Python/3.6.2 Linux/4.9.75-25.55.amzn1.x86_64 Botocore/1.8.30'}, 'body': {'Action': 'ListQueues', 'Version': '2012-11-05', 'QueueNamePrefix': ''}, 'url': 'https://eu-central-1.queue.amazonaws.com/', 'context': {'client_region': 'eu-central-1', 'client_config': <botocore.config.Config object at 0x7f5561acfbe0>, 'has_streaming_input': False, 'auth_type': None}}
[2018-03-12 10:18:29,867: DEBUG/MainProcess] Event request-created.sqs.ListQueues: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7f5561acfb38>>
[2018-03-12 10:18:29,867: DEBUG/MainProcess] Event choose-signer.sqs.ListQueues: calling handler <function set_operation_specific_signer at 0x7f5563406268>
[2018-03-12 10:18:29,867: DEBUG/MainProcess] Calculating signature using v4 auth.
[2018-03-12 10:18:29,868: DEBUG/MainProcess] CanonicalRequest:
POST
/
content-type:application/x-www-form-urlencoded; charset=utf-8
host:eu-central-1.queue.amazonaws.com
x-amz-date:20180312T091829Z
content-type;host;x-amz-date
**
[2018-03-12 10:18:29,868: DEBUG/MainProcess] StringToSign:
AWS4-HMAC-SHA256
20180312T091829Z
20180312/eu-central-1/sqs/aws4_request
**
[2018-03-12 10:18:29,868: DEBUG/MainProcess] Signature:
**
[2018-03-12 10:18:29,869: DEBUG/MainProcess] Sending http request: <PreparedRequest [POST]>
[2018-03-12 10:18:29,869: INFO/MainProcess] Starting new HTTPS connection (1): eu-central-1.queue.amazonaws.com
[2018-03-12 10:18:29,895: DEBUG/MainProcess] "POST / HTTP/1.1" 200 409
[2018-03-12 10:18:29,895: DEBUG/MainProcess] Response headers: {'server': 'Server', 'date': 'Mon, 12 Mar 2018 09:18:29 GMT', 'content-type': 'text/xml', 'content-length': '409', 'connection': 'keep-alive', 'x-amzn-requestid': '5b132733-22e0-5567-8762-74136ac526ec'}
[2018-03-12 10:18:29,896: DEBUG/MainProcess] Response body:
b'<?xml version="1.0"?><ListQueuesResponse xmlns="http://queue.amazonaws.com/doc/2012-11-05/"><ListQueuesResult><QueueUrl>**</QueueUrl><QueueUrl>**</QueueUrl></ListQueuesResult><ResponseMetadata><RequestId>5b132733-22e0-5567-8762-74136ac526ec</RequestId></ResponseMetadata></ListQueuesResponse>'
[2018-03-12 10:18:29,896: DEBUG/MainProcess] Event needs-retry.sqs.ListQueues: calling handler <botocore.retryhandler.RetryHandler object at 0x7f5561a763c8>
[2018-03-12 10:18:29,896: DEBUG/MainProcess] No retry needed.
[2018-03-12 10:18:29,899: DEBUG/MainProcess] Event before-parameter-build.sqs.GetQueueAttributes: calling handler <function generate_idempotent_uuid at 0x7f5563406378>
[2018-03-12 10:18:29,900: DEBUG/MainProcess] Making request for OperationModel(name=GetQueueAttributes) (verify_ssl=True) with params: {'url_path': '/', 'query_string': '', 'method': 'POST', 'headers': {'Content-Type': 'application/x-www-form-urlencoded; charset=utf-8', 'User-Agent': 'Boto3/1.5.16 Python/3.6.2 Linux/4.9.75-25.55.amzn1.x86_64 Botocore/1.8.30'}, 'body': {'Action': 'GetQueueAttributes', 'Version': '2012-11-05', 'QueueUrl': '**', 'AttributeName.1': 'ApproximateNumberOfMessages'}, 'url': 'https://eu-central-1.queue.amazonaws.com/', 'context': {'client_region': 'eu-central-1', 'client_config': <botocore.config.Config object at 0x7f5562069f28>, 'has_streaming_input': False, 'auth_type': None}}
[2018-03-12 10:18:29,900: DEBUG/MainProcess] Event request-created.sqs.GetQueueAttributes: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x7f5562069cf8>>
[2018-03-12 10:18:29,900: DEBUG/MainProcess] Event choose-signer.sqs.GetQueueAttributes: calling handler <function set_operation_specific_signer at 0x7f5563406268>
[2018-03-12 10:18:29,901: DEBUG/MainProcess] Calculating signature using v4 auth.
[2018-03-12 10:18:29,901: DEBUG/MainProcess] CanonicalRequest:
POST
/
content-type:application/x-www-form-urlencoded; charset=utf-8
host:eu-central-1.queue.amazonaws.com
x-amz-date:20180312T091829Z
content-type;host;x-amz-date
**
[2018-03-12 10:18:29,901: DEBUG/MainProcess] StringToSign:
AWS4-HMAC-SHA256
20180312T091829Z
20180312/eu-central-1/sqs/aws4_request
**
[2018-03-12 10:18:29,901: DEBUG/MainProcess] Signature:
9fb0d1ad68b5d25bf148cc11857b1e1083418557229ca2c47e8b525b54880b74
[2018-03-12 10:18:29,902: DEBUG/MainProcess] Sending http request: <PreparedRequest [POST]>
[2018-03-12 10:18:29,910: DEBUG/MainProcess] "POST / HTTP/1.1" 200 357
[2018-03-12 10:18:29,911: DEBUG/MainProcess] Response headers: {'server': 'Server', 'date': 'Mon, 12 Mar 2018 09:18:29 GMT', 'content-type': 'text/xml', 'content-length': '357', 'connection': 'keep-alive', 'x-amzn-requestid': '9aa38f1d-25e5-576d-a9a9-dc3d6dc029a0'}
[2018-03-12 10:18:29,911: DEBUG/MainProcess] Response body:
b'<?xml version="1.0"?><GetQueueAttributesResponse xmlns="http://queue.amazonaws.com/doc/2012-11-05/"><GetQueueAttributesResult><Attribute><Name>ApproximateNumberOfMessages</Name><Value>0</Value></Attribute></GetQueueAttributesResult><ResponseMetadata><RequestId>**</RequestId></ResponseMetadata></GetQueueAttributesResponse>'
[2018-03-12 10:18:29,912: DEBUG/MainProcess] Event needs-retry.sqs.GetQueueAttributes: calling handler <botocore.retryhandler.RetryHandler object at 0x7f556201c390>
[2018-03-12 10:18:29,912: DEBUG/MainProcess] No retry needed.
[2018-03-12 10:18:29,921: DEBUG/MainProcess] Canceling task consumer...
[2018-03-12 10:18:30,926: DEBUG/MainProcess] Canceling task consumer...
[2018-03-12 10:18:30,926: DEBUG/MainProcess] Closing consumer channel...
[2018-03-12 10:18:30,926: DEBUG/MainProcess] removing tasks from inqueue until task handler finished
I have solved the issue. Amazon instance required PyCurl and some additional packages to connect properly with SQS.
I suggest that you already have a script file to run celery daemon(run_supervised_celeryd.sh)
Here is my EB config file:
packages:
yum:
libjpeg-turbo-devel: []
libpng-devel: []
libcurl-devel: []
container_commands:
01_migrate:
command: "django-admin.py migrate --noinput"
leader_only: true
02_collectstatic:
command: "python manage.py collectstatic --noinput"
03_pycurl:
command: 'source /opt/python/run/venv/bin/activate && pip3 install /usr/local/share/pycurl-7.43.0.tar.gz --global-option="--with-nss" --upgrade'
04_celery_tasks_run:
command: "/opt/elasticbeanstalk/hooks/appdeploy/post/run_supervised_celeryd.sh"
leader_only: true
files:
"/usr/local/share/pycurl-7.43.0.tar.gz" :
mode: "000644"
owner: root
group: root
source: https://pypi.python.org/packages/source/p/pycurl/pycurl-7.43.0.tar.gz
add env variable PYCURL_SSL_LIBRARY="nss"
All listed settings solve the issue
I trying to make a work django with django-cellery, I created my project on virtualenv and I used template https://github.com/xenith/django-base-template and followings:
Django==1.6.5
celery==3.1.11
django-celery==3.1.10
My celery settings in settings/local.py
import djcelery
djcelery.setup_loader()
BROKER_URL = 'amqp://django:pas****#10.0.1.17:5672/myvhost'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend'
CELERY_ALWAYS_EAGER = True
I have one Periodic Task "Update_All_Feeds" when I start "celery beat", it seems that everything working fine, task is executed every 10sec.
python manage.py celery beat
celery beat v3.1.11 (Cipater) is starting.
__ - ... __ - _
Configuration ->
. broker -> amqp://django#10.0.1.17:5672/myvhost
. loader -> djcelery.loaders.DjangoLoader
. scheduler -> djcelery.schedulers.DatabaseScheduler
. logfile -> [stderr]#%INFO
. maxinterval -> now (0s)
[2014-05-31 20:53:16,544: INFO/MainProcess] beat: Starting...
[2014-05-31 20:53:16,544: INFO/MainProcess] Writing entries...
[2014-05-31 20:53:16,669: INFO/MainProcess] Scheduler: Sending due task Update_All_ Feeds (update_all_feeds)
[2014-05-31 20:53:19,031: WARNING/MainProcess] /home/phosting/python/django/polskifeed/local/lib/python2.7/site-packages/django/db/models/fields/__init__.py:903: RuntimeWarning: DateTimeField FeedItem.pub_date received a naive datetime (2014-05-31 19:21:49) while time zone support is active.
RuntimeWarning)
[2014-05-31 20:53:19,081: INFO/MainProcess] Writing entries...
[2014-05-31 20:53:26,675: INFO/MainProcess] Scheduler: Sending due task Update_All_ Feeds (update_all_feeds)
[2014-05-31 20:53:36,682: INFO/MainProcess] Scheduler: Sending due task Update_All_ Feeds (update_all_feeds)
[2014-05-31 20:53:46,688: INFO/MainProcess] Scheduler: Sending due task Update_All_ Feeds (update_all_feeds)
[2014-05-31 20:53:56,695: INFO/MainProcess] Scheduler: Sending due task Update_All_ Feeds (update_all_feeds)
But starting this with celeryd is not doing anything
python manage.py celeryd -l DEBUG
-------------- celery#czterykaty v3.1.11 (Cipater)
---- **** -----
--- * *** * -- Linux-2.6.32-bpo.5-xen-amd64-x86_64-with-debian-7.0
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: default:0x13bb510 (djcelery.loaders.DjangoLoader)
- ** ---------- .> transport: amqp://django#10.0.1.17:5672/myvhost
- ** ---------- .> results: djcelery.backends.database:DatabaseBackend
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ----
--- ***** ----- [queues]
-------------- .> celery exchange=celery(direct) key=celery
[tasks]
. celery.backend_cleanup
. celery.chain
. celery.chord
. celery.chord_unlock
. celery.chunks
. celery.group
. celery.map
. celery.starmap
. update_all_feeds
. update_feeds
[2014-05-31 20:58:55,353: DEBUG/MainProcess] | Worker: Starting Hub
[2014-05-31 20:58:55,354: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:55,354: DEBUG/MainProcess] | Worker: Starting Pool
[2014-05-31 20:58:55,672: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:55,675: DEBUG/MainProcess] | Worker: Starting Consumer
[2014-05-31 20:58:55,676: DEBUG/MainProcess] | Consumer: Starting Connection
[2014-05-31 20:58:55,739: DEBUG/MainProcess] Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2014 GoPivotal, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'connection.blocked': True, u'authentication_failure_close': True, u'basic.nack': True, u'per_consumer_qos': True, u'consumer_priorities': True, u'consumer_cancel_notify': True, u'publisher_confirms': True}, u'cluster_name': u'rabbit#czterykaty.luser.nl', u'platform': u'Erlang/OTP', u'version': u'3.3.1'}, mechanisms: [u'PLAIN', u'AMQPLAIN'], locales: [u'en_US']
[2014-05-31 20:58:55,741: DEBUG/MainProcess] Open OK!
[2014-05-31 20:58:55,744: INFO/MainProcess] Connected to amqp://django#10.0.1.17:5672/myvhost
[2014-05-31 20:58:55,744: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:55,744: DEBUG/MainProcess] | Consumer: Starting Events
[2014-05-31 20:58:55,791: DEBUG/MainProcess] Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2014 GoPivotal, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'connection.blocked': True, u'authentication_failure_close': True, u'basic.nack': True, u'per_consumer_qos': True, u'consumer_priorities': True, u'consumer_cancel_notify': True, u'publisher_confirms': True}, u'cluster_name': u'rabbit#czterykaty.luser.nl', u'platform': u'Erlang/OTP', u'version': u'3.3.1'}, mechanisms: [u'PLAIN', u'AMQPLAIN'], locales: [u'en_US']
[2014-05-31 20:58:55,795: DEBUG/MainProcess] Open OK!
[2014-05-31 20:58:55,797: DEBUG/MainProcess] using channel_id: 1
[2014-05-31 20:58:55,800: DEBUG/MainProcess] Channel open
[2014-05-31 20:58:55,802: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:55,802: DEBUG/MainProcess] | Consumer: Starting Mingle
[2014-05-31 20:58:55,803: INFO/MainProcess] mingle: searching for neighbors
[2014-05-31 20:58:55,805: DEBUG/MainProcess] using channel_id: 1
[2014-05-31 20:58:55,807: DEBUG/MainProcess] Channel open
[2014-05-31 20:58:57,266: INFO/MainProcess] mingle: all alone
[2014-05-31 20:58:57,266: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:57,267: DEBUG/MainProcess] | Consumer: Starting Gossip
[2014-05-31 20:58:57,268: DEBUG/MainProcess] using channel_id: 2
[2014-05-31 20:58:57,270: DEBUG/MainProcess] Channel open
[2014-05-31 20:58:57,282: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:57,282: DEBUG/MainProcess] | Consumer: Starting Heart
[2014-05-31 20:58:57,285: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:57,286: DEBUG/MainProcess] | Consumer: Starting Tasks
[2014-05-31 20:58:57,299: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:57,299: DEBUG/MainProcess] | Consumer: Starting Control
[2014-05-31 20:58:57,300: DEBUG/MainProcess] using channel_id: 3
[2014-05-31 20:58:57,303: DEBUG/MainProcess] Channel open
[2014-05-31 20:58:57,311: DEBUG/MainProcess] ^-- substep ok
[2014-05-31 20:58:57,311: DEBUG/MainProcess] | Consumer: Starting event loop
[2014-05-31 20:58:57,315: WARNING/MainProcess] celery#czterykaty ready.
[2014-05-31 20:58:57,316: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2014-05-31 20:58:57,317: DEBUG/MainProcess] basic.qos: prefetch_count->16
Periodic tasks are setup trough djcelery/admin interface, and also tasks list are empty This is my first experience with celery, and djano-celery, so I not sure what is wrong.
I managed this working.
I adjusted my settings in settings/local.py with followings:
import djcelery
djcelery.setup_loader()
BROKER_URL = 'amqp://django:django123#10.0.1.17:5672/myvhost'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
Started celeryd with flag -E and -B
python manage.py celeryd -l INFO -E -B
And for monitoring events, was necessary to start celerycam
python manage.py celerycam