my task:
#shared_task
def my_test():
# Executes every 2 minutes
UserStatisticStatus.objects.filter(id=1).update(loot_boxes=+234)
print('Hello from celery')
app.conf.beat_schedule = {
'my-task-every-10-seconds': {
'task': 'user_statistic_status.tasks.my_test',
'schedule': timedelta(seconds=10)
}
}
my settings:
if 'RDS_DB_NAME' in os.environ:
CELERY_BROKER_URL = 'redis://<myurl>/0'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_BROKER_TRANSPORT_OPTIONS = {
'region': 'eu-central-1',
'polling_interval': 20,
}
CELERY_RESULT_BACKEND = 'redis://<myurl>/1'
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_SEND_EVENTS = False
CELERY_TASK_ROUTES = {
'my_test': {'queue': 'default'},
}
my celery.py :
import os
from celery import Celery
from project import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'project.settings')
app = Celery('celery_app')
app.conf.task_routes = {
'my_test': {'queue': 'default'},
}
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
[2023-01-06 09:13:32,581: DEBUG/MainProcess] | Worker: Starting Beat
[2023-01-06 09:13:32,584: DEBUG/MainProcess] ^-- substep ok
[2023-01-06 09:13:32,585: DEBUG/MainProcess] | Worker: Starting Hub
[2023-01-06 09:13:32,585: DEBUG/MainProcess] ^-- substep ok
[2023-01-06 09:13:32,586: DEBUG/MainProcess] | Worker: Starting Pool
[2023-01-06 09:13:32,856: DEBUG/MainProcess] ^-- substep ok
[2023-01-06 09:13:32,864: DEBUG/MainProcess] | Worker: Starting Consumer
[2023-01-06 09:13:32,864: DEBUG/MainProcess] | Consumer: Starting Connection
[2023-01-06 09:13:32,901: INFO/Beat] beat: Starting...
[2023-01-06 09:13:32,972: DEBUG/Beat] Current schedule:
<ScheduleEntry: my-task-every-10-seconds user_statistic_status.tasks.my_test() <freq: 10.00 seconds>
[2023-01-06 09:13:32,972: DEBUG/Beat] beat: Ticking with max interval->5.00 minutes
[2023-01-06 09:13:32,973: DEBUG/Beat] beat: Waking up in 9.99 seconds.
[2023-01-06 09:13:42,969: DEBUG/Beat] beat: Synchronizing schedule...
It gets stuck here and task never executes !!!
The worker connects and shows the tasks correctly and the beat starts too but nothing happening. I've tested it with the local redis server and everything works fine.
Any help will be much appreciated
Thank you
I have a django project with celery integrated using redis.
My celery worker works perfectly in local development, and now I'm deploying in production.
Before daemonizing the process I want to see how celery behaves in the server. The thing is, celery beat sends the tasks correctly every minute (as I scheduled) but the worker seems not to receive it every time. Sometimes it requires 4/5 minutes until the task is received and processed. How is that possible? I have tried debugging, but there is very few information.
see my setup:
settings.py
CELERY_TIMEZONE = 'Europe/Warsaw'
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
# Other Celery settings
CELERY_BEAT_SCHEDULE = {
'task-number-one': {
'task': 'predict_assistance.alerts.tasks.check_measures',
'schedule': crontab(minute='*/1'),
},
}
tasks.py
from __future__ import absolute_import, unicode_literals
from celery import shared_task
#shared_task()
def check_measures():
print('doing something')
celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'config.settings.local')
app = Celery('predict_assistance')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
hereby my logs in production:
[2020-03-11 16:09:00,028: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:09:00,038: INFO/MainProcess] Received task: predict_assistance.alerts.tasks.check_measures[86f5c999-a53c-44dc-b568-00d924b5da9e]
[2020-03-11 16:09:00,046: WARNING/ForkPoolWorker-3] doing something
[2020-03-11 16:09:00,047: INFO/ForkPoolWorker-3] predict_assistance.alerts.tasks.check_measures[86f5c999-a53c-44dc-b568-00d924b5da9e]: doing something logger
[2020-03-11 16:09:00,204: INFO/ForkPoolWorker-3] Task predict_assistance.alerts.tasks.check_measures[86f5c999-a53c-44dc-b568-00d924b5da9e] succeeded in 0.16194193065166473s: None
[2020-03-11 16:10:00,049: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:10:00,062: INFO/MainProcess] Received task: predict_assistance.alerts.tasks.check_measures[c7786f38-793f-45e6-abb2-1c901e345e8f]
[2020-03-11 16:10:00,072: WARNING/ForkPoolWorker-3] doing something
[2020-03-11 16:10:00,073: INFO/ForkPoolWorker-3] predict_assistance.alerts.tasks.check_measures[c7786f38-793f-45e6-abb2-1c901e345e8f]: doing something logger
[2020-03-11 16:10:00,242: INFO/ForkPoolWorker-3] Task predict_assistance.alerts.tasks.check_measures[c7786f38-793f-45e6-abb2-1c901e345e8f] succeeded in 0.17491870187222958s: None
[2020-03-11 16:11:00,054: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:12:00,032: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:13:00,035: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:14:00,046: INFO/Beat] Scheduler: Sending due task task-number-one (predict_assistance.alerts.tasks.check_measures)
[2020-03-11 16:14:00,053: INFO/MainProcess] Received task: predict_assistance.alerts.tasks.check_measures[e0b3ef2b-ba15-421c-9a0f-0ef9f3ebb22a]
[2020-03-11 16:14:00,065: WARNING/ForkPoolWorker-3] doing something
[2020-03-11 16:14:00,066: INFO/ForkPoolWorker-3] predict_assistance.alerts.tasks.check_measures[e0b3ef2b-ba15-421c-9a0f-0ef9f3ebb22a]: doing something logger
[2020-03-11 16:14:00,247: INFO/ForkPoolWorker-3] Task predict_assistance.alerts.tasks.check_measures[e0b3ef2b-ba15-421c-9a0f-0ef9f3ebb22a] succeeded in 0.1897202990949154s: None
Do you have any idea why this is happening?
Thanks in advance
As suggested by the comments, the solution was to switch from redis-server service to rabbitmq-server service
I am running into the issue that, when using a timezone inside my celery.py tasks run every chance it gets, thus running not according to schedule.
This is the output:
scheduler_1 | [2018-11-29 11:00:09,186: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,199: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,204: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,210: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,220: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,228: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,231: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,236: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,239: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,247: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
scheduler_1 | [2018-11-29 11:00:09,250: INFO/MainProcess] Scheduler: Sending due task Suppliers (biko.supplier.tasks.pull_supplier_data)
My celery.py:
import os
from celery import Celery
from celery.schedules import crontab
from django.conf import settings
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings")
# app = Celery('biko')
task_apps = [
'biko.supplier.tasks',
'biko.common.tasks',
'biko.commerce.tasks',
'biko.shop.tasks'
]
app = Celery('biko', include=task_apps)
app.config_from_object('django.conf:settings')
app.conf.timezone = 'Europe/Amsterdam'
app.autodiscover_tasks()
app.conf.ONCE = {
'backend': 'celery_once.backends.Redis',
'settings': {
'url': 'redis://' + os.getenv('REDIS_HOST'),
'blocking': True,
'default_timeout': 60 * 60,
'blocking_timeout': 86400
}
}
When I remove the app.config.timezone everything works fine.
My django settings regarding timezone...
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
Any ideas what causes these issues?
I am using cloud 9 online IDE for python dev.
Here's my code:
from celery import Celery
from celery.schedules import crontab
from datetime import timedelta
RESULT_URL = 'mongodb://********'
BROKER_URL = 'redis://*********'
app = Celery('tasks', backend=RESULT_URL,broker=BROKER_URL)
CELERY_TIMEZONE = 'UTC'
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'tasks.add',
'schedule': timedelta(seconds=30),
'args': (16, 16)
},
}
#app.task
def add(x, y):
print x+y
return x + y
And I am starting it out with the command:
celery -A tasks worker --loglevel=info --beat
Celery starts OK,but stops all activity there.Manually invoked tasks work fine.
Here's the console log:
[2015-04-16 07:53:30,954: INFO/Beat] beat: Starting...
[2015-04-16 07:53:32,696: INFO/MainProcess] Connected to redis://*******
[2015-04-16 07:53:34,722: INFO/MainProcess] mingle: searching for neighbors
[2015-04-16 07:53:37,685: INFO/MainProcess] mingle: all alone
[2015-04-16 07:53:40,343: WARNING/MainProcess] celery#*****-demo-project-563148 ready.
I am using RedisLabs free-version as the Broker and a self hosted mongo as the backend storage. Where am I going wrong?
I,m using Amazon web services. And I'm trying to make Celery work.
Tried to do what the documentation says (http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html#configuring-your-django-project-to-use-celery), but when I start celery worker and try to call a task nothing happens, as if there were no task.
Here's my settings.py file:
import os
import djcelery
import djkombu
import sys
import tasks
sys.path.append(os.getcwd())
djcelery.setup_loader()
# Django settings for analogg project.
DEBUG = False
TEMPLATE_DEBUG = DEBUG
ADMINS = (
('Nikita', 'bodnarnikita#gmail.com'),
)
MANAGERS = ADMINS
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME': 'analoggdb', # Or path to database file if using sqlite3.
'USER': '', # Not used with sqlite3.
'PASSWORD': '', # Not used with sqlite3.
'HOST': '', # Set to empty string for localhost. Not used with sqlite3.
'PORT': '', # Set to empty string for default. Not used with sqlite3.
}
}
# Local time zone for this installation. Choices can be found here:
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
# although not all choices may be available on all operating systems.
# On Unix systems, a value of None will cause Django to use the same
# timezone as the operating system.
# If running in a Windows environment this must be set to the same as your
# system time zone.
TIME_ZONE = 'America/Chicago'
# Language code for this installation. All choices can be found here:
# http://www.i18nguy.com/unicode/language-identifiers.html
LANGUAGE_CODE = 'en-us'
SITE_ID = 1
# If you set this to False, Django will make some optimizations so as not
# to load the internationalization machinery.
USE_I18N = True
# If you set this to False, Django will not format dates, numbers and
# calendars according to the current locale.
USE_L10N = True
# If you set this to False, Django will not use timezone-aware datetimes.
USE_TZ = True
# Absolute filesystem path to the directory that will hold user-uploaded files.
# Example: "/home/media/media.lawrence.com/media/"
MEDIA_ROOT = ''
# URL that handles the media served from MEDIA_ROOT. Make sure to use a
# trailing slash.
# Examples: "http://media.lawrence.com/media/", "http://example.com/media/"
MEDIA_URL = '/static/'
# Absolute path to the directory static files should be collected to.
# Don't put anything in this directory yourself; store your static files
# in apps' "static/" subdirectories and in STATICFILES_DIRS.
# Example: "/home/media/media.lawrence.com/static/"
SITE_ROOT = os.path.dirname(os.path.realpath(__file__))
STATIC_ROOT = os.path.join(SITE_ROOT, 'static')
# URL prefix for static files.
# Example: "http://media.lawrence.com/static/"
STATIC_URL = 'http://analogg.info/static/'
# Additional locations of static files
STATICFILES_DIRS = (
# Put strings here, like "/home/html/static" or "C:/www/django/static".
# Always use forward slashes, even on Windows.
# Don't forget to use absolute paths, not relative paths.
)
# List of finder classes that know how to find static files in
# various locations.
STATICFILES_FINDERS = (
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
'django.contrib.staticfiles.finders.DefaultStorageFinder',
)
ADMIN_MEDIA_PREFIX = '/static/admin/'
# List of callables that know how to import templates from various sources.
TEMPLATE_LOADERS = (
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader',
# 'django.template.loaders.eggs.Loader',
)
MIDDLEWARE_CLASSES = (
'django.middleware.common.CommonMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
# Uncomment the next line for simple clickjacking protection:
# 'django.middleware.clickjacking.XFrameOptionsMiddleware',
)
ROOT_URLCONF = 'analogg.urls'
# Python dotted path to the WSGI application used by Django's runserver.
WSGI_APPLICATION = 'analogg.wsgi.application'
TEMPLATE_DIRS = (
"/templates/",
"/home/ubuntu/analogg/templates",
# Put strings here, like "/home/html/django_templates" or "C:/www/django/templates".
# Always use forward slashes, even on Windows.
# Don't forget to use absolute paths, not relative paths.
)
INSTALLED_APPS = (
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.sites',
'django.contrib.messages',
'django.contrib.staticfiles',
# Uncomment the next line to enable the admin:
'django.contrib.admin',
# Uncomment the next line to enable admin documentation:
'django.contrib.admindocs',
)
INSTALLED_APPS += ('djcelery', )
#INSTALLED_APPS += ('djkombu', )
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
CELERY_SEND_TASK_ERROR_EMAILS = True
CELERY_DISABLE_RATE_LIMITS = True
CELERY_IMPORTS = ("analogg.tasks", )
#BROKER_URL = 'amqp://guest:guest#localhost:5672/'
#BROKER_BACKEND = "djkombu.transport.DatabaseTransport"
# A sample logging configuration. The only tangible logging
# performed by this configuration is to send an email to
# the site admins on every HTTP 500 error when DEBUG=False.
# See http://docs.djangoproject.com/en/dev/topics/logging for
# more details on how to customize your logging configuration.
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'filters': {
'require_debug_false': {
'()': 'django.utils.log.RequireDebugFalse'
}
},
'handlers': {
'mail_admins': {
'level': 'ERROR',
'filters': ['require_debug_false'],
'class': 'django.utils.log.AdminEmailHandler'
}
},
'loggers': {
'django.request': {
'handlers': ['mail_admins'],
'level': 'ERROR',
'propagate': True,
},
}
}
This is my views.py file:
# -*- coding: utf-8 -*-
from django.http import HttpResponseRedirect, HttpResponse
from django.template import Context, loader, RequestContext
from django.shortcuts import render_to_response
from tasks import parsesob, add
from parsersob import parser
import os, random, string
def index(request):
html = "<html><body>Hello, world!</body></html>"
add.apply_async(args=[id, '23bnn'])
return HttpResponse(html)
This is my tasks.py file:
from celery.decorators import task
from celery.task.schedules import crontab
from parsersob import parser
#from parsersob2 import parser2
from celery import Celery
#celery = Celery('tasks', broker='amqp://rabbitmqprelaunch9176#ip-10-117-81-80//')
#task
def add(id, a1, a2):
f = open('add.txt', 'w')
g = a1 + a2
f.write(g)
f.close()
And this is my celeryconfig.py file being in the same folder with tasks.py, settings.py and views.py:
#BROKER_URL = 'amqp://'
#CELERY_RESULT_BACKEND = 'analoggdb'
#BROKER_URL = "amqp://guest:guest#localhost:5672//"
CELERY_IMPORTS = ("analogg.tasks", )
#CELERY_RESULT_DBURI = "sqlite:///analoggdb.db"
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = 'Europe/Moscow'
CELERY_ENABLE_UTC = True
I'm a newbie, so could anyone please tell me what to do to make it work.
Any help will be greatly appreciated.
UPD:
ubuntu#ip-10-117-81-80:/var/log/apache2$ sudo rabbitmqctl status
Status of node 'rabbit#ip-10-117-81-80' ...
[{pid,849},
{running_applications,[{rabbit,"RabbitMQ","2.7.1"},
{os_mon,"CPO CXC 138 46","2.2.7"},
{sasl,"SASL CXC 138 11","2.1.10"},
{mnesia,"MNESIA CXC 138 12","4.5"},
{stdlib,"ERTS CXC 138 10","1.17.5"},
{kernel,"ERTS CXC 138 10","2.14.5"}]},
{os,{unix,linux}},
{erlang_version,"Erlang R14B04 (erts-5.8.5) [source] [64-bit] [rq:1] [async-threads:30] [kernel-poll:true]\n"},
{memory,[{total,25494568},
{processes,11083752},
{processes_used,11077008},
{system,14410816},
{atom,1124433},
{atom_used,1120234},
{binary,89696},
{code,11134393},
{ets,752120}]},
{vm_memory_high_watermark,0.3999999990304762},
{vm_memory_limit,247544217}]
...done.
The output of running celeryd:
ubuntu#ip-10-117-81-80:~/analogg$ python manage.py celeryd worker --loglevel=info
-------------- celery#ip-10-117-81-80 v3.0.3 (Chiastic Slide)
---- **** -----
--- * *** * -- [Configuration]
-- * - **** --- . broker: amqp://guest#localhost:5672//
- ** ---------- . app: default:0x13c2a90 (djcelery.loaders.DjangoLoader)
- ** ---------- . concurrency: 1 (processes)
- ** ---------- . events: OFF (enable -E to monitor this worker)
- ** ----------
- *** --- * --- [Queues]
-- ******* ---- . celery: exchange:celery(direct) binding:celery
--- ***** -----
[Tasks]
. analogg.tasks.add
. analogg.tasks.parsesob
[2012-07-22 06:24:30,336: WARNING/MainProcess] celery#ip-10-117-81-80 has started.
UPD:
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] mod_wsgi (pid=7411): Exception occurred processing WSGI script '/home/ubuntu/analogg/apache/django.wsgi'.
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] Traceback (most recent call last):
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/wsgi.py", line 241, in __call__
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] response = self.get_response(request)
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py", line 82, in get_response
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] urlconf = settings.ROOT_URLCONF
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] File "/usr/local/lib/python2.7/dist-packages/django/utils/functional.py", line 185, in inner
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] return func(self._wrapped, *args)
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] AttributeError: 'Settings' object has no attribute 'ROOT_URLCONF'
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] mod_wsgi (pid=7377): Exception occurred processing WSGI script '/home/ubuntu/analogg/apache/django.wsgi'.
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] Traceback (most recent call last):
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/wsgi.py", line 241, in __call__
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] response = self.get_response(request)
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py", line 82, in get_response
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] urlconf = settings.ROOT_URLCONF
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] File "/usr/local/lib/python2.7/dist-packages/django/utils/functional.py", line 185, in inner
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] return func(self._wrapped, *args)
[Sun Jul 22 08:10:58 2012] [error] [client 37.110.0.126] AttributeError: 'Settings' object has no attribute 'ROOT_URLCONF'
I'm not quite sure what is the problem here, but here is an example config, that works:
This is located at the end of settings.py:
#CELERY CONFIG
import djcelery
djcelery.setup_loader()
BROKER_HOST = "Host"
BROKER_PORT = "Port"
BROKER_USER = "User"
BROKER_PASSWORD = "Pass"
BROKER_VHOST = "VHost"
CELERY_DEFAULT_QUEUE = "some_queue"
CELERY_DEFAULT_EXCHANGE = "some_exc"
The broker's user/pass/vhost are configured using the rabbitmqctl tool
You can trace the queue/exchange with
rabbitmqadmin get queue="some_queue"
NOTE: It is very useful to track the logs in /var/log/rabbitmq/rabbit#localhost.log or whatever the log file is. This will show you if there is a problem with the connection to RabbitMQ, like wrong credentials, permissions, etc.