I installed and followed the guide from the git, but I hit a wall once I try to execute a send_message() call using the shell.
I rewrote the asgi.py:
application = ProtocolTypeRouter({
"http": URLRouter([
path("msgs/", AuthMiddlewareStack(URLRouter(django_eventstream.routing.urlpatterns)), { "format-channels": ["test"] }),
re_path(r"", get_asgi_application()),
]),
})
and I created the entries in urls.py:
urlpatterns = [path("msgs/", include(django_eventstream.urls), {'format-channels': ["test"]})]
I also added the static ressources to my static directory and I can access them on any page that includes my base.html. But when I open the local msgs channel at 127.0.0.1\msgs\ I only see:
When opening a shell in a second terminal and using send_event("test", "message", {"text": "hello"}) I see no indication whatsoever that anything is sent out to the frontend. Also the JS sample code from the repo fetches nothing on the main page when sending out data in a loop with starting django (using the fantastic scheduler huey here):
#periodic_task(crontab(minute = "*/1"))
def test():
send_event('test', 'message', {'text': 'hello world'})
print("hello world...")
I see the output of print() but not the result of send_message() on the frontend. Also, there is no implication that my scheduled or manually run send_message() actions result in anything on the \msgs\ channel - the timing does not fit, it looks more like this is just regularly printing keep-alive with no data in it.
The JS console output on the frontend:
var es = new ReconnectingEventSource('/msgs/');
es.addEventListener('message', function (e) {console.log(e.data);}, false);
es.addEventListener('stream-reset', function (e) {}, false);
output for es:
s {CONNECTING: 0, OPEN: 1, CLOSED: 2, _configuration: undefined, withCredentials: false, …}
Related
Maybe I'm misunderstanding the purpose. I am looking for a solution to replace my console.log(errors) in my app as I get it ready for production. I went through the steps to add Sentry functionality to my expo managed work flow project. It seemed to work, it connects, it runs Sentry.init() just fine. But when I try to use Sentry.captureException('some exception') in place of a console.log, the app crashes and I get "Sentry.captureException is not a function". It logs THAT error in my sentry console. So the error is being passed to my sentry project. But from the documentation I'm following, Sentry.captureException should be a valid function? What am i missing? I have also tried Sentry.captureMessage with the same result.
App.js
import * as Sentry from 'sentry-expo';
Sentry.init({
dsn: 'xxxx',
enableInExpoDevelopment: true,
debug: true, // Sentry will try to print out useful debugging information if something goes wrong with sending an event. Set this to `false` in production.
});
enableScreens();
export default function App() {
return (
<NavigationContainer>
<AuthNavigator />
</NavigationContainer>
)
}
app.json expo hooks
"hooks": {
"postPublish": [
{
"file": "sentry-expo/upload-sourcemaps",
"config": {
"organization": "myorganization",
"project": "myproject",
"authToken": "xxxx"
}
}
]
}
You should use Sentry.Native.captureException('message') - as per these docs.
Also note that sentry-expo supports TypeScript, so if you were to use TypeScript for your project you would get autocompletion and an error when calling the wrong method.
I am looking to restore local/dev Auth0 functionality to a Flask app that I recently updated from Python 2.7 to Python 3 (v 3.8.6). The Auth0 authorize_access_token is now failing on my local development server, but still works on the deployed staging site. I have not made any changes this code or to Auth0 my settings.
Error Message:
File "/Users/h/.local/share/virtualenvs/stf-hashhere/lib/python3.8/site-packages/authlib/integrations/base_client/base_app.py", line 126, in _retrieve_oauth2_access_token_params
raise MismatchingStateError()
authlib.integrations.base_client.errors.MismatchingStateError: mismatching_state: CSRF Warning! State not equal in request and response.
Code:
def create_app(test_config=None):
# Factory to create and configure the app
app = Flask(
__name__,
static_folder='../www/static',
static_url_path='/static',
template_folder='../www/static/dist',
instance_relative_config=True,
)
oauth = OAuth(app)
app.secret_key = app.config['SESSION_KEY']
auth0_base = 'https://{}'.format(app.config['AUTH0_API_AUDIENCE'])
auth0 = oauth.register(
'auth0',
client_id=app.config['AUTH0_CLIENT_ID'],
client_secret=app.config['AUTH0_CLIENT_SECRET'],
api_base_url=auth0_base,
access_token_url='{}/oauth/token'.format(auth0_base),
authorize_url='{}/authorize'.format(auth0_base),
client_kwargs={
'scope': 'openid profile email',
},
)
#app.route('/earlybird')
def login():
return auth0.authorize_redirect(redirect_uri=app.config['AUTH0_CALLBACK_URL'])
#app.route('/auth/callback')
def callback_handling():
auth0.authorize_access_token()
return redirect('/profile')
{'framework': <authlib.integrations.flask_client.integration.FlaskIntegration object at 0x110be03a0>, 'name': 'auth0', 'client_id': '<client>', 'client_secret': 'secret', 'request_token_url': None, 'request_token_params': None, 'access_token_url': 'https://smalltradeflora.auth0.com/oauth/token', 'access_token_params': None, 'authorize_url': 'https://smalltradeflora.auth0.com/authorize', 'authorize_params': None, 'api_base_url': 'https://smalltradeflora.auth0.com', 'client_kwargs': {'scope': 'openid profile email'}, 'compliance_fix': None, 'client_auth_methods': None, '_fetch_token': None, '_update_token': None, '_user_agent': 'Authlib/0.15.2 (+https://authlib.org/)', '_server_metadata_url': None, 'server_metadata': {'refresh_token_url': None, 'refresh_token_params': None}, '_fetch_request_token': None, '_save_request_token': None}
http://flora.loc:5000/auth/callbackis my Allowed Callback URL as well as my app.config['AUTH0_CALLBACK_URL']
I have tried:
Verifying config variables
Adding a SESSION_NAME then app.config.SESSION_COOKIE_NAME to try to per this SO thread
using url_for('callback_handling', _external=True) to ensure alignment w/ the callback
Verifying that the AUTHO params do not need to be typed as bytes (the u'' transition is the only top level visible change from 2.7 in these lines of code)
Running from http://127.0.0.1:5000 (same port)
I've noticed that #lepture also notes in this thread
In Authlib 0.9 the session key for state has changed.
But I don't yet understand how, or if, this applies to my needed code adjustments.
I ended up re-writing my app factory completely, building iteratively from the downloaded Oauth for Python Web App sample. The 2 main differences between the working and non-working versions of the code are:
Using localhost:5000 as my localhost address. My mapped flora.loc base will not work. Notes:
I am still not sure why mapped hostnames do not function here (logging the referring url showed flora.loc as expected but there must be something about the resolution I haven't yet caught)
as always, ensure that your expected address is also listed in your auth0 app dashboard.
The audience parameter now needs a pre-pended https:// to resolve successfully
I need to continuously get data from a MySQL database which gets data with an update frequency of around 200 ms. I need to continuously update the data value on the dashboard text field.My dashboard is built on Django.
I have read a lot about Channels but all the tutorials are about chat applications. I know that I need to implement WebSockets which will basically have an open connection and get the data. With the chat application, it makes sense but I haven't come across anything which talks about MySQL database.
I also read about mysql-events. Since the data which is getting in the table is from an external sensor, I don't understand how I can monitor a table inside Django i.e whenever a new row is added in the table, I need to get that new inserted based on a column value.
Any ideas on how to go about it? I have gone through a lot of articles and I couldnt find something specific to this requirement.
Thanks to Timothee Legros answer, it kinda helped me move along in the right direction.
Everywhere on the internet, it says that Django channels is/can be used for real-time applications, but nowhere it talks about the exact implementation(other than chat applications).
I used Celery, Django Channels and Celery's Beat to accomplish the task and it works as expected.
There are three parts to it. Setting up channel's, then creating a celery task, calling it periodically (with the help of Celery Beat) and then sending that task's output to channel's so that it can send that data to the websocket.
Channels
I followed the original tutorial on Channel's website and build up on that.
routing.py
from django.urls import re_path
from . import consumers
websocket_urlpatterns = [
re_path(r'ws/chat/(?P<room_name>\w+)/$', consumers.ChatConsumer),
re_path(r'ws/realtimeupdate/$', consumers.RealTimeConsumer),
]
consumers.py
class RealTimeConsumer(AsyncWebsocketConsumer):
async def connect(self):
self.channel_group_name = 'core-realtime-data'
# Join room group
await self.channel_layer.group_add(
self.channel_group_name,
self.channel_name
)
await self.accept()
async def disconnect(self, close_code):
# Leave room group
await self.channel_layer.group_discard(
self.channel_group_name,
self.channel_name
)
# Receive message from WebSocket
async def receive(self, text_data):
print(text_data)
pass
async def loc_message(self, event):
# print(event)
message_trans = event['message_trans']
message_tag = event['message_tag']
# print("sending data to websocket")
await self.send(text_data=json.dumps({
'message_trans': message_trans,
'message_tag': message_tag
}))
This class will basically send data to the websocket once it receives it. Above two will be specific to the app.
Now we will setup Celery.
In the project's base directory, where the setting file resides, we need to make three files.
celery.py This will init the celery.
routing.py This will be used to route the channel's websocket addresses.
task.py This is where we will setup the task
celery.py
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj_name.settings')
app = Celery('proj_name', backend='redis://localhost', broker='redis://localhost/')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
#app.task(bind=True)
def debug_task(self):
print(f'Request: {self.request!r}')
routing.py
from channels.auth import AuthMiddlewareStack
from channels.routing import ProtocolTypeRouter, URLRouter
from app_name import routing
application = ProtocolTypeRouter({
# (http->django views is added by default)
'websocket': AuthMiddlewareStack(
URLRouter(
routing.websocket_urlpatterns
)
),
})
tasks.py
#shared_task(name='realtime_task')
def RealTimeTask():
time_s = time.time()
result_trans = CustomModel_1.objects.all()
result_tag = CustomModel_2.objects.all()
result_trans_json = serializers.serialize('json', result_trans)
result_tag_json = serializers.serialize('json', result_tag)
# output = {"ktr": result_transmitter_json, "ktag": result_tag_json}
# print(output)
channel_layer = get_channel_layer()
message = {'type': 'loc_message',
'message_transmitter': result_trans_json,
'message_tag': result_tag_json}
async_to_sync(channel_layer.group_send)('core-realtime-data', message)
print(time.time()-time_s)
The task, after completing the task, sends the result back to the Channels, which in turn will relay it to the websocket.
Settings.py
# Channels
CHANNEL_LAYERS = {
'default': {
'BACKEND': 'channels_redis.core.RedisChannelLayer',
'CONFIG': {
"hosts": [('127.0.0.1', 6379)],
},
},
}
CELERY_BEAT_SCHEDULE = {
'task-real': {
'task': 'realtime_task',
'schedule': 1 # this means, the task will run itself every second
},
}
Now the only thing left is to create a websocket in the javascript file and start listening to it.
//Create web socket to receive data
const chatSocket = new WebSocket(
'ws://'
+ window.location.host
+ '/ws/realtimeupdate'
+ '/'
);
chatSocket.onmessage = function(e) {
const data = JSON.parse(e.data);
console.log(e.data + '\n');
// For trans
var arrayOfObjects = JSON.parse(data.message_trans);
//Do your thing
//For tags
var arrayOfObjects_tag = JSON.parse(data.message_tag);
//Do your thing
}
};
chatSocket.onclose = function(e) {
console.error('Chat socket closed unexpectedly');
};
To answer the MySQL usage, I am inserting data into the MySQL database from external sensor and in the tasks.py, am querying the table using Django ORM.
Overall, it does the intended work, populate a real-time dashboard with real-time data from MySQL . Am sure, there might be different and better approach to it, please let me know about it.
Your best bet if you need to constantly query your sql database would be to use Celery or dramatiq which is simpler/easier but less battle tested in combination with Django Channels.
Celery allows you to create workers (kind of like background processes) that you can send tasks (functions) to. When a worker receives a task it will execute. All this is done in the background. From the task that the worker is executing you can actually send data back through a websocket directly from the worker. This only works if you have django channels + channel layers enabled because when you enable channel layers, each consumer instance created when you open a channel/websocket will have a name that you can pass to the worker so that it knows which websocket to send the query data back to.
Here is what the flow of this process would look like:
Client requests to connect to your websocket
Consumer instance is created and with it a specific name for it
Consumer instance accepts connection
Consumer triggers celery task and passes the name
Worker begins polling your SQL databases every X seconds
When worker finds new entry use the name it was given and send the new entry back through the websocket.
I suggest reading django channels documentation on consumers and channel layers as well as celery or dramatiq tutorials to understand how those work. For all this to work you will also have to learn about Redis and a message queue service such as RabbitMQ. There is just too much to put in a simple answer but I can provide more information if you have specific questions.
Edit:
Get Redis Server Setup on your machine. If you are on Windows like me then you have to download WSL 2 and install Ubuntu from the Windows Store (free). This link can walk you through it.
Get RabbitMQ server setup. Follow their tutorial
Enable Django Channels and Django-Channel-layers and then setup Redis as your default Django-channels backend.
Setup Dramatiq or Celery. I prefer Dramatiq as it is basically a new and improved version of Celery albeit being less popular. It is much easier to setup and use. This is the github repo for Django-dramatiq and it will walk you through how to set it up. Note that just like when you launch your django server with python manage.py runserver you have to launch dramatiq workers with python manage.py rundramatiq before testing you website.
Create a tasks.py file in your django app and inside of that task implement your code to check MySQL database for new entries. If you haven't figured that out already here is the link to get started with that. In your tasks file you should have a function with the dramatiq.actor decorator on top so that dramatiq knows that the function is a task.
Build a django-channels consumer to handle WebSocket connections as well as allow you to send data through the WebSocket connection. This is what the standard consumer would look like:
class AsyncDashboardConsumer(AsyncJsonWebsocketConsumer):
async def connect(self):
await self.accept()
async def disconnect(self, code):
await self.close()
async def receive_json(self, text_data=None, bytes_data=None, **kwargs):
someData = text_data['someData']
someOtherData = text_data['someOtherData']
if 'execute_getMySQLdata' in text_data['function']:
await self.getData(someData, someOtherData)
async def sendDataToClient(self, event):
await self.send(text_data=event['text'])
async def getData(self, someData, someOtherData):
sync_to_async(SQLData.send(self.channel_name, someData, someOtherData))
connect function is called when the client attempts to connect to the WebSocket URL that your routing file (in step 2) points to this consumer.
recieve_json function is called whenever the client sends data to your django server.
getData function is called from the recieve_json function and sends a message to start your dramatiq task that you created earlier to check SQL db. Note that when you send the message you must pass in self.channel_name as you use that channel_name to send data back through the WebSocket directly from the dramatiq worker/task.
sendDataToClient function is used when you send data back to the client. So when you send data from your task this is the function you must pass in as a callable.
To send data from the task you created earlier use this: async_to_sync(channel_layer.send)(channelName, {'type': 'sendData', 'text': jsonPayload}). Notice how you pass the channelName as well as the sendData function from your consumer.
Finally, this is what the javascript on the client side would look like:
let socket = new WebSocket("wss://javascript.info/article/websocket/demo/hello");
socket.onopen = function(e) {
alert("[open] Connection established");
alert("Sending to server");
socket.send("My name is John");
};
socket.onmessage = function(event) {
alert(`[message] Data received from server: ${event.data}`);
};
socket.onclose = function(event) {
if (event.wasClean) {
alert(`[close] Connection closed cleanly, code=${event.code} reason=${event.reason}`);
} else {
// e.g. server process killed or network down
// event.code is usually 1006 in this case
alert('[close] Connection died');
}
};
socket.onerror = function(error) {
alert(`[error] ${error.message}`);
};
This code came directly from this JavaScript WebSocket walkthrough.
This is how a basic web application with background workers would continually update information in real-time. There are probably other ways of doing this without background workers but since you want to get information as fast as possible as soon as it arrives it is better to have a background process that is continually checking for updates. On another note, the code above means that separate connections to the database are opened for each new client that connects but you can easily take advantage of django-channels groups and have one connection to your database that then just sends to all clients in certain groups.
Build a microservice for Websockets connections
Another way to implement such a feature - is to build a standalone WebSocket microservice.
Monolyth architecture isn't what you need here. Every WebSocket will open a connection to the Django (which will be behind reverse proxy and server: NGINX and Gunicorn ex.). If your client opens two tabs in the browser you will get 2 connections etc...
My recommendation is to modify the tech stack (yes, I'm a huge fan of Django, but there are many cool solutions in building WS):
Use Starlette ready for production framework with build-in WebSockets: https://www.starlette.io/websockets/
Use uvicorn.workers.UvicornWorker for Gunicorn to manage your ASGI application: this is only 1 line of code, like gunicorn -w 4 -k uvicorn.workers.UvicornWorker --log-level warning example:app
handle your WebSocket connections and use examples to request updates from the database: https://www.starlette.io/database/
Use super simple Javascript code to open the connection of the client-side and listen for updates.
So your models, templates, the view will be managed by Django.
Your WebSocket connections will be managed by Starlette in a native async way.
If you're interested in such an option I can make detailed instructions.
I have a website written on Flask, and I would like to update it when answers to a Google Form has been submitted.
More precisely, I have already associated the form to a Google spreadsheet and I can read that spreadsheet from Flask, but the key component missing is how to trigger the website to update its content when new answers have been submitted to the form.
What would be the best way to do this?
Webhook solution:
Google Forms:
Enter the Google Forms editor
Click 3 vertical dots next to profile picture, and select 'script editor'
Customize this snippet to your WebHook url and set a custom token (this is not really secure, but better than nothing ).
function onFormSubmit(e) {
const url = "https://example.com/webhook";
var options = {
"method": "post",
"headers": {
"Content-Type": "application/json"
},
"payload": JSON.stringify({"token": "sometokenheretocheckonbackend"})
};
UrlFetchApp.fetch(url, options);
}
( Dialog may popup where you have to approve that you connect to an unauthorized service )
Handling on the Flask side:
from http import HTTPStatus
from flask import (
abort,
request
)
#blueprint.route('/webhook', methods=['POST'])
def handle_webhook():
payload = request.get_json()
if payload.get('token') != "sometokenheretocheckonbackend":
abort(HTTPStatus.UNAUTHORIZED)
# Update your content
return jsonify({'success': True}), HTTPStatus.OK
Periodic updates (Alternative solution):
I would consider launching a daemon Thread that periodically updates this content. This is obviously not as elegant, but should work quite stable and wouldn't be much more demanding for the server if the content update procedure is reasonably lightweight.
You could create an Form Submit trigger to trigger a Google Apps Script function that calls out to your Flask site and triggers the update.
https://developers.google.com/apps-script/guides/triggers/installable
Is there some way to send push notifications using Parse and a web service built in Django? I mean, I have a dashboard built in django, in this dashboard I can set it up some parameters, when I create for example a new news, this has to be notified to the user through a push notifications. How can I achieve this?
See that page... https://www.parse.com/docs/rest/guide you should try make a request similar to this....every time you make some news
import json,httplib
connection = httplib.HTTPSConnection('api.parse.com', 443)
connection.connect()
connection.request('POST', '/1/classes/GameScore', json.dumps({
"score": 1337,
"playerName": "Sean Plott",
"cheatMode": False
}), {
"X-Parse-Application-Id": "${APPLICATION_ID}",
"X-Parse-REST-API-Key": "${REST_API_KEY}",
"Content-Type": "application/json"
})
results = json.loads(connection.getresponse().read())
print results