I want to keep my market model updated via the websocket stream.
I have a platform model that has many markets.
When the user first requests the model, it is retrieved from the backend database. I then want to update with the websocket data.
How do I update different values in the model? I can't figure out how to filter the hasmany relationship by market name then set the values. Maybe there's an easier way to go about it that I'm not seeing.
It's actually pretty simple - just make sure you have these things setup:
you'll want your websocket to send json data to ember, using the same format of json (json:ap for example)
when you establish your websocket connection on the ember side of things, you'll want an event handler for handling received messages.
that event handler will use store.pushPayload to add/update the model in the store (which means your websocket code needs access to the store).
an example:
// some controller.js
import Controller from '#ember/controller';
import { action } from 'ember-decorators/object';
import myAwesomeWebSocketStuff from 'lib/websocket';
export default class extends Controller {
init() {
const socket = myAwesomeWebSocketStuff(this.store);
this.set('socket', socket');
}
willDestroy() {
this.get('socket').disconnect();
}
}
and then in lib/websocket.js
import SomeWebSocketLibrary from 'some-library';
export default function(store) {
const socket = new SomeWebSocketLibrary(url);
socket.connect();
socket.on('receive', data => store.pushPayload(data));
return socket;
}
Related
I'm trying to get a working connection between a NextJS application and my Algorand wallet (Pera) using WalletConnect. I am able to connect, but the NextJS application won't send any metadata like dApp name. Is there something wrong with my code?
import WalletConnect from "#walletconnect/client";
import QRCodeModal from "algorand-walletconnect-qrcode-modal";
export default function Index(props) {
// Create a connector
const connector = new WalletConnect({
bridge: "https://bridge.walletconnect.org", // Required
qrcodeModal: QRCodeModal,
clientMeta: {
description: "WalletConnect NodeJS Client",
url: "https://nodejs.org/en/",
icons: ["https://nodejs.org/static/images/logo.svg"],
name: "WalletConnect"
}
});
// Create a function to connect
let connectWallet = () => {
if (!connector.connected) {
connector.createSession()
}
// ... Event subscriptions down here ...
}
And I call the connectWallet function from a simple onClick
return (
<div>
{/* Add button to call connectWallet */}
<button onClick={() => connectWallet()}>Connect Wallet</button>
</div>
);
From what I understand, it should show the clientMeta data I send to the connector, but it just shows empty strings and no image on the Pera wallet app.
The WalletConnect documentation for Pera Wallet does not seem to indicate support of clientMeta unfortunately.
See https://github.com/algorandfoundation/ARCs/blob/main/ARCs/arc-0025.md and https://developer.algorand.org/docs/get-details/walletconnect/
However, it should still display the right URL.
You can compare what you see with https://algorand.github.io/walletconnect-example-dapp/ (that displays the URL https://algorand.github.io)
Small note: in general, you may get faster answers by posting Algorand-related questions on https://forum.algorand.org
First off, if you're not familiar with change streams, please read this.
It seems, when using lb to scaffold applications, that a change stream endpoint is automatically created for models. I have already successfully implemented a change stream where, on submitting a new model instance to my Statement model the changes are sent to all connected clients in real time. This works great.
Except it only sends the modelInstance of the Statement model. I need to know a bit about the user that submitted the statement as well. Since Statement has a hasOne relationship with my user model I would normally make my query with an includes filter. But I'm not making a query here... that's not how change streams work. The node server sends the information to the client without any query for that information being sent first.
My question is, how can I hook the outgoing changestream in the Statement model so that I can pull in the needed data from the user module? Something like:
module.exports = function(Statement) {
Statement.hookChangeStream(function(ctx, statementInstance, cb) {
const myUser = Statement.app.models.myUser
myUser.findOne({ 'where': { 'id': statementInstance.userId } }, function(err, userInstance) {
if (err !== undefined && err !== null) cb(err, null);
// strip sensitive data from user model
cleanUserInstance = someCleanerFunc(userInstance);
// add cleaned myUser modelInstance to Statement modelInstance
statementInstance.user = cleanUserInstance;
cb(null, true);
}
});
}
Can this be done? If so, how?
I have a long running celery task which iterates over an array of items and performs some actions.
The task should somehow report back which item is it currently processing so end-user is aware of the task's progress.
At the moment my django app and celery seat together on one server, so I am able to use Django's models to report the status, but I am planning to add more workers which are away from Django, so they can't reach DB.
Right now I see few solutions:
Store intermediate results manually using some storage, like redis or mongodb making then available over the network. This worries me a little bit because if for example I will use redis then I should keep in sync the code on a Django side reading the status and Celery task writing the status, so they use the same keys.
Report status to the Django back from celery using REST calls. Like PUT http://django.com/api/task/123/items_processed
Maybe use Celery event system and create events like Item processed on which django updates the counter
Create a seperate worker which runs on a server with django which holds a task which only increases items proceeded count, so when the task is done with an item it issues increase_messages_proceeded_count.delay(task_id).
Are there any solution or hidden problems with the ones I mentioned?
There are probably many ways to achieve your goal, but here is how I would do it.
Inside your long running celery task set the progress using django's caching framework:
from django.core.cache import cache
#app.task()
def long_running_task(self, *args, **kwargs):
key = "my_task: %s" % self.result.id
...
# do whatever you need to do and set the progress
# using cache:
cache.set(key, progress, timeout="whatever works for you")
...
Then all you have to do is make a recurring AJAX GET request with that key and retrieve the progress from cache. Something along those lines:
def task_progress_view(request, *args, **kwargs):
key = request.GET.get('task_key')
progress = cache.get(key)
return HttpResponse(content=json.dumps({'progress': progress}),
content_type="application/json; charset=utf-8")
Here is a caveat though, if you are running your server as multiple processes, make sure that you are using something like memcached, because django's native caching will be inconsistent among the processes. Also I probably wouldn't use celery's task_id as a key, but it is sufficient for demonstration purpose.
Take a look at flower - a real-time monitor and web admin for Celery distributed task queue:
https://github.com/mher/flower#api
http://flower.readthedocs.org/en/latest/api.html#get--api-tasks
You need it for presentation, right? Flower works with websockets.
For instance - receive task completion events in real-time (taken from official docs):
var ws = new WebSocket('ws://localhost:5555/api/task/events/task-succeeded/');
ws.onmessage = function (event) {
console.log(event.data);
}
You would likely need to work with tasks ('ws://localhost:5555/api/tasks/').
I hope this helps.
Simplest:
Your tasks and django app already share access one or two data stores - the broker and the results backend (if you're using one that is different to the broker)
You can simply put some data into one or other of these data stores that indicates which item the task is currently processing.
e.g. if using redis simply have a key 'task-currently-processing' and store the data relevant to the item currenlty being processed in there.
You can use something like Swampdragon to reach the user from the Celery instance (you have to be able to reach it from the client thou, take care not to run afoul of CORS thou). It can be latched onto the counter, not the model itself.
lehins' solution looks good if you don't mind your clients repeatedly polling your backend. That may be fine but it gets expensive as the number of clients grows.
Artur Barseghyan's solution is suitable if you only need the task lifecycle events generated by Celery's internal machinery.
Alternatively, you can use Django Channels and WebSockets to push updates to clients in real-time. Setup is pretty straightforward.
Add channels to your INSTALLED_APPS and set up a channel layer. E.g., using a Redis backend:
CHANNEL_LAYERS = {
"default": {
"BACKEND": "channels_redis.core.RedisChannelLayer",
"CONFIG": {
"hosts": [("redis", 6379)]
}
}
}
Create an event consumer. This will receive events from Channels and push them via Websockets to the client. For instance:
import json
from asgiref.sync import async_to_sync
from channels.generic.websocket import WebSocketConsumer
class TaskConsumer(WebsocketConsumer):
def connect(self):
self.task_id = self.scope['url_route']['kwargs']['task_id'] # your task's identifier
async_to_sync(self.channel_layer.group_add)(f"tasks-{self.task_id}", self.channel_name)
self.accept()
def disconnect(self, code):
async_to_sync(self.channel_layer.group_discard)(f"tasks-{self.task_id}", self.channel_name)
def item_processed(self, event):
item = event['item']
self.send(text_data=json.dumps(item))
Push events from your Celery tasks like this:
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
...
async_to_sync(get_channel_layer.group_send)(f"tasks-{task.task_id}", {
'type': 'item_processed',
'item': item,
})
You can also write an async consumer and/or invoke group_send asynchronously. In either case you no longer need the async_to_sync wrapper.
Add websocket_urlpatterns to your urls.py:
websocket_urlpatterns = [
path(r'ws/tasks/<task_id>/', TaskConsumer.as_asgi()),
]
Finally, to consume events from JavaScript in your client, you can do something like this:
let task_id = 123;
let protocol = location.protocol === 'https:' ? 'wss://' : 'ws://';
let socket = new WebSocket(`${protocol}${window.location.host}/ws/tasks/${task_id}/`);
socket.onmessage = function(event) {
let data = JSON.parse(event.data);
let item = data.item;
// do something with the item (e.g., push it into your state container)
}
I am trying to wrap my head around Live Streaming with Server-Sent Events in Rails. I have a Rake task listening for file changes which adds records to the database. Once added I would like to send a SSE to the frontend.
But, the model can't send events to the frontend, the controller is responsible for that. How do I tell my controller a new record was added to the database?
My (broken) solution so far: use an EventBus with an after_save callback in the model that announces the changes and asks the controller to listen for these messages:
require 'reloader/sse'
class SseController < ApplicationController
include ActionController::Live
def index
response.headers['Content-Type'] = 'text/event-stream'
sse = Reloader::SSE.new(response.stream)
EventBus.subscribe(:added) do |payload|
sse.write({ /* payload */ })
end
rescue IOError
ensure
sse.close
end
end
I think my request ends before the event is received meaning it will never end up in de subscribe block. Is this the right approach, if so, what am I missing?
SOLN 1: you can use rest_client gem to send a request to your controller in your after_save callback from model.
SOLN 2: Why dont you call a model method in your controller which creates the database records and then handles further action based on whether record was created or not
I have an Ember app with a login form which returns the current user in JSON format after successful login.
Using createRecord sets the returned JSON attributes directly on the model. For instance, is_private becomes user.is_private, not user.get('isPrivate')?
How do I load the user model so that the attributes are set correctly and I don't have to re-fetch it using the id?
As of a few days ago in ember data 1.0 beta you can use pushPayload to load data directly into the store. For example if you get data pushed to your app through WebSockets (we use the Heroku add-on Pusher). You can call it on the store (source) directly and it will pass it through the appropriate serializer:
var postsJSON = {
posts: [
{id: 1, post_title: "Great post"}
]
}
this.store.pushPayload('post',postsJSON)
NOTE that it will not currently load a singular object (ie post: {id: 1, post_title:"First!"}) - you need to format it as plural with an array.
DS.RESTSerializer has pushPayload as well (source), in which case you need to pass it the store instead.
I highly encourage reading the source code before using, as it looks like the implementation of it will be revisited.
Supposedly, the official way to do this is using adapter.load, as described in this thread:
Loading Data
Previously, some features of the store, such as load(), assumed a
single adapter.
If you want to load data from your backend without the application
asking for it (for example, through a WebSockets stream), use this
API:
store.adapterForType(App.Person).load(store, App.Person, payload);
This API will also handle sideloaded and embedded data. We plan to add
a more convenient version of this API in the future.
But unfortunately, it doesn't handle sideloaded data, despite what the documentation claims. I personally use something like the following, which is based on how find(ID) is implemented:
var id = json["person"]["id"];
var store = DS.get("defaultStore");
var adapter = store.adapterForType(App.Person);
adapter.didFindRecord(store, App.Person, json, id);
var person = App.Person.find(id);
Note that this code assumes JSON in the same format that find(ID) expects to receive from the server, as documented in the RESTAdapter guide:
{
person: {
id: 1,
is_private: false,
projects: [3]
},
projects: [
{ id: 3, name: "FooReader" }
]
}
This will apply any transformations you've configured using keyForAttributeName (such as mapping is_private to isPrivate), and it will handle sideloaded records. I'm not sure if this is a best practice, but it works quite well.
how about store.push('user', userJSON)?
http://emberjs.com/guides/models/pushing-records-into-the-store/#toc_pushing-records
All answers above did not work for me.
What only worked for me was:
this.store.buildRecord(this.store.modelFor('person'), data.id, data)