Elasticsearch self.published? - elasticsearch-rails

I am using elasticsearch-rails gem For my site i need to create custom callbacks. https://github.com/elastic/elasticsearch-rails/tree/master/elasticsearch-model#custom-callbacks
But i really confused by one thing. What means if self.published? on this code?
i try to use this for my models
after_commit on: [:update] do
place.__elasticsearch__.update_document if self.published?
end
but for model in console i see self.published? => false but i don`t know what this means

From the document of elasticsearch-rails.
For ActiveRecord-based models, use the after_commit callback to protect your data against inconsistencies caused by transaction rollbacks:
I think it was used to make sure everything is updated successfully into database before we sync to elasticsearch server

Related

How to handle network errors when saving to Django Models

I have a Django .save() execution that loops at n times.
My concern is how to guard against network errors during saving, as some entries could be saved while others won't and there could be no telling.
What is the best way to make sure that the execution is completed?
Here's a sample of my code
# SAVE DEBIT ENTRIES
for i in range(len(debit_journals)):
# UPDATE JOURNAL RECORD
debit_journals[i].approval_no = journal_transaction_id
debit_journals[i].approval_status = 'approved'
debit_journals[i].save()
Either use bulk_create / bulk_update to execute a single DB query, or use transaction.atomic as decorator for your function so that any error on save will rollback your database before your function was run.
Try something like below (I suppose your model name is DebitJournal and debit_journals is a list).
for debit_journal in debit_journals:
debit_journal.approval_no = journal_transaction_id
debit_journal.approval_status = 'approved'
DebitJournal.objects.bulk_update(debit_journals, ["approval_no", "approval_status"])
If debit_journals is a QuerySet you can also try
debit_journals.update(approval_no=journal_transaction_id, approval_status='approved').
It depends of what you call a network error, if it's between the user and the django application or between the django application and the database. If it's only between the user and the app, note that if the request has been sent correctly even if the user lose the connection afterward the objects will be created. So a user might not have the request response, but objects will still be created.
If it's between the database and the django application some objects might still be created before the error.
Usually if you want a "All or Nothing" behaviour you should use manual transaction as described there: https://docs.djangoproject.com/en/4.1/topics/db/transactions/
Note that if the creation is really long you might hit the request timeout. If the creation takes more than a few seconds you should consider making it a background task. The request is only there to create the task.
See Python task queue alternatives and frameworks for 3rd party solutions.

How could I keep the synchronization within elasticsearch-rails

I have a existing data in mysql.
I imported all records from mysql into elasticsearch with a rake task
However, I wonder know how could I keep synchronization when I delete, update, insert a record within Rails
How could Rails trigger the modification to elasticsearch? Thanks
model
require 'elasticsearch/model'
class Job < ActiveRecord::Base
attr_accessor :deadline_asap
include Elasticsearch::Model
include Elasticsearch::Model::Callbacks
end
Rake task
For initial ES indexing
task :import_all => :environment do
Job.__elasticsearch__.create_index!
end
It should already be synchronized:
adding
include Elasticsearch::Model::Callbacks
to your model should ensure that any modification will call the ElasticSearch API
To check this out, just modify a model and use your ES search, it should give you results including the frechly updated model.
Update: Here is the Automatic Callbacks documentation in ElasticSearch-Model gem

How can I write data about process assignees to database

I use camunda 7.2.0 and i'm not very experienced with it. I'm trying to write data about users, who had done something with process instance to database (i'm using rest services) to get some kind of reports later. The problem is that i don't know how to trigger my rest(that sends information to datebase about current user and assignee) when user assignes task to somebody else or claims task to himself. I see that camunda engine sends request like
link: engine/engine/default/task/5f965ab7-e74b-11e4-a710-0050568b5c8a/assignee
post: {"userId":"Tom"}
As partial solution I can think about creating a global variable "currentUser" and on form load check if user is different from current, and if he is - run the rest and change variable. But this solution don't looks correct to me. So is there any better way to do it? Thanks in advance
You could use a task listener which updates your data when the assignee of a task is changed. If you want this behavior for every task you could define a global task listener.

Sails API (Ember frontend) rejecting changes

I'm getting started with Ember (CLI) and am trying to create a simple CRUD setup, using Sails as my API. Ember's loading all the data as expected, but won't save any changes. I've made some actions buttons to toggle Booleans, and increment a counter, but these just revert back. However, Sails' default attribute "updatedAt" is updated with the date of the attempted update.
It seems GET and DELETE work fine, but am I perhaps missing a step for PUT and POST to work properly?
I'm using the mphasize's sails-ember-blueprints, but haven't read about much trouble with them.
here
It turns out that I had some issues with plurality. The model title in the json payload was singular, and the backend plural. Took a little rejigging, but this was only a simple test project.

How to get django-simple-history work with Tastypie?

I need to store full history of changes made to objects. I find django-simple-history very appealing but it does not work with django-tastypie. If I send data to API using PUT (update the object), the object is updated OK but the history records are not updated. If I change the objects manually via './manage.py shell' everything works fine. It looks like tastypie is bypassing signals or something.
Any ideas how I could get this to work as expected?
Without seeing your code I'm going to attempt to solve this one analytically.
Looking at django-simple-history it seems the project does indeed create history objects on post_save/post_delete signals and provides access to them using a custom model.Manager subclass.
It looks to me that the resource that TastyPie saves is a ModelResource and not your actual Model instance. This proxy model is aware of the orm and performs all the queries on it.
So what I think happens in simple_history/models.py is that contribute_to_class method declares models.signals.class_prepared.connect(self.finalize, sender=cls) but this signal never fires since TastyPie does not initialize an instance of the model...
This seems so strange and I cannot understand why TastyPie does that, or maybe I'm misunderstanding something? Why don't you try to open an issue in the github repo?