Best way to globally use a single file across a network - web-services

I am creating an app for a company. This company has one server with multiple databases. Each company location uses their own database on this single server. Neither can see the others database.
They all want a dictionary for adding words through spellcheck. These words will be saved to a lexicon file.
I as the programmer want this lexicon file to reside on the server and then deploy a copy to the client machines on program startup. My question is what would be the best option in terms of getting the new added words back into this parent lexicon file and then subsequentially updating the client on a file changed event.
Would a webservice with a Filesystemwatcher work? Or just add the words to their database table and then parse them out to a lexicon file, then deploy it to the client machine every time and update occurs.

Related

what is the best method to initialize or store a lookup dictionary that will be used in django views

I'm reviving an old django 1.2 app. most of the steps have been taken.
I have views in my django app that will reference a simple dictionary of only 1300ish key-value pairs.
Basically the view will query the dictionary a few hunderd to a few thousand times for user supplied values.The dictionary data may change twice a year or so.
fwiw: django served by gunicorn, db=postgres, apache as proxy, no redis available yet on the server
I thought of a few options here:
a table in the database that will be queried and let caching do its
job (at the expense of a few hundred sql queries)
Simply define the dictionary in the settings file (ugly, and how many time is it read? Every time you do an 'from django.conf import settings'?
This was the situation how it was coded in the django 1.2 predecessor of this app many years ago
read a tab delimited file using Pandas in the django settings and make this available. the advantage is that I can do some pandas magic in the view. (How efficient is this, will the file be read many times for different users or just once during server startup?)
prepopulate a redis cache from a file as part of the startup process (complicates things on the server side and we want it to be simple, but its fast.
List items in a tab delimited file and read it in in the view (my least popular option since it seems to be rather slow)
What are your thoughts on this? Any other options?
Let me give a few - simple to more involved
Hold it in memory
Basic flat file
Sqlite file
Redis
DB
I wouldn't bring redis in for 1300 kv pairs that don't even get mutated all that much
I would put a file alongside the code that gets slurped in memory at startup or do a single sql query and grab the entire thing at startup and keep it in memory to use throughout the application

DB2 - Read Write Locks

I am working on a web application which involves inventory management. My application uses DB2 as the database.
1) In my application, there is a module which just inserts the records. This can happen at anytime since the records are entered by customers.
2) And there is another stand-alone module which reads and updates the records entered. This module never Inserts records. It just updates existing records. This module is scheduled so it will run once an hour.
My question is, the second module can read and update the records without an issue if the first module is inserting a record at the same time? I am not referring to the record just being entered at the time but the other records in the table that needs processing. ( Bottom line is when first module inserts data, can my second module read and update data in separate rows of the same table at the same time ? )
I am very new to DB2 and heard about the locking in DB2. That is why I raised this question.
Adding the following information about my application. Both modules are written in java. Second module is a spring boot application. Operating system is windows.
Thank you in advance.

How to automatically update tabular data or database

I'm looking for an approach to handle updating some tabular data (i.e. txt or database) periodically (i.e. every one day). The GUI should access this data any time. I'm Ok with storing data in local host or in a server but for testing I will start with a local PC host. The naive approach is to update data every time the User open GUI. This works just fine but in the future, I need data to be updated automatically. What is the right approach for this issue?
If you are storing data in a .txt file you can use QFileSystemWatcher, which emits fileChanged signal whenever that particular file is changed. Based on this signal you can update your GUI.
And, it should be possible for databases too.

Django processes concurrency

I am running a Django app with 2 processes (Apache + mod_wsgi).
When a certain view is called, the content of a folder is read and the process adds entries to my database based on what files are new/updated in the folder.
When 2 such views execute at the same time, both see the new file and both want to create a new entry. I cannot manage to have only one of them write the new entry.
I tried to use select_on_update, with transaction.atomic(), get_or_create, but without any success (maybe used wrongly?).
What is the proper way of locking to avoid writing an entry with the same content twice with get_or_create ?
I ended up enforcing the unicity at the database (model) level, and catching the resulting IntegrityError in the code.

How to modify a scope in Sync Framework?

I am new to using sync framework and need help in fixing an issue.
The system we built is a window based application. Each user will have their own database in their local. End of the day they sync their database to the remote DB server when they are within the network.
I added two new columns to an existing table. Scope definition seems to be updated in my local database. But when I try to do a sync with my remote DB server it says could not find _bulk-insert store procedure and errors out.
When checked in my remote DB server. I could see the new columns in the table and I don't see any of the store procedures. Scope-config table does not have the new columns in it.
Do the remote server needs to have the store procedure or updating the scope config table will do?
have you provisioned your remote DB server? if you're not finding the sync fx related objects then its not provisioned.
likewise, Sync Fx do not support schema synchronisation. there's nothing on the API to allow you to alter the scope definition either.
it's either you drop and re-create the scope and re-sync, or you hack your way into the Sync Fx scope definition metadata.