Localforage: switching between user-defined stores - localforage

I would like to be able to allow a user to switch between different stores in order to allow data to be isolated. For example, a user might be able to switch between several sets of data contained in a "March2020" store, or in "September2019".
I would also like to allow the previous store to be the default value when the program starts.
The problem I am facing is that you cannot change the Config once you have made a call to localforage, which I need to do to get the previous store:
const conf = localforage.config();
function getCurrentStore() {
console.log(conf.name, conf.storeName);
localforage.getItem('semYear').then(function(newStore) { //This would get the current store name
localforage.config({
storeName: newStore,
});
console.log("New store is: ", conf.name, conf.storeName);
}).catch(function(err) {
console.log(err);
});
};
Since I have already called "getItem", this does not change the store.
I could append the month and year to each database in the store (e.g. "addresses-March2020" or "nameData-March2020"), but that seems horribly clunky.
I also tried
let currentMonth = getUserInput();
currentMonth = localforage.createInstance({
name: "nameHere"
});
But this just thinks I am trying to redefine the variable "currentMonth".
So what is the best way to allow a user to switch between different sets of data? I am sure I am missing something basic, but I cannot work it out, and it is important that I am able to prevent databases with the same name from contaminating each other.

Related

Emberjs inside of get computed making request to backend multiple times cause infinite loop

I have a table basically in every row i have get function that makes a backend request with store service. But somehow when there is one row it works expect, but when there is multiple rows it always try to recalculate get function which makes sending infinite request to backend. I am using glimmer component
I cannot use model relation on ember side at this point, there is deep chain on backend side. Thats why i am making backend request.
get <function_name>() {
return this.store.query('<desired_model_name>', { <dependent1_id>: <dependent1_id_from_args>, <dependent2_id>: <dependent2_id_from_args> });
}
I fixed this problem with using constructor. But do you have any idea why this get function re-calculate all the time? Dependent_ids are constant.
Weird thing is when results are [] empty array it does not re calculate every time. Even the query results are same it still try to recalculate every time and making infinite request to backend.
But do you have any idea why this get function re-calculate all the time?
When something like this happens, it's because you're reading #tracked data that is changed later (maybe when the query finishes).
because getters are re-ran every access, you'll want to throw #cached on top of it,
// cached is available in ember-source 4.1+
// or as early as 3.13 via polyfill:
// https://github.com/ember-polyfills/ember-cached-decorator-polyfill
import { cached } from '#glimmer/tracking';
// ...
#cached
get <function_name>() {
return this.store.query(/* ... */);
}
this ensures a stable object reference on the getter that the body of the getter only re-evaluates if tracked data accessed within the getter is changed.
Weird thing is when results are [] empty array it does not re calculate every time. Even the query results are same it still try to recalculate every time and making infinite request to backend.
Given this observation, it's possible that when query finishes, that it's changing tracked data that it, itself is consuming during initial render -- in which case, you'd still have an infinite loop, even with #cached (because tracked-data is changing that was accessed during render).
To get around that is fairly hard in a getter.
Using a constructor is an ok solution for getting your initial data, but it means you opt out of reactive updates with your query (if you need those, like if the query changes or anything).
If you're using ember-source 3.25+ and you're wanting something a little easier to work with, maybe ember-data-resourecs suits your needs
the above code would be:
import { query } from 'ember-data-resources';
// ...
// in the class body
data = query(this, 'model name', () => ({ query stuff }));
docs here
This builds off some primitives from ember-resources which implement the Resource pattern, which will be making a strong appearance in the next edition of Ember.

Joining a stream against a "table" in Dataflow

Let me use a slightly contrived example to explain what I'm trying to do. Imagine I have a stream of trades coming in, with the stock symbol, share count, and price: { symbol = "GOOG", count = 30, price = 200 }. I want to enrich these events with the name of the stock, in this case "Google".
For this purpose I want to, inside Dataflow, maintain a "table" of symbol->name mappings that is updated by a PCollection<KV<String, String>>, and join my stream of trades with this table, yielding e.g. a PCollection<KV<Trade, String>>.
This seems like a thoroughly fundamental use case for stream processing applications, yet I'm having a hard time figuring out how to accomplish this in Dataflow. I know it's possible in Kafka Streams.
Note that I do not want to use an external database for the lookups – I need to solve this problem inside Dataflow or switch to Kafka Streams.
I'm going to describe two options. One using side-inputs which should work with the current version of Dataflow (1.X) and one using state within a DoFn which should be part of the upcoming Dataflow (2.X).
Solution for Dataflow 1.X, using side inputs
The general idea here is to use a map-valued side-input to make the symbol->name mapping available to all the workers.
This table will need to be in the global window (so nothing ever ages out), will need to be triggered every element (or as often as you want new updates to be produced), and accumulate elements across all firings. It will also need some logic to take the latest name for each symbol.
The downside to this solution is that the entire lookup table will be regenerated every time a new entry comes in and it will not be immediately pushed to all workers. Rather, each will get the new mapping "at some point" in the future.
At a high level, this pipeline might look something like (I haven't tested this code, so there may be some types):
PCollection<KV<Symbol, Name>> symbolToNameInput = ...;
final PCollectionView<Map<Symbol, Iterable<Name>>> symbolToNames = symbolToNameInput
.apply(Window.into(GlobalWindows.of())
.triggering(Repeatedly.forever(AfterProcessingTime
.pastFirstElementInPane()
.plusDelayOf(Duration.standardMinutes(5)))
.accumulatingFiredPanes())
.apply(View.asMultiMap())
Note that we had to use viewAsMultiMap here. This means that we actually build up all the names for every symbol. When we look things up we'll need make sure to take the latest name in the iterable.
PCollection<Detail> symbolDetails = ...;
symbolDetails
.apply(ParDo.withSideInputs(symbolToNames).of(new DoFn<Detail, AugmentedDetails>() {
#Override
public void processElement(ProcessContext c) {
Iterable<Name> names = c.sideInput(symbolToNames).get(c.element().symbol());
Name name = chooseName(names);
c.output(augmentDetails(c.element(), name));
}
}));
Solution for Dataflow 2.X, using the State API
This solution uses a new feature that will be part of the upcoming Dataflow 2.0 release. It is not yet part of the preview releases (currently Dataflow 2.0-beta1) but you can watch the release notes to see when it is available.
The general idea is that keyed state allows us to store some values associated with the specific key. In this case, we're going to remember the latest "name" value we've seen.
Before running the stateful DoFn we're going to wrap each element into a common element type (a NameOrDetails) object. This would look something like the following:
// Convert SymbolToName entries to KV<Symbol, NameOrDetails>
PCollection<KV<Symbol, NameOrDetails>> left = symbolToName
.apply(ParDo.of(new DoFn<SymbolToName, KV<Symbol, NameOrDetails>>() {
#ProcessElement
public void processElement(ProcessContext c) {
SymbolToName e = c.element();
c.output(KV.of(e.getSymbol(), NameOrDetails.name(e.getName())));
}
});
// Convert detailed entries to KV<Symbol, NameOrDetails>
PCollection<KV<Symbol, NameOrDetails>> right = details
.apply(ParDo.of(new DoFn<Details, KV<Symbol, NameOrDetails>>() {
#ProcessElement
public void processElement(ProcessContext c) {
Details e = c.element();
c.output(KV.of(e.getSymobl(), NameOrDetails.details(e)));
}
});
// Flatten the two streams together
PCollectionList.of(left).and(right)
.apply(Flatten.create())
.apply(ParDo.of(new DoFn<KV<Symbol, NameOrDetails>, AugmentedDetails>() {
#StateId("name")
private final StateSpec<ValueState<String>> nameSpec =
StateSpecs.value(StringUtf8Coder.of());
#ProcessElement
public void processElement(ProcessContext c
#StateId("name") ValueState<String> nameState) {
NameOrValue e = c.element().getValue();
if (e.isName()) {
nameState.write(e.getName());
} else {
String name = nameState.read();
if (name == null) {
// Use symbol if we haven't received a mapping yet.
name = c.element().getKey();
}
c.output(e.getDetails().withName(name));
}
});

loopback operation hook: add filter to count api

I need to intercept my loopback queries before they query my Mongodb to add additional filters, for example, to limit the object to what the user has access to.
I can successfully update the query on access operation hook to add filters to the GET /Applications , where Applications is my object. However This fails to work for GET /Applications/count
The command runs with a 200, however it returns zero results, even though I'm adding the exact same filters. There most be something different about count that I'm missing. The ctx object looks have a ton of functions/objects in it. I'm only touching the query property, but there must be something else I need to do.
Any ideas? Thank you, Dan
Could you please share your access hook observer's implementation. I tried it on a sample app, and following access hook works as expected for /api/Books/count:
module.exports = function(Book) {
Book.observe('access', function logQuery(ctx, next) {
ctx.query.where.id = 2; // changing filter value for where
console.log('Accessing %s matching %j', ctx.Model.modelName, ctx.query.where);
next();
});
};
Verify that you're modifying query property of Context (see access hook).
Hope that helps.

clarification of Ember's this.get() method

This is more of a general question than anything specific, but I'm new to ember and don't really understand when and how to use Ember's this.get('foo') (and similarly bar.get('foo')).
For example, in my route I have a user object on which there is a property called credits
user = this.store.find('user', userId)
console.log(user)
credits = user.get('credits')
console.log(credits)
my console.log shows me that user.content._data.credits has a value and also has a methods called get content and - more specifically - get credits. However, console.logging credits always returns undefined.
if i set the user as a model though, using this.get('user.credits') in my controller works fine.
I've read the docs about the advantages .get offers with computed properties, but could anyone concisely explain some ground rules of when to use this.get('foo') vs. bar.get('foo') and why it works in some places but not others.
Thanks!
You always need to use Em.get and Em.set for getting and setting properties of an Ember.Object. It's the basic rule. Without it you may find variety of bugs in observers/rendering and other places.
There is a misunderstanding of operations flow in your code: this.store.find always returns a promise object, not the actual data that you request. Detailed:
user = this.store.find('user', userId) // user - Em.RSVP.Promise object
console.log(user) // logs the Em.RSVP.Promise object
credits = user.get('credits') // gets property 'credits' of the Em.RSVP.Promise object (user)
console.log(credits) // always logs `undefined` because there is no property called 'credits' in Em.RSVP.Promise prototype
We must to rely on async nature of Promise and to rewrite this code like this:
this.store.find('user', userId).then(function(user) {
console.log(user) // logs the App.UserModel object with actual data
credits = user.get('credits') // gets property 'credits' of the App.UserModel instance (user)
console.log(credits) // logs real data from the model
});
There is another important part of getting properties from a model object, if you're using ember-data as data layer: you need to declare all fields of the model that you wish to get afterwards.

How to manually set a primary key in Doctrine2

I am importing data into a new Symfony2 project using Doctrine2 ORM.
All new records should have an auto-generated primary key. However, for my import, I would like to preserve the existing primary keys.
I am using this as my Entity configuration:
type: entity
id:
id:
type: integer
generator: { strategy: AUTO }
I have also created a setter for the id field in my entity class.
However, when I persist and flush this entity to the database, the key I manually set is not preserved.
What is the best workaround or solution for this?
The following answer is not mine but OP's, which was posted in the question. I've moved it into this community wiki answer.
I stored a reference to the Connection object and used that to manually insert rows and update relations. This avoids the persister and identity generators altogether. It is also possible to use the Connection to wrap all of this work in a transaction.
Once you have executed the insert statements, you may then update the relations.
This is a good solution because it avoids any potential problems you may experience when swapping out your configuration on a live server.
In your init function:
// Get the Connection
$this->connection = $this->getContainer()->get('doctrine')->getEntityManager()->getConnection();
In your main body:
// Loop over my array of old data adding records
$this->connection->beginTransaction();
foreach(array_slice($records, 1) as $record)
{
$this->addRecord($records[0], $record);
}
try
{
$this->connection->commit();
}
catch(Exception $e)
{
$output->writeln($e->getMessage());
$this->connection->rollBack();
exit(1);
}
Create this function:
// Add a record to the database using Connection
protected function addRecord($columns, $oldRecord)
{
// Insert data into Record table
$record = array();
foreach($columns as $key => $column)
{
$record[$column] = $oldRecord[$key];
}
$record['id'] = $record['rkey'];
// Insert the data
$this->connection->insert('Record', $record);
}
You've likely already considered this, but my approach would be to set the generator strategy to 'none' for the import so you can manually import the existing id's in your client code. Then once the import is complete, change the generator strategy back to 'auto' to let the RDBMS take over from there. A conditional can determine whether the id setter is invoked. Good luck - let us know what you end up deciding to use.