Oracle APEX page item set in an ajax callback does not persist - oracle-apex

I have the following code in my ajax callback (PL/SQL):
:P1_CNT := TO_NUMBER(:P1_CNT) + 1;
apex_util.set_session_state
(p_name => 'P1_CNT'
,p_value => :P1_CNT
);
Which seems to work just fine. But then a process is called that checks that page item and page item comes up as 0 even though in the callback it is set to 1. How can I fix that?
The code that calls the ajax is javascript below, executed from a custom dynamic action:
for ( var i=0; i<records.length; i++) {
apex.server.process
("my_ajax_callback"
,{x01:records[i][1]}
,{type:'GET', dataType: 'text', success: function( text) {}}
);
}
apex.page.submit( 'COMPLETE_PROCESS_RECORDS' );
Where COMPLETE_PROCESS_RECORDS is the process that executes once all the records in the loop have been processed by ajax callback. the ajax callback evaluates each record passed to it and processes some and discards others. P1_CNT is incremented every time a record was processed further.

You haven't shown us the code that calls the Ajax callback, but basically, there's server-side (PL/SQL in this case) and client-side (JavaScript) code. For the server to get values from the client-side, you have to send them in when calling the Ajax callback. That's what the pData parameter is for:
https://docs.oracle.com/en/database/oracle/application-express/19.2/aexjs/apex.server.html#.process
You can access the values you send to the server-side code in different ways depending on how you send them in. For example, if you send in a value with x01, you can refer to it in your PL/SQL code with apex_application.g_x01.
Of course, sometimes you need to get values from the server-side to the client-side. For this, typically you'd send an HTTP response from your PL/SQL code. Here's an example that sends a JSON object:
apex_json.open_object();
apex_json.write('hello', 'world');
apex_json.close_object();
You would then need to update your client-side code to look at and use the HTTP response to map the values to whatever part of the page/DOM you need.
This is so typical that the APEX team made it very simple if you're using the Dynamic Action framework instead of raw JavaScript. There's an action named Execute PL/SQL that has attributes named Items to Submit and Items to Return that can do the heavy lifting for you.

Related

loopback operation hook: add filter to count api

I need to intercept my loopback queries before they query my Mongodb to add additional filters, for example, to limit the object to what the user has access to.
I can successfully update the query on access operation hook to add filters to the GET /Applications , where Applications is my object. However This fails to work for GET /Applications/count
The command runs with a 200, however it returns zero results, even though I'm adding the exact same filters. There most be something different about count that I'm missing. The ctx object looks have a ton of functions/objects in it. I'm only touching the query property, but there must be something else I need to do.
Any ideas? Thank you, Dan
Could you please share your access hook observer's implementation. I tried it on a sample app, and following access hook works as expected for /api/Books/count:
module.exports = function(Book) {
Book.observe('access', function logQuery(ctx, next) {
ctx.query.where.id = 2; // changing filter value for where
console.log('Accessing %s matching %j', ctx.Model.modelName, ctx.query.where);
next();
});
};
Verify that you're modifying query property of Context (see access hook).
Hope that helps.

Django view called twice Firefox

EDIT: This question fixed my issue: What happens when no response is received for a request? I'm seeing retries
I have a weird issue. I have a Django application. The frontend has two forms. I submit the forms together via the same Ajax post method
// code to get and format form data to get into the format I need in the backend
var webURL = "showFormOutput";
$.post(webURL,
dataToSend,
callback
).fail(function(jqXHR, textStatus, errorThrown)
{
console.log("Status: " + textStatus);
console.log("Error: " + errorThrown);
});
In the Django view corresponding to that url, I do the following:
Print a message "View function" to log that I am in the view.
Parse the post data and put it into appropriate variables. Pass these variables to a Python function in a separate file. Let us call this function getQueryResults.
Create two Postgres queries (for two forms) using these variables. Print a log message "Query created".
Use Python's threading module to execute these two queries. I run the following line twice to create two separate threads for the two queries threading.Thread(target=threadFunction, args=(query,).start()
Log in the threadFunction "Query result ready" when the postgres query returns.
This is a long running query - takes up to ten minutes.
In Chrome, this works perfectly. However, in Firefox (and Safari too I think), I get the following issues:
When using Firefox, I can see in pg_stat_activity that two queries are running as expected. However, after about 5 minutes, there are FOUR queries running i.e. two more of the SAME query are started.
This is reflected in my logs also. I get the messages "View Function" and "Query Created" printed AGAIN. This means that somehow, the execution restarted from the view. This happens even on increasing the http.response.timeout to 30000.
Sometimes, Firefox just goes to the fail() case of the $.post and textStatus just says "Error" and errorThrown is blank. Sometimes, it prints my query results as expected but it waits for the second set of queries to finish. I thought it might be an issue with my Python threads but it makes no sense to have the whole view executed again since the threads never call the view anywhere! On the client side, the POST request doesn't seem to be sent again.
Notes: I am using django's filesystem-based cache to cache the query results. Could this be an issue? The second duplicate set of queries turn up even before the first actual set return results so I doubt this is the issue though.
Has anyone got any idea why using Firefox would result in my view being called twice?

Ember Data has duplicate records

In my app, a user can create a message and send it. When the user sends the message, the message is created with createRecord and the server replies with 201 Created if successful.
Also, the user can get messages from other users through a websocket. When it receives a message, I push it into the store with pushPayload.
var parsedData = JSON.parse(data);
this.store.pushPayload('message', parsedData);
The problem is, when a user sends a message and saves it, they also get it back from the websocket, and even though both objects have the same id, the store ends up with duplicate messages.
How can I tell the store than when I push or save something with the same id of an already existing element, it should override it?
Simply perform a check to see whether the model is already in the store before adding it:
var parsedData = JSON.parse(data);
if(this.store.hasRecordForId ('typeOfYourRecord', parsedData.id)){
// logic you want to run when the model is already in the store
var existingItem = this.store.find('typeOfYourRecord', parsedData.id);
// perform updates using returned data here
} else {
this.store.pushPayload('message', parsedData);
}
The only method I found to avoid this problem is to run my update in a new runloop. If the delay in ms in long enough, the problem won't occur.
It seems that receiving the update from the websocket and the request at nearly the same time creates a race condition in Ember Data.

Facebook JavaScript API: run a line of code once asynch calls are completed

I have a piece of code which makes multiple nested calls to FB.api to retrieve certain information. Eventually, it creates an object called "myfriends" and stores my desired information in that object.
What I want to do is to use that object, after it is filled in with data (i.e. after all asynch calls are done), to run something else. In other words, I need a way for my code to know that those calls are complete. How can I do that?
call 'myfriends' after async request has completed.
Example:
FB.api('/me', function(response) {
alert('Your name is ' + response.name);
// Use 'myfriends' object here
});
I ended up using callback functions. The other problem I had was that my inner API call was in a loop; I ended up using an asynchronous looping function. This combination solved my problem.

How to make ArrayController ignore DS.Store's transient record

I have a list of clients displayed through a ClientsController, its content is set to the Client.find() i.e. a RecordArray. User creates a new client through a ClientController whose content is set to Client.createRecord() in the route handler.
All works fine, however, while the user fills up the client's creation form, the clients list gets updated with the new client record, the one created in the route handler.
What's the best way to make RecordArray/Store only aware of the new record until the record is saved ?
UPDATE:
I ended up filtering the list based on the object status
{{#unless item.isNew}} Display the list {{/unless}}
UPDATE - 2
Here's an alternative way using filter, however the store has to be loaded first through the find method, App.Client.find().filter() doesn't seem to behave the way the two methods behave when called separately.
// Load the store first
App.Client.find();
var clients = App.Client.filter(function(client){
console.info(client.get('name') + ' ' + client.get('isNew'));
return !client.get('isNew');
});
controller.set('content',clients);
Few ways to go about this:
First, it's very messy for a route/state that deals with a list of clients to have to go out of its way to filter out junk left over from another unrelated state (i.e. the newClient state). I think it'd be way better for you to delete the junk record before leaving the newClient state, a la
if(client.get("isNew")) {
client.deleteRecord();
}
This will make sure it doesn't creep into the clientIndex route, or any other client list route that shouldn't have to put in extra work to filter out junk records. This code would ideally sit in the exit function of your newClient route so it can delete the record before the router transitions to another state that'll called Client.find()
But there's an even better, idiomatic solution: https://gist.github.com/4512271
(not sure which version of the router you're using but this is applicable to both)
The solution is to use transactions: instead of calling createRecord() directly on Client, call createRecord() on the transaction, so that the new client record is associated with that transaction, and then all you need to do is call transaction.rollback() in exit -- you don't even need to call isNew on anything, if the client record was saved, it obviously won't be rolled back.
This is also a useful pattern for editing records: 1) create a transaction on enter state and add the record to it, e.g.
enter: function(router, client) {
this.tx = router.get("store").transaction();
this.tx.add(client);
},
then the same sort of thing on the exit state:
exit: function(router, client) {
this.tx.rollback();
},
This way, if the user completes the form and submits to the server, rollback will correctly/conveniently do nothing. And if the user edits some of the form fields but then backs out halfway through, your exit callback will revert the unsaved changes, so that you don't end up with some dirty zombie client popping up in your clientIndex routes display it's unsaved changes.
Not 100% sure, could you try to set the content of ClientsController with
Client.filter(function(client){
return !client.get('isNew'));
});
EDIT: In order to make this work, you have to first load the store with Client.find().