How to stop refreshing page on event - refresh

I'm thinking about script which refreshes browsers (chrome) page every few seconds (we can find many add-on on internet which do just that) but the script also stops refreshing page when browser senses incoming call (via WebRTC) and box is displayed in browsers corner (box to answer call). How to handle such event and stop refreshing page then?
Here is simple javascript code to refresh every ten minutes:
<script type="text/javascript">
var timeout = setTimeout("location.reload(true);",600000);
function resetTimeout() {
clearTimeout(timeout);
timeout = setTimeout("location.reload(true);",1200000);
}
</script>
I'm thinking about workaround.
How can i do that if i click specific button on page to refresh after 20 minutes (function resetTimeout) (let's say calls wont be longer) and after 20 minutes to refresh again every 10 minutes?
How to incorporate this script to browser for particular webpage?

Ok, i installed chrome injector and for particular page I need to write simple script like this:
var timeout = setTimeout("location.reload(true);",10000);
function resetTimeout() {
clearTimeout(timeout);
timeout = setTimeout(10000000);
}
function resetTimeout2() {
clearTimeout(timeout);
timeout = setTimeout(10000);
}
$('.button.animated.fadeIn.btn.btn-sm.btn-primary.pull-left.call').click(resetTimeout());
$('.button.animated.fadeIn.btn.btn-sm.btn-danger.pull-left.disabled.hangup').click(resetTimeout2());
Refresh page every 6000000ms. If pressed answer then refresh every 10000000ms. If pressed hangup refresh again every 6000000ms. I just need help with syntax with javascript.
when i pressed left.call button resetTimout() do not work, it still refreshes page immediately.
button.animated.fadeIn.btn.btn-sm.btn-primary.pull-left.call and the other one is Button Class element

Related

PWA: how to refresh content every time the app is opened

I created a PWA app which sends API call to my domotic server and prints the response on the home page (e.g. outside temperature and vacuum robot status).
While all the data get refreshed at very first app opening, if I minimize the app whithout completely shutting it off I have no data refreshing at all.
I was wondering how to force a refresh every time the app gets re-opened without having to do it manually (no pull-down to refresh, no refresh-button).
Found myself the solution adding the following code in service worker:
self.addEventListener('visibilitychange', function() {
if (document.visibilityState === 'visible') {
console.log('APP resumed');
window.location.reload();
}
});
Here is the solution that works.
You can place this code wherever you have access to the window object:
window.addEventListener("visibilitychange", function () {
console.log("Visibility changed");
if (document.visibilityState === "visible") {
console.log("APP resumed");
window.location.reload();
}
});
Consider this may affect user experience or data loss with a forced reload every time the user swipes between apps.

Google App Engine - http request/response

I have a Java web app hosted on Google App Engine (GAE). The User clicks on a button and he gets a data table with 100 rows. At the bottom of the page, there is a "Make Web service calls" button. Clicking on that, the application will take one row at a time and make a third party web-service call using the URLConnection class. That part is working fine.
However, since there is a 60 second limit to the HttpRequest/Response cycle, all the 100 transactions don't go through as the timeout happens around row 50 or so.
How do I create a loop and send the Web service calls without the User having to click on the 'Make Webservice calls' more than once?
Is there a way to stop the loop before 60 seconds and then start again without committing the HttpResponse? (I don't want to use asynchronous Google backend).
Also, does GAE support file upload (to get the 100 rows from a file instead of a database)
Thank you.
Adding some code as per the comments:
URL url = new URL(urlString);
HttpURLConnection connection = (HttpURLConnection) url
.openConnection();
connection.setDoOutput(true);
connection.setRequestMethod("POST");
connection.setConnectTimeout(35000);
connection.setRequestProperty("Accept-Language", "en-US,en;q=0.5");
connection.setRequestProperty("Authorization", encodedCredentials);
// Send post request
DataOutputStream wr = new DataOutputStream(
connection.getOutputStream());
wr.writeBytes(submitRequest);
It all depends on what happens with the results of these calls.
If results are not returned to a UI, there is no need to block it. You can use Tasks API to create 100 tasks and return a response to a user. This will take a few seconds at most. The additional benefit is that you can make up to 10 calls in parallel by using tasks.
If results have to be returned to a user, you can still use up to 10 threads to process as many requests in parallel as possible. Hopefully, this will bring your time under 1 minute, but you cannot guarantee it since you depend on responses from third-party resources which maybe unavailable at the moment. You will have to implement your own retry mechanism.
Also note that users are not accustomed to waiting for several minutes for a website to respond. You may consider a different approach when a user is notified after the last request is processed without blocking your client code.
And yes, you can load data from files on App Engine.
Try using asynchronous urlfetch calls:
LinkedList<Future<HttpResponse>> futures;
// Start all the request
for (Url url : urls) {
HttpRequest request = new HttpRequest(url, HTTPMethod.POST);
request.setPayload(...)
futures.add(urlfetchservice.fetchAsync(request);
}
// Collect all the results
for (Future<HttpResponse> future : futures) {
HttpResponse response = future.get()
// Do something with future
}

Gracefully terminate a request based service on server

In our web application, for each http-request there is a lot of computation that happens on back end. Output can vary from 10 sec - 1 Hour. In the mean time when it is computed, "Waiting.." is shown on the website for the respective user.
But it so happens, that a user might cut down the service in between. So what all can be done on the back end so that the computation can be stopped in between to save resources? What different tactics can be applied here?
And if better (instead of killing the thread directly), then a graceful termination policy should make wonders.
I'm not sure if this fits your scenario but here is how I have tackled this issue in the past. We were generating pdf reports for a web-app. Most reports could be generated in under 5 seconds but some would take up to an hour.
When the User clicks on generate button we redirect them to a "Generating..." dialog screen which has a sort of progress bar and a Cancel button. This also launches the generate process on the server in a separate thread (we have a worker pool). The browser then polls the server regularly via ajax to check on the progress (either update the progress bar or redirect to the display page when finished).
The synchronization at the server between the generating process and the ajax process was done via a process synchronization object. The sync-obj was a very simple class instance which could be retrieved quickly from any thread at any time via some unique string.
Both processes could update this shared sync-obj. As the report generated the repgen thread would update the sync-obj which the ajax thread would inform the browser. If the User clicked the Cancel button then the ajax thread would set the "cancel" flag in the sync-ob and the repgen thread would pick that up and break out of the generate loop.
Clearly the responsiveness of the whole process depends a lot on how frequently the repgen thread checks the sync-obj and that often comes down to how the individual report was coded.
Finally, to answer your question, if the User gets bored and goes "back" and clicks the generate button again we do not cancel the first report and start a second but rather realise that it is the same report (and the same sync-obj id) and so just let the report continue. However if that does not suit your scenario then starting a generate process could cancel the first in the same manner that the User could via the Cancel button.

On server processing

I have a web application that will be doing some processing with submitted data. Instead of making the user wait for what will take at least a few seconds, maybe up to a few minutes during heavy load, I would like to know if there is some way to, within coldfusion, have processing that just occurs on the server.
Basically, the data would be passed to the server, and then the user would be redirected back to the main page to allow them to do other things, but not necessarily be able to see the results right away. Meanwhile, the processing of the data would take place on the server, and be entered into the database when complete.
Is this even possible within coldfusion, or would I need to look into using code that would receive the data and process it as a separate program?
ColdFusion 8 introduced the cfthread tag which may assist you.
<cfthread
required
name="thread name"
optional
action="run"
priority="NORMAL|HIGH|LOW"
zero or more application-specific attributes>
Thread code
</cfthread>
To do this reliably, you can use a database table as a job queue. So you when the user submits the data you insert record into the database indicating there is some work to be done. Then you create a scheduled task in the CF Administrator that polls a script that gets the next job from the queue and does the processing you describe. When complete it can update the database and you can then alert your user that there job is complete.
Make sense?
Another option that will possibly work for you is to use AJAX to post the data to the server. This is a pretty easy method to use, since you can use pretty much the exact same CF code that you have now and instead only need to modify the form submitting page (and you could even use some unobtrusive javascript techniques to have this degrade gracefully if javascript isn't present).
Here's an example using jQuery and BlockUI that will work for unobtrusively-submitting any form on your page in a background thread:
<script>
$(function () {
$("form").on("submit", function (e) {
var f = $(this);
e.preventDefault();
$.ajax({
method: f.attr("method"),
url: f.attr("action"),
data: f.serialize(),
beforeSend(jqXHR, settings) {
f.blockUI({message: "Loading..."});
},
complete(jqXHR, textStatus) {
f.unblockUI();
},
success: function (data, textStatus, jqXHR) {
// do something useful with the response
},
error: function(jqXHR, textStatus, errorThrown) {
// report the error
}
});
});
});
</script>
You should combine all three of these answers to give yourself a complete solution.
Use CF Thread to "kick off" the work.
Add a record to the DB to tell you the process is underway.
Use Ajax to check the DB record to see if the work is complete. When
your thread completes update the record - Ajax finds the work
complete and you display some message or indicator on the user's
screen so they can go on to step 2 or whatever. So each of these
answers holds a clue to a complete solution.
Not sure if this should be an answer or a comment (since I'm not adding anything new here).
We use an CF event gateway for this. The user submits a file via a web form and the event gateway monitors that upload directory. Based upon the file name the gateway knows how it should process the file into the database. This way the only real delay the user faces is the time for the file to actually transmit from their machine up to the server. We however have no need to inform the user of any statuses related to the process though could easily see how to work that into things if we did.

Can we renew session in Coldfusion?

I am storing 5-6 variable values in my session. Any suggestions on how can I renew my session struct when its about to expire? I am using Coldfusion 8.
Thanks!!
Use AJAX to ping the server to keep the session alive
Or just simply extend the session timeout timeSpand.
Any call to a CFM page from that session would cause the session to be extended. What I have seen done is a JS timer will be running and end shortly before the session expires. When the timer runs up it triggers a popup that loads a non CFM page(basic HTML) and that page states a message about the session ending soon and asking the user if they'd like to continue it.
Here's an idea of what Ajax to automatically ping the server could look like (as suggested in Henry's answer.)
//This function will keep the session alive as long as the page is still open.
//Whenever the page nears expiration, it automatically extends it.
function autoExtendSession(days,hours,mins,secs){
var milliseconds = (days*86400000)+(hours*3600000)+(mins*60000)+(secs*1000);
setTimeout(
function(){
$.post("server/heartbeat.cfm");
console.log("Heartbeat sent to the server.");
//Start another timer.
autoExtendSession(days,hours,mins,secs)
//Once we are 98% of the way to the timeout, the heartbeat is sent.
//This way the timeout should never actually be reached.
}, milliseconds-(milliseconds*0.02));
};
The hearbeat.cfm page doesn't actually have to contain anything, the server will renew the session when the $.post hits it, whether or not it has content.
Exact way of doing what you are asking for is to push session data into the database when onSessionEnd fired and restore it on next onSessionStart. To find out the data entries to read you can put cookie into the user's browser with unique identifier (for example, salted+encrypted id of that entry), kind of "Remember me" stuff.
You could try setting your session timeout to something small, say 5min.
Then when someone authenticates, extend the session timeout to something larger, 30min.
And if they signout, drop it back down.
eg. configure your cf admin with a 5 minute session timeout.
On sign in:
<cfscript>
// extend session timeout to 1800 seconds (30min)
session.SetMaxInactiveInterval( javaCast( 'long', 1800 ) );
</cfscript>
On sign out:
<cfscript>
// shrink session timeout to 300 seconds (5min)
session.SetMaxInactiveInterval( javaCast( 'long', 300 ) );
</cfscript>
The session hangs around for another 5 minutes and then is cleaned up.
Unless you continue using the site, in which case each page request would give you a further 5min.