I have ~40 achievements in my Facebook application. I'm still in dev environement performing some tests with achievements, deleting all of them and recreating them with a different URL but same content, with a batch.
But most of the time, it appears that I cannot recreate the deleted achievements with the Graph API until I've not scraped the URLs with the debugger. But for 40 achievements, it is a lot of time!
I understood that the achievements are cached at Facebook' side, and the debugger allow us to bypass the cache. Is there any automatic solution to "refresh" the cache for 40 achievements at the same time?
Thank you for your help.
You run an API call to the Facebook scraper in your code, which won't reduce the necessity for having to re-scrape, but it will at least automate it for you.
You simply make a call to:
https://developers.facebook.com/tools/lint/?url={YOUR_URL}&format=json
This performs the same action as manually debugging the page, so it will force a rescrape.
Source: https://developers.facebook.com/docs/opengraphprotocol/#edit
Related
When running a query via the Web Console, a notification appears.
However, I feel that it lasts too much time, and it impedes my view and performing other tasks.
How do I configure the amount of time after which notifications in the Web Console must automatically disappear?
At the moment, this is not a user-configurable feature, but the project is open to requested changes via GitHub issues
Several of GCP accounts I use display a message after logging in:
Refresh the page?
Now that you’ve upgraded, we need to refresh the page so you can take advantage of the new capabilities of your account. Do you want to refresh the page now, or do it yourself later?
Does anyone else see similar message? Wonder what kind of upgrade it relates to, I don't remember making any changes to the account recently. Hitting OK, refresh now doesn't produce any visible changes. Also it seems there is no way of making this message disappear - acknowledging or rejecting will still trigger a popup on next login.
popup screenshot
I guess the developers/engineers the project is shared with are making some changes. Well as per my observation, whenever one makes a change in GCP, it automatically gets reloaded or the necessary changes take place in background, but when others are working at same project at the same time, then if one user makes any changes in a shared access resource, then other users might need to reload the site for the necessary changes to take place.
I would highly suggest you contact other developers/engineers you are sharing the project with to check if they made changes in the project or not.
Hope this helps. Cheers :)
I have a site where it takes a few seconds to generate a page, due to having a crappy server. People visiting it will spam the refresh button. The problem is that the threads that are already loading the page don't stop, so you end up having 2-3 things generating the same page, all but one being discarded. Is there any way to check whether a page load is no longer necessary so I can abort?
Not that i'm aware of even if you could detect from the users browser I don't think apache can be notified and even if it could interrupting a running django thread isn't that easy a solution. Have you looked into caching your pages? Should give you the most performance improvement ESP. If generating pages is your bottleneck.
I'm building a site with django that lets users move content around between a bunch of photo services. As you can imagine the application does a lot of api hits.
for example: user connects picasa, flickr, photobucket, and facebook to their account. Now we need to pull content from 4 different apis to keep this users data up to date.
right now I have a a function that updates each api and I run them all simultaneously via threading. (all the api's that are not enabled return false on the second line, no it's not much overhead to run them all).
Here is my question:
What is the best strategy for keeping content up to date using these APIs?
I have two ideas that might work:
Update the apis periodically (like a cron job) and whatever we have at the time is what the user gets.
benefits:
It's easy and simple to implement.
We'll always have pretty good data when a user loads their first page.
pitfalls:
we have to do api hits all the time for users that are not active, which wastes a lot of bandwidth
It will probably make the api providers unhappy
Trigger the updates when the user logs in (on a pageload)
benefits:
we save a bunch of bandwidth and run less risk of pissing off the api providers
doesn't require NEARLY the amount of resources on our servers
pitfalls:
we either have to do the update asynchronously (and won't have
anything on first login) or...
the first page will take a very long time to load because we're
getting all the api data (I've measured 26 seconds this way)
edit: the design is very light, the design has only two images, an external css file, and two external javascript files.
Also, the 26 seconds number comes from the firebug network monitor running on a machine which was on the same LAN as the server
Personally, I would opt for the second method you mention. The first time you log in, you can query each of the services asynchronously, showing the user some kind of activity/status bar while the processes are running. You can then populate the page as you get the results back from each of the services.
You can then cache the results of those calls per user so that you don't have to call the apis each time.
That lightens the load on your servers, loads your page fast, and provides the user with some indication of activity (along with incrimental updates to the page as their content loads). I think those add up to the best User Experience you can provide.
I want to track and analyze web page load times on my user's systems.
I ran across this article http://www.panalysis.com/tracking-webpage-load-times.php that uses Google analytics to track pages, but it's too coarse for my needs.
Are there any sites out there that specifically let you track page load times using a JavaScript snippet you embed in your web pages?
Ideally the snippet would look like this:
var startTime = new Date();
// code to load the tracker
window.onload=function() {
loadTimeTracker.sendData(<customer id>, document.path, new Date() - startTime)
}
are you looking to send data back to server? If not there are lots of tools to track this sort of thing. i know the firebug extention for firefox does.
if you are looking to send the data base to server doing it purely with javascript will have some drawbacks because it wont include page rendering times, only the time from the when the first line of javascript was recieved until the page finished loading which may skew your data.
Jiffy is pretty nice and open source.
Gomez has a service that tracks how long your website takes to load. It doesn't use any JavaScript as far as I know.
Another good resource is http://webpagetest.org/. It allows you to test the load time manually, but offers a lot of analysis of your page. Latency, time to first byte, assets, DNS lookups, etc. Great resource.
Enable trace on your page, that might be the good start to analyze how much time your page takes to load.