Will using canonical AMP make it faster even for desktop visitors? - accelerated-mobile-page

If you convert your site or web property (app or whatever you want to call it) to go full AMP i.e. no longer keep a canonical version and a separate AMP version, is that site going to end up on Google's fast CDN as is the case when you AMP your regular website?
I know https://www.ampproject.org/ is canonical AMP but they don't address whether or not this makes it get stored in the Google CDN and being served from it when visiting from a desktop or if it is indeed being served from their host only, as is the case for when you use a separate amp for ḿobile.

Related

Allow large file upload from browser while navigating to other page

I'm building a website with Django 1.11 with a fairly simple javascript/html/css part (no framework like Vuejs). I have page reload on each navigation which is fine for my use case.
For convenience, I serve my website from App Engine Standard and it's going well so far. Now, I need my user to be able to upload files (up to 300MB size). Due to App Engine's limitation on request size (32MB), I'm using signed urls so I can send these files directly from my client's Javascript to Cloud Storage.
Due to the size of the files, the upload may take some time, but I can't seem to navigate to another page since it may cancel the upload. I understand that for a case like this a client app like single-page app in Vuejs for example would be appropriate but is there a way to achieve this with my current setup without rewriting my whole website (with possibly Vuejs and Django REST API)?
Any suggestions would be much appreciated.

How to reduce your fingerprint in browser for privacy and for web scraping

You can disable cookies, change your ip 500 times but can’t anyone just track you through fingerprinting?
You could disable Java and Flash. Though that would break the page and make you stand out anyway.
You could use Tor but I think if you use Tor you get blacklisted from some sites instantly.
What’s the workaround? Using Chrome is a big nono. Internet explorer maybe and firefox perhaps…
Are there any apps that deal with this? Or just design a good web scraper, have an ip and cross your fingers.
I realize the average site is not going to implement all these features, but I am how one would workaround a site that was extremely vigilant.
There are two types of browser fingerprinting:
1. static fingerprinting - can identify browsers (and probably operating systems) just based on details of their requests. That's the order and capitalization of http headers, browser specific headers etc.
One small aspect is described here: https://gwillem.gitlab.io/2017/05/02/http-header-order-is-important/
As this can be done without any javascript, I guess scrapy is identifyable this way.
How to get around this?
As mentioned in the above article you need to exactly emulate a particular browser's fingerprint by emulating its headers' order and capitalization (and it has to match the user agent, of course)
2. dynamic fingerprinting - uses Javascript to collect data on installed plugins, plugin versions etc ... As Granitosaurus wrote, that won't be triggered by scrapy. But sites that use fingerprinting for scraping protection will block the scraper if it doesn't get any data from its fingerprinting module.
As this type of fingerprinting yields much more dimensions it can be used to identify particular users with a high reliability (over 90%)
You can find a good example how this is done here: https://github.com/Valve/fingerprintjs2
How to get around this?
use a lot of different real browsers for scraping (for example through selenium, no phantomjs, it can be detected)
randomize these browsers' settings and installed plugins (ideally using different versions)
when scraping rotate these browser instances instead of rotating IPs (each browser instance should keep its IP over its livetime)
If one of the instances is "burnt" replace it with a new instance that has a fresh IP and randomized browser fingerprint
... as you'll need many browsers this has to be done in an automated way, of course.
Resetting cookies sounds like a good idea at first, but if the fingerprinting system is worth its salt it won't need cookies to identify each of these machines reliably.

Cloning PyQt app in django framework

I've designed a desktop app using PyQt GUI toolkit and now I need to embed this app on my Django website. Do I need to clone it using django's own logic or is there a way to get it up on website using some interface. Coz I need this to work on my website same way it works as desktop. Do I need to find out packages in django to remake it over the web or is there way to simplify the task?
Please help.
I'm not aware of any libraries to port a PyQT desktop app to a django webapp. Django certainly does nothing to enable this one way or another. I think, you'll find that you have to rewrite it for the web. Django is a great framework and depending on the complexity of your app, it might not be too difficult. If you haven't done much with web development, there is a lot to learn!
If it seemed like common sense to you that you should be able to run a desktop app as a webapp, consider this:
Almost all web communication that you likely encounter is done via HTTP. HTTP is a protocol for passing data between servers and clients (often, browsers). What this means is that any communication that takes place must be resolved into discrete chunks. Consider an example flow:
You go to google in your browser.
Your browser then hits a DNS server (or cache) that resolves the name google.com to some IP address.
Cool, now your browser makes a request to that IP address and says "get me some stuff".
Google decides to send you back a minimal amount of HTML and lots of minified JavaScript in the page.
Your browser realizes that there are some image links in the HTML and so it makes additional requests to google to get each of the images so that it can display them.
Now all the content is loaded on your browser so it starts to execute the JavaScript code, and that code needs some more data from google so it starts sending requests to google too.
This is just a small example of how fundamentally different a web application operates than how a desktop application does. On a desktop app you have the added convenience that any operation doesn't need to be "packaged up" and sent, then have an action taken, etc (unless you're using a messaging architecture, but that's relatively uncommon outside of enterprise apps).

REST Server, Delphi and Web Services - Advice needed

I am looking on advice on how best to approach a new project I need to develop. From the outset I must add, I have 0 experience with Web development on any level.
What I need to do is provide a web interface through the browser which will communicate with a server back end. The data retrieved will be sourced from either a DB or from another source - external device which the server itself will communicate with via IP. The data retrieved from the external device will always be a string format of n length (non unicode) and the DB data will mostly be strings and numbers with the odd blob thrown in (storing a picture). The communication will always go from the Client (web browser) to the Server. I don't believe that the server would need to instigate the comms.
I have Delphi XE, so started looking at using a REST server for communication and I think that seems to be OK. However, from what I can see, I need to create HTML web pages to "render" the data on the web browser. Is that true? Can I use the IW components with a REST server? If so, I'm not sure how to get the data to/from the browser UI. Am I better of investigating Ruby on Rails perhaps? From what I read on a different thread in here, it's based on MVC and some other areas which I feel, design wise, would fit how I would create the application (I was planning on creating the app based on the MVP or similar design pattern).
I think REST makes the most sense, so if the IW components can't be used, are there any 3rd party products I can use which would let me design "pretty" UI html. Given I don't know java script, would that be a stumbling block with REST too.
Thanks and hopefully I have provided enough information.
Thanks
Jason
Will a human being be responsible for typing the data retrieved from your external device into a web page?
If so, and you have no web development experience, Intraweb is definitely the way to go for Delphi programmers wanting to build a web application without learning new skills. For additional components to create a prettier UI I suggest using TMS Software's Intraweb Component Pack Pro.
If you don't need a human being to manually type in this data then you don't need Intraweb at all. Instead you would write a client application which presumably interrogated your external device for the data and then transmitted it to the REST server. Look at the documentation you've used to build your REST server and it should have a section on how to build a REST client.
You can build an ISAPI module with delphi that does the job, or include a HTTP server right into you executable with Indy, ICS or Synapse.
ISAPI will give you the freedom to choose Apache or IIS and give you all their power this way. Embeded HTTP server will give you a nice small application in which you control all ascpects of how it works.
Yes go with REST as it is simple and clean. All you need is to think and design the API (functions that your server will support). You can bind the APIs to the URL schema thus using the REST principle. I would do it simply like this.
A client makes a request. You show some form of GUI (load or render a HTML page with possible javascript)
User makes an action, you call appropriate API (or the user does it directly).
Show the user some result
Just guide the user process through a series of API calls until the result is made
You can use plain HTML and then add javascript if needed (jquery) or you can use ExtJS from Sencha which makes building a nice GUI a lot easier and is very well structured.
I would not use any "WYSIWYG" web tools. Plain old HTML written by your favorite editor is still the king in my opinion.

Can two different browser share one cookie?

My requirement is pretty interesting, I want to maintain one cookie between two different browser for same domain.
so lets say I have create one cookie with name "mydata" and value "hiscal" from IE, then if i browse same website from firefox and trying to read cookie "mydata" then system should give me value "hiscal"
but this is not happen in general case
so can any one tell me how i can share cookie between to different browser(client) of same domain.
Thanks,
Hiscal
You can build a cookie-proxy by creating a Flash application and use Shared Objects (SO = Flash cookies) to store data.
Any Browsers with Flash installed could retrieve the informations stored in the SO.
But, it's an ugly workaround.
Just don't share cookies... and find another way to build your website/app.
Every browser maintains it's own cookies. So in general, no this is not possible.
With a lot of hard work you could in theory write an application that sits on the client computer that looks at all the locations the different browsers store cookies, parses the different cookie formats, synchronises them and then writes them out.
That would be error prone and will break as soon as a browser changes how it works with cookies (not to mention that some of the browsers secure their cookies, so you won't be able to get to them in the first place).
In my opinion, this is not practical and I wouldn't even try.
Use YUI's storage utility and force it to use the SWF storage engine.
All computers and browsers would still have to have Flash installed, but you wouldn't have to write your own Flash app. You would benefit from using the one maintained by the YUI team.
As others have said, this is not very portable, but in a controlled environment, it might work for you.
Cookies can be shared with other data storage, through browser extensions. Maybe in Flash or Google Gears you can maintain shared DB between browsers, but it needs to be installed on both of them, of course.
Edit:
In Google Gears you can't. Maybe you should write self-made extension... or some user-login system, where the data will sit on the server.