Django add exemption to Same Origin Policy (only for one site) - django

I am getting errors thrown from google.g.doubleclick.net when I try to load google ads on my site through plain html.
Blocked a frame with origin "https://googleads.g.doubleclick.net" from accessing a frame with origin "https://example.com". Protocols, domains, and ports must match.
Oddly enough I have a section of my site where I add some ads through javascript and that section does not throw any errors.
I read about adding a crossdomain.xml to the site root and I tried that (and also serving it with NGINX and that does not work either...
Is there any way to add an exception to django's CSRF rules, or any other way to get around this? It is driving me nuts. This error is only thrown in safari (only tried safari and chrome) but it adds a LOT to the data transfer for loading the page and I do not want things to be slowed down.

This has nothing to do with CSRF, but rather this has to do with the same origin policy security restriction which you can fix by implementing CORS and sending the appropriate headers.
You can use django-cors-headers to help with this.

Related

Do CORS restrictions apply to browser windows as well ? HTML Editor:127.0.0.1:5000, Img editor:127.0.0.1:8000. Sending img results back causes a CORS

I have a app on 127.0.0.1:5000 that edits a page (html code)
If I need to edit a picture on that page using a specialized editor I select the picture and then I fire up a call to 127.0.0.1:8000/picture_editor?picture_url="127.0.0.1:5000/static/uploads/picture.jpg
All good so far, I am able to edit the picture and I have code that should send the results back to the parent window and integrate the changes in the editor
The problem is that this triggers a CORS (cross origins resource sharing) security exception and the call does not complete Here is the error:
svg-editor.html?picture_url=http://127.0.0.1:5000/static/uploads/picture.jpg&width=225&height=276:64 Uncaught DOMException: Blocked a frame with origin "http://localhost:8000" from accessing a cross-origin frame.
What are my options to deal with this ? Is there any way to deal with this ? This is not really CORS site to site but rather the browser not allowing the communication betweek two windows that belong to different sites (although only the port differs)
My app is a flask application and I already enabled CORS there
app = Flask(__name__)
cors = CORS(app, resources={r"*": {"origins": "*"}})
But the browser is still reporting the above error.
Yes CORS has is actually specifically about this and it does not allow the code from a browser window accessing one site to interact with the code in another window that was loaded from another site
As far as my problem goes I found that the editor has an ES6 version that can be loaded without running the Node server (in my case the server running on port 8000)
Toying with the CORS setttings for flask and Node.js (have no clue how to do that) proved to be insufficient for Flask (the above did not solve my problem) and proved to be too difficult for me to do it on Node.js which I do not know anything about

Cookie “PHPSESSID” will be soon treated as cross-site cookie against <file> because the scheme does not match

I've just noticed my console is littered with this warning, appearing for every single linked resource. This includes all referenced CSS files, javascript files, SVG images, and even URLs from ajax calls (which respond in JSON). But not images.
The warning, for example in case of a style.css file, will say:
Cookie “PHPSESSID” will be soon treated as cross-site cookie against “http://localhost/style.css” because the scheme does not match.
But, the scheme doesn't match what? The document? Because that it does.
The URL of my site is http://localhost/.
The site and its resources are all on http (no https on localhost)
The domain name is definitely not different because everything is referenced relative to the domain name (meaning the filepaths start with a slash href="/style.css")
The Network inspector just reports a green 200 OK response, showing everything as normal.
It's only Mozilla Firefox that is complaining about this. Chromium seems to not be concerned by anything. I don't have any browser add-ons. The warnings seem to originate from the browser, and each warning links to view the corresponding file source in Debugger.
Why is this appearing?
that was exactly same happening with me. the issue was that, firefox keeps me showing even Cookies of different websites hosted on same URL : "localhost:Port number" stored inside browser memory.
In my case, i have two projects configured to run at http://localhost:62601, when i run first project, it saves that cookie in browser memory. when i run second project having same URL, Cookie is available inside that projects console also.
what you can do, is delete the all of the cookies from browser.
#Paramjot Singh's answer is correct and got me most of the way to where I needed to be. I also wasted a lot of time staring at those warnings.
But to clarify a little, you don't have to delete ALL of your cookies to resolve this. In Firefox, you can delete individual site cookies, which will keep your settings on other sites.
To do so, click the hamburger menu in the top right, then, Options->Privacy & Security or Settings->Privacy & Security
From here, scroll down about half-way and find Cookies and Site Data. Don't click Clear Data. Instead, click Manage Data. Then, search for the site you are having the notices on, highlight it, and Remove Selected
Simple, I know, but I made the mistake of clearing everything the first time - maybe this will prevent someone from doing same.
The warning is given because, according to MDN web docs:
Standards related to the Cookie SameSite attribute recently changed such that:
The cookie-sending behaviour if SameSite is not specified is SameSite=Lax. Previously the default was that cookies were sent for all requests.
Cookies with SameSite=None must now also specify the Secure attribute (they require a secure context/HTTPS).
Which indicates that a secure context/HTTPS is required in order to allow cross site cookies by setting SameSite=None Secure for the cookie.
According to Mozilla, you should explicitly communicate the intended SameSite policy for your cookie (rather than relying on browsers to apply SameSite=Lax automatically), otherwise you might get a warning like this:
Cookie “myCookie” has “SameSite” policy set to “Lax” because it is missing a “SameSite” attribute, and “SameSite=Lax” is the default value for this attribute.
The suggestion to simply delete localhost cookies is not actually solving the problem. The solution is to properly set the SameSite attribute of cookies being set by the server and use HTTPS if needed.
Firefox is not the only browser making these changes. Apparently the version of Chrome I am using (84.0.4147.125) has already implemented the changes as I got this message in the console:
The previously mentioned MDN article and this article by Mike Conca have great information about changes to SameSite cookie behavior.
Guess you are using WAMP or LAMP etc. The first thing you need to do is enable ssl on WAMP as you will find many references saying you need to adjust the cookie settings to SameSite=None; Secure That entails your local connection being secure. There are instructions on this link https://articlebin.michaelmilette.com/how-to-add-ssl-https-to-wampserver/ as well as some YouTube vids.
The important thing to note is that when creating the SSL certificate you should use sha256 encoding as sha1 is now deprecated and will throw another warning.
There is a good explanation of SameSite cookies on https://web.dev/samesite-cookies-explained/
I was struggling with the same issue and solved it by making sure the Apache 2.4 headers module was enabled and than added one line of code
Header always edit Set-Cookie ^(.")$ $1;HttpOnly;Secure
I wasted lots of time staring at the same sets of warnings in the Inspector until it dawned on me that the cookies were persisting and needed purging.
Apparently Chrome was going to introduce the new rules by now but Covid-19 meant a lot of websites might have been broken while people worked from home. The major browsers are working together on the SameSite attribute this so it will be in force soon.

GET request 200 OK but 'failed to load response data' for links

I made a personal website (http://www.soyoungpark.online) using domain bought from GoDaddy and hosted on AWS s3. I set up everything and thought things were working until I put a simple link to my linkedin profile. When I check the network panel, I see that status code is 200 OK but for the response..there is nothing. The code itself doesn't seem to be problematic; it is simple a with href of the desired link. So I am guessing something could be wrong with my AWS s3 settings? Anyone with similar experience?
It's likely that these services include a header option called "X-Frame" that for security prevents them from being loaded within another site:
The X-Frame-Options HTTP response header can be used to indicate whether or not a browser should be allowed to render a page in a <frame>, <iframe> or <object> . Sites can use this to avoid clickjacking attacks, by ensuring that their content is not embedded into other sites. Source: X-Frame-Options
This does look to be the case when attempting to view Linkedin per your example:
Refused to display 'https://www.linkedin.com/in/exampleuser' in a frame because it set 'X-Frame-Options' to 'sameorigin'.
That said, applying a target Attribute to each to open in a new tab or window should allow these outside services to be navigated to.
e.g:
<a href="https://www.linkedin.com/in/exampleuser" target="_blank">

Youtube not able to play on my django-heroku app. Giving me a mixed content error message

I tried to view youtube videos on my app and it didn't work. I checked the console and got this error message
Mixed Content: The page at'https://hispanicheights.herokuapp.com/blog/youtube-video/'
was loaded over HTTPS,but requested an insecure script
'http://content.jwplatform.com/libraries/WQWJdvRx.js'.
This request has been blocked; the content must be served over HTTPS.
Is there a way around this or is this just the situation until I get a paid account with a domain?
This has nothing to do with Heroku, paid plans or not. It is simply that you are linking to an http resource inside a page that is served by https; since that potentially side steps the man-in-the-middle protection that https gives you, modern browsers forbid it.
The solution is to serve all your dependent scripts via https as well.

OAuthException (#368) The action attempted has been deemed abusive or is otherwise disallowed

I'm trying to post a feed on my wall or on the wall on some of my friends using Graph API. I gave all permissions that this application needs, allow them when i make the request from my page, I'm having a valid access token but even though this exception occurs and no feed is posted. My post request looks pretty good, the permissions are given. What do I need to do to show on facebook app that I'm not an abusive person. The last think I did was to dig in my application Auth Dialog to set all permission I need there, and to write why do I need these permissions.
I would be very grateful if you tell me what is going on and point me into the right direction of what do I need to do to fix this problem.
Had the same problem. I figured out that Facebook was refusing my shortlinks, which makes me a bit mad...but I get the point because its possible that shortlinks can be used to promote malicious content...so if you have shortlinks as part of your test, replace them w the full url...
I believe this message is encountered for one of the two reasons :
Your post contains malicious links
You are trying to make a POST request over a non-https connection.
The second one is not confirmed but I have seen that behavior. While same code in my heroku hosted app worked fine, it gave this #368 error on my 000webhost hosted .tk domain which wasn't secured by SSL
Just in case anyone is still struggling with this, the problem occurs when you put URLs or "action links" that are not in your own app domain, if you really need to post to an extarnal page, you'll have to post to your app first, then redirect from there using a script or something. hope that helps.
also it's better in my opinion to use HTTPS links, as sometimes i've seen a behaviour where http links would be rejected, but that's intermittent.
I started noticing that recently as well when running my unit tests. One of the tests I run is submitting a link that I know Facebook has blocked to verify that I handle the error correctly. I used to get this error:
Warning: This Message Contains Blocked Content: Some content in this message has been reported as abusive by Facebook...
But starting on July 4th, I started receiving this error instead:
(#368) The action attempted has been deemed abusive or is otherwise disallowed'
Both errors indicate that Facebook doesn't like what you're publishing.