Sorry if this is too basic a question for this forum but it's been playing on my mind for a while so I tried it out. I used the IMDB app_id and a spoof URL, image etc and it came back with an error message saying that I wasn't allowed to do that. Good. Tried the same thing with my App ID and it sailed straight through without a hitch! Spoof wall posting appeared as if it came from my app! Could have been absolutely anything! Porn, phishing attack, you name it!
So my question is what have I missed. How come only IMDB is allowed to use their App ID but any Tom, Dick or Harry can use mine?!
There are two things you can do to secure your apps against this, if you're worried.
Turn on Stream post URL security setting in your Dev App settings (under Advanced > Migrations). This will prevent stream posts from your App ID from linking to anything other than your Apps connect or canvas URLs.
Edit the Server Whitelist (under Advanced > Security settings) to only include your apps server IP address. This will mean that only API requests coming from those specified IPs will be accepted.
Related
We have been working on a gaming website. Recently while making note of the major traffic sources I noticed a website that I found to be a carbon-copy of our website. It uses our logo,everything same as ours but a different domain name. It cannot be, that domain name is pointing to our domain name. This is because at several places links are like ccwebsite/our-links. That website even has links to some images as ccwebsite/our-images.
What has happened ? How could have they done that ? What can I do to stop this ?
There are a number of things they might have done to copy your site, including but not limited to:
Using a tool to scrape a complete copy of your site and place it on their server
Use their DNS name to point to your site
Manually re-create your site as their own
Respond to requests to their site by scraping yours real-time and returning that as the response
etc.
What can I do to stop this?
Not a whole lot. You can try to prevent direct linking to your content by requiring referrer headers for your images and other resources so that requests need to come from pages you serve, but 1) those can be faked and 2) not all browsers will send those so you'd break a small percentage of legitimate users. This also won't stop anybody from copying content, just from "deep linking" to it.
Ultimately, by having a website you are exposing that information to the internet. On a technical level anybody can get that information. If some information should be private you can secure that information behind a login or other authorization measures. But if the information is publicly available then anybody can copy it.
"Stopping this" is more of a legal/jurisdictional/interpersonal concern than a technical one I'm afraid. And Stack Overflow isn't in a position to offer that sort of advice.
You could run your site with some lightweight authentication. Just issue a cookie passively when they pull a page, and require the cookie to get access to resources. If a user visits your site and then the parallel site, they'll still be able to get in, but if a user only knows about the parallel site and has never visited the real site, they will just see a crap ton of broken links and images. This could be enough to discourage your doppelganger from keeping his site up.
Another (similar but more complex) option is to implement a CSRF mitigation. Even though this isn't a CSRF situation, the same mitigation will work. Essentially you'd issue a cookie as described above, but in addition insert the cookie value in the URLs for everything and require them to match. This requires a bit more work (you'll need a filter or module inserted into the pipeline) but will keep out everybody except your own users.
I would like to track what websites my site's visitors go to after they leave.
Would it be possible to place a cookie on their browser when they visit my site, and then later if they go to Facebook.com or stackoverflow.com, my cookie will retreive the browser's URL data and send it back to my server.
I could then look at this data and know that my visitors had gone to Facebook.com and stackoverflow.com after they left my site.
Is this possible using cookies?
Thanks for the help.
No. Cookies are not executed or anything. They are just dumb bits of data.
You would need to be able to execute code on the page they are visiting afterwards.
What I presume you are trying to ask, is that you want to track your outbound links.
This is mainly done with Javascript: You need to intercept click events from outbound anchor links, and send an event notification as described here, or using the hitCallbackmethod prior to completing the redirection to the external website. For Google Analytics see documentation. Or you could do via a custom JS implementation sending the info back to your server instead.
Alternatively your could replace all outbound links on the server side in your html source, and have all links pointed to your server first, and redirected to the external sites. But using redirects for this purpose is not really a good recommendation, unless you are an ad networks or a search engine company requiring such method.
Lastly, there is an alternative method using the HTML5 'ping' attribute. But the feature has been either removed and/or not yet fully implemented across all browsers as of this writing.
But you can't track where your visitors go beyond the 1st level outbound links of your site.
I'm reading a lot on this topic for some time now.
The information on the web about this topic is quite confusing.
Use case:
A User visits site hostA.com, which includes a "tracking pixel" from site hostB.com .
The user never visited hostB.com before. hostB.com sets a cookie with the response.
Which browsers will accept and save the cookie? (I read about safari does not, is this true, is this only true for iframes?)
How does default security configuration effect the challenge?
Is it enough to set a "correct" P3P header? (It seams to help with some Versions of the Internet Explorer)
Is there a common, accepted solution for the challenge?
Is the only why to use flash LSOs?
Thanks
stephan
You can only set cookies for the domain that the page came from (your site). 3rd party cookies are up to the user to control through their browser settings. I never allow them in the Internet Group just for the example you gave of a "Tracking Pixel"
I am trying to reference my .asmx webservices in .NET but my server is not exposed to the internet. When I put on the following address I get the message mentioned below. What's the reason for not being able to see the directory? Am I missing something in my IIS configuraction? Am I missing anything in my permissions? Just as reference I have other folders with webservices and I have the same issue.
When I login to the server I am doing it with my windows user and password (I am using windows authentication). It's necessary to mention that when I put the URL I am getting a popup screen to put in my userid and password but it seems that's not able to validate since keeps asking me a couple of times. Let me know if you need more information to address this issue .
http://appsvr02/Inetpub/wwwroot/DevWebApi/
Internet Explorer cannot display the webpage
What you can try:
It appears you are connected to the Internet, but you might want to try to reconnect to the Internet.
Retype the address.
Go back to the previous page.
Most likely causes:
•You are not connected to the Internet.
•The website is encountering problems.
•There might be a typing error in the address.
More information
This problem can be caused by a variety of issues, including:
•Internet connectivity has been lost.
•The website is temporarily unavailable.
•The Domain Name Server (DNS) is not reachable.
•The Domain Name Server (DNS) does not have a listing for the website's domain.
•If this is an HTTPS (secure) address, click tools, click Internet Options, click Advanced, and check to be sure the SSL and TLS protocols are enabled under the security section.
For offline users
You can still view subscribed feeds and some recently viewed webpages.
To view subscribed feeds
1.Click the Favorites Center button , click Feeds, and then click the feed you want to view.
To view recently visited webpages (might not work on all pages)
1.Click Tools , and then click Work Offline.
2.Click the Favorites Center button , click History, and then click the page you want to view.
Try http://appsvr02/DevWebApi.
You shouldn't need to specify the full path, as IIS serves from wwwroot by default.
I don't know what I did but I was able to see it this time. thank youa all for your input. This is how the path to my server shows now. http://appsvr02/DevWeApi/Owner.asmx
We have several websites on different domains and I'd like to be able to track users' movements on these sites.
Obviously cookies are not feasable, because they don't cross domain borders.
I could look at a combination of IP address and User Agent, but there are some cases where that does not work.
I don't want to use flash or other plugins.
Any ideas? Or am I doomed to rely on the IP/User_Agent combination?
You can designate one domain or subdomain to tracking and have it serve a 1x1 pixel image which you include in all pages you would like to track. Serve a cookie with the image, look at the tracking domain's server logs, voilà.
This solution requires no JavaScript, and works even if the user disables third-party cookies.
First, let's make sure the user agent is sending cookies:
If getCookie("c") == null then setCookie("c", "anyValue")
Then let the request finish (aka wait for next request)
Let's call our tracker cookie uaid.
If GET http://child.com/any-page and getCookie("c") is not null and getCookie("uaid") is null...
Redirect to http://parent.com/give-me-a-uaid?returnTo=http://child.com/any-page
On http://parent.com/give-me-a-uaid, check for cookie uaid
If not exists, create it and add it to response. If it exists, get its value.
Redirect to http://child.com/any-page?uaid=valueOfParentsUAIDCookie
Child.com sets cookie uaid with valueOfParentsUAIDCookie
Redirect to http://child.com/any-page
And of course, you are validating input, and white-listing your redirect URLs :)
Flows:
This question is closely related to the Question Accessing Domain Cookies within an iFrame on Internet Explorer.
For Internet Explorer I need to take P3P Policies into account and set an additional P3P HTTP-Header to allow images to set cookies across domain borders. Then I can use simon's suggestion.
You can follow the same concept used in Google Analytics. Injecting javascript in the pages you want to track.
You do not give any context to your situation -just the basic problem. So it is difficult to give an answer that clearly fits. However, here are some techniques/mechanisms for passing information from one page to another, regardless of what domain is involved.
include hyperlink to a 1x1 pixel transparent gif image (sometimes called a "beacon")
rely on referrer information in HTTP request headers to identify page hyperlink is on
include extra parameters in hyperlinks to other site - assuming you run both sites
buy services of a company like Akamai to do user tracking for you
possibly use cross domain cookie mechanism in the future if standard is ever approved
Which techniques really come down to whether you can place software on all of the sites (servers) that the user will visit where you have interest - or you cannot place your software on all of them.