I was wondering if anyone has any links to on how to implement Akamai's Edgescape solution to get the zip code? I tried scouring the web for some sort of documentation from Akamai, but couldn't find any docs online, thought I would ask here first before contacting them.
If you have an Akamai account and have access to the control panel (https://control.akamai.com/), here is a document where you will find the information you need : https://control.akamai.com/dl/customers/ESCAPE/EdgeScape_users_guide.pdf
This sounds like an apples and oranges question. If you're using a CDN, by design, a percentage of requests that would normally be directed at your web server will be offloaded by the CDN. Of the total number of requests, those that make it through can be configured to provide the "True IP" of the client if you prefer.
As of 04/12 this is configured by adding the optional "Edge Services General" feature to your config, then enabling the "True Client IP Header".
As a bonus feature, if you're a Rails shop I'd suggest changing the name of the header to "Client-IP". If you do so, Rails will automatically use this header to determine the real ip for the user. This works as of 3.2.x, as documented here in ActionDispatch:: RemoteIp
Note: Rails appends the HTTP_ to the header :)
Related
My domain: fishercoder.com is registered with AWS Route53.
Now I'd like to configure Google My Business to use this domain.
I searched on Google's doc and found that they do offer clear instructions on how to purchase a new domain through them, for third-party domain they listed instructions for GoDaddy, eNom and Network Solutions, but none for AWS Route53.
I thought it might be similar, so I tried to simulate what I can do on AWS Route 53 console, but didn't find any luck.
Any could share any ideas how to achieve this?
More details:
Right now, when people search "fisher coder", this page shows up: https://ibb.co/pRWjRc9, and if they click Website, it'll take them to the default Google My Business website which is not what I desired, I'd like to change it to point to my own domain: fishercoder.com
Thanks!
You can do it but it’s not a pretty solution. Not only that but a Google My Business Site (I assume this is what you mean) is so basic it’s not a good website replacement at all. It’s a good free option to set up because it’s free but other than that, it’s meant to keep people in Google, not to help you. You can only map custom domain buy from from business sites option given there.
Here’s how you do it:
Buy a domain wherever you prefer (I like Namecheap but Google Domains is also a good option).
Forward the domain to the Google Sites URL (many registrars will allow you to do this for free).
That’s it!
It’s not a pretty solution nor ideal because the URL of the Google Site will still be the original URL and they won’t stay on your custom domain at all.
So, simple description: if someone types in http://customdomain.com they will get forwarded to your Google URL and remain on that URL. It essentially just forwards to your Google Site, that’s it.
In AWS routes you will get option to forward domain. https://aws.amazon.com/premiumsupport/knowledge-center/redirect-domain-route-53/
This all information based on own experiment and study based on below link
Reference info : https://www.quora.com/How-can-I-attach-a-custom-domain-to-a-Google-Sites-website
We have been working on a gaming website. Recently while making note of the major traffic sources I noticed a website that I found to be a carbon-copy of our website. It uses our logo,everything same as ours but a different domain name. It cannot be, that domain name is pointing to our domain name. This is because at several places links are like ccwebsite/our-links. That website even has links to some images as ccwebsite/our-images.
What has happened ? How could have they done that ? What can I do to stop this ?
There are a number of things they might have done to copy your site, including but not limited to:
Using a tool to scrape a complete copy of your site and place it on their server
Use their DNS name to point to your site
Manually re-create your site as their own
Respond to requests to their site by scraping yours real-time and returning that as the response
etc.
What can I do to stop this?
Not a whole lot. You can try to prevent direct linking to your content by requiring referrer headers for your images and other resources so that requests need to come from pages you serve, but 1) those can be faked and 2) not all browsers will send those so you'd break a small percentage of legitimate users. This also won't stop anybody from copying content, just from "deep linking" to it.
Ultimately, by having a website you are exposing that information to the internet. On a technical level anybody can get that information. If some information should be private you can secure that information behind a login or other authorization measures. But if the information is publicly available then anybody can copy it.
"Stopping this" is more of a legal/jurisdictional/interpersonal concern than a technical one I'm afraid. And Stack Overflow isn't in a position to offer that sort of advice.
You could run your site with some lightweight authentication. Just issue a cookie passively when they pull a page, and require the cookie to get access to resources. If a user visits your site and then the parallel site, they'll still be able to get in, but if a user only knows about the parallel site and has never visited the real site, they will just see a crap ton of broken links and images. This could be enough to discourage your doppelganger from keeping his site up.
Another (similar but more complex) option is to implement a CSRF mitigation. Even though this isn't a CSRF situation, the same mitigation will work. Essentially you'd issue a cookie as described above, but in addition insert the cookie value in the URLs for everything and require them to match. This requires a bit more work (you'll need a filter or module inserted into the pipeline) but will keep out everybody except your own users.
The content of my site depends of cookies in the request, and when Google crawler bot visits my site it deoesn't index much content, because it does't have the specific cookies in each of its requests.
Is it possible to setup some rule that when the crawler bot is crawling my site it uses the specific cookies?
Googlebot does not honor cookies on purpose -- it has to "see" what anybody else will see on your website, the "smallest common denominator" if you will; otherwise search results would be meaningless to an unknown amount of searchers.
Please google for "Googlebot cookies" to get pointed to discussions and documentations about search engines, how they work and why they work how they work; one solution to your problem might be to implement the "first visit/view free" rule.
Yes, the Google crawler has the word "Googlebot" in it's request header. Simply check for that, but be warned that people can spoof this to get access to your site's content as well. As curiousguys stated in the comments, this is generally looked down upon by people who use your site and probably against Google's TOS.
I am working on a website that generates traffic for partner sites. When a partner site's logo is clicked on our site we open the partner site in a page that contains our basic header and the partner site within an iframe. Earlier we were simply opening the partner site in new window. All cool so far.
Most partner sites use google analytics to track the traffic that we send them and soon after we started opening sites within iframe our partners reported that google analytics does not track data anymore (or tracks just a fraction of data).
I have done my fair share of homework/research on googleverse and found the know issue with google analytics or cookies in general across domains and iframes.
I am trying to resolve this issue and the only solution that has been referenced is the use of P3P headers.
First, where do the P3P headers go? In my sites pages or the partner sites pages. Since we have many partner sites (big and small) it wont be practical if the solution is to put tags in each of these sites. I can easily have them added to the page that contains the iframe.
Among the various p3p header generators is there a reliable one that you recommend?
Is there any way around this issue? I really need to open the sites in iframes and obviously the partner sites really need to track the traffic.
Thank you for the help.
Unfortunately, both you and the partner site needs to set the headers.
Alternatives:
If you do not want the partner site to set headers, one option is to lower the security level (in IE) or grant access to 3rd party cookies (in FF) in the browser settings. Every client has to do this, so this may not be an attractive solution.
Use localStorage (HTML5 thingy - browsers that support localStorage allow access to both the site and the iFrame's content that is stored in localStorage). This may not be feasible in the short term as it requires both you and your partner site to implement saving/reading information to/from localStorage and not every browser supports it (older IE browsers especially).
To add a basic policy header (ideally you should generate your own policy which is straight forward - check item#2 below)
in php add this line:
<?php header('P3P: CP="CAO PSA OUR"'); ?>
in ASP.Net:
HttpContext.Current.Response.AddHeader("p3p", "CP=\"CAO PSA OUR\"");
in HTML pages:
<meta http-equiv="P3P" content='CP="CAO PSA OUR"'>
Regarding your other concerns:
1) P3P headers refer to the HTTP header that delivers something called a compact policy to the browser. Without such a policy in place, IE (most notably) and other browsers will block access to 3rd party cookies (a term used to refer to iFrame's cookies) to protect user's privacy concerns.
As far as Google Analytics goes, both your site and the partner site still needs to configure cross domain tracking as outlined in their documentation.
2) You can use this basic policy header (which is enough to fix iFrame's cookies):
P3P: CP="CAO PSA OUR"
or generate your own. If you're not sure what those terms mean, see this.
To generate such policy, you can use online editors such as p3pedit.com or IBM's tool which present a set of questions and allow you to present answers. This makes it easy for you to quickly generate such policy. You can generate the policy XML, compact policy and more.
3) You can try the two alternatives mentioned above.
Steps to add the policy to your entire site
Generate a compact policy (using one of the tools mentioned earlier) or use the basic policy
In IIS, right-click the desired page, directory, or site, and then click Properties.
On the HTTP Headers tab, click Add.
In the Custom Header Name field, type P3P.
In the Custom Header Value field, enter your Compact P3P Policy (or the basic one from above) and then click OK.
In Apache, a mod_header line like this will do:
Header append P3P "CP=\"CAO PSA OUR\""
Hope ths helps.
We have several websites on different domains and I'd like to be able to track users' movements on these sites.
Obviously cookies are not feasable, because they don't cross domain borders.
I could look at a combination of IP address and User Agent, but there are some cases where that does not work.
I don't want to use flash or other plugins.
Any ideas? Or am I doomed to rely on the IP/User_Agent combination?
You can designate one domain or subdomain to tracking and have it serve a 1x1 pixel image which you include in all pages you would like to track. Serve a cookie with the image, look at the tracking domain's server logs, voilà.
This solution requires no JavaScript, and works even if the user disables third-party cookies.
First, let's make sure the user agent is sending cookies:
If getCookie("c") == null then setCookie("c", "anyValue")
Then let the request finish (aka wait for next request)
Let's call our tracker cookie uaid.
If GET http://child.com/any-page and getCookie("c") is not null and getCookie("uaid") is null...
Redirect to http://parent.com/give-me-a-uaid?returnTo=http://child.com/any-page
On http://parent.com/give-me-a-uaid, check for cookie uaid
If not exists, create it and add it to response. If it exists, get its value.
Redirect to http://child.com/any-page?uaid=valueOfParentsUAIDCookie
Child.com sets cookie uaid with valueOfParentsUAIDCookie
Redirect to http://child.com/any-page
And of course, you are validating input, and white-listing your redirect URLs :)
Flows:
This question is closely related to the Question Accessing Domain Cookies within an iFrame on Internet Explorer.
For Internet Explorer I need to take P3P Policies into account and set an additional P3P HTTP-Header to allow images to set cookies across domain borders. Then I can use simon's suggestion.
You can follow the same concept used in Google Analytics. Injecting javascript in the pages you want to track.
You do not give any context to your situation -just the basic problem. So it is difficult to give an answer that clearly fits. However, here are some techniques/mechanisms for passing information from one page to another, regardless of what domain is involved.
include hyperlink to a 1x1 pixel transparent gif image (sometimes called a "beacon")
rely on referrer information in HTTP request headers to identify page hyperlink is on
include extra parameters in hyperlinks to other site - assuming you run both sites
buy services of a company like Akamai to do user tracking for you
possibly use cross domain cookie mechanism in the future if standard is ever approved
Which techniques really come down to whether you can place software on all of the sites (servers) that the user will visit where you have interest - or you cannot place your software on all of them.