Functioning domain even when the server is off - web-services

Something might be a wrong with my domain name. When i visit the url its still functioning right and shows the default webpage even after the server has been off for weeks now

You may have already tried this, but the two things I would check are:
You could be viewing a cached version of the page. This is the most obvious explanation but I'm guessing you cleared your browser cache.
You might be viewing a copy on a backup server. That'll depend on whether your NS records have an alternate server specified that isn't off.
Are you sure your server is off? What response do you get when you ping it via the domain name? As this is basic troubleshooting, you may have already tried these things, so if you did, what you encountered may be helpful in diagnosing the cause.

Related

Google Ad Manager adverts not showing on specific web page, but correct across rest of site

I've been running Google Ad Manager on a client's website - https://www.thewire.co.uk since 2012 without issue until July 2020 when suddenly adverts stopped appearing on the home page, though they still appear on the rest of the site. I did a deep dive into the issue at the time and could find not clear problem or solution. It's not been an issue for most of the last year, but now we need to resolve it.
What's odd is that the adverts appear if I visit the same page via a different URL, eg, https://www.thewire.co.uk/?foo=bar (query string is ignored by server) or https://www.thewire.co.uk/home/ (this is serving the same page from a different uri)
All the adverts we serve are line items we load up to Ad Manager directly - we don't use adsense. We have some in house adverts running which are set to deliver when specific slots aren't loaded, so for instance there is a leaderboard at the top of every page which should always have an advert. MPUs on the right hand column are set to collapse if no advert is picked.
When I check delivery using Ad Manager debug tool I am informed adverts are delivering correctly, and the adverts show. But in normal viewing it is not.
This is happening consistently across browsers, devices and with/without ad blockers installed, and on 'vanilla' systems running on new ISP networks across several countries, which I believe counters out any limits that may be created by cookies and IP addresses or location.
I've scoured the Google Ad Manager settings in case some form of serving block was in place, but I've found nothing, and no notifications of any limitations on our home page URL, so I'm completely stumped as to why.
I've checked the google tags and embed codes on our site and they are all OK - they are pulled from a template which is used consistently across the site, and they obviously work when I look at the same HTML via a different context, which leads me to believe the issue is some blocking in place for the specific https://www.thewire.co.uk url from Google's side.
Has anyone had any similar issue and have advise on how to resolve or able to point me to somewhere in google I can get specific support for this issue? Searches through help and the support forums have turned up nothing.
There are numerous work arounds I could apply (eg an http 301 or 302 redirect / to /home/ ) but I really want to solve the problem, not work round it only for it to arise again.
There's a previous topic on StackOverflow - Google Ads not showing on my home page - but that shows no resolution. So posting this maybe the same issue in case my necromancy on that doesn't work. The solutions linked in the comments on that post are either outdated or don't work.

A static website on AWS not acccessible

Somewhat curious about how to make a website on AWS, yesterday I went following this document:
https://aws.amazon.com/getting-started/projects/host-static-website/?c_1
in order to get started with something simple.
I clicked the button Get Started with the Implementation Guide and found myself here:
https://docs.aws.amazon.com/AmazonS3/latest/dev/website-hosting-custom-domain-walkthrough.html
It went pretty well, except that even today, I still can't acccess the site with the expected URL (http://example.com).
For the sake of simplicity I decided to leave alone http://www.example.com for the time being.
Since the Step 2.5: Test Your Endpoint and Redirect could be performed without any problem;
I suspect that something went wrong when performing Step 3: Create and Configure Amazon Route 53 Hosted Zone.
I did not find the explanations in the guide very clear, but I did what made sense to me, based on what I could see on the screen and on my previous experience in similar cases with other providers (other than AWS).
Anyone has tried this before and has something to point out?
For reference here is the kind of display I can see on Google Chrome:
This site can’t be reached
example.info’s server IP address could not be found.
Did you mean http://example.com/?
Search Google for example info
ERR_NAME_NOT_RESOLVED
In case something similar happens to someone else, here is what I did.
I finally solved the problem by following Step 5: Route DNS Traffic for Your Domain to Your Website Bucket
of this document:
https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/getting-started.html#getting-started-create-alias
creating an Alias record.

Trying to understand an apparent web tracking malfunction

I use Safari and Firefox. Since Safari doesn't offer the cookie handling I prefer, I frequently invoke a separate tool to clear out most of the cookies. Neither Kayak nor Google are on my exceptions list. A couple of times recently, I cleaned out cache, local storage, flash, etc. to get rid of the so-called zombie cookies (while no webkit clients were running). I was in USA when I did this.
In spite of this, every time I access a Google site, it redirects to google.es and every time I access Kayak.com, it redirects to kayak.uk
I understand about browser fingerprint, but I have occasionally changed a few things that affect that. And even if that weren't enough, there is nothin to hint that they have identified me specifically. I would expect that in the absence of any completely unique identifier, they would assume the country of the IP address(es), which for over four weeks have been Comcast and ATT. (And various public WiFi sites.)
It's not log in—I don't have a login for Kayak and I never log in to Google unless absolutely necessary.
With kayak, I changed it (menu lower right corner of search page) to USA/dollars, but as soon as I go to another site without deleting cookies, when I come back, it's on UK/pounds again.
What might be the cause of this? There are a few other sites behaving similarly.
I think I understand it now. Someone correct me if I missed something.
It was not cookies or tracking or HTML redirect.
It was the browser "remembering" that I had visited kayak.co.uk and google.es while over there and "fixing" the URL for me.

Parallel website running to my original website

We have been working on a gaming website. Recently while making note of the major traffic sources I noticed a website that I found to be a carbon-copy of our website. It uses our logo,everything same as ours but a different domain name. It cannot be, that domain name is pointing to our domain name. This is because at several places links are like ccwebsite/our-links. That website even has links to some images as ccwebsite/our-images.
What has happened ? How could have they done that ? What can I do to stop this ?
There are a number of things they might have done to copy your site, including but not limited to:
Using a tool to scrape a complete copy of your site and place it on their server
Use their DNS name to point to your site
Manually re-create your site as their own
Respond to requests to their site by scraping yours real-time and returning that as the response
etc.
What can I do to stop this?
Not a whole lot. You can try to prevent direct linking to your content by requiring referrer headers for your images and other resources so that requests need to come from pages you serve, but 1) those can be faked and 2) not all browsers will send those so you'd break a small percentage of legitimate users. This also won't stop anybody from copying content, just from "deep linking" to it.
Ultimately, by having a website you are exposing that information to the internet. On a technical level anybody can get that information. If some information should be private you can secure that information behind a login or other authorization measures. But if the information is publicly available then anybody can copy it.
"Stopping this" is more of a legal/jurisdictional/interpersonal concern than a technical one I'm afraid. And Stack Overflow isn't in a position to offer that sort of advice.
You could run your site with some lightweight authentication. Just issue a cookie passively when they pull a page, and require the cookie to get access to resources. If a user visits your site and then the parallel site, they'll still be able to get in, but if a user only knows about the parallel site and has never visited the real site, they will just see a crap ton of broken links and images. This could be enough to discourage your doppelganger from keeping his site up.
Another (similar but more complex) option is to implement a CSRF mitigation. Even though this isn't a CSRF situation, the same mitigation will work. Essentially you'd issue a cookie as described above, but in addition insert the cookie value in the URLs for everything and require them to match. This requires a bit more work (you'll need a filter or module inserted into the pipeline) but will keep out everybody except your own users.

Problems with sessions and ColdFusion only in IE6

We have a strange problem when implementing sessions with ColdFusion in IE6.
After login and after a refresh on the page all the session variables are lost.
Its some kind of config in the Administrator? Could you give me some troubleshooting tips to this issue?
Thanks in advance!
I've seen this before, but it's been a long time. I remember creating a test page to dump out the cookies for the site and they'd change on every refresh. I don't remember if we ever found a solution. I want to say that the issue cleared up after another update from Microsoft, but it was so long ago I honestly don't remember.
What's happening is you're getting new values for the CFID and/or CFTOKEN cookies that CF creates and uses to keep track of the browser's state. (The web is by its nature stateless, but that's not very helpful when you need to do transactions.)
Here are some of the possible issues I've seen other people mention:
Inconsistently using www.domain.com and domain.com. The site may work either way, but unless you're using domain cookies the cookies will care
Privacy settings in IE being too restrictive
Special characters in the domain name (underscore is mentioned specifically)
Lack of P3P policy on the web server (back to the privacy settings)
As for solutions, have you tried using J2EE session variables? Some people have had success with those in solving this.
I think it is because your IE6 is not accepting cookies?