Is it possible to bypass x-content-type no sniff with just a proxy? - burp

I'm trying to find vulnerabilities in my friend's website using just the community edition of burp. I found the admin page and was able to change the host from 302 to 200. But there's still the issue of x-content-type no sniff. I can't get past a bunch of different pages because of this. Any ideas on how I can bypass this?
I tried just deleting the line completely. Obviously didn't work. I know it probably sounds stupid, but I genuinely don't know what I can do to get around this.

Related

Deceptive site ahead google chrome when i tried to open the ngrok url

recently I came across this issue when I exposed my port via ngRok.
I simply forwarded it but when I tried to open the ngRok url I got Deceptive site ahead warning.
Here is the image of the warning.
It was a django server with graphql and I wanted to test graphiql. (this point might not be necessary for the reader but more info is always better than no info)
So the solution I found was to click on the red empty area and type "thisisunsafe" (without quotes of-course)
PS: I searched for the solution but couldn't find any I hope this will help others who are looking for the same.
Another workaround that I found is using that same URL in an incognito window. I'm not sure why the security is more lax there...but it works.

Django detective spots POST requests

I installed django-detective last night and today there are many weird post requests in it.
It looks like somebody is trying to post stuff like this on my website:
/wp-admin/admin-post.phpnd_stats_value_import_settings=home[nd_stats_option_value]https://dodgylookingsite.com/pret
Please do not go to this link as I think it might be hosting malware !!
What does this mean?
How can I prevent this from happening?
Thank you
Apparently this looks for known WordPress vulnerabilities (wp-admin).

disabling or refusing to accept cookies in moodle

I'm having trouble working with moodle. I've installed it successfully. After I filled the installation form
Firefox has detected that the server is redirecting the request for this address in a way that will never complete.
This problem can sometimes be caused by disabling or refusing to accept cookies.", the second pic. I tried again but it says the same thing for 2 days now. I've tried removing cookies, but still doesn't work.
There is an easy way.
In the file
\moodle\admin\index.php
search and commandout
//redirect("index.php?sessionstarted=1&lang=$CFG->lang");
This trick worked for me.

Is someone trying to hack my Django website

I have a website that I built using Django. Using the settings.py file, I send myself error messages that are generated from the site, partly so that I can see if I made any errors.
From time to time I get rather strange errors, and they seem to mostly be around about the same area of the site (where I wrote a little tutorial trying to explain how I set up a Django Blog Engine).
The errors I'm getting all appear like something I could have done in a typo.
For example, these two errors are very close together. I never had an 'x' or 'post' as a variable on those pages.
'/blog_engine/page/step-10-sub-templates/{{+x.get_absolute_url+}}/'
'/blog_engine/page/step-10-sub-templates/{{+post.get_absolute_url+}}/'
The user agent is:
'HTTP_USER_AGENT': 'Mozilla/5.0 (compatible; Purebot/1.1; +http://www.puritysearch.net/)',
Which I take it is a scraper bot, but I can't figure out what they would be able to get with this kind of attack.
At the risk of sounding stupid, what should I do? Is it a hack attempt or are they simply trying to copy my site?
Edit: I'll follow the advice already given, but I'm really curios as to why someone would run a script like this. Are they just trying to copy. It isn't hitting admin pages or even any of the forms. It would seem like harmless (aside from potential plagiarism) attempts to dig in and find content?
From your USER_AGENT info it looks like this is a web spider from puritysearch.net.
I suggest you do is put a CAPTCHA code in you website. Program it to trigger when something tries to access 10 pages in 10 seconds (mostly no humans would do this or figure out a proper criteria to trigger your CAPTCHA).
Also, maintain robots.txt file which most crawlers honor. Mention your rules in robots.txt. You can say the crawlers to keep off certain busy sections of your site etc.
If the problem persists, you might want to contact that particular site's system admin & try to figure out what's going on.
This way you will not be completely blocking crawlers (which are needed for your website to become popular) and at the same time you are making sure that your users get fast experience on your site.
Project HoneyPot has this bot listed as a malicious one http://www.projecthoneypot.org/ip_174.133.177.66 (check the comments there) and what you should probably do is ban that IP and/or Agent.

Is it possible to be attacked with XSS on a static page (i.e. without PHP)?

A client I'm working for has mysteriously ended up with some malicious scripting going on on their site. I'm a little baffled however because the site is static and not dynamically generated - no PHP, Rails, etc. At the bottom of the page though, somebody opened a new tag and a script. When I opened the file on the webserver and stripped the malicious stuff and re-uploaded, it was still there. How is this possible? And more importantly, how can I combat this?
EDIT:
To make it weirder, I just noticed the script only shows up in the source if the page is accessed directly as 'domain.com/index.html' but not as just 'domain.com'.
EDIT2:
At any rate, I found some php file (x76x09.php) sitting on the web server that must have been updating the html file despite my attempts to strip it of the script. I'm currently in the clear but I do have to do some work to make sure rogue files don't just appear again and cause problems. If anyone has any suggestions on this feel free to leave a comment, otherwise thanks for the help everyone! It was very much appreciated!
No it's not possible unless someone has access to your files. So in your case someone has access to your files.
Edit: It's best if you ask in serverfault.com regarding what to do in case the server is compromised, but:
change your shell passwords
have a look at /var/log/messages for login attempts
finger root
have a look at last modification time of those files
There is also a high propability that the files where altered via http by using a vulnerability of a software component you use together with the static files.
To the point about the site not having pages executing on the server, XSS is absolutely still possible using a DOM based attack. Usually this will relate to JavaScript execution outputting content to the page. Just last week WhiteHat Security had an XSS vulnerability identified on a purely “static” page.
It may well be that the attack vector relates to file level access but I suggest it’s also worthwhile taking a look at what’s going on JS wise.
You should probably talk to your hosting company about this. Also, check that your file permissions aren't more lenient than they should be for your particular environment.
That's happened to me before - this happens if they get your ftp details. So, whoever did it, obviously got ahold of your ftp details somehow.
Best thing to do is change your password and contact your webhosting company to figure out a better solution.
Unfortunately, FTP isn't the most secure...