Django detective spots POST requests - django

I installed django-detective last night and today there are many weird post requests in it.
It looks like somebody is trying to post stuff like this on my website:
/wp-admin/admin-post.phpnd_stats_value_import_settings=home[nd_stats_option_value]https://dodgylookingsite.com/pret
Please do not go to this link as I think it might be hosting malware !!
What does this mean?
How can I prevent this from happening?
Thank you

Apparently this looks for known WordPress vulnerabilities (wp-admin).

Related

Is it possible to bypass x-content-type no sniff with just a proxy?

I'm trying to find vulnerabilities in my friend's website using just the community edition of burp. I found the admin page and was able to change the host from 302 to 200. But there's still the issue of x-content-type no sniff. I can't get past a bunch of different pages because of this. Any ideas on how I can bypass this?
I tried just deleting the line completely. Obviously didn't work. I know it probably sounds stupid, but I genuinely don't know what I can do to get around this.

DjangoCMS and two-factor-auth

I'm trying to use the django-two-factor-auth on djangoCMS, I saw it can be hooked to views or even called thru request.user but I'm having bad time trying to figure out how I can actually do that given that on djangoCMS, from what I got so far, there are no views I can touch.
I looked at the example app on heroku for the 2-fact-auth but it doesn't give me enough hints.
Anyone has faced this before?
Thanks

django-localeurl together with FORCE_SCRIPT_NAME not working

I'm currently working on a site where multiple locales will be served under different URLs using django-localeurl. localeurl has always worked for me before when served directly at the top level but this time around I have to use settings.FORCE_SCRIPT_NAME because it needs to be served under a sub-path.
The problem is that when the user enters the site he is redirected to http://www.example.com/en/ and not http://www.example.com/site/en/ as he should be. Serving the site under http://www.example.com/site/ works perfectly when I disable localeurl.
Any suggestions as to how I could fix this would be greatly appreciated as I'm close to tearing my hair out any second now!
There is an open ticket in locale-url for this issue. It also has a proposed patch that fixes it.

Is someone trying to hack my Django website

I have a website that I built using Django. Using the settings.py file, I send myself error messages that are generated from the site, partly so that I can see if I made any errors.
From time to time I get rather strange errors, and they seem to mostly be around about the same area of the site (where I wrote a little tutorial trying to explain how I set up a Django Blog Engine).
The errors I'm getting all appear like something I could have done in a typo.
For example, these two errors are very close together. I never had an 'x' or 'post' as a variable on those pages.
'/blog_engine/page/step-10-sub-templates/{{+x.get_absolute_url+}}/'
'/blog_engine/page/step-10-sub-templates/{{+post.get_absolute_url+}}/'
The user agent is:
'HTTP_USER_AGENT': 'Mozilla/5.0 (compatible; Purebot/1.1; +http://www.puritysearch.net/)',
Which I take it is a scraper bot, but I can't figure out what they would be able to get with this kind of attack.
At the risk of sounding stupid, what should I do? Is it a hack attempt or are they simply trying to copy my site?
Edit: I'll follow the advice already given, but I'm really curios as to why someone would run a script like this. Are they just trying to copy. It isn't hitting admin pages or even any of the forms. It would seem like harmless (aside from potential plagiarism) attempts to dig in and find content?
From your USER_AGENT info it looks like this is a web spider from puritysearch.net.
I suggest you do is put a CAPTCHA code in you website. Program it to trigger when something tries to access 10 pages in 10 seconds (mostly no humans would do this or figure out a proper criteria to trigger your CAPTCHA).
Also, maintain robots.txt file which most crawlers honor. Mention your rules in robots.txt. You can say the crawlers to keep off certain busy sections of your site etc.
If the problem persists, you might want to contact that particular site's system admin & try to figure out what's going on.
This way you will not be completely blocking crawlers (which are needed for your website to become popular) and at the same time you are making sure that your users get fast experience on your site.
Project HoneyPot has this bot listed as a malicious one http://www.projecthoneypot.org/ip_174.133.177.66 (check the comments there) and what you should probably do is ban that IP and/or Agent.

Is it possible to be attacked with XSS on a static page (i.e. without PHP)?

A client I'm working for has mysteriously ended up with some malicious scripting going on on their site. I'm a little baffled however because the site is static and not dynamically generated - no PHP, Rails, etc. At the bottom of the page though, somebody opened a new tag and a script. When I opened the file on the webserver and stripped the malicious stuff and re-uploaded, it was still there. How is this possible? And more importantly, how can I combat this?
EDIT:
To make it weirder, I just noticed the script only shows up in the source if the page is accessed directly as 'domain.com/index.html' but not as just 'domain.com'.
EDIT2:
At any rate, I found some php file (x76x09.php) sitting on the web server that must have been updating the html file despite my attempts to strip it of the script. I'm currently in the clear but I do have to do some work to make sure rogue files don't just appear again and cause problems. If anyone has any suggestions on this feel free to leave a comment, otherwise thanks for the help everyone! It was very much appreciated!
No it's not possible unless someone has access to your files. So in your case someone has access to your files.
Edit: It's best if you ask in serverfault.com regarding what to do in case the server is compromised, but:
change your shell passwords
have a look at /var/log/messages for login attempts
finger root
have a look at last modification time of those files
There is also a high propability that the files where altered via http by using a vulnerability of a software component you use together with the static files.
To the point about the site not having pages executing on the server, XSS is absolutely still possible using a DOM based attack. Usually this will relate to JavaScript execution outputting content to the page. Just last week WhiteHat Security had an XSS vulnerability identified on a purely “static” page.
It may well be that the attack vector relates to file level access but I suggest it’s also worthwhile taking a look at what’s going on JS wise.
You should probably talk to your hosting company about this. Also, check that your file permissions aren't more lenient than they should be for your particular environment.
That's happened to me before - this happens if they get your ftp details. So, whoever did it, obviously got ahold of your ftp details somehow.
Best thing to do is change your password and contact your webhosting company to figure out a better solution.
Unfortunately, FTP isn't the most secure...