wampserver homepage only visible from localhost - wamp

Recently I used WAMP server to set up a server environment in a Windows machine. Everything works great, but I have a little problem: everyone can access the wampserver homepage, therefore they can see other webpages hosted in the same server, the server file system, etc.
The URLs of the webpage have the following format: hostname/project1, hostname/project2... The main problem is that, anyone can see all the projects that are hosted by going to the direction of the hostname because this will lead to the wampserver homepage, and I would prefer that this homepage could be accessed only in the localhost of the windows host. Is there any way to do that? I'm guessing that I will need to modify some parameters in configuration files, but I have no idea wich ones...

If you intend to block access to all sites hosted on this computer from outside access, you can do this in your main apache configuration file at <installation drive>/wamp/bin/apache/Apache<version number>/conf/httpd.conf. .htaccess is more for per-site configurations, though it will certainly work if you put it in the main www directory.
To disallow outside access to the www folder (open by default) find the part of the apache configuration file (path shown above) that looks like:
<Directory "<installation drive>/wamp/www">
# There will be comments here and some options like FollowSymLinks and AllowOverride
Order Allow,Deny
Allow from all
</Directory>
And change it to:
<Directory "<installation drive>/wamp/www">
# There will be comments here and some options like FollowSymLinks and AllowOverride
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
</Directory>
If your goal is not to block outside access to all of your sites, it would help to know more about your set up. And if your goal is only to block the 'localhost' page and still allow access to, say, 'localhost/site1' then this question may be a duplicate of this.
Edit:
As you point out, there is not a good resolution for the question I linked. Assuming you have your public sites set up as virtual hosts in a sub folder of the webroot like:
|-wamp_root
|-www
|-vhosts
|-public_site_1
|-public_site_2
Then you can go back into your httpd.conf and add this below your /wamp/www/ rule:
<Directory "<installation drive>/wamp/www/vhosts/">
# There will be comments here and some options like FollowSymLinks and AllowOverride
Order Allow,Deny
Allow from all
</Directory>
This will allow anything in the www folder to be accessed only locally, and anything in the vhosts sub folder to be accessible outside. Again, remember to restart Apache whenever you change this file.

It should be possible to block other users using the windows firewall.
You could also use a .htaccess file like this one:
Order deny,allow
Deny from all
Allow from 127.0.0.1
You will have to make sure that AllowOverride is set to All in the apache configuration and that the .htaccess wil be applied to all subdirectories too, otherwise your projects will still be available.

It appears (after a bit of head-scratching myself), the answer to this question was simple.
In the Windows Taskbar, left click the WAMP icon, then click 'Put Offline'.
It doesn't appear to take the entire webserver "offline", just the root homepage? and anything you've configured in your httpd.conf file to be accessible externally still stands, they are still reachable.
NOTE: The default VHOST's are still reachable though, PHPINFO and PHPMYADMIN for example!

It is not difficult.
edit the index file by notepad++
find the line &projectContents
change from &projectContents to &project---Contents
then the project title disappears.

Related

How to restrict apache to not send any csv,txt,log,sh file in response

What I want here is that, if I create any csv,log,txt,sh file in public or any other folder inside /var/www/ and forgot to remove/delete it, and if any attacker hits the valid file path or url then apache should not send that csv,log,txt or sh file in reponse.
I changed my apache2.conf as below but still when I hit URL like
mysite.com/website_content/assets/transactions.csv
It sends me the file in response. And I can download it.
<FilesMatch "\.(csv|sh|log|txt)$">
Require all denied
</FilesMatch>
I want apache to not send any file of csv,docx,doc,sh,log,txt in response. any help would make my life easy.
My team leader solved this issue efficiently as
<Files ~ "\.csv$">
Order allow,deny
Deny from all
</Files>
And then I changed into this one to block multiple files as
<Files ~ "\.(csv|sh|txt|doc|docx|zip|gz|bz|py|xlsx|db|pdf|crt|info|key|rb)$">
Order allow,deny
Deny from all
</Files>
There will be 403 Forbidden whenever anyone tries to access such type of files.

Can I use a regex or string replace with Apache VirtualDocumentRoot?

I have an Apache configuration which is something like this:
<VirtualHost *:80>
ServerAlias *.example.com
VirtualDocumentRoot /var/www/%1
<Directory /var/www/>
Options -Indexes +FollowSymLinks +MultiViews
AllowOverride All
Require all granted
</Directory>
</VirtualHost>
The idea is that it serves any subdomain request from a directory with the same name. For instance, the docroot for http://beta-a.example.com becomes /var/www/beta-a.
This works fine.
My question is this: Is there any way to modify the extracted part of the request based on some logic? Ideally a regex. For example, I'd like to take all requests like:
http://beta-a.example.com
http://beta-b.example.com
http://beta-c.example.com
And remove everything after and including the - so that the docroot would become /var/www/beta. Basically, I'd like to find some way to have alternate host names that get served from the same docroot. I know of the rule:
%N.M insert (part of) the name
But this requires that I specify an explicit length and does not seem to allow any application of logic for the extracted substring.
Although not exactly what I am looking for, I'll offer a workaround which I'm using here. I can add an additional subdomain in the second position which accomplishes almost the same thing. So using these:
http://beta.a.example.com
http://beta.b.example.com
http://beta.c.example.com
All the above will be served from docroot /var/www/beta.

Hosting a wordpress blog parallel with a django app

I would like to host a blog at a subfolder of my domain, which is covered by a django app. I'm most of the way there, but I need some help getting over the finish line, as it were.
Currently, if I go to domain.com, the django app is served correctly. If I go to domain.com/blog/, the blog is served correctly. However, if I go to domain.com/blog (note the missing trailing slash), the urlconf returns a URL not found error.
I've tried a couple of things, including:
Reordering the Alias, Directory, and WSGIScriptAlias statements in my Apache configuration
Having the django urlconf trap the domain.com/blog condition and redirect to domain.com/blog/ (probably unsurprisingly causing an infinite loop of redirects)
What are my next steps?
Here is the relevant part of my Apache conf:
Alias /blog/ /var/www/blog/
<Directory /var/www/blog/>
AllowOverride All
Order deny,allow
Allow from all
</Directory>
I haven't used Apache in years, but try aliasing just /blog instead of /blog/. The problem currently is that Apache is not catching it, so it's being passed to Django. If that doesn't work, you might also try setting up a 301 redirect in your Apache conf to redirect to the slash version, thereby avoiding Django altogether.

Virtual Hosting in Apache with Regular Expressions

I have seen it before where, to stop the need to restart the Apache service when a new virtual host is added you can use regular expressions to setup a Virtual Host. I have a server where sites are added and removed fairly often, and would like to do so.
All directories for the sites are in the following format: /var/www/{domain-of-site}/www. So I need to to match the regular expression "var/www/([A-Za-z0-9.]){1,}/www" to get both the directory and the domain name.
Is this really possible in Apache2? If so what would a basic look like?
I do this on my dev machine. You need to enable mod_vhost_alias.
Then in your vhosts file, add:
VirtualDocumentroot "/var/www/%-1.0s/%-2.0/public_html"
<Directory "/var/www">
Options Indexes FollowSymLinks
AllowOverride All
Order allow,deny
Allow from all
</Directory>
This will point http://mydevproject.client to /var/www/clients/mydevproject/public_html

Use .htaccess to restrict external access to my Intranet

I'm sure this is possible, but its beyond my meager abilities with .htaccess files.
We have an internal php app that we use, we have basic security internally, but dont need to worry too much. I would like to make it available online for use when staff are out and about. I would like to have additional security based on htaccess or htpassword files.
Is it possible to write a htaccess file that does the following
If user is accessing from office.mydomain.com it means they are internal (office.mydomain.com resolves to an internal ip like 192.168.22.22) so allow unimpeded access
If the user is accessing from outside it will be external.myoffice.com - if this is the case as an added bit of security I would like to use .htaccess and a password file to get the user to enter an apache password.
Can anyone tell me how to write this with .htaccess file?
Update: Thanks for all the answers, I have posted what worked for me as an answer to help others.
You can use
RewriteCond %{REMOTE_ADDR} !^192\.168\.
to specify the condition of an external IP, or use
RewriteCond %{REMOTE_ADDR} ^192\.168\.
for the condition of a local IP.
You will just have to integrate these into your existing htaccess rules in a sensible way.
I think this does do what you want;
http://codesanity.net/2009/11/conditional-htpasswd-multienvironment-setups/
http://tomschlick.com/2009/11/08/conditional-htpasswd-multi-environments/
https://tomschlick.com/2009/11/08/conditional-htpasswd-multi-environments
Correct address for the resource as of 2022/01/15.
https://tomschlick.com/conditional-htpasswd-multi-environments/
Here you go
order deny,allow
allow from 192.168.22.0/255.255.255.0
deny from all
You can use a subnet mask to make sure the visitors are from the same network. If you need to address another network, just use those IP's (as the server sees them)
To Complete this answer the following Works.
#allows everything if its on a certain host
SetEnvIf HOST "^www.mysite.com" external_url
SetEnvIf HOST "^localhost" local_url
Order Deny,Allow
AuthName "Restricted Area"
AuthType Basic
AuthUserFile path/to/your/.htpasswd
AuthGroupFile /
Require valid-user
#Allow valid-user
Deny from all
Allow from env=external_url
Allow from env=local_url
Satisfy any
This pops up a Restricted Area login box if you visit via the www.mysite.com but displays nothing if you are coming locally.