How to get the $_SERVER['DOCUMENT_ROOT'] in coldfusion? - coldfusion

In LAMP, I get the root of the website with
$_SERVER['DOCUMENT_ROOT']
as in /some/path/public_html
How do I get this, in coldfusion?
Update: I find that on this server, '/' is the doc root (and not the root of the whole file system, like LAMP) Is that reliable?

ColdFusion really has no idea where the 'web root' is, not should it really matter. I often architect applications where all my controllers, services and files are outside of the web root. What I tend to do in those cases where I actually need to know the web root I will create a ColdFusion mapping in my Application.cfc file that looks like this:
rootDir = getDirectoryFromPath(getCurrentTemplatePath());
this.mappings[ "/www" ] = rootDir;
this.mappings[ "/services" ] = rootDir & '../services';
This will create 2 ColdFusion mappings named www and services respectively pointing to the 'web root' (my www directory) and services directory above the web root. My directory structure would look something like this:
Site Directory
wwwapp
services
www (web root)

I normally use
ExpandPath('/')
but I strongly suggest to check it on a server by server basis.
Also, please notice that this funcion returns the ending slash:
/var/www/html/

Related

Nuxt SSG app not routing correctly on google cloud bucket, goes to dir/index.html on reload

I followed this tutorial on how to host server side generated sites on Google Cloud Buckets. The issue I am having is that the nuxt app works fine when routing internally, but when I reload a route, the site is redirected to whatever route plus a /index.html at the end.
This causes things to break, as it conflicts with Nuxt routing. I set index to be my entry point ala
gsutil web set -m index.html -e 404.html gs://my-static-assets
but it seems to assume that to be the case always. I don't have this problem using Netlify.
According to the tutorial that you're doing, you're redirected to /route/index.html because of your MainPageSuffix index, this means that when you try to access for example https://example.com/route the service look for the target https:// example.com/route/index.html.
To fix this I suggest that you include an index.html file under the subdirectory in the subdirectory of the route.
I recommend, for further info on this to check this post which has a similar issue.

Are there alternate locations for the robots.txt file or configurations that control it?

We have a website with a number of applications configured as sub-sites. Currently none of the sites are being indexed properly by google. I'm assuming that this is due to the robots.txt file which contains:
User-Agent: *
Disallow: /
I can view the robots.txt file by adding the file name to the URL of the root site (example.com/robots.txt) but when searching the actual web server directory there is no robots.txt file. I have tried to put a new robots.txt file in the root directory but it has no effect. The only thing that has come up when I've tried to search for this issue is an IIS Search Engine Optimization feature which we do not have installed. Is there some kind of server setting or policy that is generating the file automatically? We have access to the web server but it was set up and is controlled by another department.
VM server
Windows Server 2012 R2 Standard
IIS 8.5
Turns out that our website is being directed through the Azure AD Application Proxy. According to this doc web crawler robots are blocked through the aforementioned default robots.txt reply: https://learn.microsoft.com/en-us/azure/active-directory/manage-apps/application-proxy-security.

Browserify + WebStorm debug breaks routing in React-Router v4 BrowserRouter

I am writing a single page app with React for educational purposes. My React-Router v4 BrowserRouter handles client side routing correctly on CodeSandbox but not locally. In this case, the local server is the webstorm built-in devserver. HashRouter works locally but BrowserRouter does not.
Functioning properly: https://codesandbox.io/s/j71nwp9469
You are likely serving your app on the built-in webserver (localhost:63342), right? Internal web server returns 404 when using 'absolute' URLs (the ones starting with slash) as it serves files from localhost:port/project_name and not from localhost:port. That's why you have to make sure to change all URLs from absolute to the relative ones.
There is no way to set up the internal webserver to use project root as server document root. But you can configure it to use URLs like http://<host name>:<port> where the 'host name' is a name specified in hosts file, like 127.0.0.1 myhostName. See https://youtrack.jetbrains.com/issue/WEB-8988#comment=27-577559.
The solution was to understand how push state routing and the history API works. It is necessary to proxy requests through the index page when serving Single Page Applications that utilize the HTML5 History API.
The Webstorm dev server is not expected to include this feature, therefore the mention of Webstorm in this thread was a mistake.
There are multiple libraries of < 20 lines which do this for us, or it can easily be hand coded.

lighttpd mod_rewrite to remote URLs for certain file extentions

I am trying to set up a local development lighttpd server with all the PHP, HTML and CSS files of my project. I want to set up lighttpd to load these files locally, but for any other file extension, load remotely on a remote website via a URL.
I have tried the following code in the lighttpd.conf file:
url.rewrite = ( "^(.*)$" => "http://<remote site>/" )
and I have also tried
url.rewrite = ( "^(.*)$" => "http://<remote site>/$1" )
This was a test for redirecting everything, however both of these return a 404 for any URL I try, so something is wrong already and is stopping me going any further.
Can someone give me some help with this?
Probably the option you're looking for is redirects via mod_redirect written as url.redirect.
This will tell the browser to go to the new url for the resource. This does slow things down slightly as the browser has to make a second request to go to the actual location of the resource.

ColdFusion built in web server, how to test from a domain

So I want to do this:
myapp.com -> localhost (ColdFusion local web server)
I can currently only do this:
localhost/myapp_folder/
Any ideas on how to achieve this using built in web server?
One solution is to add
127.0.0.1 myapp.com
to your windows/etc/hosts file
(You'll need to do this as an administrator, so right click notepad and 'Use as administrator', browse to the file and edit that way).
This will make all requests on your local machine for myapp.com point to 127.0.0.1 (i.e localhost) - don't forget to remove it when you're done testing, as it will obviously stop you looking at the real site when you go live with it.
This will allow you to do myapp.com/myapp_folder/, but if you want to map /myapp_folder/ to the root of the myapp.com domain using the technique above, you'll have to use something more sophisticated like apache or IIS.