EC2 web application folder structure - amazon-web-services

I have a web application which is currently working fine on my local machine and I am now trying to get it to work on EC2.
I transferred the index.php file into the folder /var/www and I am able to access it by visiting my elastic IP (for example, http://123.45.678.910/ ).
The trouble is that I also added the folder named restAPI into the folder /var/www which in turn has several files. When I try to access restAPI/index.php by going to the URL - http://123.45.678.910/var/www/restAPI/index.php, it gives me a 404 error.

There are two things at play here:
The file system path
The URL path
If you're running an Amazon Linux image, your web content should be deployed inside /var/www/html -- as is the case with just about every reasonable Linux installation.
If your index page is stored at /var/www/html/index.php, then your URL will be http://123.45.678.910/index.php.
If you're trying to access http://123.45.678.910/var/www/restAPI/index.php, it means that you uploaded your file to /var/www/html/var/www/restAPI/index.php.
Make sense?

Related

How to download files onto personal computer from EC2 without knowing file name?

I want to use selenium to log into a private database and download some files. I can already do this via a python script that will launch a new chrome window (via selenium) and automatically download the files I need locally.
My python script uses selenium. Once the python script is run, it launches a google chrome window, which selenium then does some automatic clicking on to download files.
Now, I want to deploy my code to a web application so that I have a website online for others to use. I have my script on an Amazon EC2 instance and I call/invoke my script whenever a user on my website clicks a button. However, the files are downloaded onto the EC2 instance. I need these files to be downloaded on the person's personal computer after he clicks my button on my website.
Is there a way to achieve this, either by re-directing downloads? The file names are not known at runtime.
In summary, I have a script (which downloads files) on EC2 that is invoked when my button on my website is clicked. But I need the downloaded files to go onto the user's computer, not the EC2 instance/terminal.
Thank you in advance!
However, the files are downloaded onto the EC2 instance. I need these files to be downloaded on the person's personal computer
No there is no way to redirect downloads from an EC2 instance in the same way that you can't redirect downloads normally outside of AWS anyway.
When you download a file, you download it to the machine that requests the download.
Perhaps try returning the download URL in some way back to the UI and trigger the download yourself on the machine (if the URL does not need credentials). Or download the file, reupload to S3 and create a pre-signed URL that you can return to the UI.

Unable to use gatsby website in offline mode

I've built a gatsby website, but when I try to use it offline (by directly loading index.html into my browser), it fails to load the files in the assets folder, and links to other pages fails
running in windows:
after installing gatsby , I did the following:
gatsby new sample
cd sample
gatsby build
then I went to file explorer and opened the sample/dist directory and double clicked on index.html (Chrome is my default browser, but IE behaves the same)
the result is a half-loaded webpage that is missing the style sheets, javascript, images, and links are broken.
For instance, the "about" link on the first page points to "D:/about" vs. ".\about.html".
Is there anyway to make gatsby work to create a truly off-line website?
I've built a gatsby website, but when I try to use it offline (by directly loading index.html into my browser), it fails to load the files in the assets folder, and links to other pages fails
Gatsby will create a React app for you when it is built, and because most React apps use client-side routing, your links won't work with file:// URLs.
After installing the Gatsby CLI and building your site with gatsby build you should run gatsby serve which will serve up index.html with a static file server on your own machine.
See a similar answer about create-react-app here
Try using gatsby serve from the root of your project. Serve spins up a web server to serve your prod build.
Look it up on the Gatsby CLI docs on their site.
Gatsby isn't really set up to do that, unfortunately. It's a site generator, not page generator, and it expects living on a server. The result is that while the files are static, the navigation isn't.
Because I spent some time experimenting, this is what DOESN'T work:
Setting . as pathPrefix in gatsby-config.js. Gatsby lets you set path prefix, which it then prepends to all generated urls. Unfortunately, this prefix always gets "absolutized" somehow. . gets converted to /., for example.
Setting the absolute path of the file on disk as pathPrefix. Same as above - file:///path/to/file doesn't survive the build (results in file:/) and /path/to/file breaks the JavaScript.
Setting the pathPrefix to a bogus value like /NOTAPREFIX and search-replacing it in the generated files. Breaks the JavaScript, again.
Something I haven't tried but might get you somewhere, would be to disable the Single Page App functionality. It's possible, reportedly, (or maybe with this plugin?) but no good step-by-step instructions anywhere.

Writing files to AWS Elastic Beanstalk

I have developed a web application with a Node.js backend on a local machine using Webstorm IDE. Among other things, the application creates a new unique folder and writes 7 javascript (.js) files to this folder during runtime each time a user requests a new account. Everything works properly in the local development environment.
When the application is uploaded and deployed to AWS Elastic Beanstalk, and a new user is requested though the application web page, I am receiving the following 404 (Not Found) error in the browser's Developer, JavaScript Console. The same error appears for each of the 7 files. The number 1541877962401 is the unique folder name generated by the application when a user requests a new account and user.js is one of the 7 javascript files copied to this newly created folder. savedUser is an existing folder in the file structure and is not created during runtime.
GET http://sowtest082-env.stsvxa672t.us-east-1.elasticbeanstalk.com/savedUser/1541877962401/user.js net::ERR_ABORTED 404 (Not Found)
I am guessing that the application does not have the correct permissions to create the folder and/or files in AWS?
Within the context of the Elastic Beanstalk environment, what is the best method to create a folder and copy javascript files to this folder during runtime? It is fairly easy to restructure the location of the folder and the files within it in the development environment so as to match the ASW EB environment. Reengineering the application to store the contents of these files in an AWS MySQL database engine would take a lot more work.
Bye the way, these 7 files are not tmp files and they are not config files...
Thanks...

Django - Access and save files to remote server

I am currently developping an application using Django.
What I'm trying to achieve is to have a remote server that will host configuration files. Those files are going to be numerous but quite small.
The configuration of my server is the following : on the adress 172.x.x.51 I have my Django app running with uwsgi and on 172.x.x.52 I have my nginx service connected to my uwsgi instance.
What I would like is to host the files on the nginx server.
Inside the application, I will need to access to the files and to save them (they are calculated from data from the database, so there's no need for a fileupload).
I looked on the documentation and found that I can use a Custom Storage System. The thing is, I don't think that's what I need because I want to store them the way it's done by default. I would just like to define the place where the files should be updated from Django.
Would it be better if I stored them in the media folder on my nginx instance ? How would I say to Django to go look on nginx's instance for the files ? On the server where nginx is hosted, I already host my static files and it's working.
This isn't a question about Django really. Storage backends are for file uploads, but as you say you're not doing that.
You need some way of allowing your Django instance on *.51 write to your nginx server on *.52. This might be via SSH/SCP, or by sharing directories over NFS, for example. Then you can simply save the files over that protocol to the relevant place, from where nginx can serve them.

Upload php application to amazon?

I have a .php application that i would like to be able to run from any browser.
This application is sending push to mobile phone using ssl .
I have a website in amazon, and i have changed the index.html file to my file index.php ( did that in my website bucket ).
Now its not working, and i guess its because a .php application needs to use another service .
My question is what amazon service i have to use, to be able to upload and run my .php file as quickly as possible. (the file uses a certificate .mem file )
I could see you have a php sdk , and i can figure out if i need that . They have a lots of guides there .
AWS S3 is for static files only. It will serve your php as if it's simply a file, it does not execute programs.
Your choices for PHP on AWS are to use Elastic Beanstalk or to configure your own EC2 instance.
There are many nuances to this, especially the EC2 approach.