Website deployed on AWS EC2 Using FileZilla isn't showing live changes - amazon-web-services

My organization is using AWS and I connected to it using FileZilla.
Server Folder Path:
Source
-> Website Folder
---> Client
------> All files visible here which are things from the create-react-app
Specifically:
Build, Node_modules, Public, Src, and other files including Makefile
On my local computer I used npm install to gather the latest files, then I made some changes and when I finished I ran "Npm run build" and then copied over all folders including the new build folder.
I used drag and drop into the filezilla window.
Now when I check the live website for changes, it still shows the old information even though it's updated on my local build. I don't understand.
Is there another way to deploy this properly? I don't see why I can't drag and drop the source files over. It's just static changes.
Things I tried:
Clearing Cache
Using different browser
Incognito
Reconnecting to server via filezilla

I figured it out. Only took forever.
The solution was to connect using ssh client and push transfer the files over directly.
I used putty to do this.

Related

Nexus 3.5.1 proxies from snapshot repo nothing but maven metadata files

I have upgraded nexus repository from 2.x to 3.x through following path:
2.4.14 -> 3.4.0 -> 3.5.1
All nexus services were packed in docker with data directory mapped from host's. For all services I use default either sonatype/nexus or sonatype/nexus3 containers. Nexus web interface is hidden behind nginx with simple reverse proxying.
I use the nexus service with boot-cj (with no credentials) tools which manages dependencies the same way as maven. Anyway the tool first downloads nexus-maven.xml with relevant sha1 files and tries to download jars. It works fine with all 2.x I had.
I created a proxy repository against remote sonatype-snapshots repo. When I start compilation I have Could not find artifact error. I found that the meatdata files are cached but all poms and jars.
I have tried to fix it by cleaning cache with the clean_cache file trick and more rough rm -rfv /srv/nexus3/nexus-data/cache/* with no success. There are no any logs about error. Also I have checked manually that required artefact exists in the remote repository. More obvious Rebuild index button gave no solution. I do not thing it is a problem with nginx, but who knows? Also leaving overnight to run the scheduled tasks did not help.
The expected artifact is org.eclipse.rdf4j:rdf4j:pom:2.3-20170901.145510-11.

How could i deploy my Cloud Code to AWS Elastic Beanstalk? (Parse Server)

I am struggling about how to upload my Cloud Code files that i had on Parse.com to my Parse Server hosted on AWS EB.
So far i have:
Parse Server hosted on AWS EB. To host it on AWS i used the Orange Deploy Button which basically makes all stuff easier for people without having to install the Parse Server locally and upload it later to AWS.
iOS App written in objective C connected to the Parse server and working perfectly
Parse Dashboard locally on my mac connected to the Parse Server on AWS
The only thing that i would need is to upload all my cloud code files to the Parse Server. How could i do this? I have researched a lot over Google, stackoverflow, etc without success. There is some information but its unclear. Thanks in advance.
Finally and thanks to Ran Hassid i now have a Fully functional Parse Server on AWS with Cloud Code. For those who are in the same situation where i was, here is the answer to my question:
Go to this link here and follow all the steps (By the time i asked the question, the information provided by this link of AWS wasn't that clear as it is now. They improved the explanations and the info.)
After you finish all the previous steps from the link. You would have a Parse Server on AWS working.
Now the part of CLOUD CODE. Just create a folder in your MAC or PC wherever you like. Let's say on the desktop and called it Parse Server AWS (You can call it whatever you want)
Install the EB CLI which is the Command line interface to user Terminal (On Mac) or the equivalent on windows to work with the parse server you just set up on AWS (Similar to CloudCode with Parse CLI). The easy way to install it is running this command:
brew install awsebcli
Now open terminal on mac (or the equivalent on windows) and go to the folder that you just created on the step 3.
Run the next command. It will ask you to select the location of your parse server, and then the name.
eb init
Now this command. It will download all the files from AWS of your parse server to this folder you are in.
eb labs download
Finally, you will have a folder called Cloud where you can put all your cloud code files in.
When you finish just run the command:
eb deploy
Now you have your parse server with all your cloud code files working on AWS.
Now any change you need to make to your cloudCode files, just change the local files inside this folder just created on step 3 and run again the command from the step 9. Just exactly as you used to do with Parse Deploy command
Hopefully this information will help many people as it helped to me.
Have a happy coding!
parse-server cloud code is a bit different from Parse.com cloud code. In Parse.com we use the Parse CLI in order to modify and deploy our cloud code (parse deploy ...) in parse-server your cloud code exist under the following path of your parse project ./cloud/main.js* so your cloud code endpoint is the main.js file which by default located under the **cloud folder of your parse project. If you really want you can change this path but to keep it simple use the default location.
Now about deployment. in parse-server you need to redeploy your parse server again when you do some modification to your cloud code. Another option is to edit your cloud code remotely but from my POV its better to redeploy it

Re-build files in Ember-CLI without running server

I am planning on moving from "EmberJS" to the Ember-cli, though I have a small problem. Is it possible only to run file watcher instead of serving/using ember serve that will run local server? As I am running my PHP backend on the Google App Script I have already a local python HTTP server running in localhost:8080 I do not need another one to run in localhost:4200
If I don't run ember serve my local changes in development environment wont get updated. Is there a better way of doing this? Is it possible to use assets in the app folder when running in development environment? and use dist folder for staging/live environments?
As mentioned in the guide, you can use the build command with the --watch flag.
ember build --watch
That will keep rebuilding your changes but not actually run the server.
As for your second question:
Is it possible to use assets in the app folder when running in development environment? and use dist folder for staging/live environments?
I don't believe so. You can change the output-path property in your .ember-cli config file, but you can't have one that's specific to a certain environment. You could always write a quick script to move the files though. :)

Deployment with WebStorm (JetBrains) - sends only one file

I configure deployment setting in WebStorm to location on my VPS. When I'm trying deploy to VPS WebStorm sends only one file - index.html
I want to send all files (or recently updated like in version control).
I've checked configuration but can't find any settings let me choose file to send.
How to send all files to VPS?
Select the files and or folders you want to deploy in Project Files Alt + 1. Right-click and select Deployment -> Sync with Deployed to .... This will check the files and folders for differences with the remote host. Continuing with this dialog will upload (and download) the latest changes.
Be wary, this might take a while over FTP when processing large projects.

Joomla Backup to run Locally

Can a downloaded backup of Joomla website be run on localhost? If yes, what changes I need to do locally?
I placed the unzipped backup folder under C:\xampp\htdocs\joomla. So now as I try to run loclhost //localhost/joomla it gives me following error.
Database Error: Unable to connect to the Database: Could not connect to MySQL.
Please note that I've not done any changes to backup. Just placed the unzipped backup downloaded via FTP in htdocs.
You will also need to download a copy of the website database and install this locally. If this is new to you, check out Akeeba Backup, it will simpplify your life when moving sites between different servers. It will bundle the database and site files into a nice, neat zipped package for you.
Chances are the database specifics are incorrect. Unless you have created the exact database name and user on localhost you'll need to edit configuration.php to straighten that out.