how to split Git Client/Server for Updating/uploading on AWS - amazon-web-services

I have a project, in which I have two folders: Server / Client.
When I push/commit to git, both folder will be updated.
Now, I want to upload my server to AWS and build a Pipeline. But with connection to git, it will be Server and Client uploaded. How can I change things, that only one folder get uploaded?

Related

Website deployed on AWS EC2 Using FileZilla isn't showing live changes

My organization is using AWS and I connected to it using FileZilla.
Server Folder Path:
Source
-> Website Folder
---> Client
------> All files visible here which are things from the create-react-app
Specifically:
Build, Node_modules, Public, Src, and other files including Makefile
On my local computer I used npm install to gather the latest files, then I made some changes and when I finished I ran "Npm run build" and then copied over all folders including the new build folder.
I used drag and drop into the filezilla window.
Now when I check the live website for changes, it still shows the old information even though it's updated on my local build. I don't understand.
Is there another way to deploy this properly? I don't see why I can't drag and drop the source files over. It's just static changes.
Things I tried:
Clearing Cache
Using different browser
Incognito
Reconnecting to server via filezilla
I figured it out. Only took forever.
The solution was to connect using ssh client and push transfer the files over directly.
I used putty to do this.

AWS Amplify - How to Export and Download Video Files like in XAMPP/Apache

I have a web app that lets me export videos in a MP4 video format. I can set this up locally using XAMPP Control Panel and open it as Administrator and install Apache. Finally, going to the htdocs folder and place the local files from there. How can I set this up from my GitHub repo on AWS Amplify? Or is it easier to use Heroku

I want to configure AWS S3 on my django project but many of my client's media files are on my pythonanywhere server already

I want to integrate AWS S3 in my django project and i have successfully integrated that but the problem is many of my client's media files are already on the pythonanywhere server and now i want to change it to AWS S3 so the upcoming media files should be stored in S3. But now what happens is that the new files are storing correctly and i can open them but the previous files, when i open them because of the media files url i suppose django looks it in AWS S3 but they are not there. So i need django to understand to look for the media files where they are stored like for the previous files i want django to look in the server directory and for new files in S3 instance.
Hope that makes sense.
This is my AWS S3 configurations file for my django project
This is my django settings file, mainly focuses on media files
So if you want to move your static files that were uploaded before you configured AWS S3 with django you can run this command in your command line: python manage.py collectstatic.

How to download updated sqlite Database from AWS EB? (Django app)

I have deployed my django app on AWS Elastic beanstalk, using the sqlite database. The site has been running for a while.
Now I have made some changes to the website and want to deploy again, but by keeping the changes the database went through(by form inputs)
But I am not able to find a way to download the updated database or any files. The only version of the app I am able to download is the one I already uploaded when I first deployed.
How can I download the current running app version and deploy with the same database?
you may use scp to connect to remote server and download all your files.
as commented, you should better use dumpdata and then i/o that file (instead of the db or full site).
however, if your remote system python version is not the same as your django site, then the dumpdata may not work (this is my case), so I managed to download the whole site folder back to local and then use dumpdata
aws elastic beanstalk Python app's files in /opt/python/current/app
just use scp -r to download the whole site to local.
I believe you have sqlite db file being packaged from your local during eb deploy. If that is a case then all you need to do is not include db.sqlite file in your deployments.
Move location of db.sqllite file to s3 or ebs persistent volume so that it persists.
Use .ebextentions to run db migration in aws on top of the aws db file just the way django recommends.
To be on safe side you can ssh into your eb env which is ec2 instance and download db.sqlite file just incase.

Deployment with WebStorm (JetBrains) - sends only one file

I configure deployment setting in WebStorm to location on my VPS. When I'm trying deploy to VPS WebStorm sends only one file - index.html
I want to send all files (or recently updated like in version control).
I've checked configuration but can't find any settings let me choose file to send.
How to send all files to VPS?
Select the files and or folders you want to deploy in Project Files Alt + 1. Right-click and select Deployment -> Sync with Deployed to .... This will check the files and folders for differences with the remote host. Continuing with this dialog will upload (and download) the latest changes.
Be wary, this might take a while over FTP when processing large projects.