How to copy files from local machine to digital ocean droplet? - digital-ocean

I have a folder which contains many files and i want to copy it to digital ocean droplet from local ubuntu terminal.

You can zip all the files and make it as single file to make the process easier and you can use SCP
scp files.zip remote_username#IP-ADDRESS:/remote/directory/tocopy
I hope this is just a copy process and not deployment.. SCP is not recommended for deployments

Related

Is it possible to download file or folders from AWS Linux EC2 instance to local Windows 10 machine?

I am having one Linux Ec2 instance on AWS and my local machine is Windows 10 (64-bit).
I want to download some files or folders from Ec2 to location windows machine.
I am not sure whether it is possible or not? if yes, how we can do that.
thanks.
I tried this it worked for me.
Download https://mobaxterm.mobatek.net/ its an Enhanced terminal for Windows.
You can link your EC2 instance directly via SSH its pretty simple to set up. Just follow the instuctions they've given. Once linked, its super easy to export, import, create files and folders all via mobaxterm.
folders in mobaxterm:
Got the command to Copy from Windows to Linux.
First you need to install putty (putty-64bit-0.74-installer.msi) on your windows machine
The Command is as follow it will copy the folder(e.g. DokerAutomationResult) to the windows machine from AWSLinux machine.
pscp -r ubuntu#xx.xxx.xx.xx:/home/ubuntu/DokerAutomationResult ./
[pscp -r ubuntu#(ipAddress):(locationOfLinuxFileLocation /(locationToCopyInWInodws) ]
For better Understanding:
https://www.youtube.com/watch?v=Sc0f-sxDJy0&ab_channel=Liv4IT
Yes it is possible to download files from ec2 linux instance to local system.
You can use scp -i key user#ip add:/file location which you want to download.
. will download file in your current location on local system

Google Cloud: how to copy files from VM Instance to computer desktop?

On SSH, I've tried using gsutil -m cp filename* Desktop to copy a file from the VM Instance to my computer desktop, like Google Cloud's own example in its documentation. I got a message saying that the file was copied successfully, but no mention of downloading anything, and I don't see the relevant file on my desktop. I've tried specifying the full desktop address instead of just 'Desktop', but SSH does not recognize the address.
Is there a way I can directly download files from the VM Instance to my desktop without having to go through a Google Cloud bucket?
In my opinion this is the easiest way. Directly to local machine in a few steps.

Is there a way to open and modify a python file in a virtual machine of google cloud platform?

I just started to use cloud to do training for my deep learning program. For now every time I modified my local .py file I have to remove the old one in the remote virtual machine of Google Cloud Platform and upload the new one. I am just curious if there is a way that I can actually open the .py file in the remote visual machine through the command line? That would be very high efficiency.
Thank you very much!
To edit a file on a machine you can SSH into there are many potential solutions.
Use scp to copy files. E.g. scp mylocalfile ssh://my-host-address/myfolder
Use ssh mounting solutions: How do you edit files over SSH?
Edit using nano your-file-to-edit (my favorite) https://www.howtogeek.com/howto/42980/the-beginners-guide-to-nano-the-linux-command-line-text-editor/
Edit using vi or vim http://vim.wikia.com/wiki/Tutorial

Clone a Cloud9 workspace into an SSH enabled setup

My question concerns migrating a web app built in Flask to a Droplet.
I have built the app in a private workspace, using the Flask template in C9. Now it's been developed, I want to set up the app in a Digital Ocean droplet. To connect a brand new C9 SSH workspace to a droplet is fine; however, the SSH workspace by default does not have many of the dependencies (mySQL; Flask) and this is a major pain. I'm using a droplet with Ubuntu NodeJS 6.9.5 on 14.04.
Is there a way to have my existing private dev workspace cloned to an SSH workspace? It would be even better to have a single workspace maintained that syncs to the other, should I wish to take down the droplet for any reason.
Thanks.
Why can't you just image the disk, a digital copy of a working c9 to your new workplace, then use rsync to do increment sync.
rsync -azHAXxP -e 'ssh -p22' username#your-droplet-IP \
"dd if=/dev/sda of=myworking-droplet.iso bs=512 conv=noerror,sync"
Now that you have a full copy of working droplet as an image (.iso) file,
you can copy this to any hard disk of your choice and boot into it.
Image your Cloud9 into USB
If your C9 filesize is less than 30G, you can copy this into usb drive and maintain a locally bootable copy, that is you can image this unto any larger USB
and make it bootable, to achieve this do :
Mount USB of bigger size than size of your iso.
Check your mounted flash drive localtion with df -h, or lsblk -a
Image your iso file to usb with :
dd if=myworking-droplet of=/dev/sdb bs=512k conv=noerror,sync
Once finished, remove your drive, plug it into another PC,
change the boot order to USB Drive
and boot into your C9 locally
Sync Your Cloud9 workspace directory to another box
It gets better if you just want to sync your C9 workspace directory
over SSH connection to another box, use:
`rsync -azHAXxP -e 'ssh -p22' username#your-droplet-IP:/path/to/your/workspace .`
The above will clone your workspace directory over to your current box.
NOTE: The dot at the end of the command is necessary.
Mount your Cloud9 workspace to your local machine
Also you can mount your remote workspace file system to your local computer so you can make changes on
the fly and treat your droplet as local storage, so you can work and
access the same workspace locally without the need to login into your Cloud9
everytime you want to make changes or work on your project, to achieve this use:
Install SSHFS if not already installed with :
sudo apt-get install sshfs
Create a local directory in which to mount the droplet's file system.
sudo mkdir /mnt/mydroplet
Mount your droplet with:
sudo sshfs -o allow_other,defer_permissions \
root#xxx.xxx.xxx.xxx:/ /mnt/mydroplet
If your droplet is configured for login via ssh key authorization, use this:
`sudo sshfs -o allow_other,defer_permissions,IdentityFile=~/.ssh/id_rsa \
root#xxx.xxx.xxx.xxx:/ /mnt/mydroplet`
Now you can work with files on your droplet as if it were a physical device attached to your local machine.
Testing your mount from your local machine:
`cd /mnt/mydroplet`
Creat a test file
touch TestFile.php
Now login into you Clound9 and verify that you can access and modify the newly created file i.e TestFile.php
Now you can code locally, it immediately appears on your C9, or upload files from your local machine to your Cloud9 by copy file to your locally mounted directory.

Sharing a directory from the VM to the host

I use config.vm.synced_folder to sync folders from the host to the VM, but I'd like to sync it in the other direction. Is this possible using vagrant/virtualbox?
By default Vagrant uses VirtualBox's vboxsf to sync folders between host and guest.
It is two way, so if you make changes to the files in /vagrant_data in the guest, it changes the corresponding files in the host's directory. You don't need to do it again the other way around.
Other options to sync files:
rsync
sshfs
NFS
See more => Synced Folders
If I understand correctly, you're looking to create a shared folder for Vagrant where files are being added from the guest machine and should show up in the host machine?
If that's the case, you're still going to have to create the host folder. I'm afraid Vagrant won't create the directory for you from a config.vm.synced_folder line in your Vagrantfile, but it will work fantastically once the host directory exists.