How can we move or copy one server doc file to another server s3 bucket. I have details of both of the server. Can we move or copy it using code like in PHP ?
It appears that you are asking how to upload a file to an Amazon S3 bucket.
The simplest way is to use the AWS Command-Line Interface (CLI):
aws s3 cp foo.txt s3://mybucket/foo.txt
If you wish to use this via PHP, then you can use the AWS SDK for PHP. It has a PutObject() command that adds an object to an Amazon S3 bucket.
Related
I have a presigned URL for a file in a vendor's S3 bucket. I want to copy that file into my own bucket. I'd rather not copy it to the machine I'm running the copy from. My thought was to use the CLI s3 sync or cp commands to copy the file from one bucket to another. But those commands require s3:// URLs, not https://.
I tried converting the HTTP URL by replacing "https://bucketname.s3.region.amazonaws.com" with "s3://bucketname", but that gives an Access Denied error with s3 sync and a Bad Request with s3 cp. Is there any way to do this, or do I need to download it locally with HTTP, then upload to my bucket with the CLI?
Problem here is that you need to authenticate into two different accounts, the source to read and the destination to write. If you had access to both, i.e. the credentials you use to read could also write to your own bucket, you would be able to bypass the middle-man.
That's not the case here, so your best bet is to download it first, then authenticate with your own account and put the object there.
Amazon S3 has an in-built CopyObject command that can read from an S3 bucket and write to an S3 bucket without needing to download the data. To use this command, you require credentials that have GetObject permission on the source bucket and PutObject permissions on the destination bucket. The credentials themselves can be issued by either the AWS Account having the source bucket or the AWS Account having the destination bucket. Thus, you would need to work with the account admins who control the 'other' AWS Account.
If this is too difficult and your only way of accessing the source object is via a pre-signed URL, then you cannot use the CopyObject command. Instead, you would need to download the source file and then separately upload it to Amazon S3.
I have did a number of searches and can't seem to understand if this is doable at all.
I have a data logger that has FTP-push function. The FTP-push function have the following settings:
FTP server
Port
Upload directory
User name
Password
In general, I understand that a Filezilla client (I have a Pro edition) is able to drop files into my AWS S3 bucket and I had done this successfully in my local PC.
Is it possible to remove the Filezilla client requirement and input my S3 information directly into my data logger? Something like the below diagram:
Data logger ----FTP----> S3 bucket
If not, what will be the most sensible method to have my data logger JSON files drop into AWS S3 via FTP?
Frankly, you'd be better off with:
Logging to local files
Using a schedule to copy the log files to Amazon S3 using the aws s3 sync command
The schedule could be triggered by cron (Linux) or a Scheduled Task (Windows).
Amazon did add support recently to AWS Transfer for FTP support. This will provide an integration with Amazon S3 via FTP without setting up any additional infrastructure, however you should review the pricing at the moment.
As an alternative you could create an intermediary server that can sync between itself and AWS S3 using the cli aws s3 sync.
I have a workflow need. I have a customer that does not want to deal with our S3 folders where we drop their files. They want us to send the files directly to their SFTP account. When I unload files from my backend they automatically unload to S3 from AWS services. As this is a one time request per customer I don't wish to set up an automated transfer protocol in a Lamda or bash script. nor do I wish to go through the hassle of copying the file to my local server only to post it to the SFTP site. I would prefer to just right click on the file and select to transfer to SFTP location. Does anyone know if AWS has any plans to add file transfer protocol support into the S3 console UI? (SFTP, FTP, etc.)
What would be even better is if AWS S3 allowed all files dropped in an S3 bucket location to be automatically transferred to the SFTP location defined -- in the scenario where the customer never wishes to deal with S3, but we need to use it.
Given the current capabilities of Amazon S3, automating a send of files from Amazon S3 to an SFTP target would require the use of an AWS Lambda function.
There are a few ways to do this, since you are looking for the most easiest way i would suggest you to install s3fuse on a linux server, this enables you to mount s3 as a file system. You can directly mount it on the sftp server and copy them locally , below is the URL for s3Fuse.
https://cloud.netapp.com/blog/amazon-s3-as-a-file-system
The other method would be to use the AWS CLI to do recursive copy , this would involve installing AWS CLI and generate API keys. Below is an example of the command.
aws s3 cp s3://mybucket/test.txt test2.txt
You can revoke the API keys once you are done with the transfer!
I am making a Cron Job in AWS Server and I have this File Handling function which creates JSON file. I already have Amazon S3 Cloud Storage, and I want my JSON file saved inside it. How can I do it? I tried to locate the directory for Amazon S3 Storage using Filezilla but found nothing. Thank you!
you have to put another command into your cron.
After you create a Json file, you have to use awscli to upload your json to S3 storage.
Here is how to install it.
installation guide
after its set up, you can use aws s3 command to upload it.
have a look here for more information.
S3 upload command
I guess this this a command you need to add.
aws s3 cp ./yourfile.json s3://your-bucket-name/
I'm using AWS to host a static website. Unfortunately, it's very tedious to upload the directory to S3. Is there any way to streamline the process?
Have you considered using AWSCLI - AWS Command Line Interface to interact with AWS Services & resources.
Once you install and configure the AWSCLI; to update the site all that you need to do is
aws s3 sync s3://my-website-bucket /local/dev/site
This way you can continue developing the static site locally and a simple aws s3 sync command line call would automatically look at the files which have changed since the last sync and automatically uploads to S3 without any mess.
To make the newly created object public (if not done using Bucket Policy)
aws s3 sync s3://my-website-bucket /local/dev/site --acl public-read
The best part is, the multipart upload is built in. Additionally you sync back from S3 to local (the reverse)