i am getting permission denied error while taking backup using pg_basebackup.
/usr/pgsql-11/bin/pg_basebackup -h127.0.0.1 -U thbbackup -D backup -Ft -z -P
Password:
238546/238575 kB (99%), 1/1 tablespace
pg_basebackup: could not get write-ahead log end position from server: ERROR: could not open file "./.postgresql.conf.swp": Permission denied
pg_basebackup: removing data directory "backup"
You have probably forgotten the file postgresql.conf open in a text editor (vim). If you open this conf file again then the text editor should complain saying it is already open so, you can just delete this as .swp file it is a temporary file anyway.
"When you edit a file in Vim, you have probably noticed the (temporary) .swp file that gets created. By default it'll be in the same location as the file that you are editing (although you can change this). The swap file contains the info about changes made to the file (or buffer)."
In this case it looks like a swap file from an open editor or previously orphaned. In general, Postgres needs ownership of all files in the data directory for a pg_basebackup. I have seen this failure on files with root:root or other ownership residing in the data directory. After running chown postgres:postgres [filename] on the target files, pg_basebackup should be able to run successfully.
Related
I'm trying to SCP some files from a remote server. The command I'm using is:
scp -r -i ~/path/to/key ubuntu#address:/home/ubuntu/analysis .
I receive an error:
scp: /home/ubuntu/analysis: No such file or directory.
If I scp another file in my home directory, it does work, e.g.
scp -r -i ~/path/to/key ubuntu#address:/home/ubuntu/.viminfo .
If I create a new file, e.g. with touch new_file.txt, I also cannot download that file.
The permissions and owners for .viminfo and the directory analysis are standard.
Why isn't the SCP working? I have been able to download files from this server before, but something has changed.
Quite confusing - any advice would be appreciated!
Thanks!
I am trying to copy the data from the file stores to the google cloud bucket.
This is the command I am using:
gsutil rsync -r /fileserver/demo/dir1 gs://corp-bucket
corp-bucket: Name of my bucket
/fileserver/demo/dir1: Mount point directory (This directory contain the data of the file store)
This command works fine in the first time, It copies the data of the directory /fileserver/demo/dir1 to the cloud bucket but then I delete the data from the cloud bucket and again run the same command without any changes then I get this error:
cannot open path of the current working directory: Permission denied
NOTE: If I made even a small changes to the file of the /fileserver/demo/dir1 and run the above command then again it works fine but my question is why it is not working without any changes and is there any way to copy file without making any changes
Thanks.
You may be hitting the limitation #2 of rsync " The gsutil rsync command considers only the live object version in the source and destination buckets"; You can exclude /dir1... with the -x pattern and still let rsync makes the clean up work as the regular sync process.
Another way to copy those files will be to use cp with -r option to make it recursively instead of rsync.
I am trying to download a copy of my mysql history to keep on my local drive as a safeguard.
Once selected, a dropdown menu appears
And I am prompted to enter the file path for the download
But after all the variations I can think of, I keep receiving the following error message:
Download File means that you are downloading a file from the VM to your local computer. Therefore the expected path is a file on the VM.
If instead your want to upload c:\test.txt to your VM, select Upload File. Then enter c:\test.txt. The file will be uploaded to your home directory on the VM.
I am trying to upload file from my laptop to ec2 instance.
I am trying with this:
$ scp -i ec2_instance_key.pem ~/WebstormProjects/RESTAPI/config.js ec2-user#ec2-xxx.eu-west-x.compute.amazonaws.com:~/data/
When I launch the command, terminal responses with:
scp: /home/ec2-user/data/: Is a directory
But I put it in ec2 terminal:
$ cd /home/ec2-user/data
And it responses with no such file or directory
And it copies file in my laptop again, in ec2-instance-key path.
What is the problem?
Thank you very much.
The probable case is :
There is no directory named data in home directory of ec2-user.
OR a file named data is mistakenly created in home directory, while doing scp.
Solution :
Check if there exist a file named data at home directory of ec2-user.
Move the file ~/data to some other name(if exist).
Create a directory named data in home directory of ec2-user.
Give proper access permissions to the newly created data directory. (chmod 644 ~/data).
Try uploading the file again by using following command :
scp -i ec2_instance_key.pem ~/WebstormProjects/RESTAPI/config.js ec2-user#ec2-xxx.eu-west-x.compute.amazonaws.com:~/data
I download pem file while launching t2.small instance. When I try to connect it via ssh it say no such a file in directory. But am sure that pem file is in directory.
$ ls
mypemfile.pem
$ chmod 400 mypemfile.pem
chmod: cannot access ‘mypemfile.pem’: No such file or directory
$ ssh -i "mypemfile.pem" root#x.x.x.xx
Warning: Identity file mypemfile.pem not accessible: No such file or directory.
Permission denied (publickey).
How to track this issues? and any solutions?
Note: I created instance from AMI image shared by another account.
For sure the problem is because there is no "mypemfile.pem" file.
Recheck the availability of the file, if the file is available try to rename it or make a copy of it and try with the newly created file.
Hope it helps..
I figured it out on Mac. So, this is what I had to do. When you created private key, my Mac saved is as whatever.pem.txt, so in order to connect to the AWS instance just add .txt extension to whatever AWS instructions tell you to do. For example:
chmod 400 yourfile.pem.txt ssh -I "yourfile.pem.txt" ubuntu#ecX-XX-XX-XXX-XXX.compute-1.amazonaws.com
This is for Mac users.
Best,