upload image at opencard - opencart

move_uploaded_file(): Unable to move '/tmp/phptOjhzD' to '/var/www/html/garageluxe/upload/image/catalog/demo/manufacturer/GLB6147_1.jpg' in /var/www/html/garageluxe/upload/admin/controller/common/filemanager.php on line 277
The error appeard at error log

1- check image folder and sub folders permission - for test set permission to 777
2- check DIR_IMAGE path in config.php file
3- check tmp folder - remove old data into tmp folder

Related

could not open file "./.postgresql.conf.swp": Permission denied

i am getting permission denied error while taking backup using pg_basebackup.
/usr/pgsql-11/bin/pg_basebackup -h127.0.0.1 -U thbbackup -D backup -Ft -z -P
Password:
238546/238575 kB (99%), 1/1 tablespace
pg_basebackup: could not get write-ahead log end position from server: ERROR: could not open file "./.postgresql.conf.swp": Permission denied
pg_basebackup: removing data directory "backup"
You have probably forgotten the file postgresql.conf open in a text editor (vim). If you open this conf file again then the text editor should complain saying it is already open so, you can just delete this as .swp file it is a temporary file anyway.
"When you edit a file in Vim, you have probably noticed the (temporary) .swp file that gets created. By default it'll be in the same location as the file that you are editing (although you can change this). The swap file contains the info about changes made to the file (or buffer)."
In this case it looks like a swap file from an open editor or previously orphaned. In general, Postgres needs ownership of all files in the data directory for a pg_basebackup. I have seen this failure on files with root:root or other ownership residing in the data directory. After running chown postgres:postgres [filename] on the target files, pg_basebackup should be able to run successfully.

Download File - Google Cloud VM

I am trying to download a copy of my mysql history to keep on my local drive as a safeguard.
Once selected, a dropdown menu appears
And I am prompted to enter the file path for the download
But after all the variations I can think of, I keep receiving the following error message:
Download File means that you are downloading a file from the VM to your local computer. Therefore the expected path is a file on the VM.
If instead your want to upload c:\test.txt to your VM, select Upload File. Then enter c:\test.txt. The file will be uploaded to your home directory on the VM.

Python: permission denied when extracting tar.gz

I would like to extract all tar.gz inside a folder but I am getting [Errno 13] Permission denied. I have been through different posts related to the problem but nothing helps. Even extracting a specific member inside tar.gz gives same error. Can someone help what could be wrong?
I want to create a script for unzip (.tar.gz) file via (Python)
Python: Extracting specific files with pattern from tar.gz without extracting the complete file
Overwrite existing read-only files when using Python's tarfile
tar = tarfile.open(fname, "r:gz")
tar.extractall()
tar.close()
Are you running this as a local user? Is this running on Unix/Linux? Does the account running the python script have the appropriate rights to the folder you are attempting to write to?
It happens when you do not have permission to the tmp folder on your system.
Make a temporary directory in your home directory :
mkdir tmp_local
Try to change the tmp directory to your local folder using the following command:
export TMPDIR='/local_home/ah32097/tmp_local'
after this you can directly pip install the python package in tar.gz format.

Downloading folder from GCS to local directory

I'm trying to download a folder from my Cloud Storage bucket to local directory using the command gsutil cp -r gs://bucket/my_folder . . But it is showing OSError : Access is denied. Any idea how to get around this problem?
I can reproduce this error if I do not have permissions to create LOCAL_DEST_DIR on my local machine.
$ gsutil cp -r gs://BUCKET_NAME/DIR_IN_BUCKET LOCAL_DEST_DIR
Copying gs://BUCKET_NAME/DIR_IN_BUCKET/FILE...
OSError: Permission denied.
Please check you have permissions to create a file/directory in your current working directory.
You can run touch test-file.text to verify if you're able to create files in the current directory.
If you're on linux/*nix/mac, usually you will have full permissions to create files and directories in your $HOME directory, so you can try running the gsutil command in that directory.

Not able to upload files on AWS server root folder getting 'file permission error'

I'm trying to upload a google webmaster file to aws root folder with Filezilla. I'm getting the following error
and my root folder permission is set to 755. But still i'm getting file permission error.
You are not authorized to access this directory. Ask your administrator/Root user to run this command.
sudo chown -R USER_NAME:USER_NAME FOLDER_PATH
USER_NAME: Your username
FOLDER_PATH: Folder path
I have also faced the same issue earlier so I sorted it like this.