I have a CentOS server running WHM/cPanel sites, and a lot of these sites run WordPress. I would like to do an automated backup that zips and moves any account containing an 'uploads' folder into a backup directory. I know there are other solutions, but most want you to backup the entire WordPress site, I only need to backup the uploads however.
I'm not very good with .sh scripts and have spent a lot of time already trying to figure this out, but I can't seem to find any examples similar enough for me to be successful. The main problem I have is naming the zip after the account.
Location of most upload folders (user1 being the account that changes):
/home/user1/public_html/wp-content/uploads
/home/user2/public_html/wp-content/uploads
/home/user3/public_html/wp-content/uploads
Script example:
find /home/ -type d -name uploads -exec sh -c 'zip -r /backup/uploads/alluploads.zip `dirname $0`/`basename $0`' {} \;
The problem with this approach is that it all writes to one single zip file. How can I alter this to save each users uploads to there own user1-uploads.zip file?
I've played around with exec and sed but I can't seem to figure it out. This is the best I got - just trying to get it to echo the name - but it's not right. Sorry, I'm terrible with regex:
find /home/ -type d -name uploads -exec sh -c 'string=`dirname $0` echo $string | sed `/\/home\/\(.*\)\/new\/wp-content\/uploads/`'{} \;
Would appreciate any help or directions on how to fix this. Thanks!
You can use Bash's globbing with * to expand all the different user directories.
for uploads in /home/*/public_html/wp-content/uploads; do
IFS=/ read -r _ _ user _ <<<"$uploads"
zip -r /backup/uploads/${user}.zip "$uploads"
done
A couple of solutions come to mind, you could loop through the user directories:
cd /home
for udir in *; do
find /home/$udir -type d -name uploads -exec sh -c 'zip -r /backup/uploads/'"$udir"'-uploads.zip `dirname $0`/`basename $0`' {} \;
done
or use cut to get the second element of the path:
find /home/ -type d -name uploads -exec sh -c 'zip -r /backup/uploads/`echo "$0" | cut -d/ -f 2`-uploads.zip `dirname $0`/`basename $0`' {} \;
Although both of these would run into issues if the user has more than 1 directory anywhere under their home that is named uploads
You might use tr to simply replace the path separator with another character so you end up with a separate zip for each uploads directory:
find /home/ -type d -name uploads -exec sh -c 'zip -r /backup/uploads/`echo $0 | tr "/" "-"`.zip `dirname $0`/`basename $0`' {} \;
so you would end up wil files named /backup/uploads/home-user-wp-content-uploads.zip and /backup/uploads/home-user-wip-uploads.zip instead of one zip overwriting the other
Related
I'm trying to remove all of the folder meta files from a unity project in the git repo my team is using. Other members don't delete the meta file associated to the folder they deleted/emptied and it's propagating to everyone else. It's a minor annoyance that shouldn't need to be seen so I've added this to the .gitignore:
*.meta
!*.*.meta
and now need to remove only the folder metas. I'd rather remove the metas now than wait for them to appear and have git remove them later. I'm using git bash on Windows and have tried the following commands to find just the folder metas:
find . -name '*.meta' > test.txt #returns folders and files
find . -regex '.*\.meta' > test.txt #again folders and files
find . -regex '\.[^\.]{0,}\.meta' > test.txt #nothing
find . -regex '\.[^.]{0,}\.meta' > test.txt #nothing
find . -regex '\.{2}' > test.txt #nothing
find . -regex '(\..*){2}' > test.txt #nothing
I know regex is interpreted differently per program/language but the following will produce the results I want in Notepad++ and I'm not sure how to translate it for git or git bash:
^.*/[^.]{0,}\.meta$
by capturing the lines (file paths from root of repo) that end with a /<foldername>.meta since I realized some folders contained a '.' in their name.
Once this is figured out I need to go line by line and git rm the files.
NOTE
I can also run:
^.*/.*?\..*?\.meta$\n
and replace with nothing to delete all of the file metas from the folders and files result, and use that result to get all of the folder metas, but I'd also like to know how to avoid needing Notepad++ as an extra step.
To confine the results only to indexed files use git ls-files, the swiss-army knife of index-aware file listing. git update-index is the core-command index munger,
git ls-files -i -x '*.meta' -x '!*.*.meta' | git update-index --force-remove --stdin
which will remove the files from your index but leave them in the work tree.
It's easier to express with two conditions just like in .gitignore. Match *.meta but exclude *.*.meta:
find . -name '*.meta' ! -name '*.*.meta'
Use -exec to run the command of your choice on the matched files. {} is a placeholder for the file names and ';' signifies the end of the -exec command (weird syntax but it's useful if you append other things after the -exec ... ';').
find . -name '*.meta' ! -name '*.*.meta' -exec git rm {} ';'
Please help me out here:
I'm using the below command to search and replace strings in files in a directory (including sub-directories):
find . -type f -exec perl -api -e 's/\b(?!00)[A-Z0-9]{6,}/dummy/g' {} \;
What I want to is after it performs the above operation on a file, I want to simultaneously move it to another folder and then work on the next file.
Any help is appreciated.
Thanks
You could try this:
find . -type f -exec perl -api -e 's/\b(?!00)[A-Z0-9]{6,}/dummy/g' {} \; -exec mv {} /to/this/directory \;
After the first -exec predicate completes successfully, find will run the next -exec. This answer to a related question will give you a bit more information.
What I want to is after it performs the above operation on a file, I want to simultaneously move it to another folder and then work on the next file.
You can do:
while IFS= read -rd '' file; do
perl -ap -e 's/\b(?!00)[A-Z0-9]{6,}/dummy/g' "$file" > "/dest/$file"
done < <(find . -type f -print0)
This will also take care of files with white-spaces and special characters.
I wanted to continu a project I haven't touched in a while, and came across this error when executing
php bin/console doctrine:generate:entites SalonBundle (it's from shell, so it use PHP CLI)
Generating entities for bundle "SalonBundle"
> backing up Salon.php to Salon.php~
> generating SalonBundle\Entity\Salon
[Symfony\Component\Debug\Exception\ContextErrorException]
Warning: chmod(): Operation not permitted
doctrine:generate:entities [--path PATH] [--no-backup] [-h|--help] [-q|--quiet] [-v|vv|vvv|--verbose] [-V|--version] [--ansi] [--no-ansi] [-n|--no-interaction] [-e|--env ENV] [--no-debug] [--] <command> <name>
To begin with, I'm not sure why Symfony try a chmod
All files are owned by root:www-data
File permissions are rw-rw-r--
My user is in group www-data
upload, creating file, copy, move, etc works fine
The permissions are set via a script which run the following commands
$targetDiris the path passed as argument to the script.
chown -R root:www-data $targerDir
find $targerDir -type d -exec chmod ug+rwx "{}" \;
find $targerDir -type f -exec chmod ug+rw "{}" \;
find $targerDir -type d -exec chmod g+s "{}" \;
find $targerDir -type d -exec setfacl -m g:www-data:rwx,d:g:www-data:rwx "{}" \;
find $targerDir -type f -exec setfacl -m g:www-data:rw- "{}" \;
Just added -vvv to the command line as suggested by someone and got this:
Exception trace:
() at /var/www/3DH/salon/vendor/doctrine/orm/lib/Doctrine/ORM/Tools/EntityGenerator.php:392
Symfony\Component\Debug\ErrorHandler->handleError() at n/a:n/a
chmod() at /var/www/3DH/salon/vendor/doctrine/orm/lib/Doctrine/ORM/Tools/EntityGenerator.php:392
Doctrine\ORM\Tools\EntityGenerator->writeEntityClass() at /var/www/3DH/salon/vendor/doctrine/orm/lib/Doctrine/ORM/Tools/EntityGenerator.php:347
Doctrine\ORM\Tools\EntityGenerator->generate() at /var/www/3DH/salon/vendor/doctrine/doctrine-bundle/Command/GenerateEntitiesDoctrineCommand.php:133
Doctrine\Bundle\DoctrineBundle\Command\GenerateEntitiesDoctrineCommand->execute() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Component/Console/Command/Command.php:256
Symfony\Component\Console\Command\Command->run() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Component/Console/Application.php:837
Symfony\Component\Console\Application->doRunCommand() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Component/Console/Application.php:187
Symfony\Component\Console\Application->doRun() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Bundle/FrameworkBundle/Console/Application.php:80
Symfony\Bundle\FrameworkBundle\Console\Application->doRun() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Component/Console/Application.php:118
Symfony\Component\Console\Application->run() at /var/www/3DH/salon/bin/console:27
This topic provide no solutions
This solution doesn't apply to me as I don't use vagrant (not installed)
Well, I understood the problem.
I would say I'm stupid in the first place I guess...
#john Smith: You weren't far off...
In fact, my problem is just logic...
Trying to chmod a file owned by root with a regular user is impossible...
And that's normal
Every files generated in the src/ folder are made from the files(lets call them tools) in /vendor/doctrine/orm/lib/Doctrine/ORM/Tools/
And of course files are owned by the user who user these tools (twig, entity, form, etc)
When generating a file, these tools try to chmod the new file in case we were dumb enough to not set right our folders & files permissions.
In my case, as I did a pass of my webperms script, all files were owner by root:www-data.
So of course, when I try to generate entities, it would fail because I wasn't the file owner anymore.
There are currently two solutions to this problem:
Change src/ files ownership to user:www-data
Comment the chmodcommand in the tools located in /vendor/doctrine/orm/lib/Doctrine/ORM/Tools/
I'm not sure how to contact the symfony team, but I would suggest to add an if condition, which would depend of a symfony parameter and allow us to bypass this chmod
I am trying to strip all "?" in file names in a given directory who was got more subdirectories and they have subdirectories within it. I've tried using a simple perl regex script with system calls but it fails to recurse over each subdirectory, and going manually would be too much wasted time. How can I solve my problem?
You can use the find command to search the filenames with "?" and then use its exec argument to run a script which removes the "?" characters from the filename. Consider this script, which you could save to /usr/local/bin/rename.sh, for example (remember to give it +x permission):
#!/bin/sh
mv "$1" "$(echo $1| tr -d '?')"
Then this will do the job:
find -name "*\?*" -exec rename.sh {} \;
Try this :
find -name '*\?*' -exec prename 's/\?//g' {} +
See https://metacpan.org/module/RMBARKER/File-Rename-0.06/rename.PL (this is the default rename command on Ubuntu distros)
Find all the names with '?' and delete all of them. Probably -exec option could be used as well but would require additional script
for f in $(find $dir -name "*?*" -a -type f) ; do
mv $f ${f/?/}
done
I have a folder which contains a ~50 text files (PHP) and hundreds of images. I would like to move all the images to a subfolder, and update the PHP files so any reference to those images point to the new subfolder.
I know I can move all the images quite easily (mv *.jpg /image, mv *.gif /image, etc...), but don't know how to go about updating all the text files - I assume a Regex has to be created to match all the images in a file, and then somehow the new directory has to be appended to the image file name? Is this best done with a shell script? Any help is appreciated (Server is Linux/CentOs5)
Thanks!
sed with the -i switch is probably what you're looking for. -i tells sed to edit the file in-place.
Something like this should work:
find /my/php/location -name '*.php' | xargs sed -ie 's,/old/location/,/new/location/,g'
You could do it like this:
#!/bin/sh
for f in *.jpg *.png *.gif; do
mv $f gfx/
for p in *.txt; do
sed -i bak s,`echo $f`,gfx/`echo $f`,g $p
done
done
It finds all jpg/png/gif files and moves them to the "gfx" subfolder, then for each txt file (or whatever kind of file you want it edited in) it uses "sed" in-place to alter the path.
Btw. it will create backup files of the edited files with the extra extension of "bak". This can be avoided by omitting the "bak" part in the script.
This will move all images to a subdir called 'images' and then change only links to image files by adding 'images/' just before the basename.
mkdir images
mv -f *.{jpg,gif,png,jpeg} images/
sed -i 's%[^/"'\'']\+\.\(gif\|jpg\|jpeg\|png\)%images/\0%g' *.php
If you have thousands of files, you may need to utilize find and xargs. So, a bit slower
find ./ -regex '.*\(gif\|jpg\|png\|jpeg\)' -exec mv {} /tmp \;
find ./ -name \*.php -print0 | \
xargs -0 sed -i 's%[^/"'\'']\+\.\(gif\|jpg\|jpeg\|png\)%images/\0%g' *.php
Caution, it will also change the path to images with remote urls. Also, make sure you have a full backup of your directory, php syntax and variable names might cause problems.