CHMOD 777 specific file extensions in Terminal - regex

I am trying to CHMOD 777 all .doc files on my Mac. Is there is a way through Terminal that I could do this?
__
Thanks for the responses. I thought this was the way to change permissions on Word doc files. I have 2 users on Mac make that share a folder. But when one creates a doc file the other just has read permissions. I want both of them to have this, or everyone. It doesn'tmatter. I also want to go back and retroactively make all the past docs, some of which user A has read&write permissions, and some of which user B has read&write permissions for, read&writeable by both of them. Is there another way to do this? From what I can tell, this is a Mac permissions issue, nothing in Word. I thought CHMOD 777 in terminal was the way to do this.

Use find and xargs or the -exec option of find.
/home/erjablow> find . -name "*.doc" | xargs -n 3 chmod a+rwx
# Find the names of all "doc" files beneath the current directory.
# Gather them up 3 at a time, and execute
# chmod a+rwx file1 file2 file3
# on each set.
Now, find is a nasty utility, and hard to use. It can do ridiculous things, and one misspelling can ruin one's day.
Why do you want to make all doc files unprotected?

Don't.
If you really want to do this:
find / -x -name `*.doc` -type f -print0 | xargs -0 chmod 777
I have not tested this. I don't guarantee that the Mac versions of the find and xargs commands support all these options; I'm used to the GNU findutils versions that are common on Linux systems.
The find command starts at the root directory, limiting itself (-x) to the current filesystem, selects files whose names end in .doc, and prints them separated by null characters. The latter is necessary in case you have files or directories whose names contain spaces. The -x option is equivalent to the -xdev option in some other versions of find.
If you want to match files whose names in in .Doc, .DOC, etc., use -iname '*.doc' rather than -name '*.doc'.
The xargs command takes this null-delimited list of files and executes chmod 777 on each of them, breaking the list into chunks that won't overflow chmod's command line.
Let me emphasize again that:
I have not tested the above command. I don't currently have access to a MacOS system; I've just read the relevant man pages online. If you want to do a dry run, replace xargs by echo xargs; it will print the commands that it would execute (expect a lot of output with very long lines). Or replace xargs by xargs -n 1 echo to produce one line for each file.
Doing this in the first place is a bad idea. 777 gives read, write, and execute permission to every user on the system. .doc files are not executables; giving them execute permission makes no sense on a Unix-like system. 644 is likely to be a more sensible value (read/write for the owner, read-only for everyone else).

I agree with the previous response that find is nasty. And piping it to another utility can be tricky. If I had to do this, I'd specify the file type and use find's exec function instead of xargs.
find / -x -type f -iname *.doc \-exec chmod 777 {} \;
-x will search only the local Macintosh HD and exclude any mounted volumes
You may want to consider other criteria to refine the search, like a different starting point or excluding particular folders and/or file extensions
find / -x find / -not \( -path "*/some_sub-folder*" -or -path "*/some_other_sub-folder*" \) -type f \( -iname *.doc -or -iname *.docx \) \-exec chmod 777 {} \;
Or if you're set on using find|xargs, you should probably try -print instead of -print0 so that each file is returned as a separate line. That could fix it as well.
Either way, backup before doing this and test carefully. Good luck, HTH.

Related

doctrine:generate:entities chmod operation not permitted

I wanted to continu a project I haven't touched in a while, and came across this error when executing
php bin/console doctrine:generate:entites SalonBundle (it's from shell, so it use PHP CLI)
Generating entities for bundle "SalonBundle"
> backing up Salon.php to Salon.php~
> generating SalonBundle\Entity\Salon
[Symfony\Component\Debug\Exception\ContextErrorException]
Warning: chmod(): Operation not permitted
doctrine:generate:entities [--path PATH] [--no-backup] [-h|--help] [-q|--quiet] [-v|vv|vvv|--verbose] [-V|--version] [--ansi] [--no-ansi] [-n|--no-interaction] [-e|--env ENV] [--no-debug] [--] <command> <name>
To begin with, I'm not sure why Symfony try a chmod
All files are owned by root:www-data
File permissions are rw-rw-r--
My user is in group www-data
upload, creating file, copy, move, etc works fine
The permissions are set via a script which run the following commands
$targetDiris the path passed as argument to the script.
chown -R root:www-data $targerDir
find $targerDir -type d -exec chmod ug+rwx "{}" \;
find $targerDir -type f -exec chmod ug+rw "{}" \;
find $targerDir -type d -exec chmod g+s "{}" \;
find $targerDir -type d -exec setfacl -m g:www-data:rwx,d:g:www-data:rwx "{}" \;
find $targerDir -type f -exec setfacl -m g:www-data:rw- "{}" \;
Just added -vvv to the command line as suggested by someone and got this:
Exception trace:
() at /var/www/3DH/salon/vendor/doctrine/orm/lib/Doctrine/ORM/Tools/EntityGenerator.php:392
Symfony\Component\Debug\ErrorHandler->handleError() at n/a:n/a
chmod() at /var/www/3DH/salon/vendor/doctrine/orm/lib/Doctrine/ORM/Tools/EntityGenerator.php:392
Doctrine\ORM\Tools\EntityGenerator->writeEntityClass() at /var/www/3DH/salon/vendor/doctrine/orm/lib/Doctrine/ORM/Tools/EntityGenerator.php:347
Doctrine\ORM\Tools\EntityGenerator->generate() at /var/www/3DH/salon/vendor/doctrine/doctrine-bundle/Command/GenerateEntitiesDoctrineCommand.php:133
Doctrine\Bundle\DoctrineBundle\Command\GenerateEntitiesDoctrineCommand->execute() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Component/Console/Command/Command.php:256
Symfony\Component\Console\Command\Command->run() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Component/Console/Application.php:837
Symfony\Component\Console\Application->doRunCommand() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Component/Console/Application.php:187
Symfony\Component\Console\Application->doRun() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Bundle/FrameworkBundle/Console/Application.php:80
Symfony\Bundle\FrameworkBundle\Console\Application->doRun() at /var/www/3DH/salon/vendor/symfony/symfony/src/Symfony/Component/Console/Application.php:118
Symfony\Component\Console\Application->run() at /var/www/3DH/salon/bin/console:27
This topic provide no solutions
This solution doesn't apply to me as I don't use vagrant (not installed)
Well, I understood the problem.
I would say I'm stupid in the first place I guess...
#john Smith: You weren't far off...
In fact, my problem is just logic...
Trying to chmod a file owned by root with a regular user is impossible...
And that's normal
Every files generated in the src/ folder are made from the files(lets call them tools) in /vendor/doctrine/orm/lib/Doctrine/ORM/Tools/
And of course files are owned by the user who user these tools (twig, entity, form, etc)
When generating a file, these tools try to chmod the new file in case we were dumb enough to not set right our folders & files permissions.
In my case, as I did a pass of my webperms script, all files were owner by root:www-data.
So of course, when I try to generate entities, it would fail because I wasn't the file owner anymore.
There are currently two solutions to this problem:
Change src/ files ownership to user:www-data
Comment the chmodcommand in the tools located in /vendor/doctrine/orm/lib/Doctrine/ORM/Tools/
I'm not sure how to contact the symfony team, but I would suggest to add an if condition, which would depend of a symfony parameter and allow us to bypass this chmod

How to Execute Python File In Unix Find Command

Okay. So I lets say that I am in the main directory of my computer. How can I search for a file.py and execute it with Unix in one line? Two lines is okay but we are assuming we do not know the file path.
Its a simple question but I am unable to find an answer
Updated
Per kojiro's comment, a better method is to use the -exec argument to find.
$ find ./ -name 'file.py' -exec python '{}' \;
The manpage for find explains its usage better than I can, see here under -exec command ;. But in short it will call command for each result with any arguments up to the \; and replacing '{}' with the file path of the result.
Also in the man page for find, it's worth looking at the notes relating to the -print and -print0 flags if you're using the below approach.
Original Answer
Does something like the following do what you want?
$ cd /path/to/dir/
$ find ./ -name 'file.py' | xargs -L 1 python
which is a pretty useful pattern where
find ./ -name 'file.py'
will list all the paths to files with names matching file.py in the current directory or any subdirectory.
Pipe the output of this into xargs which passes each line from its stdin as an argument to the program given to it. In this case, python. However we want to execute python once for every line given to xargs, from the wikipedia article for xargs
one can also invoke a command for each line of input at a time with -L 1
However, this will match all files under the current path that are named 'file.py'. You can probably limit this to the first result with a flag to find if you want.

Use [msys] bash to remove all files whose name matches a pattern, regardless of file-name letter-case

I need a way to clean up a directory, which is populated with C/C++ built-files (.o, .a, .EXE, .OBJ, .LIB, etc.) produced by (1) some tools which always create files having UPPER-CASE names, and (2) other tools which always create lower-case file names. (I have no control over the tools.)
I need to do this from a MinGW 'msys' bash.exe shell script (or bash command prompt). I understand piping (|), but haven't come up with the right combination of exec's yet. I have successfully filtered the file names, using commands like this example:
ls | grep '.\.[eE][xX][eE]'
to list all files having any case-combination of letters in the file-extension--this example gets all the executable (e.g. ".EXE") files.
(I'll be doing similar for .o, .a, .OBJ, .LIB, .lib, .MAP, etc., which all share the same directory as the C/C++ source files. I don't want to delete the source files, only the built-files. And yes, I probably should rework the directory structure, to use a separate directory for the built-files [only], but that will take time, and I need a quick solution now.)
How can I merge the above command with "something" else (e.g., like the 'rm -f' command???), to carry this the one step further, to actually delete [only] those filtered-out files from the current directory? (I'm hopeful for a solution which does not require a temporary file to hold the filtered file names.)
Adding this answer because the accepted answer is suggesting practices which are not-recommended in actual scripts. (Please don't feel bad, I was also on that track once..)
Parsing ls output is a NO-NO! See http://mywiki.wooledge.org/ParsingLs for more detailed explanation on why.
In short, ls separates the filenames with newline; which can be present in the filename itself. (Plus, ls does not handle other special characters properly. ls prints the output in human readable form.) In unix/linux, it's perfectly valid to have a newline in the filename.
A unix filename cannot have a NULL character though. Hence below command should work.
find /path/to/some/directory -iname '*.exe' -print0 | xargs -0 rm -f
find: is a tool used to, well, find files matching the required pattern/criterion.
-iname: search using particular names, case insensitive. Note that the argument to -iname is wildcard, not regex.
-print0: Print the file names separated by NULL character.
xargs: Takes the input from stdin & runs the commands supplied (rm -f in this case) on them. The input is separaed by white-space by default.
-0 specifies that the input is separated by null character.
Or even better approach,
find /path/to/some/directory -iname '*.exe' -delete
-delete is a built-in feature of find, which deletes the files found with the pattern.
Note that if you want to do some other operation, like move them to particular directory, you'd need to use first option with xargs.
Finally, this command find /path/to/some/directory -iname '*.exe' -delete would recursively find the *.exe files/directories. You can restrict the search to current directory with -maxdepth 1 & filetype to simple file (not directory, pipe etc.) using -type f. Check the manual link I provided for more details.
this is what you mean?
rm -f `ls | grep '.\.[eE][xX][eE]'`
but usually your "ls | grep ..." output will have some other fields that you have to strip out such as date etc., so you might just want to output the file name itself.
try something like:
rm -f `ls | grep '.\.[eE][xX][eE]' | awk '{print $9}'`
where you file name is in the 9th field like:
-rwxr-xr-x 1 Administrators None 283 Jul 2 2014 search.exe
You can use following command:
ls | grep '.\.[eE][xX][eE]' | xargs rm -f
Use of "xargs" would turn standard input ( in this case output of the previous command) as arguments for "rm -f" command.

How to Recursively Remove Files of a Certain Type

I misread the gzip documentation, and now I have to remove a ton of ".gz" files from many directories inside one another. I tried using 'find' to locate all .gz files. However, whenever there's a file with a space in the name, rm interprets that as another file. And whenever there's a dash, rm interprets that as a new flag. I decided to use 'sed' to replace the spaces with "\ " and the space-dashes with "\ -", and here's what I came up with.
find . -type f -name '*.gz' | sed -r 's/\ /\\ /g' | sed -r 's/\ -/ \\-/g'
When I run the find/sed query on a file that, for example, has a name of "Test - File - for - show.gz", I get the output
./Test\ \-\ File\ \-\ for\ \-\ show.gz
Which appears to be acceptable for rm, but when I run
rm $(find . -type f -name '*.gz'...)
I get
rm: cannot remove './Test\\': No such file or directory
rm: cannot remove '\\-\\': No such file or directory
rm: cannot remove 'File\\': No such file or directory
rm: cannot remove '\\-\\': No such file or directory
...
I haven't made extensive use of sed, so I have to assume I'm doing something wrong with the regular expressions. If you know what I'm doing wrong, or if you have a better solution, please tell me.
Adding backslashes before spaces protects the spaces against expansion in shell source code. But the output of a command in a command substitution does not undergo shell parsing, it only undergoes wildcard expansion and field splitting. Adding backslashes before spaces doesn't protect them against field splitting.
Adding backslashes before dashes is completely useless since it's rm that interprets dashes as special, and it doesn't interpret backslashes as special.
The output of find is ambiguous in general — file names can contain newlines, so you can't use a newline as a file name separator. Parsing the output of find is usually broken unless you're dealing with file names in a known, restricted character set, and it's often not the simplest method anyway.
find has a built-in way to execute external programs: the -exec action. There's no parsing going on, so this isn't subject to any problem with special characters in file names. (A path beginning with - could still be interpreted as an option, but all paths begin with . since that's the directory being traversed.)
find . -type f -name '*.gz' -exec rm {} +
Many find implementations (Linux, Cygwin, BSD) can delete files without invoking an external utility:
find . -type f -name '*.gz' -delete
See Why does my shell script choke on whitespace or other special characters? for more information on writing robust shell scripts.
There is no need to pipe to sed, etc. Instead, you can make use of the -exec flag on find, that allows you to execute a command on each one of the results of the command.
For example, for your case this would work:
find . -type f -name '*.gz' -exec rm {} \;
which is approximately the same as:
find . -type f -name '*.gz' -exec rm {} +
The last one does not open a subshell for each result, which makes it faster.
From man find:
-exec command ;
Execute command; true if 0 status is returned. All following
arguments to find are taken to be arguments to the command until an
argument consisting of ;' is encountered. The string{}' is
replaced by the current file name being processed everywhere it occurs
in the arguments to the command, not just in arguments where it is
alone, as in some versions of find. Both of these constructions
might need to be escaped (with a `\') or quoted to protect them from
expansion by the shell. See the EXAMPLES section for examples of the
use of the -exec option. The specified command is run once for
each matched file. The command is executed in the starting directory.
There are unavoidable security problems surrounding use of the -exec
action; you should use the -execdir option instead.

Remove duplicate filename extensions

I have thousands of files named something like filename.gz.gz.gz.gz.gz.gz.gz.gz.gz.gz.gz
I am using the find command like this find . -name "*.gz*" to locate these files and either use -exec or pipe to xargs and have some magic command to clean this mess, so that I end up with filename.gz
Someone please help me come up with this magic command that would remove the unneeded instances of .gz. I had tried experimenting with sed 's/\.gz//' and sed 's/(\.gz)//' but they do not seem to work (or to be more honest, I am not very familiar with sed). I do not have to use sed by the way, any solution that would help solve this problem would be welcome :-)
one way with find and awk:
find $(pwd) -name '*.gz'|awk '{n=$0;sub(/(\.gz)+$/,".gz",n);print "mv",$0,n}'|sh
Note:
I assume there is no special chars (like spaces...) in your filename. If there were, you need quote the filename in mv command.
I added a $(pwd) to get the absolute path of found name.
you can remove the ending |sh to check generated mv ... .... cmd, if it is correct.
If everything looks good, add the |sh to execute the mv
see example here:
You may use
ls a.gz.gz.gz |sed -r 's/(\.gz)+/.gz/'
or without the regex flag
ls a.gz.gz.gz |sed 's/\(\.gz\)\+/.gz/'
ls *.gz | perl -ne '/((.*?.gz).*)/; print "mv $1 $2\n"'
It will print shell commands to rename your files, it won't execute those commands. It is safe. To execute it, you can save it to file and execute, or simply pipe to shell:
ls *.gz | ... | sh
sed is great for replacing text inside files.
You can do that with bash string substitution:
for file in *.gz.gz; do
mv "${file}" "${file%%.*}.gz"
done
This might work for you (GNU sed):
echo *.gz | sed -r 's/^([^.]*)(\.gz){2,}$/mv -v & \1\2/e'
find . -name "*.gz.gz" |
while read f; do echo mv "$f" "$(sed -r 's/(\.gz)+$/.gz/' <<<"$f")"; done
This only previews the renaming (mv) command; remove the echo to perform actual renaming.
Processes matching files in the current directory tree, as in the OP (and not just files located directly in the current directory).
Limits matching to files that end in at least 2 .gz extensions (so as not to needlessly process files that end in just one).
When determining the new name with sed, makes sure that substring .gz doesn't just match anywhere in the filename, but only as part of a contiguous sequence of .gz extensions at the end of the filename.
Handles filenames with special chars. such as embedded spaces correctly (with the exception of filenames with embedded newlines.)
Using bash string substitution:
for f in *.gz.gz; do
mv "$f" "${f%%.gz.gz*}.gz"
done
This is a slight modification of jaypal's nice answer (which would fail if any of your files had a period as part of its name, such as foo.c.gz.gz). (Mine is not perfect, either) Note the use of double-quotes, which protects against filenames with "bad" characters, such as spaces or stars.
If you wish to use find to process an entire directory tree, the variant is:
find . -name \*.gz.gz | \
while read f; do
mv "$f" "${f%%.gz.gz*}.gz"
done
And if you are fussy and need to handle filenames with embedded newlines, change the while read to while IFS= read -r -d $'\0', and add a -print0 to find; see How do I use a for-each loop to iterate over file paths output by the find utility in the shell / Bash?.
But is this renaming a good idea? How was your filename.gz.gz created? gzip has guards against accidentally doing so. If you circumvent these via something like gzip -c $1 > $1.gz, buried in some script, then renaming these files will give you grief.
Another way with rename:
find . -iname '*.gz.gz' -exec rename -n 's/(\.\w+)\1+$/$1/' {} +
When happy with the results remove -n (dry-run) option.