How to Execute Python File In Unix Find Command - python-2.7

Okay. So I lets say that I am in the main directory of my computer. How can I search for a file.py and execute it with Unix in one line? Two lines is okay but we are assuming we do not know the file path.
Its a simple question but I am unable to find an answer

Updated
Per kojiro's comment, a better method is to use the -exec argument to find.
$ find ./ -name 'file.py' -exec python '{}' \;
The manpage for find explains its usage better than I can, see here under -exec command ;. But in short it will call command for each result with any arguments up to the \; and replacing '{}' with the file path of the result.
Also in the man page for find, it's worth looking at the notes relating to the -print and -print0 flags if you're using the below approach.
Original Answer
Does something like the following do what you want?
$ cd /path/to/dir/
$ find ./ -name 'file.py' | xargs -L 1 python
which is a pretty useful pattern where
find ./ -name 'file.py'
will list all the paths to files with names matching file.py in the current directory or any subdirectory.
Pipe the output of this into xargs which passes each line from its stdin as an argument to the program given to it. In this case, python. However we want to execute python once for every line given to xargs, from the wikipedia article for xargs
one can also invoke a command for each line of input at a time with -L 1
However, this will match all files under the current path that are named 'file.py'. You can probably limit this to the first result with a flag to find if you want.

Related

Using xargs, eval, and mv ensemble

I've been using the command line more frequently lately to increase my proficiency. I've created a .txt file containing URLs for libraries that I'd like to download. I batch-downloaded these files using
$ cat downloads.txt | xargs wget
When using the wget command I didn't specify a destination directory. I'd like to move each of the files that I've just downloaded into a directory called "vendor".
For the record, it has occurred to me that if I ran...
$ open .
...I could drag-and-drop these files into the desired directory. But in my opinion that would defeat the purpose of this exercise.
Now that I have the files in my cwd, I'd like to be able to target them and move them into the "vendor" directory.
As a side-question: Is there a useful way to print the most recently created files to STDOUT? Currently, I can grab the filenames from the URLs within downloads.txt pretty simply using the following pipeline and Perl script...
$ cat downloads.txt | perl -n -e 'if (/(?<=\/)([-.a-z]+)$/) { print $1 . "\n" }'
This will produce...
react.js
redux.js
react-dom.js
expect.js
...which is great as these are file that I intended on targeting. I'd like to transform each of these lines into a command within a pipeline that resembles this...
$ mv {./,./vendor/}<filename>
... where <filename> is "react.js" then "redux.js", and so forth.
I figure that I may be able to accomplish this using some combination of xargs, eval, and mv. This is where my bash skills drop-off.
Just to reiterate, I'm aware that the method in which I am approaching this problem is neither simple nor ideal. This is intentionally a convoluted exercise intended to stretch my bash knowledge.
Is there anyone who knows how I can use xargs, eval, and mv to accomplish this goal?
Thank you!
xargs -l -a downloads.txt basename | xargs -i mv {} ./vendor
How this works: The first instance of xargs reads the file names from downloads.txt and calls basename for each of these file names individually (alternatively, you could use basename -a). These basenames are then piped to another instance of xargs, which uses the arguments to call mv, replacing the string {} with the current argument.
mv $(basename -a $(<downloads.txt)) ./vendor
How this works: Since you want to move all the files into the same directory, you can use a single call to mv. The command substitution ("backticks") inserts the output of the command basename -a, which, in turn, reads its arguments from the file.

Expand command line exclude pattern with zsh

I'm trying to pass a complicated regex as an ignore pattern. I want to ignore all subfolders of locales/ except locales/US/en/*. I may need to fallback to using a .agignore file, but I'm trying to avoid that.
I'm using silver searcher (similar to Ack, Grep). I use zsh in my terminal.
This works really well and ignores all locale subfolders except locales/US:
ag -g "" --ignore locales/^US/ | fzf
I also want to ignore all locales/US/* except for locales/US/en
Want I want is this, but it does not work.
ag -g "" --ignore locales/^US/^en | fzf
Thoughts?
Add multiple --ignore commands. For instance:
ag -g "" --ignore locales/^US/ --ignore locales/US/^en
The following can work as well:
find locales/* -maxdepth 0 -name 'US' -prune -o -exec rm -rf '{}' ';'
Man Pages Documentation
-prune
True; if the file is a directory, do not descend into it. If -depth is given, false; no effect. Because -delete implies -depth, you cannot usefully use -prune and -delete together.
-prune lets you filter out your results (better description here)
-exec command ;
Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command until an argument consisting of ';' is encountered. The string '{}' is replaced by the current file name being processed everywhere it occurs in the arguments to the command, not just in arguments where it is alone, as in some versions of find. Both of these constructions might need to be escaped (with a '\') or quoted to protect them from expansion by the shell. See the EXAMPLES section for examples of the use of the -exec option. The specified command is run once for each matched file. The command is executed in the starting directory. There are unavoidable security problems surrounding use of the -exec action; you should use the -execdir option instead.
-exec lets you execute a command on any results find returns.

How to use grep to find in a directory by a regex?

I tried
grep -R '.*invalidTemplateName.*' -regex './online_admin/.*/UTF-8/.*'
to find all occurences of possible mathces of the '.invalidTemplateName.' regex within a directory regex pattern './online_admin/.*/UTF-8/.*', but it doesn't work. I got the message:
grep: ./online_admin/.*/UTF-8/.*: No such file or directory
If I use
grep -R '.*invalidTemplateName.*' .
it looks up in all subdirectory of the current directory that's overwhelming. How can I specify a directory pattern in grep? Is it possible?
Find might be a better choice here:
find ./online_admin/*/UTF-8/* -type f -exec grep -H "invalidTemplateName" {} \;
Find will locate all files in the locations you want, including subdirs of UTF-8 and then execute grep on each file. the -H argument ensures the filename will be printed along with the match. If you want only the filename, use the -L switch instead.
with find you could do something like that:
find /abs/path/to/directory -maxdepth 1 -name '.*invalidTemplateName.*'
using the name argument you can directly filter by names. you can also use wildcards for the filter-string.
using the maxdepth argument you can specify the level of recursion to look up the files. 1 means to look up in /abs/path/to/directory, 2 means to look up in /abs/path/to/directory and in the first level of directories in /abs/path/to/directory as well.

How to Recursively Remove Files of a Certain Type

I misread the gzip documentation, and now I have to remove a ton of ".gz" files from many directories inside one another. I tried using 'find' to locate all .gz files. However, whenever there's a file with a space in the name, rm interprets that as another file. And whenever there's a dash, rm interprets that as a new flag. I decided to use 'sed' to replace the spaces with "\ " and the space-dashes with "\ -", and here's what I came up with.
find . -type f -name '*.gz' | sed -r 's/\ /\\ /g' | sed -r 's/\ -/ \\-/g'
When I run the find/sed query on a file that, for example, has a name of "Test - File - for - show.gz", I get the output
./Test\ \-\ File\ \-\ for\ \-\ show.gz
Which appears to be acceptable for rm, but when I run
rm $(find . -type f -name '*.gz'...)
I get
rm: cannot remove './Test\\': No such file or directory
rm: cannot remove '\\-\\': No such file or directory
rm: cannot remove 'File\\': No such file or directory
rm: cannot remove '\\-\\': No such file or directory
...
I haven't made extensive use of sed, so I have to assume I'm doing something wrong with the regular expressions. If you know what I'm doing wrong, or if you have a better solution, please tell me.
Adding backslashes before spaces protects the spaces against expansion in shell source code. But the output of a command in a command substitution does not undergo shell parsing, it only undergoes wildcard expansion and field splitting. Adding backslashes before spaces doesn't protect them against field splitting.
Adding backslashes before dashes is completely useless since it's rm that interprets dashes as special, and it doesn't interpret backslashes as special.
The output of find is ambiguous in general — file names can contain newlines, so you can't use a newline as a file name separator. Parsing the output of find is usually broken unless you're dealing with file names in a known, restricted character set, and it's often not the simplest method anyway.
find has a built-in way to execute external programs: the -exec action. There's no parsing going on, so this isn't subject to any problem with special characters in file names. (A path beginning with - could still be interpreted as an option, but all paths begin with . since that's the directory being traversed.)
find . -type f -name '*.gz' -exec rm {} +
Many find implementations (Linux, Cygwin, BSD) can delete files without invoking an external utility:
find . -type f -name '*.gz' -delete
See Why does my shell script choke on whitespace or other special characters? for more information on writing robust shell scripts.
There is no need to pipe to sed, etc. Instead, you can make use of the -exec flag on find, that allows you to execute a command on each one of the results of the command.
For example, for your case this would work:
find . -type f -name '*.gz' -exec rm {} \;
which is approximately the same as:
find . -type f -name '*.gz' -exec rm {} +
The last one does not open a subshell for each result, which makes it faster.
From man find:
-exec command ;
Execute command; true if 0 status is returned. All following
arguments to find are taken to be arguments to the command until an
argument consisting of ;' is encountered. The string{}' is
replaced by the current file name being processed everywhere it occurs
in the arguments to the command, not just in arguments where it is
alone, as in some versions of find. Both of these constructions
might need to be escaped (with a `\') or quoted to protect them from
expansion by the shell. See the EXAMPLES section for examples of the
use of the -exec option. The specified command is run once for
each matched file. The command is executed in the starting directory.
There are unavoidable security problems surrounding use of the -exec
action; you should use the -execdir option instead.

CHMOD 777 specific file extensions in Terminal

I am trying to CHMOD 777 all .doc files on my Mac. Is there is a way through Terminal that I could do this?
__
Thanks for the responses. I thought this was the way to change permissions on Word doc files. I have 2 users on Mac make that share a folder. But when one creates a doc file the other just has read permissions. I want both of them to have this, or everyone. It doesn'tmatter. I also want to go back and retroactively make all the past docs, some of which user A has read&write permissions, and some of which user B has read&write permissions for, read&writeable by both of them. Is there another way to do this? From what I can tell, this is a Mac permissions issue, nothing in Word. I thought CHMOD 777 in terminal was the way to do this.
Use find and xargs or the -exec option of find.
/home/erjablow> find . -name "*.doc" | xargs -n 3 chmod a+rwx
# Find the names of all "doc" files beneath the current directory.
# Gather them up 3 at a time, and execute
# chmod a+rwx file1 file2 file3
# on each set.
Now, find is a nasty utility, and hard to use. It can do ridiculous things, and one misspelling can ruin one's day.
Why do you want to make all doc files unprotected?
Don't.
If you really want to do this:
find / -x -name `*.doc` -type f -print0 | xargs -0 chmod 777
I have not tested this. I don't guarantee that the Mac versions of the find and xargs commands support all these options; I'm used to the GNU findutils versions that are common on Linux systems.
The find command starts at the root directory, limiting itself (-x) to the current filesystem, selects files whose names end in .doc, and prints them separated by null characters. The latter is necessary in case you have files or directories whose names contain spaces. The -x option is equivalent to the -xdev option in some other versions of find.
If you want to match files whose names in in .Doc, .DOC, etc., use -iname '*.doc' rather than -name '*.doc'.
The xargs command takes this null-delimited list of files and executes chmod 777 on each of them, breaking the list into chunks that won't overflow chmod's command line.
Let me emphasize again that:
I have not tested the above command. I don't currently have access to a MacOS system; I've just read the relevant man pages online. If you want to do a dry run, replace xargs by echo xargs; it will print the commands that it would execute (expect a lot of output with very long lines). Or replace xargs by xargs -n 1 echo to produce one line for each file.
Doing this in the first place is a bad idea. 777 gives read, write, and execute permission to every user on the system. .doc files are not executables; giving them execute permission makes no sense on a Unix-like system. 644 is likely to be a more sensible value (read/write for the owner, read-only for everyone else).
I agree with the previous response that find is nasty. And piping it to another utility can be tricky. If I had to do this, I'd specify the file type and use find's exec function instead of xargs.
find / -x -type f -iname *.doc \-exec chmod 777 {} \;
-x will search only the local Macintosh HD and exclude any mounted volumes
You may want to consider other criteria to refine the search, like a different starting point or excluding particular folders and/or file extensions
find / -x find / -not \( -path "*/some_sub-folder*" -or -path "*/some_other_sub-folder*" \) -type f \( -iname *.doc -or -iname *.docx \) \-exec chmod 777 {} \;
Or if you're set on using find|xargs, you should probably try -print instead of -print0 so that each file is returned as a separate line. That could fix it as well.
Either way, backup before doing this and test carefully. Good luck, HTH.