So I want to change the name of a specific folder recursively. However, that folder isn't always at the same depth or position. I want to change the folder name from variables to constant.
So the variables folders might be located at depth 2, and/or 3, and/or 4, and/or 5, 6, etc... I do not know that
It might be
/var/me/variables/.../.../
or
/var/me/..../..../.../variables/...
or
/var/variables/..../variables/.../../variables/
What I want again is, WHEREVER there is a folder called variables, change its name to constant
I did the following code, but it doesn't work
find var -type d -exec echo `echo "{}" | sed 's/variables/constant/g'` \;
any help would be appreciated.
Thank you!!
This is fun! Here are my two cents:
find . -depth -type d -name variables -execdir mv -T {} constant \;
Rationale:
-depth avoids changing a path find later descends into; probably can be omitted.
-execdir avoids the need to play games with entire paths, so we can operate only on the directory basename
passing the -T option to mv makes it bail if a directory constant should already exist
You actually need to call mv, not echo:
find var -type d -name "*variables*" -exec mv {} `echo "{}" | sed 's/variables/constant/g'` \;
The above assumes you have directories with with string "variables" in names. If you are after directories which are called precisely "variables", then it could be a bit simpler:
find var -type d -name variables -exec mv {} constant \;
Try this find command:
find var -name "*variables*" -type d -exec bash -c 'mv "$1" "${1//variables/constant}"' - '{}' \;
Try using find with rename
find /var -type d -name 'variables' -type d -exec rename 's/variables/constant/' {} \;
Related
I've been searching for a long time and can't find an answer that works. I have a list with partial filenames (the first few letters of the filenames). If I place the file names individually as follows it works:
find ~/directory/to/search -name "filename*" -print -exec cp '{}' ~/directory/to/copyto \;
If I try to include the list in this scenario it does not:
cat ~/directory/List.txt | while read line
do
echo "Text read from file - $line"
find ~/directory/to/search -name "$line*" -type f
done
neither does this:
cat ~/directory/List.txt | while read line
do
echo "Text read from file - $line"
find ~/directory/to/search -name "$line&*" -type f
done
Ultimately, I'd like to add:
-exec cp '{}' ~/directory/to/copy/to \;
And copy over all files matching the find criteria.
I've tried using grep but the files are huge so it would take forever. I tried using all sorts of combinations of find, xargs, cp, grep and regex as read in previous searches and no luck.
Is the only solution to write a long script with a bunch of if then statements? I've been using Linux but it would be cool to use it on mac as well.
Here is a crude attempt at getting away with a single find invocation.
predicates=()
or=''
while read -r line; do
predicates+=($or -name "$line*")
or="-o"
done < ~/directory/list.txt
find ~/directory/to/search -type f \( "${predicates[#]}" \) \
-exec cp -t ~/directory/to/copy/to {} +
The array functionality requires an extended shell (Bash, ksh, etc) with this functionality; it won't work with /bin/sh.
cp -t is a GNU extension; if you don't have that, maybe just use your original -exec cp {} dir \; though it will be less efficient. Some old versions of find also don't support -exec ... +.
I have a list of 20 files, 10 of them already have 1970-01-01- at the beginning of the name and 10 does not ( the remaining ones all start with a small letter ) .
So my task was to rename those files that do not have the epoch date in the beginning with the epoch date too. Using bash, the below code works, but I could not solve it using a regular expression for example using rename. I had to extract the basename and then further mv. An elegant solution would be just use one pipe instead of two.
Works
find ./ -regex './[a-z].*' | xargs -I {} basename {} | xargs -I {} mv {} 1970-01-01-{}
Hence looking for a solution with just one xargs or -exec?
You can just use a single rename command:
rename -n 's/^([a-z])/1970-01-01-$1/' *
Assuming you're operating on all the files present in current directory.
Note that -n flag (dry run) will only show intended actions by rename command but won't really rename any files.
If you want to combine with find then use:
find . -type f -maxdepth 1 -name '[a-z]*.txt' -execdir rename -n 's/^/1970-01-01-/' {} +
I always prefer readable code over short code.
r() {
base=$(basename "$1")
dir=$(dirname "$1")
if [[ "$base" =~ ^1970-01-01- ]]
then
: "ignore, already has correct prefix"
else
echo mv "$1" "$dir/1970-01-01-$base"
fi
}
export -f r
find . -type f -exec bash -c 'r {}' \;
This also just prints out what would have been done (for testing). Remove the echo before the mv to have to real thing.
Mind that the mv will overwrite existing files (if there is a ./a/b/c and an ./a/b/1970-01-01-c already). Use option -i to mv to be save from this.
I want to change the file name on a certain type of files. It should run recursively.
I have almost got it and don't know how to work with the parameter {} (the absolute Path).
find $PWD -type f -name '*.jpg' -exec echo " {} " \;
For example, I want to change the extensions using reg. expressions and not the command rename etc. I need sometimes new name to pass it to a function, therefore rename is not applicable here.It should be possible to work with the parameter like in for-case with the parameter $each:
for each in /* do echo "${each\./\}.png" done
How can I apply regex on parameter {}, like here: "${each \ . / \ }.png"?
I found a workaround of the basename misbehaving when using it with find :
find $PWD -type f -name '*.jpg' -exec sh -c 'echo "$(basename "$0" .jpg).png"' {} \;
sh forces the following commands to be interpretated using your /bin/sh file. The -c option specifies arguments are passed as strings (here, your argument $0 is {}).
If you have the following files :
/home/username/image1.jpg
/home/username/Documents/image2.jpg
This will output :
image1.png
image2.png
EDIT
If you want to keep the full path, you can use this :
find $PWD -type f -name '*.jpg' -exec sh -c 'echo "${0%%.jpg}".png' {} \;
This will output :
/home/username/image1.png
/home/username/Documents/image2.png
I need to go recursively through directories. First argument must be directory in which I need to start from, second argument is regex which describes name of the file.
ex. ./myscript.sh directory "regex"
While script recursively goes through directories and files, it must use wc -l to count lines in the files which are described by regex.
How can I use find with -exec to do that? Or there is maybe some other way to do it? Please help.
Thanks
Yes, you can use find:
$ find DIR -iname "regex" -type f -exec wc -l '{}' \;
Or, if you want to count the total number of lines, in all files:
$ find DIR -iname "regex" -type f -exec wc -l '{}' \; | awk '{ SUM += $1 } END { print SUM }'
Your script would then look like:
#!/bin/bash
# $1 - name of the directory - first argument
# $2 - regex - second argument
if [ $# -lt 2 ]; then
echo Usage: ./myscript.sh DIR "REGEX"
exit
fi
find "$1" -iname "$2" -type f -exec wc -l '{}' \;
Edit: - if you need more fancy regular expressions, use -regextype posix-extended and -regex instead of -iname as noted by #sudo_O in his answer.
The thing is, I have a custom bash script that makes specific actions over the specified folder, with the syntax being:
myscript $1 $2 $3
Where $1 is the name of the folder and $2, $3 other necessary numeric arguments for the job.
Let's say I have a directory with many folders, for instance:
a
b
c
d
ref-1
ref-2
ref-3
I only need to execute the script on the folder not starting with ref-. As it would cost me a lot of effort to execute manually the script for each of the files, I want to figure out how to can I use regular expressions to limit just to the desired folders, using inverse matching like "extglob", and for each one of this execute the script with also the given $2 and $3 arguments.
you can $(ls -d */|grep -v ^ref-), for instance.
This should do it for you -
Using find -type d option you are limiting your find search to directories. Providing a pattern with -not -name can help you ignore the matched directories. -exec option can help you kick off your shell script for all the directories that have been found. {} \; just means a buffer space for exec to execute your shell script for each directory one at a time. -depth n will ensure it descends to n directory levels.
find . -not -name ref\* -not -name .\* -type d -depth 1 -exec ./script.sh {} \;
UPDATE:
My bad, I forgot about handling two user driven parameters. I may not be an expert but the easiest way I would think of doing is sending the find result to a file and then running the for loop on that file.
Something like this -
[jaypal~/Temp]$ find . -not -name ref\* -not -name .\* -type d -depth 1 > dirlist
[jaypal~/Temp]$ for i in dirlist; do ./script.sh $i param1 param2; done
Alternate way to pass user driven parameters (as suggested by #dma_k)
[jaypal~/Temp]$ find . -not -name ref\* -not -name .\* -type d -depth 1 | xargs script.sh param1 param2
This might work for you:
ls !(ref-?)
or
ls !(ref*)
or
ls +({a..d})
or just
ls ?
EDIT:
The script could then be run so:
for first in !(ref-?); do script.sh $first $second $third; done
Where second and third are the numeric arguments.
How about ls | grep yourRegexHere