I wrote a shell script , and a portion of the script failed, and it said too many arguments:
if [ -f ABC_DEF_*.* ]; then
What I want to do , is to test whether there are any such file matching the string, but the shell complain that is too many arguments. In the directory there are 20 such files.
Would it be the shell expanded the wildcard and turn ABC_DEF_. into a list of 20 filenames?
If yes, how can I resolved this?
The problem is that you can not use [ -f <more then one> ]. It doesn't even make sense: is it returning true when all files exist or when at least one file exist?
if you want to test for existence, do NUM=$(ls <pattern>|wc -l)
You can use [ -f <filename with wildcard> ]. It returns true if one or more files exist that match the filename with the wildcard, with the caveat, if there are more files that match than the if statement can use, it throws the 'too many' error and does not execute the statements in the if block. If the if statement was [ ! -f <filename with wildcard> ], it would work as expected in all cases.
Having said that, your solution works for the op's desired behavior (and doesn't reverse the logic when the error occurs).
Related
I would like to automate zsh's installation and configuration and I am currently unable to check out either a file whose name begin with '~' exists and is not empty.
zsh_rcfile='~/.zshrc'
set -- $zsh_rcfile
if [ -s "${zsh_rcfile}" ]; then
printf "Zsh is already configured."
fi
When I execute the script into the terminal, no error is returned but no output is produced as well.
I tried to hard code the pathname, or use it without the curly braces but the result is the same.
I also tried not using the set command (which prevent nasty surprises with empty names or names beginning with a dash).
The 'if' statement would work without the tilde symbol (i.e. ~) but it would account for an inferior solution as I need to automate other processes required to cross the system tree (and not just the 'home' partition).
Does anyone accept to help me achieving the comparison against a path beginning with '~'?
N.B.: I'm using zsh 5.7.1 (x86_64-debian-linux-gnu).
Tilde expansion isn't performed on the result of a parameter expansion. You want to leave the ~ unquoted so that it is expanded when you define the parameter.
% zsh_rcfile='~/.zshrc'
% print $zsh_rcfile
~/.zshrc
% zsh_rcfile=~/.zshrc
% print $zsh_rcfile
/Users/<user>/.zshrc
The -s operator returns false if the file doesn't exist in the first place (which makes sense, since a nonexistent file is trivially empty).
-s file
true if file exists and has size greater than zero.
I think I have uncovered an error in grep. If I run this grep statement against a db log on the command line it runs fine.
grep "Query Executed in [[:digit:]]\{5\}.\?" db.log
I get this result:
Query Executed in 19699.188 ms;"select distinct * from /xyztable.....
when I run it in a script
LONG_QUERY=`grep "Query Executed in [[:digit:]]\{5\}.\?" db.log`
the asterisk in the result is replaced with a list of all files in the current directory.
echo $LONG_QUERY
Result:
Query Executed in 19699.188 ms; "select distinct <list of files in
current directory> from /xyztable.....
Has anyone seen this behavior?
This is not an error in grep. This is an error in your understanding of how scripts are interpreted.
If I write in a script:
echo *
I will get a list of filenames, because an unquoted, unescaped, asterisk is interpreted by the shell (not grep, but /bin/bash or /bin/sh or whatever shell you use) as a request to substitute filenames matching the pattern '*', which is to say all of them.
If I write in a script:
echo "*"
I will get a single '*', because it was in a quoted string.
If I write:
STAR="*"
echo $STAR
I will get filenames again, because I quoted the star while assigning it to a variable, but then when I substituted the variable into the command it became unquoted.
If I write:
STAR="*"
echo "$STAR"
I will get a single star, because double-quotes allow variable interpolation.
You are using backquotes - that is, ` characters - around a command. That captures the output of the command into a variable.
I would suggest that if you are going to be echoing the results of the command, and little else, you should just redirect the results into a file. (After all, what are you going to do when your LONG_QUERY contains 10,000 lines of output because your log file got really full?)
Barring that, at the very least do echo "$LONG_QUERY" (in double quotes).
Can a bash/shell expert help me in this? Each time I use PDF to split large pdf file (say its name is X.pdf) into separate pages, where each page is one pdf file, it creates files with this pattern
"X 1.pdf"
"X 2.pdf"
"X 3.pdf" etc...
The file name "X" above is the original file name, which can be anything. It then adds one space after the name, then the page number. Page numbers always start from 1 and up to how many pages. There is no option in adobe PDF to change this.
I need to run a shell command to simply remove/strip out all the "X " part, and just leave the digits, like this
1.pdf
2.pdf
3.pdf
....
100.pdf ...etc..
Not being good in pattern matching, not sure what regular expression I need.
I know I need something like
for i in *.pdf; do mv "$i$" ........; done
And it is the ....... part I do not know how to do.
This only needs to run on Linux/Unix system.
Use sed..
for i in *.pdf; do mv "$i" $(sed 's/.*[[:blank:]]//' <<< "$i"); done
And it would be simple through rename
rename 's/.*\s//' *.pdf
You can remove everything up to (including) the last space in the variable with this:
${i##* }
That's "star space" after the double hash, meaning "anything followed by space". ${i#* } would remove up to the first space.
So run this to check:
for i in *.pdf; do echo mv -i -- "$i" "${i##* }" ; done
and remove the echo if it looks good. The -i suggested by Gordon Davisson will prompt you before overwriting, and -- signifies end of options, which prevents things from blowing up if you ever have filenames starting with -.
If you just want to do bulk renaming of files (or directories) and don't mind using external tools, then here's mine: rnm
The command to do what you want would be:
rnm -rs '/.*\s//' *.pdf
.*\s selects the part before (and with) the last white space and replaces it with empty string.
Note:
It doesn't overwrite any existing files (throws warning if it finds an existing file with the target name).
And this operation is failsafe. You can get back the changes made by last rnm command with rnm -u.
Here's a list of documents for rnm.
This question pertains to renaming files in a directory on a Linux system wherein the affected files appear in this general format:
index.html?p=155
index.html?page_id=10
index.html?author=2&paged=5
index.html?feed=rss2&tag=search-engine
index.html?tag=social-media
Might there be a shell level "rename" command that I can use to replace the question marks (?) with an underscore (_) in each file within a directory?
Thank you in advance for any advice or information!
I would prefer the rename command myself, though sometimes a roll-your-own for loop can be more targeted.
Note: rename takes a sed expression as the argument to change and filenames as the last argument. The proper call to use would be:
rename 's/\?/_/' index*
because the ? indicates 0 or 1 of a preceding character when not \ escaped.
This is also easier to toss into a find command which can operate recursively, etc:
find . -name index.html* -exec rename 's/\?/_/' {} +
for file in index.html\?*; do
new=${file/\?/_} # Substitute underscore for ?
mv "$file" "$new" # Rename the file
done
See the Parameter Expansion section of the bash man page for information on the substitution syntax used.
You can use the rename command.
rename '?' '_' *
The first parameter is the expression you want to replace, the second argument is the string to replace the first parameter with, and the final option is the selection of files to apply it to ( all in the current directory, in this case )
See the man page for more details.http://ss64.com/bash/rename.html
I'm writing a program, foo, in C++. It's typically invoked on the command line like this:
foo *.txt
My main() receives the arguments in the normal way. On many systems, argv[1] is literally *.txt, and I have to call system routines to do the wildcard expansion. On Unix systems, however, the shell expands the wildcard before invoking my program, and all of the matching filenames will be in argv.
Suppose I wanted to add a switch to foo that causes it to recurse into subdirectories.
foo -a *.txt
would process all text files in the current directory and all of its subdirectories.
I don't see how this is done, since, by the time my program gets a chance to see the -a, then shell has already done the expansion and the user's *.txt input is lost. Yet there are common Unix programs that work this way. How do they do it?
In Unix land, how can I control the wildcard expansion?
(Recursing through subdirectories is just one example. Ideally, I'm trying to understand the general solution to controlling the wildcard expansion.)
You program has no influence over the shell's command line expansion. Which program will be called is determined after all the expansion is done, so it's already too late to change anything about the expansion programmatically.
The user calling your program, on the other hand, has the possibility to create whatever command line he likes. Shells allow you to easily prevent wildcard expansion, usually by putting the argument in single quotes:
program -a '*.txt'
If your program is called like that it will receive two parameters -a and *.txt.
On Unix, you should just leave it to the user to manually prevent wildcard expansion if it is not desired.
As the other answers said, the shell does the wildcard expansion - and you stop it from doing so by enclosing arguments in quotes.
Note that options -R and -r are usually used to indicate recursive - see cp, ls, etc for examples.
Assuming you organize things appropriately so that wildcards are passed to your program as wildcards and you want to do recursion, then POSIX provides routines to help:
nftw - file tree walk (recursive access).
fnmatch, glob, wordexp - to do filename matching and expansion
There is also ftw, which is very similar to nftw but it is marked 'obsolescent' so new code should not use it.
Adrian asked:
But I can say ls -R *.txt without single quotes and get a recursive listing. How does that work?
To adapt the question to a convenient location on my computer, let's review:
$ ls -F | grep '^m'
makefile
mapmain.pl
minimac.group
minimac.passwd
minimac_13.terminal
mkmax.sql.bz2
mte/
$ ls -R1 m*
makefile
mapmain.pl
minimac.group
minimac.passwd
minimac_13.terminal
mkmax.sql.bz2
mte:
multithread.ec
multithread.ec.original
multithread2.ec
$
So, I have a sub-directory 'mte' that contains three files. And I have six files with names that start 'm'.
When I type 'ls -R1 m*', the shell notes the metacharacter '*' and uses its equivalent of glob() or wordexp() to expand that into the list of names:
makefile
mapmain.pl
minimac.group
minimac.passwd
minimac_13.terminal
mkmax.sql.bz2
mte
Then the shell arranges to run '/bin/ls' with 9 arguments (program name, option -R1, plus 7 file names and terminating null pointer).
The ls command notes the options (recursive and single-column output), and gets to work.
The first 6 names (as it happens) are simple files, so there is nothing recursive to do.
The last name is a directory, so ls prints its name and its contents, invoking its equivalent of nftw() to do the job.
At this point, it is done.
This uncontrived example doesn't show what happens when there are multiple directories, and so the description above over-simplifies the processing.
Specifically, ls processes the non-directory names first, and then processes the directory names in alphabetic order (by default), and does a depth-first scan of each directory.
foo -a '*.txt'
Part of the shell's job (on Unix) is to expand command line wildcard arguments. You prevent this with quotes.
Also, on Unix systems, the "find" command does what you want:
find . -name '*.txt'
will list all files recursively from the current directory down.
Thus, you could do
foo `find . -name '*.txt'`
I wanted to point out another way to turn off wildcard expansion. You can tell your shell to stop expanding wildcards with the the noglob option.
With bash use set -o noglob:
> touch a b c
> echo *
a b c
> set -o noglob
> echo *
*
And with csh, use set noglob:
> echo *
a b c
> set noglob
> echo *
*