Given the script below, I would like to avoid the execution of the pipeline if a file does not exist.
How may I exit the script straight away if the txt file is not found?
ls */*.txt | grep ab1 | awk '{print $1, $1}' | sed "s/\"/\"\"/g" | xargs -L1 mv
You can check for file existence with something like:
if [[ -f x.txt ]] ; then
echo file exists.
fi
To exit if it doesn't, something like this would suffice:
if [[ ! -f x.txt ]] ; then
echo 'File "x.txt" is not there, aborting.'
exit
fi
The -f <file>
is only one of the many conditional expressions you can use. If you look at the bash
man-page under CONDITIONAL EXPRESSIONS
, you'll see a whole host of them.
If (as stated in a question update) you wish to check if a wildcard results in files, you can simply expand it, throwing away the errors. If there are none, you'll end up with an empty string which can be detected with -z
:
if [[ -z "$(ls -1 */*.txt 2>/dev/null | grep ab1)" ]] ; then
echo 'There are no "*/*.txt" files.'
exit
fi
Note that I've used -1
to force one file per line even though Linux ls
does that by default if the output device is not a terminal (from memory). That's just in case you try this on a machine that doesn't force one per line in that case.
Keep in mind however that, if you have spaces in your filenames, using ls
and then awk
to extract column 1 is not going to work too well. For example, the file abc ab1.txt
will result in the extraction of only the abc
bit.
Using find
with -print0
, combined with xargs
with -0
is the usual way to properly process files which may have "special" characters in them. There are many other options you can give to find
to ensure only the files required are processed, such as -maxdepth
to limit how far down the directory tree you go, and -name
to properly filter file names.
However, if you know that you will never have these types of files, it's probably okay to use the ls
solution, just make sure you're comfortable with its shortcomings.