I'm working with a command line utility that requires passing the name of a file to write output to, e.g.
foo -o output.txt
The only thing it writes to stdout
is a message that indicates that it ran successfully. I'd like to be able to pipe everything that is written to output.txt
to another command line utility. My motivation is that output.txt will end up being a 40 GB file that I don't need to keep, and I'd rather pipe the streams than work on massive files in a stepwise manner.
Is there any way in this scenario to pipe the real output (i.e. output.txt
) to another command? Can I somehow magically pass stdout
as the file argument?
Named pipes work fine, but you have a nicer, more direct syntax available via bash process substitution that has the added benefit of not using a permanent named pipe that must later be deleted (process substitution uses temporary named pipes behind the scenes):
foo -o >(other command)
Also, should you want to pipe the output to your command and also save the output to a file, you can do this:
foo -o >(tee output.txt) | other command