When running some tests, I need to run a series of commands. It would be extremely useful to me, and save me a lot of time, if there was a way to do all of these things:
- Run the command I need to run
- Redirect all the output from the command to a specified file
- Include the original command in the specified file
- Print the output from the original command in the terminal
People have suggested using tee to me which does a great job of printing to terminal as well as sending to a file but doesn't include the original command. What I'd like to end up with is a file where the first line is the command I ran, and then below that is the output from the command.
Someone suggested this:
echo "ls -l" | xargs -I{} bash -c "echo >> output.file; eval {} >> output.file"
But this doesn't either print the output in the terminal or include the original command in the file.
I'd appreciate any ideas.
That's tee
you're searching for.
ls -l | tee outfile
prints the output of ls -l
to stdout (i.e. the terminal) and saves it in the file outfile
at the same time. But: It doesn't write the command name neither to stdout nor to the file. To achieve that, just echo
the command name before running the command and pipe both outputs to tee
:
( echo "ls -l" && ls -l ) | tee outfile
That's cumbersome to type, so why not define a function?
both(){ ( echo "$@" && "$@" ) | tee outfile ;}
After that you can just run
both ls -l
to get the desired result. Put the function in your ~/.bashrc
to have it defined in every new terminal.
If you want to be able to specify the output file as the first argument like in
both output ls -l
instead make it:
both(){ ( echo "${@:2}" && "${@:2}" ) | tee "$1" ;}
If you don't want the output file to be overwritten but rather append to it, add the -a
option to tee
.
No comments:
Post a Comment