How to apply shell command to each line of a command output?

bash execute command for each line in file
bash for each line in variable
xargs
awk run command for each line
bash for each line in output
bash xargs
pipe each line to command
bash run command for each line in stdin

Suppose I have some output from a command (such as ls -1):

a
b
c
d
e
...

I want to apply a command (say echo) to each one, in turn. E.g.

echo a
echo b
echo c
echo d
echo e
...

What's the easiest way to do that in bash?

It's probably easiest to use xargs. In your case:

ls -1 | xargs -L1 echo

The -L flag ensures the input is read properly. From the man page of xargs:

-L number
    Call utility for every number non-empty lines read. 
    A line ending with a space continues to the next non-empty line. [...]

How do you run a command for each line of a file?, Read a file line by line and execute commands: 4 answers or other special chars like quotes), you could use shell command line expansion. Constants file descriptors are: 0 for STDIN, 1 for STDOUT and 2 for STDERR. Stack Overflow Public questions and answers; Iterating over each line of ls -l output. Running shell command and capturing the output.

You can use a basic prepend operation on each line:

ls -1 | while read line ; do echo $line ; done

Or you can pipe the output to sed for more complex operations:

ls -1 | sed 's/^\(.*\)$/echo \1/'

Execute a command once per line of piped input?, the -n1 switch, which means "Execute the command once per line of output:" Apply an operation on a file list coming from stdin, in this case make a copy of  This works because read will break down the line into variables, as many times as you specified variables. The remaining part of the line - possibly including whitespace - will get assigned to the last variable which has been specified in the command line.

You can use a for loop:

for file in * ; do
   echo "$file"
done

Note that if the command in question accepts multiple arguments, then using xargs is almost always more efficient as it only has to spawn the utility in question once instead of multiple times.

BASH, for line in `networksetup -listallnetworkservices` do networksetup -​setautoproxyurl $line http://etc done. Note: The command in the question uses Start-Process, which prevents direct capturing of the target program's output.Generally, do not use Start-Process to execute console applications synchronously - just invoke them directly, as in any shell.

You actually can use sed to do it, provided it is GNU sed.

... | sed 's/match/command \0/e'

How it works:

  1. Substitute match with command match
  2. On substitution execute command
  3. Replace substituted line with command output.

execute a command for each line of stdin immediately?, Using GNU Parallel: (echo abc; sleep 10; echo def;sleep 10; echo def;)|parallel -​uj1. -u is needed to get the output immediately. Without, the command will run,  After trying most of the solutions here, the easiest thing I found was the obvious - using a temp file. I'm not sure what you want to do with your multiple line output, but you can then deal with it line by line using read.

for s in `cmd`; do echo $s; done

If cmd has a large output:

cmd | xargs -L1 echo

xargs, xargs is a command on Unix and most Unix-like operating systems used to build and execute commands from standard input. It converts input from standard input into arguments to a command. Some commands such as grep and awk can take input either as command-line One use case of the xargs command is to remove a list of files using the rm  The program outputs the results to standard out. I need a script that will go into a directory, execute the command on each file, and concat the output into one big output file. For instance, to run the command on 1 file: $ cmd [option] [filename] > results.out

Handy one-liners for SED, Output file # should contain no more than one blank line between lines of text. command may need an -e switch if you use Unix System V or bash shell. sed "s/  If you want to write the output to a new file, output.txt, redirect the output like this: sort data.txt > output.txtwhich will not display any output, but will create the file output.txt with the same sorted data from the previous command. To check the output, use the cat command: cat output.txtwhich displays the sorted data:

Linux/UNIX: Bash Read a File Line By Line, How to use command/process substitution to read a file line by line but to run a shell command and store its output to a variable or pass it to  The above command will add the output of Ping command in the same text file without overwriting it. In this article, I described the method to save the PowerShell commands output in a file. You can save the output of the commands in a text file, prevent it from overwriting, and to add the output of another command in the same file.

Text Processing Commands, The useful -c option prefixes each line of the input file with its number of occurrences. This template finds use in analysis of log files and dictionary lists, and #!/bin/bash # rnd.sh: Outputs a 10-digit random number # Script by Stephane  Immediate Hire Shell Application May Be Available! Apply Now. Shell Application, Employment. Hiring Near You. Shell Application Go Fast. Apply Today!

Comments
  • ls -1 may be an example here but it is important to remember that it is not good to parse the output of ls. See: mywiki.wooledge.org/ParsingLs
  • ls automatically does -1 in a pipe.
  • @Dennis, doesn't look like it: ls | xargs -L2 echo and ls -1 | xargs -L2 echo give two different outputs. The former being all on one line.
  • @Alex: I get the same output.
  • xargs can run only executable files not shell functions or shell built-in commands. For the former the best solution is probably the one with read in a loop.
  • I wish this answer included an explanation of what -L1 is for.
  • The sed command doesn't seem to work: sh: cho: not found a sh: cho: not found Looks like it's taking the e in echo to be a sed command or something.
  • +1 for the while loop. cmd1 | while read line; do cmd2 $line; done. Or while read line; do cmd2 $line; done < <(cmd1) which doesn't create a subshell. This is the simplified version of your sed command: sed 's/.*/echo &/'
  • @Alex: change the double quotes to single quotes.
  • Quote the "$line" in the while loop, in order to avoid word splitting.
  • Try using read -r line to prevent read messing with escaped characters. For example echo '"a \"nested\" quote"' | while read line; do echo "$line"; done gives "a "nested" quote", which has lost its escaping. If we do echo '"a \"nested\" quote"' | while read -r line; do echo "$line"; done we get "a \"nested\" quote" as expected. See wiki.bash-hackers.org/commands/builtin/read
  • It's worth describing the proper/safe use of xargs, ie. printf '%s\0' * | xargs -0 ... -- otherwise, it's quite unsafe with filenames with whitespace, quotes, etc.