270

I am trying to get bash to process data from stdin that gets piped into, but no luck. What I mean is none of the following work:

echo "hello world" | test=($(< /dev/stdin)); echo test=$test test= echo "hello world" | read test; echo test=$test test= echo "hello world" | test=`cat`; echo test=$test test= 

where I want the output to be test=hello world. I've tried putting "" quotes around "$test" that doesn't work either.

7
  • 1
    Your example.. echo "hello world" | read test; echo test=$test worked fine for me.. result: test=hello world ; what environment are running this under? I'm using bash 4.2.. Commented Jul 21, 2012 at 14:43
  • Do you want multiple lines in a single read? Your example only shows one line, but the problem description is unclear. Commented Oct 4, 2012 at 14:08
  • 3
    @alex.pilon, I'm running Bash version 4.2.25, and his example does not work for me too. May be that's a matter of a Bash runtime option or environment variable? I've the example does not work with Sh neither, so may be Bash can try to be compatible with Sh? Commented Jul 15, 2014 at 23:45
  • 2
    @Hibou57 - I tried this again in bash 4.3.25 and it no longer works. My memory is fuzzy on this and I'm not sure what I may have done to get it to work. Commented Oct 24, 2014 at 18:20
  • 2
    @Hibou57 @alex.pilon the last cmd in a pipe should affect the vars in bash4>=4.2 with shopt -s lastpipe -- tldp.org/LDP/abs/html/bashver4.html#LASTPIPEOPT Commented May 27, 2016 at 12:49

17 Answers 17

214

Use

IFS= read var << EOF $(foo) EOF 

You can trick read into accepting from a pipe like this:

echo "hello world" | { read test; echo test=$test; } 

or even write a function like this:

read_from_pipe() { read "$@" <&0; } 

But there's no point - your variable assignments may not last! A pipeline may spawn a subshell, where the environment is inherited by value, not by reference. This is why read doesn't bother with input from a pipe - it's undefined.

FYI, http://www.etalabs.net/sh_tricks.html is a nifty collection of the cruft necessary to fight the oddities and incompatibilities of bourne shells, sh.

Sign up to request clarification or add additional context in comments.

10 Comments

You can make the assignment last by doing this instead: `test=``echo "hello world" | { read test; echo $test; }```
Let's try this again (apparently escaping backticks in this markup is fun): test=`echo "hello world" | { read test; echo $test; }`
can I ask why you used {} instead of () in grouping those two commands?
The trick is not in making read take input from the pipe, but in using the variable in the same shell that executes the read.
Got bash permission denied when trying to use this solution. My case is quite different, but I couldn't find an answer for it anywhere, what worked for me was (a different example but similar usage): pip install -U echo $(ls -t *.py | head -1). In case, someone ever has a similar problem and stumbles upon this answer like me.
|
134

if you want to read in lots of data and work on each line separately you could use something like this:

cat myFile | while read x ; do echo $x ; done 

if you want to split the lines up into multiple words you can use multiple variables in place of x like this:

cat myFile | while read x y ; do echo $y $x ; done 

alternatively:

while read x y ; do echo $y $x ; done < myFile 

But as soon as you start to want to do anything really clever with this sort of thing you're better going for some scripting language like perl where you could try something like this:

perl -ane 'print "$F[0]\n"' < myFile 

There's a fairly steep learning curve with perl (or I guess any of these languages) but you'll find it a lot easier in the long run if you want to do anything but the simplest of scripts. I'd recommend the Perl Cookbook and, of course, The Perl Programming Language by Larry Wall et al.

1 Comment

"alternatively" is the correct way. No UUoC and no subshell. See BashFAQ/024.
67

This is another option

$ read test < <(echo hello world) $ echo $test hello world 

2 Comments

The significant advantage that <(..) has over $(..) is that <(..) returns each line to the caller as soon as the command that it executes makes it available. $(..), however, waits for the command to complete and generate all of its output before it makes any output available to the caller.
This is called process substitution. pass v1.7.4 uses it to generate passwords from /dev/urandom.
60

read won't read from a pipe (or possibly the result is lost because the pipe creates a subshell). You can, however, use a here string in Bash:

$ read a b c <<< $(echo 1 2 3) $ echo $a $b $c 1 2 3 

But see @chepner's answer for information about lastpipe.

3 Comments

Nice and simple one-liner, easy to understand. This answer needs more upvotes.
<<< adds line feed which may be not desired
@LoganMzz: True, but read consumes it because it's the default delimiter. As a result, the variable doesn't contain it. Note for comparison that this also does not result in the variable containing a newline even though echo outputs one: d=$(echo "foo") because command substitution removes trailing newlines.
52

I'm no expert in Bash, but I wonder why this hasn't been proposed:

stdin=$(cat) echo "$stdin" 

One-liner proof that it works for me:

$ fortune | eval 'stdin=$(cat); echo "$stdin"' 

10 Comments

That's probably because "read" is a bash command, and cat is a separate binary that will be launched in a subprocess, so it's less efficient.
sometimes simplicity and clarity trump efficiency :)
definitely the most straight-forward answer
@djanowski but it's not necessarily the expected behaviour of a given script. If only there was a way to handle the absence of stdin gracefully, and fall back to 'regular' behaviour if it's not present. This post almost has it - it accepts either arguments or stdin. The only thing missing is being able to provide a usage helper if neither are present.
While searching for alternatives to the accepted answer, I decided to go with something similar to this answer for my use case :) ${@:-$(cat)}
|
34

bash 4.2 introduces the lastpipe option, which allows your code to work as written, by executing the last command in a pipeline in the current shell, rather than a subshell.

shopt -s lastpipe echo "hello world" | read test; echo test=$test 

2 Comments

ah! so good, this. if testing in an interactive shell, also: "set +m" (not required in an .sh script)
This is a long standing peeve for me.. You can add shopt -s lastpipe and set +m (or set -o monitor) to ~/.bashrc or ~/.bash_aliases, but the is a bug that set +m wont stick. I found a workaround by adding it to the export PROMPT_COMMAND='set +m (askubuntu.com/questions/1395963/…)
31

A smart script that can both read data from PIPE and command line arguments:

#!/bin/bash if [[ -p /dev/stdin ]] then PIPE=$(cat -) echo "PIPE=$PIPE" fi echo "ARGS=$@" 

Output:

$ bash test arg1 arg2 ARGS=arg1 arg2 $ echo pipe_data1 | bash test arg1 arg2 PIPE=pipe_data1 ARGS=arg1 arg2 

Explanation: When a script receives any data via pipe, then the /dev/stdin (or /proc/self/fd/0) will be a symlink to a pipe.

/proc/self/fd/0 -> pipe:[155938] 

If not, it will point to the current terminal:

/proc/self/fd/0 -> /dev/pts/5 

The bash [[ -p option can check it it is a pipe or not.

cat - reads the from stdin.

If we use cat - when there is no stdin, it will wait forever, that is why we put it inside the if condition.

1 Comment

You can also use /dev/stdin which is a link to /proc/self/fd/0
15

The syntax for an implicit pipe from a shell command into a bash variable is

var=$(command) 

or

var=`command` 

In your examples, you are piping data to an assignment statement, which does not expect any input.

1 Comment

Because $() can be nested easily. Think in JAVA_DIR=$(dirname $(readlink -f $(which java))), and try it with `. You will need to escape three times!
14

In my eyes the best way to read from stdin in bash is the following one, which also lets you work on the lines before the input ends:

while read LINE; do echo $LINE done < /dev/stdin 

1 Comment

I almost went crazy before finding this. Thanks a lot for sharing!
11

Because I fall for it, I would like to drop a note. I found this thread, because I have to rewrite an old sh script to be POSIX compatible. This basically means to circumvent the pipe/subshell problem introduced by POSIX by rewriting code like this:

some_command | read a b c 

into:

read a b c << EOF $(some_command) EOF 

And code like this:

some_command | while read a b c; do # something done 

into:

while read a b c; do # something done << EOF $(some_command) EOF 

But the latter does not behave the same on empty input. With the old notation the while loop is not entered on empty input, but in POSIX notation it is! I think it's due to the newline before EOF, which cannot be ommitted. The POSIX code which behaves more like the old notation looks like this:

while read a b c; do case $a in ("") break; esac # something done << EOF $(some_command) EOF 

In most cases this should be good enough. But unfortunately this still behaves not exactly like the old notation if some_command prints an empty line. In the old notation the while body is executed and in POSIX notation we break in front of the body.

An approach to fix this might look like this:

while read a b c; do case $a in ("something_guaranteed_not_to_be_printed_by_some_command") break; esac # something done << EOF $(some_command) echo "something_guaranteed_not_to_be_printed_by_some_command" EOF 

1 Comment

[ -n "$a" ] || break should also work – but the problem about missing actual empty lines remains
10

The first attempt was pretty close. This variation should work:

echo "hello world" | { test=$(< /dev/stdin); echo "test=$test"; }; 

and the output is:

test=hello world

You need braces after the pipe to enclose the assignment to test and the echo.

Without the braces, the assignment to test (after the pipe) is in one shell, and the echo "test=$test" is in a separate shell which doesn't know about that assignment. That's why you were getting "test=" in the output instead of "test=hello world".

3 Comments

@KevinBuchs, the OP's question is how to get "bash to process data from stdin that gets piped into". And OP was on the right track, just missing the curly braces around the two shell commands. When you use a pipeline, the shell forks itself to execute each part, but since variable assignments are only for the current shell (or sub-processes if you export), when the sub-process exits the variable assignment is lost. The curly braces put the shell commands after the pipe into a single shell so the assignment is usable. Maybe you have a different question you want to ask?
You are right, my mistake. Sorry.
Best answer. Summarized in this short description: you have an [bash] array of data returned by some script and want to repeat the same action on each member of that array. Wasted 1 hour solving this.
7

Piping something into an expression involving an assignment doesn't behave like that.

Instead, try:

test=$(echo "hello world"); echo test=$test 

Comments

5

The following code:

echo "hello world" | ( test=($(< /dev/stdin)); echo test=$test ) 

will work too, but it will open another new sub-shell after the pipe, where

echo "hello world" | { test=($(< /dev/stdin)); echo test=$test; } 

won't.


I had to disable job control to make use of chepnars' method (I was running this command from terminal):

set +m;shopt -s lastpipe echo "hello world" | read test; echo test=$test echo "hello world" | test="$(</dev/stdin)"; echo test=$test 

Bash Manual says:

lastpipe

If set, and job control is not active, the shell runs the last command of a pipeline not executed in the background in the current shell environment.

Note: job control is turned off by default in a non-interactive shell and thus you don't need the set +m inside a script.

Comments

1

I think you were trying to write a shell script which could take input from stdin. but while you are trying it to do it inline, you got lost trying to create that test= variable. I think it does not make much sense to do it inline, and that's why it does not work the way you expect.

I was trying to reduce

$( ... | head -n $X | tail -n 1 ) 

to get a specific line from various input. so I could type...

cat program_file.c | line 34 

so I need a small shell program able to read from stdin. like you do.

22:14 ~ $ cat ~/bin/line #!/bin/sh if [ $# -ne 1 ]; then echo enter a line number to display; exit; fi cat | head -n $1 | tail -n 1 22:16 ~ $ 

there you go.

Comments

1

The questions is how to catch output from a command to save in variable(s) for use later in a script. I might repeat some earlier answers but I try to line up all the answers I can think up to compare and comment, so bear with me.

The intuitive construct

echo test | read x echo x=$x 

is valid in Korn shell because ksh have implemented that the last command in a piped series is part of the current shell ie. the previous pipe commands are subshells. In contrast other shells define all piped commands as subshells including the last. This is the exact reason I prefer ksh. But having to copy with other shells, bash f.ex., another construct must be used.

To catch 1 value this construct is viable:

x=$(echo test) echo x=$x 

But that only caters for 1 value to be collected for later use.

To catch more values this construct is useful and works in bash and ksh:

read x y <<< $(echo test again) echo x=$x y=$y 

There is a variant which I have noticed work in bash but not in ksh:

read x y < <(echo test again) echo x=$x y=$y 

The <<< $(...) is a here-document variant which gives all the meta handling of a standard command line. < <(...) is an input redirection of a file-substitution operator.

I use "<<< $(" in all my scripts now because it seems the most portable construct between shell variants. I have a tools set I carry around on jobs in any Unix flavor.

Of course there is the universally viable but crude solution:

command-1 | {command-2; echo "x=test; y=again" > file.tmp; chmod 700 file.tmp} . ./file.tmp rm file.tmp echo x=$x y=$y 

2 Comments

This is a very good explanatory answer but I can't seem to get the multiple-value construct working. The first variable gets populated by the first line of my command's output but the second one stays empty. Any idea where I could be going wrong?
@HashimAziz - you might want to include the shell code you were using so we can help you.
0

I wanted something similar - a function that parses a string that can be passed as a parameter or piped.

I came up with a solution as below (works as #!/bin/sh and as #!/bin/bash)

#!/bin/sh set -eu my_func() { local content="" # if the first param is an empty string or is not set if [ -z ${1+x} ]; then # read content from a pipe if passed or from a user input if not passed while read line; do content="${content}$line"; done < /dev/stdin # first param was set (it may be an empty string) else content="$1" fi echo "Content: '$content'"; } printf "0. $(my_func "")\n" printf "1. $(my_func "one")\n" printf "2. $(echo "two" | my_func)\n" printf "3. $(my_func)\n" printf "End\n" 

Outputs:

0. Content: '' 1. Content: 'one' 2. Content: 'two' typed text 3. Content: 'typed text' End 

For the last case (3.) you need to type, hit enter and CTRL+D to end the input.

Comments

-3

How about this:

echo "hello world" | echo test=$(cat) 

5 Comments

Basically a dupe from my answer below.
Maybe, I would argue mine is slightly cleaner and closer to the code posted in the original question.
@KevinBuchs As far as I can tell, this doesn't assign any variables that persist after completion. Did you try it?
Please ignore my (now deleted) comment. I was mistaken.
This echos an assignment (without actually escaping the content to be eval-safe), but it doesn't perform an assignment in such a way as to make the variable available later in the script.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.