Bug #14413
closed`-n` and `-p` flags break when stdout is closed
Description
Ruby generally works well within a pipeline. The -n
and -p
flags are incredibly useful. However, it is common practice to use programs like head
and sed
, which will close the pipe after completing their job. This is convenient, because often it limits an expensive amount of output to a workable subset. I can figure out the pipeline for a subset of input, and then remove the limiting function.
However, Ruby explodes with -e:1:in
write': Broken pipe @ io_write - (Errno::EPIPE), when it writes the current line to stdout. When in a line oriented mode, and stdout closes, I think it should exit successfully (so it doesn't break bash scripts with
pipe fail`set) and silently (no error message), hence marking this a bug report rather than a feature request.
I've attached a screenshot to show that this is how every other program works, that I tried.
git clone https://github.com/jquery/esprima
cd esprima/test/fixtures/
ls | head -1 # ls
find . -type f -name '*json' | head -1 # find
find . -type f -name '*json' | head -1 | xargs cat | jq . | head -1 # jq
find . -type f -name '*json' | grep -v JSX | grep -v tokenize | head -1 # grep
find . -type f -name '*json' | sed -E '/JSX|tokenize/d' | head -1 # sed
find . -type f -name '*json' | awk '/JSX|tokenize/ { next }; { print }' | head -1 # awk
find . -type f -name '*json' | perl -e 'while(<>) { /JSX|tokenize/ || print }' | head -1 # perl
find . -type f -name '*json' | ruby -ne 'print unless /JSX|tokenize/' | head -1 # ruby :(
Files