Very simple foreach line alias to xargs - is it usefule?
I created a simple alias for xargs, with the intend to pipe it when needed. It will simply run a command for each line of it. My question to you is, is this useful or are there better ways of doing this? This is just a little bit of brainstorming basically. Maybe I have a knot in my head.
# Pipe each line and execute a command. The "{}" will be replaced by the line.
# Example:
# find . -maxdepth 2 -type f -name 'M*' | foreach grep "USB" {}
alias foreach='xargs -d "\n" -I{}'
For commands that already operate on every line from stdin, this won't be much useful. But in other cases, it might be. A more simplified usage example (and a useless one) would be:
find . -maxdepth 1 | foreach echo "File" {}
It's important to use the {} as a placeholder for the "current line" that is processed. What do you think about the usefulness? Have you any idea how to use it?
Don't use ls if you want to get filenames, it does a bunch of stuff to them. Use a shell glob or find.
Also, because filenames can have newlines, if you want to loop over them, it's best to use one these:
for x in *; do do_stuff "$x"; done # can be any shell glob, not just *
find . -exec do_stuff {} \;
find . -print0 | xargs -0 do_stuff # not POSIX but widely supported
find . -print0 | xargs -0 -n1 do_stuff # same, but for single arg command
When reading newline-delimited stuff with while read, you want to use:
Those find and ls commands were just to illustrate how the actual alias work, to get a sense of. I'm aware of those issues with filenames. It's not about ls or find here.
Yeah sorry then. It would be good to not use ls in your example though, someone who doesn't know about that might read this discussion and think that's reasonable.
As for your original question, doing the foreach as a personal alias is fine. I wouldn't use it in any script, since if anyone else reads that, they probably already know about xargs. So using your foreach would be more confusing to any potential reader I think.
Some additional thoughts to be aware of by looking closer to each line (previously I just glanced over).
This point is not directly affecting your example, but I want to make you aware of something I fall into myself. Its one of those Bash quirks. Other shells might handle it differently, only speaking about Bash here. For a regular for loop over files, its important to note that if no file exists, the variable will be set to the search string. So example for x in *.png; do, if no .png file is found, then x will be set to *.png literally. So depending on what you do in the loop this could be catastrophic. But Bash has an option for this specifically: shopt -s nullglob . Using this option, if no file is found, then x will be set to an empty string. More about Bash options: https://www.gnu.org/software/bash/manual/html_node/The-Shopt-Builtin.html
for x in *.abcdefg; do echo "$x"; done
shopt -s nullglob
for x in *.abcdefg; do echo "$x"; done
BTW one can also do a read line by line without cat, by reading the file directly: (for some reasons Beehaw won't let me type the lower than character, so replace that, here a screenshot too):
while IFS= read -r line; do echo "Line: ${line}" ; done \< filenames.txt
Great minds, lol. I have almost the exact same command set up as a little script. Mine has an extra modification for my use case, and I named mine iter, but foreach is a good name for it too.
A bit of a tangent, but I almost never use xargs in the shell anymore, and instead use “while read line ; do *SOMETHING* $line ; done”, because xargs doesn’t have access to the shell’s local variables, aliases, or functions.
Good point! A while loop is probably more flexible here and easier to expand too. I will experiment a bit more and maybe I'll change to a while readline implemented as a Bash function.
True that the loop is easier to work with, though you can still pass args/env into the sub shell, and xargs' -P is one of my favorites depending on the task (may not be desired in this case). Sometimes I've done both: echo assembled commands in a loop/find -exec, sanity check output, then pipe to xargs ... bash -c to run in parallel.