# find with grep
# + concatinates results and runs the command once, faster
find . -name "*.txt" -exec grep -l "somename" '{}' '+'
# run a command for each result individually
find . -name "*.txt" -exec basename '{}' \';' | column
# case insensitive
find -iname "SoMeNaMe.TxT
# file or dir
find -type f
find -type d
# define file owner
find -user Bob
# define file group
find -group wheel
# by permission
find -perm 777
# find by size
find -size +1G
It's useful to be able to do this without additional tools (and there are more applications for the general command setup discussed in the video), but in practice, ease of use and performance often make a difference.
It is important to have backups for when Youtube blocks clients, but I just watch it over a VPN and Freetube or Grayjay. Not leeching any resources when avoidable, just costing big brother money.
If you have a very large directory, find will check each individual file, even when -path doesn't match, which makes it take longer to complete. Combine -o and -prune to omit them entirely.
She’s done such a good job with this channel. I understand most of the content, but I always pick up a nugget of new as well as being able to better explain after a topic she ELI5’d