Basically another shell scripting language. But unlike most other languages like Csh or Fish, it can compile back to Bash. At the moment I am bit conflicted, but the thing it can compile back to Bash is what is very interesting. I'll keep an eye on this. But it makes the produced Bash code a bit less readable than a handwritten one, if that is the end goal.
I wish this nonsense of piping a shell script from the internet directly into Bash would stop. It's a bad idea, because of security concerns. This install.sh script eval and will even run curl itself to download amber and install it from this url
url="https://github.com/Ph0enixKM/${__0_name}/releases/download/${__2_tag}/amber_${os}_${arch}"
...
echo "Please make sure that root user can access /opt directory.";
And all of this while requiring root access.
I am not a fan of this kind of distribution and installation. Why not provide a normal manual installation process and link to the projects releases page: https://github.com/Ph0enixKM/Amber/releases BTW its a Rust application. So one could build it with Cargo, for those who have it installed.
I wish this nonsense of piping a shell script from the internet directly into Bash would stop. Itโs a bad idea, because of security concerns.
I would encourage you to actually think about whether or not this is really true, rather than just parroting what other people say.
See if you can think of an exploit I perform if you pipe my install script to bash, but I can't do it you download a tarball of my program and run it.
while requiring root access
Again, think of an exploit I can do it you give me root, but I can't do if you run my program without root.
(Though I agree in this case it is stupid that it has to be installed in /opt; it should definitely install to your home dir like most modern languages - Go, Rust, etc.)
I would encourage you to actually think about whether or not this is really true, rather than just parroting what other people say.
I would encourage you to read up on the issue before thinking they haven't.
See if you can think of an exploit I perform if you pipe my install script to bash, but I canโt do it you download a tarball of my program and run it.
It is also terrible conditioning to pipe stuff to bash because it's the equivalent of "just execute this .exe, bro". Sure, right now it's github, but there are other curl|bash installs that happen on other websites.
Additionally a tar allows one to install a program later with no network access to allow reproducible builds. curl|bash is not repoducible.
It is absolutely possible to know as the server serving a bash script if it is being piped into bash or not purely by the timing of the downloaded chunks. A server could halfway through start serving a different file if it detected that it is being run directly. This is not a theoretical situation, by the way, this has been done. At least when downloading the script first you know what you'll be running. Same for a source tarball. That's my main gripe with this piping stuff. It assumes you don't even care about the security.
I mean, you can always just download the script, investigate it yourself, and run it locally. I'd even argue it's actually better than most installers.
Install scripts are just the Linux versions of installer exes. Hard and annoying to read, probably deviating from standard behaviour, not documenting everything, probably being bound to specific distros and standards without checks, assuming stuff way too many times.
Why does the generated bash look like that? Is this more safe somehow than a more straighforward bash if or does it just generate needlessly complicated bash?
I doubt the goal is to produce easily understood bash, otherwise you'd just write bash to begin with. It's probably more similar to a typescript transpiler that takes in a language with different goals and outputs something the interpreter can execute quickly (no comment on how optimized this thing is).
Especially as Bash can do that anyway with if [ "${__0_age}" -lt 18 ] as an example, and could be straight forward. Also Bash supports wildcard comparison, Regex comparison and can change variables with variable substitution as well. So using these feature would help in writing better Bash. The less readable output is expected though, for any code to code trans-compiler, its just not optimal in this case.
It's probably just easier to do all arithmetic in bc so that there's no need to analyze expressions for Bash support and have two separate arithmetic codegen paths.
As someone who has done way too much shell scripting, the example on their website just looks bad if i'm being honest.
I wrote a simple test script that compares the example output from this script to how i would write the same if statement but with pure bash.
here's the script:
#!/bin/bash
age=3
[ "$(printf "%s < 18\n" "$age" | bc -l | sed '/\./ s/\.\{0,1\} 0\{1,\}$//')" != 0 ] && echo hi
# (( "$age" < 18 )) && echo hi
Comment out the line you dont want to test then run hyperfine ./script
I found that using the amber version takes ~2ms per run while my version takes 800microseconds, meaning the amber version is about twice as slow.
The reason the amber version is so slow is because:
a) it uses 4 subshells, (3 for the pipes, and 1 for the $() syntax)
b) it uses external programs (bc, sed) as opposed to using builtins (such as the (( )), [[ ]], or [ ] builtins)
I decided to download amber and try out some programs myself.
and i actually facepalmed because instead of directly accessing the first item, it first creates a new array then accesses the first item in that array, maybe there's a reason for this, but i don't know what that reason would be.
so now we have 1000 items in our array, I bench marked this, and a version where it doesn't create a new array.
not creating a new array is 600ms faster (1.7ms for the amber version, 1.1ms for my version).
I wrote another simple amber program that sums the items in a list
let items = [1, 2, 3, 10]
let x = 0
loop i in items {
x += i
}
echo x
which compiles to
__AMBER_ARRAY_0=(1 2 3 10);
__0_items=("${__AMBER_ARRAY_0[@]}");
__1_x=0;
for i in "${__0_items[@]}"
do
__1_x=$(echo ${__1_x} '+' ${i} | bc -l | sed '/\./ s/\.\{0,1\}0\{1,\}$//')
done;
echo ${__1_x}
This compiled version takes about 5.7ms to run, so i wrote my version
arr=(1 2 3 10)
x=0
for i in "${arr[@]}"; do
x=$((x+${arr[i]}))
done
printf "%s\n" "$x"
This version takes about 900 microseconds to run, making the amber version about 5.7x slower.
Amber does support 1 thing that bash doesn't though (which is probably the cause for making all these slow versions of stuff), it supports float arithmetic, which is pretty cool.
However if I'm being honest I rarely use float arithmetic in bash, and when i do i just call out to bc which is good enough. (and which is what amber does, but also for integers)
I dont get the point of this language, in my opinion there are only a couple of reasons that bash should be chosen for something
a) if you're just gonna hack some short script together quickly. or
b) something that uses lots of external programs, such as a build or install script.
for the latter case, amber might be useful, but it will make your install/build script hard to read and slower.
Lastly, I don't think amber will make anything easier until they have a standard library of functions.
The power of bash comes from the fact that it's easy to pipe text from one text manipulation tool to another, the difficulty comes from learning how each of those individual tools works, and how to chain them together effectively.
Until amber has a good standard library, with good data/text manipulation tools, amber doesn't solve that.
This is the complete review write up I love to see, let's not get into the buzzword bingo and just give me real world examples and comparisons. Thanks for doing the real work ๐
Compiling to bash seems awesome, but on the other hand I don't think anyone other than the person who wrote it in amber will run a bash file that looks like machine-generated gibberish on their machine.
I don't think anyone other than the person who wrote it in amber will run a bash file that looks like machine-generated gibberish on their machine.
Lol I barely want to run (or read) human generated bash, machine generated bash sounds like a new fresh hell that I don't wanna touch with a ten foot pole.
I'm very suspicious of the uses cases for this. If the compiled bash code is unreadable then what's the point of compiling to bash instead of machine code like normal? It might be nice if you're using it as your daily shell but if someone sent me "compiled" bash code I wouldn't touch it. My general philosophy is if your bash script gets too long, move it to python.
The only example I can think of is for generating massive install.sh
As a long-time bash, awk and sed scripter who knows he'll probably get downvoted into oblivion for this my recommendation: learn PowerShell
It's open-source and completely cross-platform - I use it on Macs, Linux and Windows machines - and you don't know what you're missing until you try a fully objected-oriented scripting language and shell. No more parsing text, built-in support for scalars, arrays, hash maps/associative arrays, and more complex types like version numbers, IP addresses, synchronized dictionaries and basically anything available in .Net. Read and write csv, json and xml natively and simply. Built-in support for regular expressions throughout, web service calls, remote script execution, and parallel and asynchronous jobs and lots and lots of libraries for all kinds of things.
Seriously, I know its popular and often-deserved to hate on Microsoft but PowerShell is a kick-ass, cross-platform, open-source, modern shell done right, even if it does have a dumb name imo. Once you start learning it you won't want to go back to any other.
As someone who spent 2 years learning and writing PowerShell for work... It's... Okay. Way easier to make stuff work then bash, and gets really powerful when you make libraries for it. But... I prefer Python and GoLang for building scripts and small apps.
I do. Currently I use it mostly for personal stuff as part of my time spent on production support. Importing data from queries, exporting spreadsheets, reading complex json data and extracting needed info, etc. In the past when I was on DevOps used it with Jenkins and various automation processes, and I've used it as a developer to create test environments and test data.
Bash is one of the most used shell language, it's installed on almost all Linux and Mac systems and can also be used on windows. Almost no one likes writing it as it is convoluted and really really hard to read and write.
There are many replacement language's for it, but using them is troublesome, because of incompatibilities.
Amber is compiled which will solve problems with compatibility and it seems that language itself is very readable. On top of that it has most futures that modern programmers need.
Does Bash support those? I think the idea is that it's basically Bash, as if written by a sane person. So it supports the same features as Bash but without the army of footguns.
A language being compiled should be able to support higher-level language concepts than what the target supports natively. That's how compiling works in the first place.
If you can use an alternative then do that. This is for situations where you can't use an alternative or don't want users to have to install anything else.
I checked the docs, and I'm a bit confused with one thing. They show that you can capture the stdout of a command into a variabe, but they never show stderr being captured. How would that work?