The most interesting part here I find is the cost analysis. Was quite surprised to see that the cost to train it on current hardware would have been a third of the cost it was back when they were training it. That is like a 3x improvement in a year/year and a half. I winder whether this trend will continue.
I have been out of the ml world for a bit (like 6months lol ...) And I already feel way out if date. It seems like I should pick up the vicuna llm, didnt want to touch llama initially due to the legal problems with it. I thought that would be a problem for a while, and then they went and solved it. Somehow even missed the news of it, most likely due to the enormous amount of news comming from the ml world (I might need a model to abbreviate it). Anyways thanks for the article I know what to do this weekend.