Researchers upend AI status quo by eliminating matrix multiplication in LLMs
Researchers upend AI status quo by eliminating matrix multiplication in LLMs
arstechnica.com Researchers upend AI status quo by eliminating matrix multiplication in LLMs
Running AI models without floating point matrix math could mean far less power consumption.
You're viewing a single thread.
All comments
6
comments
The technique has not yet been peer-reviewed
Let's see, then.
14 0 ReplyYeah I'm not exactly holding my breath.
3 0 ReplyThe peer review process is way too noisy to be meaningful. Better to just read it and judge the work for yourself.
https://arxiv.org/abs/2406.02528
Or wait for follow up work that can corroborate their claims.
3 0 Reply
6
comments
Scroll to top