You're viewing a single thread.
All comments
217
comments
4GB of RAM: load a model into llama.cpp
Explodes
3 0 ReplyApple be like: our 4gb is like 16gb from others
9 0 ReplyThat's right. Prize wise
9 0 Reply
217
comments
Scroll to top