Google quietly released an app that lets you download and run AI models locally
Google quietly released an app that lets you download and run AI models locally
A gallery that showcases on-device ML/GenAI use cases and allows people to try and use models locally. - google-ai-edge/gallery

You're viewing a single thread.
Why would I use this over Ollama?
42 0 ReplyOllama can’t run on Android
31 0 ReplyThat's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
23 0 ReplyYes, that's my setup. But this will be useful for cases where internet connection is not reliable
4 0 ReplyHow is Ollama compared to GPT models? I used the paid tier for work and I'm curious how this stacks up.
2 0 ReplyIt's decent, with the deepseek model anyway. It's not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.
1 0 Reply
You can use it in termux
8 0 ReplyHas this actually been done? If so, I assume it would only be able to use the CPU
3 0 ReplyYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
6 0 Reply
Is there any useful model you can run on a phone?
3 0 ReplyLlama.cpp (on which ollama runs on) can. And many chat programs for phones can use it.
2 0 ReplyTry PocketPal instead
2 0 Reply