Running an llm on a phone will absolutely destroy your battery life. It also is imperative that you understand that the comfort of ai is bought with killing of innocents (through expediency of climate catastrophe, exploitation of the planet and the poorest on it).
I think using ai to experiment on a home server which already exists wouldnt be problematic IN A VACUUM but you would still normalize using the tech which is morally corrupt.
I am a fan of LLMs and what they can do, and as such have a server specifically for running AI models. However, I've been reading "Atlas of AI" by Kate Crawford and you're right. So much of the data that they're trained on is inherently harmful or was taken without consent. Even in the more ethical data sets it's probably not great considering the sheer quantity of data needed to make even a simple LLM.
I still like using it for simple code generation (this is just a hobby to me so Vibe coding isn't a problem in my scenario) and corporate tone policing. And I tell people non stop that it's worthless outside of these use cases and maybe as a search engine, but I recommend Wikipedia as a better start almost Everytime.
It very much depends on your phone hardware, RAM affects how big models can be and CPU affects how fast you'll get the replies. I've successfully ran 4B models on my 8GB RAM phone, but since it's the usual server and client setup which needs full internet access due to the lack of granular permissions on Android (Even AIO setups needs open ports to connect to itself) I prefer a proper home server. Which, with a cheap GFX card, is indescribably faster and more capable.
You want to run it on the phone itself? I don't think any phone would be good enough for that. The issue with AI assistants is not just privacy. It's also the resource consumption (and of course stolen content). It's so high and only these big companies with huge server farms can do it.
If you just want a voice assistant for simple commands, I've heard of an open source local assistant called Dicio. But I don't think you can talk to it like ChatGPT or something.
I've successfully ran small scale LLM's on my phone, slow but very doable. I run my main AI system on an older, midrange gaming PC. No problems at all.
Dicio is a pre-programmed assistant, which one can talk to if one has speech recognition software installed. It has a preset of tasks it can do, in my experience it's quite incomparable to how LLM's work.
I don't recommend it. I ran local AI in my phone before (iPhone but same difference), and just asking it stuff makes it warm up to touch. Battery also takes a hit.
It also messes up multitasking features since it uses up most memory which kills background apps. Phones weren't designed for this.
Best way is to host it in an actual dedicated machine that can be accessed remotely.
I have pocketpal setup on my pixel with graphene os and its pretty awesome. I agree that AI is inherently bad considering the environmental impact and the amount of data that is illegally needed to train AI.