Skip Navigation
Idea: Selfhosted Interface for LLMs
  • @mulcahey @not_IO LOL - it's not even a 3070 - I'm running it on an NVIDIA GeForce GTX 1660.

    Not the snappiest thing, but it definitely works well for my needs. Complex queries cause alllll the fans to go off.

  • Idea: Selfhosted Interface for LLMs
  • @mulcahey @not_IO The client is the browser in this case. But there's an API, so... sky's the limit.

  • Idea: Selfhosted Interface for LLMs
  • @mulcahey @not_IO Take a look at: https://openwebui.com/

    I'm running the docker container on my home server behind my firewall on an - I think - 6070 GPU (may be a 3070 - it's one of the cheaper, older ones).

    It's a wee bit slower than one of the cloud based LLMs, but my data stays local. I like the Qwen model.

  • Backfilling Conversations: Two Major Approaches
  • @julian Ah... that actually may make more sense - thanks.

    I'm working on my own AP implementation and hadn't yet run into this issue, so assumed.

    Time to upgrade!

  • Backfilling Conversations: Two Major Approaches
  • @julian Quick, somewhat unrelated note - I follow you on Mastodon and see your posts with the HTML tags showing. Is NodeBB escaping those tags prior to sending out the AP message?

  • Consider SQLite
  • @colonial @Sibbo I'm actually glad I did read the page itself - it's clearly satire, making fun of how "sacred" others seem to hold their codes of conduct/ethics. I'm glad I read through that - I see no problems with it or in using SQLite.

  • What's something you learned in your job that can be used in a solarpunk lifestyle?
  • @j_roby @x_cell This is a good one. Any tips and tricks to share? Resources to point to? (About growing food the way you describe - got the cannabis part down ;-) )

  • robz Rob Zazueta @toot.robzazueta.com

    He's just this guy, you know?

    Posts 0
    Comments 7