Skip Navigation
Stubsack: weekly thread for sneers not worth an entire post, week ending 9th February 2025
  • Thanks for the replies, I guess the "good" was vague on purpose, to see how people interpret it...

    This popped up on one of my feeds today and I saved it, can't remember from where, it's relevant to the above so sharing here: https://oneproject.org/ai-commons/ (AI Commons: nourishing alternatives to Big Tech monoculture).

    They talk about AI for good, at some point they mention how the term is sometimes used just for marketing.

  • OpenAI Furious DeepSeek Might Have Stolen All the Data OpenAI Stole From Us
  • Knowledge distilation is training a smaller model to mimic the outputs of a larger model. You don't need to use the same training set that was used to train the larger model (the whole internet or whatever they used for chatgpt), but can use a transfer set.

    Here's a reference: Hinton, Geoffrey. "Distilling the Knowledge in a Neural Network." arXiv preprint arXiv:1503.02531 (2015)., https://arxiv.org/pdf/1503.02531

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 8th December 2024
  • Hi, I'm new here. I mean, I've been reading but I haven't commented before.

    I'm sure you all know about how cheap labour is used for labelling data for training "AI" systems, but I just came across this video and wanted to share. Apologies if it has already been posted: Training AI takes heavy toll on Kenyans working for $2 an hour.

  • InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)NE
    NextElephant9 @awful.systems
    Posts 0
    Comments 8