Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)JO
Posts
15
Comments
61
Joined
1 yr. ago

  • This is a threat to any neural network that is being constantly trained. Hell it's even a problem with our brain's NN. We just call it "believing your own bullshit" or "getting high on your own supply".

    The issue with NNs looking for cures or diseases (or anything that isn't trained off of the internet) is that they are basically out of training data. They'll need orders of magnitude more to get better and we just don't have that. We haven't figured out a way to get better off of less data and there's no real movement on that front either.

    What we have right now is essentially a culmination of research that has been going on since the 1960's that was finally able to be realized with us figuring out:

    • We can map our NN variables to a matrix
    • We can use linear algebra to optimize the loss on that matrix
    • We can leverage video cards to crunch the linear algebra
    • We have the largest data set ever created in order to get our loss lower than ever before
  • Honestly the apps on my phone that do this are amazing. I bought an adapter that adds a 1/4” and an 1/8” jack so I can listen to it through headphones and it’s beyond anything we had just a few years ago.

  • We just retired my MBP from 2014 this year. 10 years of usage for a laptop is crazy. And we could have kept it going with a battery swap but we opted to get a new Air which are super nice.

    Apple is expensive but you get what you pay for.