Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)FL
Posts
1
Comments
1,827
Joined
2 yr. ago

  • No, the old model does not have the training data. It only has "model weights". You can conceptualize those as the abstract rules that the old model learned when it read the training data. By design, they are not supposed to memorize their training data.

    To outperform the old model, the new model needs more than what the old model learned. It needs primary sources, ie the training data itself. Which is going to be deleted.

  • It's true that a new model can be initialized from an older one, but it will never outperform the older one unless it is given actual training data (not necessarily the same training data used previously).

    Kind of like how you can learn ancient history from your grandmother, but you will never know more ancient history than your grandmother unless you do some independent reading.

  • News @lemmy.world

    Biden sets out new Israeli proposal to end war in Gaza