Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)BL
Posts
0
Comments
162
Joined
2 yr. ago

  • No, you are correct. Hinton began researching ReLUs in 2010 and his students Alex Krizhevsky and Ilya Sutskever used it to train a much deeper network (AlexNet) to win the 2012 ILSVRC. The reason AlexNet was so groundbreaking was because it brought all of the gradient optimization improvements (SGD with momentum as popularized by Schmidhuber, and dropout), better activation functions (ReLU), a deeper network (8 layers), supervised training on very large datasets (necessary to learn good general-purpose convolutional kernels), and GPU acceleration into a single approach.

    NNs, and specifically CNNs, won out because they were able to create more expressive and superior image feature representations over the hand-crafted features of competing algorithms. The proof was in the vastly better performance, it was a major jump when the performance on the ILSVRC was becoming saturated. Nobody was making nearly +10% improvements on that challenge back then, it blew everybody out of the water and made NNs and deep learning impossible to ignore.

    Edit: to accentuate the point about datasets and GPUs, the original AlexNet developers really struggled to train their model on the GPUs available at the time. The model was too big and they had to split it across two GPUs to make it work. They were some of the first researchers to train large CNNs with GPUs. Without large datasets like the ILSVRC they would not have been able to train good deep hierarchical convolutions, and without better GPUs they wouldn’t have been able to make AlexNet sufficiently large or deep. Training AlexNet on CPU only for ILSVRC was out of the question, it would have taken months of full-tilt, nonstop compute for a single training run. It was more than these two things, as detailed above, but removing those two barriers really allowed CNNs and deep learning to take off. Much of the underlying NN and optimization theory had been around for decades.

  • Before AlexNet, SVMs were the best algorithms around. LeNet was the only comparable success case for NNs back then, and it was largely seen as exclusively limited to MNIST digits because deep networks were too hard to train. People used HOG+SVM, SIFT, SURF, ORB, older Haar / Viola-Jones features, template matching, random forests, Hough Transforms, sliding windows, deformable parts models… so many techniques that were made obsolete once the first deep networks became viable.

    The problem is your schooling was correct at the time, but the march of research progress eventually saw 1) the creation of large, million-scale supervised datasets (ImageNet) and 2) larger / faster GPUs with more on-card memory.

    It was fact back in ~2010 that SVMs were superior to NNs in nearly every aspect.

    Source: started a PhD on computer vision in 2012

  • For those who don’t know what this is from, it’s a 2021 Netflix special that came out during COVID lockdown called “Inside” by Bo Burnham.

    If you haven’t watched it, go do that immediately. Also, a bit of a warning, it deals with some heavy concepts that were very relevant a year or so into lockdown. I rewatched it recently and had to fight off some PTSD feelings.

  • I was asleep in bed one night many years ago and heard a loud bang, startling me awake. I thought for sure somebody was trying to break in a window or something. Went out, checked the house, nothing amiss. Went outside, checked around for any signs of something being off, not a thing. I then sat in my living room for 30 minutes trying to decide if it was a dream or if it would happen again. Still, nothing. I went back to sleep.

    The next morning I made coffee and sat down at my MacBook Pro. It was an older model but I immediately noticed something was wrong. The entire laptop was elevated off the table slightly. The battery had apparently expanded in the middle of the night and with enough force ripped the bottom aluminum cover through three of the screw holes. Found the source of the noise, and immediately took it for disposal.

    I was very lucky it didn’t catch fire, although it would have quickly solved my midnight mystery.

  • Their stance is Trump came out of his two impeachments politically stronger than he went into them, and unless 14 20 GOP Senators decide to switch their allegiance all of a sudden the act of impeachment is practically useless.

    Edit: I fucked up the math, thanks commenter

  • Permanently Deleted

    Jump
  • That would be a pretty big security hole in iOS if that was allowed, but it isn’t. Notification and other UI elements are rendered on top of the underlying app, which does not have access to or cannot see the full screen’s canvas. We can see practical implementations of this “snapshot” test feature in code:

    https://github.com/uber/ios-snapshot-test-case

  • Something that was said to me long ago by somebody I admired when I was a fuckwit teenager, “if you're not embarrassed of the person you were last year, you're not growing enough.” That has stuck with me for decades, I find it still applies and I’m scared of when I could realize it doesn’t.