AIfu
That's gold. I like it.
Damn, HP doesn't mess around. I'm going to stop trashing them around the office.
And by costlier he means power consumption. Ten thousand times more powerful! Still can't do basic math without handing it off to a calculator.
I was about to laugh about 2020 being cyberpunk, but come to think about it 2020 was the most cyberpunk year so far with everyone stuck inside doing everything on the internet.
I will never get tired of that saltman pic.
probably just a pun on https://en.wikipedia.org/wiki/Fotomat
Looking forward to the LLM vs LLM PRs with hundreds of back and forth commit-request changes-commit cycles. Most of it just flipping a field between final and not final.
Wait, is this how Those People claim that Copilot actually “improved their productivity”? They just don’t fucking read what the machine output?
Yes, that's exactly what it is. That and boilerplate, but it probably makes all kinds of errors that they don't noticed, because the build didn't fail.
My first memory of programming was typing in a BASIC program for hours with my older cousin into his VIC-20 from a magazine where the last, like, ten pages were nothing but hexadecimal numbers. Ended up being a Robotron clone. We played it for a while then turned off the computer and it was gone.
I loved making maps for Q3. I made so many of them, some even got rotation on some servers. The simplicity was perfect for someone like me, just brushes and shaders. When UT2K4 came out I decided to try to make maps for that, but everything was intricate 3D models, which i couldn't do, so I gave up and went back to Q3.
I even made some maps for Q1, but I think I spent most of my time trying to make mods. My favorite was an inferno gun from Battletech: Crescent Hawk's Inception, which used the rocket launcher to shoot a fireball that would stick to the target and do big damage in ticks.
Well, they used to have 700 customers.
Skimmed the paper, but i don't see the part where the game engine was being played. They trained an "agent" to play doom using vizdoom, and trained the diffusion model on the agents "trajectories". But i didn't see anything about giving the agents the output of the diffusion model for their gameplay, or the diffusion model reacting to input.
It seems like it was able to generate the doom video based on a given trajectory, and assume that trajectory could be real time human input? That's the best i can come up with. And the experiment was just some people watching video clips, which doesn't track with the claims at all.
They added those at my work and they are terrible. A picture of the company CEO standing in front of a screen with the text on it announcing a major milestone? "man in front of a screen" Could get more information from the image filename.
It turns AI really is going to try to wipe out the humans. Just, you know, indirectly through human activity trying to make it happen.
Where's the part where they have people play with the game engine? Isn't that what they supposedly are running, a game engine? Sounds like what they really managed to do was recreate the video of someone playing Doom which is yawn.
Huh. That sounds exactly like those work from home scams. Surely an AI company wouldn't do such a thing. Right?
The only bright spot is that the new educational AI model doesn’t exist yet and there’s plenty of time for the whole project to go sideways before launch.
Do you really think that will stop them?
it cannot handle subtraction going negative
hi, another lurker from reddit here. maybe i'll try posting this time around