The programming described in the article is spectacular too. Imagine working with 68 KB of space. I got to talk to someone who worked on the team once, which was probably the culmination of my life.
I started programming as a kid way back in the ZX Spectrum days, and that one had even less memory than that.
You can do a surprising large amount of functionality if you're hand-coding assembly (I actually made a mine-sweeper clone for the Spectrum like that).
Even nowadays, there is the whole domain of microcontrollers, some of which are insanelly tiny (for example, the ATTiny202 which has 2KB flash and 128 Bytes of RAM) and you can do a surprising amount of functionality even in C since modern C compilers are extremelly efficient.
(That said, that 202 is the extreme low end and barelly useful, but I do have an automated plant watering system I designed - complete with low battery detection and signalling - running on an ATTiny45, an older chip with twice as much flash and RAM).
In my experience, if there is no UI on a screen (graphical elements tend to use quite a bit of memory plus if you're doing animation you need an in-memory buffer the size of the video memory to get double-buffering for smoothness and just that buffer can add up a lot of memory depending on resolution and bytes per pixel), using a compiled language which can optimize for size (like C) and not dragging in a ton of oversized libraries as dependencies, you can do a ton of functionality in very little memory - there are quite complex functional elements out there (like full TCP/IP stacks) that fit in a few KB of memory.
I saw an interesting video about the first drone that flew on Mars. They programmed the flights in advance and it then executed them autonomously. I think that is even more impressive, since it would not have been possible to intervene if something went wrong. At the time the data was received, the drone already landed
Snapdragon took care of image processing, guidance processing, and storing flight data—with readings 500 times a second—while the microcontroller was in charge of navigation and running the helicopter’s motors.
It's kinda mind-blowing that the same hardware from my trusty s5 (that is currently gathering dust in a drawer somewhere, rip) powered flight of a drone on Mars.
All of that with no GPS to get the location of the drone. They relied on a camera under the drone to basically act like an optical mouse sensor to follow the location of the drone.
I am not downplaying the supreme engineering of the mars rover team, especially because there is no GPS on mars, but DJI has pre-programmed drone flights that work with their consumer drones, called missions.
Well they used a camera to track features on the ground for navigating, but then flew into a sandy area with to few features and crashed xD To be fair it was never intended to leave the initially planned area in the first place. And they made the most out of it, the drone is now a lil weather station reporting temp and pressure
The rovers themselves have to do the entire landing (called EDL in NASA speak) autonomously. The process takes 11 minutes so, likewise, by the time we hear the report that EDL has started the rover is already on the ground.
Logically, given that we are still getting transmissions from the remote vehicles, either there are no aliens shooting back, the aliens have lousy aim or really bad weapons, or they've long destroyed those vehicles and what we're receiving are fake transmissions from the aliens.
So it is indeed possible that the aliens are shooting back but we can't tell from this side.
It's like playing Age of Empires over dialup. One minute you're happily building a little army and keeping your farms going. Then some asshole with cable internet comes along and faster than you can blink, your army is destroyed, villagers murdered, and your city burned to the ground.
I never figured out how, but it tended to feel impossibly early in the game too, as if the opponent had already been developing their economy for at least as long as I had before the game had even started.
It depends on what type of game you are playing, and how good the game's lag compensation is. I've played games just fine with a ping as high as 200ms.
I gave up sim racing online after a crash and seeing the other players replay of the crash. I didn't think I was at fault but because of the lag, I totally was.
My ping from Australia to Europe was just too much in order to ensure others could have a safe race. When everyone else has 20-40 ping and I'm racing with 150+ it's just too much lag to be safe on the track
It would be comparable if NASA scientists were racing against someone else controlling another vehicle over there with less ping.
P.S. I'm not saying it isn't challenging - it surely is, but it's like connecting to your home computer over a shitty connection to play a single player game.
I guess I don't really play any games where that tiny difference in ping time matters THAT much...That's less time than most people can even measure without tools
I think over 100 starts making your off-gcds clip on FFXIV, and stuff like getting 6 hits properly for Wildfire on MCH(probably some other stuff too). It's not unplayable, but it is frustrating and distracting, so can cause an ADHD person like me significant stress when doing content like Savage raids and Ultimates
As an amateur, in a fast paced shooter, vs. an equally skilled player, it went from a fair match with equal pings to one player dominating the other with 100 vs. 70 ping.
Back in my day, controls not matching the game play was kind of built in as part of the challenge.. Frog Master, Dragon's Lair, Space Ace. Probably others I'm forgetting.