Wireless has a lower minimum latency than wired, that's why trading houses set up relay towers from Chicago to NYC, in order to achieve the lowest possible latency for their trades between the two markets.
Wired gives better stability, due to almost zero interference noise. The primary cause of sucky WiFi speeds/stability, is having too many other people's routers nearby.
Ehhh... not quite. There's evidence that copper runs closer to the speed of light (aka c), than fiber. Light through glass runs at around 2/3 c, making it the slowest option.
Wireless technically runs as fast as light, through atmosphere that's a tiny bit slower than c, but as close as we can get.
There's also a large argument among physisicts and electrician YouTubers about the speed of electricity through a wire, and I don't understand the conclusions, though they were articulated quite well by the YouTubers, it just didn't stick in my brain. The premise is how fast a lightbulb would illuminate if it had one light-second of pure copper (or superconducting) wire between the power source and the bulb, with little to no resistance. It's interesting but nuanced and complex.
Wifi, being EM waves (same as light) should run the fastest, copper ethernet close behind and fiber dragging it's heels at 2/3rds c. However, in practical applications, wifi has more to overcome since it's a shared medium. Copper and fiber have a dedicated medium, so they have no competition in signaling, wifi needs to contend with everything from other wifi networks spurious emissions from other frequencies, even background cosmic radiation, as well as itself (half duplex). Because of all of that, you generally end up with wifi in last because it has so many protections and checks that it delays itself to ensure that it's transmission will be recieved intact. The packets are generally larger and take longer to get started, so all the additional (mostly artificial) slowdowns make it slower. However, if you use highly directional antennas, a pair of them, on different but otherwise equivalent frequencies for send/receive, and cut out a lot of the other factors by designing the system well, then disable most of the protections because they're not needed by design, it will be faster, at least in terms of latency, than fiber or copper in almost every case.
Since designing a multi-access system that doesn't need wifi's protections is borderline impossible, this is limited to very controlled point to point systems where both ends are tightly constrained.
So the argument "wifi has a lower minimum latency" is correct, but irrelevant in 99.99% of use-cases. Copper is easier and cheaper than fiber and actually runs faster, than fiber, but it's only viable for extremely short runs, up to 100m in most cases, and fiber, while "slow" at 2/3rds c, is better for longer distance since there's less line-loss across the glass per foot.
This is a very deep topic and I'm no physicist, but I've been endlessly fascinated by this issue for a very long time. The information here is the result of my research over many years. I still consider fiber to be the gold standard of data communication, ethernet to be next-best and overall best for relatively short connections, and wireless to be dead last due to all the challenges it faces that are not easily overcome.
Is the notebook or desktop wifi NIC and antenna important or only the router? Because when I had a shitty laptop a few years back the latency sucked ass, both at home and at my university (where I hope they had good network components but idk)
With wifi, everything is important, even the number of people connected on your channel... not the number of wifi networks on the channel, the number of total nodes using the same channel. The ap hardware factors in, your wifi card (client) factors in, even drivers and other things can factor in. The band (2.4/5/6 GHz), the non-wifi traffic, spurious emissions from other harmonic frequencies, even electrical noise from gadgets and other devices nearby. You can even factor in distance to the ap and cosmic background noise.
On top of that, it's half duplex, so only one node can successfully transmit at a time. So it interferes with itself.
It's a complete mess of unknowns and unknowable things, unless you have a very good spectrum analyser to look into it.
IMO, this is what makes WiFi so terrible. There's simply too many factors that can be slowing you down, most of which you can't see and aren't obvious.
Your experience varies massively depending on your RF environment. In my suburban neighborhood, I’m getting a stable 3.4ms to my router. The same hardware when I was in a dense urban environment was around 11ms. I’ve never looked at retry counters, but if I had to guess, I’m getting close to zero right now, but was getting considerably higher in a dense area.