We have introduced IBM Condor, a 1,121 superconducting qubit quantum processor based on our cross-resonance gate technology. Condor pushes the limits of scale and yield in chip design with a 50% increase in qubit density, advances in qubit fabrication and laminate size, and includes over a mile of high-density cryogenic flex IO wiring within a single dilution refigerator.
So, it sounds like this is actually another fridge sized system.
I may be mistaken but the fridge sized copper monstrosity is the system that cool the quantum chip, so unless they miniaturized the cooling system it didn't change.
From what I remember the chip itself is pretty small, the size is all due to the cooling component.
Also keep in mind you've probably seen a development version of a quantum computer, where things are set up to be easily accessible to allow fixing and tinkering, without regard for size and optimization of space.
It's faster than Moore's law, but I don't know whether it can be sustained.
For years, IBM has been following a quantum-computing road map that roughly doubled the number of qubits every year. The chip unveiled on 4 December, called Condor, has 1,121 superconducting qubits arranged in a honeycomb pattern. It follows on from its other record-setting, bird-named machines, including a 127-qubit chip in 2021 and a 433-qubit one last year.
For now they are only being used for research purposes. For example, simulating Quantum effects in many atom physics and implementing error correction for future quantum computers. Any real applications still need some time but the pace of development is really quite something.
Currently, there is basically only one real world application we really know: Factoring numbers into prime factors. And we can't know for sure whether there will be more even.
Wasn't there a study that, with the current approach of evaluating an average to break it down to a few finite states, they might never be able to do for what they were developed; cracking passwords?
If by "cracking passwords" you mean reversing password hashes in a database, quantum computers aren't going to make a big dent there. The standard industry ways of doing that wouldn't be affected much by QCs. Breaking encryption, OTOH, with QCs is a concern, but also vastly overrated. It would take orders of magnitude more qubits to pull off than what's been worked on so far, and it may not be feasible to juggle that many qubits in a state of superposition.
I get really annoyed when people focus on breaking encryption with QCs. They are far more interesting and useful than that.
QC can make logistics more efficient. Have you ever seen photos of someone unpacking a giant Amazon box holding one little micro SD card? Amazon isn't dumb about these things, but our best methods of packing an entire truck is a guess. Packing algorithms would take too long to calculate how to perfectly pack it, so they come up with a solution that seems OK, and that leads to a few "filler" boxes that are unnecessarily large, among other inefficiencies. QC can solve this problem without taking the age of the universe to come up with a solution.
The order in which that truck delivers those packages can also be made more efficient with QC.
Then there's molecular simulations, which have the promise of making medications that are more effective, more likely to pass trials, and with fewer side effects. This can be done far faster on a QC.
I dont think that the only use of Quantum computers is password cracking, rather that one of the types of work loads thats much easier on a quantum computer.
Jokes on them now they have to pay my electricity bill
They have already developed quantum proof encryption algorithms, something something latent spaces something something. Anyway, as long as the website has been updated to use the new algorithms you'll be okay. You may just have to change your password one time in case it was compromised under the old encryption scheme.
Post quantum cryptography is under development and is slowly being introduced in order to evaluate it / prevent store-and-decrypt-later attacks... But this is generally in more niche applications. SSH recently adopted post-quantum cryptography for key exchange, but it uses a hybrid approach with traditional cryptography in case the post-quantum stuff proves to be not as strong as we think... Signal is experimenting with post-quantum stuff as well. As far as I know, though, post-quantum cryptograhy hasn't seen wide deployment, and I don't think any of it is used with HTTPS yet (certainly not commonly, anyway). Depending on what you care about this could be a problem. If you just care that nobody else can authenticate as you, then yeah, once everything is moved over to post-quantum stuff you can just change all your passwords and hopefully you'll be good... If you care that the data is private then this is a big problem, and in theory somebody could scrape all of the messages you've sent and the contents of everything that you've done on the web (probably government agencies and not normal people, but maybe this information later gets leaked to the public too). This could also be a problem for authentication, for instance if you've ever logged into your bank account you've probably seen your routing numbers which somebody could take and use to transfer money, in theory.
It's also worth noting that, as far as I know, we don't actually know for certain that the post-quantum cryptography we've developed is secure. I think all we know is that it isn't vulnerable to Shor's algorithm, but there could be other exploits we don't know about. This is of course also true for all of the cryptography we use today too. We don't actually know how hard factoring into prime numbers is, for instance, but these algorithms have been in use for a long time and have been under a lot of scrutiny so we have more confidence in them.
It takes about a billion qbits to break 2048bit encryption, so a while. I saw something about reducing it to about 20 million qbits recently, but it's still a while off.
IIRC, those several million qubit computers out there right now aren't really comparable, either. They're using a ton of qubits, expecting a lot of them to fall out of superposition, but hoping they have enough to get a useful result. IBM's approach is to try to get the most out of every qubit. Either approach is valid, but IBM's 1000 qubits can't be directly compared to millions of qubits used elsewhere.
It's worth noting that the laser was much the same way. It was described early on as a solution in search of a problem, and lasers have had an incredible impact on technology.
It's actually impossible to do a direct comparison of flops to what I guess we'd call quflops, as the algorithms are not directly comparable. Quantum computers are good at quantum algorithms that can do operations in a single time step that a classical computer couldn't, likewise, to simulate a classical computer on a quantum computer would be very resource intensive.
I think it's closer to 20,000,000 and that is out the Noise Intermediate Scale Quantum computing, meaning modern chips would need to double or quadruple the number of qubits for error detection and error correction in order to run even basic algorithms. That's not to mention that they'd need to be super cooled for up to eight hours and stay in a super position without decoherence into their ground states before performing the Shor's Algorithm.
TL;DR: We need an improvement over 20000x and better tech to break RSA, but this is a good step forward!
So, basically, we're still in the ENIAC stage of quantum computers. They're cool and all, can do some awesome stuff, but are no where near the potential they could be.