This quantum computer built on server racks paves the way to bigger machines A Canadian startup called Xanadu has built a new quantum computer it says can be easily scaled up to achieve the computational power needed to tackle scientific challenges ranging from drug discovery to more energy-efficient machine learning. Aurora is a “photonic” quantum computer, which means it crunches numbers using photonic qubits—information encoded in light. In practice, this means combining and recombining laser beams on multiple chips using lenses, fibers, and other optics according to an algorithm. Xanadu’s computer is designed in such a way that the answer to an algorithm it executes corresponds to the final number of photons in each laser beam. This approach differs from one used by Google and IBM, which involves encoding information in properties of superconducting circuits. Aurora has a modular design that consists of four similar units. Ultimately, Xanadu envisions a quantum computer as a specialized data center, consisting of rows upon rows of these servers. This contrasts with the industry’s earlier conception of a specialized chip within a supercomputer, much like a GPU. But this work, which the company published last week in Nature, is just a first step toward that vision. Aurora used 35 chips to construct a total of 12 quantum bits, or qubits. Any useful applications of quantum computing proposed to date will require at least thousands of qubits, or possibly a million. By comparison, Google’s quantum computer Willow, which debuted last year, has 105 qubits, and IBM’s Condor has 1,121. Xanadu’s 12 qubits may seem like a paltry number next to IBM’s 1,121, but Tiwari says this doesn’t mean that quantum computers based on photonics are running behind. Photonic quantum computers offer several design advantages. The qubits are less sensitive to environmental noise, says Tiwari, which makes it easier to get them to retain information for longer. It is also relatively straightforward to connect photonic quantum computers via conventional fiber optics, because they already use light to encode information. Networking quantum computers together is key to the industry’s vision of a “quantum internet” where different quantum devices talk to each other. Aurora’s servers also don’t need to be kept as cool as superconducting quantum computers, says Weedbrook, so they don’t require as much cryogenic technology. The server racks operate at room temperature, although photon-counting detectors still need to be cryogenically cooled in another room. Xanadu is not the only company pursuing photonic quantum computers; others include PsiQuantum in the US and Quandela in France. Other groups are using materials like neutral atoms and ions to construct their quantum systems. https://lnkd.in/g467ftCT
Astor Perkins’ Post
More Relevant Posts
-
This quantum computer built on server racks paves the way to bigger machines https://ift.tt/d7fvuUg A Canadian startup called Xanadu has built a new quantum computer it says can be easily scaled up to achieve the computational power needed to tackle scientific challenges ranging from drug discovery to more energy-efficient machine learning. Aurora is a “photonic” quantum computer, which means it crunches numbers using photonic qubits—information encoded in light. In practice, this means combining and recombining laser beams on multiple chips using lenses, fibers, and other optics according to an algorithm. Xanadu’s computer is designed in such a way that the answer to an algorithm it executes corresponds to the final number of photons in each laser beam. This approach differs from one used by Google and IBM, which involves encoding information in properties of superconducting circuits. Aurora has a modular design that consists of four similar units, each installed in a standard server rack that is slightly taller and wider than the average human. To make a useful quantum computer, “you copy and paste a thousand of these things and network them together,” says Christian Weedbrook, the CEO and founder of the company. Ultimately, Xanadu envisions a quantum computer as a specialized data center, consisting of rows upon rows of these servers. This contrasts with the industry’s earlier conception of a specialized chip within a supercomputer, much like a GPU. But this work, which the company published last week in Nature, is just a first step toward that vision. Aurora used 35 chips to construct a total of 12 quantum bits, or qubits. Any useful applications of quantum computing proposed to date will require at least thousands of qubits, or possibly a million. By comparison, Google’s quantum computer Willow, which debuted last year, has 105 qubits (all built on a single chip), and IBM’s Condor has 1,121. Devesh Tiwari, a quantum computing researcher at Northeastern University, describes Xanadu’s progress in an analogy with building a hotel. “They have built a room, and I’m sure they can build multiple rooms,” he says. “But I don’t know if they can build it floor by floor.” Still, he says, the work is “very promising.” Xanadu’s 12 qubits may seem like a paltry number next to IBM’s 1,121, but Tiwari says this doesn’t mean that quantum computers based on photonics are running behind. In his opinion, the number of qubits reflects the amount of investment more than it does the technology’s promise. Photonic quantum computers offer several design advantages. The qubits are less sensitive to environmental noise, says Tiwari, which makes it easier to get them to retain information for longer. It is also relatively straightforward to connect photonic quantum computers via conventional fiber optics, because they already use light to encode information. Networking quantum computers together is key to the industry’s vision of a “quantum internet”...
To view or add a comment, sign in
-
the POWERFUL computers of the future: Google's quantum computer, known as Sycamore, represents a significant milestone in the field of quantum computing. Developed by Google AI Quantum, the team dedicated to advancing quantum algorithms and technology, Sycamore made headlines in October 2019 when Google announced it had achieved what is known as "quantum supremacy." This term describes the ability of a quantum computer to solve a problem that is practically impossible for classical computers. Quantum supremacy was demonstrated when Sycamore completed a complex problem in 200 seconds—a task that, according to Google, would take the world’s most powerful supercomputer approximately 10,000 years to solve. The core of this achievement lies in Sycamore's architecture, which includes 54 qubits, or quantum bits. However, due to one qubit not functioning correctly, the historic computation was carried out with 53 qubits. These qubits are made from superconducting loops that can exist in multiple states simultaneously, unlike traditional bits that are either 0 or 1. This allows quantum computers to process a vast amount of possibilities simultaneously. Sycamore manipulates its qubits using finely tuned microwave pulses, achieving low error rates in quantum gates—essential for reliable quantum operations. The specific task Sycamore completed involved generating a vast number of random numbers to demonstrate the processor's capabilities. While this task itself was not directly useful outside of proving the concept, it showcased the potential for future applications. Quantum computers like Sycamore could eventually revolutionize fields by performing complex calculations that are currently unfeasible, such as simulating molecular structures for drug discovery, optimizing large systems for logistics, or cracking codes that are secure with today’s encryption standards. As quantum technology continues to develop, the implications of this quantum leap by Google's Sycamore will likely be far-reaching, paving the way for new breakthroughs in multiple scientific and technological arenas. #quantumcomputing
To view or add a comment, sign in
-
-
Google's Quantum Computer Chip Completes a Benchmark Computation in 5 Minutes. 👊 Why is it buzzing around? 🤔 Let’s take a quickscan. 🌟Five years ago, Google introduced a chip that could solve a complex computational problem in 200 seconds. 🌟In contrast, classical supercomputers would have taken 10,000 years to solve the same problem. 🌟It has improved every year since. 🌟Recently, Google showcased a Quantum AI Chip - Willow, which has shown double exponential growth in computing power. 🌟Now, it can solve complex computations in 5 minutes—problems that would take a classical computer 10 septillion years (10 to the power 25) to process. But how does it work? 🤔 🌟Our traditional computers operate using memory that stores information in the form of bits. 🌟A bit has two possible values: 0 or 1. It is integrated into a transistor, which acts as a switch that is either on or off. 🌟In quantum computing, we use a quantum bit (qubit), which can simultaneously represent both values—0 and 1—thanks to a principle called superposition. 🌟When hundreds of qubits work together, advanced algorithms can solve far more complex problems much faster than conventional computers. Is it ready for use? 🤔 Not yet. 🌟Google has made progress in error correction.. 🌟This means multiple attempts are needed to extract a usable signal, which is then compared with classical computers. 🌟Sometimes, it takes over a million attempts, but advancements have reduced this to mere minutes—still significantly faster than classical computers. 🌟Initially, companies used RCS (Random Circuit Sampling) to compare quantum processes with the best classical algorithms running on supercomputers. Error rates are decreasing over time. How far are we from using these chips?🤔 🌟It will likely take 3 to 4 years before these chips can be trained to solve complex problems. 🌟Early commercial applications are still at least half a decade away. Interesting times ahead.🫡
To view or add a comment, sign in
-
-
What happens when computers begin to solve problems of their own creation, problems which humans can't even begin to understand? That future may be closer than you think... There's been a recent breakthrough in quantum computing by Microsoft and Quantinuum. They have overcome a major hurdle in error correction. The researchers were able to run 14,000 experiments without errors by using logical qubits and ion-trap hardware. What does this mean? Essentially this discovery could help us create quantum computers that are more reliable and less prone to mistakes. These next gen computers could unlock solutions to problems we currently consider impossible. What came before the universe? What's inside a black hole? Is time travel possible? Those answers are closer than ever! #quantumcomputing #microsoft #technology #futureoftechnology #AI
To view or add a comment, sign in
-
Why do we need error-corrected quantum computers to be fast? Because error correction, while critical for quantum computing, introduces delays that directly limit performance. Let’s break this down: 1. Quantum operations are inherently slow. Superconducting qubits (one of the fastest technologies) take about 1 microsecond to measure. By comparison, classical computers operate at sub-nanosecond speeds — a difference of over 1,000x. 2. Error correction adds complexity. Error correction doesn’t just involve running quantum operations. It requires constant measurements to detect and correct errors. These measurements feed into classical decoders, like AlphaQubit, which analyze errors and decide corrective actions. 3. Decoder latency becomes a bottleneck. Even if qubits are fast, the classical decoder must process measurement data in real time. For instance: Google quantum error decoder introduces delays of 50–100 microseconds. As the quantum system scales (e.g., larger qubit lattices), this delay could grow, further slowing down operations. 4. Real-time decoding is crucial for scalability. Fault-tolerant quantum computing requires scaling up from logical qubits (error-corrected units) to large systems capable of handling practical problems. Every additional delay in decoding slows down the entire system, making it impractical for meaningful computations. Why does speed matter for error-corrected quantum computers? Fighting decoherence: Qubits are fragile and lose their quantum state (decoherence) quickly. Delays increase the risk of losing information. Efficiency at scale: Larger quantum systems need faster operations to maintain coherence across thousands or millions of qubits. Scientific practicality: A slow fault-tolerant quantum computer would struggle to outperform classical supercomputers in real-world tasks. Key Takeaway: Building fault-tolerant quantum computers isn’t just about improving accuracy. It’s about eliminating speed bottlenecks — in qubit operation, error correction, and decoding. Advances like real-time decoding (e.g., AlphaQubit) are a step forward, but achieving practical, large-scale quantum computing demands further breakthroughs in both quantum and classical processing speeds. _____________ ✔️ Click "Follow" on the Cohorte page for daily AI engineering news.
To view or add a comment, sign in
-
More physical qubits, improved calibration, machine learning, and improved device fabrication have done it for the Google 'Willow' quantum chip. The extreme sensitivity of quantum states creates far more errors than classical computing. Looks like this is changing, opening up a realm of capabilities to solve super complex problems. https://lnkd.in/gcMJNU5B
To view or add a comment, sign in
-
Google has unveiled its latest quantum chip, Willow, claiming significant advancements in quantum computing. While the company avoids explicitly claiming quantum supremacy, it highlights Willow's ability to perform certain calculations exponentially faster than classical computers. However, the significance of these claims is subject to debate, as other companies like IBM and Honeywell use different metrics to assess quantum computer performance. A more substantial breakthrough lies in Willow's ability to reduce errors as the number of qubits increases, bringing the industry closer to building practical, large-scale quantum computers. This achievement positions Google as a leader in quantum computing, paving the way for potential real-world applications in the future. #Google #TechGiants #chips #chipmaker #CPU #CPUs #Processors #semiconductor #semiconductors #semiconductorindustry #semiconductormanufacturing #quantumcomputing #technology #AI #artificialintelligence #innovation #futureoftech
To view or add a comment, sign in
-
First Multiuse Optical Quantum Computer Comes to Japan Riken's new machine has computing power equivalent to 1,000 qubits. By Tim Hornyak Tim Hornyak is a Tokyo-based journalist and the author of Loving the Machine: The Art and Science of Japanese Robots. Image: This light modulation module is at the heart of Japan's new general-purpose optical quantum computer. Takahiro Kashiwazaki et al./NTT Quantum computers could in principle solve some complex mathematical problems that would take far too long on a regular, classical computer. However, efforts to make them practical and easily scalable have been stymied by the inherent instability of quantum states. Researchers in Japan recently developed an optical quantum computer that can be used for a variety of applications, a feature they say makes it the first general-purpose optical quantum computer in the world. While purpose-built optical quantum machines have been available for years, a general-purpose one has long been a goal of the industry. “The previous optical quantum computers are purpose-specific devices, such as a boson sampling machine and small-scale quantum computers with around 10 qubits,” says Hidehiro Yonezawa, team leader of the optical quantum control research team at the Riken Center for Quantum Computing. “Our quantum computer is a flexibly programmable quantum computer with a hundred analog quantum inputs.” The machine employs photons instead of superconducting electronic circuits, the preferred approach of Google and IBM, among others. Because it doesn’t use superconductors, the computer operates at near-room temperature, doesn’t need a cooling system, and can easily be scaled, according to the researchers from Riken (Japan’s largest research institute), Nippon Telegraph and Telephone Corp. (its largest telecom), and the cloud computing platform Fixstars Amplify. Most quantum computers are measured in terms of qubits, or quantum bits. While classical bits are either 0 or 1, qubits can have a value of either, or a bit of both simultaneously. Like classical computer bits, photons can have two states, a horizontal and vertical polarization, but can also exist in a superposition of those states, somewhere between horizontal and vertical. https://lnkd.in/dgUWubWY
To view or add a comment, sign in
-
-
Google vs IBM Last year, Google Quantum AI released the latest breakthrough in error-corrected quantum computing uses the“surface code” to group physical qubits into logical qubits. However, recently IBM released a rival method to “surface code” which called QLDPC which may outperform it by requiring fewer qubits. This could be a game-changer given that large-scale quantum computers don’t yet exist and qubit resources are precious. While the surface code is well understood and readily adapted to superconducting qubits, QLDPC demands more complex qubit connectivity and hasn’t been explored as thoroughly. Different hardware designs, including ultracold atoms that allow flexible qubit connections, further complicate the race to achieve practical quantum error correction. Ultimately, whether Google’s surface-code success is future-proof depends on how new codes and hardware innovations evolve in this competitive. #google #IBM #AI #quantum
To view or add a comment, sign in
-
What is quantum computing? Quantum computing is an emergent field of cutting-edge computer science harnessing the unique qualities of quantum mechanics to solve problems beyond the ability of even the most powerful classical computers. The field of quantum computing contains a range of disciplines, including quantum hardware and quantum algorithms. While still in development, quantum technology will soon be able to solve complex problems that supercomputers can’t solve, or can’t solve fast enough. By taking advantage of quantum physics, fully realized quantum computers would be able to process massively complicated problems at orders of magnitude faster than modern machines. For a quantum computer, challenges that might take a classical computer thousands of years to complete might be reduced to a matter of minutes. The study of subatomic particles, also known as quantum mechanics, reveals unique and fundamental natural principles. Quantum computers harness these fundamental phenomena to compute probabilistically and quantum mechanically. Four key principles of quantum mechanics Understanding quantum computing requires understanding these four key principles of quantum mechanics: Superposition: Superposition is the state in which a quantum particle or system can represent not just one possibility, but a combination of multiple possibilities. Entanglement: Entanglement is the process in which multiple quantum particles become correlated more strongly than regular probability allows. Decoherence: Decoherence is the process in which quantum particles and systems can decay, collapse or change, converting into single states measurable by classical physics. Interference: Interference is the phenomenon in which entangled quantum states can interact and produce more and less likely probabilities. Qubits While classical computers rely on binary bits (zeros and ones) to store and process data, quantum computers can encode even more data at once using quantum bits, or qubits, in superposition. A qubit can behave like a bit and store either a zero or a one, but it can also be a weighted combination of zero and one at the same time. When combined, qubits in superposition can scale exponentially. Two qubits can store four bits of information, three can store eight, and four can store twelve. https://lnkd.in/dnFNK9Kz IBM #Quantum #Computing
To view or add a comment, sign in