In 2019, researchers at Tohoku University and Purdue University in Japan conducted a proof-of-concept experiment using “probability bits” instead of qubits to solve integer factorization problems that plague classical computers. Related research, published in the journal Nature, demonstrated the feasibility of a device called a “probabilistic computer.” The authors say that this “probabilistic computer” can solve some problems that are usually thought to require a quantum computer, but the construction conditions are not as harsh (can operate at room temperature), so it may be easier to implement, and they also “probabilistic bits” “It’s called the “poor man’s qubit.” Recently, the authors of the study published a short paper in IEEE Spectrum, introducing the basic principles of such a computer in plain language.

In recent years, with the death of Moore’s Law, high hopes have been placed on quantum computers. Many commentators have pointed out that if engineers can design practical quantum computers, there will be a structural shift in the way humans compute.

However, there is an important “if” to this assertion.

In theory, quantum computers are promising, but building a practical quantum computer requires overcoming enormous hurdles. Some skeptics even argue that it may not be possible to build a universal quantum computer in the foreseeable future due to the technical difficulty. Of course, there are also people who are more optimistic that it will take only 5-10 years to realize this vision. These people come from technology giants such as Google, IBM, Intel and others that are building quantum computers.

But even if the entire quantum computing industry is moving a lot slower than proponents expect, one thing seems certain. Quantum computing has inspired a deep understanding of the role probability plays in computing systems—as the late physicist Richard Feynman expected when he brought the idea back to life in the 1980s That way.

It was this understanding that we sought when we started our work on probabilistic bits (p-bits) in 2012. “Probability bit” is a name based on the quantum bit (qubit). Feynman had viewed such probabilistic computers as a Contrast to the quantum computer he envisioned. So we asked ourselves a question: How can we make one?

A magnet with two directions of magnetization can store one bit. Early computers used this method to create magnetic core memory. However, it is very difficult to make magnetic core memory small, because the smaller it is, the more unstable its properties are.

In a 2019 Nature paper, we successfully exploited this bug-like feature to implement p-bit using small, unstable magnets. With the help of researchers at Tohoku University in Japan, we built a probabilistic computer with 8 p-bits.

Paper link:

https://www.nature.com/articles/s41586-019-1557-9

This new magnet-based p-bit is not necessary when building probabilistic computers. In fact, earlier, we have built a probabilistic computer that uses complex Electronic circuits to generate pseudo-random sequences from deterministic bits to achieve p-bits. Companies such as Fujitsu have also started selling similar probabilistic computers. However, using unstable magnets as basic building blocks, we can implement a p-bit with a few transistors (instead of thousands), which enables the construction of large probabilistic computers.

In such a computer, a system of p-bits evolves from an initial state to a final state and passes through one of many possible intermediate states. Which path the computer takes is entirely a matter of chance, and each path has a certain probability. Add up the probabilities of all possible paths and you get the total probability of reaching a given end state.

A quantum computer does something similar, but it uses qubits instead of probability bits. This means that each path here has what physicists call a probability amplitude, which can be negative. More precisely, it is a complex number with both real and imaginary parts.

In a quantum computer, to determine the overall probability of getting from some initial state to the final state, you first add the amplitudes of all possible paths to get the probability amplitude of the final state. The final amplitude is also a complex number, then square its magnitude to get the actual probability, a number between 0 and 1.

In a nutshell, this is the key difference between probabilistic and quantum computers. The former adds up all probabilities, the latter adds up the complex probability amplitudes.

This difference is actually very important. The probability is a positive number less than 1, so adding an extra path will only improve the final probability. But the probability amplitude is complex, which means that adding an extra path may cancel an existing path. It’s as if a path has a negative probability.

The power of quantum computing comes directly from this ability to make probability negative. Well-known algorithms such as Shor for integer factorization and Grover for data search carefully orchestrate the available intermediate paths to ensure that those leading to wrong outputs are cancelled out and those leading to the correct answer can be added.

However, the realization of this power comes at a price. The qubits that carry these complex amplitudes must be carefully protected from environmental influences. This usually requires extremely low temperatures. By contrast, probabilistic computers can be created at room temperature using simpler techniques. But such calculations lack the magic of negative probabilities, and thus only work for algorithms that don’t require path cancellation.

Simulating a quantum computer with probabilistic bits is theoretically possible, but it’s not a practical strategy. Nonetheless, probabilistic computers can provide significant speedups on many important problems compared to deterministic computers, which is why we are so interested in building such computers.

How do probabilistic computers work? In fact, its principles are very different from the digital systems we use every day, and even most computer engineering students know very little about it. So we wanted to talk about this topic in a conversational way.

The names of the dialogue characters are taken from a book by Galileo, Dialogue of Two World Systems. This is a cleverly written book that unfolds in the dialogue of three characters – Simplicio (a geocentric Aristotelian), Salviati (a heliocentric Copernicus) and Sagredo (The erudite wise men who are neutral in this debate). In this conversation, Salviati systematically refutes all of Simplicio’s claims and arrives at the proof that Galileo claimed that the earth revolves around the sun. Sagredo finally concluded that the wise Salviati (actually Galileo’s own projection in the book) was right. Aristotle was wrong. The three then retired to enjoy a meal and wine.

The dialogue for this article also unfolds between these three, with only a slight change in the character’s mission. Salviati is responsible for conveying the author’s knowledge and opinions; Sagredo can be seen as the reader (you); Simplicio is just a passerby. The three meet on the plane.

Here is the dialogue part:

**What is a “probabilistic computer”**

Sagredo: I see you are reading IEEE magazine, are you an electrical engineer?

Salviati: Yeah, I study computing.

Sagredo: Have you been up to anything interesting lately?

Salviati: My colleagues and I are working on a new computational approach. You know, all our electronic devices, like cell phones, are circuit based, each input has a corresponding output, such as input 5 and 6, these devices can give the result of multiplying them 30. But now, we’ve built a circuit in reverse: given a number 30, the device can give you all of your input combinations, like 5 and 6, 15 and 2, 10 and 3, and 30 and 1.

Sagredo: Sounds interesting. But what is this for?

Salviati: It has a lot of uses, because now many problems can be hard in reverse, like multiplication is much easier than factoring. Many kids can quickly figure out that 711×85 equals 65535, but breaking 65535 into 711×85 is not that easy, and getting other combinations (such as 257×255) is even harder.

Sagredo: I see. But I heard that computers can beat Go masters now, so it’s not difficult to solve this kind of problem, right?

Salviati: Yes, today’s digital computers can beat Go masters, but what is less known is that they consume 10 megawatts of electricity to do so, while a human Go master only needs 10 to 20 watts. There’s a lot of interest in reducing the energy consumption of complex computations, and we think the “reverse computation” we’re working on can achieve that vision.

Sagredo: It should be difficult for you to explain your design philosophy to a novice like me, right?

Salviati: It does take a little more time, I need to draw a few pictures. (Salviati sees an unused napkin next to him.) Excuse me, can I use your napkin?

Simplicio: No problem.

Salviati: (Salviati puts down the small table and starts drawing.) You see, in a digital computer, everything can be represented by bits (0s and 1s), which in turn can be represented by physical entities with two states , such as magnets.

Engineers make complex circuits to perform specific operations. For example, we can construct a circuit to do a one-bit binary multiplication: the output bit is either a 0 or a 1 (let’s call it C), depending on the product of the input bits A and B.

Sagredo: How is this different from your reverse circuit?

Salviati: We build circuits with p-bits, which are neither 0 nor 1, but fluctuate rapidly between the two, being 0 half the time and 1 half the time.

Sagredo: So what’s the use of that? These bits carry no information at all.

Salviati: Yes, but these bits are useful if we make them communicate with each other. You see, if they don’t communicate with each other, they fluctuate between 0 and 1 independently. We can draw a histogram like this to represent the probabilities of all combinations of A, B, and C. Each of the eight possibilities is equally likely.

Salviati: Now suppose A, B and C can communicate with each other and they like to listen and imitate each other. Then if A becomes 1, B and C will also become 1. If A becomes 0, B and C also become 0 successively. Now draw another histogram and we can see that there are only two peaks left.

At this point, our small p-bit magnets are still fluctuating, but their fluctuations are consistent.

Sagredo: It’s like you have a big magnet that fluctuates between 0 and 1, but that big magnet doesn’t seem to be doing much.

Salviati: Exactly. If we have a very positive communication, we can get a big magnet. For this to be useful, we must cleverly design the communication between them so that the desired set of spikes occurs.

As an example, if we want to implement a 1-bit multiplier, we only need 4 of the 8 peaks to appear. For {AB→C}, what we want to see is: {00→0}, {01→0}, {10→0}, {11→1}.

If this can be achieved by carefully designing the communication between the p-bits, we have the aforementioned reversible circuit.

Sagredo: How is this done?

Salviati: Let the three magnets move freely between four possibilities: {00 0}, {01 0}, {10 0}, {11 1}.

But if we forcibly lock the A and B magnets to 0, then these magnets are left with only one choice: {00 0}, that is, C can only be 0.

Sagredo: It’s like a multiplier operating in forward mode: 0x0=0, right?

Salviati: Yes. If we want to run in reverse mode, we can lock C to 0. As a result, the system will fluctuate between the following three options: {00 0}, {01 0}, {10 0}. This is the inverse multiplier. Given an output of 0, the system tells us that there are three possible inputs corresponding to it: 0x0, 0x1, and 1×0.

Sagredo: I see. But how do you engineer this magical communication between your p-bits? In other words, how do you know what kind of communication to design?

Salviati: There are well-established methods for determining what kind of communication is needed to create a desired set of peaks.

Sagredo: You’re evasive. Based on what you said earlier, I thought it was your own idea, which is why I was so excited.

Salviati: Actually, that part is well known, at least for some applications. Some companies are building probabilistic computers using ordinary hardware and random number generators to simulate the probability bit flips I just said. But doing so wastes a lot of energy and can quickly drain the laptop’s battery. Our circuit does the same thing with just three transistors and a special hardware element whose intrinsic physical properties generate random numbers.

Sagredo: Can you tell us about this particular component?

Salviati: We used something called a magnetic tunnel junction to build a neat device that allows p-bits to communicate very easily. We set its output to be V_out, which fluctuates. If V_in is 0, V_out will be 1 50% of the time and 0 50% of the time. But if V_in is positive, V_out is more likely to be 0. If V_in is negative, V_out is more likely to be 1. If you keep V_in positive or negative, you can “lock” the output to a certain state.

This is how each p-bit listens to other p-bits via the input voltage V_in, it “speaks” via the output voltage Vout. For example, p-bit A can communicate with p-bit B by feeding the output of A back to the input of B. We constructed a reversible circuit with this device. So far, we haven’t done anything earth-shattering: they’re just a proof of concept. But we have shown that such a device can be constructed with advanced technology. One day, we could use such technology to construct huge circuits to solve real-world problems.

Sagredo: What are the real-world problems?

Salviati: Such as optimization problems, where you need to find a configuration that minimizes some cost function.

People solve optimization problems every day, like finding the best order to deliver a bunch of packages so that the courier travels the shortest distance. Similar problems can be mapped onto the basic architecture we use. Each problem requires a specific connection mode. Once we figure out the patterns and connect them correctly, the p-bit circuit can give an answer in the form of configuration peaks.

Sagredo: Okay, you sparked my interest in this system. But we’re about to land, is there any other way I can learn about your research?

Salviati: We recently published an article on how to build a p-bit computer that can calculate the degree of genetic relatedness between relatives, you can take a look.

Article link:

https://spectrum.ieee.org/computing/hardware/waiting-for-quantum-computing-try-probabilistic-computing

**What new progress has the researchers made in the past two years?**

Since demonstrating hardware for a probabilistic computer in 2019, research teams at institutions such as Purdue University have also used existing silicon technology to simulate a probabilistic machine with thousands of p-bits on traditional hardware publicly available through Amazon Web Services. computer (related links:

https://ieeexplore.ieee.org/abstract/document/9173656).

In addition, the researchers have published several papers on advances in integrating individual hardware components in an attempt to model larger systems and ensure energy efficiency from the start (related links:

https://onlinelibrary.wiley.com/doi/abs/10.1002/adma.201906021).

“The jury is still out on the best implementation of p-bit. But we’ve shown what works, and it’s a matter of time before the best implementation,” said a Purdue professor of electrical and computer engineering.

Probabilistic computing research at Purdue is part of a project called Purdue-P. From the name, these researchers appear to be the only ones working on probabilistic computing, but in other parts of the world there are groups working on similar techniques with different materials and paradigms.

Kerem Camsari, a former postdoctoral researcher at Purdue University, said, “As a field, we look at computing problems that we can’t solve on our own. At the same time, we are also thinking, now there is digital computing, there is quantum computing, what else?” “Actually, from At a higher level, there are many things that can be called ‘probabilistic calculations’.”

The Links: **2MBI300N-060** **TD210N08**