ecosmak.ru

Standard quantum limit. Physicist: Squeezed light will help LIGO to step over the quantum limit

A stationary light flux incident on a photodiode generates pairs of charge carriers as independent random events. Such a process of photon conversion is called Poisson. If over a period of time an optical energy equal to the average falls on the photodiode, then it should be expected that a pair of charge carriers will be created, and

Here, as before, the quantum efficiency of interaction, the photon energy. Due to the statistical nature of the interaction of photons with a photoconductor, the true number of charge carrier pairs generated by each optical pulse will vary around the average value, The probability that the number of charge carrier pairs created is equal to k is determined by the Poisson probability distribution

In this case, the standard deviation from the mean (variance) will also be equal to

In an ideal communication system, this change in the number of charge carrier pairs generated is the only source of noise. Also, in such a system, optical energy is received and charge carriers are generated only when 1 is transmitted. If the receiver is sensitive enough to detect a single electron-hole pair produced by light, then the threshold can be set at this level. And there is no error in transmitting 0 because no energy is received and no signal is generated. Only when the optical energy corresponding to 1 incident on the photodetector does not generate any charge carriers at all, then an error is recorded instead of the expected number N. Recall that 0 and 1 are transmitted with the same probability (see (15.1.3)].

Using the Poisson distribution, we find

To obtain it is necessary to require therefore,

In this case, the minimum average power at the input of the photodetector

The value found characterizes the absolute quantum limit of detectability. When we get Comparison of these figures with the previously mentioned values ​​obtained in practice shows that the noise of the amplifier in practical systems communication leads to a deterioration in their sensitivity, so that the required level of received power is almost and a two orders of magnitude higher than this quantum limit. It is probably more convenient to express the result as the average received energy per transmitted bit. If a 0 and 1 are equally likely, according to the quantum limit of detection, there are on average 10 received photons per bit.

One of the main characteristics of an optical radiation receiver is its sensitivity, i.e., the minimum value of the detected (detected) power of the optical signal, at which the specified values ​​of the signal-to-noise ratio or error probability are provided.

Let's define minimum detectable power(MDM) of the optical signal, corresponding to the minimum sensitivity threshold of the photodetector in the absence of noise and distortion, i.e., in conditions of ideal reception.

The symbol "1" corresponds to the transmission of an optical study pulse with a duration τ , whose energy at the receiver input is equal to E in, symbol "0" - zero value of optical energy. When the photodetector is irradiated with an optical energy flux E in generated electron-hole pairs - charge carriers. This is an independent random process for which the average number of emerging pairs of charge carriers is determined by the formula

The number of emerging pairs of charge carriers is determined by the Poisson probability distribution, i.e.

. (20.7)

Let us assume that even with the generation of only one pair of carriers, it is possible to register an optical radiation pulse, i.e., reception "1". Under this assumption, the error probability means the probability of the appearance of a single pair of charge carriers. The probability of such an event can be determined by formula (20.7) by setting P=0. Then

……………………..(20.8)

If we put that R osh = R(0)=10 -9 , then we get N=21. This means that the energy received in the optical pulse must be equal to the energy of 21 photons, i.e., to ensure the error probability is not worse than 10 -9, it follows from (20.6) - (20.8) that .

This is the minimum allowable receiver sensitivity for ideal reception, and the requirement to generate 21 photons for each received pulse at R osh =10 -9 is a fundamental limit, which is inherent in any physically realizable photodetector and is called quantum limit of detection.

It corresponds to the minimum average power of an optical signal with a duration τ =1/IN, Where IN- speed transmission of information,

which is called minimum detectable power.

From (20.3), taking into account (20.9), it follows that the MDM of the optical signal

(20.10)

Inequality (20.10) determines, all other things being equal, the minimum sensitivity threshold or MDM of a photodetector.

In addition to the quantum limit of detection, there are other factors: thermal, dark and shot noise, intersymbol noise limiting the MDM. The fundamental difference between these factors is that by complicating the equipment, using appropriate methods of transmission and reception, their influence can be minimized.



Control questions

1. Interference affecting the optical signal.

2. OLT and factors influencing its structure.

3. Digital repeater (diagram and principle of operation).

4. Digital regenerator (scheme and principle of operation).

5. Functions of digital repeaters and their classification.

6. Types of analog OLT repeaters.

7. AOLT repeaters of the first type.

8. AOLT repeaters of the second and third types.

9. Main noise sources of POM with LD

10. Main sources of noise POM with LED

11. Methods for reducing noise in POM with LD

12. OLT noise sources

13. Calculation of the regenerator error probability, security

14. Minimum detectable power, quantum detection limit of a photodetector

Caution, below quantum mechanics!

SQL (or SQL, Standard Quantum Limit) is a concept from quantum mechanics. This is the name of the limitation in the accuracy of measurements that are carried out repeatedly or for a long time. good example, which is also suitable for our case, is the measurement of the distance to a certain mass with the highest possible accuracy. A laser beam is used for measurement. Knowing the wavelength of the laser, the initial phase of the wave, and measuring the phase of the returned beam, we can calculate the exact distance it has traveled. Unfortunately, the pressure of the beam on the body will cause perturbations in it at the quantum level (quantum shot fluctuations). The more precisely it is required to measure the coordinate, the more powerful the laser beam is needed, and the greater these fluctuations will be. Such quantum noise creates a measurement error.

In fact, the SCP is a consequence of the fundamental prohibition of quantum physics - the Heisenberg uncertainty principle. The uncertainty principle states that when two quantities are measured simultaneously, the product of the errors cannot be less than a certain constant. Roughly speaking, the more accurately we measure the speed of a quantum particle, the less accurately we can determine its position. And vice versa. It is important to note that the limitations on measurement accuracy imposed by the SPC are more severe than the limitations of the Heisenberg uncertainty principle. Bypassing the latter is impossible in principle without destroying the foundations of all quantum mechanics.

A way to circumvent the limitation of the standard quantum limit was proposed in the American gravitational wave detector LIGO. The search for gravitational waves is one of the most important tasks of modern physics, but so far it has not been possible to register them due to the too low sensitivity of the existing equipment. The LIGO installation is very simple. It consists of two vacuum tunnels converging at right angles. Laser beams pass through the pipes, and mirrors are installed at their far ends (see Fig.). It is the distance to these mirrors that is measured by the laser, as described above. Of particular importance is the intersection of the laser beams returning from the mirrors. There is interference between them. Due to this phenomenon, the rays either strengthen or weaken each other. The amount of interference depends on the phase of the rays, and hence on the path traveled by the rays. Theoretically, such a device should record the change in the distances between the mirrors during the passage through the installation of a gravitational wave, but in practice the accuracy of the interferometer is still too low.

About a quarter of a century ago, it was proposed to use the so-called squeezed states of light to bypass the SCP. They were received in 1985, but it was only recently that the idea was put into practice. Most light sources, including lasers, are not capable of creating such radiation, however, with the help of special crystals, physicists have learned to receive light in a compressed state. A laser beam passing through such a crystal undergoes spontaneous parametric scattering. In other words, some photons turn from a single quantum into a pair of entangled particles.

Scientists have demonstrated that the use of quantum-correlated photons can reduce the measurement error to a value that is less than the standard quantum limit. Unfortunately, without special knowledge it is very difficult to understand (and, even more so, to explain) exactly how this happens, but the behavior of entangled photons just reduces the very quantum shot noise that was mentioned at the beginning.

The researchers emphasize that the changes they made have significantly increased the sensitivity of the gravitational wave detector in the frequency range from 50 to 300 hertz, which is of particular interest to astrophysicists. It is in this range that, according to the theory, waves should be emitted during the merger of massive objects: neutron stars or black holes.

We invite you to watch and study a series of popular science videos called Beyond the Quantum Limit. These video lessons will help you find out how a group of independent researchers decided to get acquainted with the report of the primordial Allatra physics in more detail. And also check all the information they have.

The fact is that modern science today already has a significant amount of research data on the nature of the world around us. For example, new elementary particles have been discovered and chemical elements; manifestation of discreteness of absorption and emission of energy is revealed. Thanks to the results modern science we have the opportunity to check the information from the report in more detail.

But at the same time, thanks to improved research methods, everything is revealed large quantity unexplained phenomena and unexpected results, facts and anomalies are found that do not fit into the framework of generally accepted models, theories and hypotheses.

The AllatRa report provides answers to unresolved questions of physics. Is there such a thing today in modern science? Let's see, but in general it is interesting to understand the essence of the information provided.

Elementary particles and the golden ratio

The guys did a good job, and very clearly told about the golden ratio in quantum physics. The quantum physics interesting section Sciences. The structure of elementary particles and Po particles is described in an interesting way. The neutron, electron, proton and photon are also entertainingly described. The information is really interesting, given the fact that this is just one of the hypotheses.

Amazing beta decay and electron capture

To date, there are a number scientific theories on the structure and interaction of elementary particles. In this issue of the “Zaquantum Limit” program, another alternative theory-hypothesis about the nature of elementary particles is considered, and two formulas for nuclear reactions, namely beta decay and electron capture, are tested.

Analysis of formulas for decay and interaction of elementary particles

Golden section and spiral tracks of elementary particles

One of the founders of quantum information theory, Corresponding Member of the Russian Academy of Sciences Alexander Holevo believes that we may have approached the boundaries of knowledge

TO The cable-stayed computer is one of the most discussed topics in science. Unfortunately, so far things have not progressed further than individual experiments that are being conducted in many countries of the world, including Russia, although their results are promising.

In parallel, but with much greater success, there is a creation of quantum cryptography systems. Such systems are already at the stage of pilot implementation.

The very idea of ​​the possibility of creating a quantum computer and quantum cryptography systems is based on quantum information theory. One of its founders - Alexander Holevo, Russian mathematician, Corresponding Member of the Russian Academy of Sciences, Head of the Department of Probability Theory and Mathematical Statistics of the Mathematical Institute. V. A. Steklov RAS. In 2016, he received the Shannon Prize, the most prestigious in the field of information theory, awarded by the Institute of Electrical and Electronics Engineers - IEEE. Back in 1973, Holevo formulated and proved the theorem that received his name and formed the basis of quantum cryptography: it sets an upper limit on the amount of information that can be extracted from quantum states.

You formulated your most famous theorem in 1973. As far as I remember, such words as quantum information theory did not sound in the public space at that time. Why are you interested in her?

Indeed, then, and even then for some time, it did not sound in the public space, but in the scientific literature it was then, in the 1960s - early 1970s, publications began to appear devoted to the question of what fundamental restrictions the quantum nature of the carrier imposes information (for example, the laser radiation field) for its transmission. The question of fundamental limitations arose not by chance, almost immediately after the creation of the foundations of information theory by Claude Shannon. By the way, 2016 marks the 100th anniversary of his birth, and his famous work on information theory appeared in 1948. And already in the 1950s, experts began to think about quantum limitations. One of the first was an article by Denis Gabor (who received the Nobel Prize for the invention of holography). He posed the following question: what fundamental restrictions does the quantum nature of the electromagnetic field impose on the transmission and reproduction of information? After all, the electromagnetic field is the main carrier of information: in the form of light, radio waves or at other frequencies.

If there is a communication channel, which is considered as a quantum one, then the Shannon amount of classical information that can be transmitted over such a channel is limited from above by some very specific value

After that, physical works on this topic began to appear. Then it was called not quantum information theory, but Quantum Communication, that is, the quantum theory of message transmission. Of the domestic scientists who were already interested in this issue, I would name Ruslan Leontyevich Stratonovich. He was a prominent specialist in statistical thermodynamics, who also wrote on these topics.

In the late 1960s, I defended my Ph.D. thesis in mathematical statistics. random processes, began to think about what to do next, and came across works on this issue. I saw that this is a huge field of activity if, on the one hand, we approach these problems from the point of view of the mathematical foundations of quantum theory, and, on the other hand, use what I know about mathematical statistics. This synthesis proved to be very fruitful.

The essence of the theorem, proved by me in 1973, is as follows: if there is a communication channel, which is considered as a quantum one, then the Shannon amount of classical information that can be transmitted through such a channel is limited from above by some very specific value - then they began to call it the χ-quantity (chi-amount). Essentially, all communication channels are quantum, only in most cases their “quantumness” can be neglected. But if the noise temperature in the channel is very low or the signal is very weak (for example, a signal from a distant star or a gravitational wave), then it becomes necessary to take into account quantum mechanical errors that arise due to the presence of quantum noise.

- Limited from above, that is, we are talking about the maximum amount of transmitted information?

Yes, oh the maximum number information. I took up this question because it was essentially a mathematical problem. Physicists guessed the existence of such an inequality, it was formulated as an assumption and appeared in this capacity for at least a decade, and maybe more. I could not find contradictory examples, and the proof did not work, so I decided to do this. The first step was to formulate the assumption mathematically in order to actually prove it as a theorem. After that, another couple of years passed, until somehow an insight came to me in the subway. The result is this inequality. And in 1996, I managed to show that this upper bound is achievable in the limit of very long messages, that is, it gives the channel capacity.

It is important that this upper bound on the information does not depend on how the output is measured. This boundary, in particular, has found important applications in quantum cryptography. If there is a secret communication channel and some intruder tries to eavesdrop on it (such an intruder is usually called Eve from the English eavesdropper - eavesdropping), then it is not known how Eve is eavesdropping. But the amount of information that it still manages to steal is limited from above by this absolute value, which does not depend on the method of measurement. Knowledge of this value is used to enhance the secrecy of the transmission.

- Information can be understood both from a mathematical and physical point of view. What is the difference?

In the mathematical theory of information, it is not about its content, but about its quantity. And from this point of view, the method of physical realization of information is indifferent. Whether it is an image, music, text. What matters is how much memory this information occupies in digital form. And how it can best be encoded, usually in binary form, because for classical information this is the most convenient way to digitally represent it. The amount of such information is measured in binary units - bits. If the information is unified in this way, then this opens up opportunities for a unified approach that does not depend on the nature of the information carrier, while we consider only "classical" carriers.

A distinctive property of quantum information is the impossibility of its "cloning". In other words, the laws of quantum mechanics forbid "quantum xerox". This, in particular, makes quantum information a suitable medium for transmitting secret data.

However, the transition to quantum carriers - photons, electrons, atoms - opens up fundamentally new possibilities, and this is one of the main messages of quantum information theory. Arises the new kind information - quantum information, the unit of measurement of which is a quantum bit - a qubit. In this sense, "information is physical," as Rolf Landauer, one of the founding fathers of quantum information theory, said. A distinctive property of quantum information is the impossibility of its “cloning”. In other words, the laws of quantum mechanics forbid "quantum xerox". This, in particular, makes quantum information a suitable medium for transferring secret data.

It must be said that our compatriot Vladimir Aleksandrovich Kotelnikov had his say in information theory before Shannon. Back in 1933, he published the famous "reference theorem" in "Materials for the First All-Union Congress on the Reconstruction of Communications". The significance of this theorem is that it allows continuous information, an analog signal to be converted into a discrete form (counts). In our country, work in this area was surrounded by great secrecy, therefore, such a resonance as the work of Shannon, the work of Kotelnikov did not receive, and in the West they were generally unknown until a certain moment. But in the late 1990s, the Institute of Electrical and Electronics Engineers, IEEE, awarded Kotelnikov the highest award - the A. G. Bell medal, and the German Eduard Rein Foundation - the award for fundamental research, namely, for the sampling theorem.

- And for some reason, Kotelnikov was so little remembered even among us ...

His work was classified. In particular, Kotelnikov did a lot in the field of government communications, deep space communications. By the way, Vladimir Alexandrovich was also interested in questions of interpretation of quantum mechanics, he has works on this topic.

Shannon became famous for his 1948 paper on information theory. But his first famous work on the use of the algebra of logic and Boolean functions, that is, functions of binary variables for the analysis and synthesis of electrical circuits (relay, switching circuits), was written back in 1937, when he was a student at the Massachusetts Institute of Technology. It is sometimes called the most outstanding thesis work of the twentieth century.

It was a revolutionary idea, which, however, was in the air at the time. And in this Shannon had a predecessor, the Soviet physicist Viktor Shestakov. He worked at the Faculty of Physics of Moscow State University and proposed the use of binary and more general multi-valued logic for the analysis and synthesis of electrical circuits as early as 1934. He then defended himself, but did not immediately publish his research, since it was believed that it was important to get a result, and publication could wait. In general, he published his work only in 1941, after Shannon.

Interestingly, at that time, in the 1940s and 1950s, it turned out so well: everything that made it possible to develop information theory and ensure its technical implementation appeared almost simultaneously.

Indeed, at the end of the war, electronic computers appeared. Then, almost simultaneously with the publication of Shannon's article, the transistor was invented. If not for this discovery, and if technological progress had slowed down in this respect, then the ideas of information theory would not have found application for a long time, because it was difficult to implement them on huge cabinets with radio tubes that were heated and required Niagara for their cooling. Everything matched. We can say that these ideas arose very timely.


Photo: Dmitry Lykov

Shannon received a degree in mathematics and at the same time a degree in electrical engineering. He knew mathematics as much as an engineer needs, and at the same time he had an amazing engineering and mathematical intuition. The significance of Shannon's work for mathematics was realized in the Soviet Union by Andrey Kolmogorov and his school, while some Western mathematicians treated Shannon's work rather arrogantly. They criticized him for not strictly writing, that he had some mathematical flaws, although by and large he had no serious flaws, but his intuition was completely unmistakable. If he claimed something, he usually did not write out General terms, for which this is true, but a professional mathematician, having worked hard, could always find exact formulations and proofs for which the corresponding result would be rigorous. As a rule, these were very new and deep ideas that had global implications. In this respect, he is even compared with Newton and Einstein. So were laid theoretical basis for the information age that began in the middle of the twentieth century.

In your works, you write about the connection of such properties of the quantum world as “complementarity” and “entanglement” with information. Explain it please.

These are two basic, fundamental properties that distinguish the quantum world from the classical one. Complementarity in quantum mechanics consists in the fact that there are some aspects of a quantum mechanical phenomenon or object that both relate to this object, but cannot be simultaneously fixed exactly. For example, if the position of a quantum particle is focused, then the momentum is blurred, and vice versa. And it's not just coordinates and momentum. As Niels Bohr pointed out, complementarity is not only a property of quantum mechanical systems, it manifests itself in both biological and social systems. In 1961, translated into Russian, Bohr's remarkable collection of articles "Atomic Physics and Human Knowledge" was published. It says, for example, about the complementarity between reflection and action, while reflection is the analogue of the position, and action is the analogue of the impulse. We know very well that there are people of action, there are people of reflection, and it is difficult to combine it in one person. There are some fundamental limits that do not allow combining these properties. Mathematically, complementarity is expressed in the fact that non-permutable objects, matrices or operators are used to describe quantum quantities. The result of their multiplication depends on the order of the factors. If we measure first one quantity, then another, and then we do it in the opposite order, we will get different results. This is a consequence of complementarity, and nothing like this exists in the classical description of the world, if we understand by this, say, Kolmogorov's probability theory. In it, in whatever order the random variables are measured, they will have the same joint distribution. Mathematically, this is a consequence of the fact that random variables are represented not by matrices, but by functions that commute in the sense of multiplication.

Shannon received a degree in mathematics and at the same time a degree in electrical engineering. He knew mathematics as much as an engineer needs, and at the same time he had an amazing engineering and mathematical intuition.

How does this affect information theory?

The most important consequence of complementarity is that if you measure one quantity, you perturb its complement. This works, for example, in quantum cryptography. If there was unauthorized interference in the communication channel, it must necessarily manifest itself. On this principle...

- Is information security built?

Yes, one of the "quantum" ways to protect information is based precisely on the property of complementarity.

The second method uses "entanglement" (entanglement). Entanglement is another fundamental property of quantum systems that has no classical analogues. It refers to composite systems. If complementarity is also manifested for a single system, then the entanglement property indicates a connection between parts of a composite system. These parts can be spatially separated, but if they are in a coupled quantum state, then some mysterious connection arises between their internal properties, the so-called quantum pseudo-telepathy. By measuring one subsystem, you can somehow influence another, and instantly, but influence in a very subtle way. The measure of such entanglement is determined by the Einstein-Podolsky-Rosen correlation. It is stronger than any classical correlation, but does not contradict the theory of relativity, which prohibits the transmission of information at a speed greater than the speed of light. Information cannot be transmitted, but this correlation can be captured, and it can be used. The second class of cryptographic protocols is just based on the creation and use of entanglement between the participants in this protocol.

- If someone interferes, then because of the entanglement, you can find out about it?

If we interfere with one, the other will inevitably feel it.

Cohesion is probably the transfer of something. Any transmission occurs through something. What is the linkage mechanism?

I would not talk about the entanglement mechanism. This is a property of quantum mechanical description. If you accept this description, then entanglement follows from it. How is the interaction usually transmitted? With some particles. In this case, there are no such particles.

But there are experiments that confirm the existence of this property. In the 1960s, Irish physicist John Bell developed an important inequality that allows one to experimentally determine whether quantum entanglement exists at large distances. Such experiments were carried out, and the presence of entanglement was confirmed experimentally.

If you want to create a consistent system of axioms for a sufficiently meaningful mathematical theory, then it will always be incomplete in the sense that there is a sentence in it, the truth or falsity of which is unprovable

The phenomenon of entanglement is indeed very counterintuitive. Its quantum mechanical explanation was not accepted by some prominent physicists, such as Einstein, De Broglie, Schrödinger... They did not accept the probabilistic interpretation of quantum mechanics, which is also associated with the phenomenon of entanglement, and believed that there must be some "deeper" theory that would allow describing the results of quantum mechanical experiments, in particular the presence of entanglement "realistically" as, say, classical field theory describes electromagnetic phenomena.

Then it would be possible to harmoniously combine this property with the theory of relativity and even with the general theory of relativity. At present, this is perhaps the most profound problem of theoretical physics: how to reconcile quantum mechanics with the requirements of the general theory of relativity. Quantum field theory agrees with special relativity at the cost of making corrections (renormalizations) such as the subtraction of an "infinite constant". A completely mathematically consistent unified theory still does not exist, and attempts to build it so far run into a dead end. The two fundamental theories that emerged at the beginning of the twentieth century, quantum theory and relativity, have not yet been fully brought together.

- Thinking is also a form of information processing. What is the connection between thinking and information theory?

2015 marked the bicentenary of George Bull. This is an Irish mathematician who discovered the calculus of functions of binary variables, as well as the algebra of logic. He suggested assigning the value "0" to a false statement, the value "1" to a true statement, and showed that the laws of logic are perfectly described by the corresponding algebra of logic. I must say that it was his desire to understand the laws of human thinking that served as the impetus for this discovery. As they write in his biographies, when he was a young man, he was visited by a mystical revelation and he felt that he should be engaged in revealing the laws of human thinking. He wrote two important books that were not really in demand at the time. His discoveries found wide application only in the twentieth century.

- In a certain sense, the algebra of logic, in fact, demonstrates the connection between thinking and mathematics?

You can say so. But, if we talk about the connection between thinking and mathematics, then in the twentieth century the most impressive achievement, speaking of some deep internal contradictions or paradoxes that are embedded in the laws of human thinking, were the works of Kurt Gödel, who put an end to the utopian and overly optimistic idea David Hilbert axiomatize all mathematics. From Gödel's results, in particular, it follows that such a goal is, in principle, unattainable. If you want to create a consistent system of axioms for some sufficiently meaningful mathematical theory, then it will always be incomplete in the sense that there is a sentence in it, the truth or falsity of which is unprovable. This is seen as some distant parallel with the principle of complementarity in quantum theory, which also speaks of the incompatibility of certain properties. Completeness and consistency turn out to be mutually complementary properties. If we draw this parallel further, then we can come to an idea that, perhaps, for modern science will seem seditious: knowledge has limits. “Humble yourself, proud man,” as Fyodor Mikhailovich Dostoevsky said. The electron, of course, is inexhaustible, but knowledge has limits due to the finiteness of the mental apparatus that a person possesses. Yes, we still do not fully know all the possibilities, but already somewhere, in some aspects, apparently, we are approaching the boundaries. Perhaps that is why the problem of creating a scalable quantum computer is so difficult.

The electron, of course, is inexhaustible, but knowledge has limits due to the finiteness of the mental apparatus that a person possesses. Yes, we still do not fully know all the possibilities, but already somewhere, in some aspects, apparently, we are approaching the borders

Maybe the point is that not just the possibilities of human thinking are not enough, but that the world as such is arranged so internally contradictory that it cannot be known?

This can only show the future. In a sense, this is true, and this is clearly seen in the example public life: how many attempts there were to build a harmonious society, and although they led to a new development - unfortunately, with great efforts and sacrifices - a harmonious society was never created. This internal contradiction, of course, is present in our world. However, as dialectics teaches, contradictions, negation of negation are the source of development. Incidentally, a certain dialectic is also present in quantum theory.

Of course, what I am saying now contradicts the existing historical optimism, roughly speaking, that it is possible to construct a “theory of everything” and explain everything.

Ludwig Faddeev, as he said in an interview with me, is a supporter of the point of view that sooner or later such a theory will arise.

This view is probably based on an extrapolation of the ideas of the Age of Enlightenment, which culminated in the unprecedented scientific and technological breakthrough of the twentieth century. But reality always confronts us with the fact that science can do a lot, but is still not omnipotent. The situation when different fragments of reality are successfully described by different mathematical models, which are only in principle consistent in the boundary regimes, can be inherent in the very nature of things.

- You mentioned the quantum computer. But his idea was born on the basis of quantum information theory ...

The idea of ​​efficient quantum computing was expressed by Yuri Ivanovich Manin in 1980. Richard Feynman wrote an article in 1984 in which he asked the question: since the simulation of complex quantum systems, such as sufficiently large molecules, takes up more and more space and time on conventional computers, could quantum systems be used to simulate quantum systems?

- Based on the fact that the complexity of a quantum system is adequate to the complexity of the problem?

Approximately so. Then the ideas of quantum cryptography appeared, and the idea of ​​a quantum computer sounded most loudly after Peter Shor proposed an algorithm for factoring a large composite natural number based on the idea of ​​quantum parallelism. Why did it cause such an uproar? The assumption of the complexity of solving such a problem underlies modern public key encryption systems, which are widely used, in particular, on the Internet. Such complexity does not allow, even with a supercomputer, to break the cipher in any foreseeable time. At the same time, Shor's algorithm makes it possible to solve this problem in an acceptable time (on the order of several days). This, as it were, created a potential threat to the entire Internet system and everything that uses such encryption systems. On the other hand, it has been shown that quantum cryptography methods are not hackable even with the help of a quantum computer, that is, they are physically secure.

Another important discovery was that quantum error-correcting codes could be proposed, as in classical information theory. Why is digital information stored so high-quality? Because there are codes that correct errors. You can scratch a CD and it will still play the recording correctly, without distortion, thanks to these correction codes.

A similar but much more sophisticated design has been proposed for quantum devices. Moreover, it has been theoretically proven that if the probability of failures does not exceed a certain threshold, then almost any circuit that performs quantum computing can be made error-tolerant by adding special blocks that deal not only with correction, but also with internal security.

It is possible that the most promising way is to create not a large quantum processor, but a hybrid device in which several qubits interact with a classical computer.

When experimenters began to work on the embodiment of the ideas of quantum informatics, the difficulties in the way of their implementation became clear. A quantum computer must consist of a large number of qubits - quantum memory cells and quantum logical processors that perform operations on them. Our physicist Alexei Ustinov in 2015 realized a superconducting quantum qubit. Now there are circuits of dozens of qubits. Google promises to build a computing device of 50 qubits in 2017. At this stage, it is important that physicists successfully master innovative experimental methods, which allow "to measure and purposefully manipulate individual quantum systems" ( Nobel Prize in Physics 2012). Chemists who create molecular machines are moving in the same direction (Nobel Prize in Chemistry 2016).

The practical implementation of quantum computing and other ideas of quantum informatics is a promising task. Physicists and experimenters are constantly working hard. But until there has been a technological breakthrough like the invention of the transistor, there are no quantum technologies that would be reproduced massively and relatively cheaply, like the production of integrated circuits. If for the manufacture of a classic personal computer it was possible to buy parts in a store and solder electronic circuits in the garage, then with quantum it will not work.

It is possible that the most promising way is to create not a large quantum processor, but a hybrid device in which several qubits interact with a classical computer.

Perhaps the human brain is similar hybrid computer. In the popular book by the English physicist Roger Penrose, The New Mind of the King, the author expresses the opinion that there are some biophysical mechanisms in the brain capable of performing quantum calculations, although not everyone shares this opinion. Renowned Swiss theorist Klaus Hepp says he cannot imagine a wet and warm brain performing quantum operations. On the other hand, Yuri Manin, who has already been mentioned, admits that the brain is a large classical computer in which there is a quantum chip responsible for intuition and other creative tasks. And also, probably, for “free will”, since in quantum mechanics randomness is inherent in principle, in the very nature of things.

Unlike conventional systems (with a secret key), systems that allow open transmission of the (open) part of the key over an insecure communication channel are called public key systems. In such systems, the public key (encryption key) is different from the private key (decryption key), which is why they are sometimes called asymmetric systems or two-key systems.

Loading...