Understanding Quantum Computing

#Quantum
Share

IEEE North Jersey MTT/AP Chapter Co-Sponsors the MTT-S Technical Webinar


In this talk, just graduate-level mathematical tools and terminologies are used to explain the concepts underlying quantum computing and the functions of the corresponding hardware—The Quantum Computer. The main differences between classical and quantum mechanics will be concisely reviewed. Quantum dynamic variables like position and momentum in mechanical systems as well as voltage and current in electrical circuits are shown to behave as random rather than deterministic signals, whose attributes can be used to carry information. It will also be demonstrated that these random signals can be stored and processed in what is called Qubits. The latter is the quantum counterpart of the classical Bits. The hardware realization of qubits in the form of superconducting circuits—the Transmons—will be explained. Other realizations of qubits, such as trapped ions and quantum dots, will not be considered.

Understanding Quantum Computing – Part 1 (Register now)



  Date and Time

  Location

  Hosts

  Registration



  • Date: 06 Jun 2023
  • Time: 12:00 PM to 01:00 PM
  • All times are (GMT-05:00) US/Eastern
  • Add_To_Calendar_icon Add Event to Calendar
If you are not a robot, please complete the ReCAPTCHA to display virtual attendance info.
  • Contact Event Hosts
  • Ajay Poddar (akpoddar@ieee.org), Edip Niver (edip.niver@njit.edu), Anisha Apte (anisha_apte@ieee.org)

  • Co-sponsored by IEEE North Jersey Section


  Speakers

Dr. Abbas Omar Dr. Abbas Omar of University of Magdebur

Topic:

Understanding Quantum Computing - Part 1

Classical computers are based on the binary coding of digital data. The binary codes are processed by applying Boolean algebra using digital electrical circuits known as logic gates. The latter are simple configurations of transistors and other circuit elements (resistors, capacitors, etc.). The operation of the logic gates is governed by the Circuit Theory, which is a simplified form of Classical Electromagnetics.

In the sixties of the last century, an observation has been made, which is nowadays known as “Moor’s Law”. It predicts a doubling in the number of components (mainly transistors) in an integrated circuit every two years. Based on that, molecular-scale transistor size has been forecasted (state-of-the-art transistors can be as small as a few atoms), which must lead to invalidating the Circuit Theory as a mathematical tool for describing the performance of the logic gates. This has motivated a number of scientists in the early eighties of the last century to think about replacing these classical-computer building blocks with what is called Quantum Gates. The operation of the latter is fully governed by the laws of Quantum Mechanics. Computers whose building blocks are quantum gates are called Quantum Computers.

As the classical laws (of either mechanics or electromagnetics) are a limiting case of the more comprehensive quantum ones, it has become evident that quantum computers can implement algorithms that are not available in their classical counterparts. This has encouraged scientists to develop such algorithms and research and development facilities to find and build corresponding hardware realizations. Prominent examples are the hardware developed by IBM and Google. Most of the approaches used for explaining quantum computing rely on the highly academic concepts of Quantum Mechanics. Used terminologies are not easily understandable by the majority of interested audiences, who just have only general knowledge about the subject. In some cases, the exaggerated perception is gained, that this “magic thing” is capable of solving all computationally based problems much more efficiently and much faster than the classical computer.

In this talk, just graduate-level mathematical tools and terminologies are used to explain the concepts underlying quantum computing and the functions of the corresponding hardware—The Quantum Computer. The main differences between classical and quantum mechanics will be concisely reviewed. Quantum dynamic variables like position and momentum in mechanical systems as well as voltage and current in electrical circuits are shown to behave as random rather than deterministic signals, whose attributes can be used to carry information. It will also be demonstrated that these random signals can be stored and processed in what is called Qubits. The latter is the quantum counterpart of the classical Bits. The hardware realization of qubits in the form of superconducting circuits—the Transmons—will be explained. Other realizations of qubits, such as trapped ions and 
quantum dots, will not be considered.

Due to the impossibility of fully isolating dynamic systems from their surroundings, thermal noise and quantum dynamic variables, both being random signals, interact together. As opposed to deterministic signals, noise corruption has a different form in this case. It deteriorates an essential statistical attribute of interacting quantum dynamic variables, which is known as Coherency. The latter is a sort of “memory”, which enables different quantum dynamic variables to “remember” each other. Using quantum systems for encoding and processing information needs therefore cooling the systems down to very near the absolute zero temperature (0°K) in order to reduce the noise impact on the coherency. Coding and processing errors due to deteriorated coherencies might also need to involve error-correction techniques similar to those known in Channel Coding.

Biography:

Dr. Omar is a Professor Emeritus at the Otto-von-Guericke University of Magdeburg in Germany. He received a B.Sc., M.Sc., and Doktor-Ing. degrees in electrical engineering in 1978, 1982, and 1986, respectively. He has been a professor of electrical engineering since 1990 and director of the Chair of Microwave and Communication Engineering at the Otto-von-Guericke University of Magdeburg, Germany from 1998 to his retirement in 2020. He joined the Petroleum Institute in Abu Dhabi as a Distinguished Professor in 2012 and 2013 as an organizer of the research activities for the Oil and Gas Industry in this area. In 2014 and 2015 he chaired the Electrical and Computer Engineering at the University of Akron, Ohio, USA. Dr. Omar authored and co-authored more than 480 technical papers extending over a wide spectrum of research areas. His current research and teaching fields cover the areas of health aspects of millimeter-wave radiations, quantum computing, phased arrays and beamforming for massive MIMO, and magnetic resonance imaging. He also covered in the past other disciplines including microwave and acoustic imaging, microwave and millimeter-wave material characterization, indoor positioning, subsurface tomography, and ground penetrating radar, and field theoretical modeling of microwave systems and components. Dr. Omar is IEEE Fellow.

Address:Germany