Quantum Computing

Quantum computers can process massive and complex datasets more efficiently than classical computers. Quantum computing could change the world - but right now, its future remains uncertain.

They use the fundamentals of quantum mechanics to speed up the process of solving complex computations. Often those computations incorporate a seemingly unlimited number of variables, and the potential applications span industries from genomics to finance.

Interested in learning more about quantum computing? Find links to helpful resources on quantum programming, quantum algorithms, and more.

The Agenda:

The Agenda is about Quantum, Quantum Computing, Qubit, Superposition and Entanglement.

What Is Quantum?

Physicist Max Planck in 1900 proposed that at the subatomic level, energy is contained in tiny discrete packets called quanta, which behave as both waves and particles, depending on their environment at the time. The basis of quantum theory relies on the observation that at any point in time, these particles could be in any state and may change their state.

What Is Quantum Computing?

Quantum computing is the use of quantum-mechanical phenomena such as superposition and entanglement to perform computation. Computers that perform quantum computations are known as quantum computers. Quantum computers are believed to be able to solve certain computational problems, such as integer factorization (which underlies RSA encryption), substantially faster than classical computers. The study of quantum computing is a subfield of quantum information science.

This new generation of supercomputers uses knowledge of quantum mechanics - the area of physics that studies atomic and subatomic particles - to overcome the limitations of classic computing. Although in practice, quantum computing faces evident problems regarding scalability and incoherence, it makes it possible to perform multiple simultaneous operations and eliminates the tunnel effect that limits current anemometric scale programming.

What Is Qubit?

The classical computing methods we use today work on chips that process all data using 2 bits - 0 and 1. Even the most complex data or algorithm you input gets broken down into these two bits.

Quantum machine learning on the other hand uses the unit ‘qubits’, short for quantum bits. In quantum physics, these qubits could be electrons or protons orbiting a nucleus in an atom

qubit its a combination of both 0 and 1

In a classical (or conventional) computer, information is stored as bits; in a quantum computer, it is stored as qubits (quantum bits).

Classical Bits => 2 * n = 2*10= 20 Bits

Qubits = > 2^n = 2¹⁰ = 1024 Bits


The Superposition & Entanglement are key principles of Quantum.

What Is Quantum Superposition?

These quantum particles or Qubits may exist as both 0 and 1 at the same time. This is a phenomenon known as Superposition. Essentially, this means that a particle can exist in multiple quantum states and when placed under supervision, i.e. when we try to measure its position, it undergoes change and its superposition is lost.


What Is Quantum Entanglement?

Entanglement happens when two qubits in a superposition, correlated with one another. (i.e) State of one is depends on the State of other.

Researchers can generate pairs of qubits that are “entangled,” which means the two members of a pair exist in a single quantum state. Changing the state of one of the qubits will instantaneously change the state of the other one in a predictable way. This happens even if they are separated by very long distances.

Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as “spooky action at a distance.” But it’s key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.

Quantum computers harness entangled qubits in a kind of quantum daisy chain to work their magic. The machines’ ability to speed up calculations using specially designed quantum algorithms is why there’s so much buzz about their potential.

That’s the good news. https://www.youtube.com/watch?v=tafGL02EUOA

Quantum Computing Companies

There are different companies have been released the quantum computing for development.

1. IBM - Qiskit

2. Microsoft - Quantum Development Kit.

3. D-Wave - Ocean

4. Xanadu - PennyLane

To use qubits we should have more RAM memory.

Microsoft launched the preview version of a new programming language for quantum computing called Q#. The industry giant also launched a quantum simulator that developers can use to test and debug their quantum algorithms.

The language and simulator were announced in September. The then-unnamed language was intended to bring traditional programming concepts - functions, variables, and branches, along with a syntax-highlighted development environment complete with quantum debugger - to quantum computing, a field that has hitherto built algorithms from wiring…

Example Q# Algorithm Code accessing in C#

After the workshop, Q# program that demonstrates a simple quantum behavior that is distinct from the behavior of a classical computer.

it creates a quantum algorithm using q#

after creates the algorithm, can access the q# code with c#.

CTO @ Preline,Inc