Computing methodologies with rapid proliferation and need for greater concurrency have evolved over the years from parallel and distributed computing to latest technologies like cloud computing. A paradigm shift in computing methodologies has been witnessed in the form of exponential speed up, optimization of problem solutions, and solvability of few classically unsolvable problems with emergence of quantum computing. The study in this article presents observations on executing basic quantum operations that play a vital role in quantum computing. The results and analysis demonstrate how parameters of time, deviation, and shots have a significant effect on outcomes of quantum circuits executed which can be a reference point for the community in general engaged in designing of quantum circuits, protocols, algorithms, and quantum hardware to leverage quantum characteristics and properties. The study by means of experimental use cases provides a rigorous review of qubit behavior, standard, and entangled qubit measurement with practical results over a simulator and real quantum hardware for variable qubit configurations using a selected platform of IBM Q Experience. With the paradigm shift from classical to quantum computing, this article provides basic but very vital observations which can be brought to use by anyone intending to build quantum applications.
K E Y W O R D Sabsolute deviation analysis, bell state measurement, noise model, open-source quantum software projects, quantum measurement, transpiling time
INTRODUCTIONAbacus was the first step toward building a computing hardware. Calculators such as the arithmometer remained a fascination after 1820s and its potential was utilized for commercial use. The need for performing complex calculations and constructing mathematical tables like logarithmic tables further lead to the making of difference engine by Charles Babbage which later gave way to a more advanced computing machine-the analytical engine, a general-purpose, fully program-controlled automatic mechanical digital computer. [1][2][3] It is important to realize that the advancements in computational hardware were driven by commercial or military requirements. First generation classical computers had vacuum tubes at the heart of their construction which were replaced with transistors, integrated circuits, very large scale integrated (VLSI) circuits for the purpose of enhancing computing power of a system and reducing hardware size to enable larger information/data processing with lesser time and space available. [4][5][6][7][8] Standalone computation systems were further integrated into networks to provide computational concurrency enabling handling of large data volumes and performing complex computations with evolving computing methodologies from serial computing to distributed computing 9 and parallel computing. 10 Over the recent decades with increase in data volumes, concepts like data mining, big data analytics, and artificial intelligence (AI) emerged, demand for extremely fast processing speeds an...