The laptops, cell phones, and internet applications commonplace in our daily lives are all rooted in the idea of zeros and ones -in bits.This foundational element originated from the combination of mathematics and Claude Shannon's Theory of Information. Coupled with the 50-year legacy of Moore's Law, the bit has propelled the digitization of our world.In recent years, artificial intelligence systems, merging neuron-inspired biology with information, have achieved superhuman accuracy in a range of narrow classification tasks by learning from labelled data. Advancing from Narrow AI to Broad AI will encompass the unification of learning and reasoning through neuro-symbolic systems, resulting in a form of AI which will perform multiple tasks, operate across multiple domains, and learn from small quantities of multi-modal input data.Finally, the union of physics and information led to the emergence of Quantum Information Theory and the development of the quantum bit -the qubit -forming the basis of quantum computers. We have built the first programmable quantum computers, and although the technology is still in its early days, these systems offer the potential to solve problems which even the most powerful classical computers cannot.The future of computing will look fundamentally different than it has in the past. It will not be based on more and cheaper bits alone, but rather, it will be built upon bits + neurons + qubits. This future will enable the next generation of intelligent mission-critical systems and accelerate the rate of science-driven discovery.