Big Data

Quantum Computing, Artificial Intelligence (AI) and Solving the Impossible

Bill Schmarzo By Bill Schmarzo CTO, Dell EMC Services (aka “Dean of Big Data”) August 16, 2017

In the blog “From Autonomous to Smart:  Importance of Artificial Intelligence,” we discussed the two critical artificial intelligence (AI) challenges in creating “smart” edge devices:

  • Artificial Intelligence Challenge #1: How do the Artificial Intelligence algorithms handle the unexpected, such as flash flooding, terrorist attacks, earthquakes, tornadoes, police car chases, emergency vehicles, blown tires, a child chasing a ball into the street, etc.?
  • Artificial Intelligence Challenge #2: The more complex the problem state, the more data storage (to retain known state history) and CPU processing power (to find the optimal or best solution) is required in the edge devices in order to create “smart”.

In the blog “Transforming from Autonomous to Smart: Reinforcement Learning Basics,” we talked about how Moore’s Law isn’t going to bail us out of because the problem space is getting more complex, even for relatively easy environments like playing checkers and chess:

  • Checkers has 500 billion billion (that’s right, billion twice) possible board moves. That’s 500,000,000,000,000,000,000 possible moves (that’s 20 zeros).
  • The number of moves in a game of chess is a minimum of 10120 moves (that’s 120 zeros).

By the way, the commonly accepted answer for the number of particles in the observable universe is 1080.

“Abandon all hope, ye who enter here.”

Maybe our only hope to solve these uber-complex, life-impacting analytic problems like autonomous vehicles, smart cities and precision medicine lies in a new approach – quantum computing.

What is Quantum Computing?

Quantum computing differs from traditional binary computing in that takes advantage of the strange ability of subatomic particles to exist in more than one state at any time (it’s like your children, where you can both love and hate them at the same time). In classical digital computing, a bit is a single piece of information that can exist in two states – 1 or 0. Quantum computing uses quantum bits, or ‘qubits’ instead. A qubit can be thought of like an imaginary sphere, where a qubit can be any point on the sphere[1].

The secret to Quantum computing’s “weird” behavior lies in the quantum-mechanical phenomena of superposition and entanglement.

  • Quantum superposition states that any two (or more) quantum states can be added together (“superposed”) and the result will be another valid quantum state (i.e., two waves colliding in physics can create a bigger wave, can cancel each other out, or create something in the middle). Common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1). However, quantum computers can represent 0, 1, or a “superposition” of both 0 and 1 at the same time.
  • Quantum entanglement is a physical phenomenon that occurs when pairs or groups of particles are generated or interact in ways such that the quantum state of each particle cannot be described independently of the others, even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole.

Superposition and entanglement are the break-through of quantum computing. As a result, three qubits can represent eight values simultaneously which means qubits can perform operations magnitudes faster and with less energy than a digital computer.

Quantum Computing and Analytics

One important area where quantum computing is expected to have a dramatic impact is in improving the ability for reinforcement learning to process an exponentially-wider range of operating variables in real-time, which is vital in automated cars and smart entities like factories and hospitals. As an example, Google has built a quantum computer which is 100 million times faster than any of today’s machines.This quantum computer could complete calculations within seconds to a problem that might take a digital computer 10,000 years to calculate.

This new age of “Quantum Artificial Intelligence” is particularly important for the machine learning problems that today are too hard or too complex for digital computers to solve, such as:

  • Safer airplanes—Lockheed Martin plans to test airplane designs that are currently too complex for classical computers.
  • Detect cancer earlier—Computational models will help determine how diseases develop.
  • Help automobiles drive themselves—Google is using a quantum computer to design software that can distinguish cars from landmarks.
  • Reduce weather-related deaths—Precision forecasting will give people more time to take cover.
  • Cut back on travel time—Sophisticated analysis of traffic patterns in the air and on the ground will forestall bottlenecks and snarls.
  • Develop more effective, personalized drugs—By mapping amino acids, for example, or analyzing DNA-sequencing data, doctors will discover and design superior drug-based treatments.

I didn’t see space travel or time travel on that list, but hopefully that’s just a bit later on the solution roadmap.

Maybe hope is just around the corner…or maybe that’s an autonomous vehicle plowing around that corner right at my self-driving dreams.

Additional Sources:

How quantum effects could improve artificial intelligence

How will Quantum Computing Impact Artificial Intelligence?

9 Ways Quantum Computing Will Change Everything

Google’s new quantum computer could launch artificial intelligence arms race


[1] “Inside the weird world of quantum computers”

Bill Schmarzo

About Bill Schmarzo

CTO, Dell EMC Services (aka “Dean of Big Data”)

Bill Schmarzo, author of “Big Data: Understanding How Data Powers Big Business” and “Big Data MBA: Driving Business Strategies with Data Science”, is responsible for setting strategy and defining the Big Data service offerings for Dell EMC’s Big Data Practice. As a CTO within Dell EMC’s 2,000+ person consulting organization, he works with organizations to identify where and how to start their big data journeys. He’s written white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power an organization’s key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the “Big Data MBA” course. Bill also just completed a research paper on “Determining The Economic Value of Data”. Onalytica recently ranked Bill as #4 Big Data Influencer worldwide.

Bill has over three decades of experience in data warehousing, BI and analytics. Bill authored the Vision Workshop methodology that links an organization’s strategic business initiatives with their supporting data and analytic requirements. Bill serves on the City of San Jose’s Technology Innovation Board, and on the faculties of The Data Warehouse Institute and Strata.

Previously, Bill was vice president of Analytics at Yahoo where he was responsible for the development of Yahoo’s Advertiser and Website analytics products, including the delivery of “actionable insights” through a holistic user experience. Before that, Bill oversaw the Analytic Applications business unit at Business Objects, including the development, marketing and sales of their industry-defining analytic applications.

Bill holds a Masters Business Administration from University of Iowa and a Bachelor of Science degree in Mathematics, Computer Science and Business Administration from Coe College.

Read More

Join the Conversation

Our Team becomes stronger with every person who adds to the conversation. So please join the conversation. Comment on our posts and share!

Leave a Reply

Your email address will not be published. Required fields are marked *