Quantum Consumer Electronics

(Please note: This article is NOT an official Intel communication. All thoughts and observations mentioned in the piece are entirely my own, as is the responsibility for any technical inaccuracy or otherwise)

Every January, several thousand exhibitors and more than 170,000 attendees converge on Las Vegas, Nevada to gaze at the coming wave of incrementally improved gadgets. 4000+ companies showcase their newest lineup of products to restock the shelves of big box retailers like Best Buy and Walmart over the coming months.

So when I heard that I would be responsible for demoing Intel’s recently announced Quantum computing chip “Tangle Lake”, and the self-learning Neuromorphic computing architecture code-named “Loihi” I was both excited and perplexed.

You see, bringing a Quantum processor to a consumer electronics show is like bringing a Falcon Heavy to an Auto Show, or a gun to a knife fight! Except, this fight is about bold innovation, and Intel was armed with the most exotic computing technologies ever fabricated!

Technology enthusiasts from around the world woke up to a new paradigm of human ingenuity. During CES2018, I must have spoken with hundreds of passionate technologists, corporate executives and engineers who struggled to add a new set of scientific concepts to their technical vocabulary.

I also initially struggled to internalize some of the technical concepts, however, I am fortunate to have founded The Machine Learning Society which has 5000+ technical experts, many of whom are Neuro-scientists, Quantum physicists and friends.

A huge thanks to PSS for selecting me to present, and to the extraordinary Intel Labs team for patiently educating me on the applications of this revolutionary technology.

Over the course of the show, several themes dominated the discussion, so I am publishing a collection of those exchanges for all that could not attend this year.

Q&A 1: The Profit

Q: Attendee- How much are the Quantum Computing chips selling for, I would like to buy several units for crypto mining.

A: Me- Sorry, this technology is about 5–10 years away from commercialization. The temperatures required to maintain superposition are nearly 250 times colder than deep space and will likely require institutional supervision to maintain.

Q&A 2: The Digital Hypochondriac

Q: Attendee- So what kind of encryption strength can this chip crack, won’t it make cryptocurrency and blockchain technology obsolete?

A: Me- I imagine that any future processing technology will be sufficient to brute force today’s most secure cryptographic standards. I am also confident that many brilliant mathematicians and cryptographers are working on novel techniques to develop quantum-resistant encryption schemes.

Regarding the security of cryptocurrecy and blockchains, quantum computing does indeed potentially pose an existential threat. However, like most engineering problems, a new branch of research into post-quantum cryptography has emerged to solve this problem.

Q&A 3: The Relativist

Q: Attendee- How does Intel’s Neuromorphic computing chip compare to Nvidia’s new AI chip? Which one is better?

A: Me- It doesn’t! Each architecture has strengths and weaknesses for a variety of Data processing tasks.

Q: But, which one is better?

A: Me- What’s better, a sport car, an SUV or a yacht? Obviously a yacht! But in all seriousness, each has its own unique use case. The recent explosion of purpose build integrated circuits (microchips) has shifted the conversation away from relative comparisons based on ambiguous benchmarks. The real question here is: what kind of computational problem are you trying to solve? and at what energy efficiency? I imagine that the age of generalized microchip manufacturing is over. In the coming years, we will see a host of exotic new architectures entering the market to solve extremely finite processing requirements.

Q&A 4: The Movie Buff

Q: Attendee- When will the robots kill us all?

A: Me- 2024…

Serious A: Me — Hollywood's lack of imagination has created a heartbreaking trope that needs to be retired from our collective consciousness. Improvements in Artificial Intelligence will fundamentally transform our civilization and solve the intractable problems that have plagued our species for millions of years. Our cities will live and breath through trillions of connected devices that create unprecedented wealth and abundance. If we are lucky, we will come together and engineer away subsequent mass extinctions and perhaps even earn the designation of: Type 1 civilization.

Q&A 5: The Futurist

Q: Attendee- Will we have quantum computers in our pockets one day?

A: Me- This is my favorite question because it is inherently optimistic. Unfortunately this question comes from an innocent ignorance of the underlying differences between asynchronous Quantum computing phenomena and conventional batch processing frameworks. Quantum chip applications will likely exploit atomic and subatomic interactions to perform complex molecular modeling, protein folding, real-time simulations of fluid dynamics, as well as the development of meta materials and new energy paradigms.

Final thoughts.

The five archetypes above where selected to highlight some of the emotions that result from the introduction of new and seemingly unrelatable technological developments. These emotions come from a fear of change and the unknown. This revolution also entails a certain loss of control, and for the first time in history, an existential threat to the collective intelligence of our species.

When it comes to AI and Skynet, this is only a demonstration of human imagination, not a distopian future. The irony is that the sooner we relinquish control to a more dependable system of intelligence(autonomous driving, robotic surgery, equitable distribution of resources), the sooner we can regain control of our humanity.

If you liked this article, please visit The Machine Learning Society website to learn more about how our global community of Scientists, Engineers, and Artificial Intelligence experts are transforming Science, Technology and Culture.

About the author

Want to read more? Checkout our other articles