KDnuggets Home » News » 2017 » Feb » Opinions, Interviews » Why Go Long on Artificial Intelligence? ( 17:n07 )

Why Go Long on Artificial Intelligence?


We are now at the right place and time for AI to be the set of technology advancements that can help us solve challenges where answers reside in data. While we have already seen a few AI bull and bear markets since the 50’s, this time it’s different. If I and others are right, the implications are immensely valuable for all.



By Nathan Benaich, investing in a future of technology.

For those out there who know me, it’ll be no surprise to learn that I’m going long on the transformative power of artificial intelligence (AI). Since 2013, I’ve spent most of my energy studying, researching, investing (e.g. Mapillary, Numerai, Ravelin) and building AI communities (AI Summit 2015and 2016, LondonAI meetup), with a mission to accelerate its real-world applications. I am passionate about seeking out and bringing technology advancements to markets that can enable us to solve the high-value (and often complex) problems we face in business and society. Importantly, this includes ones that were previously intractable from either a technical or commercial standpoint. We are now at the right place and time for AI to be the set of technology advancements that can help us solve challenges where answers reside in data. While we have already seen a few AI bull and bear markets since the 50’s, this time it’s different. If I and others are right, the implications are immensely valuable for all.

Here is why I’m going long on AI:

  • AI and related technologies are the next major technology enablers, just like mobile, social and cloud before them. In fact, AI is the next electricity, transforming one industry after another, forever.
  • AI will become ubiquitous and deeply integrated into the fabric of software and will be used every day. Similar to how most of today’s applications are cloud-native from day one, those of tomorrow will be AI-driven. Foundational AI components are becoming increasingly available in the form of open source libraries, frameworks, APIs and entire platforms. These developments create layers of abstraction that empower non-expert developers to build AI-native applications.
  • In doing so, AI will improve performance, cost efficiency and overall experience for creators and users alike by several orders of magnitude.
  • AI will provide accelerating returns to those who successfully deploy it. This is because the act of users engaging with an AI system in the wild enables the system to optimise its own performance against predetermined (or learned) goals. The intrinsic accelerating returns of AI also mean that those who fail to get with the program will be left chasing an ever-widening gap with their competitors.
  • We therefore observe the formation of a positive feedback loop from building, testing and retuning AI systems that yields the optimal experience for end users. This, in turn, drives the formation of stickier habits as AI systems will become just too good to give up. This is best experienced today with recommendation systems (e.g. Spotify, Pandora, YouTube, Netflix) that uncannily improve as they learn our preferences.

For those who work in the industry, this comes as no surprise. For laymen and public markets, however, it’s only in the last 12 months that the veil over AI’s potential has been somewhat lifted. Indeed, not a day goes by in which AI doesn’t somehow make headlines, with media mentions up by nearly 200% YoY. This is driven in part by breakthroughs on well-known benchmarks for speech, language and image classification, the open-sourcing of machine learning frameworks by Google, Microsoft, Facebook and Baidu, and major acquisitions.


Media attention for selected keywords over the last five years plotted on the same y-axis scale

Another way of looking at this hype wave is to track the share price of NVIDIA, the leading graphics processing unit (GPUs) designer. In 2012, University of Toronto researchers developed a then state of the art convolutional neural network (CNN) that achieved a record breaking performance on a large scale image classification task. This feat was made possible, in no small part, because the authors optimised their network (henceforth known as ‘AlexNet’) for parallel training and inference on two NVIDIA GPUs.

Since then, NVIDIA GPUs along with their parallel computing platform and programming model (CUDA) have veritably become the shovels for the AI gold rush. The dramatic increase in parallelizable computing power has enabled developers to train deep, data-hungry architectures faster than ever before, whether they are neural network or reinforcement learning models. The result? We’ve achieved incredible breakthroughs in environment perception, autonomy, robotics, machine translation, speech recognition and dialogue, search, image and video super-resolution, and many more to come.

It’s therefore not surprising that NVIDIA delivered a 225% total return in 2016, making them last year’s best-performing stock in the S&P 500. A truly incredible feat. Note the red box in the graph below, which outlines the +30% jump the stock recognised on the day of its Q3 2017 earnings call, 10th November 2016. That’s right, a 30% uplift or, put another way, +$11bn was added to the company’s market capitalisation in a single day. In my eyes, this is the moment public markets awoke to the future potential of NVIDIA and, by close association, the potential that AI holds. One could posit that the delta between the expectations of public markets investors and the reality of AI impacting the real world narrowed in a meaningful way.


A scorching year for NVIDIA (share price % change plotted vs. S&P 500) (Google Finance)

It’s not often that several major macro and micro factors align to poise a technology for such a significant impact. Today we have access to immense compute resources and the means to capture and process data in search of answers to complex questions. We have simulation environments in which to rapidly iterate the development of new AI models. Researchers are publishing new model architectures and training methodologies, while squeezing more performance from existing models. In fact, the output of the AI research community is incredible — registrations to the Neural Information Processing Systems (NIPS), the most significant AI conference, shot through the roof (and sold out for the first time) this year to 5,362 attendees versus 3,651 attendees last year. What’s more, the resources to conduct experiments, build, deploy and scale AI technology are rapidly being democratised. Finally, significant capital and talent is flowing into private companies to enable new and bold ideas to take flight.

The time is now for AI. We’re poised to see real value creation in 2017 and beyond. Watch this space!

Sign up to my newsletter covering AI news and analysis from the tech world, research lab and private/public company market.

I’d love to hear your thoughts - ping me on Twitter (@nathanbenaich).

Thanks to my friend and LondonAI partner in crime, Alexandre Flamant, for proof reading this piece.

Disclosure: I currently hold a (microscopic) long position in NVIDIA.

Bio: Nathan Benaich is an investor and technologist accelerating the future of #AI. Former research scientist, photographer, perpetual foodie.

Original. Reposted with permission.

Related: