About Kevin Vu
Kevin Vu manages Exxact Corp blog and works with many of its talented authors who write about different aspects of Deep Learning.
Kevin Vu Posts (31)
-
GPT-2 vs GPT-3: The OpenAI Showdown - 17 Feb 2021
Thanks to the diversity of the dataset used in the training process, we can obtain adequate text generation for text from a variety of domains. GPT-2 is 10x the parameters and 10x the data of its predecessor GPT.
-
Vision Transformers: Natural Language Processing (NLP) Increases Efficiency and Model Generality - 02 Feb 2021
Why do we hear so little about transformer models applied to computer vision tasks? What about attention in computer vision networks?
-
Top 5 Artificial Intelligence (AI) Trends for 2021 - 21 Jan 2021
From voice and language driven AI to healthcare, cybersecurity and beyond, these are some of the key AI trends for 2021.
-
A Friendly Introduction to Graph Neural Networks - 30 Nov 2020
Despite being what can be a confusing topic, graph neural networks can be distilled into just a handful of simple concepts. Read on to find out more.
-
Compute Goes Brrr: Revisiting Sutton’s Bitter Lesson for AI - 19 Nov 2020
"It's just about having more compute." Wait, is that really all there is to AI? As Richard Sutton's 'bitter lesson' sinks in for more AI researchers, a debate has stirred that considers a potentially more subtle relationship between advancements in AI based on ever-more-clever algorithms and massively scaled computational power.
-
Building Neural Networks with PyTorch in Google Colab - 30 Oct 2020
Combining PyTorch and Google's cloud-based Colab notebook environment can be a good solution for building neural networks with free access to GPUs. This article demonstrates how to do just that.
-
Autograd: The Best Machine Learning Library You’re Not Using? - 16 Sep 2020
If there is a Python library that is emblematic of the simplicity, flexibility, and utility of differentiable programming it has to be Autograd.
-
A Deep Dive Into the Transformer Architecture – The Development of Transformer Models - 24 Aug 2020
Even though transformers for NLP were introduced only a few years ago, they have delivered major impacts to a variety of fields from reinforcement learning to chemistry. Now is the time to better understand the inner workings of transformer architectures to give you the intuition you need to effectively work with these powerful tools.
-
Exploring GPT-3: A New Breakthrough in Language Generation - 10 Aug 2020
GPT-3 is the largest natural language processing (NLP) transformer released to date, eclipsing the previous record, Microsoft Research’s Turing-NLG at 17B parameters, by about 10 times. This has resulted in an explosion of demos: some good, some bad, all interesting.
-
Deep Learning for Signal Processing: What You Need to Know - 27 Jul 2020
Signal Processing is a branch of electrical engineering that models and analyzes data representations of physical events. It is at the core of the digital world. And now, signal processing is starting to make some waves in deep learning.