- GPT-2 vs GPT-3: The OpenAI Showdown - Feb 17, 2021.
Thanks to the diversity of the dataset used in the training process, we can obtain adequate text generation for text from a variety of domains. GPT-2 is 10x the parameters and 10x the data of its predecessor GPT.
GPT-2, GPT-3, Natural Language Generation, NLP, OpenAI, Transformer
- Innovating versus Doing: NLP and CORD19 - Jun 30, 2020.
How I learned to trust the process and find value in the road most traveled.
Coronavirus, DataScience, DataVisualization, GPT-2, IBM, LDA, NLP, Topic Modeling
- GPT-3, a giant step for Deep Learning and NLP? - Jun 9, 2020.
Recently, OpenAI announced a new successor to their language model, GPT-3, that is now the largest model trained so far with 175 billion parameters. Training a language model this large has its merits and limitations, so this article covers some of its most interesting and important aspects.
AI, Deep Learning, GPT-2, GPT-3, NLP, OpenAI

20 AI, Data Science, Machine Learning Terms You Need to Know in 2020 (Part 2) - Mar 2, 2020.
We explain important AI, ML, Data Science terms you should know in 2020, including Double Descent, Ethics in AI, Explainability (Explainable AI), Full Stack Data Science, Geospatial, GPT-2, NLG (Natural Language Generation), PyTorch, Reinforcement Learning, and Transformer Architecture.
AI, Data Science, Explainability, Geospatial, GPT-2, Key Terms, Machine Learning, Natural Language Generation, Reinforcement Learning, Transformer
- What just happened in the world of AI? - Dec 12, 2019.
The speed at which AI made advancements and news during 2019 makes it imperative now to step back and place these events into order and perspective. It's important to separate the interest that any one advancement initially attracts, from its actual gravity and its consequential influence on the field. This review unfolds the parallel threads of these AI stories over this year and isolates their significance.
AI, AutoML, Bias, Deep Learning, DeepMind, GANs, GPT-2, NLP, OpenAI, Reinforcement Learning, Trends
- Deploying a pretrained GPT-2 model on AWS - Dec 12, 2019.
This post attempts to summarize my recent detour into NLP, describing how I exposed a Huggingface pre-trained Language Model (LM) on an AWS-based web application.
AWS, Deployment, GPT-2, Natural Language Generation, NLP
- Examining the Transformer Architecture: The OpenAI GPT-2 Controversy - Jun 20, 2019.
GPT-2 is a generative model, created by OpenAI, trained on 40GB of Internet to predict the next word. And OpenAI found this model to be SO good that they did not release the fully trained model due to their concerns about malicious applications of the technology.
AI, Architecture, GPT-2, NLP, OpenAI, Transformer
- What Does GPT-2 Think About the AI Arms Race? - Apr 1, 2019.
It may be April first, but that doesn't mean you will necessarily be fooled by GPT-2's views on the AI arms race. Why not have a read for fun and to see what the language generation model is capable of.
AI, GPT-2, Natural Language Generation, NLP
- OpenAI’s GPT-2: the model, the hype, and the controversy - Mar 4, 2019.
OpenAI recently released a very large language model called GPT-2. Controversially, they decided not to release the data or the parameters of their biggest model, citing concerns about potential abuse. Read this researcher's take on the issue.
AI, Ethics, GPT-2, Hype, NLP, OpenAI