- GPT-2 vs GPT-3: The OpenAI Showdown - Feb 17, 2021.
Thanks to the diversity of the dataset used in the training process, we can obtain adequate text generation for text from a variety of domains. GPT-2 is 10x the parameters and 10x the data of its predecessor GPT.
- Six Times Bigger than GPT-3: Inside Google’s TRILLION Parameter Switch Transformer Model - Jan 25, 2021.
Google’s Switch Transformer model could be the next breakthrough in this area of deep learning.
- Top 5 Artificial Intelligence (AI) Trends for 2021 - Jan 21, 2021.
From voice and language driven AI to healthcare, cybersecurity and beyond, these are some of the key AI trends for 2021.
- Main 2020 Developments and Key 2021 Trends in AI, Data Science, Machine Learning Technology - Dec 9, 2020.
Our panel of leading experts reviews 2020 main developments and examines the key trends in AI, Data Science, Machine Learning, and Deep Learning Technology.
- Must-read NLP and Deep Learning articles for Data Scientists - Aug 21, 2020.
NLP and deep learning continue to advance, nearly on a daily basis. Check out these recent must-read guides, feature articles, and other resources to keep you on top of the latest advancements and ahead of the curve.
- KDnuggets™ News 20:n31, Aug 12: Data Science Skills: Have vs Want: Vote in the New Poll; Netflix Polynote is a New Open Source Framework to Build Better Data Science Notebooks - Aug 12, 2020.
Vote in the new KDnuggets poll
: which data science skills you have and which ones you want? Netflix is not only for movies - its Polynote is a new open source framework to build better data science notebooks; Learn about containerization of PySpark using Kubernetes; Read the findings from Data Scientist Job Market 2020 analysis; and Explore GPT-3 latest.
- Exploring GPT-3: A New Breakthrough in Language Generation - Aug 10, 2020.
GPT-3 is the largest natural language processing (NLP) transformer released to date, eclipsing the previous record, Microsoft Research’s Turing-NLG at 17B parameters, by about 10 times. This has resulted in an explosion of demos: some good, some bad, all interesting.
- GPT-3, a giant step for Deep Learning and NLP? - Jun 9, 2020.
Recently, OpenAI announced a new successor to their language model, GPT-3, that is now the largest model trained so far with 175 billion parameters. Training a language model this large has its merits and limitations, so this article covers some of its most interesting and important aspects.