NLP Overview: Modern Deep Learning Techniques Applied to Natural Language Processing
Trying to keep up with advancements at the overlap of neural networks and natural language processing can be troublesome. That's where the today's spotlighted resource comes in.
Over the recent number of years, neural networks have come to play an increasingly central role in natural language processing. Owing in large part to milestones such as word embeddings, and the explosion of chatbots powered by language models built, at least in part, by neural networks, the achievements of neural networks in the domain are come increasingly quickly. Trying to keep up with these advancements can be troublesome. That's where the today's spotlighted resource comes in.
Source: Vered Shwartz
NLP Overview: Modern Deep Learning Techniques Applied to Natural Language Processing is a living resource maintained by Elvis Saravia and Soujanya Poria — with a major part of the project having been directly borrowed from the work of Young et al. (2017), as per the resource maintainers. It covers the major achievements of neural networks in relation to their use in NLP. Directly from the project itself:
This project contains an overview of recent trends in deep learning based natural language processing (NLP). It covers the theoretical descriptions and implementation details behind deep learning models, such as recurrent neural networks (RNNs), convolutional neural networks (CNNs), and reinforcement learning, used to solve various NLP tasks and applications. The overview also contains a summary of state of the art results for NLP tasks such as machine translation, question answering, and dialogue systems.
Whether looking for an overview of advancements, wondering what the state of the art technique in some specific task is, or seeking implementation code, the NLP Overview project could be useful to you. Its true virtues are its concision and its organization; you can read the main information top to bottom in an hour, or selectively locate what you are interested in very easily.
The project goals are stated as:
The main motivations for this project are as follows:
- Maintain an up-to-date learning resource that integrates important information related to NLP research, such as:
- state of the art results
- emerging concepts and applications
- new benchmark datasets
- code/dataset releases
- Create a friendly and open resource to help guide researchers and anyone interested to learn about modern techniques applied to NLP
- A collaborative project where expert researchers can suggest changes (e.g., incorporate SOTA results) based on their recent findings and experimental results
This resource may well be a great find for NLP practitioners, or those with some existing understanding of the domain, though I would not expect that it would be terribly useful for those brand spanking new to the area, however. You might want to develop your basics first, and then visit this overview to get up to speed on how NLP is currently being practiced leveraging neural networks and deep learning.
Thanks to Elvis Saravia and Soujanya Poria for putting together an accessible, informative overview of where NLP and neural networks currently overlap, and for providing readers with much to take away with them.
- BERT: State of the Art NLP Model, Explained
- Data Representation for Natural Language Processing Tasks
- Approaches to Text Summarization: An Overview