Finally a Book on Attention!
Finally a book on Attention. Learn how to build your own transformer model with Machine Learning Mastery's new book.
Research has found that RNN and LSTM does not perform well with long sequences. Attention is the solution. With attention, you can use transformer models instead of RNN for your NLP project.
Learn how to build your own transformer model with Machine Learning Mastery's new book:
>> Building Transformer Models with Attention
(use the offer code KDN20 to get 20% off)
Getting a computer to understand human language is hard because:
- Every language has thousands of words
- Words can carry multiple meanings
- Sentence structure can be very complex
- The same word can have different meaning depending on its position in a sentence
Transformers can solve these problems. We have seen that transformers can:
- ...Translate passages from one language to another
- ...Extract specific keywords such as persons’ names
- ...Summarize articles into a paragraph
- ...Look for answers to a given question in a passage
- ...Compose an article with a leading sentence
And so much more...
I have developed a playbook titled "Building Transformer Models with Attention" designed for developers to get you to understand the magic behind transformer models and the attention mechanism.
- What are the variations of attention mechanisms
- Detailed code to implement reusable attention components
- Building a complete transformer model using the components you created
As a valued reader, I want to give you a discount on this new book.
Enter the offer code KDN20 and click "apply" to get 20% off the price of the standalone book.