More Free Courses on Large Language Models
Interested in learning about large language models? Get up and running with these free courses from DeepLearning.AI, Google Cloud, Udacity, and more.
Image by Author
Given the potential of large language models (LLMs) and LLM applications, the best time to learn more about them is now! From fun personal projects to academic research and work, it’s always exciting to understand LLMs better so we can build interesting applications using them.
In a previous article, we enumerated free courses and resources that’ll help you learn about large language models. We’ve curated yet another list of free courses to help you upskill.
Let’s get started!
ChatGPT Prompt Engineering for Developers
ChatGPT Prompt Engineering for Developers is offered by DeepLearning.AI in collaboration with the OpenAI team.
If you already use ChatGPT or GPT-4, this course teaches you how to get better at using them. You’ll learn how to use the OpenAI API effectively by using prompt engineering best practices.
Along the way, you’ll have the opportunity to build a custom chatbot and learn to use the OpenAI API for common use cases including summarization, inference, translation, spelling and grammar checks.
Also check out this detailed review of this prompt engineering course by Josep Ferrer.
LangChain for LLM Application Development
The LangChain for LLM application Development by DeepLearning.AI is co-taught by Harrison Chase, the creator of LangChain. Focusing on building applications by leveraging the LangChain ecosystem, this course will help you get the hang of:
- Managing prompts and parsing responses, memory and context window constraints
- Using chains to perform a sequence of actions
- Question answering over a document corpus
- Leveraging the reasoning capabilities of agents for reasoning capabilities
Building Systems with the ChatGPT API
Building Systems with the ChatGPT API is also offered by DeepLearning.AI in partnership with OpenAI. In this free course in this free course, you’ll build a customer service chatbot to apply the following concepts covered in the course:
- Building systems using large language models
- Using multistage prompts
- Building a subtask pipeline by breaking down tasks into subtasks
- Evaluating LLM inputs and outputs
Note: All the above courses are free for a limited time.
Google Cloud Generative AI Learning Path
Google Cloud recently released a dedicated Generative AI learning path. The series of micro courses that make up this path aim at enabling development and deployment of generative AI solutions on Google Cloud.
If you’re interested in learning about large language models, you’ll find the following courses helpful:
- Introduction to Generative AI
- Introduction to Large Language Models
- Generative AI Fundamentals
- Encoder-Decoder Architecture
- Attention Mechanism
- Transformer Models and BERT Model
- Generative AI Explorer - Vertex AI
Introduction to Large Language Models with Google Cloud
Introduction to Large Language Models with Google Cloud is part of Udacity’s free course library, and covers getting started with understanding and building LLM applications including:
- Basics of large language models and use cases
- Prompt tuning
LLM University by Cohere provides an easy-to-follow learning path: from the fundamentals of LLMs to building applications using them. The course covers:
- Concept such as word and sentence embeddings
- Foundational concepts of large language models: transformers and attention mechanism
- Applications of LLMs in text generation, classification, and analysis
- Building and deploying building applications using Cohere’s end points
Full Stack LLM Bootcamp
The Full Stack LLM Bootcamp covers it all: from prompt engineering to get the most out of GPT assistants to deploying and monitoring LLM applications.Here’s an overview of what this bootcamp offers:
- Prompt engineering
- LLM foundations
- Augmented language models
- UX for Language user interfaces
Here’s a post that breaks down the contents of this Full Stack LLM bootcamp.
Other Helpful Resources
Here are a few other interesting resources that’ll help you get up to speed with LLMs:
- State of GPT Talk: This talk by Andrej Karpathy at Microsoft Build 2023 provides a comprehensive overview of the training pipelines of GPT assistants including tokenization, pretraining, fine tuning, and reinforcement learning from human feedback.
- Practical Deep Learning for Coders, Part 2 by fast.ai: Lessons on attention and transformer models can be helpful.
- CS25: Transformers United V2 by Stanford is a series of lectures that covers the basics of transformers up to the most recent advances and applications of large language models—beyond common NLP tasks.
- LangChain Tutorials by Data Independent on YouTube: A series of short tutorials that use LangChian to build LLM apps over different custom data sources and explore a variety of tasks including solving math problems, summarization, and search.
Hope you found this round-up of some of the best resources to learn large language models helpful. We had a great time putting together this listicle, and we hope you’re excited to learn and start building! Happy learning!
Bala Priya C is a developer and technical writer from India. She likes working at the intersection of math, programming, data science, and content creation. Her areas of interest and expertise include DevOps, data science, and natural language processing. She enjoys reading, writing, coding, and coffee! Currently, she's working on learning and sharing her knowledge with the developer community by authoring tutorials, how-to guides, opinion pieces, and more.