5 LLM Tools I Can’t Live Without
In this article, I share the five essential LLM tools that I currently find indispensable, and which have the potential to help revolutionize the way you work.
Image by Author | Canva
Large language models (LLMs) have transformed, and continue to transform, the AI and machine learning landscape, offering powerful tools to improve workflows and boost productivity for a wide array of domains. I work with LLMs a lot, and have tried out all sorts of tools that help take advantage of the models and their potential. While my favorites at any given time will obviously shift, there are a core bunch that I seem to find myself sticking with recently, even as time goes on.
In this article, I share the five essential LLM tools that I currently find indispensable, and which have the potential to help revolutionize the way you work: LlamaIndex, Ollama, Ollama UI, NotebookLM, and ControlFlow.
LlamaIndex
What it is: LlamaIndex is a framework specifically designed for building data-centric applications powered by LLMs. Optimized for building retrieval augmented generation (RAG) systems, LlamaIndex is my go-to option for anything beyond prototyping a RAG setup. I've been sa number of useful applications with LlamaIndex and their core.
How it works: This powerful tool allows you to connect your data to generative AI, simplifying the process of data ingestion, parsing, retrieval, and indexing across a wide range of sources. LlamaIndex works seamlessly with over 40 vector stores, large language models (LLMs), and data sources.
TL;DR: If a RAG app is what I'm looking to build, LlamaIndex is my top choice.
Ollama
What it is: Ollama is a platform that makes it easy to run a variety of LLMs locally on your computer. Don't want to run an LLM in the cloud? Want the security of a local model? Ollama is my top choice for local hosting. Along with being able to run models and interact with them directly within the Ollama command line interface, you can use Ollama to serve models for other applications to interact with, which is a huge advantage.
How it works: With a simple single command, you can download and run a whole host of models on your own hardware. Want to test out the new Llama 3.2 3B? This will do it for you:
ollama run llama3.2
And once it's running, you can just send prompts via the terminal. More importantly, connect your own scripts via the Ollama Python API and write your own AI apps.
TL;DR: To run local models quickly and easily, it's always Ollama.
Ollama UI
What it is: Have you downloaded a language model and want to get it up and running NOW? Don't want to write any code yourself, but don't want to use the Ollama CLI to test your new LM? Ollama UI is an independently developed simple chat interface for your Ollama models. With no configuration needed (or even possible), Ollama UI is the easiest no frills chat client for your local models that just works right out of the box.
How it works: The app just makes interacting with Ollama even more intuitive and incredibly straightforward, streamlining the management and utilization of your LLMs and chats. And I have not found a faster local LLM interface app. Plus, it's available as a Chrome extension, so its installation and usage literally could not be easier.
TL;DR: Get chatting with your local models immediately with Ollama UI.
NotebookLM
What it is: NotebookLM is an AI-powered notebook from Google. It's been around in experimental form for some time, but it got a big injection of cool last week when it introduced the "audio overview" functionality which allows users to create a deep dive conversation in podcast form based on the sources referenced in your notebook.
How it works: Let's back up a bit here: NotebookLM allows users to create individual notebooks which are collections of sources, chats, summaries, FAQs, study guides, and a host of other AI-generated artifacts based on the sources. Users can summarize documents, generate ideas, and write different kinds of creative text formats, and can also add your own human-written notes to these notebooks. Think of it as an app like Microsoft OneNote but with deep AI-integration and a RAG focus.
TL;DR: For help grokking source material in a variety of innovative and useful ways, NotebookLM is a must-have.
ControlFlow
What it is: ControlFlow is a framework for building agentic AI workflows in Python. I was just recently turned on to ControlFlow by this article by Abid Ali Awan on our sister site Machine Learning Mastery. Simple to use, and easy to to get started with, ControlFlow is a great starter choice for agentic AI.
How it works: As a framework for creating structured AI workflows, ControlFlow allows users to define discrete tasks, assign specialized AI agents, and combine tasks into flows. This approach allows developers to harness AI for complex applications while maintaining fine-grained control and oversight. It's task-focused, which makes it intuitive to use, and its syntax is elegant. However, the real winner IMO is its ability to prototype so quickly. In that regard, it fits in well with the other tools on this list.
TL;DR: If I want easy agentic AI, ControlFlow is my new go-to.
Summary
The five LLM tools discussed — LlamaIndex, Ollama, Ollama UI, NotebookLM, and ControlFlow — have the potential to revolutionize workflows across various tasks, unlocking significant gains in efficiency, productivity, and innovation. I've settled on these tools for my workflows, and I hope some of you find them useful as well.
Matthew Mayo (@mattmayo13) holds a master's degree in computer science and a graduate diploma in data mining. As managing editor of KDnuggets & Statology, and contributing editor at Machine Learning Mastery, Matthew aims to make complex data science concepts accessible. His professional interests include natural language processing, language models, machine learning algorithms, and exploring emerging AI. He is driven by a mission to democratize knowledge in the data science community. Matthew has been coding since he was 6 years old.