Automate Your Codebase with Promptr and GPT
Are you looking to streamline your code operations with GPT but are tired of the copy-pasting process? Well, here is the solution in the form of Promptr. An open-source tool to automate your codebase.
Image by Author
As the field of Artificial Intelligence is growing and evolving, we have seen the rise of powerful tools like GPT, ChatGPT, Bard, etc. Programmers are using these tools to streamline their workflows and optimize their codebase. It has enabled them to focus more on building the program's core logic and less on the more mundane and repetitive tasks. However, programmers are experiencing the issue of copy-pasting their code into these models, getting the recommendations, and then updating their codebase. This procedure becomes tiresome for the people who do it frequently.
Fortunately, there is a solution to this problem now. Let me introduce you to Promptr, an open-source command line-based tool that allows programmers to automate their codebase without leaving their editor. Sounds cool! Right? If you are interested to know more about how this tool works, what it offers, and how to set it up? Please sit back and relax while I explain it to you.
What is Promptr?
Promptr is a CLI tool that makes the process of applying the GPT code recommendations to your codebase a lot easier. You can refactor your code, implement the classes to pass the test, experiment with LLMs, perform debugging and troubleshooting, etc all with just a single line of code. As per its official documentation:
“This is most effective with GPT4 because of its larger context window, but GPT3 is still useful for smaller scopes.” (Source - GitHub)
This tool accepts several parameters separated by the space that specifies the mode, template, prompt, and other settings for generating the output.
promptr -m <mode> [options] <file1> <file2> <file3> ...
- -m, --mode <mode>: It specifies the mode to use (GPT-3 or GPT-4). The default mode is GPT-3
- -d, --dry-run: It is an optional flag when only the prompt is sent to the model but the changes are not reflected in your file system.
- -i, --interactive: It enables the interactive mode and allows the user to pass various inputs.
- -p, --prompt <prompt>: It is a non-interactive mode and it can be a string or a URL/path containing the prompt
Similarly, you can use some other options mentioned on their GitHub repository depending on your use case. Now, you might be wondering how it all happens under the hood. So, let's explore that.
How does Promptr Work?
Image by Author
The first thing you do is clean your working area and commit any changes. Then, you need to write a prompt with clear instructions as if you are explaining the task to an inexperienced co-worker. After that, specify the context that you will send along with your prompt to GPT. Please note that prompt is your instruction to GPT while context refers to the files that GPT must know to perform the codebase operations. For instance,
promptr -p "Cleanup the code in this file" index.js
Here index.js refers to the context while "Cleanup the code in this file" is your prompt to GPT. Promptr will send it to GPT and wait for the response as it may take some time. Then the response generated by the GPT is first parsed by Promptr after which the suggested changes are applied to your file system. And that’s it! Simple yet a very useful tool.
Setting up Promptr for Automating your Codebase
Here are the steps to setup Promptr on your local computer:
Open the terminal or command line window. Install the Promptr globally by running either of the below-mentioned commands depending on the package manager that you are using:
npm install -g @ifnotnowwhen/promptr
yarn global add @ifnotnowwhen/promptr
You can also install Promptr by copying the binary for the current release to your path but it is only supported for macOS users as of now.
Once the installation is complete you can verify it by executing the following command
Setting OpenAI API Key
You will need an OpenAI API key to use promptr. If you don’t have one, you can sign up for a free account to get free credits up to $18.
Once you get your secret key, you have to set an environment variable ‘OPENAI_API_KEY’.
For Mac or Linux:
export OPENAI_API_KEY=<your secret key>
Click “Edit the system environment variables” to add a new variable ‘OPENAI_API_KEY’ and set its value to the secret key that you received from your OpenAI account.
Although it allows humans to perform operations on their code just like they maintain their text files, this technology is still in its early stages and has some cons. For example, there is a potential for data loss if deleting files is recommended by GPT hence it is advised to commit to your important work before using it. Similarly, some people have expressed their concern about the per-token cost of using the OpenAI API. Nonetheless, I wonder how far is it when we can develop software that can self-repair. If you want to experiment with it, here is the link to the official GitHub repository - Promptr.
Kanwal Mehreen is an aspiring software developer with a keen interest in data science and applications of AI in medicine. Kanwal was selected as the Google Generation Scholar 2022 for the APAC region. Kanwal loves to share technical knowledge by writing articles on trending topics, and is passionate about improving the representation of women in tech industry.