become a 1000x engineer or die tryin'

Posts

header-img

 

For millennia, programmers have searched far and wide for the vaunted 10X Engineer. Unfortunately, due to inflation — real and imagined, 10X just won’t cut it anymore. We need bigger gains, bigger wins, more code, more PRs, more lines, less linting, etc….Therefore, in this article I’ll cover how to catapult your productivity to the heavens via a series of command line wrapper functions around the OpenAI API.

First off, you’ll need an OpenAI API key. You can get this by going here and signing up.

 

Fair warning, I don’t use bash — or technically, zsh now for Mac. I use Fish — heartbreaking, I know. Either way, the gist of it will largely remain the same and it is a good excercise to implement it yourself in bash or zsh.

 

Nevertheless, let’s get started.

 

Hey GPT

The first thing we need is to get answers to any and every question we may have.

bash
# model: gpt-4 is in private beta (have to get from waitlist)
# model: gpt-3.5-turbo (if you don't have access)

function hey_gpt
    set prompt \'(echo $argv | string join ' ')\'
    set gpt (curl https://api.openai.com/v1/chat/completions -s \
    -H "Content-Type: application/json" \
    -H "Authorization: Bearer $OPENAI_KEY" \
    -d '{
        "model": "gpt-4",
        "messages": [{"role": "user", "content": "'$prompt'"}],
        "temperature": 0.7,
        "stream": true
    }')
    for text in $gpt
        if test $text = 'data: [DONE]'
            break
        else if string match -q --regex "role" $text
            continue
        else if string match -q --regex "content" $text
            echo -n $text | string replace 'data: ' '' | jq -r -j '.choices[0].delta.content'
        else
            continue
        end
    end
end

One other quick note if you are trying to replicate this is that you will need jq installed. If you don’t have it: you should.

I’m not going to dive super deep into the fine tuning of the model parameters — you can read more about that here.

What I will say though, is that I alias this function to the letter h so that I can easily type the following (don’t need to wrap input in strings anymore - with above code):

bs

chip

make

 

Nice, nice…now onto the real gains.

 

Data GPT

One common pattern that I’ve observed myself using is a mixture of prompt + data.

bash
function data_gpt -a prompt data
    set prompt_input (echo "$prompt: $data" | string join ' ')

    curl https://api.openai.com/v1/chat/completions -s \
    -H "Content-Type: application/json" \
    -H "Authorization: Bearer $OPENAI_API_KEY" \
    -d '{
        "model": "gpt-4",
        "messages": [{"role": "user", "content": "'$prompt_input'"}],
        "temperature": 0.7
    }' | jq -r '.choices[0].message.content'
end

nba

Link to generated dataset

ans

awk

py

git

Obviously, the use cases here are pretty vast. I experimented with another version that was able to read input that was piped in, but the escaping was messy and it just seems cleaner to write to a file and cat that for now.

So far these examples are fairly similar to the experience you get using ChatGPT. However, there are some differences (1.) I find having it exposed at the CLI level makes me more likely to experiment with GPT (2.) The ability to use these functions within additional commands — for instance, it’s great when combined with GitHub’s CLI or this Jira CLI (3.) Lastly, you can also chain multiple invocations — giving some of the feel of tools like LangChain.

 

IMG GPT

One last final meme-worthy example is the Create Image endpoint.

bash
function img_gpt -a prompt
    set create_img (curl https://api.openai.com/v1/images/generations -s \
    -H "Content-Type: application/json" \
    -H "Authorization: Bearer $OPENAI_API_KEY" \
    -d '{
        "prompt": "'$prompt'",
        "n": 1,
        "size": "1024x1024"
    }')
    echo $create_img | jq
    set url (echo $create_img | jq -r '.data[0].url')
    set rand_num (random 1 1000000)
    curl -s $url -o img-"$rand_num".png
end

ai-img

dog

Generated image

 

Wacky

Another useful endpoint is Code Edits. It does, more or less, what is says. You can ask it for revisions on code—improve runtime complexity, re-write as one-liner, add doc string, etc…. However, instead of generating a wrapper for it using fish we’ll use what we’ve already written—a quick glimpse into the near future:

bash
function openai_edits_api
    h 'can you generate a golang script that reads from stdin and sends that to the OpenAI Code Edits API endpoint - include only the code nothing else' | string replace '```' '' > openai_edits_api.go
    h 'can you generate the commands to build and run the golang script - only include the commands' | string replace '```' '' > openai_edits_api.sh
    data_gpt 'can you generate some tests for the following golang script' (cat openai_edits_api.go | string collect) | string replace '```' '' > openai_edits_api_test.go
    data_gpt 'can you generate a makefile for a golang project with the following files' (ls) | string replace '```' '' > Makefile
end

It’s a little wild, but that will effectively get you 90% of the way there all via basic sentences with a little light cleanup—wild to say the least.

 

Conclusion

One last thing to consider is that this is really only the beginning. Last week OpenAI announced Plugins—literal steroids for an already overpowered workhorse. I think the real differentiator in the coming months/years will be developer productivity. Good, bad or indifferent there seems to a be an unavoidable advantage to those who aggressively adopt workflows that lean into AI versus fighting, or ignoring, it. I think there is an oncoming paradigm of rapid prototyping combined with simulation that will help to isolate and test various components of a project. Additionally, the ability for translating languages and developer productivity in new languages. The space of what is possible is getting fuzzier with a strong understanding of the fundamentals. I don’t think this replaces programmers, more so provides interesting new ways to try and solve larger problems in murkier domains.

Hopefully this article opened your mind to potential new workflows and potential time optimizations!