Impro: Improvisation and the Theatre
By Keith Johnstone
My GoodReads rating from 2015:
Our afternoon "shake-off" at Woebot:
Because apparently I document everything...
github.com/pamelafox/improvlists/Full rules: Group-Story-Spine.md
We need the computer to generate a creative human-like response.
Possible tools:
Let's try these models...
Company | Model | Parameters | Host |
---|---|---|---|
OpenAI | gpt-4o-mini | ?? | OpenAI.com, Azure, GitHub Models |
Meta | Llama3.1 | 8B, 70B, 405B | Azure, GitHub Models, Ollama |
Microsoft | Phi3 | 3.8B, 14B | Azure, GitHub Models, Ollama |
Mistral | Mistral small | ?? | Azure, GitHub Models, Ollama |
Ollama is a tool for easily running local LLMs on your computer.
You can also run it from GitHub Codespaces: 🔗 github.com/pamelafox/ollama-python-playground
We can call Ollama programmatically using:
HTTP, ollama package, or openai package.
GitHub models are freely available models with very low rate limits. 😢 Currently waitlist only, sign up at: github.com/marketplace/models
We can call GitHub models programmatically using:
HTTP, azure-ai-inference package, or openai package.
Since the openai package can be used for both Ollama and GitHub models, we'll use it for most examples.
response = client.chat.completions.create(
model=model_name,
messages=[
{"role":"system",
"content":"You're playing improv games."},
{"role":"user",
"content":"Suggest a random first line."}],
)
Full code: example.py
Rules: Game: Group Story Spine
Code: storyspine.py
Takeaways:
Rules: Sentence-at-a-time-Story.md
Code: sentenceatatime.py
Rules: Tweet-Generator.md
Code: tweetgenerator.py
Approaches:
Rules: Yes,And-Product-Factory.md
Code: yesandproduct.py
Yes, with multimodal models!
Rules: 3-Things!.md
Code: threethings_simple.py
Yes, but use function calling,
if model supports it.
Ollama blog: Tool support for language models
Add few-shot examples to improve response quality.
Langchain blog: Few-shot prompting to improve tool calling
Other options: