Generative AI workflows and hacks 2025

March 23, 2024 — February 18, 2025

economics
faster pussycat
innovation
language
machine learning
neural nets
NLP
stringology
technology
UI
Figure 1

I’ll try to synthesise LLM research elsewhere. This is where I keep ephemeral notes and links, continuing my habit in 2024.

1 The year I finally install local LLMs

1.1 …via Ollama for autonomous LLM inference

Useful guides:

1.2 …via Simon Willison

Develops LLM: A CLI utility and Python library for interacting with Large Language Models.

If you like command-lines wizardry this is neat.

Has powerful tricks such as Apple mac acceleration and this kind of stunt: pillow.md

git clone https://github.com/python-pillow/Pillow
cd Pillow
files-to-prompt -c . -e .py -e c -e h | \
  llm -m gemini-2.0-pro-exp-02-05 \
  -s 'Explain how Pillow interacts with the Python GIL - include example code snippets as part of your explanation'

1.3 … via LM Studio

A GUI option:

1.4 Virtually via proxy

2 Moar automation

OpenInterpreter/open-interpreter: A natural language interface for computers.

3 What happened previously

4 Economics

5 Deepseek and the cheapening of LLMs

Figure 2: Looks like AI Safety is going fine in GitHub Copilot.

6 Using AI Agents

I seem to be doing a lot of it.

See AI Agents for more.