Transparent GPT-5 Support in My Clojure ChatGPT Library

🐾 Pyjama, the Ollama/ChatGPT Clojure Wrapper Now Speaks Fluent GPT-5 (and Still Recognizes Cats)

It’s always a good day when your code gets smarter — and an even better day when it can still identify a cute kitten.

With OpenAI rolling out the GPT-5 family (gpt-5, gpt-5-mini, gpt-5-nano), I wanted my Clojure ChatGPT wrapper to transparently support the new models without breaking a sweat.


Calling GPT-5 Is This Simple

Whether you’re sending text or adding vision, the call looks almost the same:

Ollama Clojure Frontend - Breeze

Introduction

Breeze is a small ollama frontend, with few features, expect that it just works. It allows to talk to ollama in realtime, with a really small memory/disk footprint.

../13_01.png

Prerequisites are:

  • Ollama, to run the models
  • Docker, to run this image from dockerhub.

Start the docker image

docker run -it --rm -p 3000:3000 hellonico/breeze

Breeze UI

The setup screen includes:

  • ollama url
  • ollama model
  • system prompt

../13_02.png

ClojureScript and Dynamic!? Re-charts

I simply forgot how easy it was to use ClojureScript for charting.

Someone got in touch with me recently to give them an example on how things are again, so here it is.

The data set is an atom, so that eventually the reagent/react reactive rendering framework can be used as its best.

(defonce data (r/atom [  {:name "Page A" :uv 4000 :pv 2400}
                         {:name "Page B" :uv 3000 :pv 1398}
                         {:name "Page C" :uv 2000 :pv 9800}
                         {:name "Page D" :uv 2780 :pv 3908}
                         {:name "Page E" :uv 1890 :pv 4800}
                         {:name "Page F" :uv 2390 :pv 3800}
                         {:name "Page G" :uv 3490 :pv 4300}]))

The chart is a two lines chart, one for the uv series, and one for the pv series.

What surprising skills do leaders need to innovate with generative AI?

This article from the Guardian, picked up my curiosity.

“What surprising skills do leaders need to innovate with generative AI?”

So I asked my model, to come up with a mindmap summary of the main concepts of the article:

../12_02.png

It’s a bit short on details and underlying concepts.

Asking the same question directly to the model was (not-) surprisingly more verbose and interesting.

Visualizing Large Language Model Outputs with Mindmaps and ClojureScript

Introduction

In recent years, Large Language Models (LLMs) have revolutionized the field of natural language processing. These models can generate human-like text based on a given input, making them a valuable tool for various applications, including text analysis, sentiment analysis, and more. However, one major limitation of LLM outputs is their lack of visual representation. This makes it difficult to understand complex relationships between ideas or concepts.

To overcome this limitation, we can use mindmaps to visualize the output of LLMs. Mindmaps are a graphical representation of ideas, concepts, and relationships, making them an ideal tool for understanding complex information. In this blog post, we will explore how to integrate ClojureScript with LLM outputs using Markmap, a JavaScript library that allows us to convert Markdown text to mindmaps.

Using OpenRouter to access large models library

OpenRouter makes it super easy to just get best scores at school by using a large set of models available.

; claude-write-fibonacci

(require '[pyjama.openrouter.core :as openrouter])
(let [response (openrouter/with-prompt "Write Fibonacci in Clojure!")]
  (println
    (-> response :choices first :message :content)))

I actually wanted to evaluate answers from anthropic/claude-3.7-sonnet so that model is the default one for open router.

With a bit of rewriting around, we can get something vastly more generic, where the prompt and the model are simply calling openrouter/with-config.