Visualizing Large Language Model Outputs with Mindmaps and ClojureScript

Introduction

In recent years, Large Language Models (LLMs) have revolutionized the field of natural language processing. These models can generate human-like text based on a given input, making them a valuable tool for various applications, including text analysis, sentiment analysis, and more. However, one major limitation of LLM outputs is their lack of visual representation. This makes it difficult to understand complex relationships between ideas or concepts.

To overcome this limitation, we can use mindmaps to visualize the output of LLMs. Mindmaps are a graphical representation of ideas, concepts, and relationships, making them an ideal tool for understanding complex information. In this blog post, we will explore how to integrate ClojureScript with LLM outputs using Markmap, a JavaScript library that allows us to convert Markdown text to mindmaps.

Using OpenRouter to access large models library

OpenRouter makes it super easy to just get best scores at school by using a large set of models available.

; claude-write-fibonacci

(require '[pyjama.openrouter.core :as openrouter])
(let [response (openrouter/with-prompt "Write Fibonacci in Clojure!")]
  (println
    (-> response :choices first :message :content)))

I actually wanted to evaluate answers from anthropic/claude-3.7-sonnet so that model is the default one for open router.

With a bit of rewriting around, we can get something vastly more generic, where the prompt and the model are simply calling openrouter/with-config.

ChatGPT 4.5-preview

ChatGPT 4.5 Preview meets Pyjama: Clojure Conversations Just Got Smarter!

The latest and greatest from OpenAI—ChatGPT 4.5 Preview—is out in the wild! And what’s better than playing around with cutting-edge AI? Doing so directly from your favorite REPL, thanks to Pyjama, your go-to Clojure companion for seamless interactions with ChatGPT.

Quick recap: Calling ChatGPT from Clojure

With Pyjama, chatting with GPT is as straightforward as ever:

(require '[pyjama.chatgpt.core :as gpt])

(gpt/chatgpt
  {:prompt "give me a command to show number of sata ports linux"})

And to harness the power of the shiny new GPT-4.5-preview model, it’s just a small configuration tweak away:

FIKA - Filosophy Kafe #1

Our first new AI event, to get hands-on AI agents joining a specific topic and battling for wits, and intelligence, happened today at the Karabiner Office.

../08_02.png

Setup was to use the pyjama-philosophers arena, and let custom agents join the on-going conversation after theme was decided between the human participants.

../08_01.png

There are templates for Agents in different languages, namely Rust, Python, Clojure and .. Ruby !

Using Clojure for RAG with Ollama and Pyjama

Retrieval-Augmented Generation (RAG) is a powerful technique that combines retrieval-based search with generative AI models. This approach ensures that responses are based on relevant context rather than solely on the model’s pretrained knowledge. In this blog post, we’ll explore a simple RAG setup in Clojure using Ollama and Pyjama.

Setting Up the Environment

The script starts by defining the URL of the Ollama server, which serves the embedding and generative models:

Pyjama Embeddings: Asking the Big Questions

Fun with Clojure: Asking the Big Questions

Clojure might be known for its Lisp-y elegance and its ability to wrangle data like a pro, but today, we’re using it for something far more profound: uncovering the deepest truths of the universe. Why does the sky turn red? Why do fireworks explode? Why did the sun even bother to rise? Thanks to some Clojure magic, we’ve got the answers—straight from a highly sophisticated knowledge base (i.e., a delightfully whimsical source_of_truth.txt).