Skip to content

kamali-lab/obsidian-llm-workspace

 
 

Repository files navigation

LLM Workspace plugin for Obsidian

Quick demo

Click to expand Let's create a new LLM workspace. A workspace is just a regular note with links to other notes.
0.Create.workspace.mp4

The workspace view opens side-by-side to the notes. Let's index all information in the workspace. Behind the scenes, this creates the typical RAG setup by chunking documents and computing embeddings for them:

1.Open.and.index.mp4

We can start a conversation right away, but it's also possible to generate exploratory questions:

2.Explore.mp4

A conversation is always grounded in the workspace sources and you can debug RAG pipeline:

3.Debug.mp4

You can also chat with a single note (without creating a workspace) and attach more context gradually:

4.Single.note.mp4

Why have another AI plugin for Obsidian?

  • This plugin focuses on a specific way of AI integration: manually created source sets that an LLM chat is grounded in. It's both cheaper, faster, and more accurate than doing RAG over an entire vault (and we organize our notes anyway, right?!).
  • I wanted something that is flexible, configurable, tweakable. AI progresses too fast anyway, I have no idea what's going to be the best setup tomorrow.
  • Because it's always fun to build tools for my own needs!

Goals

  • Integrate LLMs into the Obsidian vault, not the other way around (copy-pasting snippets in and out of ChatGPT)
  • Enable granular control, tweaking and transparency: choose from multiple models and LLM providers, customize prompts, and debug responses.
  • Native to Obsidian, not just a ChatGPT clone inside the Obsidian window
  • Great UX, efficient multitasking.

Get started

This is a bring your own API key kind of integration. The first step is to add an API key in the plugin settings and choose a model.

This page details the supported LLM providers and the various details of each integration.

The main plugin panel can be launched from the context menu entries, but the following commands are also available:

  • Open active note as LLM workspace
  • Chat with active note
  • Open existing workspace
  • Open context panel

Recommended plugins

The following plugins work well together with LLM workspace and make the experience even better:

  • Folder notes: blurs the line between a folder and a note in the file tree. A note can have child notes like in Notion and similar tools. Combined with this plugin, an LLM workspace can be a folder with notes inside.
  • Waypoint: generates a table of contents with links to other notes. Combined with Folder notes and LLM workspace, it maintains links to notes in the LLM workspace folder.

Advanced use

Click here for a list of advanced features and use cases.

Current limitations

  • The plugin only handles Markdown files for now. There is no support for images, PDFs or canvases.
  • The chunking of notes is pretty naive and there is no token count validation (but models have huge context windows nowadays)

Additional details

  • This plugin makes network requests to external LLM APIs. You can view the API-related code here
  • You can review all LLM prompts here
  • The RAG implementation is here

Roadmap

  • Change models and model parameters mid-conversation
  • Tool use
  • Chat history
  • Better query understanding and planning
  • Footnotes in LLM response
  • More advanced ways for including notes in a workspace (Dataview integration maybe?)
  • API cost estimation before heavy actions

Prior work and inspiration

Built with

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 65.9%
  • Svelte 30.9%
  • JavaScript 3.2%