← Home
OAMC

OAMC

Local-first LLM wiki for research

PythonFastAPIObsidianOpenAImacOS
GitHub

I kept losing context across research sessions. Notes in one place, sources in another, LLM conversations gone after closing the tab. I wanted something that ties it all together locally — no cloud sync, no vendor lock-in, just my files.

OAMC ingests raw sources into an Obsidian vault and compiles them into a structured wiki using LLMs. Drop a paper, article, or notes into the raw folder, and the pipeline processes it into linked, searchable wiki pages. Query your knowledge base from the CLI, the local dashboard, or the macOS menubar.

Built with Python and FastAPI. Runs entirely on your machine. Your research stays yours.

Input Raw sources — papers, articles, notes, any markdown
Output Structured Obsidian wiki with linked pages and summaries
Interface CLI, local web dashboard, macOS menubar app
Privacy Fully local — your files, your machine, your API key