Ticker

6/random/ticker-posts

Bilgecan: Your Own Local AI Platform (Ollama + Spring AI)

 Hey everyone,

I’ve been working on a side project called Bilgecan — a self-hosted, local-first AI platform that uses Ollama as the LLM runtime.

What can you do with Bilgecan?

  • Use local LLM models via Ollama to run privacy-friendly AI prompts and chat without sending your data to third parties.
  • With RAG (Retrieval-Augmented Generation), you can feed your own files into a knowledge base and enrich AI outputs with your private data.
  • Define asynchronous AI tasks to run long operations (document analysis, report generation, large text processing, image analysis, etc.) in the background.
  • Use the file processing pipeline to run asynchronous AI tasks over many files automatically.
  • With the Workspace structure, you can share AI prompts and tasks with your team in a collaborative environment.


I’d really appreciate feedback from the community.

Repo: https://github.com/mokszr/bilgecan

YouTube demo video: 

 


Yorum Gönder

0 Yorumlar