Skip to Main Content
Lightning Talk Intermediate BSD-3

Building a Local-First AI Stack with Open WebUI & Ollama

Rejected
Session Description

AI is everywhere, but so are concerns around data privacy, vendor lock-in, and sending sensitive information to closed-source models.

In this talk, I'll introduce a local-first AI stack using Ollama and Open Webui, showing how developers can run and interact with LLMs locally without relying on cloud APIs.

I'll show a quick demo on how to setup your own private AI assitants and how to use it efficiently.

Key Takeaways

Key Takeaways

  • What a Local first AI is

  • How to run LLMs locally with Ollama

  • How to use Open webui as a self hosted AI interface

References

Session Categories

Introducing a FOSS project or a new version of a popular project
Engineering practice - productivity, debugging
Tutorial about using a FOSS project
Talk License: BSD-3

Speakers

Vignesh Murugan Software Engineer | VGTS

Software Engineer - VGTS
Founding Member - Rebel/Stack Community

Vignesh Murugan
https://www.linkedin.com/in/vignesh-murugan-dev/

Reviews

This is more suited for a blog post.

Reviewer #1 Rejected

The proposal isn't sufficiently detailed and it's not clear what additional knowledge this talk imparts to the audience that they can't learn from existing information online

Reviewer #2 Rejected