Get started free
Featured

Our Workshop Wrap-Up: Building Your Own AI with Open-Source Models

We recently partnered with brno.ai for a workshop on running open-source LLMs locally. Here’s a summary of what we covered, from the benefits of local AI to getting hands-on with Ollama and RAG.

by Jaroslav
WorkshopOllamaLocal RAG

Last week, our co-founder and CTO Josef Podaný had the pleasure of leading a workshop called “Build Your Own AI: Open-Source Models on Your Machine”. The event was organised by the fantastic team at brno.ai and held at the KUMST creative hub in Brno. The atmosphere was brilliant, and it was great to see a room full of people so enthusiastic about diving into the practical side of AI.

The audience was almost entirely developers from Brno’s tech community, including people from local startups, established companies like Red Hat, and Masaryk University. For those who couldn’t make it, or for anyone who wants a quick refresher, here’s a summary of what we discussed.

Why Bother with a Local AI?

The first question we tackled was simple: with powerful APIs from OpenAI and Google, why would you run an AI model on your own computer? The reasons are quite compelling:

  • Privacy and Data Control: When you run a model locally, your data never leaves your machine. This is perfect for working with sensitive company documents, private code, or customer information.
  • Offline Capability: A local model works without an internet connection, which is useful for development on the go or for applications in remote environments.
  • No API Costs or Limits: You avoid paying per-token fees, and you aren’t limited by a provider’s rate limits or terms of service.
  • Full Customisation: You have complete control. You can tweak the model, fine-tune it with your own data, or experiment with different open-source architectures.

The Open vs. Closed Model Landscape

We then discussed the difference between closed models like GPT-4 and open-source models like Meta’s Llama 3 or Mistral.

Closed models are easy to use and often offer the highest performance, but they come at a cost. You send your data to a third-party service, you pay for what you use, and you have no visibility into how the model works.

Open-source models give you freedom and control. You can download the model files and run them wherever you want. This transparency and flexibility are what make the open-source community so exciting. The trade-off is that you are responsible for providing the hardware to run them.

Your Toolkit for Running Local LLMs

Getting started with local models is easier than ever thanks to some great tools. We focused on a few key players in the ecosystem:

  • Hugging Face: Think of it as the GitHub for AI. It’s a huge library where you can find, compare, and download thousands of open-source models.
  • LM Studio: A desktop application with a graphical user interface that lets you download and chat with different models, much like you would with ChatGPT.
  • Ollama: This was the star of our hands-on session. Ollama is a simple command-line tool that makes it incredibly easy to download, manage, and run LLMs locally.

Hands-On: Running a Model and Building a RAG App

In the practical part of the workshop, Josef showed how anyone can get an AI running in just a few minutes with Ollama. With a single command (ollama run mistral), you can download a powerful model and start chatting with it directly in your terminal.

We also built a simple Retrieval-Augmented Generation (RAG) application. In simple terms, RAG gives a local model access to your own documents. It’s like giving the AI an “open-book exam”, allowing it to answer questions based on a specific set of information you provide, rather than just its general knowledge. This is a powerful way to make a general model an expert on your specific data.

All the code and examples from this practical demo are available on GitHub for you to try yourself: https://github.com/podanypepa/gorag

It was a fantastic evening, and it’s clear that the developer community in Brno is excited about the possibilities of open-source AI. A final thank you to brno.ai, KUMST, and everyone who attended and brought their curiosity and great questions.

Ready to get started?

Experience the power of unified AI with Nomodo AI platform.

Get Started Free