Войти
  • 29261Просмотров
  • 4 месяца назадОпубликованоPython Simplified

Build Full Stack LLM Chat App with Docker Model Runner, LangChain and Streamlit

In this tutorial, I’ll show you how to build a complete AI assistant app from scratch! 🚀 You’ll learn how to run open-source LLMs locally using Docker’s brand-new Model Runner (via CLI and as a backend service). We will then combine it with a clean, traditional, chat interface using Streamlit (a very quick and simple GUI library!) And the best part is - we will easily switch from chatting with a small local model to a powerful cloud-based model on OpenRouter - all while saving the conversation history so you don’t have to repeat yourself. YES, BOTH MODELS WILL BE AWARE OF THE ENTIRE CONVERSATION! EVEN THE PARTS WHERE IT WASN'T TALKING! 🤯🤯🤯 📦 Tools Used ---------------------------------------------- 🔹 Docker Model Runner 🔹 Langchain 🔹 Streamlit 🔹 OpenRouter 🔹 Docker Compose 🛠️ What You'll Build -------------------------------------------------- 🔹 Local LLM serving with Docker Model Runner 🤖 🔹 A chat GUI with Streamlit 💻 🔹 Memory for past chat messages 💡 🔹 One-click switch to a big cloud model ☁️ 🔹 Fully containerized setup with Docker Compose 🐋 By the end of this video, you’ll have a production-ready AI chatbot 🤖 that runs both locally and in the cloud, with all dependencies packaged in Docker containers! This project is the perfect foundation for more advanced AI apps (coming soon... 😉). 💻 Code and Resources: -------------------------------------------------- ⭐ Full Tutorial Code: ⭐ Docker Model Runner documentation: ⭐ Docker AI Namespace - Find the model you need here: 🏃‍♀️‍➡️ Base URL for Docker Model Runner: -------------------------------------------------- ⏰ Time Stamps: -------------------------------------------------- 01:25 - Docker Desktop Setup 02:14 - Docker Model Runner CLI 03:22 - Intro to Building Apps with Docker 04:30 - Basic App with Docker Compose [CLI] 08:39 - Docker Model Runner in Docker Compose and Langchain 11:19 - Chat App GUI with Streamlit 18:02 - Store Chat History in User Sessions 21:57 - LLM Chat Context 23:26 - Run Cloud LLM via OpenRouter 28:42 - Best Practices 30:04 - Thanks for Watching! 🎥 Related Videos: -------------------------------------------------- ⭐ Docker Quickstart for Beginners: ⭐ WSL Setup: If you find this tutorial helpful, don’t forget to like 👍 subscribe 🔔 and drop your questions in the comments 💌. Happy coding! 🎯 The Workflow: -------------------------------------------------- 1. A step by step pipeline of bringing the chat app to life. 2. How to install and enable Docker Model Runner. 3. Creating a minimal Python + Docker app. 4. Setting up Docker Compose with local model services. 5. Building a Streamlit chat interface. 6. Storing and passing conversation context. 7. Connecting to OpenRouter for large models. 8. Best practices for environment variables, requirements, and healthchecks. 🤝 Let's Connect 🤝 -------------------------------------------------- 🔗 Github: 🔗 X: 🔗 LinkedIn: 🔗 Blog: 🔗 Discord: 💳 Credits 💳 -------------------------------------------------- - beautiful icons by FlatIcon - beautiful graphics by Freepik #python #docker #pythonprogramming #LLM #LangChain #LocalLLM #Streamlit #AgenticAI #coding #software #ai