Войти
  • 108838Просмотров
  • 1 год назадОпубликованоIan Wootten

Using Ollama to Run Local LLMs on the Raspberry Pi 5

My favourite local LLM tool Ollama is simple to set up and works on a raspberry pi 5. I check it out and compare it to some benchmarks from more powerful machines. 00:00 Introduction 00:41 Installation 02:12 Model Runs 09:01 Conclusion Ollama: Blog: Support My Work: Check out my website: Follow me on twitter: Subscribe to my newsletter: I've been using DigitalOcean for web hosting for years. Get $200 credit on sign up: Buy me a cuppa: Learn how devs make money from Side Projects: Gear: RPi 5 from Pimoroni on Amazon: Some of these links are affiliates meaning I earn on qualifying purchases at no extra cost to you.