Войти
  • 2962Просмотров
  • 9 месяцев назадОпубликованоBusinessBreeze: run your business with AI tips

DeepSeek R1 full model on 128 GB MacBook Pro M4 Max: how fast is it?

Running a 131GB AI Model on a MacBook Pro: Pushing the Limits Ever wondered if you could run a massive AI model on your laptop? In this fascinating episode, we dive into the remarkable feat of running the DeepSeek R1 reasoning model on a 2025 MacBook Pro with the M4 Max chip, one of the best notebook computers money can buy today. Through clever optimization and quantization, the original 720GB model was compressed to 131GB, making it possible to run on consumer hardware with 128GB RAM. While the model demonstrates impressive reasoning capabilities across topics like business scalability and asset diversification, the performance tells an interesting story. At tokens per second and a 6-second initial response time, it's not breaking any speed records - but the ability to run completely offline brings unique advantages for privacy and remote usage. This experiment pushes the boundaries of what's possible with consumer hardware and showcases both the potential and current limitations of running large AI models locally. Whether you're a tech enthusiast or AI practitioner, you'll want to hear the full details of this groundbreaking test. 🎧 Listen now to discover if your Mac could be your next AI powerhouse and learn about the future of personal AI computing. Subscribe to my channel for more tips on AI for managers, entrepreneurs and business people, upcoming AI tools which will save you time and make you money.