Войти
  • 34631Просмотров
  • 1 год назадОпубликовано1littlecoder

The ONLY Local LLM Tool for Mac (Apple Silicon)!!

LM Studio ships with an MLX engine for running on-device LLMs super efficiently on Apple Silicon Macs. MLX support in LM Studio includes: Search & download any supported MLX LLM from Hugging Face (just like you've been doing with GGUF models) Use MLX models via the Chat UI, or from your code using an OpenAI-like local server running on localhost Enforce LLM responses in specific JSON formats (thanks to Outlines) Use Vision models like LLaVA and more, and use them via the chat or the API (thanks to mlx-vlm) Load and run multiple simultaneous LLMs. You can even mix and match and MLX models! 🔗 Links 🔗 ❤️ If you want to support the channel ❤️ Support here: Patreon - Ko-Fi - 🧭 Follow me on 🧭 Twitter - Linkedin -