Distributed AI
exo for Mac Mini
Use exo for Mac Mini with Macfleet CloudRun Your Own AI Cluster
exo lets you unify your existing devices into one powerful GPU cluster. It works with iPhones, iPads, Android devices, Macs, NVIDIA GPUs, Raspberry Pis, and almost any other device.
Key Features
- Wide Model Support: Run LLaMA, Mistral, LlaVA, Qwen, Deepseek, and more
- Dynamic Model Partitioning: Split models across devices based on available resources
- Automatic Device Discovery: Zero configuration required to find other devices
- ChatGPT-compatible API: One-line change to run models on your own hardware
- Device Equality: P2P architecture with no master-worker hierarchy
Getting Started
- Install from source:
git clone https://github.com/exo-explore/exo.git
- Change to the directory:
cd exo
- Install dependencies:
pip install -e .
orsource install.sh
- Start exo on each device: simply run
exo
Perfect For
- Running larger models than any single device could handle
- Privacy-conscious users wanting local AI processing
- Developers building AI applications
- Utilizing existing hardware instead of buying expensive GPUs
exo is maintained by exo labs and released under the GPL-3.0 license.
Apple silicon as-a-Service
Discover why Macfleet is the preferred cloud provider for developers.