
◆ What
M4 Max multi-model inference
◆ How
OllamaM4 MaxLLMs
◆ Why
The Stack + The Lab
Supporting: Solid production work
Overview
Local AI infrastructure on M4 Max MacBook Pro. Ollama-based multi-model conversations, Reachy experiments, and development environment for AI-native workflows.
Technologies
OllamaM4 MaxLLMs