Back to Graph
personalactiveDecember 2024

Local AI Stack

M4 Max multi-model inference

Local AI Stack visualization

What

M4 Max multi-model inference

How

OllamaM4 MaxLLMs

Why

The Stack + The Lab

Supporting: Solid production work

Overview

Local AI infrastructure on M4 Max MacBook Pro. Ollama-based multi-model conversations, Reachy experiments, and development environment for AI-native workflows.

Technologies

OllamaM4 MaxLLMs

Narrative Framings

Related Writing