Build AI.
Run it anywhere.
The AI infrastructure platform. Version control, GPU compute, and model serving — from research to production.
terminal — outpost
▌
01
Features.
Store anything, anysize
One repository for code, models, datasets, and artifacts. No file size limits, no LFS. Written in Rust.
Explore repositoriesLaunch compute in seconds
A100s, H100s, or CPUs on demand. Dev environments, batch jobs, and sandboxes — all from one CLI.
Explore computeDeploy models as endpoints
Auto-scaling inference. Scale to zero. Multi-region replicas with automatic failover.
Explore servingRun models anywhere
Shard large models across phones, laptops, and GPUs. CUDA, Metal, CPU. Zero-config clustering.
Explore edge outpost push
│
┌────────────┼────────────┐
▼ ▼ ▼
┌─────────┐ ┌─────────┐ ┌─────────┐
│ MODEL │ │ DATASET │ │ CODE │
│ ░░░░░░░ │ │ ░░░░░░░ │ │ ░░░░░░░ │
│ 50 GB │ │ 12 GB │ │ 2 KB │
└─────────┘ └─────────┘ └─────────┘
one repo · no limits · versioned02
Built for.
| Generative AI | LLMs, image models, multi-modal systems |
| Embodied AI | robotics, sensor data, edge deployment |
| Computer vision | video, image pipelines at scale |
| Research | reproducible experiments, versioned everything |
Developer-first, enterprise-ready.
Single-Sign On · DPAs & SLAs · Reliable Uptime