build-models
Package and build custom AI models with Cog for deployment on Replicate. Use when creating a cog.yaml or predict.py, defining model inputs and outputs, loading model weights at setup time, building Docker images for ML models, serving locally with cog serve or cog predict, or porting a HuggingFace, GitHub, or ComfyUI model to run on Replicate. Trigger on phrases like "build a model", "package a model", "create a Cog model", "wrap a model", "containerize an AI model", "predict.py", "cog.yaml", "BasePredictor", or "Cog container", and when referencing cog.run, github.com/replicate/cog, or github.com/replicate/cog-examples. Covers GPU and CUDA setup, pget for fast weight downloads, async predictors with continuous batching, streaming outputs, and cold-boot optimization for image, video, audio, and LLM models. For pushing built models to Replicate, see publish-models. For running existing models, see run-models.
Unsigned — install at your own risk
This skill has no cryptographic signature attached. We can't verify the contents match what the publisher intended.
Install this skill
Run this command in your terminal. No account required — it auto-detects your AI tool and installs the skill file.
npx @skills-hub-ai/cli install replicate-build-modelsSetup by platform
Install
One-click setup for your editorRun in your project root
npx @skills-hub-ai/cli install replicate-build-models --target claude-code