huggingface-tokenizers
Fast tokenizers optimized for research and production. Rust-based implementation tokenizes 1GB in <20 seconds. Supports BPE, WordPiece, and Unigram algorithms. Train custom vocabularies, track alignments, handle padding/truncation. Integrates seamlessly with transformers. Use when you need high-performance tokenization or custom tokenizer training.
v1.0.03 installs
Signing
SignedSLSA L2- Signed by
- skills-hub.ai distributor
- Method
- Distributor-signed by skills-hub.aiCryptographically signed by the skills-hub.ai distributor key at publish time.
- Signed
Install this skill
Run this command in your terminal. No account required — it auto-detects your AI tool and installs the skill file.
npx @skills-hub-ai/cli install ai-research-huggingface-tokenizersOr download directly:
Browse all CLI commands →Setup by platform
Install
One-click setup for your editorRun in your project root
npx @skills-hub-ai/cli install ai-research-huggingface-tokenizers --target claude-codeInstructions
Security
Loading security scan...