hf CLI: A faster, friendlier Hugging Face CLI with Jobs
Sources: https://huggingface.co/blog/hf-cli, Hugging Face Blog
Overview
Hugging Face officially renamed the CLI from huggingface-cli to hf. The change aims to improve ergonomics and clarity by reorganizing commands into a consistent, resource-driven structure. The new pattern is hf , with root-level commands like hf upload and hf download expected to be among the most-used. The CLI is designed to be predictable and discoverable, setting the stage for upcoming features while preserving compatibility with the legacy huggingface-cli to ease the transition. A key motivation is to simplify how features are accessed as new capabilities are added (upload, download, cache management, repo management, etc.). A notable addition is a dedicated command for Hugging Face Jobs. Jobs lets you run scripts or Docker images on Hugging Face Infrastructure using your chosen hardware flavor. Billing for Jobs is pay-as-you-go, and access is limited to Pro users and Team or Enterprise organizations. The CLI borrows heavily from Docker’s command style to feel familiar to developers. To start experimentation, you’ll install the latest huggingface_hub release, reload your terminal, and test basic commands such as hf version and hf —help. If you are familiar with huggingface-cli, most commands will look familiar, but the new organization makes authentication and other common tasks more coherent. To preview how the new CLI is organized, you can think in terms of resource groups like hf auth, hf cache, hf repo, etc., with the important exception that hf upload and hf download are surfaced at the root level for quick access.
Key features
- hf replaces huggingface-cli with a faster, more ergonomic interface.
- Commands follow the predictable pattern: hf .
- Root-level commands for the most-used actions, including hf upload and hf download.
- Command groups such as hf auth, hf cache, hf repo organize functionality by resource.
- Legacy huggingface-cli remains active and fully functional to ease the migration, with a warning pointing to the new CLI equivalent.
- A dedicated hf jobs command enables running scripts or Docker images on Hugging Face Infrastructure.
- Jobs are paid on a pay-as-you-go basis and are available to Pro users and Team or Enterprise organizations.
- The CLI design is inspired by Docker, aiming for familiarity and ease of learning.
- Authentication changes group all auth-related commands under the new structure, including hf auth list for listing local profiles.
- Users should install the latest huggingface_hub version, reload the terminal, and verify with commands like hf version and hf —help.
Note: If you used the legacy CLI, you will see warnings that guide you to the new hf equivalents.
Common use cases
- Authenticate with multiple local profiles using hf auth and hf auth list.
- Manage local caches and repository references via hf cache and hf repo.
- Upload or download artifacts quickly using hf upload and hf download at the root level.
- Explore the new CLI structure with hf —help and drill into specifics for any resource with —help.
- Launch and manage Jobs on Hugging Face Infrastructure using hf jobs (subject to plan).
- Transition gradually from huggingface-cli, using the legacy CLI with minimal disruption while adopting the new hf syntax.
Setup & installation
To start, install the latest huggingface_hub version and reload your terminal session. The exact installation command is not provided in the excerpt, but the guidance emphasizes updating to the latest huggingface_hub and restarting your terminal before testing.
# exact installation command not provided in source excerpt
After installation, test the setup:
hf version
hf --help
You can also inspect authentication state and profiles:
hf auth list
If you want to explore Jobs:
hf jobs --help
Quick start (minimal runnable example)
- Install the latest huggingface_hub and restart your terminal.
- Verify the installation:
- Run hf version to confirm the CLI is available.
- Run hf —help to see the resource-based command structure.
- List local profiles with hf auth list.
- Use the root-level commands for common tasks, such as uploading or downloading artifacts:
- hf upload (root-level command)
- If you are on a Pro/Team/Enterprise plan, explore Jobs with hf jobs —help to learn about launching scripts or Docker images on Hugging Face Infrastructure. Note: The exact commands for installation are not present in the provided excerpt; follow the hint to upgrade huggingface_hub and restart your terminal to begin.
Pros and cons
- Pros:
- Cleaner, more predictable CLI with hf structure.
- Root-level commands for the most used tasks simplify common workflows.
- Dedicated Jobs service adds cloud-based execution with usage-based pricing.
- Legacy CLI remains available to reduce disruption during migration.
- Docker-inspired design helps developers learn quickly.
- Cons:
- New features require upgrading to the latest huggingface_hub and potential transitional warnings.
- Jobs access is gated behind Pro and higher-tier plans, with pay-as-you-go pricing.
- For users deeply customized around the old command layout, migration may require retraining on the new structure.
Alternatives (brief comparisons)
| Aspect | huggingface-cli (legacy) | hf (new) |---|--------------------------|----------| | Command pattern | Feature-specific commands; no strict uniform structure | hf ; more uniform, ergonomic |Root-level actions | Not clearly emphasized | hf upload and hf download surfaced at root |Authentication | Distributed commands; migration introduces hf auth group | Auth commands grouped under hf auth; hf auth list for profiles |Migration path | Active with warnings pointing to new CLI | Legacy CLI remains available during transition |Jobs support | Not highlighted | hf jobs introduced for running scripts/Docker on HF infra |
Pricing or License
Hugging Face Jobs are available only to Pro users and Team or Enterprise organizations. Billing for Jobs is pay-as-you-go, meaning you pay only for the seconds you use.
References
More resources
Make ZeroGPU Spaces faster with PyTorch ahead-of-time (AoT) compilation
Learn how PyTorch AoT compilation speeds up ZeroGPU Spaces by exporting a compiled model once and reloading instantly, with FP8 quantization, dynamic shapes, and careful integration with the Spaces GPU workflow.
Generate Images with Claude and Hugging Face: Tools, Setup, and Examples
Learn how to connect Claude to Hugging Face Spaces via the MCP Server to generate images with Krea and Qwen-Image, leverage free credits, and explore the Hugging Face AI App Directory.
Nemotron Nano 2 9B: Open Reasoning Model with 6x Throughput for Edge and Enterprise
Open Nemotron Nano 2 9B delivers leading accuracy and up to 6x throughput with a Hybrid Transformer–Mamba backbone and a configurable thinking budget, aimed at edge, PC and enterprise AI agents.
From Zero to GPU: Building and Scaling Production-Ready CUDA Kernels
A practical guide to developing, building for multiple architectures, and deploying CUDA kernels with Hugging Face Kernel Builder. Learn how to create a robust workflow from local development to Hub-based distribution.
From Zero to GPU: A Guide to Building and Scaling Production-Ready CUDA Kernels
A practical walkthrough of Hugging Face's kernel-builder for developing, compiling, and deploying production-ready CUDA kernels across PyTorch, with reproducible builds, multi-arch support, and hub-based distribution.
MCP for Research: Connecting AI to Research Tools
Explains the Model Context Protocol (MCP) for research discovery and how AI can orchestrate research tools across arXiv, GitHub, and Hugging Face via natural language.