This guide will help you install the necessary Hugging Face packages and tools to download and use models from the Hugging Face Hub.
Follow this guide, or return to the main page if needed.
transformers PackageThe transformers package is used to load and use Hugging Face models in Python.
Run the following command in your terminal or command prompt:
bash
pip install transformers
To confirm the installation was successful, run:
bash
python -c "from transformers import pipeline; print('Installation successful!')"
huggingface_hub PackageThe huggingface_hub package provides the huggingface-cli tool for interacting with the Hugging Face Hub (e.g., downloading models, uploading files, etc.).
Run the following command:
bash
pip install huggingface_hub
After installation, check if the huggingface-cli is available:
bash
huggingface-cli --help
huggingface-cliThe huggingface-cli tool allows you to interact with the Hugging Face Hub directly from the command line.
To log in to your Hugging Face account:
bash
huggingface-cli login
To download a model (e.g., gpt2):
bash
huggingface-cli download gpt2
To see all available commands and options:
bash
huggingface-cli --help
Here’s an example of downloading and using a model in Python:
from transformers import pipeline
# Download and load a model
generator = pipeline("text-generation", model="gpt2")
# Generate text
output = generator("Hello, how are you?", max_length=50)
print(output)
| Command | Description |
|---|---|
pip install transformers |
Install the transformers package. |
pip install huggingface_hub |
Install the huggingface_hub package. |
huggingface-cli --help |
List all available huggingface-cli commands. |
huggingface-cli login |
Log in to your Hugging Face account. |
huggingface-cli download gpt2 |
Download the gpt2 model. |
Now you’re ready to use Hugging Face models and tools in Python! 🚀