I don’t think there is a problem on your end. There was a major outage a few days ago, and it seems that the Inference API for that model has been turned off since then.
John6666
2
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| Constant 503 error for several days when running LLAMA 3.1 | 5 | 630 | April 25, 2025 | |
| Not able to access meta-llama/Llama-3.2-3B-Instruct | 3 | 433 | April 25, 2025 | |
| 503 server Error | 1 | 191 | August 12, 2025 | |
| Getting "502 Server Error: Bad Gateway for url: https://huggingface.co/proxy/api-inference.huggingface.co/models/meta-llama/Llama-3.2-3B-Instruct" error | 8 | 603 | April 28, 2025 | |
| Inference API returns 504 error for Llama-3.2-3B-Instruct & google/gemma-2-2b-it | 3 | 68 | April 21, 2025 |