Assistance with 401 Unauthorized Error for API Access

I am currently encountering an issue while attempting to use the Hugging Face Inference API with access tokens. Despite following all documentation and testing multiple configurations, I am consistently receiving a 401 Unauthorized error when calling the /models endpoint.

Here is a summary of the steps I have taken to troubleshoot this issue:

  1. I generated both read-only and fine-grained tokens from my account settings.
  2. I ensured the tokens include permissions for:
  • Making calls to the serverless Inference API.
  • Making calls to Inference Endpoints.
  1. I verified that my network settings allow the API to be reached, as confirmed by successful TLS handshakes.
  2. I tested the tokens using both curl and Python, specifically targeting:
  • https://huggingface.co/proxy/api-inference.huggingface.co/models
  • https://huggingface.co/proxy/api-inference.huggingface.co/models/distilbert-base-uncased
  1. All attempts resulted in the same 401 Unauthorized error, despite using valid tokens.

Based on the error and my troubleshooting, I suspect the issue might be related to the limitations of a free-tier account. Could you kindly confirm:

  1. Whether a paid subscription is required to access the /models endpoint or specific API features?
  2. If there are other configurations or settings I might be missing to enable access with a free-tier account?

I greatly appreciate your guidance and support in resolving this issue. Please let me know if additional details are needed.

Thank you in advance for your help.

Best regards,
Miguel Crespo

That’s strange, it’s working fine over here. Try this.
https://huggingface.co/proxy/api-inference.huggingface.co/models/distilbert/distilbert-base-uncased

Thank you John,

To be more specific, I’m getting a 401 error at my end when I try to use a token that I had created. Please see curl verbose output:

curl -v -H “Authorization: Bearer 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” https://huggingface.co/proxy/api-inference.huggingface.co/models

< HTTP/2 401
< date: Sun, 01 Dec 2024 02:44:17 GMT
< content-length: 0
< vary: Origin, Access-Control-Request-Method, Access-Control-Request-Headers
< x-request-id: -x55LlHujz9vzSw9J7-gp
< access-control-allow-credentials: true
<

Thank you!

Thanks for the details. It looks like this, but the strange thing is that there shouldn’t be a problem even if you don’t specify it in particular, and it doesn’t look like an SSL error from the message content…

Adding a header “Connection: close” to your CURL command should solve the problem:
curl -v -H “Connection: close” https://example.com

[quote=“John6666, post:4, topic:128643”]
message
[/quote] Hi John, really appreciate you looking into this, I tried incorporating your suggestion but unfortunately I get the same error, HTTP/2 400, invalid token.

I generated both read-only and fine-grained tokens from my account settings. But I always get invalid token at the HF end.

All seems correct, but somehow HF says the token is invalid.

Please see screenshot.

Thanks for your help!

@not-lain @meganariley Weird 401 problem.

Hi john,

Do you think this could be because I need to have a paid subscription? Do you manage to get tokens and then use them?

Thank you!

It’s true that there are advantages for those who subscribe to the paid plan, but I don’t think that’s the case here. If it’s because you’re a free member, I think you’d get a different error rather than a 401 error…

Hi John,

I don’t know if this is possible, but would you be able to run this command in your environment and see if you still get the same error?

Not sure if I can ask this, if not, not to worry. I’ll try something else :slight_smile:

curl -v -H “Authorization: Bearer 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” -H “Connection: close” https://huggingface.co/proxy/api-inference.huggingface.co/models

Thank you!

There’s nothing wrong with it. I’ll hide some of it, but it’s generally like this. Windows CMD.
Incidentally, the result was exactly the same even if I used my own token.

Host api-inference.huggingface.co:443 was resolved.

*   Trying 35.171.117.73:443...
* Connected to api-inference.huggingface.co (35.171.117.73) port 443
* schannel: disabled automatic use of client certificate
* ALPN: curl offers http/1.1
* using HTTP/1.x
> GET /models HTTP/1.1
> Host: api-inference.huggingface.co
> User-Agent: curl/8.9.1
> Accept: */*
> Authorization: Bearer hf_*******
> Connection: close
>
* Request completely sent off
* schannel: server close notification received (close_notify)
< HTTP/1.1 401 Unauthorized
< Date: Tue, 03 Dec 2024 07:41:39 GMT
< Content-Length: 0
< Connection: close
< access-control-allow-credentials: true
< x-request-id: ********
< vary: Origin, Access-Control-Request-Method, Access-Control-Request-Headers
<
* shutting down connection #0
* schannel: shutting down SSL/TLS connection with api-inference.huggingface.co port 443

Hi John,

Thank you so much for your help, much appreciated. I will repost the issue again on the forum and see if someone from support can help me on this.

Have a great day!

Miguel.

I get the same 401 error when trying to connect to HF inference endpoints with any model, and I have a paid subscription.

danielbyrne@Daniels-MacBook-Pro FunctionalNetworksSFT % curl -s https://huggingface.co/api/whoami -H “Authorization: Bearer $HF_TOKEN”
{“error”:“Invalid credentials in Authorization header”}

Oh. whoami is virtually deprecated. Try whoami-v2 instead.

curl -s https://huggingface.co/api/whoami-v2 -H "Authorization: Bearer $HF_TOKEN"

hi @miguelcrespo did you manage a breakthrough? months later i am facing the same issue!