-
Notifications
You must be signed in to change notification settings - Fork 679
Open
Description
We are trying to use nexus as huggingface proxy, but are experiencing issues huggingface via vllm (same issue seams to appear here: gpustack/gpustack#1490)
The only logs i can see are as followed:
10.89.0.2 - - [24/Nov/2025:13:14:19 +0000] "HEAD /repository/huggingface/gpt2/resolve/main/config.json HTTP/1.0" 200 - 0 15 "unknown/None; hf_hub/0.36.0; python/3.12.12; torch/2.9.0+cu129; transformers/4.57.1; session_id/e7a16bda0bca41f8a94c3bf9b56d88b5; file_type/config; from_auto_class/False" [qtp1805909312-5148]
10.89.0.2 - - [24/Nov/2025:13:14:19 +0000] "GET /repository/huggingface/gpt2/resolve/main/config.json HTTP/1.0" 200 - 665 5 "unknown/None; hf_hub/0.36.0; python/3.12.12; torch/2.9.0+cu129; transformers/4.57.1; session_id/e7a16bda0bca41f8a94c3bf9b56d88b5; file_type/config; from_auto_class/False" [qtp1805909312-2534]
10.89.0.2 - - [24/Nov/2025:13:14:19 +0000] "HEAD /repository/huggingface/gpt2/resolve/main/config.json HTTP/1.0" 200 - 0 4 "unknown/None; hf_hub/0.36.0; python/3.12.12; torch/2.9.0+cu129; transformers/4.57.1; session_id/e7a16bda0bca41f8a94c3bf9b56d88b5; file_type/config; from_auto_class/False" [qtp1805909312-5148]
10.89.0.2 - - [24/Nov/2025:13:14:19 +0000] "HEAD /repository/huggingface/gpt2/resolve/main/config.json HTTP/1.0" 200 - 0 3 "unknown/None; hf_hub/0.36.0; python/3.12.12; torch/2.9.0+cu129; transformers/4.57.1; session_id/e7a16bda0bca41f8a94c3bf9b56d88b5; file_type/config; from_auto_class/True" [qtp1805909312-2534]
10.89.0.2 - - [24/Nov/2025:13:14:19 +0000] "GET /repository/huggingface/api/models/gpt2/tree/main?recursive=True&expand=False HTTP/1.0" 400 - 1433 3 "unknown/None; hf_hub/0.36.0; python/3.12.12; torch/2.9.0+cu129" [qtp1805909312-2368]
for reproduction:
we are trying to run vllm in docker:
docker run --gpus all --rm -p 8001:8000 -e HF_ENDPOINT="https://internalNexus.fqdn/repository/huggingface" -v /home/local/vllm/certs/:/usr/local/share/ca-certificates -e REQUESTS_CA_BUNDLE=/etc/ssl/certs/ca-certificates.crt -e SSL_CERT_FILE=/etc/ssl/certs/ca-certificates.crt --entrypoint bash internalNexus.fqdn:8091/vllm/vllm-openai:latest -c "ls -l /usr/local/share/ca-certificates && update-ca-certificates && vllm serve gpt2"
Metadata
Metadata
Assignees
Labels
No labels