tritonserver-backend-tensorrtllm-24.04
Chainguard
Status
Impact
Upgrading transformers from 4.50.0 to the fixed version 4.53.0 causes the build process to fail due to dependency conflicts in the TensorRT-LLM backend. The package has complex interdependencies with other Python ML libraries (torch, tensorrt_llm, etc.) that are tightly coupled to specific versions. Upgrading transformers in isolation breaks compatibility with the TensorRT-LLM requirements and causes build failures. This requires upstream NVIDIA to update the entire TensorRT-LLM stack to support newer transformers versions in a compatible manner.
Status