/
DirectorySecurity Advisories
Sign In
Directory
tritonserver-pytorch-backend logo

tritonserver-pytorch-backend

Last changed

Create your Free Account

Be the first to hear about exciting product updates, critical vulnerability alerts, compare alternative images, and more.

Sign Up
Tags
Overview
Comparison
Provenance
Specifications
SBOM
Vulnerabilities
Advisories

Chainguard Container for tritonserver-pytorch-backend

The Triton backend for the PyTorch TorchScript models.

Chainguard Containers are regularly-updated, secure-by-default container images.

Download this Container Image

For those with access, this container image is available on cgr.dev:

docker pull cgr.dev/ORGANIZATION/tritonserver-pytorch-backend:latest

Be sure to replace the ORGANIZATION placeholder with the name used for your organization's private repository within the Chainguard Registry.

Compatibility Notes

The Chainguard tritonserver-pytorch-backend image is functionally comparable to the official NVIDIA nvcr.io/nvidia/tritonserver:*-pyt-python-py3 image. This image contains the PyTorch backend (libtriton_pytorch.so) and supporting libraries such as TorchVision and Torch-TensorRT,


Getting Started

The following instructions serve a TorchScript-exported ResNet-50 model using the Triton Inference Server with the PyTorch backend.

Step 1: Export a TorchScript model using NVIDIA’s PyTorch container

MODEL_PATH=$(pwd)/model_repository/1
mkdir -p "$MODEL_PATH"

docker run --rm -v "$MODEL_PATH:/out" nvcr.io/nvidia/pytorch:25.04-py3   python3 -c '
import torch
import torchvision.models as models
model = models.resnet50(weights=models.ResNet50_Weights.DEFAULT)
model.eval()
torch.jit.script(model).save("/out/model.pt")'

Step 2: Create the model configuration file

cat > model_repository/config.pbtxt <<EOF
name: "resnet50"
platform: "pytorch_libtorch"
max_batch_size: 1
input [
  {
    name: "input"
    data_type: TYPE_FP32
    format: FORMAT_NCHW
    dims: [3, 224, 224]
  }
]
output [
  {
    name: "output"
    data_type: TYPE_FP32
    dims: [1000]
  }
]
EOF

Step 3: Run Triton Inference Server using the Chainguard PyTorch backend image

docker run --rm --name triton -v "$(pwd)/model_repository:/models" -p8000:8000 -p8001:8001 -p8002:8002 cgr.dev/ORGANIZATION/tritonserver-pytorch-backend --model-repository=/models

Step 5: Run inference with dummy input

Once the server becomes ready, create the inference request JSON:

python3 -c '
import json
with open("infer_input.json", "w") as f:
  json.dump({
    "inputs": [{
      "name": "input",
      "shape": [1, 3, 224, 224],
      "datatype": "FP32",
      "data": [0.0] * (1 * 3 * 224 * 224)
    }],
    "outputs": [{"name": "output"}]
  }, f)
'

Send the request:

curl -sf localhost:8000/v2/models/resnet50/infer   -H "Content-Type: application/json"   -d @infer_input.json | jq

You should see a JSON response with output logits of shape [1, 1000] indicating successful inference!

What are Chainguard Containers?

Chainguard Containers are minimal container images that are secure by default.

In many cases, the Chainguard Containers tagged as :latest contain only an open-source application and its runtime dependencies. These minimal container images typically do not contain a shell or package manager. Chainguard Containers are built with Wolfi, our Linux undistro designed to produce container images that meet the requirements of a more secure software supply chain.

The main features of Chainguard Containers include:

For cases where you need container images with shells and package managers to build or debug, most Chainguard Containers come paired with a -dev variant.

Although the -dev container image variants have similar security features as their more minimal versions, they feature additional software that is typically not necessary in production environments. We recommend using multi-stage builds to leverage the -dev variants, copying application artifacts into a final minimal container that offers a reduced attack surface that won’t allow package installations or logins.

Learn More

To better understand how to work with Chainguard Containers, please visit Chainguard Academy and Chainguard Courses.

In addition to Containers, Chainguard offers VMs and Libraries. Contact Chainguard to access additional products.

Trademarks

This software listing is packaged by Chainguard. The trademarks set forth in this offering are owned by their respective companies, and use of them does not imply any affiliation, sponsorship, or endorsement by such companies.

Licenses

Chainguard container images contain software packages that are direct or transitive dependencies. The following licenses were found in the "latest" tag of this image:

  • Apache-2.0

  • BSD-2-Clause

  • BSD-3-Clause

  • GCC-exception-3.1

  • GPL-2.0-only

  • GPL-2.0-or-later

  • GPL-3.0-or-later

For a complete list of licenses, please refer to this Image's SBOM.

Software license agreement

Compliance

A FIPS validated version of this image is available for FedRAMP compliance. STIG is included with FIPS image.


Related images

Category
featured
AI

Safe Source for Open Sourceâ„¢
Media KitContact Us
© 2025 Chainguard. All Rights Reserved.
Private PolicyTerms of Use

Products

Chainguard ContainersChainguard LibrariesChainguard VMs