/
DirectorySecurity AdvisoriesPricing
Sign in
Directory
hailo-ai-onnxruntime logo

hailo-ai-onnxruntime

Last changed

Request a free trial

Contact our team to test out this image for free. Please also indicate any other images you would like to evaluate.

Tags
Overview
Comparison
Provenance
Specifications
SBOM
Vulnerabilities
Advisories

Chainguard Container for hailo-ai-onnxruntime

Container image with ONNX Runtime, HailoRT, and the Hailo Execution Provider for hardware-accelerated ML inference

Chainguard Containers are regularly-updated, secure-by-default container images.

Download this Container Image

For those with access, this container image is available on cgr.dev:

docker pull cgr.dev/ORGANIZATION/hailo-ai-onnxruntime:latest

Be sure to replace the ORGANIZATION placeholder with the name used for your organization's private repository within the Chainguard Registry.

What's Included

This image contains:

  • HailoRT: Open source runtime library for Hailo AI accelerator devices
  • ONNX Runtime: ML inference engine with C++ libraries and headers
  • Hailo Execution Provider: Integration enabling ONNX Runtime to use Hailo accelerators
  • Python 3.12 Bindings: Full Python API for ONNX Runtime with Hailo support
  • Development Tools: Build tools for compiling C++ applications (bash, build-base)
  • HailortCLI: Command-line tool for managing and querying Hailo devices

Usage

Available Execution Providers

This image includes both CPUExecutionProvider and HailoExecutionProvider. You can verify the providers are available:

docker run --rm cgr.dev/ORGANIZATION/hailo-ai-onnxruntime:latest -lc "python3 -c 'import onnxruntime as ort; print(ort.get_available_providers())'"

This will output: ['HailoExecutionProvider', 'CPUExecutionProvider']

Python Example

Run ONNX model inference using Python:

import onnxruntime as ort
import numpy as np

# Create inference session with Hailo provider
session = ort.InferenceSession(
    "model.onnx",
    providers=["HailoExecutionProvider", "CPUExecutionProvider"]
)

# Prepare input data
input_name = session.get_inputs()[0].name
input_data = np.random.randn(1, 3, 224, 224).astype(np.float32)

# Run inference
outputs = session.run(None, {input_name: input_data})
print("Inference completed:", outputs[0].shape)

C++ Example

Compile and link against ONNX Runtime C++ API:

#include <iostream>
#include "core/session/onnxruntime_cxx_api.h"

int main() {
    // List available providers
    for (auto& provider : Ort::GetAvailableProviders()) {
        std::cout << provider << std::endl;
    }

    // Get ONNX Runtime version
    std::cout << OrtGetApiBase()->GetVersionString() << std::endl;
    return 0;
}

Build the application:

docker run --rm -v "$PWD:/work" -w /work \
  cgr.dev/ORGANIZATION/hailo-ai-onnxruntime:latest -lc \
  "c++ -std=c++17 app.cpp -o app \
   $(pkgconf --cflags libonnxruntime) \
   $(pkgconf --libs libonnxruntime)"

Using HailortCLI

The image includes the hailortcli tool for device management:

docker run --rm cgr.dev/ORGANIZATION/hailo-ai-onnxruntime:latest -lc "hailortcli --version"

Hardware Requirements

  • For CPU-only inference: No special hardware required
  • For accelerated inference: Hailo AI accelerator device (e.g., Hailo-8, Hailo-15) with appropriate drivers installed on the host

When running with Hailo hardware, you may need to pass device access to the container:

docker run --rm --device=/dev/hailo0 \
  cgr.dev/ORGANIZATION/hailo-ai-onnxruntime:latest \
  -lc "hailortcli scan"

Environment Variables

  • HAILORT_LOGGER_PATH: Path for HailoRT log files (default: /var/log/hailort)
  • ONNXRUNTIME_LOG_LEVEL: Set ONNX Runtime logging verbosity (0-4, where 0 is most verbose)

Resources

  • Hailo AI GitHub - HailoRT
  • Hailo AI ONNX Runtime Fork
  • ONNX Runtime Documentation
  • ONNX Model Zoo

What are Chainguard Containers?

Chainguard's free tier of Starter container images are built with Wolfi, our minimal Linux undistro.

All other Chainguard Containers are built with Chainguard OS, Chainguard's minimal Linux operating system designed to produce container images that meet the requirements of a more secure software supply chain.

The main features of Chainguard Containers include:

For cases where you need container images with shells and package managers to build or debug, most Chainguard Containers come paired with a development, or -dev, variant.

In all other cases, including Chainguard Containers tagged as :latest or with a specific version number, the container images include only an open-source application and its runtime dependencies. These minimal container images typically do not contain a shell or package manager.

Although the -dev container image variants have similar security features as their more minimal versions, they include additional software that is typically not necessary in production environments. We recommend using multi-stage builds to copy artifacts from the -dev variant into a more minimal production image.

Need additional packages?

To improve security, Chainguard Containers include only essential dependencies. Need more packages? Chainguard customers can use Custom Assembly to add packages, either through the Console, chainctl, or API.

To use Custom Assembly in the Chainguard Console: navigate to the image you'd like to customize in your Organization's list of images, and click on the Customize image button at the top of the page.

Learn More

Refer to our Chainguard Containers documentation on Chainguard Academy. Chainguard also offers VMs and Librariescontact us for access.

Trademarks

This software listing is packaged by Chainguard. The trademarks set forth in this offering are owned by their respective companies, and use of them does not imply any affiliation, sponsorship, or endorsement by such companies.

Licenses

Chainguard's container images contain software packages that are direct or transitive dependencies. The following licenses were found in the "latest" tag of this image:

  • Apache-2.0

  • BSD-1-Clause

  • BSD-2-Clause

  • BSD-3-Clause

  • BSD-4-Clause-UC

  • CC-BY-4.0

  • CC-PDDC

For a complete list of licenses, please refer to this Image's SBOM.

Software license agreement

Category
application

The trusted source for open source

Talk to an expert
© 2025 Chainguard. All Rights Reserved.
PrivacyTerms

Product

Chainguard ContainersChainguard LibrariesChainguard VMsIntegrationsPricing