Check the preview of 2nd version of this platform being developed by the open MLCommons taskforce on automation and reproducibility as a free, open-source and technology-agnostic on-prem platform.
program:image-classification-tensorrt-py (v3.0.0)
Copyright: See copyright in the source repository
License: See license in the source repository
Creation date: 2019-11-05
Source: GitHub
cID: b0ac08fe1d3c2615:b244e68887347d16

Don't hesitate to get in touch if you encounter any issues or would like to discuss this community project!
Please report if this CK component works: 1  or fails: 0 
Sign up to be notified when artifacts are shared or updated!

Description  

This portable workflow is our attempt to provide a common CLI with Python JSON API and a JSON meta description to automatically detect or install required components (models, data sets, libraries, frameworks, tools), and then build, run, validate, benchmark and auto-tune the associated method (program) across diverse models, datasets, compilers, platforms and environments. Our on-going project is to make the onboarding process as simple as possible via this platform. Please check this CK white paper and don't hesitate to contact us if you have suggestions or feedback!
  • Automation framework: CK
  • Development repository: ck-ml
  • Source: GitHub
  • Available command lines:
    • ck run program:image-classification-tensorrt-py --cmd_key=default (META)
  • Support for host OS: any
  • Support for target OS: any
  • Tags: image-classification,tensorrt,trt,standalone,lang-python
  • How to get the stable version via the client:
    pip install cbench
    cb download program:image-classification-tensorrt-py --version=3.0.0 --all
    ck run program:image-classification-tensorrt-py
  • How to get the development version:
    pip install ck
    ck pull repo:ck-ml
    ck run program:image-classification-tensorrt-py

  • CLI and Python API: module:program
  • Dependencies    

    ReadMe  

    Image Classification - TensorRT-Python program

    The instructions below have been tested on a Jetson TX1 board with JetPack 4.2.2 installed via the NVIDIA SDK Manager.

    Convert TF model to ONNX model

    When installing a Jetpack via the NVIDIA SDK Manager, tick the TensorFlow option. For JetPack 4.2.2, this installs TensorFlow 1.14.0.

    Detect TensorFlow

    $ ck detect soft:lib.tensorflow --full_path=/usr/local/lib/python3.6/dist-packages/tensorflow/__init__.py
    

    Install ONNX from source (with the ProtoBuf compiler dependency)

    $ ck install package --tags=lib,python-package,onnx,from-source
    

    Install TF-to-ONNX converter (of a known good version)

    $ ck install package --tags=lib,python-package,tf2onnx --force_version=1.5.1
    

    NB: Both 1.5.2. and 1.5.3 can be installed but fail to convert ResNet to ONNX on TX1.

    Convert TF to ONNX

    $ ck install package --tags=model,resnet,onnx,converted-from-tf
    

    Convert ONNX to TensorRT

    When converting an ONNX model to TensorRT, you can select the numerical data type (fp32 or fp16) and the maximum batch size (currently 1 .. 20).

    precision=fp32, max_batch_size=1

    $ ck install package --tags=model,resnet,tensorrt,converted-from-onnx
    

    precision=fp16, max_batch_size=1

    $ ck install package --tags=model,resnet,tensorrt,converted-from-onnx,fp16
    

    precision=fp32, max_batch_size=2

    $ ck install package --tags=model,resnet,tensorrt,converted-from-onnx,fp32,maxbatch.2
    

    precision=fp16, max_batch_size=2

    $ ck install package --tags=model,resnet,tensorrt,converted-from-onnx,fp16,maxbatch.2
    

    Versions  

    Files  

    Comments  

    Please log in to add your comments!
    If you notice any inapropriate content that should not be here, please report us as soon as possible and we will try to remove it within 48 hours!