To quickly go to the code, go here. Otherwise, keep reading…
In a previous blog post, I wrote about classifying images with the ResNet50v2 model from the ONNX Model Zoo. In that post, the container ran on a Kubernetes cluster with GPU nodes. The nodes had an NVIDIA v100 GPU. The actual classification was done with a simple Python script with help from Keras and Numpy. Each inference took around 25 milliseconds.
In this post, we will do two things:
- run the scoring container (CPU) on a local machine that runs Docker
- perform the scoring (classification) in Go
Installing the scoring container locally
I pushed the scoring container with the ONNX ResNet50v2 image to the following location: https://cloud.docker.com/u/gbaeke/repository/docker/gbaeke/onnxresnet50v2. Run the container with the following command:
docker run -d -p 5001:5001 gbaeke/onnxresnet50
The container will be pulled and started. The scoring URI is on http://localhost:5001/score.
Note that in the previous post, Azure Machine Learning deployed two containers: the scoring container (the one described above) and a front-end container. In that scenario, the front-end container handles the HTTP POST requests (optionally with SSL) and route the request to the actual scoring container.
The scoring container accepts the same payload as the front-end container. That means it can be used on its own, as we are doing now.
Note that you can also use IoT Edge, as explained in an earlier post. That actually shows how easy it is to push AI models to the edge and use them locally, befitting your business case.
Scoring with Go
To actually classify images, I wrote a small Go program to do just that. Although there are some scientific libraries for Go, they are not really needed in this case. That means we do have to create the 4D tensor payload and interpret the softmax result manually. If you check the code, you will see that is not awfully difficult.
The code can be found in the following GitHub repository: https://github.com/gbaeke/resnet-score.
Remember that this model expects the input as a 4D tensor with the following dimensions:
- dimension 0: batch (we only send one image here)
- dimension 1: channels (one for each; RGB)
- dimension 2: height
- dimension 3: width
The 4D tensor needs to be serialized to JSON in a field called data. We send that data with HTTP POST to the scoring URI at http://localhost:5001/score.
The response from the container will be JSON with two fields: a result field with the 1000 softmax values and a time field with the inference time. We can use the following two structs for marshaling and unmarshaling

Note that this model expects pictures to be scaled to 224 by 224 as reflected by the height and width dimensions of the uint8 array. The rest of the code is summarized below:
- read the image; the path of the image is passed to the code via the -image command line parameter
- the image is resized with the github.com/disintegration/imaging package (linear method)
- the 4D tensor is populated by iterating over all pixels of the image, extracting r,g and b and placing them in the BCHW array; note that the r,g and b values are uint16 and scaled to fit in a uint8
- construct the input which is a struct of type InputData
- marshal the InputData struct to JSON
- POST the JSON to the local scoring URI
- read the HTTP response and unmarshal the response in a struct of type OutputData
- find the highest probability in the result and note the index where it was found
- read the 1000 ImageNet categories from imagenet_class_index.json and marshal the JSON into a map of string arrays
- print the category using the index with the highest probability and the map
What happens when we score the image below?

Running the code gives the following result:
$ ./class -image images/cassette.jpg
Highest prob is 0.9981583952903748 at 481 (inference time: 0.3309464454650879 )
Probably [n02978881 cassette
The inference time is 1/3 of a second on my older Linux laptop with a dual-core i7.
Try it yourself by running the container and the class program. Download it from here (Linux).