Hello.
I have model monocular depth estimation and complete compilation and quantization. But on AI-G doesn’t inference, it just return as input video. Can you check all my command and model.
Thank you.
Hello.
I have model monocular depth estimation and complete compilation and quantization. But on AI-G doesn’t inference, it just return as input video. Can you check all my command and model.
Thank you.
Hello. This is TOPST.
Have you resolved the camera-related issue you previously asked about?
First, you converted the model to an unknown type, but currently only classification and object detection models can be inferred in the toolkit and inference app.
Slight modifications are required to use other types of models.
First, the output tensor is currently output in custom_postproc.c within the build_network directory’s custom_postproc and in post_process.c within the compiled network.
Currently, it simply outputs the output tensor, so it may appear that no visualization or inference is being performed. You need to analyze this tensor, perform postprocessing, and then retrieve the corresponding tensor values from ai-g’s tcnnapp for visualization.
In tcnnapp, you need to modify the postprocessing steps in NnAppMain.c, which we shared previously, when the model type is custom.
You also need to modify NnNeuralNetwork.c to determine which structure the values are stored in when the model type is custom.
A model conversion guide for custom types is currently being prepared.
Please understand that the materials are still insufficient. Please feel free to contact us with any further questions.
Thank you.
Hello.
So i need to modify custom_postproc.c in tc-nn-toolkit to convert, quantize and compile model from tflite to enlight type.
After that, I modify postprocessing in NnAppMain.c and NnNeuralNetwork.c then rebuild AI-G. however, i see 2 line NPU_POST_CUSTOM in NnAppMain.c. do i need to modify all of this?
One in NnDrawResult(app_context_t *pContext) funtion,
One in NnOutputResultData(app_context_t *pContext, MessageHandle msgHandle) funtion.
Thank you.
Yes, you can modify both. I handled everything in the function above, so NnOutputResultData was left empty..
Hello.
I’m working on another Linux server that never had the AI-G firmware installed. I build bitbake -c compile -f tc-nn-app instead of running bitbake telechips-topst-ai-image at step 3.6 . Then, I modify postprocessing in NnAppMain.c and NnNeuralNetwork.c.
After that, how do I rebuild AI-G/tcnnapp? just follow session 3.7 in TOPST ?
Thank you.
Hello,
After making the modifications, run ‘bitbake -c compile -f tc-nn-app’ again.
Then, as shown in the picture below, navigate to the build directory of tcnnapp, and you should be able to see the built tcnnapp.
![]()
After transferring the app to the board using the scp command, set the permission with “chmod 755 tcnnapp” and run it with “./tcnnapp”.
Thank you.