Hello, I am trying to implement Cifar-10 in real-time on hardware(PYNQ-Z2) after capturing a frame from USB webcam and pass it to HDMI out (monitor). For this, I have used the base overlay of pynq (having HDMI-in and HDMI-out) and integrate it with the CNV IP core( having binary neural network & its weights) so that I can generate my own overlay with CNV and HDMI using VIVADO IP integrator. My IP block design is given as below:
The block design has successfully generated a bitstream file. The .bit, tcl and hwh file has uploaded to the jupyter notebook environment. Using this bit file, I want to apply the bnn.classifier in hardware on each USB frame for real-time recognition. I want to know how should I classify the image captured from USB webcam and then, display on HDMI out monitor.
CNV IP core given BNN-PYNQ/bnn/src/network at master · Xilinx/BNN-PYNQ · GitHub