Am working on a little project to make a convolution core, and i have been surprised by the fact hls_video.h and hls_opencv.h have been deprecated, making the tutorials i was following much less useful.
So now i am trying to implement the Conv2D with LineBuffer that is in the examples of Vivado HLS. ( and also in this link )
Can anyone guide me a little i want to pass an image from to the program on the testbench, How can i do it without the openCV libraries ?
After i export the generated design to Vivado i get an “ap_ctrl” port but Vivado doesn’t let me connect GPIO nor anything to it,
How can i control the core ?
If i put a Constant with “1” would that mean that the core would be always on ?
When i try to validate the design Vivado tells me it lacks a “Tlast” which i understood was important for AXI_Stream, Where can i get this signal from ?
is it one of these DST_TVALID or DST_TREADY on the 11x11 filter core that i should wire to AXI2mm_tlast ?
one last question is , supposing i got ta bitstream and then ran it on the PYNQ ,
how would i be able to pass ta test image to it ?
at which address should i send it with python ?
I suspect it should be an address on the DMA that i should configure on the adress bar on vivado , but i am still a bit confused and the AXI documentation i s super vast and i got lost on there mayt times.
thank you for helping me !