Sending frame from PYNQ z2 PS to FPGA

I want to send video camera frame from Pynq-z2 Jupiter notebook to the FPGA PL side and vice-versa. So how do I create video IP in PL and process that in PS? Any kind of suggestion is welcomed.
Thanks

1 Like

Hi @dpk1,

Did you check the composable overlay?

Mario

Hello,
Here is some code I put together using some references (noted in the code) that sends 800x600 frames from my USB webcam to the PL. On the PL side, it makes video frames from them and sends it out the HDMI output. This runs on my ZCU104.

# https://discuss.pynq.io/t/webcam-video-to-hdmi-output/3101
#https://discuss.pynq.io/t/how-to-stream-a-video-from-webcam-to-hdmi-out-using-only-base-overlay-of-pynq-z1/816



from pynq.overlays.base import BaseOverlay
from pynq.lib.video import *
import time
import cv2

base = BaseOverlay("base.bit")

#camera (input) configuration

frame_in_w = 800
frame_in_h = 600


# configure video output
Mode = VideoMode(frame_in_w,frame_in_h,24)
hdmi_out = base.video.hdmi_out
hdmi_out.configure(Mode,PIXEL_BGR)
hdmi_out.start()

# configure monitor (output) frame buffer size
frame_out_w = 1920
frame_out_h = 1080

# used to record the time when we processed last frame
prev_frame_time = 0
 
# used to record the time at which we processed current frame
new_frame_time = 0

# initialize camera from OpenCV

cap = cv2.VideoCapture(0)
cap.set(cv2.CAP_PROP_FRAME_WIDTH, frame_in_w);
cap.set(cv2.CAP_PROP_FRAME_HEIGHT, frame_in_h);
print("Capture device is open: " + str(cap.isOpened()))


#Capture webcam video

while (True):
  ret, frame = cap.read()
  if ret:
     outframe = hdmi_out.newframe()
     outframe[:] = frame
     hdmi_out.writeframe(outframe)
     new_frame_time = time.time()
     fps = int(1/(new_frame_time-prev_frame_time))
     prev_frame_time = new_frame_time
     print("Frames per second : {0}". format(fps))
     
     
cap.release()
cv2.destroyAllWindows()


# print the camera fps settings 
cap.get(cv2.CAP_PROP_FPS)

Good luck,
John

Is this only for HDMI_in camera? This pynq.lib.video is a way to call the video IP which is build on PL into the PS? I want to send frame of any type of camera to the PL.
Thanks

That code is using the OpenCV capture device (i.e. a USB webcam), not HDMI-IN.

This code would copy HDMI-IN to HDMI-OUT (without using the PS)

#https://stackoverflow.com/questions/59578954/pynq-z1-how-to-stream-a-video-from-hdmi-input-to-hdmi-output-using-base-overla



from pynq.overlays.base import BaseOverlay
from pynq.lib.video import *
import time

base = BaseOverlay("base.bit")


hdmi_in = base.video.hdmi_in
hdmi_out = base.video.hdmi_out
hdmi_in.configure()
hdmi_out.configure(hdmi_in.mode)
hdmi_in.start()

     
while True:
    try:
        frame = hdmi_in.readframe()
        hdmi_out.writeframe(frame)
        time.sleep(0.016)
    except KeyboardInterrupt:
        break

hdmi_in.close()
hdmi_out.close()

1 Like

Thank you for your response. Just to confirm,
This "from pynq.lib.video import * " library is used to transfer frames between PS and PL?
Also Video Pipeline IP is built in PL side. By using pynq.lib, we are accessing that IP using PS jupiter notebook. Is it so?
Is this method work for all types of cameras like USB monocular, pyrealsense etc.?
Thanks

Any support on how to create IP for Video camera in VIVADO and use of that IP in PS using jupiter notebook?
Thanks

Hii, I am having issue while installing composible overlay package. When I am doing
“pip install PYNQ_Composable_Pipeline/” , pynq-2.7.0.tar.gz is installing and it is giving me error “ERROR: Package ‘pynq-composable’ requires a different Python: 3.6.5 not in ‘>=3.8.0’”. I am using pynq-z2 and image pynq_z2_v2.6.0. Any solution how can I install it in my board?

Hi @dpk1,

The latest version of the composable overlay depends on PYNQ 2.7. You will need to burn an SD card with PYNQ 2.7.

Mario

Thanks, pynq 2.7 worked.
I want to confirm one thing, Is this composable overlay using OpenCV for accessing video frames or video IP is built on PL and we are accessing that IP in PS?

Hi,

OpenCV is only used to capture frames from a webcam or file in the PS. Then quickly moved to PynqBuffer.
All of the vision function implemented on hardware are accelerated hardware version of their corresponding OpenCV counterpart.

Mario

1 Like

Thanks,

Where can I find hardware-accelerated IP so that I can modify or take references for my work?
Also, I want to send a video frame from a USB camera. Will this composable pipeline would work for USB type camera?

Between the documentation page an the repo your question should be answer.
The vision IP come from the Vitis accelerated libraries.

Mario

1 Like