Taking over control of displayport output using live feed settings

Hi all,

My setup:
Pynq 2.4
Ultra96V1 (Ultrascale Zynq)

Im basically trying to use the xilinx test pattern generator and timing controller to show color bars on the displayport output of the device.

I following Adam Taylors recipe to do this with some additions (like an adder HLS block just to test axi read/writes).

I then followed the recipe to for poking the tgp and vtc blocks according to:
https://forums.xilinx.com/t5/Design-and-Debug-Techniques-Blog/Video-Series-23-Generate-a-video-output-on-Pynq-Z2-HDMI-out/ba-p/932553#feedback-success

tpg = overlay.v_tpg_0
vtc = overlay.v_tc_0
tpg.write(0x10,720)
tpg.write(0x18,1280)
tpg.write(0x40,0)
tpg.write(0x20,9)
tpg.write(0x00,0x81)
vtc.write(0x00,0x01) ← this is the go signal

I set the tpg output to be the same as that dervived by the tpg & vtc, i.e. 1280x720. Theres a 74.25MHz pixel clock in the FPGA fabric to do this.

Heres what i have checked:

  • blocks are bound in correctly
  • i can peek and poke the blocks (proving memory mapping is correct)
  • double checked registers for setting vtc and tpg against drivers and xilinx documentation.

Issue is that the chromium desktop has remained. There is not sign of the colour bars.

Has anyone trying to use the PL fabric to control the hard displayport section of the Ultrascale ZYnq part?

Thanks,

Darth

1 Like

Hi all,

Bringing this thread back to life again as I am trying to do the same thing by using this design provided by Adam Taylor and merge it with a previous design I have created. (see: here).

The aim of my design is to create a Video processing pipeline on the Kria KV260 (Zynq Ultrascale+ MPSoC) that can capture video from a MIPI camera (digilent PCam5C) and pass it through processing a PL pipeline that allows for low latency, purely FPGA based streaming - i.e. no storage of frames in DDR memory through the VDMA. To do this I aim to deviate from the MIPI processing line from the PYNQ baseline design (that writes frames to VDMA, then Reads them and pushes them to the displayport) and instead bypass the VDMA as shown below:

From looking into these mipi/displayport example notebooks provided as part of PYNQ baseline design, I can see the DisplayPort Controller Is configured via the pynq/lib/video/drm.py underlying file. But from what I can see this file is tailored for the base design hence - cannot be used for live PL → Displayport streaming .

Ideally I would like to keep my software configuration in the PYNQ world, as with my previous MIPI work I ended up writing the camera driver in Python ipynb form. However, Is it possible to configure the Displayport controller for Live PL streaming out of HDMI for the Zynq Ultrascale+ MPSoC from pynq?.

Concerns:

  1. Adam Taylor’s example describes using Vitis to modify the BSP - configuring the displayport component on arm_cortex_a53 to use the AVBUF driver - does this bind me to loading the hardware and software designs in the Vitis environment?

  2. I find it strange there are no examples for this work in a PYNQ environment, all desiugns I can find seem to rely on the VDMA for streaming however this doesn’t allow pixel by pixel streaming to HDMI out

Any help appreciated on this,
I believe Live PL streaming out of the Displayport/HDMI outputs of a Zinq Ultrascale+ MPSoC device would be beneficial for many others,

Regards,
Cameron