KV260 read mipi frame stuck on dma, help

Hi, I use KV260, ubuntu 22.04, pynq 3.0,

I reproduce base overlay with no error. I copy bit, hwh, tcl, and dtbo files into one folder and read the overlay. There is no error when I read overlay and initialize mipi. but when I want to read the frame, sometime it works, but sometime it does not work. if it does not work it will stuck on frame = mipi.readframe(), when I give interrupt on notebook, I got this status below. has anyone know how to solve it?

Thank you


KeyboardInterrupt Traceback (most recent call last)
Input In [6], in <cell line: 1>()
----> 1 frame = mipi.readframe()
2 PIL.Image.fromarray(frame[:,:,[2,1,0]])

File /usr/local/share/pynq-venv/lib/python3.10/site-packages/pynq/lib/video/pcam5c.py:153, in Pcam5C.readframe(self)
148 def readframe(self):
149 “”“Read a video frame
150
151 See AxiVDMA.S2MMChannel.readframe for details
152 “””
→ 153 return self._vdma.readchannel.readframe()

File /usr/local/share/pynq-venv/lib/python3.10/site-packages/pynq/lib/video/dma.py:184, in AxiVDMA.S2MMChannel.readframe(self)
182 while self._mmio.read(0x34) & 0x1000 == 0:
183 loop = asyncio.get_event_loop()
→ 184 loop.run_until_complete(asyncio.ensure_future(self._interrupt.wait()))
185 pass
186 self._mmio.write(0x34, 0x1000)

File /usr/local/share/pynq-venv/lib/python3.10/site-packages/nest_asyncio.py:83, in _patch_loop..run_until_complete(self, future)
81 f._log_destroy_pending = False
82 while not f.done():
—> 83 self._run_once()
84 if self._stopping:
85 break

File /usr/local/share/pynq-venv/lib/python3.10/site-packages/nest_asyncio.py:106, in _patch_loop.._run_once(self)
99 heappop(scheduled)
101 timeout = (
102 0 if ready or self._stopping
103 else min(max(
104 scheduled[0]._when - self.time(), 0), 86400) if scheduled
105 else None)
→ 106 event_list = self._selector.select(timeout)
107 self._process_events(event_list)
109 end_time = self.time() + self._clock_resolution

File /usr/lib/python3.10/selectors.py:469, in EpollSelector.select(self, timeout)
467 ready =
468 try:
→ 469 fd_event_list = self._selector.poll(timeout, max_ev)
470 except InterruptedError:
471 return ready

KeyboardInterrupt:

1 Like

Hi @ihsanalhafiz28,

You already have the same issue open, and I had provided an answer. This is most likely the camera.

Mario

i wonder that it is not because of connection, because sometime with similar connection, I got it works and sometime does not work. seems weird but I don’t know what exactly happen. i got zocl-drm axi:zyxclmm_drm: IRQ index 0 not found, on the status in serial connection, is it because of interrupt from csi that does not work?

Hi guys,

I recently encountered the same error as ihsanalhafiz28, when attempting to run the MIPI_to_displayport.ipynb (https://github.com/Xilinx/Kria-PYNQ/blob/main/kv260/base/notebooks/video/mipi_to_displayport.ipynb) with a Digilent PCAM5C on Kria KV260 (Ubuntu 22.04 and Pynq3.0).

After digging into the files and trying to debug and settled for purchasing a new camera as a quick means of solving the problem, however even a new camera straight out of the box presents the same error (shown below).

note: the notenbook hangs on the readframe input the error is prompted by a keyboard interrupt. Also the “im stuck here” text is the result of an added print statement in the PCam5C.py file.

From what I can deduce, deeper in the dma.py file it appears that the code gets stuck in a loop as it waits for bit 13 of the 34h register of the VDMA (6.3) to go high, which according to AMD Adaptive Computing Documentation Portal corresponds to it waiting on an interrupt showing that a frame has arrived. Is this a correct assessment?

I opened the base overlay design in vivado 2020.2.2 to investigate the mipi configuration and thought the issue may be related to the cam_gpio output of the GPIO_IP_reset ip block (see below for image of mipi subsection)

From my understanding via following the trail in of the PACKAGE_PIN assigned in the XDC (pin F11) through the board schematic for the KV260, this output leads to the RPI_ENABLE (pin11) of the 15 pin FFI raspberry pi connector. When looking at the Digilent PCAM5C reference manual this Pin is used by the camera for “PWUP” Which is said to be the power and enable sequence. (see “power up and reset” section: Pcam 5C Reference Manual - Digilent Reference). This sequence follows a set of steps to ensure the enable signal can be sent high to the camera, allowing frames to be captured and transmitted across the mipi line. From digging into the repo files I found this c file does what appears to be the same steps of the PWUP sequence: https://github.com/Xilinx/PYNQ/blob/master/pynq/lib/_pynq/_pcam5c/pcam_5c.c
which is then compiled into a .so file used in the pcam5c.py file referenced before: https://github.com/Xilinx/PYNQ/blob/master/pynq/lib/video/pcam5c.py.

This is where I am stumped, am I going down the right path, is there something wrong with these files that is preventing a frame from being released buy the camera for transfer to the board. Or do you think it may be an issue with the vdma and how the frame is being written to memory?

Is there any way I could debug the problem further?

Any support appreciated,
Thanks,
Cameron

1 Like

Hi @cking,

It is unfortunate that you are having these issues. You are in the right track. You could try to write the driver in Python to be able to update it quicker. You would also probably want to add ILAs to check if there is data coming from the camera.

You can also check this post as reference https://www.adiuvoengineering.com/post/microzed-chronicles-kria-raspberry-pi-camera

Mario

Hi Mario,

Thanks for the reply, I also came across this post: Pcam not generating frames · Issue #27 · Xilinx/Kria-PYNQ · GitHub where the user seemed to be in the same predicament to me.

I will try follow Adam Taylors design to make sure my camera works and can be used for streaming. Rewriting the driver will be quite the task as I it is unfamiliar territory for me but I may give it a go and report back.

Thanks,
Cameron

Hi Mario,

Unfortunately I don’t have the expertise to poke any further into the driver design of PCam5C for KV260 PYNQ. I do sense there is something inherently wrong under the surface I’m just unsure where.

After reading: Pcam not generating frames · Issue #27 · Xilinx/Kria-PYNQ · GitHub I have realised I am in the same boat. The camera can be seen at address 3c only after mipi.start() is entered from the top level in python.

But writing to this address at all using axi.iic.send/recieve from the axi iic class defined under pynq/lib (https://github.com/Xilinx/PYNQ/blob/master/pynq/lib/iic.py), works (doesn’t prompt an error) but reading the output from the command reads nothing. The only output is whatever integer is entered in either the send/receive function, should this be the case?

It would be great to have someone look into the problem as it has clearly troubled some people, particularly I would say because the Xilinx KV260 board is advertised as the go to board for all things vision related - including its “plug and play” rpi camera connector.

Thanks,
Cameron

Hi Mario,

I hope this reaches you,

I have continued with my endeavour, attempting get the MIPI camera (digilent PCAM5C) streaming through to the top level Jupyter notebooks environment on the PYNQ framework with my KV260.

After investigating Adam taylor’s design () I was able to use his vitis application to prove that my PCAM was working - getting a stream that i could display to the HDMI to a monitor. I then copied his driver format and translated into python, trying to run as cells - writing to the sequence of registers he detailed.
note: There were some minor differences between his driver and the pcam5c.py driver under the PYNQ layers. Mainly in how he repeated particular writes down the I2C for resets, commands to configure VDMA and Videomode (these are handled differently in PYNQ I assume with the other files and functions e.g. dma.py and VideoMode function) .

When trialing this driver, it worked up until the very last register I had to write to: 0x02 to register 0x3008. This is done in the PYNQ driver file as below:
image

So with this failing, and an uncertainty as to why I turned to comparing the hardware design, and looking to make minor tweaks to the PYNQ hardware design adding in ILAs as you suggested. To ensure the Axi Stream data passed from one block to the next - not getting held up awaiting an axi Stream TReady signal from the next block, I added in a FIFO as the next block in the chain - planning to test the movement of the Axi Stream through each block from MIPI RX subsytem to VDMA. Such as below:

I then run synth,impl and gen the bitstream before turning to the hardware manager. Here I a connect the board via JTAG and power up the board with all peripherals already connected. The K26 Som is detected but no ILAs show in the Hardware tab underneath.

As the board boots up, it enters as bootup sequence I dont recognise, and vivado tcl console prompts the critical warning:

*[Labtools 27-3421] xck26_0 PL Power Status OFF cannot connect PL TAP. Check Por_B signal*

After a while the board then enters the regular bootup sequence. But along the way I have noticed that it says the Jupyter notebooks server fails to launch. This is then reinforced once I am logged into the ubuntu user:

I have been reading up and found this discussion: Problem with Using integrated logic analyzer (ILA) for Debugging with PYNQ - #5 by parsonsk That suggests you cant use the JTAG to debug and the Jupyter notebooks at the same time out of the box

Is this the correct next port of call, do I have to add this to the SD card and re flash the board with this code or is there a change to this code for the KV260? I can’t find any other examples of people using boards with PYNQ to debug using the hardware manager to view ILA data.

Thought I’d update the thread with what I have attempted and found. Am I going down the right avenues, do you have any further suggestions?

Thanks,
Cameron

Hi Cameron,

Apologies for delay, I have been a bit busy.

It is good to know that the camera works.

I have been able to use the ILA in Kria designs, this is my process:

  • Connect JTAG
  • Power on the board
  • Download overlay with ILA using a notebook
  • Open HW manager in Vivado
  • Connect to the board
  • Add the .ltx file
  • Refresh

This has worked for me in the past.

Also note that in the PYNQ design there’s a GPIO IP that drives the reset of the camera.

Mario

Hi Mario,

No problem at all, I’m amazed you have any time at all to keep on top of the forums.

I had sort of sussed this was the order of operations for carrying out the debugging. And it works… in that I can see the ILA in vivado hardware manager with the screen below. I have the trigger set for whenever the TLAST line goes high - expecting the capture of one frame.

Unfortunately, I don’t see any data but I feel like it may be related to another unexpected thing I noticed.

When I load in my new overlay (containing the ILA FIFO set up shown last post), When i try to run the following logic, copied from the mipi example in kv260 specific notebooks:

base = BaseOverlay("base_v2.bit", ignore_version=True)

mipi = base.mipi
iic = base.axi_iic

The base.mipi line fails quoting the pynq/lib/video/pcam5c.py file at the following section - raising this error at line 81:


This also prevents the device I2C address (3c) from appearing on the I2C bus of the board when you scan using: i2cdetect -y 6

But it does work if first, after booting up the board, I first load on the default PYNQ base overlay then run base.mipi, then when i load on my bitstream - with no changes to any blocks just a design with added ILA and an axis data stream fifo in between the MIPI RX subsystem and the subset converter - the base.mipi command works fine and the address 3c of the pcam appears on the bus

Do you have any idea why this may be happening?

My main concern for the investigation at the moment is if I am Initialising the camera to start streaming before I open the ILA with a trigger - so I am missing the only chance to see the first frame passing over before the data line jams and the TREADY signal is lowered.

→ Does the streaming of the camera only start into the MIPI Rx Subsystem block with the mipi.start() command, or does it start with the base.mipi command?

p.s. I have been commanding to reset the camera with the following command - as I seen is done in the Pcam5c.c driver file:

base.mipi.gpio_ip_reset[0].write(0)
sleep(1)
base.mipi.gpio_ip_reset[0].write(1)

Thanks for continued support,
Cameron

Hi Cameron,

Good to know that you have a path to check the ILA.

This is probably because the .dtbo file that adds the i2c to the device tree is not present. There should be a base.dtbo file next to the base.bit, you can simply copy this and rename it as base_v2.bit

→ Does the streaming of the camera only start into the MIPI Rx Subsystem block with the mipi.start() command, or does it start with the base.mipi command?

I don’t think so, .start only configures the VDMA

Mario

2 Likes

Hi Mario,

Thanks for the .dtbo info that worked a treat, now the design works off first launch with the .mipi command.

But now the issue is back to the driver of the PCAM5C i believe. I have rewritten the code for the driver in the project that i know works - see Adam Taylor, translating from c to python.

I have attached the two files used below, one is the main order of operations - pcam5c_driver.ipynb - then the configuration file containing the lists of registers alongside data needing written is held in pcam5c_cfg.ipynb.

This driver file works with the current config up untill the software reset command is sent as part of the configuration of the frame rate. As seen in the pynq driver this is done here:

image

When this is sent down the I2C bus the bus jams as busy - no more commands for driver configuration can be sent and the notebook hangs. Nothing is seen on the ILA - I assume because the camera is not yet streaming.

I am not sure on a way forward. At the moment I am also attempting to write up my own ILA in HLS that will send the output of a bypassing stream to memory so I can see a log of anything that has went through. But I am not sure how to assess the driver issue. Will report back with any changes

Thanks,
Cameron

1 Like

Cameron,

You could comment these lines, not to call the pcam driver:

Then you could run the driver Ptyhon version before you start the stream.

Mario

Hi Mario,

Still looking into this issue, I have dug down deeper into the design and the driver files to gain more of an understanding of the set up. As well as this I have rebuilt Adam taylors design, added an axi_iic controller and loaded the bitstream on using the pynq Overlay(<.bit>) format.

My thought was to use the PYNQ drivers (pcam5c.py and underlying c files) to configure the blocks in the pipeline of adam Taylor’s design (MIPI RX subsystem, demosaic, gammaLUT and VDMA) in the pipeline but there are some key differences with the design of Adam Taylors, below shows the flow of data in both cases:

Key difference up to the VDMA - Adam Taylor’s does not have video processing and pixel_pack ip

I am trying to run the driver for these ip from the top level, but I cannot use the functions to do so that are defined in the compiled c file - pcam_mipi.c

This image shows first one of the ip block driver functions (demosaic) and then how it fits into the main initialisation driver function for all the pipeline IP - InitImageProcessingPipe

  • Now I know I cannot use this function - since it includes config of the Vprocess IP which is not in my design, but can I call the individual driver functions (e.g. ConfigDemosaic) and start there to make my driver?

The aim would be to remake this pcam_init driver function which is then called in pcam_5c.py function to initiate streaming.

Is this worth doing or would the preferred method for creating a driver at the top level be by following this method: Overlay Tutorial — Python productivity for Zynq (Pynq) v1.0

The current main issue is still raised when trying to raead a frame from the VDMA - prompting an error as an interrupt is raised claiming it is waiting on a frame

Looking at the ip before the VDMA all blocks are initiated as shown with functions establishing the writing and reading to and from particular registers - except for the pixel_pack ip which only has this line in the highest level pcam5c.py:

  • Is there something missing from its configuration perhaps?

  • It appears that the Axi Stream out of the Mipi Rx subsystem block is a 16 bit pixel depth stream - how does this carry an RGB image? Is it divided into 4 channels of depth 4, or is it multiple 16 bits recombined in the demosaic block (who’s output is 24)?

  • Do you have any other suggestions as to what to spend focus on? I feel I have exhausted a lot of avenues and am getting bogged down in the layers of config files

Thanks for any help,
Cameron

Hi @cking,

You could try to modify the C driver directly on the board.

For this, inside the KV260 on a terminal:

  1. Clone the PYNQ repo recursively:

    git clone https://github.com/Xilinx/PYNQ.git --recursive
    
  2. Move to the folder where the driver is:

    cd PYNQ/pynq/lib/_pynq/_pcam5c
    
  3. Compile

    make
    

    Hopefully, this will generate the .so file. Once the driver compiles, you can start making modifications and recompile.

  4. Update the path where the .so file is being imported here https://github.com/Xilinx/PYNQ/blob/master/pynq/lib/video/pcam5c.py#L60

I need to dig deeper into the other questions.

Mario

1 Like

Hi Mario,

Thanks for the quick response, yes we considered the recompiling of the driver this way and may do so to modify the drivers to fit Adam Taylor’s design, though this may turn out to be quite a bit of work.

I have attached below a notebook which shows the driver commands we run at the top level if you are interested, it follows the same structure as Adam Taylor’s driver for the cameras - differing from the init_pcam function in the pcam_5c.c file only slightly in the order of operations.

cameron king notebook.ipynb (20.7 KB)

At the minute this driver as shown in the notebook where there are no results past cell 22, seems to be two particular key sequences where the code hangs:

Case 1

  1. Software power down - writing 0x42 (sending bit 6 high) to register 0x3008
  2. Software reset - writing 0x3008 0x82

Case 2

  1. Software power down - writing 0x42 (sending bit 6 high) to register 0x3008
  2. Waking up sensor - writing 0x3008 0x02

After these two no more commands can be sent down the I2C bus. From our understanding of how this bus works, 8 bits are sent to the slave (camera) and 1 bit is sent back to the master from the slave to acknowledge it has received it.

  • Could it be the case that it never receives the acknowledge bit from command 2 as it is powered down and so cannot send anymore?

Contradicting this thought is the fact that this driver works in the vitis environment with Adam Taylor’s application, and also the the PYNQ driver (Pcam5c.c) gets past this, (doesn’t hang).

Very confused…
Regards,
Cameron

Hi Cameron,

I am not able to find cell 18, iic_tx(0x3008, 0x02) until much later in the configuration.
Have you tried without this?

Hi Mario,

Yes, note that this notebook was just to test different commands to 0x3008 - isn’t a replica of the driver.

The reason we are doing this is because, when copying this driver from the top level (Adam Taylors c driver) - helloworld contains the main driver which reaches out to the arrays stored in the i2c.h (same as pcam_5c.c and pcam_5c.h)
helloworld.c (8.6 KB)
i2c.h (10.6 KB)

These I replicated here:
pcam_cfg.ipynb (26.2 KB)
pcam_driver(1).ipynb (21.4 KB)

The code hangs after the main function of pcam_driver(1) after the cell that reads all the cfg_720p_60fps commands (cell 13). It sends the final command 0x3008, 0x02 and then after that the I2C channel hangs - no more commands can be sent

…So we investigated this command and those previous and following it.

Hi @cking,

I wonder if you can execute the .exe file that Vitis generates, which you have tested that works, on the Ubuntu-based image. This will help us narrow down the issue.

If this works, you may have a work around.

Mario

Hi Mario,

I have not yet tried to execute the .exe image on the ubuntu image but I will look into doing that now to see if it works.

Apologies for not responding, I made progress when investigating the camera reset commands in the driver set up. The issue I was facing I think we can put down to timing, after trialling commands sent with different breaks inbetween I found this to be a working configuration - see the notebook below

cameron king notebook.ipynb (10.2 KB)

By Working, I mean that a frame can be passed through the MIPI rx subsystem → Demosaic → Gamma LUT → VDMA and then taking to the PS for display from the pyng Python level. Hence I can display a frame to an HDMI output from the MIPI camera - but this doesn’t seem to be behaving well

  1. When first booted and ran, A frame such as the following is displayed


    Quite clearly this image is in the wrong hue - as the demosaic channlkes have been configured wrong. As outlined in the demosaic user guide, writing to register offset 0x28 you can change the bayer grid phase to tell which pixel is capturing which colour (RGB). In this image pattern 0 is selected (0x00 written to 0x28 as in the PYNQ pcam_mipi.c file)

  2. After flicking through different configs it seems that pattern 4 is the closest (writing 0x03 to 0x28). This is the demosaic config that Adam Taylor uses in his code for the vitis application. However I say closest because, as shown below, it is still wrong - the blue and green channels are inverted (changing the gamma has no effect on this):

  1. Sometimes, what I take to be randomly), when streaming to the displayport output, the kernel will just die and need restarted. When I do this and display a frame after re-running, the image will be of a different colour configuration than before in 0x03 demosaic config, so I change it back to 0x00 and the frame is perfect in terms of colour, but a column of pixels from the start of frame is offset to the end:

This I’m not sure how to explain. After resetting the readchannel of the vdma, restarting the kernel or even loading a different overlay and then loading the design back on the camera will display this type of frame.
Restarting the system will send it back to a frame like in bullet point 1

Have you ever seen this when working with video frame display in Pynq?

I notice the frames extracted from the vdma are of the format, pynq.buffer.PynqBuffer described as a “subclass of array” does that mean it can be treated the same as a numpy array?

note: This is using a modified version of Adam Taylors design as outlines before, modified by including an axi_iic controller. A pieced together screenshot of the bd of this design is shown below:

I will keep on looking into it, any help would be appreciated as always,
Thanks,
Cameron