PYNQ: PYTHON PRODUCTIVITY

DPU-PYNQ on ZCU102

Hello,
I’m trying to get the DPU-PYNQ running on my ZCU102 board. I successfully built a PYNQ image with my own board repo and ran the upgrade makefile of the DPU-PYNQ repo. For the board repo i used the base board support package and the prebuilt image for aarch64 v2.6. In the spec file i added the packages for pynq, xrt and ethernet.

Then i built the DPU-TRD and copied the necessary files onto the board. Now the Jupyter Notebook stops on the command execute_async and Jupyter crashes (I can’t access it afterwards.). The models are compiled with the arch.json from the DPU-TRD implementation. I also tried to use the DPU-PYNQ board makefiles for generating the bitstream, but that didn’t solve it.

Is anyone running DPU-PYNQ on the ZCU102 and can help me with the setup?

Markus

Which bitstream are you loading on to the board? The one from the TRD on one you are creating yourself? When the Jupyter notebook stops does the entire board lock up (serial and all) or just the currently running notebook?

@rock any thoughts on what might be going on?

Peter

I’ve tried it with the bitstream from the TRD and DPU-PYNQ make flow.
The board crashes completely. I can’t access it through the browser or serial.

Looks like an AXI hang. One thing that we have noticed is that you have to make your model compatible with your bitstream. That arch.json has to be taken from the project where you build you bitstream. If the xmodel does not match your bitstream, it will error like that. Check DPU-PYNQ/host at master · Xilinx/DPU-PYNQ · GitHub for more information.

My best suggestion is that, use the DPU-PYNQ board make file to generate the hardware design. Once the design is done, find the arch.json file and use that file to rebuild your model file.

I tried your suggestion but nothing has changed.

I built the bitstream with the DPU-PYNQ board make file and copied the generated files to the notebook.
With the arch.json ({“fingerprint”:“0x1000020F6014407”}) i manually compiled this two models:
cf_resnet50_imagenet_224_224_7.7G_1.3
tf_inceptionv1_imagenet_224_224_3G_1.3
Then also copied the .xmodel files to the notebook.
When i run the example notebooks the same behavior as before is seen.

Did you pip install the DPU-PYNQ? I wonder if the model name you provided to your notebook matches the one you generated. Also, when you load bitstream, make sure you use our overlay class because AXI data width may need to be adjusted based on your overlay.

Yes i did pip install DPU-PYNQ. I use your overlay class.
The Notebooks from DPU-PYNQ were uploaded manually, together with the bitstream, xclbin, hwh and xmodel.

The notebook stops at the line dpu.execute_async(…).

How did you build your xmodel then? It is probably the model issue. Where did you find the arch.json and how did you use that to compile your model?

I copied the arch.json from DPU-PYNQ/Boards/ZCU102/binary_container_1/link/vivado/vpl/prj/prj.gen/sources_1/bd/dpu/ip/dpu_DPUCZDX8G_1_0/.
Then i started the docker container and manually typed in vai_c_tensorflow … and vai_c_caffe … using the copied arch.json.
I didn’t use the compile.sh from DPU-PYNQ/host.

That looks fine. What about your vitis platform? Have you verified that?

I added some lines to the make files. I only type make BOARD=ZCU102 in DPU-PYNQ/boards.
Is that a problem?

You will have to provide your own Vitis platform. That makefile only takes care of Ultra96, ZCU104, and ZCU111. For ZCU102, I don’t have a Vitis platform so maybe you need to build one by yourself, or download it somewhere.

I’ll try it with my own Vitis platform.
FYI:
For the make file flow to work with the DPU-PYNQ repo i modified some files.
In DPU-PYNQ/boards/Makefile i added:
SUPPORTED = Ultra96 ZCU104 ZCU111 ZCU102

ifeq ((BOARD),ZCU102) VITIS_PLATFORM := (shell pwd)/$(BOARD)/dpu/dpu.xpfm
endif

In the PYNQ-derivative-overlays/dpu/Makefile i added:
ifeq ($(BOARD),ZCU102)
device=xczu9eg-ffvb1156-2-e
endif

In PYNQ-derivative-overlays/vitis_platform:
I added the folder ZCU102 and copied src folder from ZCU104 into it.

Then the make flow ran without issues.