The Composable Video Pipeline

Hello PYNQ aficionados,

Edit 10 March 2022: comments closed to request support open a new post in the support category.

I am pleased to announce that the PYNQ team has just released the composable (video) pipeline, this is an open source project and you can find it in GitHub PYNQ_Composable_Pipeline

Note that through the blog, I will use the term “composable overlay” and “composable pipeline” interchangeably.

What is a composable overlay?

A composable overlay provides run time configuration as well as runtime hardware composability. More in composability in a moment.

Our composable overlay is based on three pillars:

  1. An AXI4-Stream Switch that provides the runtime hardware composability, defining how data flow from one IP core to another.
  2. DFX regions that brings new functionality to the design at runtime.
  3. An API, built on top of pynq, that exposes the functionality to control the composable pipeline in a pythonic way.

The AXI4-Stream Switch is cornerstone in achieving composability, its runtime configuration allows us to modify our design without redesigning the overlay, hence being more productive. This is the key of the composable pipeline.

DFX regions are optional in a composable pipeline, however, having them brings an extra dimension of flexibility making the applicability of a composable overlay broader. For instance, not all application need the same functions or some functions are rarely used.

The composable video pipeline

To demonstrate the benefits of the composable overlay, we are providing a composable video pipeline that you can use out-of-the-box. An overview of the composable video pipeline is shown in the image below.

This version implements several standard vision functions. The most common functions are implemented in the static region, these account for 6 functions. The composable overlay also provides 12 dynamic functions implemented across 4 DFX regions, note that pr_0 and pr_1 provide pairs of functions.

Dynamic DFX Region
absdiff pr_join
add pr_join
bitwise_add pr_join
cornerHarris pr_1
dilate pr_0 & pr_1
duplicate pr_fork
erode pr_0 & pr_1
fast pr_0
fifo pr_0 & pr_1
filter2d pr_0
rgb2xyz pr_fork
subtract pr_join

The python API provides a nice visualization of the available functions in the .c_dict attribute

Filtering by loaded .c_dict.loaded and unloaded .c_dict.loaded functions is also supported. The .loadIP() method allows you to load a function or list of functions. For instance, cpipe.loadIP([cpipe.pr_1.dilate_accel, cpipe.pr_0.filter2d])

How to get started?

If you have one of the supported boards (PYNQ-Z2 or PYNQ-ZU), burn a new SD card with its corresponding 2.6.0 image. Once you power on your board and the system boots, open Jupyter Lab (http://<board_ip_address>/lab) and then open a terminal

Copy and paste the following commands in the terminal

git clone
pip install PYNQ_Composable_Pipeline/
pynq-get-notebooks composable-pipeline -p $PYNQ_JUPYTER_NOTEBOOKS

You can see how pynq is updated to the latest stable version (2.6.2), and the composable video pipeline gets installed in the folder composable-pipeline. This process takes approximately 10 minutes

Applications and Custom Pipeline

The composable-pipeline folder contains two notebooks at the top level, one that describes how to setup the board and another that describes how some of the vision function operates (examples using OpenCV).



The application folder contains five pre-built applications, 3 of them can be controlled with ipywidgets and the remaining two with the switches and buttons present in the board.

The image below shows one of the applications, the API at the application level is straightforward but powerful enough to provide run time configuration.

Custom pipeline

The custom_pipeline folder provides several notebooks that gradually introduce the API until you get familiar with it and build an application. There is a bonus notebook with advanced features!

For instance, to compose a pipeline you will write

video_pipeline = [video_in_in, rgb2gray, filter2d, colorthr, gray2rgb, video_in_out]

You can visualize what has been composed by calling the .graph attribute:

Final remarks

The composable overlay unleashes the benefits of the programmable logic while providing an unprecedented flexibility at the user level.

The provided composable video pipeline has been carefully optimized to support three applications:

  • Difference of Gaussians
  • Corner Detect
  • Color Detect

However, it is up to your imagination what you can compose!

I love to hear your feedback and features request for future releases.This is an open source project, contributions are welcome.

Hope you enjoy composing! :notes:


I think the device is the same for both pynq-z1 and z2, but does it work with z1?

@Msk_Nak, I have not tested it. But, I believe that it should work.

Once you clone the repo, edit this line with PYNQ-Z1 and proceed with the installation

Thank you for your information.I applied that fix. Pynq-Z2-> Pynq-Z1.
However, I got an error when checking board_folder.
So, next, I copied board / Pynq-Z2 to create board / Pynq-Z1.
After that, I continued the install work and it was successful.
I have not confirmed the operation of the device yet, so I will confirm it in the future.


Thanks for your reply. OK ,I try it.

2021年7月5日(月) 16:43 Mario Ruiz via PYNQ <>:

After completing the installation of PYNQ_Composable_Pipeline, there was an error running the notebook.

/usr/local/lib/python3.6/dist-packages/composable_pipeline/ in init(self, bitfile_name, dtbo, download, ignore_version, device)
299 [‘C_NUM_MI_SLOTS’])
→ 301 predefpaths = _default_paths[]
303 if len(switch) != 0:
KeyError: ‘Pynq-Z1’

I tried changing the environment variable on the board to “Pynq-Z2”, but it didn’t work.
I thought it was because I didn’t have the “Pynq-Z1” key in “_default_paths” in “”.
So, I uninstalled “composable-pipeline” once and copied “Pynq-Z2” in “_default_paths” of “” as “Pynq-Z1”.

_default_paths = {
‘Pynq-Z1’: {
0: {‘ci’ : 0, ‘pi’ : 0},
1: {‘ci’ : 1, ‘pi’ : 1}
‘Pynq-Z2’: {
0: {‘ci’ : 0, ‘pi’ : 0},
1: {‘ci’ : 1, ‘pi’ : 1}

After fixing it, I installed it again.
・sudo python3 -m pip install PYNQ_Composable_Pipeline/

I ran “notebook” again and it was successful.
It doesn’t work well if the HDMI source is 1080p, but there is no problem if it is 720p.
Thank you for your advice.

@Msk_Nak thank you for confirming that the design works on the PYNQ-Z1. I’ll incorporate the changes into the repo, so it is easier to install it for PYNQ-Z1.

The recommended resolution for the Zynq 7000 devices is 720p, support for 1080p is in the limit of the capabilities of these devices.


Thank you for supporting Pynq-Z1.

The recommended resolution for the Zynq 7000 devices is 720p, support for 1080p is in the limit of the capabilities of these devices.

I understood that. Thank you very much.

1 Like

Did my thesis work provide any inspiration for this?
It does seem to differ with better DFX, HLS library, and PYNQ python support.

Hi @Space_Zealot,

The idea of composable overlay is very similar to your thesis, but I was not aware of your work until now.

Thank you for letting us know. :grinning:


1 Like

Well if you have any questions on the approach I tried, feel free to ask :grinning_face_with_smiling_eyes:


Some videos of it:
Demo: PYNQ HLS Video Demo - YouTube
In-Depth: PYNQ Video HLS and Dynamic PR (Retry) - YouTube


3 posts were split to a new topic: Composable Overlay v2.7

A post was split to a new topic: Composable Overlay question

3 posts were split to a new topic: Composable Overlay not working on PYNQ-ZU