Hello PYNQ aficionados,
Edit 10 March 2022: comments closed to request support open a new post in the support category.
I am pleased to announce that the PYNQ team has just released the composable (video) pipeline, this is an open source project and you can find it in GitHub PYNQ_Composable_Pipeline
Note that through the blog, I will use the term “composable overlay” and “composable pipeline” interchangeably.
A composable overlay provides run time configuration as well as runtime hardware composability. More in composability in a moment.
Our composable overlay is based on three pillars:
- An AXI4-Stream Switch that provides the runtime hardware composability, defining how data flow from one IP core to another.
- DFX regions that brings new functionality to the design at runtime.
- An API, built on top of
pynq, that exposes the functionality to control the composable pipeline in a pythonic way.
The AXI4-Stream Switch is cornerstone in achieving composability, its runtime configuration allows us to modify our design without redesigning the overlay, hence being more productive. This is the key of the composable pipeline.
DFX regions are optional in a composable pipeline, however, having them brings an extra dimension of flexibility making the applicability of a composable overlay broader. For instance, not all application need the same functions or some functions are rarely used.
To demonstrate the benefits of the composable overlay, we are providing a composable video pipeline that you can use out-of-the-box. An overview of the composable video pipeline is shown in the image below.
This version implements several standard vision functions. The most common functions are implemented in the static region, these account for 6 functions. The composable overlay also provides 12 dynamic functions implemented across 4 DFX regions, note that pr_0 and pr_1 provide pairs of functions.
|dilate||pr_0 & pr_1|
|erode||pr_0 & pr_1|
|fifo||pr_0 & pr_1|
The python API provides a nice visualization of the available functions in the
Filtering by loaded
.c_dict.loaded and unloaded
.c_dict.loaded functions is also supported. The
.loadIP() method allows you to load a function or list of functions. For instance,
If you have one of the supported boards (PYNQ-Z2 or PYNQ-ZU), burn a new SD card with its corresponding 2.6.0 image. Once you power on your board and the system boots, open Jupyter Lab (
http://<board_ip_address>/lab) and then open a terminal
Copy and paste the following commands in the terminal
git clone https://github.com/Xilinx/PYNQ_Composable_Pipeline pip install PYNQ_Composable_Pipeline/ pynq-get-notebooks composable-pipeline -p $PYNQ_JUPYTER_NOTEBOOKS
You can see how
pynq is updated to the latest stable version (2.6.2), and the composable video pipeline gets installed in the folder
composable-pipeline. This process takes approximately 10 minutes
composable-pipeline folder contains two notebooks at the top level, one that describes how to setup the board and another that describes how some of the vision function operates (examples using OpenCV).
application folder contains five pre-built applications, 3 of them can be controlled with
ipywidgets and the remaining two with the switches and buttons present in the board.
The image below shows one of the applications, the API at the application level is straightforward but powerful enough to provide run time configuration.
custom_pipeline folder provides several notebooks that gradually introduce the API until you get familiar with it and build an application. There is a bonus notebook with advanced features!
For instance, to compose a pipeline you will write
video_pipeline = [video_in_in, rgb2gray, filter2d, colorthr, gray2rgb, video_in_out] cpipe.compose(video_pipeline)
You can visualize what has been composed by calling the
The composable overlay unleashes the benefits of the programmable logic while providing an unprecedented flexibility at the user level.
The provided composable video pipeline has been carefully optimized to support three applications:
- Difference of Gaussians
- Corner Detect
- Color Detect
However, it is up to your imagination what you can compose!
I love to hear your feedback and features request for future releases.This is an open source project, contributions are welcome.
Hope you enjoy composing!