ZCU111 Build Problems

I’m trying to build the ZCU111 Pynq software that’s here: GitHub - Xilinx/ZCU111-PYNQ: Board files to build the ZCU111 PYNQ image

I have Vivado 2021.2 (this is the standard version that we use). See questions below on versions.

~/Pynq$ vivado -version
Vivado v2021.2 (64-bit)
SW Build 3367213 on Tue Oct 19 02:47:39 MDT 2021
IP Build 3369179 on Thu Oct 21 08:25:16 MDT 2021
Copyright 1986-2021 Xilinx, Inc. All Rights Reserved.

Running in an Ubuntu 20.04 LTS Virtualbox VM. Reading the ZCU111-PYNQ README and other documentation, I have collected the following steps. My first question is, can I use Vivado 2021.2, and if I can, what needs to be changed? See the error output below after Pynq make is run. I also get errors when the Pynq overlay bitstreams are built. Both errors seem related to using Vivado 2021.2. Should I be using 2020.2? Also see the warning below when the Petalinux environment is set up about the unsupported OS (is this a problem?). Thanks in advance for any help.

References: the readthedocs pynq_sd_card.html documentation

  1. Download/install Vivado

  2. Clone the Pynq code here:
    git clone GitHub - Xilinx/PYNQ: Python Productivity for ZYNQ

  3. Setup the host Ubuntu with all required pkgs:
    ./PYNQ/sdbuild/scripts/setup_host.sh

  4. Build the overlay bitstreams:
    ./PYNQ/build.sh
    ERROR: ‘2203161343’ is an invalid argument. Please specify an integer value.

  5. Clone the specific board repo at Xilinx/ZCU111-PYNQ, e.g.:

  6. Download the Petalinux tools here, and note the version (it matters and should match the version that the board was originally build with), and place the .bsp file in the specific board folder (e.g. ZCU111_Pynq/ZCU111:
    the Xilinx embedded-design-tools.html

  7. Set up the Vivado and Petalinux environments:
    source ~/tools/Xilinx/Vivado/2021.2/settings64.sh

    source ./petalinux-SDK/settings.sh
    PetaLinux environment set to ‘/home/jramsey/Pynq-Builds/petalinux-SDK’
    WARNING: /bin/sh is not bash!
    bash is PetaLinux recommended shell. Please set your default shell to bash.
    WARNING: This is not a supported OS
    INFO: Checking free disk space
    INFO: Checking installed tools
    INFO: Checking installed development libraries
    INFO: Checking network and other services

    petalinux-util --webtalk off

  8. Build Pynq:
    cd ~/Pynq/sdbuild/
    make

    source pynqz2.tcl -notrace

ERROR: [BD::TCL 103-2041] This script was generated using Vivado <2020.2> and is being run in <2021.2> of Vivado. Please run the script in Vivado <2020.2> then open the design in Vivado <2021.2>. Upgrade the design by running “Tools => Report => Report IP Status…”, then run write_bd_tcl to create an updated script.
INFO: [Common 17-206] Exiting Vivado at Wed Mar 16 13:33:22 2022…
vivado -mode batch -source build_bitstream.tcl -notrace

****** Vivado v2021.2 (64-bit)
**** SW Build 3367213 on Tue Oct 19 02:47:39 MDT 2021
**** IP Build 3369179 on Thu Oct 21 08:25:16 MDT 2021
** Copyright 1986-2021 Xilinx, Inc. All Rights Reserved.

source build_bitstream.tcl -notrace
ERROR: [Coretcl 2-27] Can’t find specified project.
INFO: [Common 17-206] Exiting Vivado at Wed Mar 16 13:33:45 2022…
make[1]: *** [makefile:13: bitstream] Error 1
make[1]: Leaving directory ‘/home/jramsey/Pynq/sdbuild/build/Pynq-Z2/petalinux_bsp/hardware_project’
make: *** [Makefile:345: /home/jramsey/Pynq/sdbuild/build/Pynq-Z2/petalinux_bsp/xilinx-pynqz2-2021.2.bsp] Error 2

  1. Build the board;
    pushd ~/PYNQ/sdbuild; make BOARDDIR=~/ZCU111-PYNQ
1 Like

Hi there,

Vivado 2020.2 is the latest supported version for pynq 2.7. It might be easier to downgrade your Vivado version than to make changes to the current sdbuild flow.

Also, Ubuntu 20.04 has not been tested with the current sdbuild flow, so I can’t say for certain if you won’t run into any issues or might need to install different packages. It is recommended to run in 18.04 and use the setup_host.sh script.

I highly recommend you use the PREBUILT and PYNQ_SDIST flags in your make call. You can download the board agnostic PYNQ v2.7 image here and the PYNQ 2.7 source distribution here.

then

make BOARDDIR=zcu111_dir PYNQ_SDIST=sdist_tarball_dir PREBUILT=prebuilt_image_dir BOARDS=ZCU111

Thanks
Shawn

1 Like

@skalade

Hi Shawn,

Your suggested worked (almost). I was able to build the ZCU111 board, but right at the end I saw these messages:

WARNING: Unable to access the TFTPBOOT folder /tftpboot!!!
WARNING: Skip file copy to TFTPBOOT folder!!!

INFO: Failed to copy built images to tftp dir: /tftpboot

tar: /home/jramsey/Pynq-Builds: Cannot read: Is a directory
tar: At beginning of tape, quitting now
tar: Error is not recoverable: exiting now
make: *** [Makefile:345: /home/jramsey/Pynq/sdbuild/build/ZCU111.tar.gz] Error 2

I also saw many permissions denied messages above the final build WARNING/Error messages shown above, like this one:

rm: cannot remove ‘pynq_2_7_root_fs/opt/microblazeel-xilinx-elf/microblazeel-xilinx-elf/include/c++/9.2.0/cwctype’: Permission denied

Would you know why I’m getting these messages, and should I be concerned? And in what directory are the final binary files that I need to copy to the SD Card? Thanks.

Jeff

Hi Jeff,

Do you have passwordless sudo permissions on your machine? You can usually set this by editing the sudoers file with sudo visudo. Also, when you have this configured run a sudo command just before you call make, can just do a simple sudo ls to confirm it works.

Thanks
Shawn

Hi Shawn,

I have a file named jramsey-user in /etc/sudoers.d:

jramsey@jramsey-VirtualBox:~/Pynq/sdbuild$ sudo cat /etc/sudoers.d/jramsey-user
jramsey ALL=(ALL) NOPASSWD:ALL

I think that’s how password-less sudo is enable for a specific user rather than systemwide. I can sudo as myself and execute any command without being prompted for a password, so I’m fairly certain that is working. I re-ran make, and it didn’t rebuild anything (which I expected), but it shows a shorter output with the problem right at the end of the build (see below). What is the build script trying to tar (it looks like the directory above the board directory)? Where should the output that needs to be written to an SD Card end up, once I get past this problem? Thanks.

Jeff

jramsey@jramsey-VirtualBox:~/Pynq/sdbuild$ make BOARDDIR=~/ZCU111-PYNQ PYNQ_SDIST=~/Pynq-Builds PREBUILT=~/Pynq-Builds BOARDS=ZCU111
/opt/qemu/bin/qemu-aarch64-static -version | fgrep 5.2.0
qemu-aarch64 version 5.2.0
vivado -version | fgrep 2020.2
Vivado v2020.2 (64-bit)
vitis -version | fgrep 2020.2
****** Vitis v2020.2 (64-bit)
which petalinux-config
/home/jramsey/Pynq-Builds/petalinux-SDK/tools/common/petalinux/bin/petalinux-config
which arm-linux-gnueabihf-gcc
/home/jramsey/tools/Xilinx/Vitis/2020.2/gnu/aarch32/lin/gcc-arm-linux-gnueabi/bin/arm-linux-gnueabihf-gcc
which microblaze-xilinx-elf-gcc
/home/jramsey/Pynq-Builds/petalinux-SDK/tools/xsct/gnu/microblaze/lin/bin/microblaze-xilinx-elf-gcc
which ct-ng
/opt/crosstool-ng/bin/ct-ng
bash /home/jramsey/Pynq/sdbuild/scripts/check_env.sh
Checking system for required packages:
bc gperf bison flex texi2html texinfo help2man gawk libtool libtool-bin build-essential automake libglib2.0-dev device-tree-compiler qemu-user-static binfmt-support multistrap git lib32z1 libbz2-1.0 lib32stdc++6 libssl-dev kpartx zerofree u-boot-tools rpm2cpio libsdl1.2-dev rsync python3-pip gcc-multilib libidn11 curl libncurses6 lib32ncurses6
sudo rm -fr /home/jramsey/Pynq/sdbuild/build/focal.ZCU111
mkdir /home/jramsey/Pynq/sdbuild/build/focal.ZCU111
(cd /home/jramsey/Pynq/sdbuild/build/focal.ZCU111 && sudo tar -xf /home/jramsey/Pynq-Builds)
tar: /home/jramsey/Pynq-Builds: Cannot read: Is a directory
tar: At beginning of tape, quitting now
tar: Error is not recoverable: exiting now
make: *** [Makefile:345: /home/jramsey/Pynq/sdbuild/build/ZCU111.tar.gz] Error 2

1 Like

Oh apologies I should have been more clear on the make command. You should be pointing to the prebuilt files rather than the directories they are living in. Also recommend using absolute paths, rather than relative. I think the tar command was trying to extract the contents of the prebuilt rootfs, and not finding the tar.gz file. Should be something like this:

make BOARDDIR=/home/jramsey/ZCU111-PYNQ PYNQ_SDIST=/home/jramsey/Pynq-Builds/pynq-2.7.0.tar.gz PREBUILT=/home/jramsey/Pynq-Builds/focal.aarch64.2.7.0_2021_11_17.tar.gz BOARDS=ZCU111

The image file should be in sdbuild/output

Thanks
Shawn

Hi Shawn,

I got further with the following command (close to what you suggested with path corrections):

make BOARDDIR=/home/jramsey/Pynq-Builds/ZCU111-PYNQ PYNQ_SDIST=/home/jramsey/Pynq/pynq-2.7.0.tar.gz PREBUILT=/home/jramsey/Pynq-Builds/focal.aarch64.2.7.0_2021_11_17.tar.gz BOARDS=ZCU111

But I see the following after the make runs for a while. Any idea why the Vivado IP build would fail? The bad lexical cast seems suspicious. Thanks again for your support; it’s a big help.

****** Vivado v2020.2 (64-bit)
**** SW Build 3064766 on Wed Nov 18 09:12:47 MST 2020
**** IP Build 3064653 on Wed Nov 18 14:17:31 MST 2020
** Copyright 1986-2020 Xilinx, Inc. All Rights Reserved.

source run_ippack.tcl -notrace
bad lexical cast: source type value could not be interpreted as target
> while executing
“rdi::set_property core_revision 2203181132 {component component_1}”
invoked from within
“set_property core_revision $Revision $core”
(file “run_ippack.tcl” line 937)
INFO: [Common 17-206] Exiting Vivado at Fri Mar 18 11:33:32 2022…
ERROR: [IMPL 213-28] Failed to generate IP.
INFO: [HLS 200-111] Finished Command export_design CPU user time: 18.42 seconds. CPU system time: 2.88 seconds. Elapsed time: 56.19 seconds; current allocated memory: 246.764 MB.
command ‘ap_source’ returned error code
while executing
“source color_convert/script.tcl”
(“uplevel” body line 1)
invoked from within
"uplevel #0 [list source $arg] "

INFO: [HLS 200-112] Total CPU user time: 36.94 seconds. Total CPU system time: 4.98 seconds. Total elapsed time: 88.6 seconds; peak allocated memory: 242.969 MB.
INFO: [Common 17-206] Exiting vitis_hls at Fri Mar 18 11:33:38 2022…
child process exited abnormally
INFO: [Common 17-206] Exiting Vivado at Fri Mar 18 11:33:38 2022…
make[1]: *** [makefile:10: hls_ip] Error 1
make[1]: Leaving directory ‘/home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base’

  • unmount_special
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/proc
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/run
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/dev
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/ccache
  • rmdir /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/ccache
    make: *** [Makefile:346: /home/jramsey/Pynq/sdbuild/build/ZCU111.tar.gz] Error 2

Oh this looks like the dreaded Vivado y2k22 bug… There’s a patch and documentation on it here.

Thanks
Shawn

Hi Shawn,

Okay it looks like I’m past that problem now that Vivado 2020.2 is patched. The build is attempting to build all the boards and it ends up failing on Z2. The output is below from the run. Again this is the command I used:

make BOARDDIR=/home/jramsey/Pynq-Builds/ZCU111-PYNQ PYNQ_SDIST=/home/jramsey/Pynq/pynq-2.7.0.tar.gz PREBUILT=/home/jramsey/Pynq-Builds/focal.aarch64.2.7.0_2021_11_17.tar.gz BOARDS=ZCU111

Since I only listed the ZCU111, why would all the boards be built? The output follows. I think I might try saving the 20.04 VM and creating a new VM based on 18.04 so my environment matches the suggested build environment precisely (although I’m going to use the sdbuild/scripts/setup_host.sh script rather than Vagrant). Note that in the output below there is no runme.log file.

Thanks,

Jeff

synth_1: /home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base/base/base.runs/synth_1/runme.log
[Fri Mar 18 19:19:12 2022] Launched impl_1…
Run output will be captured here: /home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base/base/base.runs/impl_1/runme.log
launch_runs: Time (s): cpu = 00:04:01 ; elapsed = 00:04:22 . Memory (MB): peak = 2660.562 ; gain = 247.305 ; free physical = 133 ; free virtual = 4684
[Fri Mar 18 19:19:12 2022] Waiting for impl_1 to finish…
> /home/jramsey/tools/Xilinx/Vivado/2020.2/bin/rdiArgs.sh: line 309: 3213 Killed “$RDI_PROG” "$@"
make[1]: *** [makefile:16: bitstream] Error 137
make[1]: Leaving directory ‘/home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base’
unmount_special
for fs in $fss
sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/proc
for fs in $fss
sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/run
for fs in $fss
sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/dev
sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/ccache
rmdir /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/ccache
make: *** [Makefile:346: /home/jramsey/Pynq/sdbuild/build/ZCU111.tar.gz] Error 2

Hi Shawn,

I got further with Vivado 2020.2 patched. Again the command I am using follows. It fails on building Z2, and it appears as though all the boards are being build. If I specify only the ZCU211, why would the other boards be built?

make BOARDDIR=/home/jramsey/Pynq-Builds/ZCU111-PYNQ PYNQ_SDIST=/home/jramsey/Pynq/pynq-2.7.0.tar.gz PREBUILT=/home/jramsey/Pynq-Builds/focal.aarch64.2.7.0_2021_11_17.tar.gz BOARDS=ZCU111

Here’s what’s in the ZCU111 directory when the build ends, after failing the Z2 portion. Does this mean the ZCU211 is built successfully, and what parts of this do I need on the SD Card to boot in the ZCU211 (or are the SD Card files in another location)?

jramsey@jramsey-VirtualBox:~/Pynq/sdbuild/build/PYNQ/boards/ZCU111$ ls -l
total 530924
drwxrwxr-x 3 jramsey jramsey 4096 Mar 18 11:30 packages
drwxrwxr-x 3 jramsey jramsey 4096 Mar 18 11:30 petalinux_bsp
-rw-rw-r-- 1 jramsey jramsey 543645818 Mar 18 11:30 xilinx-zcu111-v2020.2-final.bsp
-rwxrwxr-x 1 jramsey jramsey 176 Mar 18 11:30 ZCU111.spec

I am going to save my current 20.04 Ubuntu VM, and recreate an Ubuntu 18.04 VM (using the sdbuild/scripts/setup_host.sh script and not Vagrant). That way my environment will match the suggested build environment. Here’s the excerpt from the build output; if you can think of any reason why it’s failing Z2 (and there’s no runme.log file in the location stated in the output, for some reason). Thanks, again.

Jeff

Run output will be captured here: /home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base/base/base.runs/impl_1/runme.log
launch_runs: Time (s): cpu = 00:04:01 ; elapsed = 00:04:22 . Memory (MB): peak = 2660.562 ; gain = 247.305 ; free physical = 133 ; free virtual = 4684
[Fri Mar 18 19:19:12 2022] Waiting for impl_1 to finish…
/home/jramsey/tools/Xilinx/Vivado/2020.2/bin/rdiArgs.sh: line 309: 3213 Killed “$RDI_PROG” "$@"
> make[1]: *** [makefile:16: bitstream] Error 137
make[1]: Leaving directory ‘/home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base’

  • unmount_special
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/proc
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/run
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/dev
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/ccache
  • rmdir /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/ccache
    make: *** [Makefile:346: /home/jramsey/Pynq/sdbuild/build/ZCU111.tar.gz] Error 2

Hi Shawn,

I got further with the Vivado patch. For some reason all the boards are being build, even with only the ZCU211 specified as BOARDS in the command (see my previous post for the command). The Z2 build is failing for some reason. Here’s the results of the ZCU211 directory; can you tell by the contents whether the ZCU211 build succeeded, and where are the files stored that need to be copied to the SD Card for booting?

jramsey@jramsey-VirtualBox:~/Pynq/sdbuild/build/PYNQ/boards/ZCU111$ ls -l
total 530924
drwxrwxr-x 3 jramsey jramsey 4096 Mar 18 11:30 packages
drwxrwxr-x 3 jramsey jramsey 4096 Mar 18 11:30 petalinux_bsp
-rw-rw-r-- 1 jramsey jramsey 543645818 Mar 18 11:30 xilinx-zcu111-v2020.2-final.bsp
-rwxrwxr-x 1 jramsey jramsey 176 Mar 18 11:30 ZCU111.spec

Here’s an excerpt from the output of the build at the spot where it failed Z2. Thanks again for your help.

Jeff

synth_1: /home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base/base/base.runs/synth_1/runme.log
[Fri Mar 18 19:19:12 2022] Launched impl_1…
Run output will be captured here: /home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base/base/base.runs/impl_1/runme.log
launch_runs: Time (s): cpu = 00:04:01 ; elapsed = 00:04:22 . Memory (MB): peak = 2660.562 ; gain = 247.305 ; free physical = 133 ; free virtual = 4684
[Fri Mar 18 19:19:12 2022] Waiting for impl_1 to finish…
/home/jramsey/tools/Xilinx/Vivado/2020.2/bin/rdiArgs.sh: line 309: 3213 Killed “$RDI_PROG” “$@”
make[1]: *** [makefile:16: bitstream] Error 137
make[1]: Leaving directory ‘/home/jramsey/Pynq/sdbuild/build/PYNQ/boards/Pynq-Z2/base’

  • unmount_special
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/proc
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/run
  • for fs in $fss
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/dev
  • sudo umount -l /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/ccache
  • rmdir /home/jramsey/Pynq/sdbuild/build/focal.ZCU111/ccache
    make: *** [Makefile:346: /home/jramsey/Pynq/sdbuild/build/ZCU111.tar.gz] Error 2

Hi Jeff,

It seems like your build crashes because your VM is running out of memory (looking up the crash message you’re seeing is discussed here and here, where the solution was to add more swap/ram space.

Unfortunately some of the bitstreams for the Z2 have to be built as part of the pynq package (it happens in this script). You don’t have to rebuild the Pynq-Z1 and ZCU104 bitstreams however by providing the prebuilt sdist with the PYNQ_SDIST flag.

Thanks
Shawn

1 Like

Hi Shawn,

While waiting for memory that I ordered to arrive (I need more RAM in order to increase the VM RAM size), I decided to try building the ZCU216 project that you referenced. After 13 hours, the build completed with my current RAM configuration. So I may just use that as a base for the ZCU208 adaptation. I’m going to try booting the ZCU216 build on the ZCU208 board; I’m not sure what to expect, but I’ll see. Thanks again for your help, and I’ll update this thread as I get further with my ZCU208 PYNQ testing.

Jeff

Hi Shawn,

I’ve been trying to get the ZCU216 build up ion the ZCU208 board and am wondering if you can give me some references on where to look for the register mappings for the tic files in the ZCU216 build. I want to verify their settings and how appropriate/accurate they are for the ZCU208 board, which is my real target. There are two tic files in the ZCU216 repo as follows:

pynq@pynq-VirtualBox:~/ZCU216-PYNQ/tics$ ls
LMK04828_245.76.txt LMX2594_491.52.txt

Here’s an excerpt from the first file (I won’t include all the contents; it’s rather large):

R0 (INIT) 0x000090
R0 0x000010
R2 0x000200
R3 0x000306
R4 0x0004D0
R5 0x00055B
R6 0x000600
R12 0x000C51


|R371|0x017300|
|R8189|0x1FFD00|
|R8190|0x1FFE00|
|R8191|0x1FFF53|

I was looking at ug1087-zynq-ultrascale-registers.htm to see if the mappings are obvious, but it’s not jumping out at me if that’s the correct reference. Thanks again for your help.

Jeff

Hi Jeff,

I believe both the ZCU208 and ZCU216 use the clk104 daughterboard for clocking, which has LMK04828B and LMX2594 chips. So the registers should be the same. There’s a bit more info on this, and how specifically this is configured on pynq on the xrfdc package.

Thanks
Shawn

Hi Shawn,

I’ve Jupyter up and communicating with my ZCU208 (using the publicly available ZCU216 build mentioned above). Here are my questions:

  1. The base.bit file wasn’t available in the file system (I would assume whatever packages are build would be included in the file system created when UBoot loads the image, but maybe that’s not the case). So I pushed over a base.bit and base.hdf to the directory that PYNQ expect for the Python overlays (I actually used the ZCU104 base files; maybe I should use the Z2?). The Jupyter notebook for IPython seems fine. But when I execute the following from a notebook, it produces an error for the base overlay. After I pushed over the base bit files the error changed.

from pynq.overlays.base import BaseOverlay
base = BaseOverlay(“base.bit”)

I’m not in the lab so I can’t get the exact error, but does the build system have an option to automatically deploy all the overlays, or are they built into the file system that UBoot loads on start-up depending on the packages that are included in the build spec?

  1. Just to see what the results of another build are, I rebuilt the ZCU111. It still has errors at the end (and warnings splattered throughout). First, should I be concerned about warnings like these?

WARNING: [BD 41-1306] The connection to interface pin </lcp_ar/FSM_generator/fsm_bram_rst_addr/gpio_io_o> is being overridden by the user with net <smb_bram_rst_addr_o>. This pin will not be connected as a part of interface connection .
WARNING: [BD 41-2180] Resetting the memory initialization file of </lcp_ar/lmb/lmb_bram> to default.
CRITICAL WARNING: [BD 41-1265] Different slave segments </lcp_ar/lmb/lmb_bram_if_cntlr/SLMB1/Mem> and </ps7_0/S_AXI_HP0/HP0_DDR_LOWOCM> are mapped into related address spaces </lcp_ar/mb/Data> and </lcp_ar/axi_cdma_0/Data>, at the same offset 0x0000_0000 [ 64K ].

  1. Right at the end of the build, the installation failed. See the following excerpt from the build output. Can you tell what is going on? Is the installation trying to install the build results in /root? Thanks, again for your help. I’m getting closer, but it’s a little frustrating. It sure would be nice if Xilinx had a supported ZCU208 PYNQ port.

Jeff

+ export HOME=/root
+ HOME=/root
+ export BOARD=ZCU111
+ BOARD=ZCU111
+ cd /root/xrfclk_build
+ pip3 install .
Processing /root/xrfclk_build
  DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
   pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.
  Installing build dependencies: started
  Installing build dependencies: finished with status 'error'
  ERROR: Command errored out with exit status 1:
   command: /usr/local/share/pynq-venv/bin/python3 /tmp/pip-standalone-pip-kfxgpz45/__env_pip__.zip/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-znmtelh8/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- 'setuptools>=40.8.0' wheel
       cwd: None
  Complete output (7 lines):
  WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnection object at 0x550c70f880>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /simple/setuptools/
  ERROR: Could not find a version that satisfies the requirement setuptools>=40.8.0 (from versions: none)
  ERROR: No matching distribution found for setuptools>=40.8.0
  ----------------------------------------
WARNING: Discarding file:///root/xrfclk_build. Command errored out with exit status 1: /usr/local/share/pynq-venv/bin/python3 /tmp/pip-standalone-pip-kfxgpz45/__env_pip__.zip/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-znmtelh8/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- 'setuptools>=40.8.0' wheel Check the logs for full command output.

Hi Jeff,

You can’t use base designs from other boards, you would have to build your own overlay for the ZCU208.
Image builds don’t necessarily require a base.bit.

Warnings are usually fine… I don’t really know what this is building so can’t comment too much.

The QEMU scripts are running as if you were on the board, so when it says /root it’s referring to the root that is on the image. From the log in this part

WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnection object at 0x550c70f880>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution')': /simple/setuptools/
  ERROR: Could not find a version that satisfies the requirement setuptools>=40.8.0 (from versions: none)

Were you by any chance not connected to the internet on your host machine, or lost connectivity for a moment?

Thanks
Shawn