DPU-PYNQ connect to S_AXI_LPD

Hello,

I am trying to use the DPU-PYNQ flow to connect the instruction fetch port (M_AXI_GP0) of a single DPU core to the S_AXI_LPD port on a ZYNQ US+ as suggested in this documentation. I am getting the following error:

ERROR: [CF2SW 83-2178] Memory for component zynq_ultra_ps_e, interface S_AXI_LPD cannot have address segment information automatically inferred. Please annotate the interface with memory segment information.

I have a working design where the instructing fetch port is connected to the S_AXI_HPC1_FPD port, however, I have significant data movement on the S_AXI_HPC0_FPD port which results in ~25% performance drop for the DPU and nominal performance drop for the subsystem connected to S_AXI_HPC0_FPD. I am trying to resolve this by only using the S_AXI_HPC0_FPD port in the design and using the S_AXI_LPD port for instruction fetching

I looked up the error but haven’t found anything that could help me with this issue so any help would be highly appreciated.

Thank you,

Mario

Set up

  • Custom ZU+ SoC
  • PYNQ 2.7.0 image
  • DPU-PYNQ 1.4.0
  • Xilinx tools 2020.2
  • Ubuntu 18.04.6 set up using Vagrant file

AXI Port for the platform

image

prj_config content

[clock]
id=1:DPUCZDX8G_1.aclk
id=2:DPUCZDX8G_1.ap_clk_2 

[connectivity]
sp=DPUCZDX8G_1.M_AXI_GP0:LPD
sp=DPUCZDX8G_1.M_AXI_HP0:HP0
sp=DPUCZDX8G_1.M_AXI_HP2:HP2

[advanced]
misc=:solution_name=link
param=compiler.addOutputTypes=sd_card

[vivado]
prop=run.impl_1.strategy=Performance_Explore
param=place.runPartPlacer=0

Full build log

vagrant@ubuntu-bionic:/workspace/DPU-PYNQ/boards$ make BOARD=mlvap_test1 VITIS_PLATFORM=/workspace/mlvap_test1_pfm/export/mlvap_test1_pfm/mlvap_test1_pfm.xpfm 
BOARD: mlvap_test1
VITIS_PLATFORM: /workspace/mlvap_test1_pfm/export/mlvap_test1_pfm/mlvap_test1_pfm.xpfm
bash check_env.sh
cp -rf /workspace/DPU-PYNQ/boards/../vitis-ai-git/dsa/DPU-TRD/prj/Vitis/kernel_xml/dpu/kernel.xml /workspace/DPU-PYNQ/boards/mlvap_test1/kernel_xml/dpu/kernel.xml
cp -f /workspace/DPU-PYNQ/boards/../vitis-ai-git/dsa/DPU-TRD/prj/Vitis/scripts/package_dpu_kernel.tcl /workspace/DPU-PYNQ/boards/mlvap_test1/scripts/package_dpu_kernel.tcl
sed -i 's/set path_to_hdl "..\/..\/dpu_ip"/set path_to_hdl "..\/..\/vitis-ai-git\/dsa\/DPU-TRD\/dpu_ip"/' /workspace/DPU-PYNQ/boards/mlvap_test1/scripts/package_dpu_kernel.tcl
cp -f /workspace/DPU-PYNQ/boards/../vitis-ai-git/dsa/DPU-TRD/prj/Vitis/scripts/gen_dpu_xo.tcl /workspace/DPU-PYNQ/boards/mlvap_test1/scripts/gen_dpu_xo.tcl
cp -f /workspace/DPU-PYNQ/boards/../vitis-ai-git/dsa/DPU-TRD/prj/Vitis/scripts/bip_proc.tcl /workspace/DPU-PYNQ/boards/mlvap_test1/scripts/bip_proc.tcl
cd /workspace/DPU-PYNQ/boards/mlvap_test1 ;\
/workspace/xilinx/Vivado/2020.2/bin/vivado -mode batch -source scripts/gen_dpu_xo.tcl \
	-tclargs binary_container_1/dpu.xo DPUCZDX8G hw mlvap_test1

****** Vivado v2020.2 (64-bit)
  **** SW Build 3064766 on Wed Nov 18 09:12:47 MST 2020
  **** IP Build 3064653 on Wed Nov 18 14:17:31 MST 2020
    ** Copyright 1986-2020 Xilinx, Inc. All Rights Reserved.

source scripts/gen_dpu_xo.tcl
# if { $::argc != 4 } {
#     puts "ERROR: Program \"$::argv0\" requires 4 arguments!\n"
#     puts "Usage: $::argv0 <xoname> <krnl_name> <target> <device>\n"
#     exit
# }
# set xoname    [lindex $::argv 0]
# set krnl_name [lindex $::argv 1]
# set target    [lindex $::argv 2]
# set device    [lindex $::argv 3]
# puts $xoname
binary_container_1/dpu.xo
# set suffix "${krnl_name}_${target}_${device}"
# if { [info exists ::env(DIR_PATH)] } {
#     source -notrace $env(DIR_PRJ)/scripts/package_dpu_kernel.tcl
# } else {
#     source -notrace ./scripts/package_dpu_kernel.tcl
# }
INFO: [IP_Flow 19-5654] Module 'DPUCZDX8G' uses SystemVerilog sources with a Verilog top file. These SystemVerilog files will not be analysed by the packager.
INFO: [IP_Flow 19-1842] HDL Parser: Found include file "src/arch_def.vh" from the top-level HDL file.
INFO: [IP_Flow 19-1842] HDL Parser: Found include file "/workspace/DPU-PYNQ/boards/mlvap_test1/dpu_conf.vh" from the top-level HDL file.
INFO: [IP_Flow 19-1841] HDL Parser: Add include file "/workspace/DPU-PYNQ/boards/mlvap_test1/dpu_conf.vh" to file group xilinx_anylanguagesynthesis.
INFO: [IP_Flow 19-1841] HDL Parser: Add include file "/workspace/DPU-PYNQ/boards/mlvap_test1/dpu_conf.vh" to file group xilinx_anylanguagebehavioralsimulation.
INFO: [IP_Flow 19-234] Refreshing IP repositories
INFO: [IP_Flow 19-1704] No user IP repositories specified
INFO: [IP_Flow 19-2313] Loaded Vivado IP repository '/workspace/xilinx/Vivado/2020.2/data/ip'.
INFO: [IP_Flow 19-5107] Inferred bus interface 'aclk' of definition 'xilinx.com:signal:clock:1.0' (from X_INTERFACE_INFO parameter from HDL file).
INFO: [IP_Flow 19-5107] Inferred bus interface 'aclk' of definition 'xilinx.com:signal:clock:1.0' (from 'X_INTERFACE_INFO' attribute).
INFO: [IP_Flow 19-5107] Inferred bus interface 'ap_clk_2' of definition 'xilinx.com:signal:clock:1.0' (from X_INTERFACE_INFO parameter from HDL file).
INFO: [IP_Flow 19-5107] Inferred bus interface 'ap_clk_2' of definition 'xilinx.com:signal:clock:1.0' (from 'X_INTERFACE_INFO' attribute).
INFO: [IP_Flow 19-5107] Inferred bus interface 'ap_rst_n_2' of definition 'xilinx.com:signal:reset:1.0' (from X_INTERFACE_INFO parameter from HDL file).
INFO: [IP_Flow 19-5107] Inferred bus interface 'ap_rst_n_2' of definition 'xilinx.com:signal:reset:1.0' (from 'X_INTERFACE_INFO' attribute).
INFO: [IP_Flow 19-5107] Inferred bus interface 'aresetn' of definition 'xilinx.com:signal:reset:1.0' (from X_INTERFACE_INFO parameter from HDL file).
INFO: [IP_Flow 19-5107] Inferred bus interface 'aresetn' of definition 'xilinx.com:signal:reset:1.0' (from 'X_INTERFACE_INFO' attribute).
INFO: [IP_Flow 19-5107] Inferred bus interface 'M_AXI_GP0' of definition 'xilinx.com:interface:aximm:1.0' (from Xilinx Repository).
INFO: [IP_Flow 19-5107] Inferred bus interface 'M_AXI_HP0' of definition 'xilinx.com:interface:aximm:1.0' (from Xilinx Repository).
INFO: [IP_Flow 19-5107] Inferred bus interface 'M_AXI_HP2' of definition 'xilinx.com:interface:aximm:1.0' (from Xilinx Repository).
INFO: [IP_Flow 19-5107] Inferred bus interface 'S_AXI_CONTROL' of definition 'xilinx.com:interface:aximm:1.0' (from Xilinx Repository).
INFO: [IP_Flow 19-5107] Inferred bus interface 'interrupt' of definition 'xilinx.com:signal:interrupt:1.0' (from Xilinx Repository).
INFO: [IP_Flow 19-4728] Bus Interface 'interrupt': Added interface parameter 'SENSITIVITY' with value 'LEVEL_HIGH'.
INFO: [IP_Flow 19-4728] Bus Interface 'aclk': Added interface parameter 'ASSOCIATED_BUSIF' with value 'M_AXI_GP0'.
INFO: [IP_Flow 19-4728] Bus Interface 'aclk': Added interface parameter 'ASSOCIATED_RESET' with value 'aresetn'.
INFO: [IP_Flow 19-4728] Bus Interface 'ap_clk_2': Added interface parameter 'ASSOCIATED_RESET' with value 'ap_rst_n_2'.
INFO: [IP_Flow 19-4728] Bus Interface 'ap_rst_n_2': Added interface parameter 'POLARITY' with value 'ACTIVE_LOW'.
INFO: [IP_Flow 19-4728] Bus Interface 'aresetn': Added interface parameter 'POLARITY' with value 'ACTIVE_LOW'.
WARNING: [IP_Flow 19-5661] Bus Interface 'ap_clk_2' does not have any bus interfaces associated with it.
WARNING: [IP_Flow 19-3157] Bus Interface 'ap_rst_n_2': Bus parameter POLARITY is ACTIVE_LOW but port 'ap_rst_n_2' is not *resetn - please double check the POLARITY setting.
WARNING: [IP_Flow 19-731] File Group 'xilinx_anylanguagesynthesis (Synthesis)': "/workspace/DPU-PYNQ/boards/mlvap_test1/dpu_conf.vh" file path is not relative to the IP root directory.
WARNING: [IP_Flow 19-4816] The Synthesis file group has two include files that have the same base name. It is not guaranteed which of these two files will be picked up during synthesis/simulation:   src/dpu_conf.vh
  /workspace/DPU-PYNQ/boards/mlvap_test1/dpu_conf.vh
WARNING: [IP_Flow 19-991] Unrecognized or unsupported file 'src/fingerprint_json.ttcl' found in file group 'Synthesis'.
Resolution: Remove the file from the specified file group.
WARNING: [IP_Flow 19-731] File Group 'xilinx_anylanguagebehavioralsimulation (Simulation)': "/workspace/DPU-PYNQ/boards/mlvap_test1/dpu_conf.vh" file path is not relative to the IP root directory.
WARNING: [IP_Flow 19-4816] The Simulation file group has two include files that have the same base name. It is not guaranteed which of these two files will be picked up during synthesis/simulation:   src/dpu_conf.vh
  /workspace/DPU-PYNQ/boards/mlvap_test1/dpu_conf.vh
WARNING: [IP_Flow 19-991] Unrecognized or unsupported file 'src/fingerprint_json.ttcl' found in file group 'Simulation'.
Resolution: Remove the file from the specified file group.
INFO: [IP_Flow 19-2181] Payment Required is not set for this core.
INFO: [IP_Flow 19-2187] The Product Guide file is missing.
ipx::package_project: Time (s): cpu = 00:00:04 ; elapsed = 00:00:06 . Memory (MB): peak = 2467.258 ; gain = 0.344 ; free physical = 4172 ; free virtual = 8759
INFO: [IP_Flow 19-795] Syncing license key meta-data
INFO: [IP_Flow 19-234] Refreshing IP repositories
INFO: [IP_Flow 19-1704] No user IP repositories specified
INFO: [IP_Flow 19-2313] Loaded Vivado IP repository '/workspace/xilinx/Vivado/2020.2/data/ip'.
INFO: [IP_Flow 19-5107] Inferred bus interface 'ap_clk_2' of definition 'xilinx.com:signal:clock:1.0' (from TCL Argument).
INFO: [IP_Flow 19-5107] Inferred bus interface 'ap_rst_n_2' of definition 'xilinx.com:signal:reset:1.0' (from TCL Argument).
# if {[file exists "${xoname}"]} {
#     file delete -force "${xoname}"
# }
# if { [info exists ::env(DIR_PATH)] } {
#     package_xo -xo_path ${xoname} -kernel_name ${krnl_name} -ip_directory ./packaged_kernel_${suffix} -kernel_xml $env(DIR_PRJ)/kernel_xml/dpu/kernel.xml
# } else {
#     package_xo -xo_path ${xoname} -kernel_name ${krnl_name} -ip_directory ./packaged_kernel_${suffix} -kernel_xml ./kernel_xml/dpu/kernel.xml
# }
WARNING: [Vivado 12-4404] The CPU emulation flow in v++ is only supported when using a packaged XO file that contains C-model files, none were found.
INFO: [Common 17-206] Exiting Vivado at Mon Oct  2 14:48:18 2023...
cd /workspace/DPU-PYNQ/boards/mlvap_test1 ;\
v++ -t hw --platform /workspace/mlvap_test1_pfm/export/mlvap_test1_pfm/mlvap_test1_pfm.xpfm --save-temps --config /workspace/DPU-PYNQ/boards/mlvap_test1/prj_config --xp param:compiler.userPostSysLinkTcl=/workspace/DPU-PYNQ/boards/../vitis-ai-git/dsa/DPU-TRD/prj/Vitis/syslink/strip_interconnects.tcl -l --temp_dir binary_container_1 \
	--log_dir binary_container_1/logs \
	--remote_ip_cache binary_container_1/ip_cache -o /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/dpu.xclbin binary_container_1/dpu.xo
WARNING: [v++ 60-1600] The option 'xp' was used directly on the command line, where its usage is deprecated. To ensure input line works for supported operating systems or shells, v++ supports specification for some options in a configuration file. As an alternative, please use options 'advanced.*', 'vivado.*' in a configuration file. Use one or more configuration files along with section headers to define key-value pairs for the advanced properties or parameters. Specify a configuration file using '--config'.
INFO: [v++ 82-185] Check out the auto-generated 'sample_link.ini' configuration file. The file shows how to migrate from deprecated command line --xp switches to configuration file directives.
Option Map File Used: '/workspace/xilinx/Vitis/2020.2/data/vitis/vpp/optMap.xml'

****** v++ v2020.2 (64-bit)
  **** SW Build (by xbuild) on 2020-11-18-05:13:29
    ** Copyright 1986-2020 Xilinx, Inc. All Rights Reserved.

WARNING: [v++ 60-1495] Deprecated parameter found: compiler.userPostSysLinkTcl. Please use this replacement parameter instead: compiler.userPostDebugProfileOverlayTcl
INFO: [v++ 60-1306] Additional information associated with this v++ link can be found at:
	Reports: /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/reports/link
	Log files: /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/logs/link
Running Dispatch Server on port:35543
INFO: [v++ 60-1548] Creating build summary session with primary output /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/dpu.xclbin.link_summary, at Mon Oct  2 14:48:29 2023
INFO: [v++ 60-1316] Initiating connection to rulecheck server, at Mon Oct  2 14:48:29 2023
Running Rule Check Server on port:34103
INFO: [v++ 60-1315] Creating rulecheck session with output '/workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/reports/link/v++_link_dpu_guidance.html', at Mon Oct  2 14:48:30 2023
INFO: [v++ 60-895]   Target platform: /workspace/mlvap_test1_pfm/export/mlvap_test1_pfm/mlvap_test1_pfm.xpfm
INFO: [v++ 60-1578]   This platform contains Xilinx Shell Archive '/workspace/mlvap_test1_pfm/export/mlvap_test1_pfm/hw/mlvap-9z1.xsa'
INFO: [v++ 60-629] Linking for hardware target
INFO: [v++ 60-423]   Target device: mlvap_test1_pfm
INFO: [v++ 60-1332] Run 'run_link' status: Not started
INFO: [v++ 60-1443] [14:48:31] Run run_link: Step system_link: Started
INFO: [v++ 60-1453] Command Line: system_link --xo /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/dpu.xo -keep --config /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/int/syslinkConfig.ini --xpfm /workspace/mlvap_test1_pfm/export/mlvap_test1_pfm/mlvap_test1_pfm.xpfm --target hw --output_dir /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/int --temp_dir /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link
INFO: [v++ 60-1454] Run Directory: /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/run_link
INFO: [SYSTEM_LINK 60-1316] Initiating connection to rulecheck server, at Mon Oct  2 14:48:32 2023
INFO: [SYSTEM_LINK 82-70] Extracting xo v3 file /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/dpu.xo
INFO: [SYSTEM_LINK 82-53] Creating IP database /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/_sysl/.cdb/xd_ip_db.xml
INFO: [SYSTEM_LINK 82-38] [14:48:32] build_xd_ip_db started: /workspace/xilinx/Vitis/2020.2/bin/build_xd_ip_db -ip_search 0  -sds-pf /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/mlvap-9z1.hpfm -clkid 1 -ip /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/iprepo/xilinx_com_RTLKernel_DPUCZDX8G_1_0,DPUCZDX8G -o /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/_sysl/.cdb/xd_ip_db.xml
INFO: [SYSTEM_LINK 82-37] [14:48:36] build_xd_ip_db finished successfully
Time (s): cpu = 00:00:05 ; elapsed = 00:00:04 . Memory (MB): peak = 1630.387 ; gain = 314.297 ; free physical = 4512 ; free virtual = 9108
INFO: [SYSTEM_LINK 82-51] Create system connectivity graph
INFO: [SYSTEM_LINK 82-102] Applying explicit connections to the system connectivity graph: /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/cfgraph/cfgen_cfgraph.xml
INFO: [SYSTEM_LINK 82-38] [14:48:36] cfgen started: /workspace/xilinx/Vitis/2020.2/bin/cfgen  -sp DPUCZDX8G_1.M_AXI_GP0:LPD -sp DPUCZDX8G_1.M_AXI_HP0:HP0 -sp DPUCZDX8G_1.M_AXI_HP2:HP2 -clock.id 1:DPUCZDX8G_1.aclk -clock.id 2:DPUCZDX8G_1.ap_clk_2 -dmclkid 1 -r /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/_sysl/.cdb/xd_ip_db.xml -o /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/cfgraph/cfgen_cfgraph.xml
INFO: [CFGEN 83-0] Kernel Specs: 
INFO: [CFGEN 83-0]   kernel: DPUCZDX8G, num: 1  {DPUCZDX8G_1}
INFO: [CFGEN 83-0] Port Specs: 
INFO: [CFGEN 83-0]   kernel: DPUCZDX8G_1, k_port: M_AXI_GP0, sptag: LPD
INFO: [CFGEN 83-0]   kernel: DPUCZDX8G_1, k_port: M_AXI_HP0, sptag: HP0
INFO: [CFGEN 83-0]   kernel: DPUCZDX8G_1, k_port: M_AXI_HP2, sptag: HP2
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_instr_addr to LPD for directive DPUCZDX8G_1.M_AXI_GP0:LPD
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_prof_addr to LPD for directive DPUCZDX8G_1.M_AXI_GP0:LPD
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_base0_addr to HP0 for directive DPUCZDX8G_1.M_AXI_HP0:HP0
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_base1_addr to HP0 for directive DPUCZDX8G_1.M_AXI_HP0:HP0
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_base2_addr to HP0 for directive DPUCZDX8G_1.M_AXI_HP0:HP0
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_base3_addr to HP0 for directive DPUCZDX8G_1.M_AXI_HP0:HP0
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_base4_addr to HP2 for directive DPUCZDX8G_1.M_AXI_HP2:HP2
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_base5_addr to HP2 for directive DPUCZDX8G_1.M_AXI_HP2:HP2
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_base6_addr to HP2 for directive DPUCZDX8G_1.M_AXI_HP2:HP2
INFO: [CFGEN 83-2228] Creating mapping for argument DPUCZDX8G_1.dpu_base7_addr to HP2 for directive DPUCZDX8G_1.M_AXI_HP2:HP2
INFO: [SYSTEM_LINK 82-37] [14:48:37] cfgen finished successfully
Time (s): cpu = 00:00:00.89 ; elapsed = 00:00:00.94 . Memory (MB): peak = 1630.387 ; gain = 0.000 ; free physical = 4510 ; free virtual = 9107
INFO: [SYSTEM_LINK 82-52] Create top-level block diagram
INFO: [SYSTEM_LINK 82-38] [14:48:37] cf2bd started: /workspace/xilinx/Vitis/2020.2/bin/cf2bd  --linux --trace_buffer 1024 --input_file /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/cfgraph/cfgen_cfgraph.xml --ip_db /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/_sysl/.cdb/xd_ip_db.xml --cf_name dr --working_dir /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/_sysl/.xsd --temp_dir /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link --output_dir /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/int
INFO: [CF2BD 82-31] Launching cf2xd: cf2xd -linux -trace-buffer 1024 -i /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/cfgraph/cfgen_cfgraph.xml -r /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/_sysl/.cdb/xd_ip_db.xml -o dr.xml
INFO: [CF2BD 82-28] cf2xd finished successfully
INFO: [CF2BD 82-31] Launching cf_xsd: cf_xsd -disable-address-gen -dn dr -dp /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/sys_link/_sysl/.xsd
INFO: [CF2BD 82-28] cf_xsd finished successfully
INFO: [SYSTEM_LINK 82-37] [14:48:38] cf2bd finished successfully
Time (s): cpu = 00:00:01 ; elapsed = 00:00:01 . Memory (MB): peak = 1630.387 ; gain = 0.000 ; free physical = 4505 ; free virtual = 9106
INFO: [v++ 60-1441] [14:48:38] Run run_link: Step system_link: Completed
Time (s): cpu = 00:00:08 ; elapsed = 00:00:07 . Memory (MB): peak = 1586.246 ; gain = 0.000 ; free physical = 4544 ; free virtual = 9145
INFO: [v++ 60-1443] [14:48:38] Run run_link: Step cf2sw: Started
INFO: [v++ 60-1453] Command Line: cf2sw -sdsl /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/int/sdsl.dat -rtd /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/int/cf2sw.rtd -nofilter /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/int/cf2sw_full.rtd -xclbin /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/int/xclbin_orig.xml -o /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/int/xclbin_orig.1.xml
INFO: [v++ 60-1454] Run Directory: /workspace/DPU-PYNQ/boards/mlvap_test1/binary_container_1/link/run_link
ERROR: [CF2SW 83-2178] Memory for component zynq_ultra_ps_e, interface S_AXI_LPD cannot have address segment information automatically inferred.  Please annotate the interface with memory segment information
INFO: [v++ 60-1442] [14:48:39] Run run_link: Step cf2sw: Failed
Time (s): cpu = 00:00:01 ; elapsed = 00:00:01 . Memory (MB): peak = 1586.246 ; gain = 0.000 ; free physical = 4545 ; free virtual = 9147
ERROR: [v++ 60-661] v++ link run 'run_link' failed
ERROR: [v++ 60-626] Kernel link failed to complete
ERROR: [v++ 60-703] Failed to finish linking
INFO: [v++ 60-1653] Closing dispatch client.
Makefile:124: recipe for target 'dpu.xclbin' failed
make: *** [dpu.xclbin] Error 1

I managed to resolve the issue by specifying the “memory” field when enabling the the S_AXI_LPD port for the platform as <PS name> LPD_DDR_LOW. I didn’t find any documentation about this, I stumbled upon this while reading through the source code of this TRD. While this is the recommended configuraiton, I didn’t manage to resolve my bottleneck.

Part of the subsystem is responsible for data preprocessing for the DPU. Is there a way to directly connect a preprocessing system to the DPU? I feel like that should help with the bottleneck as it would remove avoid using the PS to move data between sections of the DDR (output of the preprocessing subsystem and input to the DPU).

Thank you,

Mario