Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LINPACK on Xilinx U280: invalid port or argument name: m_axi_gmem0 #14

Open
ndcontini opened this issue Dec 21, 2022 · 9 comments
Open

Comments

@ndcontini
Copy link

I am attempting to build the torus kernel for the LINPACK benchmark, but the build errors out in the link stage due to an invalid port mapping. I'm not sure I understand why this issue is occuring, but my guess would be that the m_axi_gmemX ports are expected to be specified within the kernel. This error seems to imply the final kernel code is not being generated correctly. Is there a setting in my build that is missing? I expected the config file to take care of most of the gotchas, since U280s seem to be supported by the benchmark.

cd LINPACK
mkdir build
cd build
cmake .. -DVitis_INCLUDE_DIRS=/opt/software/FPGA/Xilinx/Vitis/2021.2/include -DVitis_FLOATING_POINT_LIBRARY=/opt/software/FPGA/Xilinx/Vitis_HLS/2021.2/lnx64/tools/fpo_v7_0/libIp_floating_point_v7_0_bitacc_cmodel.so -DHPCC_FPGA_CONFIG=../configs/Xilinx_U280_B8_SB3_R2.cmake -DMPI_C=$HOME/repos/mvapich2/install/lib/libmpi.so -DMPI_CXX=$HOME/repos/mvapich2/install/lib/libmpi.so
make hpl_torus_PCIE_xilinx
...
[ 50%] Generating ../../bin/hpl_torus_PCIE.xclbin
WARNING: [v++ 60-1600] The option 'jobs' was used directly on the command line, where its usage is deprecated. To ensure input line works for supported operating systems or shells, v++ supports specification for some options in a configuration file. As an alternative, please use options 'hls.jobs', 'vivado.synth.jobs' in a configuration file. 
Option Map File Used: '/opt/software/FPGA/Xilinx/Vitis/2021.2/data/vitis/vpp/optMap.xml'

****** v++ v2021.2 (64-bit)
  **** SW Build 3363252 on 2021-10-14-04:41:01
    ** Copyright 1986-2020 Xilinx, Inc. All Rights Reserved.

INFO: [v++ 60-1306] Additional information associated with this v++ link can be found at:
	Reports: /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/bin/xilinx_reports/link
	Log files: /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/bin/xilinx_reports/logs/link
Running Dispatch Server on port: 34327
INFO: [v++ 60-1548] Creating build summary session with primary output /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/bin/hpl_torus_PCIE.xclbin.link_summary, at Tue Dec 20 16:47:58 2022
INFO: [v++ 60-1316] Initiating connection to rulecheck server, at Tue Dec 20 16:47:58 2022
INFO: [v++ 60-1315] Creating rulecheck session with output '/upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/bin/xilinx_reports/link/v++_link_hpl_torus_PCIE_guidance.html', at Tue Dec 20 16:47:59 2022
INFO: [v++ 60-895]   Target platform: /opt/software/FPGA/Xilinx/platforms/xilinx_u280_xdma_201920_3_3246211/xilinx_u280_xdma_201920_3.xpfm
INFO: [v++ 60-1578]   This platform contains Xilinx Shell Archive '/opt/software/FPGA/Xilinx/platforms/xilinx_u280_xdma_201920_3_3246211/hw/xilinx_u280_xdma_201920_3.xsa'
INFO: [v++ 74-78] Compiler Version string: 2021.2
INFO: [v++ 60-1302] Platform 'xilinx_u280_xdma_201920_3.xpfm' has been explicitly enabled for this release.
INFO: [v++ 60-629] Linking for hardware target
INFO: [v++ 60-423]   Target device: xilinx_u280_xdma_201920_3
INFO: [v++ 60-1332] Run 'run_link' status: Not started
INFO: [v++ 60-1443] [16:48:13] Run run_link: Step system_link: Started
INFO: [v++ 60-1453] Command Line: system_link --xo /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/xilinx_tmp_compile/hpl_torus_PCIE.xo --config /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/int/syslinkConfig.ini --xpfm /opt/software/FPGA/Xilinx/platforms/xilinx_u280_xdma_201920_3_3246211/xilinx_u280_xdma_201920_3.xpfm --target hw --output_dir /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/int --temp_dir /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link
INFO: [v++ 60-1454] Run Directory: /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/run_link
INFO: [SYSTEM_LINK 60-1316] Initiating connection to rulecheck server, at Tue Dec 20 16:48:14 2022
INFO: [SYSTEM_LINK 82-70] Extracting xo v3 file /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/xilinx_tmp_compile/hpl_torus_PCIE.xo
INFO: [SYSTEM_LINK 82-53] Creating IP database /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/_sysl/.cdb/xd_ip_db.xml
INFO: [SYSTEM_LINK 82-38] [16:48:27] build_xd_ip_db started: /opt/software/FPGA/Xilinx/Vitis/2021.2/bin/build_xd_ip_db -ip_search 0  -sds-pf /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/xilinx_u280_xdma_201920_3.hpfm -clkid 0 -ip /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/iprepo/xilinx_com_hls_lu_1_0,lu -ip /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/iprepo/xilinx_com_hls_top_update_1_0,top_update -ip /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/iprepo/xilinx_com_hls_inner_update_mm0_1_0,inner_update_mm0 -ip /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/iprepo/xilinx_com_hls_left_update_1_0,left_update -o /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/_sysl/.cdb/xd_ip_db.xml
INFO: [SYSTEM_LINK 82-37] [16:48:32] build_xd_ip_db finished successfully
Time (s): cpu = 00:00:04 ; elapsed = 00:00:05 . Memory (MB): peak = 2369.379 ; gain = 0.000 ; free physical = 392220 ; free virtual = 422552
INFO: [SYSTEM_LINK 82-51] Create system connectivity graph
INFO: [SYSTEM_LINK 82-102] Applying explicit connections to the system connectivity graph: /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/cfgraph/cfgen_cfgraph.xml
INFO: [SYSTEM_LINK 82-38] [16:48:32] cfgen started: /opt/software/FPGA/Xilinx/Vitis/2021.2/bin/cfgen  -nk lu:1 -nk left_update:1 -nk top_update:1 -nk inner_update_mm0:2 -slr lu_1:SLR0 -slr left_update_1:SLR0 -slr top_update_1:SLR0 -slr inner_update_mm0_1:SLR1 -slr inner_update_mm0_2:SLR2 -sp lu_1.m_axi_gmem0:DDR[0] -sp lu_1.m_axi_gmem1:DDR[0] -sp lu_1.m_axi_gmem2:DDR[1] -sp top_update_1.m_axi_gmem0:DDR[0] -sp top_update_1.m_axi_gmem1:DDR[0] -sp top_update_1.m_axi_gmem2:DDR[0] -sp left_update_1.m_axi_gmem0:DDR[0] -sp left_update_1.m_axi_gmem1:DDR[1] -sp left_update_1.m_axi_gmem2:DDR[1] -sp inner_update_mm0_1.m_axi_gmem0:DDR[0] -sp inner_update_mm0_1.m_axi_gmem1:DDR[1] -sp inner_update_mm0_1.m_axi_gmem2:DDR[0] -sp inner_update_mm0_2.m_axi_gmem0:DDR[0] -sp inner_update_mm0_2.m_axi_gmem1:DDR[1] -sp inner_update_mm0_2.m_axi_gmem2:DDR[0] -dmclkid 0 -r /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/_sysl/.cdb/xd_ip_db.xml -o /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/cfgraph/cfgen_cfgraph.xml
INFO: [CFGEN 83-0] Kernel Specs: 
INFO: [CFGEN 83-0]   kernel: lu, num: 1  {lu_1}
INFO: [CFGEN 83-0]   kernel: left_update, num: 1  {left_update_1}
INFO: [CFGEN 83-0]   kernel: top_update, num: 1  {top_update_1}
INFO: [CFGEN 83-0]   kernel: inner_update_mm0, num: 2  {inner_update_mm0_1 inner_update_mm0_2}
INFO: [CFGEN 83-0] Port Specs: 
INFO: [CFGEN 83-0]   kernel: lu_1, k_port: m_axi_gmem0, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: lu_1, k_port: m_axi_gmem1, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: lu_1, k_port: m_axi_gmem2, sptag: DDR[1]
INFO: [CFGEN 83-0]   kernel: top_update_1, k_port: m_axi_gmem0, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: top_update_1, k_port: m_axi_gmem1, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: top_update_1, k_port: m_axi_gmem2, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: left_update_1, k_port: m_axi_gmem0, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: left_update_1, k_port: m_axi_gmem1, sptag: DDR[1]
INFO: [CFGEN 83-0]   kernel: left_update_1, k_port: m_axi_gmem2, sptag: DDR[1]
INFO: [CFGEN 83-0]   kernel: inner_update_mm0_1, k_port: m_axi_gmem0, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: inner_update_mm0_1, k_port: m_axi_gmem1, sptag: DDR[1]
INFO: [CFGEN 83-0]   kernel: inner_update_mm0_1, k_port: m_axi_gmem2, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: inner_update_mm0_2, k_port: m_axi_gmem0, sptag: DDR[0]
INFO: [CFGEN 83-0]   kernel: inner_update_mm0_2, k_port: m_axi_gmem1, sptag: DDR[1]
INFO: [CFGEN 83-0]   kernel: inner_update_mm0_2, k_port: m_axi_gmem2, sptag: DDR[0]
INFO: [CFGEN 83-0] SLR Specs: 
INFO: [CFGEN 83-0]   instance: inner_update_mm0_1, SLR: SLR1
INFO: [CFGEN 83-0]   instance: inner_update_mm0_2, SLR: SLR2
INFO: [CFGEN 83-0]   instance: left_update_1, SLR: SLR0
INFO: [CFGEN 83-0]   instance: lu_1, SLR: SLR0
INFO: [CFGEN 83-0]   instance: top_update_1, SLR: SLR0
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem0
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem1
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem2
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem0
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem1
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem2
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem0
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem1
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem2
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem0
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem1
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem2
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem0
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem1
ERROR: [CFGEN 83-2292] --sp tag applied to an invalid port or argument name: m_axi_gmem2
ERROR: [CFGEN 83-2298] Exiting due to previous error
ERROR: [SYSTEM_LINK 82-36] [16:48:35] cfgen failed
Time (s): cpu = 00:00:03 ; elapsed = 00:00:03 . Memory (MB): peak = 2369.379 ; gain = 0.000 ; free physical = 391997 ; free virtual = 422329
ERROR: [SYSTEM_LINK 82-62] Error generating design file for /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/cfgraph/cfgen_cfgraph.xml, command: /opt/software/FPGA/Xilinx/Vitis/2021.2/bin/cfgen  -nk lu:1 -nk left_update:1 -nk top_update:1 -nk inner_update_mm0:2 -slr lu_1:SLR0 -slr left_update_1:SLR0 -slr top_update_1:SLR0 -slr inner_update_mm0_1:SLR1 -slr inner_update_mm0_2:SLR2 -sp lu_1.m_axi_gmem0:DDR[0] -sp lu_1.m_axi_gmem1:DDR[0] -sp lu_1.m_axi_gmem2:DDR[1] -sp top_update_1.m_axi_gmem0:DDR[0] -sp top_update_1.m_axi_gmem1:DDR[0] -sp top_update_1.m_axi_gmem2:DDR[0] -sp left_update_1.m_axi_gmem0:DDR[0] -sp left_update_1.m_axi_gmem1:DDR[1] -sp left_update_1.m_axi_gmem2:DDR[1] -sp inner_update_mm0_1.m_axi_gmem0:DDR[0] -sp inner_update_mm0_1.m_axi_gmem1:DDR[1] -sp inner_update_mm0_1.m_axi_gmem2:DDR[0] -sp inner_update_mm0_2.m_axi_gmem0:DDR[0] -sp inner_update_mm0_2.m_axi_gmem1:DDR[1] -sp inner_update_mm0_2.m_axi_gmem2:DDR[0] -dmclkid 0 -r /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/_sysl/.cdb/xd_ip_db.xml -o /upb/departments/pc2/users/m/mpifpga2/repos/HPCC_FPGA/LINPACK/build/src/device/_x/link/sys_link/cfgraph/cfgen_cfgraph.xml
ERROR: [SYSTEM_LINK 82-96] Error applying explicit connections to the system connectivity graph
ERROR: [SYSTEM_LINK 82-79] Unable to create system connectivity graph
INFO: [v++ 60-1442] [16:48:35] Run run_link: Step system_link: Failed
Time (s): cpu = 00:00:12 ; elapsed = 00:00:23 . Memory (MB): peak = 2265.199 ; gain = 0.000 ; free physical = 392006 ; free virtual = 422334
ERROR: [v++ 60-661] v++ link run 'run_link' failed
ERROR: [v++ 60-626] Kernel link failed to complete
ERROR: [v++ 60-703] Failed to finish linking
INFO: [v++ 60-1653] Closing dispatch client.
make[3]: *** [src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/build.make:75: bin/hpl_torus_PCIE.xclbin] Error 1
make[2]: *** [CMakeFiles/Makefile2:501: src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:508: src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/rule] Error 2
make: *** [Makefile:283: hpl_torus_PCIE_xilinx] Error 2
@Mellich
Copy link
Collaborator

Mellich commented Jan 5, 2023

The ports seem not to be created correctly in the compilation stage because the compile settings file is ignored (Specified with XILINX_COMPILE_SETTINGS_FILE in the config file). This is a bug in the benchmark CMake scripts.

As a workaround, adding the following line to your config file should fix this:
set(XILINX_ADDITIONAL_COMPILE_FLAGS "--hls.max_memory_ports=all" CACHE STRING "Additional compile flags for v++" FORCE)

@ndcontini
Copy link
Author

Thank you for the response. That workaround got me past the initial link tasks. FYI, this is also an issue when building PTRANS. I assume the same workaround will work.

After building with the workaround I run into a different error:

[19:29:13] Starting logic placement..
[19:31:26] Phase 1 Placer Initialization
[19:31:26] Phase 1.1 Placer Initialization Netlist Sorting
[19:35:17] Phase 1.2 IO Placement/ Clock Placement/ Build Placer Device
[19:37:30] Phase 1.3 Build Placer Netlist Model
[19:43:36] Phase 1.4 Constrain Clocks/Macros
[19:44:08] Phase 2 Global Placement
[19:44:08] Phase 2.1 Floorplanning
[19:45:49] Phase 2.1.1 Partition Driven Placement
[19:45:49] Phase 2.1.1.1 PBP: Partition Driven Placement
[19:53:01] Phase 2.1.1.2 PBP: Clock Region Placement
[19:54:42] Phase 2.1.1.3 PBP: Discrete Incremental
[19:55:15] Phase 2.1.1.4 PBP: Compute Congestion
[19:55:15] Phase 2.1.1.5 PBP: Macro Placement
[19:56:22] Phase 2.1.1.6 PBP: UpdateTiming
[19:57:29] Phase 2.2 Update Timing before SLR Path Opt
[19:57:29] Phase 2.3 Global Placement Core
[20:17:01] Phase 2.3.1 Physical Synthesis In Placer
[20:39:19] Phase 3 Detail Placement
[20:39:19] Phase 3.1 Commit Multi Column Macros
[20:39:52] Phase 3.2 Commit Most Macros & LUTRAMs
[20:42:40] Phase 3.3 Small Shape DP
[20:42:40] Phase 3.3.1 Small Shape Clustering
[20:44:20] Phase 3.3.2 Flow Legalize Slice Clusters
[20:44:54] Phase 3.3.3 Slice Area Swap
[20:47:41] Phase 3.4 Place Remaining
[20:48:15] Phase 3.5 Re-assign LUT pins
[20:48:47] Phase 3.6 Pipeline Register Optimization
[20:49:22] Phase 3.7 Fast Optimization
[20:52:10] Phase 4 Post Placement Optimization and Clean-Up
[20:52:10] Phase 4.1 Post Commit Optimization
[20:58:19] Phase 4.1.1 Post Placement Optimization
[20:58:19] Phase 4.1.1.1 BUFG Insertion
[20:58:19] Phase 1 Physical Synthesis Initialization
[21:01:40] Phase 4.1.1.2 BUFG Replication
[21:05:00] Phase 4.1.1.3 Replication
[21:08:54] Phase 4.2 Post Placement Cleanup
[21:09:28] Phase 4.3 Placer Reporting
[21:09:28] Phase 4.3.1 Print Estimated Congestion
[21:10:01] Phase 4.4 Final Placement Cleanup
[21:24:07] Run vpl: Step impl: Failed
[21:24:11] Run vpl: FINISHED. Run Status: impl ERROR

===>The following messages were generated while processing /upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/prj/prj.runs/impl_1 :
ERROR: [VPL 17-69] Command failed: Failed to create design checkpoint
ERROR: [VPL 60-773] In '/upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/runme.log', caught Tcl error:  problem implementing dynamic region, impl_1: place_design ERROR, please look at the run log file '/upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/prj/prj.runs/impl_1/runme.log' for more information                                                                                                                          
WARNING: [VPL 60-732] Link warning: No monitor points found for BD automation.
ERROR: [VPL 60-704] Integration error, problem implementing dynamic region, impl_1: place_design ERROR, please look at the run log file '/upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/prj/prj.runs/impl_1/runme.log' for more information                                                                                                                                                                                                                            
ERROR: [VPL 60-1328] Vpl run 'vpl' failed
ERROR: [VPL 60-806] Failed to finish platform linker
INFO: [v++ 60-1442] [21:24:19] Run run_link: Step vpl: Failed
Time (s): cpu = 00:06:41 ; elapsed = 04:17:11 . Memory (MB): peak = 1756.656 ; gain = 0.000 ; free physical = 156008 ; free virtual = 461854
ERROR: [v++ 60-661] v++ link run 'run_link' failed
ERROR: [v++ 60-626] Kernel link failed to complete
ERROR: [v++ 60-703] Failed to finish linking
INFO: [v++ 60-1653] Closing dispatch client.
make[3]: *** [src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/build.make:75: bin/hpl_torus_PCIE.xclbin] Error 1
make[2]: *** [CMakeFiles/Makefile2:501: src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:508: src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/rule] Error 2
make: *** [Makefile:283: hpl_torus_PCIE_xilinx] Error 2

I tried to do some basic web searching to figure out why this issue arises, but can't seem to figure out if this is a local issue or an issue with the build process.

@Mellich
Copy link
Collaborator

Mellich commented Jan 6, 2023

From the current output, I can only see, that the synthesis failed during the placement phase. Maybe you can find out more when looking into the log files mentioned in the error message. Does the design overutilize the resources on one SLR? The matrix multiplication kernels are quite large in this configuration and nearly fill a whole SLR. Together with the DDR memory interconnect it may get dense on SLR1. But it should not overutilize the available resources on the U280.

If it is just about getting some working bitstream, we may change to HBM instead and/or reduce the size of the design. But maybe its better to first track down this issue.

@ndcontini
Copy link
Author

I think this isn't an issue with overutilization. It honestly seems like it could be a bug inside Vitis itself. This is the first error from the exceprt above as well as in the runme.log the console output refers to:

ERROR: [VPL 17-69] Command failed: Failed to create design checkpoint

@ndcontini
Copy link
Author

I retried building the kernels on a newer version of Vitis, did not get the above errors, so I'm going to assume this is an issue with Vitis 20.2. However with HBM enabled and disabled, I get timing errors:

[07:55:26] Starting bitstream generation..
Starting optional post-route physical design optimization.
[08:36:26] Phase 1 Physical Synthesis Initialization
[08:43:23] Phase 2 Critical Path Optimization
Finished optional post-route physical design optimization.
[12:31:46] Run vpl: Step impl: Failed
[12:31:53] Run vpl: FINISHED. Run Status: impl ERROR

===>The following messages were generated while  Compiling (bitstream) accelerator binary: hpl_torus_PCIE Log file: /upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/prj/prj.runs/impl_1/runme.log  :
ERROR: [VPL 101-2] design did not meet timing - Design did not meet timing. One or more unscalable system clocks did not meet their required target frequency. For all system clocks, this design is using 0 nanoseconds as the threshold worst negative slack (WNS) value. List of system clocks with timing failure:                                                                               
system clock: pll_clk[0]_DIV; slack: -1.009 ns
system clock: mmcm_clkout0; slack: -0.751 ns
system clock: mmcm_clkout0_1; slack: -0.166 ns
system clock: pll_clk[1]_DIV; slack: -0.134 ns
ERROR: [VPL 101-3] sourcing script /upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/scripts/impl_1/_full_write_bitstream_pre.tcl failed                          
ERROR: [VPL 60-773] In '/upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/runme.log', caught Tcl error:  problem implementing dynamic region, impl_1: write_bitstream ERROR, please look at the run log file '/upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/prj/prj.runs/impl_1/runme.log' for more information                  
WARNING: [VPL 60-732] Link warning: No monitor points found for BD automation.
ERROR: [VPL 60-704] Integration error, problem implementing dynamic region, impl_1: write_bitstream ERROR, please look at the run log file '/upb/departments/pc2/groups/hpc-prf-mpifpga/LINPACK/src/device/_x/link/vivado/vpl/prj/prj.runs/impl_1/runme.log' for more information
ERROR: [VPL 60-1328] Vpl run 'vpl' failed
ERROR: [VPL 60-806] Failed to finish platform linker
INFO: [v++ 60-1442] [12:32:06] Run run_link: Step vpl: Failed
Time (s): cpu = 00:58:31 ; elapsed = 14:52:56 . Memory (MB): peak = 2265.207 ; gain = 0.000 ; free physical = 139218 ; free virtual = 467439                                                      
ERROR: [v++ 60-661] v++ link run 'run_link' failed
ERROR: [v++ 60-626] Kernel link failed to complete
ERROR: [v++ 60-703] Failed to finish linking
INFO: [v++ 60-1653] Closing dispatch client.
make[3]: *** [src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/build.make:75: bin/hpl_torus_PCIE.xclbin] Error 1
make[2]: *** [CMakeFiles/Makefile2:501: src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:508: src/device/CMakeFiles/hpl_torus_PCIE_xilinx.dir/rule] Error 2

Please let me know what extra information I can give to help diagnose the issue.

@Mellich
Copy link
Collaborator

Mellich commented Jan 9, 2023

Interesting. What Vitis and XRT version do you use now?

Could you please provide the v++ log for the compilation (bin/xilinx_reports/logs/v++_hpl_torus_PCIE.log) and linking (bin/xilinx_reports/logs/link/v++.log)?

@ndcontini
Copy link
Author

[mpifpga2@n2login2 LINPACK]$ vitis --version

****** Xilinx Vitis Development Environment
****** Vitis v2021.2 (64-bit)
  **** SW Build 3363750 on 2021-10-16-13:10:08
    ** Copyright 1986-2021 Xilinx, Inc. All Rights Reserved.

[mpifpga2@n2login2 LINPACK]$ xbutil --version
Version              : 2.12.429
Branch               : 2021.2_RHEL8.5
Hash                 : 2180e838abe791cb1e90d9011bbc8b3676774172
Hash Date            : 2022-04-08 11:43:35
XOCL                 : unknown, unknown
XCLMGMT              : unknown, unknown

v++_hpl_torus_PCIE.log
v++.log

@Mellich
Copy link
Collaborator

Mellich commented Jan 10, 2023

It looks like the two logs are the same. Could you please re-upload the compilation logs? It should be this path: bin/xilinx_reports/logs/v++_hpl_torus_PCIE.log

@ndcontini
Copy link
Author

I think bin/xilinx_reports/logs/hpl_torus_PCIE/v++.log is the compilation log actually.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants