Skip to content

Yosys-Slang Ram Inference Issues #3262

@loglav03

Description

@loglav03

Currently 2 benchmarks are failing in the VPR stage when using slang as a parser. These benchmarks are arm_core.v and or1200.v.

They get this assertion error during the VPR flow: vtr-root/vpr/src/timing/timing_graph_builder.cpp:546 create_block_internal_clock_timing_edges: Assertion 'clk_pin' failed. and errors out completely when building the timing graph.

This error is coming from the clock and data ports being unconnected for some single/dual port ram instances in the .blif files as one can see below:

arm_core.pre-vpr.blif - line#14139:
arm_core.pre-vpr.txt

.subckt single_port_ram addr[0]=$dffe~1793^Q~0 addr[1]=$dffe~1793^Q~1 addr[2]=$dffe~1793^Q~2 addr[3]=$dffe~1793^Q~3 \
 addr[4]=unconn addr[5]=unconn addr[6]=unconn addr[7]=unconn addr[8]=unconn addr[9]=unconn addr[10]=unconn addr[11]=unconn \
 addr[12]=unconn addr[13]=unconn addr[14]=unconn data=unconn we=gnd clk=unconn out=single_port_ram^MEM~64-14^out~0

It is believed that this issue is stemming from the way parmys handles techmapping for RAM instances. This could be happening in one of these three steps parmys takes to do this:

First, parmys will attempt to resolve memory nodes by transforming Yosys Generated Memory Blocks into BRAMs. This takes place in vtr-root/parmys/parmys-plugin/parmys.cc during the elaboration phase in the resolve_top function.

Next, in the optimization phase, it performs mapping of BRAMs to single/dual port rams. Following this, it performs splitting of these sp/dp rams into a set of functionally equivalent instances for which the input and output data busses are only one bit wide.

In arm_core.pre-vpr.blif, since there's only one set of 15 split instances where this is occurring and the rest look fine, it would seem this is a weird corner case that we need to account for with slang usage.

In addition to this issue, various other benchmarks still fail QoR. The most concerning are benchmarks like mkDelayWorker32B where the number of memories is showing 0 when it's supposed to be 44. It is possible that these QoR failures could be related to the issue we're seeing with arm_core and or1200.

One can reproduce these errors by using the modified verilog code in vtr-root/vtr_flow/benchmarks/verilog from this branch: verilog_benchmark_fix_slang_errors and run this from the vtr-root:
./vtr_flow/scripts/run_vtr_flow.py <path_to_verilog_file> vtr_flow/arch/timing/k6_frac_N10_frac_chain_mem32K_40nm.xml -track_memory_usage -crit_path_router_iterations 100 -parser slang

For convenience, below is a .zip of the currently modified verilog files:
modified_verilog.zip

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions