Uvm Class Labnotes
Uvm Class Labnotes
Uvm Class Labnotes
Cadence Confidential 2
Table of Contents
Software Dependencies ................................................................................................................... 4
Workshop Setup Instructions ........................................................................................................... 4
Lab 1. Stimulus Modeling ...................................................................................... 5
1.1. Objectives ....................................................................................................................... 5
1.2. Installing Lab1 ................................................................................................................. 5
Lab 2. UVM Sequences .......................................................................................... 7
2.1. Objectives ....................................................................................................................... 7
2.2. Installing Lab2 ................................................................................................................. 7
Lab 3. Reusable Environment Topology ............................................................ 10
3.1. Objectives ..................................................................................................................... 10
3.2. Installing Lab3 ............................................................................................................... 10
Lab 4. Test Creation Using Reusable Components ........................................... 12
4.1. Objectives ..................................................................................................................... 12
4.2. Installing Lab4 ............................................................................................................... 12
Lab 5. Using Incisive Verification Builder .......................................................... 14
5.1. Objective ....................................................................................................................... 14
Lab 6. Coverage Analysis and Ranking Runs .................................................... 20
6.1. Objective ....................................................................................................................... 20
6.2. Installing Lab6 ............................................................................................................... 20
6.3. Loading Regression Output, Viewing Coverage .......................................................... 20
6.4. Viewing Regression Run Results ................................................................................. 24
6.5. Ranking Runs ............................................................................................................... 25
6.6. Analyzing Coverage...................................................................................................... 28
Cadence Confidential 3
Software Dependencies
• IES-XL 13.1
Cadence Confidential 4
Lab 1. Stimulus Modeling
1.1. Objectives
Use the UVM class library to:
• Generate random stimulus and layer constraints
• Explore the UVM automation provided in the library
• Use the UVM messaging capability to control verbosity from the command-line
Cadence Confidential 5
Note: IES-XL does not recompile/re-elaborate the design/testbench for these changes.
5. Run with multiple random seeds to get different results:
% irun –f run.f +svseed=RANDOM
Cadence Confidential 6
Lab 2. UVM Sequences
2.1. Objectives
• Explore the driver and sequencer interaction
• Review the sequencer default behavior
• Execute a specific sequence
• Write a new sequence
Cadence Confidential 7
5. Run a simulation and review the results (results are in the irun.log file):
% irun –f run.f
Review the frame printed out by the driver:
o How many frames were executed?
o Which sequence(s) were executed during the simulation? (hint: look at the value of “parent
sequence” in each frame)
o How did the sequencer know what sequence to execute?
6. Run with different random seeds and get different results:
% irun –f run.f +svseed=RANDOM
7. One way to view the transactions was in the irun.log file. With built-in transaction recording, SimVision
allows you to view the transactions graphically. Invoke Simvision and load up the transaction
database:
% simvision –input lab2a.svcf
• The SimVision Waveform window displays the five transactions generated during the simulation
run. You should be able to see the transactions generated by the simple sequence, and those
captured in the driver’s send_to_dut() task. The fields of the uart_frame are captured and the
timing of the task matches the delay value.
• The Transaction Stripe Chart window displays the same data in a different format. You can see
the frame field values in either stripe-chart or table format.
• Select a transaction in the stripe chart (or Table) and you will see the same transaction
highlighted in the waveform window.
o Which sequence(s) were executed during the simulation? (Hint: look at the value of “parent
sequence” in each frame). Because this is a nested sequence, you will see that the nested
sequence executed multiple sub-sequences
o Do the values look correct?
o How did the sequencer know what sequence to execute?
Cadence Confidential 8
5. Invoke Simvision and load up the new transaction database:
% simvision –input lab2b.svcf
• The SimVision Waveform window displays the sequencer and driver transactions. If you expand
the uart_nested_seq (the [+] sign next to the name in the Waveform) it shows the top-level
sequence (uart_nested_seq), the sub-sequences (incr_payload_seq, bad_parity_seq, and
transmit-seq), and the individual uart frame transactions (req) that were generated.
• The Transaction Stripe Chart window can also expand/collapse to show the sequence and
transaction data.
Cadence Confidential 9
Lab 3. Reusable Environment Topology
3.1. Objectives
• Review and understand the correct structure of a reusable component.
Cadence Confidential 10
2. Run a simulation and review the irun.log file:
% irun –f run.f
o Search for SVSEED in irun.log file
o The first thing you see printed in the log file is the UART topology. This capability is provided
with the UVM built-in automation when the test called uart0.print() in the second initial block
(lab3_note3d); note that the uart.print() was called after #1 to ensure that the build_phase
has completed and the uart and all its sub-components have been built.
o Notice that the Tx agent is UVM_ACTIVE and Rx agent is UVM_PASSIVE. The Tx agent
contains driver, monitor and sequencer, while the Rx agent only contains a monitor.
Cadence Confidential 11
Lab 4. Test Creation Using Reusable
Components
4.1. Objectives
• Explore test creation and control using a UVM Testbench
• Examine an existing test and observe how it creates a verification environment
• Control environment behavior from the testbench and the test
• Control test execution and execute multiple tests without recompiling/re-elaborating the design
• Controlling the test exit using objection handling mechanism
Module UVC
UVM Testbench Scoreboard
Control/Status Control/Interrupt
Registers Logic
Virtual Sequencer
This lab is run on the full verification environment and UART DUT. It includes:
• The Verilog RTL for the UART DUT
• An instance of the APB UVC
• An Instance of the UART UVC
• A module UVC which includes a scoreboard and a monitor for checking and coverage collection
• A virtual sequencer which controls multiple UVCs
• Coverage is being collected
Cadence Confidential 12
• Transaction recording is also enabled
1. Review the Simulation Verification Environment: uart_ctrl_tb.sv:
This file contains the testbench class for this design. uart_ctrl_tb extends from uvm_env (lab4_note1).
It contains instances of the APB UVC, the UART UVC, the module UVC and the virtual sequencer.
The build () method of the sve is used to configure the verification environment and create/build the
sub-components of the environment (lab4_note2)
2. Review a test class: tests/apb_uart_rx_tx.sv:
In UVM, tests are classes. u2a_a2u_full_rand_test extends from uvm_test.
• It contains an instance of our sve: uart_ctrl_tb uart_ctrl_tb0;
• The build () method of the test sets a default sequence for the virtual sequencer and then it
creates and builds the testbench (uart_ctrl_tb0)
3. Run a simulation and review the printed topology: (irun.log):
% irun –f run1.f
• The run1.f file specified +UVM_TESTNAME=u2a_a2u_full_rand_test to tell the run_test()
method which UVM test class to execute
• When reviewing the results in the irun.log file, we see the printed topology for the full
verification environment (uart_ctrl_tb0, apb0, uart0, uart0_ctrl0, and the virtual sequencer)
• Notice that the virtual sequencer was executing the “concurrent_u2a_a2u_rand_trans” sequence
(hint: search for “Executing sequence” string)
• Multiple UART and APB frames were generated and sent during the simulation.
4. Review the Virtual Sequence Library: uart_ctrl_virtual_seq_lib.sv:
• A base_virtual_seq sequence is included to handle the objection mechanism. (lab4_note1) An
objection is raised in the pre_body() task and dropped in the post_body() task. All virtual
sequences will extend from this base sequence.
• The body() of the virtual sequence begins at line marked lab4_note2 of this file:
o It starts by programming the DUT by calling the program_dut_csr_seq sequence (config_dut
instance).
o Then it spawns off two different threads – one for sending a random number of frames from
the UART to the APB and one sending transfers from APB to UART.
o Finally it calls another sequence to read the UART DUT Rx Fifo from the APB UVC.
5. Run the uart_incr_payload test:
• This sequence programs the DUT via the APB bus, executes an incrementing payload sequence
via the UART interface and then reads the TX Fifo via the APB bus.
% irun –f run2.f
The IUS simulator should not recompile/re-elaborate the design because we only changed the
UVM_TESTNAME argument in the run2.f file.
6. Run a simulation in GUI mode:
• Explore the SimVision debug capability by running in GUI mode:
% irun –f run2.f –input lab4.tcl
Cadence Confidential 13
Lab 5. Using Incisive Verification Builder
Objective
• Use IVB to understand the necessary steps to implement a UVC
• Perform simple simulation to check the correctness of generated code
6. On the main page, click to Create a new project. Add the project name "my_lib" as shown in the
second window below and then click Finish:
Cadence Confidential 14
The Wizard flow window will be open, select Create UVCs > Interface UVC Builder.
Cadence Confidential 15
8. The IVB Building Wizard will start with the following windows, keep the default values unless
otherwise indicated:
9. On General UVC Options window, type 'my_uvc' as package name and select 'SystemVerilog –
Accellera UVM'.
10. Continue filling in names for the initiator, responder, and data item names say master, slave, and
transaction respectively.
Cadence Confidential 16
11. Invoke the UVC generation by selecting my_uvc and clicking the Generate button:
12. Click OK on pop-up form and IVB will show the list of generated files.
Cadence Confidential 17
13. Exit from IVB and cd to my_lib_lib/my_uvc/sv: Analyze:
• my_uvc_transaction.sv - for data items generation
• my_uvc_master_agent.sv, my_uvc_master_driver.sv, my_uvc_master_monitor.sv,
my_uvc_master_sequencer.sv to understand the generated agent structure
14. cd ../examples and review the files:
• test_lib.sv, check how test has been implemented
• examine the sequence lib
15. cd ../, execute ./demo.sh, simulation will start with graphical interface. Run the simulation by clicking
the run button:
Cadence Confidential 18
16. Analyze the irun.log and look for the topology print-out.
17. In the Design Browser, you can view the topology by expanding the uvm_test_top instance.
Cadence Confidential 19
Lab 6. Coverage Analysis and Ranking
Runs
In this section, we use Enterprise Manager to view regression results and do test case ranking.
To save time in this section we load an already-executed regression result.
6.1. Objective
• Use an executable verification plan to analyze test runs.
• Analyze the relative contribution of each run to the total coverage.
Cadence Confidential 20
$SOCV_KIT_HOME/ip_regression_data/uvm_class/uart_ctrl/vm_uart_ctrl_regression.user1.11*/*
.vsof
c. Click Open.
This opens the regression results
2. To view the coverage, click the box vPlan in the Incisive Enterprise Manager window.
Cadence Confidential 21
3. Click OK in the Backward Compatibility Note window if it pops up.
4. On the Automatic Coverage Merge window that opens, select the
“I intend to perform per-run analysis (do not merge)” option and click Ok.
Cadence Confidential 22
This shows you the raw coverage as shown below.
Cadence Confidential 23
We now load the vPlan we reviewed in step 1 on top of this coverage:
1. Click Read on the Verification Plan Tree (Default) window
2. Load the uart_apb_vplan.xml files, and click Open.
Note: Remember to change the Files of type to xml file (*.xml).
A Verification Plan Tree window appears with the UART vPlan annotated with coverage results.
Cadence Confidential 24
Note that all the test cases in the regression (twelve tests as shown in the figure above is just
indicative) have passed. Let us review the individual result log.
2. On the passed column (marked as “P”), click the number 12 (number 12 as shown in the figure above
is just indicative)
This displays a list of the tests that passed and the seed of the tests. You can select one of the runs
and get much information such as run-directory, log file, seed, and failure cause (if any).
Cadence Confidential 25
3. Click OK on the Ranking pop window that appears.
4. Click OK on the Reread Coverage window.
A Runs window opens, if one is not already open. Alternatively, the table in the Runs window
changes as shown below:
Cadence Confidential 26
The cumulative attribute is added to the table.
The table is sorted according to the coverage efficiency. The optimum runs have a purple
background, while redundant runs have blue.
5. Click the Filter Radio button under vPlan button.
We use the general filtering mechanism to remove runs that don’t contribute additional code or
functional coverage towards our plan goals.
6. From the pull down menu on the left, select Ranking Id.
7. Select != from the comparison pull-down, and enter -1 in the textbox.
8. Click Filter.
9. These runs can be written to a Session Input File (VSIF) to launch a focused regression suite.
Cadence Confidential 27
6.6. Analyzing Coverage
Restore the Verification Plan Tree window that we previously minimized.
Cadence Confidential 28
Notice that we have only hit a very small subset of the TX FIFO level coverage bins. This includes not
completely filling the TX FIFO.
2. To create a correlation matrix for the Serial Data FIFOs node select Serial Data FIFOs in the vPlan by
clicking on it.
3. Click the Correlate button or select Analysis - Correlation Matrix from the menu
4. Click OK when the following dialog box appears.
5. If the Reread Coverage dialog box appears Click OK in order to create the correlation matrix.
The following Runs window opens, displaying the runs in the current context. A coverage attribute
column is added to the Runs table, displaying the contribution made by each run/seed of runs.
Cadence Confidential 29
All simulations run so far are not able to contribute to 100% coverage to this feature. We may need a
large number of seed or bias constraints differently to increase coverage of this section.
We have created a test that targets the FIFO holes that addresses this coverage goal.
Next, we run this test and look at the cumulative coverage to determine if we are making progress
towards this feature.
To run this test we start a new simulation session in Incisive Manager by following the steps below:
6. Go to the main Incisive Enterprise Manager window.
7. Click Start.
This launches a Start Session window.
8. Select uart_rxtx_fifo_cov.vsif and click Open.
Cadence Confidential 30
The test case runs in a new xterm.
9. Once the test case completes, quit the xterm.
10. To see the simulation result, click Refresh in the Incisive Manager window.
11. Follow the steps 1-3 above to create a correlation matrix for the Serial Data FIFOs node.
The Runs window opens displaying the runs in the current context.
Now you can see that the new test case uart_txrx_fifo_cov test has 100% correlation for the Serial Data
FIFOs node. The total coverage has also increased from the previous simulation.
Cadence Confidential 31
Summary
Incisive Enterprise Manager automates the execution of multiple random tests. It allows intuitive feature-
based review of multiple coverage metrics, rank your regression suit to remove redundant test and
correlated simulation runs to exercise a desired feature.
Cadence Confidential 32