Jump to content

uvm and unit-testing methodology

Recommended Posts

In addition to chip-design, I've done a lot of OO SW development. I very much like doing unit-testing. I need to verify that my integrity-tests and other checks work in an automated test. I would assume that other verification engineers have the same problem, and that UVM has some method of doing it. But it's not obvious to me, so I'll assume (in my ignorance) that it doesn't exist...

So, in UVM, what's the best way to create a test-case such that if a specific assertion fires, the test passes, and the test fails otherwise?

I would like to get a PASS iff the following occurs:

1) every element in a list of expected errors is seen

2) each element in the list consists of:

ID (the ID passed into `uvm_error)



error_count_min, error_count_max (in the case of lists, I may expect the error to occur <N to M> times)

I would prefer to have a method to specify the ID code as:

{error_code_enum, error_code_instance} rather than have to use file/line, but that's an enhancement for later.

This would seem like a generally useful thing in UVM - everybody has to verify that their error-detection logic works.



Link to comment
Share on other sites

That's what the report catcher is for. As long as the assertion reports its failure using the UVM reporting system, you can catch messages, look at them, and if the ones you are looking for, demote them or simply swallow them so they are never displayed.

That's how negative tests are implemented in UVM.

There are no examples in the distribution but if you are an experienced OO programmer, it should not be too difficult to figure out.

Link to comment
Share on other sites

yes, I saw that I can swallow errors using report-catcher. But I want a positive evidence that the error-detection fired, not simply turning off the error.

Essentially - I want an error-scoreboard: all the errors that I expected to see, were seen (or else I get an error), and no unexpected errors were seen.

I thought about modifying the report-catcher to have a static list of which errors were sent to it, and then it could do the checking that all errors detected were expected.

What do you think of having the report-catcher having a list of expected-errors: for each of those errors that occurs, it

a) converts it to an INFO (so that the logfile shows the time/location where the error was reported)

B) sends the information to an actual scoreboard in my testbench, that knows the list of expected-errors and then I can use the usual system in scoreboards of comparing expected v. actuals?

Or is there some better method to actively verify that all expected errors were seen?

Link to comment
Share on other sites

Since you define the report catcher, you can make it part of your scoreboarding mechanism. What you describe would indeed work.

For simple error-detection tests, I keep a counter of expected-and-seen messages in the catcher and check that is has the expected value at the end of the test in the report_phase(). If you see more or fewer messages, the test fails. Another way is to use the uvm_report_server::get_id_count() method to get a count of (presumably demoted) messages of a specific ID that were issued .

And demoting expected messages to INFO is indeed a good idea -- with a modification to the text to state that it is an expected warning/error/fatal.

Edited by janick
Link to comment
Share on other sites

ejessen, there's an option for an open source sv unit test framework as well called SVUnit which you can find at www.agilesoc.com/svunit. I think the most comparable sw framework would be cpptest if you're familiar with that. I'm in the middle of building a usage model that incorporates uvm into the framework (integrating uvm components was a big hurdle but as of a couple days ago, svunit is uvm component friendly).


Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...