uvm_rookie Posted February 7, 2012 Report Share Posted February 7, 2012 set_global_timeout is a deprecated feature in uvm1.1a. What is the new way of specifying global timeout ? Quote Link to comment Share on other sites More sharing options...
Roman Posted February 8, 2012 Report Share Posted February 8, 2012 set_global_timeout is a deprecated feature in uvm1.1a. What is the new way of specifying global timeout ? Hi there, Why not use the +UVM_TIMEOUT=<timeout>,<overridable> in the command-line? Quote Link to comment Share on other sites More sharing options...
janick Posted February 8, 2012 Report Share Posted February 8, 2012 See $UVM_HOME/examples/simple/phases/timeout for an example on how to add a global per-phase timer. Quote Link to comment Share on other sites More sharing options...
uvm_rookie Posted February 8, 2012 Author Report Share Posted February 8, 2012 I have tests that run minutes, to hours, and up to a week. So, I like to use different timeout for each test. What is the proper way of doing so ? Quote Link to comment Share on other sites More sharing options...
uwes Posted February 8, 2012 Report Share Posted February 8, 2012 hi, to be clear: you should be using objections as a distributed mean to perform end-of-test handling. the timeouts are usually only sort of a "grace" period to delay end of simulation or the transition to another phase. Quote Link to comment Share on other sites More sharing options...
janick Posted February 8, 2012 Report Share Posted February 8, 2012 I have tests that run minutes, to hours, and up to a week. So, I like to use different timeout for each test. What is the proper way of doing so ? That depends on the timeout mechanism you are using, the timescale that is active in the timer code and how long your need the timer to be. That value should then be set in each test where it needs to be different than the default value set in the base test. Probably best to set it in the build phase. And as Uwe pointed out, timeout's should only be use to catch run-away conditions, not to terminate the test gracefully. Quote Link to comment Share on other sites More sharing options...
uvm_rookie Posted February 8, 2012 Author Report Share Posted February 8, 2012 That depends on the timeout mechanism you are using, the timescale that is active in the timer code and how long your need the timer to be. That value should then be set in each test where it needs to be different than the default value set in the base test. Probably best to set it in the build phase. And as Uwe pointed out, timeout's should only be use to catch run-away conditions, not to terminate the test gracefully. That's exactly what I trying to do, to catch run-away condition .. In case something went wrong with a particular test, it won't be stuck indefinitely and holds up a LSF/server. I will follow the example and setup test specific timeout value in the build phase of each test. Thanks. Quote Link to comment Share on other sites More sharing options...
jillkam Posted February 24, 2012 Report Share Posted February 24, 2012 I, too, am new to UVM, and I, too, am trying to set a global timer. I don't want to use deprecated methods. I do use raise/drop objections. But I don't want to kick off loops of a test overnight, only to find out that the first run got stuck .... and since the default value of the time is 90 seconds or something like that, it's still stuck in the morning! So I've generally set a global timer to kill a test that is otherwise objecting but stuck. I prefer that this is set to something in the base test class, and then set again by a subclass test if more time is needed on that test. I've tried the example code Janick recommended, and also tried the command line UVM_TIMEOUT (not my preferred solution), and also tried calling set_timeout from the base test class. And the timeouts do occur ... but .... they occur immediately (in wall time), regardless of the value that I set. If I set it to 2 seconds, it says it's timed out after 2 seconds, but it was just as instant as if I'd set it to 2 ns. I was thinking it was the timescale in my environment, but if I run the example code stand-alone (where timescale is set by make on command line), and change the timeout values in the env class, it has the same behavior (at least for me). I'm running Questa at the moment, but need this code to port to Incisiv and VCS also. Any clues? What am I doing wrong? Thanks! Quote Link to comment Share on other sites More sharing options...
krb Posted April 5, 2012 Report Share Posted April 5, 2012 Hi, A question along the same line, how can my test know when it is stopped due to timeout, from a previous timeout setting (either using the function or from cmdline) ? Thanks, krb Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.