ankushKumar Posted December 29, 2014 Report Posted December 29, 2014 Hello All, I need some strong suggestion regarding modeling of TIMER. I am new in systemc. Want to know what are the things a best model should consist of and how its modeling should be done ? Since timer has a clock time period, should i model it in by incrementing/decrementing the counter and applying a delay for timer period or i should model it without any delay by using sensitivity to the new write in register and decremnting/ incrementing the counter value by evaluating the current time period and previous sensitivity time period by clock time period. With Warm Regards. Quote
David Black Posted December 30, 2014 Report Posted December 30, 2014 Since ESL models are supposed to be lightweight and fast (esp. if used for software development), I strongly suggest you do not increment/decrement a counter. Instead, use timed event notification and calculation as you seem to have suggested. I did a presentation at NASUG on this, "Look Ma, no clocks" several years back. I am pretty sure the code was published -- www.nascug.org should have reference to it somewhere. ankushKumar 1 Quote
ankushKumar Posted December 30, 2014 Author Report Posted December 30, 2014 Since ESL models are supposed to be lightweight and fast (esp. if used for software development), I strongly suggest you do not increment/decrement a counter. Instead, use timed event notification and calculation as you seem to have suggested. I did a presentation at NASUG on this, "Look Ma, no clocks" several years back. I am pretty sure the code was published -- www.nascug.org should have reference to it somewhere. May i know what's the difference it will create if have a timed notification or have wait after every increment or decrement. I mean a timed notification will also notify after a dalay that has been provided. Quote
dakupoto Posted December 31, 2014 Report Posted December 31, 2014 Hello All, I need some strong suggestion regarding modeling of TIMER. I am new in systemc. Want to know what are the things a best model should consist of and how its modeling should be done ? Since timer has a clock time period, should i model it in by incrementing/decrementing the counter and applying a delay for timer period or i should model it without any delay by using sensitivity to the new write in register and decremnting/ incrementing the counter value by evaluating the current time period and previous sensitivity time period by clock time period. With Warm Regards. Hello Sir, A simple timer can be implemented simply as outlined below: First of all, note that a timer could be count down/up or simply waiting for a triggering event, and this triggering is essential for any timer to work. 1. Have a SystemC clock (sc_core::sc_clock), with pre-defined time period. 2. Have a loop that runs while the triggering event is FALSE, and resets when the triggering event is TRUE. Inside the loop, decrement/ increment a counter at each clock tick. When the triggering event stops the loop, the counter value multiplied by the clock period provides the time value that you need. Hope that helps. Quote
ankushKumar Posted December 31, 2014 Author Report Posted December 31, 2014 Hello Sir, A simple timer can be implemented simply as outlined below: First of all, note that a timer could be count down/up or simply waiting for a triggering event, and this triggering is essential for any timer to work. 1. Have a SystemC clock (sc_core::sc_clock), with pre-defined time period. 2. Have a loop that runs while the triggering event is FALSE, and resets when the triggering event is TRUE. Inside the loop, decrement/ increment a counter at each clock tick. When the triggering event stops the loop, the counter value multiplied by the clock period provides the time value that you need. Hope that helps. Hello, incrementing/decrementing the counter at every clock tick slow downs the simulation process that why David has recommended for the second method. Quote
dakupoto Posted January 2, 2015 Report Posted January 2, 2015 Hello, incrementing/decrementing the counter at every clock tick slow downs the simulation process that why David has recommended for the second method. Helld Sir, The choice is yours. With event tracking, the programmer has to manually track each event -- why bother when the same goal can be achieved with sensitivity lists with the SystemC runtime infrastructure taking care of the task ? Note also that whether the simulation runs slow or not, does not matter the least when you are trying to understand the behavior of a design. Simulation time is just what the name suggests -- an artificial setup. If you are really interested in performance characteristics, it is highly recommended that you use goog old SPICE simulations. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.