Jump to content

Modeling of Timer in Systemc


ankushKumar

Recommended Posts

Hello All,

 

I need some strong suggestion regarding modeling of TIMER.

I am new in systemc.

 

Want to know what are the things a best model should consist of and how its modeling should be done ?

 

Since timer has a clock time period,

should i model it in by incrementing/decrementing the counter and applying a delay for timer period 

or i should model it without any delay by using sensitivity to the new write in register and decremnting/ incrementing the counter value by evaluating the current time period and previous sensitivity time period by clock time period.

 

With Warm Regards.

Link to comment
Share on other sites

Since ESL models are supposed to be lightweight and fast (esp. if used for software development), I strongly suggest you do not increment/decrement a counter. Instead, use timed event notification and calculation as you seem to have suggested. I did a presentation at NASUG on this, "Look Ma, no clocks" several years back. I am pretty sure the code was published -- www.nascug.org should have reference to it somewhere.

Link to comment
Share on other sites

Since ESL models are supposed to be lightweight and fast (esp. if used for software development), I strongly suggest you do not increment/decrement a counter. Instead, use timed event notification and calculation as you seem to have suggested. I did a presentation at NASUG on this, "Look Ma, no clocks" several years back. I am pretty sure the code was published -- www.nascug.org should have reference to it somewhere.

 

May i know what's the difference it will create if have a timed notification or have wait  after every increment or decrement.

I mean a timed notification will also notify after a dalay that has been provided.

Link to comment
Share on other sites

Hello All,

 

I need some strong suggestion regarding modeling of TIMER.

I am new in systemc.

 

Want to know what are the things a best model should consist of and how its modeling should be done ?

 

Since timer has a clock time period,

should i model it in by incrementing/decrementing the counter and applying a delay for timer period 

or i should model it without any delay by using sensitivity to the new write in register and decremnting/ incrementing the counter value by evaluating the current time period and previous sensitivity time period by clock time period.

 

With Warm Regards.

 

Hello Sir,

A simple timer can be implemented simply as

outlined below: First of all, note that a timer

could be count down/up or simply waiting 

for a triggering event, and this triggering 

is essential for any timer to work.

1. Have a SystemC clock (sc_core::sc_clock),

with pre-defined time period.

2. Have a loop that runs while the triggering

event is FALSE, and resets when the triggering

event is TRUE. Inside the loop, decrement/

increment a counter at each clock tick. When 

the triggering event stops the loop, the counter

value multiplied by the clock period provides

the time value that you need.

Hope that helps.

Link to comment
Share on other sites

Hello Sir,

A simple timer can be implemented simply as

outlined below: First of all, note that a timer

could be count down/up or simply waiting 

for a triggering event, and this triggering 

is essential for any timer to work.

1. Have a SystemC clock (sc_core::sc_clock),

with pre-defined time period.

2. Have a loop that runs while the triggering

event is FALSE, and resets when the triggering

event is TRUE. Inside the loop, decrement/

increment a counter at each clock tick. When 

the triggering event stops the loop, the counter

value multiplied by the clock period provides

the time value that you need.

Hope that helps.

 

Hello,

 

incrementing/decrementing the counter at every clock tick slow downs the simulation process that why  David has recommended for the second method.

Link to comment
Share on other sites

Hello,

 

incrementing/decrementing the counter at every clock tick slow downs the simulation process that why  David has recommended for the second method.

 

Helld Sir,

The choice is yours. With event tracking, the programmer has

to manually track each event -- why bother when the same goal

can be achieved with sensitivity lists with the SystemC runtime

infrastructure taking care of the task ?

Note also that whether the simulation runs slow or not, does not

matter the least when you are trying to understand the behavior

of a design. Simulation time is just what the name suggests -- 

an artificial setup. If you are really interested in performance

characteristics, it is highly recommended that you use goog old

SPICE simulations.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...