Jump to content

Recommended Posts

Posted

Hi Everyone,

I am beginner in systems and implementing the model, how can we implement delay for more than 64 bits value,

suppose val is uint64 max = -1;

it need to multiplied with some time_interval(suppose 50 ns)

How to use wait or notify to trigger a method after this much delay ?

Is there any data structure in systemC which can hold more than 64 bits value result ?

In my case there is also a possibility for delay value more than 64 bits, Anyone have idea to deal with this issue ?

Posted

First, the basic answer is NO. SystemC only allows 64 bits for a time variable.

Before you rant about how inappropriate that is, consider that using picoseconds (ps) as the smallest unit of time resolution means that you can represent up to 30 weeks, 3 days, 12 hours, 5 minutes and 44 seconds of simulated time.

I do not know of any simulations that need that much dynamic range.

If need be, you can change the scale to nanoseconds (ns) and represent up to 587 years!

Posted

Dear Sir,

One way to get around this problem is to synchronize with a

clock, and then use 'wait' for example, 100 clock ticks, which

correspond to a finite simulation time period. Also, although

SystemC does not allow for more than 64 bit time variables,

it does allow for arbitrary length bit vectors, as for example:

sc_dt::sc_bv<128> b_v;

Hope that helps.

Posted

Let me try to explain my case :

Consider val_1 & val_2 (here both are of 64 bits).

val_2 is a compare value.

x is a value 0f 8 bits(can be inbetween 0x00 to 0xFF)

I have consider supplied clock is of 50 ns.

val_1 is need to be incremented after (x+1) clock cycles.

i.e. if x is 4 then val_1 is incremented after each 5 clock cycles.

consider if started at 0ns then first increment should be at 250ns.

Implementation should be in optimized approach.(means rather then checking on each increment depending on val_2 delay should be calculated.)

when val_1 matches with val_2 some task need to done like interrupt generation.

How we should calculate accurate delay here for a match ?

Kindly suggest for better implementation which meets the required results.

Posted

What about something like this:

sc_time delay = (x+1) * sc_time(50,SC_NS) * (val_2 - val_1);

Greetings from Oldenburg,

Hi philipp,

Sorry, but it won't be accurate for higher range of values. ( i.e. not accurate for difference values > 0xFFFFFFFFFFF)

Still thanks and let me know any better approach.

Posted

Side note: Can you try to keep your reply outside of the "[ quote ]" blocks? Otherwise, it's hard to parse what's new and what's quoted.

Sorry, but it won't be accurate for higher range of values. ( i.e. not accurate for difference values > 0xFFFFFFFFFFF)

Still thanks and let me know any better approach.

It is as accurate as you can be in SystemC. As David said in his reply, the SystemC time is represented as a 64-bit value itself.

You can adjust the resolution (as a power of 10) with the function call (before using sc_time for the first time!):

sc_set_time_resolution( 10, SC_NS );

to increase the maximum simulation time (but lose local precision).

In case of very big differences, I would suggest to generate a warning and ignore the notification. It is unlikely, that you need/want both very high time resolution (below a single clock cycle) and very long simulated time periods (several years).

Greetings from Oldenburg,

Philipp

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...