Jump to content

How to model output delay in SystemC like the Verilog non-blocking intra-assignment delay?


Recommended Posts

I am trying to model an OR gate with 2 NS output delay in SystemC. 

#include "systemc.h"

SC_MODULE(adder) {
   sc_in<bool>  A, B;  
   sc_out<bool> OUT;
   
   void add() {
     
     while (true){
       wait(A.default_event() | B.default_event());
       bool intermediate = A.read() | B.read();
       wait(2, SC_NS);
       //cout << "adding at time " << sc_time_stamp() << endl;
       OUT.write(intermediate);
     }  
   }
 
   SC_CTOR(adder){
     SC_THREAD(add);
     sensitive<< A << B;;
   }
};

This is my attempt so far. But the simulation result is not as required:

Time		A		B		OUT
0 s		 0		0		0
5 ns		 1		0		0
6 ns		 0		0		0
7 ns		 0		0		1
13 ns		 0		1		1

Instead of being 0 at 8 NS, it has missed the event at 6 NS. In Verilog, the events is similarly missed if blocking assignment is used. In Verilog, we fix it using non-blocking assignment (See this about Verilog: https://electronics.stackexchange.com/q/572643/238188). So how do I stop the event from getting missed in SystemC? 

Link to comment
Share on other sites

  • Shashank V M changed the title to How to model output delay in SystemC like the Verilog non-blocking intra-assignment delay?

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...