Jump to content

Error: CLK failed to set wave to 0 and Sampling id is greater than delay


Recommended Posts

Hi All, 

 

while simulating A2D convertor, I got following problems. 

 

Compilation: No Error

 

running .exe file returns the following error message:

 

-----------------------------------------------------------------------------------------------------------------------------------------------------------------

Warning: SystemC-AMS: Initialization for tracing of: CLK failed  set wave to 0
 
In file: sca_trace_object_data.cpp:136
In process: method_p_0 @ 0 s
 
Error: SystemC-AMS: Sample id (0) is greater than delay (0 while initializing port: A2D_dut.a2d.eoc
In file: /home/4all/packages/systemc/2.2.0-sl4.5//include/scams/predefined_moc/tdf/sca_tdf_sc_out.h:418
In process: A2D_dut.a2d.sca_implementation_0.cluster_process_0 @ 0 s
-----------------------------------------------------------------------------------------------------------------------------------------------------------------
 
In this regards, I have checked the complete code as well as the the files suggested in error message (/sca_tdf_sc_out.h:418), (sca_trace_object_data.cpp:136), but I did not get any clue. 
 
my codes look like this: 
 
//A2D.h
 
SCA_TDF_MODULE (a2d_nbit)
 
{
//port declaration
  sca_tdf::sca_in<double> a_in; // analog input pin
 
  sca_tdf::sca_de::sca_in<sc_dt::sc_logic> start; //start signal
  sca_tdf::sca_de::sca_in<bool> clk; //clock signal
 
  sca_tdf::sca_de::sca_out<sc_dt::sc_logic> eoc; //end of conversion pin  
  sca_tdf::sca_de::sca_out<sc_dt::sc_lv<8> > d_out; // digital output signal
 
  a2d_nbit(sc_core::sc_module_name nm, double Vmax_ = 5.0, double delay_ = 10.0e-3, int bit_rng = 8): 
    a_in("a_in"), start("start"),clk("clk"), eoc("eoc"), d_out("d_out"), Vmax(Vmax_), delay(delay_), bit_range(bit_rng){}
 
  void set_attribute()
  {
    set_timestep(1, sc_core::SC_MS);
    eoc.set_delay(0);
  }
 
  void initialize()
  {
    eoc.initialize(sc_dt::SC_LOGIC_0);
    start.initialize(sc_dt::SC_LOGIC_0);
  }
 
  void processing();
  
 private:
  double Vmax; // ADC maximum range
  double delay; // ADC conversion time 
  int bit_range; //vector length of d_temp and d_out
  
};
 
//A2D_top_level.h
 
SC_MODULE (A2D_top_level)
{
  a2d_nbit a2d; 
  vtg_src input_vtg;
  sc_core::sc_clock clk1;
  
  void start_logic(){
    while(true)
      {
          start.write(sc_dt::SC_LOGIC_0);
          wait(20, sc_core::SC_MS);
          start.write(sc_dt::SC_LOGIC_1);
          wait(20, sc_core::SC_MS);
          start.write(sc_dt::SC_LOGIC_0);
          sc_core::sc_stop();
      }
  }
  
  SC_CTOR(A2D_top_level)
    : in("in"), out("out"), a2d("a2d"), input_vtg("input_vtg"),clk1("clk1",10, 0.5, true), start("start"), eoc("eoc")
    {
      input_vtg.out(in);
      
      a2d.a_in(in); 
      a2d.start(start);
      a2d.clk(clk1.signal());
      a2d.eoc(eoc);
      a2d.d_out(out);
 
      SC_THREAD(start_logic);
    }
 
    public:
 
    sca_tdf::sca_signal <double> in;
    sc_core::sc_signal<sc_dt::sc_lv<8> > out;
    sc_core::sc_signal<sc_dt::sc_logic> start;
    sc_core::sc_signal<sc_dt::sc_logic> eoc;
 
};  
 
//A2D_test.cpp
 
int sc_main(int argc, char* argv[])
{
 
  A2D_top_level A2D_dut("A2D_dut");
 
  sca_util :: sca_trace_file* atfs = sca_util :: sca_create_tabular_trace_file("A2D.dat");
 
  sca_util :: sca_trace(atfs, A2D_dut.a2d.clk, "CLK");
  sca_util :: sca_trace(atfs, A2D_dut.start, "START");
  sca_util :: sca_trace(atfs, A2D_dut.in, "INPUT");
  sca_util :: sca_trace(atfs, A2D_dut.out, "OUTPUT");
  sca_util :: sca_trace(atfs, A2D_dut.eoc, "EOC");
 
  sc_start(1.5, SC_SEC);
 
  sca_util :: sca_close_tabular_trace_file (atfs);
 
  return 0;
  
  
}
 
 
could you please help in solving this problem....
 
regards, 
Milind 
Link to comment
Share on other sites

Hello Sir,

Please note that you have used 'set_timestep', but it

is not clear which object it is referring to. It is suggested

that you first try out (i.e., compile and run) the simple

examples in the Users Guide and understand what

is going on, before you try your own model. If you

jump in right at the start to your own model, I am

afraid it will not be very productive.

 

Link to comment
Share on other sites

Hello Milind,

 

The error message regarding the delay is caused by your calls to sca_tdf::sca_de::sca_out<T>.initialize() in the a2d_nbit::initialize() callback. By simply removing the two initialize() callbacks in this callback, the error will disappear. The reason is the semantical difference between initialize() of a TDF (converter) port and that of a DE port. The initialize() of sc_core::sc_in<T>, sc_core::sc_out<T>, and sc_core::sc_inout<T> will set the initial value of the sc_core::sc_signal<T> bound to the port. The initialize() of sca_tdf::sca_in<T>, sca_tdf::sca_out<T>, sca_tdf::sca_de::sca_in<T>, sca_tdf::sca_de::sca_out<T> will initialize the delay samples of the port. By default, a TDF port has a delay of zero! Therefore, you are not allowed to call initialize on these kind of ports in the context of the TDF module's initialize() callback. Once you specify a sample delay of n samples on a TDF ports in the module's set_attributes() callback, you can initialize the delay samples with id 0 to n-1 using initialize(val, id).

 

For more information, have a look in the SystemC AMS User's Guide and the SystemC and SystemC AMS LRMs.

 

Regards, Torsten

Link to comment
Share on other sites

thanks Torsten for your detailed response. 

 

As you correctly pointed out, after removing the two initialize callbacks, the error disappeared. 

are there some ways to initialize the sca_tdf::sca_de::sca_in<sc_dt::sc_logic> port ?

in the simulation, "eoc" and "output" remains "X" undefined. 

 

could you please tell me, if the clock connection is correct. I have declared the clock in A2D_toplevel.h, and connected directly to clk input of an a2d instance. In constructor I have defined the clock period as 10, does it take the 10 time units of "set_timestep" attribute a2d_nbit which is 1ms.

 

regards, 

Milind.

Link to comment
Share on other sites

Dear Milind,

 

SystemC AMS 1.0 doesn't offer a possibility to initialize the initial value of a DE signal of type sc_core::sc_signal<T> through a TDF converter output port of type sca_tdf::sca_de::sca_out<T>. This missing functionality has been added in SystemC AMS 2.0 with the sca_tdf::sca_de::sca_out<T>::initialize_de_signal() member function. In SystemC AMS 1.0, you will have to initialize the sc_core::sc_signal<T> directly from your top cell using the channels initialize() member function.

 

Regarding the your sc_clock initialization problem, you're are using a deprecated pre-SystemC 2.0 version of the sc_clock constructor, which still accepts an integer argument without an associated time unit and then interprets that integer value as a multiple of the sc_core::sc_set_time_resolution(). Don't do this! Instead, replace:

 

clk1("clk1",10, 0.5, true)

 

with 

 

clk1("clk1",10, sc_core::SC_MS, 0.5)

 

Please check clause 6.7 on sc_clock in IEEE Std 1666-2005 for more information on the sc_clock's constructors.

 

An sc_clock has no knowledge at all of the time step, you assign to a TDF cluster. Indeed, SystemC has no particular knowledge about semantics of SystemC AMS. SystemC AMS has been just defined in a way that it's enhancements can be implemented and executed within the constraints and semantics defined by the SystemC standard. SystemC AMS thus doesn't need to modify the SystemC simulation kernel.

 

That said, please be aware of the limitations of the DE<->TDF synchronization semantics, which are defined the LRM and also discussed more clearly in the SystemC AMS User's Guide. A TDF cluster is always executed at delta cycle 0 of a given SystemC time and will sample the value of a DE signal, which is valid at that time. Any value written to a DE signal via a TDF converter output port,  will become valid for the DE side at delta cycle 1.

 

I also have some concerns regarding your A2D_top_level::start_logic() thread: You probably don't want to call sc_stop() from within it, as you will force the end of the simulation after just 40 ms, whereas you initially started the simulation for 1.5 s. Instead, just wait indefinitely long using wait() without any arguments.

 

Regards, Torsten

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...