gakwaya Posted July 2, 2014 Report Share Posted July 2, 2014 Hello systemc community, I want to model a network on chip for my research and systemc proved to be my choice as it meets most of my requirements .In my attempt to try it ,I created two nodes that send simple data to each other using a simple generator and testnode modules as shown in the code below . The generator generates numbers but the code hangs on the statements to write to the fifos .Can anyone help hint out what can be done to make this work?I am quite new to systemc so ,sorry if this is obvious. Thanks. Dan. #include <systemc.h> #include <iostream> using namespace std; SC_MODULE(Generator) { sc_out<int> hanze; sc_event event1; int value; SC_HAS_PROCESS(Generator); Generator(sc_module_name name); void generateMethod(); void eventTriggerThread(); }; Generator::Generator(sc_module_name name) :sc_module(name) { value=0; SC_THREAD(eventTriggerThread); SC_METHOD(generateMethod); dont_initialize(); sensitive << event1; } void Generator::eventTriggerThread() { for(; { wait(1,SC_NS); event1.notify(); } } void Generator::generateMethod() { value++; hanze.write(value); cout<<name()<<"@ "<<sc_time_stamp() << " wrote "<<value << " to the terminal"<< endl; } SC_MODULE(testNode) { //GENERATOR PORTS sc_in <int> dataToSend; sc_port < sc_fifo_out_if <int> > out_port; sc_port < sc_fifo_in_if <int> > in_port; SC_HAS_PROCESS(testNode); testNode(sc_module_name name); void method1(); void method2 (); }; testNode::testNode(sc_module_name name) :sc_module(name) { SC_THREAD(method1); dont_initialize(); sensitive << dataToSend; SC_THREAD(method2); dont_initialize(); sensitive << dataToSend; } void testNode::method1() { for( ; { wait(); int availableData; availableData=dataToSend.read(); out_port->write(availableData); std::cout << "At time "<< sc_time_stamp() <<" Method1 Wrote a piece of data to the output port :"<<std::endl; wait(1, SC_NS); } } void testNode::method2 () { for( ; { wait(); int availableData; availableData=dataToSend.read(); out_port->write(availableData); std::cout <<"At time "<< sc_time_stamp() << " Method2 Wrote a piece of data to the output port :"<<std::endl; wait(1, SC_NS); } } int sc_main(int argc,char *argv[]) { sc_time sim_time( 10, SC_NS); //CREATE THE NODE testNode mNode1("node1"); testNode mNode2("node2"); //CREATE THE SIGNALS sc_signal<int> dataToSend1; sc_signal<int> dataToSend2; sc_fifo<int> connect1To2(1); sc_fifo<int> connect2To1(1); //CREATE THE GENERATOR Generator gen1("gen1"); Generator gen2("gen2"); //CONNECT THE GENERATOR gen1.hanze(dataToSend1); gen2.hanze(dataToSend2); //CONNECT THE SIGNALS mNode1.dataToSend(dataToSend1); mNode1.in_port(connect2To1); mNode1.out_port(connect1To2); mNode2.dataToSend(dataToSend2); mNode2.in_port(connect1To2); mNode2.out_port(connect2To1); //START THE SIMULATION sc_start(sim_time); return 1; } Quote Link to comment Share on other sites More sharing options...
David Black Posted July 2, 2014 Report Share Posted July 2, 2014 sc_fifo is of fixed size and defaults to 16. sc_fifo::read() and sc_fifo::write() are blocking. In other words, read() will wait if the fifo is empty, and write() will wait if the fifo is full. I don't see any consumers for your fifos. Add a process to read the fifo. Alternately, you could use sc_fifo::nb_write(). Also, a suggestion (pedantic): change dataToSend.read() to dataToSend->read(). Get into a habit of using -> when accessing channel methods. It will save you grief in the long run. Fortunately, since you are using sc_in, there exists sc_in::read() that calls port->read(). gakwaya 1 Quote Link to comment Share on other sites More sharing options...
gakwaya Posted July 3, 2014 Author Report Share Posted July 3, 2014 Thank you for the answer,I set up consumer processes in my nodes and now it is working .I had completely missed that.A few more questions though.You pointed out that I should use pointer access operators (-> ) when reading/writting on ports ,looking at my code is there any other good practices I may develop to write more legit systemc code? Also I am modeling a system on chip simulator and I still have problems how to model the concepts of delay and throughput. I am thinking of hardcoding throughput values on all my interfaces (output/input ports) and using wait statements to model the channel delay .Is this a good way to do this?I would be good to hear expert advice on this . Thank you again for your valuable time. Dan. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.