Jump to content

Search the Community

Showing results for tags 'VCS performance'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Accellera Systems Initiative
    • Information
    • Announcements
    • In the News
  • SystemC
    • SystemC Language
    • SystemC AMS (Analog/Mixed-Signal)
    • SystemC TLM (Transaction-level Modeling)
    • SystemC Verification (UVM-SystemC, SCV)
    • SystemC CCI Public Review
  • UVM (Universal Verification Methodology)
    • Methodology and BCL Forum
    • UVM SystemVerilog Discussions
    • Simulator Specific Issues
    • 1800.2-2017 Early Adopter Release
    • UVM Commercial Announcements
    • UVM 1.2 Public Review
  • Portable Stimulus
    • Portable Stimulus Discussion
  • IP-XACT
    • IP-XACT Discussion
  • IEEE 1735/IP Encryption
    • IEEE 1735/IP Encryption Discussion
  • OCP (Open Core Protocol)
  • UCIS (Unified Coverage Interoperability Standard)
  • Commercial Announcements
    • Announcements

Categories

  • SystemC
  • UVM
  • UCIS
  • IEEE 1735/IP Encryption

Calendars

  • Community Calendar

Found 1 result

  1. What is the best way to measure simulation time using VCS? I would like to do some same-tool benchmarks to measure simulation performance improvements using different coding styles or tricks using the same simulator. I do not intend to do cross-tool benchmarking; only same-tool benchmarking. I will check to see if techniques used with one tool also cause similar performance improvements across multiple tools, but I will not report actual speed differences between the tools (in accordance with my tool-usage agreements with multiple vendors). I am trying to identify best performance coding practices. The best technique I currently have is to do: "time simv" This reports: real / user / sys times. Regards - Cliff Cummings Verilog & SystemVerilog Guru
×