This Website is not fully compatible with Internet Explorer.
For a more complete and secure browsing experience please consider using Microsoft Edge, Firefox, or Chrome

Machine Learning-based Timing Error Simulation of Microelectronic Circuits

Machine Learning-based Timing Error Simulation of Microelectronic Circuits


As the transistor size scales down to deep nanometer era, microelectronic circuits are increasingly susceptible to microelectronic variations. The most immediate manifestation of such variations is the delay uncertainty which can prevent circuits from meeting their timing specification, resulting in timing errors. Without proper protection, such timing errors can pose great threats to the reliability of digital systems. While gate-level simulation can accurately simulate circuit timing errors, it is often prohibitively slow and could require expensive licensing fee to access commercial simulation tools. In this talk, I will present our work in using machine learning methods to build predictive models that can classify the timing errors of circuits under different operating conditions, clock speeds, and input workload. Our model is able to achieve an average classification accuracy above 95% and is 50-100X faster than gate-level simulation. The promising results can potentially lead to a novel path in using machine learning to complement or replace simulation tools in electronic design automation.

Document Details

ReferenceSEM_230222_3735_p
AuthorJiao. X
LanguageEnglish
TypePresentation
Date 23rd February 2022
OrganisationVillanova University
RegionAmericas

Download


Back to Previous Page