The Future of Semiconductors: Engineering in the Convergence era


The semiconductor industry is entering a convergence era where silicon, software, physics, packaging, security, AI, and power constraints all intertwine. Device scaling still matters but architecture, integration, verification, and automation will define the industry’s trajectory. Organizations that embrace this cross-domain, lifecycle-oriented mindset will define the next decade. Moore’... » read more

Across The Vast Reaches Of The 3D Stack: Mastering ESD Verification In Advanced Semiconductor Design


Introduction: The epic challenge In the vast reaches of the semiconductor cosmos, a silent menace lurks—one that can obliterate years of design work in a fraction of a nanosecond. Electrostatic discharge (ESD) verification stands as the guardian against this invisible threat, a critical discipline that separates the triumphant chip designs from the smoldering wreckage of failed silicon dream... » read more

Limiting AI/ML Tools To Ensure Physical AI Safety, Security


Key Takeaways: AI-based tools can help monitor physical AI systems and LLMs, but human oversight is still needed to avoid false positives, bias, and other anomalies. For autonomous vehicles and robots, edge case scenarios and understanding human values are weak points, especially as moral and social values change over time. AI tools are growing and becoming increasingly helpful for c... » read more

System-level Reliability Verification for 2.5D/3D ICs Using Innovator3D IC and Calibre 3DPERC


The increasing demand for higher performance, lower power, and greater functionality in smaller packages has driven the rapid adoption of 2.5D and 3D Integrated Circuits (ICs). However, the inherent complexity of these multi-die architectures presents significant reliability verification challenges that traditional 2D flows cannot adequately address, particularly concerning electrostatic discha... » read more

New Challenges In Signoff


Multi-die assemblies coupled with leading-edge process nodes make signoff increasingly challenging and scary. There are more corner cases and more data to consider, but no slack in the delivery schedule. Marc Heyberger, product engineer group director at Cadence Design Systems, talks about full-chip timing, flat versus hierarchical timing analysis, the ongoing development of full 3D-ICs, and wh... » read more

Using Data And AI More Effectively In EDA


Key Takeaways The data being produced by EDA tools tends to be for human consumption and has weak semantics. Agents are attempting to create actionable information from unstructured data. The Model Context Protocol may provide AI with access to better data. Semiconductor design generates a lot of data, but how much of that is useful or currently being used by AI tools? And h... » read more

Formal Verification First: How AI Supports But Cannot Replace It


At a recent VLSI-D panel, industry leaders explored one of the most pressing topics in silicon design today — the intersection of AI-powered EDA, which is revolutionizing chip design for tomorrow. Ashish Darbari, CEO of Axiomise, questioned the panelists on the role of AI in chip design, optimizing PPA, validation and verification. While the panel explored the role of AI in design implemen... » read more

Formal Verification Fundamentals Remain Non-Negotiable In The New Verification Revolution


The semiconductor industry stands at a critical juncture. First-time silicon success rates have reached all-time lows, while design complexity continues to grow exponentially. System-on-chip designs now integrate billions of transistors, multiple processor cores, complex memory hierarchies, and sophisticated interconnect fabrics. In this environment, the stakes for verification accuracy have ne... » read more

The Verification Conundrum


When constrained random test pattern generation became the de facto way to verify designs, reference models became necessary to check that a design was producing the correct output. These were often distributed across several models, such as checkers, scoreboards and assertions. Another model that had to be created was the coverage model. It was required because you had to know if a generate... » read more

Benchmark For AI-Aided Chip Design That Evaluates LLMs Across 3 Critical Tasks (UCSD, Columbia)


Researchers at UCSD and Columbia University published "ChipBench: A Next-Step Benchmark for Evaluating LLM Performance in AI-Aided Chip Design." Abstract "While Large Language Models (LLMs) show significant potential in hardware engineering, current benchmarks suffer from saturation and limited task diversity, failing to reflect LLMs' performance in real industrial workflows. To address t... » read more

← Older posts