FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇Semiconductor Engineering
  • Integrating Digital Twins In Semiconductor OperationsMark da Silva
    By Mark da Silva, Nishita Rao and Karim Somani Chipmakers must adopt transformative technologies including Digital Twins (DT) to keep pace with unprecedented global semiconductor industry growth that is expected to drive its total market value to $1 trillion[1] as soon as 2030. Leveraging predictive modeling and other efficiency-enhancing innovations, DTs promise to optimize semiconductor design, manufacturing processes and equipment maintenance while improving overall operational efficiency. Wi
     

Integrating Digital Twins In Semiconductor Operations

22. Únor 2024 v 09:01

By Mark da Silva, Nishita Rao and Karim Somani

Chipmakers must adopt transformative technologies including Digital Twins (DT) to keep pace with unprecedented global semiconductor industry growth that is expected to drive its total market value to $1 trillion[1] as soon as 2030. Leveraging predictive modeling and other efficiency-enhancing innovations, DTs promise to optimize semiconductor design, manufacturing processes and equipment maintenance while improving overall operational efficiency.

With DTs rising in prominence as a critical enabler of industry growth, key players from across the semiconductor ecosystem – including OEMs, platforms and end users – gathered at the Semiconductor Digital Twin Workshop last December at SEMI headquarters in Milpitas, Calif. to discuss the latest DT developments and explore the path to advancing the technology.

Following are highlights from the sold-out event hosted by the SEMI Smart Manufacturing Initiative.

Key takeaways

  • Industry Alignment on DT Definition and Taxonomy
    • The semiconductor industry needs to align on the definition and taxonomy of DTs in semiconductor operations.
    • With collaboration crucial to advances in DTs, the industry must come together to develop a common understanding of the technology.
  • Data Sharing for Sustainability Improvements
    • Sharing data among various chip ecosystem players will be vital to driving sustainability improvements.
    • Focusing on equipment and operational DTs with sustainability in mind will help foster collaboration among industry stakeholders.
  • Advocacy for Standardized DT Architecture and Framework
    • A standardized DT framework architecture must be established to enhance interoperability, reliability, synchronization, and security.
    • The adoption of digital twin technical standards is in its early stages but increasing in importance as DT technology evolves.
    • Collaboration will be essential to accelerate the availability and adoption of several digital twin technical standards under development by SEMI and other Standards Development Organizations (SDOs).

Key challenges

  • Robust DT Framework and Overcoming Development Silos
    • Establishing a robust DT framework and overcoming isolated development silos in microelectronics are challenges the industry must overcome.
  • Managing Unclean Factory Data
    • Challenges include managing unclean factory data, varying data granularity, and addressing the lifecycle of data models.
  • Sharing Data Between Tools and Process Steps
    • Data sharing between various semiconductor tools and process steps must be seamless. Data provenance is critical for DT accuracy and validation.
  • Legacy Factories & Small/Medium Firms
    • Factories with older generation tools and processes have a unique challenge in developing process level DTs for existing products.

Workshop sessions

The workshop consisted of four sessions focused on DT efforts by equipment makers, solution providers, device makers, and factory integration providers.

Equipment-level digital twins session

The session focused on OEM efforts to develop tool-level DTs and highlighted the potential to improve efficiency, performance, and sustainability. The session also featured discussions on equipment-level data sharing, standards, and interoperability challenges that need to be addressed. Speakers included IRDS Co-Chair Supika Mashiro of TEL, Ala Moradian of Applied Materials, Joseph Ervin of LAM Research, Sean Glazier of Onto Innovation, Basil Milton and Chan-Pin Chong of Kulicke & Soffa, and Mark Huntington of McKinsey & Company.

Session speakers: (L) Supika Mashiro, TEL, and (R) Ala Moradien, AMAT. 

Speakers discussed existing DTs deployed in manufacturing such as Run-to-Run (R2R) control, virtual metrology, and predictive maintenance (PdM) and the need for standardized DTs that can communicate with each other. Tool-level DT solutions such as Applied Materials EcoTwin within the AppliedTwin platform provide a virtualized replica of chipmaking equipment for development and improvement of chip-level processes. The platform has also demonstrated extensibility to sustainability analysis, a significant development.

Other focus areas were the connectivity of DTs across different levels (tools to factories) and the use of AI to make them self-adjusting for manufacturing processes. The importance of DT infrastructure and associated challenges such as ensuring clean and accessible data, data flow, and communication to keeping DTs synchronized were raised as significant challenges. In the back-end, OEMs are making steady progress to virtualize various tools such as wire bonding. The session also highlighted DTs as a major investment across industries, with huge potential in chipmaking. Building a strong data sharing foundation is key to success.

Chamber process, operations and planning level digital twin session

The session was led by solution providers from across the semiconductor ecosystem that develop tools to facilitate DTs at various hierarchical levels. The providers offer a variety of products and services across areas such as process physics-based models, chamber processes, operations, as well as planning modelling approaches to help companies implement and manage DTs. The session included technical details of DT models and their potential impact on the entire manufacturing process.

While the session made clear that DTs promise to revolutionize the semiconductor industry, it must overcome significant technical development challenges of integrating DTs into day-to-day operations. Speakers included Sarbajit Ghosal of SC Solutions, Norman Chang of Ansys, Holland Smith of INFICON, Chandra Reddy of IBM Research, Jon Herlocker of TIGNIS, Ken Smerz of ZELUS and John Behnke of INFICON.

Speakers emphasized the need for fast, multi-physics-based (and data-assisted) accurate DTs for real-time control and monitoring and that react instantly to changes, just like physical equipment. Think of it as having a virtual process line that can predict how different processes will interact. Sitting on top of the DTs are AI-powered (physics and/or data-driven) models that can then be harnessed to optimize manufacturing processes and predict yield.

Speakers also discussed operational-level DTs and the need for a central hub for all factory operational data to boost efficiency, maximize productivity, and reduce waste – all critical as the number of fabs grows in the years ahead. Construction DTs for pre-construction planning in the building of new chip fabs or expanding brown-field sites provide a preconstruction virtual blueprint that can help identify potential problems early on and minimize time to wafer starts. Lastly, how these various levels of DTs are integrated vertically within a factory play a key role in making decisions about autonomous fabs.

Digital twin adoption and implementation session

The session was led by device makers and owners of fabs, where DTs are critical for improving productivity by predicting yield, quality, and efficiency. A process-level DT enables a virtual representation of a product’s process flow in the fab, and it can be used to speed integration efforts (MRL 5-7), simulate specific outcomes, and optimize operations. Imagine a future where chip fabs are run by AI agents, with virtual models predicting problems before they happen and optimizing processes on the fly. That’s the vision shared by the session’s expert speakers. Their insights painted a fascinating picture of what’s next for the semiconductor industry. Speakers included Professor. H.-S Philip Wong of Stanford University, Steven J Meyer of Intel, Jae Yong Park of Samsung, Rosa Javadi of JABIL, Professor Amit Lal and Peter Doerschuk of Cornell University, Ben Davaji of Northeastern University, Pushkar Apte of SEMI and Bobby Mitra of Deloitte.

Session speakers: (L) Steven J Meyer, Intel, and (R) Jae Yong Park, Samsung.

The key development target is advanced AI-assisted manufacturing with three layers of virtual models – processes, tools, and the entire fab itself – all working together seamlessly is critical. This ambitious vision aligns with the National Semiconductor Technology Center (NSTC) DT Grand Challenge, which focuses on generating, sharing, and using data effectively. Intel’s AFS Software Suite, which includes high-speed simulators and graphical models to enable better planning and decision-making across multiple sites, is a real-world example of DTs used in today’s fabs.

Use cases of deploying AI to improve Automated Material Handling Systems (AMHS) asset utilization by 30% have also been demonstrated in real-world fab environments. The session highlighted the importance of scheduling with AI-powered DTs and standardizing data availability across the industry. Other impressive product development use case studies shared included a rapid COVID-19 tester system development and a global supply chain DT.

Speakers described how challenges such as infrastructure readiness, talent gaps, and data privacy concerns are slowing industrywide adoption. They also discussed efforts to develop an open-access academic cleanroom dedicated to developing and testing DT models for lithography and etching processes, with investigation of federated learning to address data privacy & sharing concerns. The experts characterized the hierarchy of DT types as a framework based on the ISA-95 standard to ensure seamless communication and collaboration between DTs across various levels, from process development to production. This interconnected approach could revolutionize chipmaking across the entire enterprise, as demonstrated by an example showing DTs spanning the enterprise.

Digital twin connectivity and platform integration session

The session focused on a variety of product and service offerings by cloud, facilities, and supply chain solution providers that help companies implement and manage DTs of various levels. These solutions include integration, connectivity, security and horizontal integration across the supply chain. Almost all speakers pointed to the importance of standardization efforts as crucial for future development. Speakers included Rad Desiraju of Microsoft, Gautham Unni of AWS, David Gross and Srividya Jayaram of Siemens, Slava Libman of FTD Solutions, Becky Kelderman of Rockwell Automation, Ram Walvekar of HCL Technologies, and Paul Trio of SEMI International Standards

Session speakers touched on definitions and categorization of DTs, including types and uses, as well as building dedicated infrastructure to support their development. The experts highlighted a few DT development challenges in areas such as data sources and provenance, as well as visualization and shared their solution offerings for creating, connecting, and maintaining these digital twins both vertically and horizontally within an enterprise.

The presenters also shared use cases on how DTs bridge design and manufacturing, enabling simulations and faster production, and how connecting DTs for various assets, processes, and products creates a holistic view. Session speakers also discussed a DT maturity scorecard that enables players from across the supply chain to track their progress and identify areas for improvement. Use cases of facility-level DTs for water management in fabs for promoting sustainability was also a topic of discussion.

The semiconductor industry’s commitment to digital twins

The Semiconductor Digital Twin Workshop showcased the industry’s commitment to adopting and advancing the technology. Continued collaboration and adherence to standards and sustainable practices will play a crucial role in unlocking the full potential of DT technology in semiconductor manufacturing.

SEMI thanks the speakers who provided access to their material presented at the workshop. Visit Semiconductor Digital Twin Workshop OnDemand | SEMI for the workshop materials.

Reference

  1. Ondrej Burkacky, Julia Dragon, and Nikolaus Lehmann, The semiconductor decade: A trillion-dollar industry, McKinsey & Company (blog), April 1, 2021

Nishita Rao is senior product marketing manager at SEMI.

Karim Somani is program manager at SEMI.  

The post Integrating Digital Twins In Semiconductor Operations appeared first on Semiconductor Engineering.

  • ✇IEEE Spectrum
  • How AI Will Change Chip DesignRina Diane Caballar
    The end of Moore’s Law is looming. Engineers and designers can do only so much to miniaturize transistors and pack as many of them as possible into chips. So they’re turning to other approaches to chip design, incorporating technologies like AI into the process.Samsung, for instance, is adding AI to its memory chips to enable processing in memory, thereby saving energy and speeding up machine learning. Speaking of speed, Google’s TPU V4 AI chip has doubled its processing power compared with that
     

How AI Will Change Chip Design

8. Únor 2022 v 15:00


The end of Moore’s Law is looming. Engineers and designers can do only so much to miniaturize transistors and pack as many of them as possible into chips. So they’re turning to other approaches to chip design, incorporating technologies like AI into the process.

Samsung, for instance, is adding AI to its memory chips to enable processing in memory, thereby saving energy and speeding up machine learning. Speaking of speed, Google’s TPU V4 AI chip has doubled its processing power compared with that of its previous version.

But AI holds still more promise and potential for the semiconductor industry. To better understand how AI is set to revolutionize chip design, we spoke with Heather Gorr, senior product manager for MathWorks’ MATLAB platform.

How is AI currently being used to design the next generation of chips?

Heather Gorr: AI is such an important technology because it’s involved in most parts of the cycle, including the design and manufacturing process. There’s a lot of important applications here, even in the general process engineering where we want to optimize things. I think defect detection is a big one at all phases of the process, especially in manufacturing. But even thinking ahead in the design process, [AI now plays a significant role] when you’re designing the light and the sensors and all the different components. There’s a lot of anomaly detection and fault mitigation that you really want to consider.

Portrait of a woman with blonde-red hair smiling at the camera Heather GorrMathWorks

Then, thinking about the logistical modeling that you see in any industry, there is always planned downtime that you want to mitigate; but you also end up having unplanned downtime. So, looking back at that historical data of when you’ve had those moments where maybe it took a bit longer than expected to manufacture something, you can take a look at all of that data and use AI to try to identify the proximate cause or to see something that might jump out even in the processing and design phases. We think of AI oftentimes as a predictive tool, or as a robot doing something, but a lot of times you get a lot of insight from the data through AI.

What are the benefits of using AI for chip design?

Gorr: Historically, we’ve seen a lot of physics-based modeling, which is a very intensive process. We want to do a reduced order model, where instead of solving such a computationally expensive and extensive model, we can do something a little cheaper. You could create a surrogate model, so to speak, of that physics-based model, use the data, and then do your parameter sweeps, your optimizations, your Monte Carlo simulations using the surrogate model. That takes a lot less time computationally than solving the physics-based equations directly. So, we’re seeing that benefit in many ways, including the efficiency and economy that are the results of iterating quickly on the experiments and the simulations that will really help in the design.

So it’s like having a digital twin in a sense?

Gorr: Exactly. That’s pretty much what people are doing, where you have the physical system model and the experimental data. Then, in conjunction, you have this other model that you could tweak and tune and try different parameters and experiments that let sweep through all of those different situations and come up with a better design in the end.

So, it’s going to be more efficient and, as you said, cheaper?

Gorr: Yeah, definitely. Especially in the experimentation and design phases, where you’re trying different things. That’s obviously going to yield dramatic cost savings if you’re actually manufacturing and producing [the chips]. You want to simulate, test, experiment as much as possible without making something using the actual process engineering.

We’ve talked about the benefits. How about the drawbacks?

Gorr: The [AI-based experimental models] tend to not be as accurate as physics-based models. Of course, that’s why you do many simulations and parameter sweeps. But that’s also the benefit of having that digital twin, where you can keep that in mind—it’s not going to be as accurate as that precise model that we’ve developed over the years.

Both chip design and manufacturing are system intensive; you have to consider every little part. And that can be really challenging. It’s a case where you might have models to predict something and different parts of it, but you still need to bring it all together.

One of the other things to think about too is that you need the data to build the models. You have to incorporate data from all sorts of different sensors and different sorts of teams, and so that heightens the challenge.

How can engineers use AI to better prepare and extract insights from hardware or sensor data?

Gorr: We always think about using AI to predict something or do some robot task, but you can use AI to come up with patterns and pick out things you might not have noticed before on your own. People will use AI when they have high-frequency data coming from many different sensors, and a lot of times it’s useful to explore the frequency domain and things like data synchronization or resampling. Those can be really challenging if you’re not sure where to start.

One of the things I would say is, use the tools that are available. There’s a vast community of people working on these things, and you can find lots of examples [of applications and techniques] on GitHub or MATLAB Central, where people have shared nice examples, even little apps they’ve created. I think many of us are buried in data and just not sure what to do with it, so definitely take advantage of what’s already out there in the community. You can explore and see what makes sense to you, and bring in that balance of domain knowledge and the insight you get from the tools and AI.

What should engineers and designers consider when using AI for chip design?

Gorr: Think through what problems you’re trying to solve or what insights you might hope to find, and try to be clear about that. Consider all of the different components, and document and test each of those different parts. Consider all of the people involved, and explain and hand off in a way that is sensible for the whole team.

How do you think AI will affect chip designers’ jobs?

Gorr: It’s going to free up a lot of human capital for more advanced tasks. We can use AI to reduce waste, to optimize the materials, to optimize the design, but then you still have that human involved whenever it comes to decision-making. I think it’s a great example of people and technology working hand in hand. It’s also an industry where all people involved—even on the manufacturing floor—need to have some level of understanding of what’s happening, so this is a great industry for advancing AI because of how we test things and how we think about them before we put them on the chip.

How do you envision the future of AI and chip design?

Gorr: It’s very much dependent on that human element—involving people in the process and having that interpretable model. We can do many things with the mathematical minutiae of modeling, but it comes down to how people are using it, how everybody in the process is understanding and applying it. Communication and involvement of people of all skill levels in the process are going to be really important. We’re going to see less of those superprecise predictions and more transparency of information, sharing, and that digital twin—not only using AI but also using our human knowledge and all of the work that many people have done over the years.

❌
❌