FreshRSS

Zobrazení pro čtení

Jsou dostupné nové články, klikněte pro obnovení stránky.

Blog Review: Aug. 21

Cadence’s Reela Samuel explores the critical role of PCIe 6.0 equalization in maintaining signal integrity and solutions to mitigate verification challenges, such as creating checkers to verify all symbols of TS0, ensuring the correct functioning of scrambling, and monitoring phase and LTSSM state transitions.

Siemens’ John McMillan introduces an advanced packaging flow for Intel’s Embedded Multi-die Interconnect Bridge (EMIB) technology, including technical challenges, design methodologies, and the integration of EMIBs in system-level package designs.

Synopsys’ Dustin Todd checks out what’s next for the U.S. CHIPS and Science Act, including the establishment of the National Semiconductor Technology Center and the allocation of $13 billion for research and development efforts.

Keysight’s Roberto Piacentini Filho explores the challenges of managing the large design files and massive volumes of data involved in a modern chip design project, which can take up as much as a terabyte of disk space and involve hundreds of thousands of files.

Arm’s Sandeep Mistry shows how ML models developed for mobile computer vision applications and requiring tens to hundreds of millions of multiply-accumulate (MACs) operations per inference can be deployed to a modern microcontroller.

Ansys’ Aliyah Mallak explores an effort to manufacture biotech products in microgravity and how simulation helps ensure payloads containing delicate, temperature-sensitive spore samples and bioreactors make it safely to the International Space Station or low Earth orbit safely.

Micron Technology’s Amit Srivastava, ULVAC’s Brian Coppa, and SEMI’s Mark da Silva suggest tackling corporate sustainability goals with a bottom-up approach that leverages various sensing technologies, at the cleanroom, sub-fab, and facilities levels for both greenfield and brownfield device-making facilities, to enable predictive analytics.

And don’t miss the blogs featured in the latest Manufacturing, Packaging & Materials newsletter:

Amkor’s JeongMin Ju shows how to prevent critical failures in copper RDLs caused by overcurrent-induced fusing.

Synopsys’ Al Blais discusses curvilinear checking and fracture requirements for the MULTIGON era.

Lam Research’s Dempsey Deng compares the parasitic capacitance of a 6F2 honeycomb DRAM device to a 4F2 VCAT DRAM structure.

Brewer Science’s Jessica Albright covers debonding methods, thermal, topography, adhesion, and thickness variation.

SEMI’s John Cooney reviews a fireside chat between the President of SEMI Americas and the U.S. Under Secretary of State for Economic Growth, Energy, and the Environment on securing supply chains.

The post Blog Review: Aug. 21 appeared first on Semiconductor Engineering.

Chip Industry Week in Review

Okinawa Institute of Science and Technology proposed a new EUV litho technology using only four reflective mirrors and a new method of illumination optics that it claims will use 1/10 the power and cost half as much as existing EUV technology from ASML.

Applied Materials may not receive expected U.S. funding to build a $4 billion research facility in Sunnyvale, CA, due to internal government disagreements over how to fund chip R&D, according to Bloomberg.

SEMI published a position paper this week cautioning the European Union against imposing additional export controls to allow companies, encouraging them to  be “as free as possible in their investment decisions to avoid losing their agility and relevance across global markets.” SEMI’s recommendations on outbound investments are in response to the European Economic Security Strategy and emphasize the need for a transparent and predictable regulatory framework.

The U.S. may restrict China’s access to HBM chips and the equipment needed to make them, reports Bloomberg. Today those chips are manufactured by two Korean-based companies, Samsung and SK hynix, but U.S.-based Micron expects to begin shipping 12-high stacks of HBM3E in 2025, and is currently working on HBM4.

Synopsys executive chair and founder Dr. Aart de Geus was named the winner of the Semiconductor Industry Association’s Robert N. Noyce Award. De Geus was selected due to his contributions to EDA technology over a career spanning more than four decades.

The top three foundries plan to implement high-NA EUV lithography as early as 2025 for the 18 angstrom generation, but the replacement of single exposure high-NA (0.55) over double patterning with standard EUV (NA = 0.33) depends on whether it provides better results at a reasonable cost per wafer.

Quick links to more news:

Global
In-Depth
Market Reports and Earnings
Education and Training
Security
Product News
Research
Events and Further Reading


Global

Belgium-based Imec released part 2 of its chiplets series, addressing testing strategies and standardization efforts, as well as guidelines and research “towards efficient ESD protection strategies for advanced 3D systems-on-chip.”

Also in Belgium, BelGan, maker of GaN chips, filed for bankruptcy according to the Brussels Times.

TSMC‘s Dresden, Germany, plant will break ground this month.

The UK will dole out more than £100 million (~US $128 million) in funding to develop five new quantum research hubs in Glasgow, Edinburgh, Birmingham, Oxford, and London.

MassPhoton is opening Hong Kong‘s first ultra-high vacuum GaN epitaxial wafer pilot line and will establish a GaN research center.

Infineon completed the sale of its manufacturing sites in the Philippines and South Korea to ASE.

Israel-based RAAAM Memory Technologies received a €5.25 million grant from the European Innovation Council (EIC) to support the development and commercialization of its innovative memory solutions. This funding will enable RAAAM to advance its research in high-performance and energy-efficient memory technologies, accelerating their integration into various applications and markets.


In-Depth

Semiconductor Engineering published its Automotive, Security and Pervasive Computing newsletter this week, featuring these top stories and video:

And:


Market Reports and Earnings

The semiconductor equipment industry is on a positive trajectory in 2024, with moderate revenue growth observed in Q2 after a subdued Q1, according to a new report from Yole Group. Wafer Fab Equipment revenue is projected to grow by 1.3% year-on-year, despite a 12% drop in Q1. Test equipment lead times are normalizing, improving order conditions. Key areas driving growth include memory and logic capital expenditures and high-bandwidth memory demand.

Worldwide silicon wafer shipments increased by 7% in Q2 2024, according to SEMI‘s latest report. This growth is attributed to robust demand from multiple semiconductor sectors, driven by advancements in AI, 5G, and automotive technologies.

The RF GaN market is projected to grow to US $2 billion by 2029, a 10% CAGR, according to Yole Group.

Counterpoint released their Q2 smartphone top 10 report.

Renesas completed their acquisition of EDA firm Altium, best known for its EDA platform and freeware CircuitMaker package.

It’s earnings season and here are recently released financials in the chip industry:

AMD  Advantest   Amkor   Ansys  Arteris   Arm   ASE   ASM   ASML
Cadence  IBM   Intel   Lam Research   Lattice   Nordson   NXP   Onsemi 
Qualcomm   Rambus  Samsung    SK Hynix   STMicro   Teradyne    TI  
Tower  TSMC    UMC  Western Digital

Industry stock price impacts are here.


Education and Training

Rochester Institute of Technology is leading a new pilot program to prepare community college students in areas such as cleanroom operations, new materials, simulation, and testing processes, with the intent of eventual transfer into RIT’s microelectronic engineering program.

Purdue University inked a deal with three research institutions — University of Piraeus, Technical University of Crete, and King’s College London —to develop joint research programs for semiconductors, AI and other critical technology fields.

The European Chips Skills Academy formed the Educational Leaders Board to help bridge the talent gap in Europe’s microelectronics sector.  The Board includes representatives from universities, vocational training providers, educators and research institutions who collaborate on strategic initiatives to strengthen university networks and build academic expertise through ECSA training programs.


Security

The Cybersecurity and Infrastructure Security Agency (CISA) is encouraging Apple users to review and apply this week’s recent security updates.

Microsoft Azure experienced a nearly 10 hour DDoS attack this week, leading to global service disruption for many customers.  “While the initial trigger event was a Distributed Denial-of-Service (DDoS) attack, which activated our DDoS protection mechanisms, initial investigations suggest that an error in the implementation of our defenses amplified the impact of the attack rather than mitigating it,” stated Microsoft in a release.

NIST published:

  • “Recommendations For Increasing U.S. Participation and Leadership in Standards Development,” a report outlining cybersecurity recommendations and mitigation strategies.
  • Final guidance documents and software to help improve the “safety, security and trustworthiness of AI systems.”
  • Cloud Computing Forensic Reference Architecture guide.

Delta Air Lines plans to seek damages after losing $500 million in lost revenue due to security company CrowdStrike‘s software update debacle.  And shareholders are also angry.

Recent security research:

  • Physically Secure Logic Locking With Nanomagnet Logic (UT Dallas)
  • WBP: Training-time Backdoor Attacks through HW-based Weight Bit Poisoning (UCF)
  • S-Tune: SOT-MTJ Manufacturing Parameters Tuning for Secure Next Generation of Computing ( U. of Arizona, UCF)
  • Diffie Hellman Picture Show: Key Exchange Stories from Commercial VoWiFi Deployments (CISPA, SBA Research, U. of Vienna)

Product News

Lam Research introduced a new version of its cryogenic etch technology designed to enhance the manufacturing of 3D NAND for AI applications. This technology allows for the precise etching of high aspect ratio features, crucial for creating 1,000-layer 3D NAND.


Fig.1: 3D NAND etch. Source: Lam Research

Alphawave Semi launched its Universal Chiplet Interconnect Express Die-to-Die IP. The subsystem offers 8 Tbps/mm bandwidth density and supports operation at 24 Gbps for D2D connectivity.

Infineon introduced a new MCU series for industrial and consumer motor controls, as well as power conversion system applications. The company also unveiled its new GoolGaN Drive product family of integrated single switches and half-bridges with integrated drivers.

Rambus released its DDR5 Client Clock Driver for next-gen, high-performance desktops and notebooks. The chips include Gen1 to Gen4 RCDs, power management ICs, Serial Presence Detect Hubs, and temperature sensors for leading-edge servers.

SK hynix introduced its new GDDR7 graphics DRAM. The product has an operating speed of 32Gbps, can process 1.5TB of data per second and has a 50% power efficiency improvement compared to the previous generation.

Intel launched its new Lunar Lake Ultra processors. The long awaited chips will be included in more than 80 laptop designs and has more than 40 NPU tera operations per second as well as over 60 GPU TOPS delivering more than 100 platform TOPS.

Brewer Science achieved recertification as a Certified B Corporation, reaffirming its commitment to sustainable and ethical business practices.

Panasonic adopted Siemens’ Teamcenter X cloud product lifecycle management solution, citing Teamcenter X’s Mendix low-code platform, improved operational efficiency and flexibility for its choice.

Keysight validated its 5G NR FR1 1024-QAM demodulation test cases for the first time. The 5G NR radio access technology supports eMBB and was validated on the 3GPP TS 38.521-4 test specification.


Research

In a 47-page deep-dive report, the Center for Security and Emerging Technology delved into all of the scientific breakthroughs from 1980 to present that brought EUV lithography to commercialization, including lessons learned for the next emerging technologies.

Researchers at the Paul Scherrer Institute developed a high-performance X-ray tomography technique using burst ptychography, achieving a resolution of 4nm. This method allows for non-destructive imaging of integrated circuits, providing detailed views of nanostructures in materials like silicon and metals.

MIT signed a four-year agreement with the Novo Nordisk Foundation Quantum Computing Programme at University of Copenhagen, focused on accelerating quantum computing hardware research.

MIT’s Research Laboratory of Electronics (RLE) developed a mechanically flexible wafer-scale integrated photonics fabrication platform. This enables the creation of flexible photonic circuits that maintain high performance while being bendable and stretchable. It offers significant potential for integrating photonic circuits into various flexible substrate applications in wearable technology, medical devices, and flexible electronics.

The Naval Research Lab identified a new class of semiconductor nanocrystals with bright ground-state excitons, emphasizing an important advancement in optoelectronics.

Researchers from National University of Singapore developed a novel method, known as tension-driven CHARM3D,  to fabricate 3D self-healing circuits, enabling the 3D printing of free-standing metallic structures without the need for support materials and external pressure.

Find more research in our Technical Papers library.


Events and Further Reading

Find upcoming chip industry events here, including:

Event Date Location
Atomic Layer Deposition (ALD 2024) Aug 4 – 7 Helsinki
Flash Memory Summit Aug 6 – 8 Santa Clara, CA
USENIX Security Symposium Aug 14 – 16 Philadelphia, PA
SPIE Optics + Photonics 2024 Aug 18 – 22 San Diego, CA
Cadence Cloud Tech Day Aug 20 San Jose, CA
Hot Chips 2024 Aug 25- 27 Stanford University/ Hybrid
Optica Online Industry Meeting: PIC Manufacturing, Packaging and Testing (imec) Aug 27 Online
SEMICON Taiwan Sep 4 -6 Taipei
DVCON Taiwan Sep 10 – 11 Hsinchu
AI HW and Edge AI Summit Sep 9 – 12 San Jose, CA
GSA Executive Forum Sep 26 Menlo Park, CA
SPIE Photomask Technology + EUVL Sep 29 – Oct 3 Monterey, CA
Strategic Materials Conference: SMC 2024 Sep 30 – Oct 2 San Jose, CA
Find All Upcoming Events Here

Upcoming webinars are here, including topics such as quantum safe cryptography, analytics for high-volume manufacturing, and mastering EMC simulations for electronic design.

Find Semiconductor Engineering’s latest newsletters here:

Automotive, Security and Pervasive Computing
Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials

 

The post Chip Industry Week in Review appeared first on Semiconductor Engineering.

Chip Industry Week in Review

Okinawa Institute of Science and Technology proposed a new EUV litho technology using only four reflective mirrors and a new method of illumination optics that it claims will use 1/10 the power and cost half as much as existing EUV technology from ASML.

Applied Materials may not receive expected U.S. funding to build a $4 billion research facility in Sunnyvale, CA, due to internal government disagreements over how to fund chip R&D, according to Bloomberg.

SEMI published a position paper this week cautioning the European Union against imposing additional export controls to allow companies, encouraging them to  be “as free as possible in their investment decisions to avoid losing their agility and relevance across global markets.” SEMI’s recommendations on outbound investments are in response to the European Economic Security Strategy and emphasize the need for a transparent and predictable regulatory framework.

The U.S. may restrict China’s access to HBM chips and the equipment needed to make them, reports Bloomberg. Today those chips are manufactured by two Korean-based companies, Samsung and SK hynix, but U.S.-based Micron expects to begin shipping 12-high stacks of HBM3E in 2025, and is currently working on HBM4.

Synopsys executive chair and founder Dr. Aart de Geus was named the winner of the Semiconductor Industry Association’s Robert N. Noyce Award. De Geus was selected due to his contributions to EDA technology over a career spanning more than four decades.

The top three foundries plan to implement high-NA EUV lithography as early as 2025 for the 18 angstrom generation, but the replacement of single exposure high-NA (0.55) over double patterning with standard EUV (NA = 0.33) depends on whether it provides better results at a reasonable cost per wafer.

Quick links to more news:

Global
In-Depth
Market Reports and Earnings
Education and Training
Security
Product News
Research
Events and Further Reading


Global

Belgium-based Imec released part 2 of its chiplets series, addressing testing strategies and standardization efforts, as well as guidelines and research “towards efficient ESD protection strategies for advanced 3D systems-on-chip.”

Also in Belgium, BelGan, maker of GaN chips, filed for bankruptcy according to the Brussels Times.

TSMC‘s Dresden, Germany, plant will break ground this month.

The UK will dole out more than £100 million (~US $128 million) in funding to develop five new quantum research hubs in Glasgow, Edinburgh, Birmingham, Oxford, and London.

MassPhoton is opening Hong Kong‘s first ultra-high vacuum GaN epitaxial wafer pilot line and will establish a GaN research center.

Infineon completed the sale of its manufacturing sites in the Philippines and South Korea to ASE.

Israel-based RAAAM Memory Technologies received a €5.25 million grant from the European Innovation Council (EIC) to support the development and commercialization of its innovative memory solutions. This funding will enable RAAAM to advance its research in high-performance and energy-efficient memory technologies, accelerating their integration into various applications and markets.


In-Depth

Semiconductor Engineering published its Automotive, Security and Pervasive Computing newsletter this week, featuring these top stories and video:

And:


Market Reports and Earnings

The semiconductor equipment industry is on a positive trajectory in 2024, with moderate revenue growth observed in Q2 after a subdued Q1, according to a new report from Yole Group. Wafer Fab Equipment revenue is projected to grow by 1.3% year-on-year, despite a 12% drop in Q1. Test equipment lead times are normalizing, improving order conditions. Key areas driving growth include memory and logic capital expenditures and high-bandwidth memory demand.

Worldwide silicon wafer shipments increased by 7% in Q2 2024, according to SEMI‘s latest report. This growth is attributed to robust demand from multiple semiconductor sectors, driven by advancements in AI, 5G, and automotive technologies.

The RF GaN market is projected to grow to US $2 billion by 2029, a 10% CAGR, according to Yole Group.

Counterpoint released their Q2 smartphone top 10 report.

Renesas completed their acquisition of EDA firm Altium, best known for its EDA platform and freeware CircuitMaker package.

It’s earnings season and here are recently released financials in the chip industry:

AMD  Advantest   Amkor   Ansys  Arteris   Arm   ASE   ASM   ASML
Cadence  IBM   Intel   Lam Research   Lattice   Nordson   NXP   Onsemi 
Qualcomm   Rambus  Samsung    SK Hynix   STMicro   Teradyne    TI  
Tower  TSMC    UMC  Western Digital

Industry stock price impacts are here.


Education and Training

Rochester Institute of Technology is leading a new pilot program to prepare community college students in areas such as cleanroom operations, new materials, simulation, and testing processes, with the intent of eventual transfer into RIT’s microelectronic engineering program.

Purdue University inked a deal with three research institutions — University of Piraeus, Technical University of Crete, and King’s College London —to develop joint research programs for semiconductors, AI and other critical technology fields.

The European Chips Skills Academy formed the Educational Leaders Board to help bridge the talent gap in Europe’s microelectronics sector.  The Board includes representatives from universities, vocational training providers, educators and research institutions who collaborate on strategic initiatives to strengthen university networks and build academic expertise through ECSA training programs.


Security

The Cybersecurity and Infrastructure Security Agency (CISA) is encouraging Apple users to review and apply this week’s recent security updates.

Microsoft Azure experienced a nearly 10 hour DDoS attack this week, leading to global service disruption for many customers.  “While the initial trigger event was a Distributed Denial-of-Service (DDoS) attack, which activated our DDoS protection mechanisms, initial investigations suggest that an error in the implementation of our defenses amplified the impact of the attack rather than mitigating it,” stated Microsoft in a release.

NIST published:

  • “Recommendations For Increasing U.S. Participation and Leadership in Standards Development,” a report outlining cybersecurity recommendations and mitigation strategies.
  • Final guidance documents and software to help improve the “safety, security and trustworthiness of AI systems.”
  • Cloud Computing Forensic Reference Architecture guide.

Delta Air Lines plans to seek damages after losing $500 million in lost revenue due to security company CrowdStrike‘s software update debacle.  And shareholders are also angry.

Recent security research:

  • Physically Secure Logic Locking With Nanomagnet Logic (UT Dallas)
  • WBP: Training-time Backdoor Attacks through HW-based Weight Bit Poisoning (UCF)
  • S-Tune: SOT-MTJ Manufacturing Parameters Tuning for Secure Next Generation of Computing ( U. of Arizona, UCF)
  • Diffie Hellman Picture Show: Key Exchange Stories from Commercial VoWiFi Deployments (CISPA, SBA Research, U. of Vienna)

Product News

Lam Research introduced a new version of its cryogenic etch technology designed to enhance the manufacturing of 3D NAND for AI applications. This technology allows for the precise etching of high aspect ratio features, crucial for creating 1,000-layer 3D NAND.


Fig.1: 3D NAND etch. Source: Lam Research

Alphawave Semi launched its Universal Chiplet Interconnect Express Die-toDie IP. The subsystem offers 8 Tbps/mm bandwidth density and supports operation at 24 Gbps for D2D connectivity.

Infineon introduced a new MCU series for industrial and consumer motor controls, as well as power conversion system applications. The company also unveiled its new GoolGaN Drive product family of integrated single switches and half-bridges with integrated drivers.

Rambus released its DDR5 Client Clock Driver for next-gen, high-performance desktops and notebooks. The chips include Gen1 to Gen4 RCDs, power management ICs, Serial Presence Detect Hubs, and temperature sensors for leading-edge servers.

SK hynix introduced its new GDDR7 graphics DRAM. The product has an operating speed of 32Gbps, can process 1.5TB of data per second and has a 50% power efficiency improvement compared to the previous generation.

Intel launched its new Lunar Lake Ultra processors. The long awaited chips will be included in more than 80 laptop designs and has more than 40 NPU tera operations per second as well as over 60 GPU TOPS delivering more than 100 platform TOPS.

Brewer Science achieved recertification as a Certified B Corporation, reaffirming its commitment to sustainable and ethical business practices.

Panasonic adopted Siemens’ Teamcenter X cloud product lifecycle management solution, citing Teamcenter X’s Mendix low-code platform, improved operational efficiency and flexibility for its choice.

Keysight validated its 5G NR FR1 1024-QAM demodulation test cases for the first time. The 5G NR radio access technology supports eMBB and was validated on the 3GPP TS 38.521-4 test specification.


Research

In a 47-page deep-dive report, the Center for Security and Emerging Technology delved into all of the scientific breakthroughs from 1980 to present that brought EUV lithography to commercialization, including lessons learned for the next emerging technologies.

Researchers at the Paul Scherrer Institute developed a high-performance X-ray tomography technique using burst ptychography, achieving a resolution of 4nm. This method allows for non-destructive imaging of integrated circuits, providing detailed views of nanostructures in materials like silicon and metals.

MIT signed a four-year agreement with the Novo Nordisk Foundation Quantum Computing Programme at University of Copenhagen, focused on accelerating quantum computing hardware research.

MIT’s Research Laboratory of Electronics (RLE) developed a mechanically flexible wafer-scale integrated photonics fabrication platform. This enables the creation of flexible photonic circuits that maintain high performance while being bendable and stretchable. It offers significant potential for integrating photonic circuits into various flexible substrate applications in wearable technology, medical devices, and flexible electronics.

The Naval Research Lab identified a new class of semiconductor nanocrystals with bright ground-state excitons, emphasizing an important advancement in optoelectronics.

Researchers from National University of Singapore developed a novel method, known as tension-driven CHARM3D,  to fabricate 3D self-healing circuits, enabling the 3D printing of free-standing metallic structures without the need for support materials and external pressure.

Find more research in our Technical Papers library.


Events and Further Reading

Find upcoming chip industry events here, including:

Event Date Location
Atomic Layer Deposition (ALD 2024) Aug 4 – 7 Helsinki
Flash Memory Summit Aug 6 – 8 Santa Clara, CA
USENIX Security Symposium Aug 14 – 16 Philadelphia, PA
SPIE Optics + Photonics 2024 Aug 18 – 22 San Diego, CA
Cadence Cloud Tech Day Aug 20 San Jose, CA
Hot Chips 2024 Aug 25- 27 Stanford University/ Hybrid
Optica Online Industry Meeting: PIC Manufacturing, Packaging and Testing (imec) Aug 27 Online
SEMICON Taiwan Sep 4 -6 Taipei
DVCON Taiwan Sep 10 – 11 Hsinchu
AI HW and Edge AI Summit Sep 9 – 12 San Jose, CA
GSA Executive Forum Sep 26 Menlo Park, CA
SPIE Photomask Technology + EUVL Sep 29 – Oct 3 Monterey, CA
Strategic Materials Conference: SMC 2024 Sep 30 – Oct 2 San Jose, CA
Find All Upcoming Events Here

Upcoming webinars are here, including topics such as quantum safe cryptography, analytics for high-volume manufacturing, and mastering EMC simulations for electronic design.

Find Semiconductor Engineering’s latest newsletters here:

Automotive, Security and Pervasive Computing
Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials

 

The post Chip Industry Week in Review appeared first on Semiconductor Engineering.

Chip Industry Week In Review

BAE Systems and GlobalFoundries are teaming up to strengthen the supply of chips for national security programs, aligning technology roadmaps and collaborating on innovation and manufacturing. Focus areas include advanced packaging, GaN-on-silicon chips, silicon photonics, and advanced technology process development.

Onsemi plans to build a $2 billion silicon carbide production plant in the Czech Republic. The site would produce smart power semiconductors for electric vehicles, renewable energy technology, and data centers.

The global chip manufacturing industry is projected to boost capacity by 6% in 2024 and 7% in 2025, reaching 33.7 million 8-inch (200mm) wafers per month, according to SEMIs latest World Fab Forecast report. Leading-edge capacity for 5nm nodes and below is expected to grow by 13% in 2024, driven by AI demand for data center applications. Additionally, Intel, Samsung, and TSMC will begin producing 2nm chips using gate-all-around (GAA) FETs next year, boosting leading-edge capacity by 17% in 2025.

At the IEEE Symposium on VLSI Technology & Circuits, imec introduced:

  • Functional CMOS-based CFETs with stacked bottom and top source/drain contacts.
  • CMOS-based 56Gb/s zero-IF D-band beamforming transmitters to support next-gen short-range, high-speed wireless services at frequencies above 100GHz.
  • ADCs for base stations and handsets, a key step toward scalable, high-performance beyond-5G solutions, such as cloud-based AI and extended reality apps.

Quick links to more news:

Global
In-Depth
Market Reports
Education and Training
Security
Product News
Research
Events and Further Reading


Global

Wolfspeed postponed plans to construct a $3 billion chip plant in Germany, underscoring the EU‘s challenges in boosting semiconductor production, reports Reuters. The North Carolina-based company cited reduced capital spending due to a weakened EV market, saying it now aims to start construction in mid-2025, two years later than 0riginally planned.

Micron is building a pilot production line for high-bandwidth memory (HBM) in the U.S., and considering HBM production in Malaysia to meet growing AI demand, according to a Nikkei report. The company is expanding HBM R&D facilities in Boise, Idaho, and eyeing production capacity in Malaysia, while also enhancing its largest HBM facility in Taichung, Taiwan.

Kioxia restored its Yokkaichi and Kitakami plants in Japan to full capacity, ending production cuts as the memory market recovers, according to Nikkei. The company, which is focusing on NAND flash production, has secured new bank credit support, including refinancing a ¥540 billion loan and establishing a ¥210 billion credit line. Kioxia had reduced output by more than 30% in October 2022 due to weak smartphone demand.

Europe’s NATO Innovation Fund announced its first direct investments, which includes semiconductor materials. Twenty-three NATO allies co-invested in this over $1B fund devoted to address critical defense and security challenges.

The second meeting of the U.S.India Initiative on Critical and Emerging Technology (iCET) was held in New Delhi, with various funding and initiatives announced to support semiconductor technology, next-gen telecommunications, connected and autonomous vehicles, ML, and more.

Amazon announced investments of €10 billion in Germany to drive innovation and support the expansion of its logistics network and cloud infrastructure.

Quantum Machines opened the Israeli Quantum Computing Center (IQCC) research facility, backed by the Israel Innovation Authority and located at Tel Aviv University. Also, Israel-based Classiq is collaborating with NVIDIA and BMW, using quantum computing to find the optimal automotive architecture of electrical and mechanical systems.

Global data center vacancy rates are at historic lows, and power availability is becoming less available, according to a Siemens report featured on Broadband Breakfast. The company called for an influx of financing to find new ways to optimize data center technology and sustainability.


In-Depth

Semiconductor Engineering published its Manufacturing, Packaging & Materials newsletter this week, featuring these top stories:

More reporting this week:


Market Reports

Renesas completed its acquisition of Transphorm and will immediately start offering GaN-based power products and reference designs to meet the demand for wide-bandgap (WBG) chips.

Revenues for the top five wafer fab equipment (WFE) companies fell 9% YoY in Q1 2024, according to Counterpoint. This was offset partially by increased demand for NAND and DRAM, which increased 33% YoY, and strong growth in sales to China, which were up 116% YoY.

The SiC power devices industry saw robust growth in 2023, primarily driven by the BEV market, according to TrendForce. The top five suppliers, led by ST with a 32.6% market share and onsemi in second place, accounted for 91.9% of total revenue. However, the anticipated slowdown in BEV sales and weakening industrial demand are expected to significantly decelerate revenue growth in 2024. 

About 30% of vehicles produced globally will have E/E architectures with zonal controllers by 2032, according to McKinsey & Co. The market for automotive micro-components and logic semiconductors is predicted to reach $60 billion in 2032, and the overall automotive semiconductor market is expected to grow from $60 billion to $140 billion in the same period, at a 10% CAGR.

The automotive processor market generated US$20 billion in revenue in 2023, according to Yole. US$7.8 billion was from APUs and FPGAs and $12.2 billion was from MCUs. The ADAS and infotainment processors market was worth US$7.8 billion in 2023 and is predicted to grow to $16.4 billion by 2029 at a 13% CAGR. The market for ADAS sensing is expected to grow at a 7% CAGR.


Security

The CHERI Alliance was established to drive adoption of memory safety and scalable software compartmentalization via the security technology CHERI, or Capability Hardware Enhanced RISC Instructions. Founding members include Capabilities Limited, Codasip, the FreeBSD Foundation, lowRISC, SCI Semiconductor, and the University of Cambridge.

In security research:

  • Japan and China researchers explored a NAND-XOR ring oscillator structure to design an entropy source architecture for a true random number generator (TRNG).
  • University of Toronto and Carleton University researchers presented a survey examining how hardware is applied to achieve security and how reported attacks have exploited certain defects in hardware.
  • University of North Texas and Texas Woman’s University researchers explored the potential of hardware security primitive Physical Unclonable Functions (PUF) for mitigation of visual deepfakes.
  • Villanova University researchers proposed the Boolean DERIVativE attack, which generalizes Boolean domain leakage.

Post-quantum cryptography firm PQShield raised $37 million in Series B funding.

Former OpenAI executive, Ilya Sutskever, who quit over safety concerns, launched Safe Superintelligence Inc. (SSI).

EU industry groups warned the European Commission that its proposed cybersecurity certification scheme (EUCS) for cloud services should not discriminate against Amazon, Google, and Microsoft, reported Reuters.

Cyber Europe tested EU cyber preparedness in the energy sector by simulating a series of large-scale cyber incidents in an exercise organized by the European Union Agency for Cybersecurity (ENISA).

The Cybersecurity and Infrastructure Security Agency (CISA) issued a number of alerts/advisories.


Education and Training

New York non-profit NY CREATES and South Korea’s National Nano Fab Center partnered to develop a hub for joint research, aligned technology services, testbed support, and an engineer exchange program to bolster chips-centered R&D, workforce development, and each nation’s high-tech ecosystem.

New York and the Netherlands agreed on a partnership to promote sustainability within the semiconductor industry, enhance workforce development, and boost semiconductor R&D.

Rapidus is set to send 200 engineers to AI chip developer Tenstorrent in the U.S. for training over the next five years, reports Nikkei. This initiative, led by Japan’s Leading-edge Semiconductor Technology Center (LSTC), aims to bolster Japan’s AI chip industry.


Product News

UMC announced its 22nm embedded high voltage (eHV) technology platform for premium smartphone and mobile device displays. The 22eHV platform reduces core device power consumption by up to 30% compared to previous 28nm processes. Die area is reduced by 10% with the industry’s smallest SRAM bit cells.​

Alphawave Semi announced a new 9.2 Gbps HBM3E sub-system silicon platform capable of 1.2 terabytes per second. Based on the HBM3E IP, the sub-system is aimed at addressing the demand for ultra-high-speed connectivity in high-performance compute applications.

Movellus introduced the Aeonic Power product family for on-die voltage regulation, targeting the challenging area of power delivery.

Cadence partnered with Semiwise and sureCore to develop new cryogenic CMOS circuits with possible quantum computing applications. The circuits are based on modified transistors found in the Cadence Spectre Simulation Platform and are capable of processing analog, mixed-signal, and digital circuit simulation and verification at cryogenic temperatures.

Renesas launched R-Car Open Access (RoX), an integrated development platform for software-defined vehicles (SDVs), designed for Renesas R-Car SoCs and MCUs with tools for deployment of AI applications, reducing complexity and saving time and money for car OEMs and Tier 1s.

Infineon released industry-first radiation-hardened 1 and 2 Mb parallel interface ferroelectric-RAM (F-RAM) nonvolatile memory devices, with up to 120 years of data retention at 85-degree Celsius, along with random access and full memory write at bus speeds. Plus, a CoolGaN Transistor 700 V G4 product family for efficient power conversion up to 700 V, ideal for consumer chargers and notebook adapters, data center power supplies, renewable energy inverters, and more.

Ansys adopted NVIDIA’s Omniverse application programming interfaces for its multi-die chip designers. Those APIs will be used for 5G/6G, IoT, AI/ML, cloud computing, and autonomous vehicle applications. The company also announced ConceptEV, an SaaS solution for automotive concept design for EVs.

Fig. 1: Field visualization of 3D-IC with Omniverse. Source: Ansys

QP Technologies announced a new dicing saw for its manufacturing line that can process a full cassette of 300mm wafers 7% faster than existing tools, improving throughput and productivity.

NXP introduced its SAF9xxx of audio DSPs to support the demand for AI-based audio in software-defined vehicles (SDVs) by using Cadence’s Tensilica HiFi 5 DSPs combined with dedicated neural-network engines and hardware-based accelerators.

Avionyx, a provider of software lifecycle engineering in the aerospace and safety-critical systems sector, partnered with Siemens and will leverage its Polarion application lifecycle management (ALM) tool. Also, Dovetail Electric Aviation adopted Siemens Xcelerator to support sustainable aviation.


Research

Researchers from imec and KU Leuven released a +70 page paper “Selecting Alternative Metals for Advanced Interconnects,” addressing interconnect resistance and reliability.

A comprehensive review article — “Future of plasma etching for microelectronics: Challenges and opportunities” — was created by a team of experts from the University of Maryland, Lam Research, IBM, Intel, and many others.

Researchers from the Institut Polytechnique de Paris’s Laboratory of Condensed Matter for Physics developed an approach to investigate defects in semiconductors. The team “determined the spin-dependent electronic structure linked to defects in the arrangement of semiconductor atoms,” the first time this structure has been measured, according to a release.

Lawrence Berkeley National Laboratory-led researchers developed a small enclosed chamber that can hold all the components of an electrochemical reaction, which can be paired with transmission electron microscopy (TEM) to generate precise views of a reaction at atomic scale, and can be frozen to stop the reaction at specific time points. They used the technique to study a copper catalyst.

The Federal Drug Administration (FDA) approved a clinical trial to test a device with 1,024 nanoscale sensors that records brain activity during surgery, developed by engineers at the University of California San Diego (UC San Diego).


Events and Further Reading

Find upcoming chip industry events here, including:

Event Date Location
Standards for Chiplet Design with 3DIC Packaging (Part 2) Jun 21 Online
DAC 2024 Jun 23 – 27 San Francisco
RISC-V Summit Europe 2024 Jun 24 – 28 Munich
Leti Innovation Days 2024 Jun 25 – 27 Grenoble, France
ISCA 2024 Jun 29 – Jul 3 Buenos Aires, Argentina
SEMICON West Jul 9 – 11 San Francisco
Flash Memory Summit Aug 6 – 8 Santa Clara, CA
USENIX Security Symposium Aug 14 – 16 Philadelphia, PA
Hot Chips 2024 Aug 25- 27 Stanford University
Find All Upcoming Events Here

Upcoming webinars are here.

Semiconductor Engineering’s latest newsletters:

Automotive, Security and Pervasive Computing
Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials


The post Chip Industry Week In Review appeared first on Semiconductor Engineering.

Chip Industry Week In Review

Rapidus and IBM are jointly developing mass production capabilities for chiplet-based advanced packages. The collaboration builds on an existing agreement to develop 2nm process technology.

Vanguard and NXP will jointly establish VisionPower Semiconductor Manufacturing Company (VSMC) in Singapore to build a $7.8 billion, 12-inch wafer plant. This is part of a global supply chain shift “Out of China, Out of Taiwan,” according to TrendForce.

Alphawave joined forces with Arm to develop an advanced chiplet based on Arm’s Neoverse Compute Subystems for AI/ML. The chiplet contains the Neoverse N3 CPU core cluster and Arm Coherent Mesh Network, and will be targeted at HPC in data centers, AI/ML applications, and 5G/6G infrastructure.

ElevATE Semiconductor and GlobalFoundries will partner for high-voltage chips to be produced at GF’s facility in Essex Junction, Vermont, which GF bought from IBM. The chips are essential for semiconductor testing equipment, aerospace, and defense systems.

NVIDIA, OpenAI, and Microsoft are under investigation by the U.S. Federal Trade Commission and Justice Department for violation of antitrust laws in the generative AI industry, according to the New York Times.

Quick links to more news:

Market Reports
Global
In-Depth
Education and Training
Security
Product News
Research
Events and Further Reading


Global

Apollo Global Management will invest $11 billion in Intel’s Fab 34 in Ireland, thereby acquiring a 49% stake in Intel’s Irish manufacturing operations.

imec and ASML opened their jointly run High-NA EUV Lithography Lab in Veldhoven, the Netherlands. The lab will be used to prepare  the next-generation litho for high-volume manufacturing, expected to begin in 2025 or 2026.

Expedera opened a new semiconductor IP design center in India. The location, the sixth of its kind for the company, is aimed at helping to make up for a shortfall in trained technicians, researchers, and engineers in the semiconductor sector.

Foxconn will build an advanced computing center in Taiwan with NVIDIA’s Blackwell platform at its core. The site will feature GB200 servers, which consist of 64 racks and 4,608 GPUs, and will be completed by 2026.

Intel and its 14 partner companies in Japan will use Sharp‘s LCD plants to research semiconductor production technology, a cost reduction move that should also produce income for Sharp, according to Nikkei Asia.

Japan is considering legislation to support the commercial production of advanced semiconductors, per Reuters.

Saudi Arabia aims to establish at least 50 semiconductor design companies as part of a new National Semiconductor Hub, funded with over $266 million.

Air Liquide is opening a new industrial gas production facility in Idaho, which will produce ultra-pure nitrogen and other gases for Micron’s new fab.

Microsoft will invest 33.7 billion Swedish crowns ($3.2 billion) to expand its cloud and AI infrastructure in Sweden over a two-year period, reports Bloomberg. The company also will invest $1 billion to establish a new data center in northwest Indiana.

AI data centers could consume as much as 9.1% of the electricity generated in the U.S. by 2030, according to a white paper published by the Electric Power Research Institute. That would more than double the electricity currently consumed by data centers, though EPRI notes this is a worst case scenario and advances in efficiency could be a mitigating factor.


Markets and Money

The Semiconductor Industry Association (SIA) announced global semiconductor sales increased 15.8% year-over-year in April, and the group projected a market growth of 16% in 2024. Conversely, global semiconductor equipment billings contracted 2% year-over-year to US$26.4 billion in Q1 2024, while quarter-over-quarter billings dropped 6% during the same period, according to SEMI‘s Worldwide Semiconductor Equipment Market Statistics (WWSEMS) Report.

Cadence completed its acquisition of BETA CAE Systems International, a provider of multi-domain, engineering simulation solutions.

Cisco‘s investment arm launched a $1 billion fund to aid AI startups as part of its AI innovation strategy. Nearly $200 million has already been earmarked.

The power and RF GaN markets will grow beyond US$2.45 billion and US$1.9 billion in 2029, respectively, according to Yole, which is offering a webinar on the topic.

The micro LED chip market is predicted to reach $580 million by 2028, driven by head-mounted devices and automotive applications, according to TrendForce. The cost of Micro LED chips may eventually come down due to size miniaturization.


In-Depth

Semiconductor Engineering published its Automotive, Security, and Pervasive Computing newsletter this week, featuring these top stories:

More reporting this week:


Security

Scott Best, Rambus senior director of Silicon Security Products, delivered a keynote at the Hardwear.io conference this week (below), detailing a $60 billion reverse engineering threat for hardware in just three markets — $30 billion for printer consumables, $20 billion for rechargeable batteries with some type of authentication, and $10 billion for medical devices such as sonogram probes.


Photo source: Ed Sperling/Semiconductor Engineering

wolfSSL debuted wolfHSM for automotive hardware security modules, with its cryptographic library ported to run in automotive HSMs like Infineon’s Aurix Tricore TC3XX.

Cisco integrated AMD Pensando data processing units (DPUs) with its Hypershield security architecture for defending AI-scale data centers.

OMNIVISION released an intelligent CMOS image sensor for human presence detection, infrared facial authentication, and always-on technology with a single sensing camera. And two new image sensors for industrial and consumer security surveillance cameras.

Digital Catapult announced a new cohort of companies will join Digital Security by Design’s Technology Access Program, gaining access to an Arm Morello prototype evaluation hardware kit based on Capability Hardware Enhanced RISC Instructions (CHERI), to find applications across critical UK sectors.

University of Southampton researchers used formal verification to evaluate the hardware reliability of a RISC-V ibex core in the presence of soft errors.

Several institutions published their students’ master’s and PhD work:

  • Virginia Tech published a dissertation proposing sPACtre, a defense mechanism that aims to prevent Spectre control-flow attacks on existing hardware.
  • Wright State University published a thesis proposing an approach that uses various machine learning models to bring an improvement in hardware Trojan identification with power signal side channel analysis
  • Wright State University published a thesis examining the effect of aging on the reliability of SRAM PUFs used for secure and trusted microelectronics IC applications.
  • Nanyang Technological University published a Final Year Project proposing a novel SAT-based circuit preprocessing attack based on the concept of logic cones to enhance the efficacy of SAT attacks on complex circuits like multipliers.

The Cybersecurity and Infrastructure Security Agency (CISA) issued a number of alerts/advisories.


Education and Training

Renesas and the Indian Institute of Technology Hyderabad (IIT Hyderabad) signed a three-year MoU to collaborate on VLSI and embedded semiconductor systems, with a focus on R&D and academic interactions to advance the “Make in India” strategy.

Charlie Parker, senior machine learning engineer at Tignis, presented a talk on “Why Every Fab Should Be Using AI.

Penn State and the National Sun Yat-Sen University (NSYSU) in Taiwan partnered to develop educational and research programs focused on semiconductors and photonics.

Rapidus and Hokkaido University partnered on education and research to enhance Japan’s scientific and technological capabilities and develop human resources for the semiconductor industry.

The University of Minnesota named Steve Koester its first “Chief Semiconductor Officer,” and launched a website devoted to semiconductor and microelectronics research and education.

The state of Michigan invested $10 million toward semiconductor workforce development.


Product News

Siemens reported breakthroughs in high-level C++ verification that will be used in conjunction with its Catapult software. Designers will be able to use formal property checking via the Catapult Formal Assert software and reachability coverage analysis through Catapult Formal CoverCheck.

Infineon released several products:

Augmental, an MIT Media Lab spinoff, released a tongue-based computer controller, dubbed the MouthPad.

NVIDIA revealed a new line of products that will form the basis of next-gen AI data centers. Along with partners ASRock Rack, ASUS, GIGABYTE, Ingrasys, and others, the NVIDIA GPUs and networking tech will offer cloud, on-premises, embedded, and edge AI systems. NVIDIA founder and CEO Jensen Huang showed off the company’s upcoming Rubin platform, which will succeed its current Blackwell platform. The new system will feature new GPUs, an Arm-based CPU and advanced networking with NVLink 6, CX9 SuperNIC and X1600 converged InfiniBand/Ethernet switch.

Intel showed off its Xeon 6 processors at Computex 2024. The company also unveiled architectural details for its Lunar Lake client computing processor, which will use 40% less SoC power, as well as a new NPU, and X2 graphic processing unit cores for gaming.


Research

imec released a roadmap for superconducting digital technology to revolutionize AI/ML.

CEA-Leti reported breakthroughs in three projects it considers key to the next generation of CMOS image sensors. The projects involved embedding AI in the CIS and stacking multiple dies to create 3D architectures.

Researchers from MIT’s Computer Science & Artificial Intelligence Laboratory (MIT-CSAIL) used a type of generative AI, known as diffusion models, to train multi-purpose robots, and designed the Grasping Neural Process for more intelligent robotic grasping.

IBM and Pasqal partnered to develop a common approach to quantum-centric supercomputing and to promote application research in chemistry and materials science.

Stanford University and Q-NEXT researchers investigated diamond to find the source of its temperamental nature when it comes to emitting quantum signals.

TU Wien researchers investigated how AI categorizes images.

In Canada:

  • Simon Fraser University received funding of over $80 million from various sources to upgrade the supercomputing facility at the Cedar National Host Site.
  • The Digital Research Alliance of Canada announced $10.28 million to renew the University of Victoria’s Arbutus cloud infrastructure.
  • The Canadian government invested $18.4 million in quantum research at the University of Waterloo.

Events and Further Reading

Find upcoming chip industry events here, including:

Event Date Location
SNUG Europe: Synopsys User Group Jun 10 – 11 Munich
IEEE RAS in Data Centers Summit: Reliability, Availability and Serviceability Jun 11 – 12 Santa Clara, CA
AI for Semiconductors (MEPTEC) Jun 12 – 13 Online
3D & Systems Summit Jun 12 – 14 Dresden, Germany
PCI-SIG Developers Conference Jun 12 – 13 Santa Clara, CA
Standards for Chiplet Design with 3DIC Packaging (Part 1) Jun 14 Online
AI Hardware and Edge AI Summit: Europe Jun 18 – 19 London, UK
Standards for Chiplet Design with 3DIC Packaging (Part 2) Jun 21 Online
DAC 2024 Jun 23 – 27 San Francisco
RISC-V Summit Europe 2024 Jun 24 – 28 Munich
Leti Innovation Days 2024 Jun 25 – 27 Grenoble, France
Find All Upcoming Events Here

Upcoming webinars are here.


Semiconductor Engineering’s latest newsletters:

Automotive, Security and Pervasive Computing
Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials

 

The post Chip Industry Week In Review appeared first on Semiconductor Engineering.

AI For Data Management

Data management is becoming a significant new challenge for the chip industry, as well as a brand new opportunity, as the amount of data collected at every step of design through manufacturing continues to grow.

Exacerbating the problem is the rising complexity of designs, many of which are highly customized and domain-specific at the leading edge, as well as increasing demands for reliability and traceability. There also is a growing focus on chiplets developed using different processes, including some from different foundries, and new materials such as glass substrates and ruthenium interconnects. On the design side, EDA and verification tools can generate terabytes of data on a weekly or even a daily basis, unlike in the past when this was largely done on a per-project basis.

While more data can be used to provide insights into processes and enable better designs, it’s an ongoing challenge to manage the current volumes being generated. The entire industry must rethink some well-proven methodologies and processes, as well as invest in a variety of new tools and approaches. At the same time, these changes are generating concern in an industry used to proceeding cautiously, one step at a time, based on silicon- and field-proven strategies. Increasingly, AI/ML is being added into design tools to identify anomalies and patterns in large data sets, and many of those tools are being regularly updated as algorithms are updated and new features are added, making it difficult to know exactly when and where to invest, which data to focus on, and with whom to share it.

“Every company has its own design flow, and almost every company has its own methodology around harvesting that data, or best practices about what reports should or should not be written out at what point,” said Rob Knoth, product management director in Cadence’s Digital & Signoff group. “There’s a death by 1,000 cuts that can happen in terms of just generating titanic volumes of data because, in general, disk space is cheap. People don’t think about it a lot, and they’ll just keep generating reports. The problem is that just because you’re generating reports doesn’t mean you’re using them.”

Fig. 1: Rising design complexity is driving increased need for data management. Source: IEEE Rising Stars 2022/Cadence

As with any problem in chip design, there is opportunity in figuring out a path forward. “You can always just not use the data, and then you’re back where you started,” said Tony Chan Carusone, CTO at Alphawave Semi. “The reason it becomes a problem for organizations is because they haven’t architected things from the beginning to be scalable, and therefore, to be able to handle all this data. Now, there’s an opportunity to leverage data, and it’s a different way. So it’s disruptive because you have to tear things apart, from re-architecting systems and processes to how you collect and store data, and organize it in order to take advantage of the opportunity.”

Buckets of data, buckets of problems
The challenges that come with this influx of data can be divided into three buckets, said Jim Schultz, senior staff product manager at Synopsys. The first is figuring out what information is actually critical to keep. “If you make a run, designers tend to save that run because if they need to do a follow up run, they have some data there and they may go, ‘Okay, well, what’s the runtime? How long did that run take, because my manager is going to ask me what I think the runtime is going to be on the next project or the next iteration of the block. While that data may not be necessary, designers and engineers have a tendency to hang onto it anyway, just in case.”

The second challenge is that once the data starts to pour in, it doesn’t stop, raising questions about how to manage collection. And third, once the data is collected, how can it be put to best use?

“Data analytics have been around with other types of companies exploring different types of data analytics, but the differences are those are can be very generic solutions,” said Schultz. “What we need for our industry is going to be very specific data analytics. If I have a timing issue, I want you to help me pinpoint what the cause of that timing violation is. That’s very specific to what we do in EDA. When we talk about who is cutting through the noise, we don’t want data that’s just presented. We want the data that is what the designer most cares about.”

Data security
The sheer number of tools being used and companies and people involved along the design pathway raises another challenge — security.

“There’s a lot of thought and investment going into the security aspect of data, and just as much as the problem of what data to save and store is the type of security we have to have without hindering the user day-to-day,” said Simon Rance, director of product management at Keysight. “That’s becoming a bigger challenge. Things like the CHIPS Act and the geopolitical scenarios we have at the moment are compounding that problem because a lot of the companies that used to create all these devices by themselves are having to collaborate, even with companies in different regions of the globe.”

This requires a balancing act. “It’s almost like a recording studio where you have all these knobs and dials to fine tune it, to make sure we have security of the data,” said Rance. “But we’re also able to get the job done as smoothly and as easily as we can.”

Further complicating the security aspect is that designing chips is not a one-man job. As leading-edge chips become increasingly complex and heterogeneous, they can involve hundreds of people in multiple companies.

“An important thing to consider when you’re talking about big data and analytics is what you’re going to share and with whom you’re going to share it,” said Synopsys’ Schultz. “In particular, when you start bringing in and linking data from different sources, if you start bringing in data related to silicon performance, you don’t want everybody to have access to that data. So the whole security protocol is important.”

Even the mundane matters — having a ton of data makes it likely, at some point, that data will be moved.

“The more places the data has to be transferred to, the more delays,” said Rance. “The bigger the data set, the longer it takes to go from A to B. For example, a design team in the U.S. may be designing during the day. Then, another team in Singapore or Japan will pick up on that design in their time zone, but they’re across the world. So you’re going to have to sync the data back and forth between these kinds of design sites. The bigger the data, the harder to sync.”

Solutions
The first step toward solving the issue of too much data is figuring out what data is actually needed. Rance said his team has found success using smart algorithms that help figure out which data is essential, which in turn can help optimize storage and transfer times.

There are less technical problems that can rear their heads, as well. Gina Jacobs, head of global communications and brand marketing at Arteris, said that engineers who use a set methodology — particularly those who are used to working on a problem by themselves and “brute forcing” a solution – also can find themselves overwhelmed by data.

“Engineers and designers can also switch jobs, taking with them institutional knowledge,” Jacobs said. “But all three problems can be solved with a single solution — having data stored in a standardized way that is easily accessible and sortable. It’s about taking data and requirements and specifications in different forms and then having it in the one place so that the different teams have access to it, and then being able to make changes so there is a single source of truth.”

Here, EDA design and data management tools are increasingly relying on artificial intelligence to help. Schultz forecasted a future where generative AI will touch every facet of chip development. “Along with that is the advanced data analytics that is able to mine all of that data you’ve been collecting, instead of going beyond the simple things that people have been doing, like predicting how long runtime is going to be or getting an idea what the performance is going to be,” he said. “Tools are going to be able to deal with all of that data and recognize trends much faster.”

Still, those all-encompassing AI tools, capable of complex analysis, are still years away. Cadence’s Knoth said he’s already encountered clients that are reluctant to bring it into the mix due to fears over the costs involved in disk space, compute resources, and licenses. Others, however, have been a bit more open-minded.

“Initially, AI can use a lot of processors to generate a lot of data because it’s doing a lot of things in parallel when it’s doing the inferencing, but it usually gets to the result faster and more predictably,” he said. So while a machine learning algorithm may generate even more vast amounts of data, on top of the piles currently available, “a good machine learning algorithm could be watching and smartly killing or restarting jobs where needed.”

As for the humans who are still an essential component to chip design, Alphawave’s Carusone said hardware engineers should take a page from lessons learned years ago from their counterparts in the software development world.

These include:

  • Having an organized and automated way to collect data, file it in a repository, and not do anything manually;
  • Developing ways to run verification and lab testing and everything in between in parallel, but with the data organized in a way that can be mined; and
  • Creating methods for rigorously checking in and out of different test cases that you want to consider.

“The big thing is you’ve got all this data collected, but then what is each of each of those files, each of those collections of data?” said Carusone. “What does that correspond to? What test conditions was that collected in? The software community dealt with that a while ago, and the hardware community also needs to have this under its belt, taking it to the next level and recognizing we really need to be able to do this en masse. We need to be able to have dozens of people work in parallel, collecting data and have it all on there. We can test a big collection of our designs in the lab without anyone having to touch a thing, and then also try refinements of the firmware, scale them out, then have all the data come in and be analyzed. Being able to have all that done in an automated way lets you track down and fix problems a lot more quickly.”

Conclusion
The influx of new tools used to analyze and test chip designs has increased productivity, but those designs come with additional considerations. Institutions and individual engineers and designers have never had access to so much data, but that data is of limited value if it’s not used effectively.

Strategies to properly store and order that data are essential. Some powerful tools are already in place to help do that, and the AI revolution promises to make even more powerful resources available to quickly cut down on the time needed to run tests and analyze the results.

For now, handling all that data remains a tricky balance, according to Cadence’s Knoth. “If this was an easy problem, it wouldn’t be a problem. Being able to communicate effectively, hierarchically — not just from a people management perspective, but also hierarchically from a chip and project management perspective — is difficult. The teams that do this well invest resources into that process, specifically the communication of top-down tightening of budgets or top-down floorplan constraints. These are important to think about because every engineer is looking at chip-level timing reports, but the problem that they’re trying to solve might not ever be visible. But if they have a report that says, ‘Here is your view of what your problems are to solve,’ you can make some very effective work.”

Further Reading
EDA Pushes Deeper Into AI
AI is both evolutionary and revolutionary, making it difficult to assess where and how it will be used, and what problems may crop up.
Optimizing EDA Cloud Hardware And Workloads
Algorithms written for GPUs can slice simulation time from weeks to hours, but not everything is optimized or benefits equally.

The post AI For Data Management appeared first on Semiconductor Engineering.

Chip Industry Week In Review

President Biden will raise the tariff rate on Chinese semiconductors from 25% to 50% by 2025, among other measures to protect U.S. businesses from China’s trade practices. Also, as part of President Biden’s AI Executive Order, the Administration released steps to protect workers from AI risks, including human oversight of systems and transparency about what systems are being used.

Intel is in advanced talks with Apollo Global Management for the equity firm to provide more than $11 billion to build a fab in Ireland, reported the Wall Street Journal. Also, Intel’s Foundry Services appointed Kevin O’Buckley as the senior vice president and general manager.

Polar is slated to receive up to $120 million in CHIPS Act funding to establish an independent American foundry in Minnesota. The company expects to invest about $525 million in the expansion of the facility over the next two years, with a $75 million investment from the State of Minnesota.

Arm plans to develop AI chips for launch next year, reports Nikkei Asia.

South Korea is planning a support package worth more than 10 trillion won ($7.3 billion) aimed at chip materials, equipment makers, and fabless companies throughout the semiconductor supply chain, according to Reuters.

Quick links to more news:

Global
In-Depth
Markets and Money
Security
Supercomputing
Education and Training
Product News
Research
Events and Further Reading


Global

Edwards opened a new facility in Asan City, South Korea. The 15,000m² factory provides a key production site for abatement systems, and integrated vacuum and abatement systems for semiconductor manufacturing.

France’s courtship with mega-tech is paying off.  Microsoft is investing more than US $4 billion to expand its cloud computing and AI infrastructure, including bringing up to 25,000 advanced GPUs to the country by the end of 2025. The “Choose France” campaign also snagged US $1.3 billion from Amazon for cloud infrastructure expansion, genAI and more.

Toyota, Nissan, and Honda are teaming up on AI and chips for next-gen cars with support from Japan’s Ministry of Economy, Trade and Industry, (METI), reports Nikkei Asia.

Meanwhile, IBM and Honda are collaborating on long-term R&D of next-gen technologies for software-defined vehicles (SDV), including chiplets, brain-inspired computing, and hardware-software co-optimization.

Siemens and Foxconn plan to collaborate on global manufacturing processes in electronics, information and communications technology, and electric vehicles (EV).

TSMC confirmed a Q424 construction start date for its first European plant in Dresden, Germany.

Amazon Web Services (AWS) plans to invest €7.8 billion (~$8.4B) in the AWS European Sovereign Cloud in Germany through 2040. The system is designed to serve public sector organizations and customers in highly regulated industries.


In-Depth

Semiconductor Engineering published its Low Power-High Performance newsletter this week, featuring these stories:

And this week’s Test, Measurement & Analytics newsletter featured these stories:


Markets and Money

The U.S. National Institute of Standards and Technology (NIST) awarded more than $1.2 million to 12 businesses in 8 states under the Small Business Innovation Research (SBIR) Program to fund R&D of products relating to cybersecurity, quantum computing, health care, semiconductor manufacturing, and other critical areas.

Engineering services and consulting company Infosys completed the acquisition of InSemi Technology, a provider of semiconductor design and embedded software development services.

The quantum market, which includes quantum networking and sensors alongside computing, is predicted to grow from $838 million in 2024 to $1.8 billion in 2029, reports Yole.

Shipments of OLED monitors reached about 200,000 units in Q1 2024, a year over year growth of 121%, reports TrendForce.

Global EV sales grew 18% in Q1 2024 with plug-in hybrid electric vehicles (PHEV) sales seeing 46% YoY growth and battery electric vehicle (BEV) sales growing just 7%, according to Counterpoint. China leads global EV sales with 28% YoY growth, while the US grew just 2%. Tesla saw a 9% YoY drop, but topped BEV sales with a 19% market share. BYD grew 13% YoY and exported about 100,000 EVs with 152% YoY growth, mainly in Southeast Asia.

DeepX raised $80.5 million in Series C funding for its on-device NPU IP and AI SoCs tailored for applications including physical security, robotics, and mobility.

MetisX raised $44 million in Series A funding for its memory solutions built on Compute Express Link (CXL) for accelerating large-scale data processing applications.


Security

While security experts have been warning of a growing threat in electronics for decades, there have been several recent fundamental changes that elevate the risk.

Synopsys and the Ponemon Institute released a report showing 54% of surveyed organizations suffered a software supply chain attack in the past year and 20% were not effective in their response. And 52% said their development teams use AI tools to generate code, but only 32% have processes to evaluate it for license, security, and quality risks.

Researchers at Ruhr University Bochum and TU Darmstadt presented a solution for the automated generation of fault-resistant circuits (AGEFA) and assessed the security of examples generated by AGEFA against side-channel analysis and fault injection.

TXOne reported on operational technology security and the most effective method for preventing production interruptions caused by cyber-attacks.

CrowdStrike and NVIDIA are collaborating to accelerate the use of analytics and AI in cybersecurity to help security teams combat modern cyberattacks, including AI-powered threats.

The National Institute of Standards and Technology (NIST) finalized its guidelines for protecting sensitive data, known as controlled unclassified information, aimed at organizations that do business with the federal government.

The Defense Advanced Research Projects Agency (DARPA) awarded BAE Systems a $12 million contract to solve thermal challenges limiting electronic warfare systems, particularly in GaN transistors.

Sigma Defense won a $4.7 million contract from the U.S. Army for an AI-powered virtual training environment, partnering with Brightline Interactive on a system that uses spatial computing and augmented intelligence workflows.

SkyWater’s advanced packaging operation in Florida has been accredited as a Category 1A Trusted Supplier by the Defense Microelectronics Activity (DMEA) of the U.S. Department of Defense (DoD).

Videos of two CWE-focused sessions from CVE/FIRST VulnCon 2024 were made available on the CWE YouTube Channel.

The Cybersecurity and Infrastructure Security Agency (CISA) issued a number of alerts/advisories.


Supercomputing

Supercomputers are battling for top dog.

The Frontier supercomputer at Oak Ridge National Laboratory (ORNL) retained the top spot on the Top500 list of the world’s fastest systems with an HPL score of 1.206 EFlop/s. The as-yet incomplete Aurora system at Argonne took second place, becoming the world’s second exascale system at 1.012 EFlop/s. The Green500 list, which tracks energy efficiency of compute, saw three new entrants take the top places.

Cerebras Systems, Sandia National Laboratory, Lawrence Livermore National Laboratory, and Los Alamos National Laboratory used Cerebras’ second generation Wafer Scale Engine to perform atomic scale molecular dynamics simulations at the millisecond scale, which they claim is 179X faster than the Frontier supercomputer.

UT Austin‘s Stampede3 Supercomputer is now in full production, serving the open science community through 2029.


Education and Training

SEMI announced the SEMI University Semiconductor Certification Programs to help alleviate the workforce skills gap. Its first two online courses are designed for new talent seeking careers in the industry, and experienced workers looking to keep their skills current.  Also, SEMI and other partners launched a European Chip Skills Academy Summer School in Italy.

Siemens created an industry credential program for engineering students that supplements a formal degree by validating industry knowledge and skills. Nonprofit agency ABET will provide accreditation. The first two courses are live at the University of Colorado Boulder (CU Boulder) and a series is planned with Pennsylvania State University (Penn State).

Syracuse University launched a $20 million Center for Advanced Semiconductor Manufacturing, with co-funding from Onondaga County.

Starting young is a good thing.  An Arizona school district, along with the University Of Arizona,  is creating a semiconductor program for high schoolers.


Product News

Siemens and Sony partnered to enable immersive engineering via a spatial content creation system, NX Immersive Designer, which includes Sony’s XR head-mounted display. The integration of hardware and software gives designers and engineers natural ways to interact with a digital twin. Siemens also extended its Xcelerator as a Service portfolio with solutions for product engineering and lifecycle management, cloud-based high-performance simulation, and manufacturing operations management. It will be available on Microsoft Azure, as well.

Advantest announced the newest addition to its portfolio of power supplies for the V93000 EXA Scale SoC test platform. The DC Scale XHC32 power supply offers 32 channels with single-instrument total current of up to 640A.

Fig. 1: Advantest’s DC Scale XHC32. Source: Advantest

Infineon released its XENSIV TLE49SR angle sensors, which can withstand stray magnetic fields of up to 8 mT, ideal for applications of safety-critical automotive chassis systems.

Google debuted its sixth generation Cloud TPU, 4.7X faster and 67% more energy-efficient than the previous generation, with double the high-bandwidth memory.

X-Silicon uncorked a RISC-V vector CPU, coupled with a Vulkan-enabled GPU ISA and AI/ML acceleration in a single processor core, aimed at embedded and IoT applications.

IBM expanded its Qiskit quantum software stack, including the stable release of its SDK for building, optimizing, and visualizing quantum circuits.

Northeastern University announced the general availability of testing and integration solutions for Open RAN through the Open6G Open Testing and Integration Center (Open 6G OTIC).


Research

The University of Glasgow received £3 million (~$3.8M) from the Engineering and Physical Sciences Research Council (EPSRC)’s Strategic Equipment Grant scheme to help establish “Analogue,” an Automated Nano Analysing, Characterisation and Additive Packaging Suite to research silicon chip integration and packaging.

EPFL researchers developed scalable photonic ICs, based on lithium tantalate.

DISCO developed a way to increase the diameter of diamond wafers that uses the KABRA process, a laser ingot slicing method.

CEA-Leti developed two complementary approaches for high performance photon detectors — a mercury cadmium telluride-based avalanche photodetector and a superconducting single photon detector.

Toshiba demonstrated storage capacities of over 30TB with two next-gen large capacity recording technologies for hard disk drives (HDDs): Heat Assisted Magnetic Recording (HAMR) and Microwave Assisted Magnetic Recording (MAMR).

Caltech neuroscientists reported that their brain-machine interface (BMI) worked successfully in a second human patient, following 2022’s first instance, proving the device is not dependent on one particular brain or one location in a brain.

Linköping University researchers developed a cheap, sustainable battery made from zinc and lignin, while ORNL researchers developed carbon-capture batteries.


Events and Further Reading

Find upcoming chip industry events here, including:

Event Date Location
European Test Symposium May 20 – 24 The Hague, Netherlands
NI Connect Austin 2024 May 20 – 22 Austin, Texas
ITF World 2024 (imec) May 21 – 22 Antwerp, Belgium
Embedded Vision Summit May 21 – 23 Santa Clara, CA
ASIP Virtual Seminar 2024 May 22 Online
Electronic Components and Technology Conference (ECTC) 2024 May 28 – 31 Denver, Colorado
Hardwear.io Security Trainings and Conference USA 2024 May 28 – Jun 1 Santa Clara, CA
SW Test Jun 3 – 5 Carlsbad, CA
IITC2024: Interconnect Technology Conference Jun 3 – 6 San Jose, CA
VOICE Developer Conference Jun 3 – 5 La Jolla, CA
CHIPS R&D Standardization Readiness Level Workshop Jun 4 – 5 Online and Boulder, CO
Find All Upcoming Events Here

Upcoming webinars are here.


Semiconductor Engineering’s latest newsletters:

Automotive, Security and Pervasive Computing
Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials

 

The post Chip Industry Week In Review appeared first on Semiconductor Engineering.

Chip Industry Week In Review

Synopsys refocused its security priorities around chips, striking a deal to sell off its Software Integrity Group subsidiary to private equity firms Clearlake Capital Group and Francisco Partners for about $2.1 billion. That deal comes on the heels of Synopsys’ recent acquisition of Intrinsic ID, which develops physical unclonable function IP. Sassine Ghazi, Synopsys’ president and CEO, said in an interview that the sale of the software group “gives us the ability to have management bandwidth, capital, and to double down on what we’re doing in our core business.”

The U.S. Commerce Department reportedly pulled export licenses from Intel and Qualcomm that permitted them to ship semiconductors to Huawei, the Financial Times reported. The move comes after advanced chips from Intel reportedly were used in new laptops and smartphones from the China-based company. 

Apple debuted its second-generation 3nm M4 chip with the launch of the new iPad Pro. The CPU and GPU each have up to 10 cores, with a neural engine capable of 38 TOPS, and a total of 28 billion transistors. Apple also is working with TSMC to develop its own AI processors for running software in data centers, reports The Wall Street Journal.

The U.S. is expected to triple its semiconductor manufacturing capacity by 2032, according to a new report by the Semiconductor Industry Association and Boston Consulting. By that year, the U.S. is projected to have 28% of global capacity for advanced logic manufacturing and over a quarter of total global capital expenditures.

Fig. 1: Source: Semiconductor Industry Association and Boston Consulting Group.

Quick links to more news:

Global
Market Reports
Automotive
Security
Product News
Education and Training
Research
In-Depth
Events
Further Reading

Around The Globe

The U.S. Commerce Department plans to solicit bids from organizations interested in creating and managing a new CHIPS Manufacturing USA institute focused on digital twins in the semiconductor sector. The government will award up to $285 million to the selected proposal.

The U.S. National Science Foundation and Department of Energy announced the first 35 projects to be supported with computational time through the National Artificial Intelligence Research Resource (NAIRR) Pilot. The initial selected projects will gain access to several U.S. supercomputing centers and other resources, with the goal of advancing responsible AI research.

Through its new Federal AI Sandbox, MITRE is offering up its computing power to U.S. government agencies. “Our new Federal AI Sandbox will help level the playing field, making the high-quality compute power needed to train and test custom AI solutions available to any agency,” stated Charles Clancy, MITRE, senior vice president and chief technology officer, in the release.

Saudi Arabia’s $100 billion investment fund for semiconductor and AI technology pledged it would divest from China if requested by the U.S, reported Bloomberg.

Japan’s SoftBank is holding talks with UK-based AI Chip firm Graphcore about a possible acquisition, reports Bloomberg.

India’s chip industry is heating up. Mindgrove launched the country’s first SoC, named Secure IoT. The chip clocks at 700 MHz, and the company is touting its key security algorithms, secure boot, and on-chip OTP memory. Meanwhile, Lam Research is expanding its global semiconductor fabrication supply chain to include India.

Microsoft will build a $3.3 billion AI data center in Racine, Wisconsin, the same location as the failed Foxconn investment touted six years ago.

Markets And Money

The SIA announced first-quarter global semiconductor sales grew more than 15% YoY, still 5.7% below Q4 2023, but a big improvement over last year. Consider that the semiconductor materials market contracted 8.2% in 2023 to $66.7 billion, down from a record $72.7 billion in 2022, according to a new report from SEMI.

The demand for AI-powered consumer electronics will drive global AI chipset shipments to 1.3 billion by 2030, according to ABI Research.

TrendForce released several new industry reports this week. Among the highlights:

  • HBM prices are expected to increase by up to 10% in 2025, representing more than 30% of total DRAM value.
  • In Q2, DRAM contract prices rose 13% to 18%, while NAND flash prices increased 15% to 20%.
  • The top 10 design firms’ combined revenue increased 12% in 2023, with NVIDIA taking the lead for the first time.

A number of acquisitions were announced recently:

  • High-voltage IC company, Power Integrations, will purchase the assets of Odyssey Semiconductor Technologies, a developer of gallium nitride (GaN) transistors.
  • Mobix Labs agreed to buy RF design company RaGE Systems for $20 million in cash, stock, and incentives.
  • V-Tek, a packaging services and inspection company, acquired A&J Programming, a manufacturer of automated handling and programming equipment.

The global smartphone market grew 6% year-over-year, shipping 296.9 million units in Q124, according to a Counterpoint report.  Samsung toppled Apple for the top spot with a 20% share.

Automotive

U.S. Justice Department is investigating whether Tesla committed securities or wire fraud for misleading consumers and investors about its EV’s autopilot capabilities, according to Reuters.

The automotive ecosystem is undergoing a huge transformation toward software-defined vehicles, spurring new architectures that can be future-proofed and customized with software.

Infineon introduced a microcontroller for the automotive battery management sector, integrating high-precision analog and high-voltage subsystems on a single chip. Infineon also inked a deal with China’s Xiaomi to provide SiC power modules for Xiaomi’s new SU7 smart EV.

Keysight and ETAS are teaming up to embed ETAS fuzz testing software into Keysight’s automotive cybersecurity platform.

Also, Keysight’s device security research lab, Riscure Security Solutions, can now conduct vehicle type approval evaluations under United Nations R155/R156 regulations. Keysight acquired Riscure in March.

Two autonomous driving companies received big funding. British AI company Wayve received a $1.05 billion Series C investment from SoftBank, with contributions from NVIDIA and Microsoft. Hyundai spent an additional $475 million on Motional, according its recent earnings report.

The automotive imaging market grew to U.S. $5.7 billion in 2023 due to increased production, autonomy demand, and higher-resolution offerings.

Automotive Grade Linux (AGL), a collaborative cross-industry effort developing an open source platform for all Software-Defined Vehicles (SDVs), released cloud-native functionality, RISC-V architecture and flutter applications.

Security

SRAM security concerns are intensifying as a combination of new and existing techniques allow hackers to tap into data for longer periods of time after a device is powered down. This is particularly alarming as the leading edge of design shifts to heterogeneous systems in package, where chiplets frequently have their own memory hierarchy.

Machine learning is being used by hackers to find weaknesses in chips and systems, but it also is starting to be used to prevent breaches by pinpointing hardware and software design flaws.

txOne Networks, provider of Cyber-Physical Systems security, raised $51 million in Series B extension round of funding.

The U.S. Department of Justice charged a Russian national with his role as the creator, developer and administrator of the LockBit, a prolific ramsomware group, that allegedly stole $100 million in payments from 2,000 victims.

The Cybersecurity and Infrastructure Security Agency (CISA) launched “We Can Secure Our World,” a new public awareness program promoting “basic cyber hygiene” and the agency also issues a number of alerts/advisories.

Product News

Siemens unveiled its Solido IP Validation Suite software, an automated quality assurance product designed to work across all design IP types and formats. The suite includes Solido Crosscheck and IPdelta software, which both provide in-view, cross-view and version-to-version QA checks.

proteanTecs announced its lifecycle monitoring solution is being integrated into SAPEON’s new AI processors.

SpiNNcloud Systems revealed their SpiNNaker2 system, an event-based AI platform supercomputer containing chips that are a mesh of 152 ARM-based cores. The platform has the ability to emulate 10 billion neurons while still maintaining power efficiency and reliability.

Ansys partnered with Schrodinger to develop new computational materials. The collaboration will see Schrodinger’s molecular modeling technology used in Ansys’ simulation tools to evaluate performance ahead of the prototype phase.

Keysight introduced a pulse generator to its handheld radio frequency analyzer software options. The Option 357 pulse generator is downloadable on B- and C-Series FieldFox analyzers.

Education and Training

Semiconductor fever is hitting academia:

  • Penn State discussed its role in leading 15 universities to drive advances in chip integration and packaging.
  • Georgia Tech’s explained its research is happening at all the levels of the “semiconductor stack,” touting its 28,500 square feet of academic cleanroom space.
  • And in the past month Purdue University, Dassault Systems and Lam Research expanded an existing deal to use virtual twins and simulation tools in workforce development.

Arizona State University is beefing up their technology programs with a new bachelor’s and doctoral degree in robotics and autonomous systems.

Microsoft is partnering with Gateway Technical College in Wisconsin to create a Data Center Academy to train Wisconsinites for data center and STEM roles by 2030.

Research

Stanford-led researchers used ordinary-appearing glasses for an augmented reality headset, utilizing waveguide display techniques, holographic imaging, and AI.

UC Berkeley, LLNL, and MIT engineered a miniaturized on-chip energy storage and power delivery, using an atomic-scale approach to modify electrostatic capacitors.

ORNL and other researchers observed a “surprising isotope effect in the optoelectronic properties of a single layer of molybdenum disulfide” when they substituted heavier isotope of molybdenum in the crystal.

Three U.S. national labs are partnering with NVIDIA to develop advanced memory technologies for high performance computing.

In-Depth

In addition to this week’s Automotive, Security and Pervasive Computing newsletter, here are more top stories and tech talk from the week:

Events

Find upcoming chip industry events here, including:

Event Date Location
ASMC: Advanced Semiconductor Manufacturing Conference May 13 – 16 Albany, NY
ISES Taiwan 2024: International Semiconductor Executive Summit May 14 – 15 New Taipei City
Ansys Simulation World 2024 May 14 – 16 Online
Women In Semiconductors May 16 Albany, NY
European Test Symposium May 20 – 24 The Hague, Netherlands
NI Connect Austin 2024 May 20 – 22 Austin, Texas
ITF World 2024 (imec) May 21 – 22 Antwerp, Belgium
Embedded Vision Summit May 21 – 23 Santa Clara, CA
ASIP Virtual Seminar 2024 May 22 Online
Electronic Components and Technology Conference (ECTC) 2024 May 28 – 31 Denver, Colorado
Hardwear.io Security Trainings and Conference USA 2024 May 28 – Jun 1 Santa Clara, CA
Find All Upcoming Events Here

Upcoming webinars are here.

Further Reading

Read the latest special reports and top stories, or check out the latest newsletters:

Automotive, Security and Pervasive Computing
Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials

The post Chip Industry Week In Review appeared first on Semiconductor Engineering.

Chip Industry Week In Review

Synopsys refocused its security priorities around chips, striking a deal to sell off its Software Integrity Group subsidiary to private equity firms Clearlake Capital Group and Francisco Partners for about $2.1 billion. That deal comes on the heels of Synopsys’ recent acquisition of Intrinsic ID, which develops physical unclonable function IP. Sassine Ghazi, Synopsys’ president and CEO, said in an interview that the sale of the software group “gives us the ability to have management bandwidth, capital, and to double down on what we’re doing in our core business.”

The U.S. Commerce Department reportedly pulled export licenses from Intel and Qualcomm that permitted them to ship semiconductors to Huawei, the Financial Times reported. The move comes after advanced chips from Intel reportedly were used in new laptops and smartphones from the China-based company. 

Apple debuted its second-generation 3nm M4 chip with the launch of the new iPad Pro. The CPU and GPU each have up to 10 cores, with a neural engine capable of 38 TOPS, and a total of 28 billion transistors. Apple also is working with TSMC to develop its own AI processors for running software in data centers, reports The Wall Street Journal.

The U.S. is expected to triple its semiconductor manufacturing capacity by 2032, according to a new report by the Semiconductor Industry Association and Boston Consulting. By that year, the U.S. is projected to have 28% of global capacity for advanced logic manufacturing and over a quarter of total global capital expenditures.

Fig. 1: Source: Semiconductor Industry Association and Boston Consulting Group.

Quick links to more news:

Global
Market Reports
Automotive
Security
Product News
Education and Training
Research
In-Depth
Events
Further Reading

Around The Globe

The U.S. Commerce Department plans to solicit bids from organizations interested in creating and managing a new CHIPS Manufacturing USA institute focused on digital twins in the semiconductor sector. The government will award up to $285 million to the selected proposal.

The U.S. National Science Foundation and Department of Energy announced the first 35 projects to be supported with computational time through the National Artificial Intelligence Research Resource (NAIRR) Pilot. The initial selected projects will gain access to several U.S. supercomputing centers and other resources, with the goal of advancing responsible AI research.

Through its new Federal AI Sandbox, MITRE is offering up its computing power to U.S. government agencies. “Our new Federal AI Sandbox will help level the playing field, making the high-quality compute power needed to train and test custom AI solutions available to any agency,” stated Charles Clancy, MITRE, senior vice president and chief technology officer, in the release.

Saudi Arabia’s $100 billion investment fund for semiconductor and AI technology pledged it would divest from China if requested by the U.S, reported Bloomberg.

Japan’s SoftBank is holding talks with UK-based AI Chip firm Graphcore about a possible acquisition, reports Bloomberg.

India’s chip industry is heating up. Mindgrove launched the country’s first SoC, named Secure IoT. The chip clocks at 700 MHz, and the company is touting its key security algorithms, secure boot, and on-chip OTP memory. Meanwhile, Lam Research is expanding its global semiconductor fabrication supply chain to include India.

Microsoft will build a $3.3 billion AI data center in Racine, Wisconsin, the same location as the failed Foxconn investment touted six years ago.

Markets And Money

The SIA announced first-quarter global semiconductor sales grew more than 15% YoY, still 5.7% below Q4 2023, but a big improvement over last year. Consider that the semiconductor materials market contracted 8.2% in 2023 to $66.7 billion, down from a record $72.7 billion in 2022, according to a new report from SEMI.

The demand for AI-powered consumer electronics will drive global AI chipset shipments to 1.3 billion by 2030, according to ABI Research.

TrendForce released several new industry reports this week. Among the highlights:

  • HBM prices are expected to increase by up to 10% in 2025, representing more than 30% of total DRAM value.
  • In Q2, DRAM contract prices rose 13% to 18%, while NAND flash prices increased 15% to 20%.
  • The top 10 design firms’ combined revenue increased 12% in 2023, with NVIDIA taking the lead for the first time.

A number of acquisitions were announced recently:

  • High-voltage IC company, Power Integrations, will purchase the assets of Odyssey Semiconductor Technologies, a developer of gallium nitride (GaN) transistors.
  • Mobix Labs agreed to buy RF design company RaGE Systems for $20 million in cash, stock, and incentives.
  • V-Tek, a packaging services and inspection company, acquired A&J Programming, a manufacturer of automated handling and programming equipment.

The global smartphone market grew 6% year-over-year, shipping 296.9 million units in Q124, according to a Counterpoint report.  Samsung toppled Apple for the top spot with a 20% share.

Automotive

U.S. Justice Department is investigating whether Tesla committed securities or wire fraud for misleading consumers and investors about its EV’s autopilot capabilities, according to Reuters.

The automotive ecosystem is undergoing a huge transformation toward software-defined vehicles, spurring new architectures that can be future-proofed and customized with software.

Infineon introduced a microcontroller for the automotive battery management sector, integrating high-precision analog and high-voltage subsystems on a single chip. Infineon also inked a deal with China’s Xiaomi to provide SiC power modules for Xiaomi’s new SU7 smart EV.

Keysight and ETAS are teaming up to embed ETAS fuzz testing software into Keysight’s automotive cybersecurity platform.

Also, Keysight’s device security research lab, Riscure Security Solutions, can now conduct vehicle type approval evaluations under United Nations R155/R156 regulations. Keysight acquired Riscure in March.

Two autonomous driving companies received big funding. British AI company Wayve received a $1.05 billion Series C investment from SoftBank, with contributions from NVIDIA and Microsoft. Hyundai spent an additional $475 million on Motional, according its recent earnings report.

The automotive imaging market grew to U.S. $5.7 billion in 2023 due to increased production, autonomy demand, and higher-resolution offerings.

Automotive Grade Linux (AGL), a collaborative cross-industry effort developing an open source platform for all Software-Defined Vehicles (SDVs), released cloud-native functionality, RISC-V architecture and flutter applications.

Security

SRAM security concerns are intensifying as a combination of new and existing techniques allow hackers to tap into data for longer periods of time after a device is powered down. This is particularly alarming as the leading edge of design shifts to heterogeneous systems in package, where chiplets frequently have their own memory hierarchy.

Machine learning is being used by hackers to find weaknesses in chips and systems, but it also is starting to be used to prevent breaches by pinpointing hardware and software design flaws.

txOne Networks, provider of Cyber-Physical Systems security, raised $51 million in Series B extension round of funding.

The U.S. Department of Justice charged a Russian national with his role as the creator, developer and administrator of the LockBit, a prolific ramsomware group, that allegedly stole $100 million in payments from 2,000 victims.

The Cybersecurity and Infrastructure Security Agency (CISA) launched “We Can Secure Our World,” a new public awareness program promoting “basic cyber hygiene” and the agency also issues a number of alerts/advisories.

Product News

Siemens unveiled its Solido IP Validation Suite software, an automated quality assurance product designed to work across all design IP types and formats. The suite includes Solido Crosscheck and IPdelta software, which both provide in-view, cross-view and version-to-version QA checks.

proteanTecs announced its lifecycle monitoring solution is being integrated into SAPEON’s new AI processors.

SpiNNcloud Systems revealed their SpiNNaker2 system, an event-based AI platform supercomputer containing chips that are a mesh of 152 ARM-based cores. The platform has the ability to emulate 10 billion neurons while still maintaining power efficiency and reliability.

Ansys partnered with Schrodinger to develop new computational materials. The collaboration will see Schrodinger’s molecular modeling technology used in Ansys’ simulation tools to evaluate performance ahead of the prototype phase.

Keysight introduced a pulse generator to its handheld radio frequency analyzer software options. The Option 357 pulse generator is downloadable on B- and C-Series FieldFox analyzers.

Education and Training

Semiconductor fever is hitting academia:

  • Penn State discussed its role in leading 15 universities to drive advances in chip integration and packaging.
  • Georgia Tech’s explained its research is happening at all the levels of the “semiconductor stack,” touting its 28,500 square feet of academic cleanroom space.
  • And in the past month Purdue University, Dassault Systems and Lam Research expanded an existing deal to use virtual twins and simulation tools in workforce development.

Arizona State University is beefing up their technology programs with a new bachelor’s and doctoral degree in robotics and autonomous systems.

Microsoft is partnering with Gateway Technical College in Wisconsin to create a Data Center Academy to train Wisconsinites for data center and STEM roles by 2030.

Research

Stanford-led researchers used ordinary-appearing glasses for an augmented reality headset, utilizing waveguide display techniques, holographic imaging, and AI.

UC Berkeley, LLNL, and MIT engineered a miniaturized on-chip energy storage and power delivery, using an atomic-scale approach to modify electrostatic capacitors.

ORNL and other researchers observed a “surprising isotope effect in the optoelectronic properties of a single layer of molybdenum disulfide” when they substituted heavier isotope of molybdenum in the crystal.

Three U.S. national labs are partnering with NVIDIA to develop advanced memory technologies for high performance computing.

In-Depth

In addition to this week’s Automotive, Security and Pervasive Computing newsletter, here are more top stories and tech talk from the week:

Events

Find upcoming chip industry events here, including:

Event Date Location
ASMC: Advanced Semiconductor Manufacturing Conference May 13 – 16 Albany, NY
ISES Taiwan 2024: International Semiconductor Executive Summit May 14 – 15 New Taipei City
Ansys Simulation World 2024 May 14 – 16 Online
Women In Semiconductors May 16 Albany, NY
European Test Symposium May 20 – 24 The Hague, Netherlands
NI Connect Austin 2024 May 20 – 22 Austin, Texas
ITF World 2024 (imec) May 21 – 22 Antwerp, Belgium
Embedded Vision Summit May 21 – 23 Santa Clara, CA
ASIP Virtual Seminar 2024 May 22 Online
Electronic Components and Technology Conference (ECTC) 2024 May 28 – 31 Denver, Colorado
Hardwear.io Security Trainings and Conference USA 2024 May 28 – Jun 1 Santa Clara, CA
Find All Upcoming Events Here

Upcoming webinars are here.

Further Reading

Read the latest special reports and top stories, or check out the latest newsletters:

Automotive, Security and Pervasive Computing
Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials

The post Chip Industry Week In Review appeared first on Semiconductor Engineering.

Chip Industry Week In Review

Samsung and Synopsys collaborated on the first production tapeout of a high-performance mobile SoC design, including CPUs and GPUs, using the Synopsys.ai EDA suite on Samsung Foundry’s gate-all-around (GAA) process. Samsung plans to begin mass production of 2nm process GAA chips in 2025, reports BusinessKorea.

UMC developed the first radio frequency silicon on insulator (RF-SOI)-based 3D IC process for chips used in smartphones and other 5G/6G mobile devices. The process uses wafer-to-wafer bonding technology to address radio frequency interference between stacked dies and reduces die size by 45%.

Fig. 1: UMC’s 3D IC solution for RFSOI technology. Source: UMC

The first programmable chip capable of shaping, splitting, and steering beams of light is now being produced by Skywater Technology and Lumotive. The technology is critical for advancing lidar-based systems used in robotics, automotive, and other 3D sensing applications.

Driven by demand for AI chips, SK hynix revealed it has already booked its entire production of high-bandwidth memory chips for 2024 and is nearly sold out of its production capacity for 2025, reported the Korea Times, while SEMI reported that silicon wafer shipments declined in Q1 2024, quarter over quarter, a 13% drop, attributed to continued weakness in IC fab utilization and inventory adjustments.

PCI-SIG published the CopprLink Internal and External Cable specifications to provide PCIe 5.0 and 6.0 signaling at 32 and 64 GT/s and leverage standard connector form factors for applications including storage, data centers, AI/ML, and disaggregated memory.

The U.S. Department of Commerce (DoC) launched the CHIPS Women in Construction Framework to boost the participation of women and economically disadvantaged people in the workforce, aiming to support on-time and successful completion of CHIPS Act-funded projects. Intel and Micron adopted the framework.

Quick links to more news:

Market Reports
Global
In-Depth
Education and Training
Security
Product News
Quantum
Research
Events
Further Reading


Markets and Money

The SiC wafer processing equipment market is growing rapidly, reports Yole. SiC devices will exceed $10B by 2029 at a CAGR of 25%, and the SiC manufacturing tool market is projected to reach $5B by 2026.

imec.xpand launched a €300 million (~$321 million) fund that will invest in semiconductor and nanotechnology startups with the potential to push semiconductor innovation beyond traditional applications and drive next-gen technologies.

Blaize raised $106 million for its programmable graph streaming processor architecture suite and low-code/no-code software platform for edge AI.

Guerrilla RF completed the acquisition of Gallium Semiconductor‘s portfolio of GaN power amplifiers and front-end modules.

About 90% of connected cars sold in 2030 will have embedded 5G capability, reported Counterpoint. Also, about 75% of laptop PCs sold in 2027 will be AI laptop PCs with advanced generative AI, and the global high-level OS (HLOS) or advanced smartwatch market is predicted to grow 15% in 2024.


Global

Powerchip Semiconductor opened a new 300mm facility in northwestern Taiwan targeting the production of AI semiconductors. The facility is expected to produce 50,000 wafers per month at 55, 40, and 28nm nodes.

Taiwan-based KYEC Semiconductor will withdraw its China operations by the third quarter due to increasing geopolitical tensions, reports the South China Morning Post.

Japan will expand its semiconductor export restrictions to China related to four technologies: Scanning electron microscopes, CMOS, FD-SOI, and the outputs of quantum computers, according to TrendForce.

IBM will invest CAD$187 million (~US$137M in Canada’s semiconductor industry, with the bulk of the investment focused on advanced assembly, testing, and packaging operations.

Microsoft will invest US$2.2 billion over the next four years to build Malaysia’s digital infrastructure, create AI skilling opportunities, establish an AI Center of Excellence, and enhance cybersecurity.


In-Depth

New stories and tech talks published by Semiconductor Engineering this week:


Security

Infineon collaborated with ETAS to integrate the ESCRYPT CycurHSM 3.x automotive security software stack into its next-gen AURIX MCUs to optimize security, performance, and functionality.

Synopsys released Polaris Assist, an AI-powered application security assistant on its Polaris Software Integrity Platform, combining LLM technology with application security knowledge and intelligence.

In security research:

U.S. President Biden signed a National Security Memorandum to enhance the resilience of critical infrastructure, and the White House announced key actions taken since Biden’s AI Executive Order, including measures to mitigate risk.

CISA and partners published a fact sheet on pro-Russia hacktivists who seek to compromise industrial control systems and small-scale operational technology systems in North American and European critical infrastructure sectors. CISA issued other alerts including two Microsoft vulnerabilities.


Education and Training

The U.S. National Institute for Innovation and Technology (NIIT) and the Department of Labor (DoL) partnered to celebrate the inaugural Youth Apprenticeship Week on May 5 to 11, highlighting opportunities in critical industries such as semiconductors and advanced manufacturing.

SUNY Poly received an additional $4 million from New York State for its Semiconductor Processing to Packaging Research, Education, and Training Center.

The University of Pennsylvania launched an online Master of Science in Engineering in AI degree.

The American University of Armenia celebrated its 10-year collaboration with Siemens, which provides AUA’s Engineering Research Center with annual research grants.


Product News

Renesas and SEGGER Embedded Studio launched integrated code generator support for its 32-bit RISC-V MCU. 

Rambus introduced a family of DDR5 server Power Management ICs (PMICs), including an extreme current device for high-performance applications.

Fig. 2: Rambus’ server PMIC on DDR5 RDIMM. Source: Rambus

Keysight added capabilities to Inspector, part of the company’s recently acquired device security research and test lab Riscure, that are designed to test the robustness of post-quantum cryptography (PQC) and help device and chip vendors identify and fix hardware vulnerabilities. Keysight also validated new conformance test cases for narrowband IoT non-terrestrial networks standards.

Ansys’ RedHawk-SC and Totem power integrity platforms were certified for TSMC‘s N2 nanosheet-based process technology, while its RaptorX solution for on-chip electromagnetic modeling was certified for TSMC’s N5 process.

Netherlands-based athleisure brand PREMIUM INC selected CLEVR to implement Siemens’ Mendix Digital Lifecycle Management for Fashion & Retail solution.

Micron will begin shipping high-capacity DRAM for AI data centers.

Microchip uncorked radiation-tolerant SoC FPGAs for space applications that uses a real-time Linux-capable RISC-V-based microprocessor subsystem.


Quantum

University of Chicago researchers developed a system to boost the efficiency of quantum error correction using a framework based on quantum low-density party-check (qLDPC) codes and new hardware involving reconfigurable atom arrays.

PsiQuantum will receive AUD $940 million (~$620 million) in equity, grants, and loans from the Australian and Queensland governments to deploy a utility-scale quantum computer in the regime of 1 million physical qubits in Brisbane, Australia.

Japan-based RIKEN will co-locate IBM’s Quantum System Two with its Fugaku supercomputer for integrated quantum-classical workflows in a heterogeneous quantum-HPC hybrid computing environment. Fugaku is currently one of the world’s most powerful supercomputers.

QuEra Computing was awarded a ¥6.5 billion (~$41 million) contract by Japan’s National Institute of Advanced Industrial Science and Technology (AIST) to deliver a gate-based neutral-atom quantum computer alongside AIST’s ABCI-Q supercomputer as part of a quantum-classical computing platform.

Novo Holdings, the controlling stakeholder of pharmaceutical company Novo Nordisk, plans to boost the quantum technology startup ecosystem in Denmark with DKK 1.4 billion (~$201 million) in investments.

The University of Sydney received AUD $18.4 million (~$12 million) from the Australian government to help grow the quantum industry and ecosystem.

The European Commission plans to spend €112 million (~$120 million) to support AI and quantum research and innovation.


Research

Intel researchers developed a 300-millimeter cryogenic probing process to collect high-volume data on the performance of silicon spin qubit devices across whole wafers using CMOS manufacturing techniques.

EPFL researchers used a form of ML called deep reinforcement learning (DRL) to train a four-legged robot to avoid falls by switching between walking, trotting, and pronking.=

The University of Cambridge researchers developed tiny, flexible nerve cuff devices that can wrap around individual nerve fibers without damaging them, useful to treat a range of neurological disorders.

Argonne National Laboratory and Toyota are exploring a direct recycling approach that carefully extracts components from spent batteries. Argonne is also working with Talon Metals on a process that could increase the number of EV batteries produced from mined nickel ore.


Events

Find upcoming chip industry events here, including:

Event Date Location
IEEE International Symposium on Hardware Oriented Security and Trust (HOST) May 6 – 9 Washington DC
MRS Spring Meeting & Exhibit May 7 – 9 Virtual
ASMC: Advanced Semiconductor Manufacturing Conference May 13 – 16 Albany, NY
ISES Taiwan 2024: International Semiconductor Executive Summit May 14 – 15 New Taipei City
Ansys Simulation World 2024 May 14 – 16 Online
NI Connect Austin 2024 May 20 – 22 Austin, Texas
ITF World 2024 (imec) May 21 – 22 Antwerp, Belgium
Embedded Vision Summit May 21 – 23 Santa Clara, CA
ASIP Virtual Seminar 2024 May 22 Online
Electronic Components and Technology Conference (ECTC) 2024 May 28 – 31 Denver, Colorado
Hardwear.io Security Trainings and Conference USA 2024 May 28 – Jun 1 Santa Clara, CA
Find All Upcoming Events Here

Upcoming webinars are here.


Further Reading

Read the latest special reports and top stories, or check out the latest newsletters:

Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials
Automotive, Security and Pervasive Computing

The post Chip Industry Week In Review appeared first on Semiconductor Engineering.

Fundamental Issues In Computer Vision Still Unresolved

Given computer vision’s place as the cornerstone of an increasing number of applications from ADAS to medical diagnosis and robotics, it is critical that its weak points be mitigated, such as the ability to identify corner cases or if algorithms are trained on shallow datasets. While well-known bloopers are often the result of human decisions, there are also fundamental technical issues that require further research.

“Computer vision” and “machine vision” were once used nearly interchangeably, with machine vision most often referring to the hardware embodiment of vision, such as in robots. Computer vision (CV), which started as the academic amalgam of neuroscience and AI research, has now become the dominant idea and preferred term.

“In today’s world, even the robotics people now call it computer vision,” said Jay Pathak, director, software development at Ansys. “The classical computer vision that used to happen outside of deep learning has been completely superseded. In terms of the success of AI, computer vision has a proven track record. Anytime self-driving is involved, any kind of robot that is doing work — its ability to perceive and take action — that’s all driven by deep learning.”

The original intent of CV was to replicate the power and versatility of human vision. Because vision is such a basic sense, the problem seemed like it would be far easier than higher-order cognitive challenges, like playing chess. Indeed, in the canonical anecdote about the field’s initial naïve optimism, Marvin Minsky, co-founder of the MIT AI Lab, having forgotten to include a visual system in a robot, assigned the task to undergraduates. But instead of being quick to solve, the problem consumed a generation of researchers.

Both academic and industry researchers work on problems that roughly can be split into three categories:

  • Image capture: The realm of digital cameras and sensors. It may use AI for refinements or it may rely on established software and hardware.
  • Image classification/detection: A subset of AI/ML that uses image datasets as training material to build models for visual recognition.
  • Image generation: The most recent work, which uses tools like LLMs to create novel images, and with the breakthrough demonstration of OpenAI’s Sora, even photorealistic videos.

Each one alone has spawned dozens of PhD dissertations and industry patents. Image classification/detection, the primary focus of this article, underlies ADAS, as well as many inspection applications.

The change from lab projects to everyday uses came as researchers switched from rules-based systems that simulated visual processing as a series of if/then statements (if red and round, then apple) to neural networks (NNs), in which computers learned to derive salient features by training on image datasets. NNs are basically layered graphs. The earliest model, 1943’s Perceptron, was a one-layer simulation of a biological neuron, which is one element in a vast network of interconnecting brain cells. Neurons have inputs (dendrites) and outputs (axons), driven by electrical and chemical signaling. The Perceptron and its descendant neural networks emulated the form but skipped the chemistry, instead focusing on electrical signals with algorithms that weighted input values. Over the decades, researchers refined different forms of neural nets with vastly increased inputs and layers, eventually becoming the deep learning networks that underlie the current advances in AI.

The most recent forms of these network models are convolutional neural networks (CNNs) and transformers. In highly simplified terms, the primary difference between them is that CNNs are very good at distinguishing local features, while transformers perceive a more globalized picture.

Thus, transformers are a natural evolution from CNNs and recurrent neural networks, as well as long short-term memory approaches (RNNs/LSTMs), according to Gordon Cooper, product marketing manager for Synopsys’ embedded vision processor family.

“You get more accuracy at the expense of more computations and parameters. More data movement, therefore more power,” said Cooper. “But there are cases where accuracy is the most important metric for a computer vision application. Pedestrian detection comes to mind. While some vision designs still will be well served with CNNs, some of our customers have determined they are moving completely to transformers. Ten years ago, some embedded vision applications that used DSPs moved to NNs, but there remains a need for both NNs and DSPs in a vision system. Developers still need a good handle on both technologies and are better served to find a vendor that can provide a combined solution.”

The emergence of CNN-based neural networks began supplanting traditional CV techniques for object detection and recognition.

“While first implemented using hardwired CNN accelerator hardware blocks, many of those CNN techniques then quickly migrated to programmable solutions on software-driven NPUs and GPNPUs,” said Aman Sikka, chief architect at Quadric.

Two parallel trends continue to reshape CV systems. “The first is that transformer networks for object detection and recognition, with greater accuracy and usability than their convolution-based predecessors, are beginning to leave the theoretical labs and enter production service in devices,” Sikka explained. “The second is that CV experts are reinventing the classical ISP functions with NN and transformer-based models that offer superior results. Thus, we’ve seen waves of ISP functionality migrating first from pure hardwired to C++ algorithmic form, and now into advanced ML network formats, with a modern design today in 2024 consisting of numerous machine-learning models working together.”

CV for inspection
While CV is well-known for its essential role in ADAS, another primary application is inspection. CV has helped detect everything from cancer tumors to manufacturing errors, or in the case of IBM’s productized research, critical flaws in the built environment. For example, a drone equipped with the IBM system could check if a bridge had cracks, a far safer and more precise way to perform visual inspection than having a human climb to dangerous heights.

By combining visual transformers with self-supervised learning, the annotation requirement is vastly reduced. In addition, the company has introduced a new process named “visual prompting,” where the AI can be taught to make the correct distinctions with limited supervision by using “in-context learning,” such as a scribble as a prompt. The optimal end result is that it should be able to respond to LLM-like prompts, such as “find all six-inch cracks.”

“Even if it makes mistakes and needs the help of human annotations, you’re doing far less labeling work than you would with traditional CNNs, where you’d have to do hundreds if not thousands of labels,” said Jayant Kalagnanam, director, AI applications at IBM Research.

Beware the humans
Ideally, domain-specific datasets should increase the accuracy of identification. They are often created by expanding on foundation models already trained on general datasets, such as ImageNet. Both types of datasets are subject to human and technical biases. Google’s infamous racial identification gaffes resulted from both technical issues and subsequent human overcorrections.

Meanwhile, IBM was working on infrastructure identification, and the company’s experience of getting its model to correctly identify cracks, including the problem of having too many images of one kind of defect, suggests a potential solution to the bias problem, which is to allow the inclusion of contradictory annotations.

“Everybody who is not a civil engineer can easily say what a crack is,” said Cristiano Malossi, IBM principal research scientist. “Surprisingly, when we discuss which crack has to be repaired with domain experts, the amount of disagreement is very high because they’re taking different considerations into account and, as a result, they come to different conclusions. For a model, this means if there’s ambiguity in the annotations, it may be because the annotations have been done by multiple people, which may actually have the advantage of introducing less bias.”

Fig.1 IBM’s Self-supervised learning model. Source: IBM

Fig. 1: IBM’s Self-supervised learning model. Source: IBM

Corner cases and other challenges to accuracy
The true image dataset is infinity, which in practical terms leaves most computer vision systems vulnerable to corner cases, potentially with fatal results, noted Alan Yuille, Bloomberg distinguished professor of cognitive science and computer science at Johns Hopkins University.

“So-called ‘corner cases’ are rare events that likely aren’t included in the dataset and may not even happen in everyday life,” said Yuille. “Unfortunately, all datasets have biases, and algorithms aren’t necessarily going to generalize to data that differs from the datasets they’re trained on. And one thing we have found with deep nets is if there is any bias in the dataset, the deep nets are wonderful at finding it and exploiting it.”

Thus, corner cases remain a problem to watch for. “A classic example is the idea of a baby in the road. If you’re training a car, you’re typically not going to have many examples of images with babies in the road, but you definitely want your car to stop if it sees a baby,” said Yuille. “If the companies are working in constrained domains, and they’re very careful about it, that’s not necessarily going to be a problem for them. But if the dataset is in any way biased, the algorithms may exploit the biases and corner cases, and may not be able to detect them, even if they may be of critical importance.”

This includes instances, such as real-world weather conditions, where an image may be partly occluded. “In academic cases, you could have algorithms that when evaluated on standard datasets like ImageNet are getting almost perfect results, but then you can give them an image which is occluded, for example, by a heavy rain,” he said. “In cases like that, the algorithms may fail to work, even if they work very well under normal weather conditions. A term for this is ‘out of domain.’ So you train in one domain and that may be cars in nice weather conditions, you test in out of domain, where there haven’t been many training images, and the algorithms would fail.”

The underlying reasons go back to the fundamental challenge of trying to replicate a human brain’s visual processing in a computer system.

“Objects are three-dimensional entities. Humans have this type of knowledge, and one reason for that is humans learn in a very different way than machine learning AI algorithms,” Yuille said. “Humans learn over a period of several years, where they don’t only see objects. They play with them, they touch them, they taste them, they throw them around.”

By contrast, current algorithms do not have that type of knowledge.

“They are trained as classifiers,” said Yuille. “They are trained to take images and output a class label — object one, object two, etc. They are not trained to estimate the 3D structure of objects. They have some sort of implicit knowledge of some aspects of 3D, but they don’t have it properly. That’s one reason why if you take some of those models, and you’ve contaminated the images in some way, the algorithms start degrading badly, because the vision community doesn’t have datasets of images with 3D ground truth. Only for humans, do we have datasets with 3D ground truth.”

Hardware implementation, challenges
The hardware side is becoming a bottleneck, as academics and industry work to resolve corner cases and create ever-more comprehensive and precise results. “The complexity of the operation behind the transformer is quadratic,“ said Malossi. “As a result, they don’t scale linearly with the size of the problem or the size of the model.“

While the situation might be improved with a more scalable iteration of transformers, for now progress has been stalled as the industry looks for more powerful hardware or any suitable hardware. “We’re at a point right now where progress in AI is actually being limited by the supply of silicon, which is why there’s so much demand, and tremendous growth in hardware companies delivering AI,” said Tony Chan Carusone, CTO of Alphawave Semi. “In the next year or two, you’re going to see more supply of these chips come online, which will fuel rapid progress, because that’s the only thing holding it back. The massive investments being made by hyperscalers is evidence about the backlogs in delivering silicon. People wouldn’t be lining up to write big checks unless there were very specific projects they had ready to run as soon as they get the silicon.”

As more AI silicon is developed, designers should think holistically about CV, since visual fidelity depends not only on sophisticated algorithms, but image capture by a chain of co-optimized hardware and software, according to Pulin Desai, group director of product marketing and management for Tensilica vision, radar, lidar, and communication DSPs at Cadence. “When you capture an image, you have to look at the full optical path. You may start with a camera, but you’ll likely also have radar and lidar, as well as different sensors. You have to ask questions like, ‘Do I have a good lens that can focus on the proper distance and capture the light? Can my sensor perform the DAC correctly? Will the light levels be accurate? Do I have enough dynamic range? Will noise cause the levels to shift?’ You have to have the right equipment and do a lot of pre-processing before you send what’s been captured to the AI. Remember, as you design, don’t think of it as a point solution. It’s an end-to-end solution. Every different system requires a different level of full path, starting from the lens to the sensor to the processing to the AI.”

One of the more important automotive CV applications is passenger monitoring, which can help reduce the tragedies of parents forgetting children who are strapped into child seats. But such systems depend on sensors, which can be challenged by noise to the point of being ineffective.

“You have to build a sensor so small it goes into your rearview mirror,” said Jayson Bethurem, vice president of marketing and business development at Flex Logix. “Then the issue becomes the conditions of your car. The car can have the sun shining right in your face, saturating everything, to the complete opposite, where it’s completely dark and the only light in the car is emitting off your dashboard. For that sensor to have that much dynamic range and the level of detail that it needs to have, that’s where noise creeps in, because you can’t build a sensor of that much dynamic range to be perfect. On the edges, or when it’s really dark or oversaturated bright, it’s losing quality. And those are sometimes the most dangerous times.”

Breaking into the black box
Finally, yet another serious concern for computer vision systems is the fact that they can’t be tested. Transformers, especially, are a notorious black box.

“We need to have algorithms that are more interpretable so that we can understand what’s going on inside them,” Yuille added. “AI will not be satisfactory till we move to a situation where we evaluate algorithms by being able to find the failure mode. In academia, and I hope companies are more careful, we test them on random samples. But if those random samples are biased in some way — and often they are — they may discount situations like the baby in the road, which don’t happen often. To find those issues, you’ve got to let your worst enemy test your algorithm and find the images that break it.”

Related Reading
Dealing With AI/ML Uncertainty
How neural network-based AI systems perform under the hood is currently unknown, but the industry is finding ways to live with a black box.

The post Fundamental Issues In Computer Vision Still Unresolved appeared first on Semiconductor Engineering.

Blog Review: May 1

Cadence’s Vatsal Patel stresses the importance of having testing and training capabilities for high-bandwidth memory to prevent the entire SoC from becoming useless and points to key HBM DRAM test instructions through IEEE 1500.

In a podcast, Siemens’ Stephen V. Chavez chats with Anaya Vardya of American Standard Circuits about the growing significance of high density interconnect and Ultra HDI technologies, which enable denser component placement and increased signal integrity compared to traditional PCB designs.

Synopsys’ Ian Land and Randy Fish find that silicon lifecycle management is increasingly being used on chips that target the aerospace and government market to ensure system health and longevity.

Arm’s Hristo Belchev looks at how to enable testing of system designs using the Memory Partitioning and Monitoring (MPAM) Arm architecture supplement, which allows privileged software to partition caches, memory controllers and interconnects on the hardware level.

Keysight’s Jonathon Wright considers where generative AI can add value in software testing by proposing a wide range of scenarios and improving communication between different stakeholders.

Ansys’ Laura Carter checks out how simulation is used to reduce the risks to drivers during a crash in stock car racing.

SEMI’s Maria Daniela Perez chats with Owen J. Guy of Swansea University about the challenge of onboarding talent within the microelectronics industry and the importance of ensuring students receive hands-on experience and exposure to real-world applications.

And don’t miss the blogs featured in the latest Systems & Design newsletter:

Technology Editor Brian Bailey suggests that although it is great to see the DAC conference come back to life, EDA companies need to do something about the show floor.

Siemens’ John Ferguson shows how to glean useful information well before all the details of an assembly are known.

Axiomise’ Ashish Darbari explains how formal verification can help improve chips.

Arteris’ Frank Schirrmeister tracks the race to centralized computing in automotive.

Synopsys’ Andrew Appleby explores the co-optimization of foundation IP and design flows for new transistors.

Cadence’s Anika Sunda looks at controlling the access to physical memory addresses.

Keysight’s Ben Coffin digs into how AI will be used in just about every subsystem of 6G networks.

The post Blog Review: May 1 appeared first on Semiconductor Engineering.

Chip Industry Week In Review

By Adam Kovac, Gregory Haley, and Liz Allan.

Cadence plans to acquire BETA CAE Systems for $1.24 billion, the latest volley in a race to sell multi-physics simulation and analysis across a broad set of customers with deep pockets. Cadence said the deal opens the door to structural analysis for the automotive, aerospace, industrial, and health care sectors. Under the terms of the agreement, 60% of the purchase would be paid in cash, and the remainder in stock.

South Korea’s National Intelligence Service reported that North Korea was targeting cyberattacks at domestic semiconductor equipment companies, using a “living off the land” approach, in which the attacker uses minimal malware to attack common applications installed on the server. That makes it more difficult to spot an attack. According to the government, “In December last year, Company A, and in February this year, Company B, had their configuration management server and security policy server hacked, respectively, and product design drawings and facility site photos were stolen.”

As the memory market goes, so goes the broader chip industry. Last quarter, and heading into early 2024, both markets began showing signs of sustainable growth. DRAM revenue jumped 29.6% in Q4 for a total of $17.46 billion. TrendForce attributed some of that to  new efforts to stockpile chips and strategic production control. NAND flash revenue was up 24.5% in Q4, with solid growth expected to continue into the first part of this year, according to TrendForce. Revenue for the sector topped $11.4 billion in Q4, and it’s expected to grow another 20% this quarter. SSD prices rebounded in Q4, as well, up 15% to $23.1 billion. Across the chip industry, sales grew 15.2% in January compared to the same period in 2023, according to the Semiconductor Industry Association (SIA). This is the largest increase since May 2022, and that trend is expected to continue throughout 2024 with double-digit growth compared to 2023.

Marvell said it is working with TSMC to develop a technology platform for the rapid deployment of analog, mixed-signal, and foundational IP. The company plans to sell both custom and commercial chiplets at 2nm.

The Dutch government is concerned that ASML, the only maker of EUV/high-NA EUV lithography equipment in the world, is considering leaving the Netherlands, according to De Telegraaf.

Quick links to more news:

Design and Power
Manufacturing and Test
Automotive and Batteries
Security
Pervasive Computing and AI
Events

Design and Power

AMD appears to have hit a roadblock with the U.S. Department of Commerce (DoC) over a new AI chip it designed for the Chinese market, as reported by Bloomberg. U.S. officials told the company the new chip is too powerful to be sold without a license.

JEDEC released its new memory standard as a free download on its website. The JESD239 Graphics Double Data Rate SGRAM can reach speeds of 192 GB/s and improve signal-to-noise ratio.

Accellera rolled out its IEEE Std. 1800‑2023 Standard for SystemVerilog—Unified Hardware Design, Specification, and Verification Language, which is now available for free download. The decision to offer it at no cost is due to Accellera’s participation in the IEEE GET Program, which was founded in 2010 with the intention of providing  open access to some standards. Accellera also announced it had approved for release the Verilog-AMS 2023 standard, which offers enhancements to analog constructs, dynamic tolerance for event control statements, and other upgrades.

Chiplets are a hot topic these days. Six industry experts discuss chiplet standards, interoperability, and the need for highly customized AI chiplets.

Optimizing EDA hardware for the cloud can shorten the time required for large and complex simulations, but not all workloads will benefit equally, and much more can be done to improve those that can.

Flex Logix is developing InferX DSP for use with existing EFLX eFPGA from 40nm to 7nm. InferX achieves about 30 times the DSP performance/mm² than eFPGA.

The number of challenges is growing in power semiconductors, just as it is in traditional chips. This tech talk looks at integrating power semiconductors with other devices, different packaging impacts, and how these devices will degrade over time.

Vultr announced it will use NVIDIA’s HGX H100 GPU clusters to expand its Seattle-based cloud data center. The company said the expansion, which will be powered by hydroelectricity, will make the facility one of the cleanest, most power efficient data centers in the country.

Amazon Web Services will expand its presence in Saudi Arabia, announcing a new $5.3 billion infrastructure region in the country that will launch in 2026. The new region will offer developers, entrepreneurs and companies access to healthcare, education and other services.

Google is teaming up with the Geneva Science and Diplomacy Anticipator (GESDA) to launch the XPRIZE Quantum Applications, with a $5 million in prizes for winners who can demonstrate ways to use quantum computing to solve real-world problems. Teams must submit a proposal that includes analysis of how long their algorithm would need to run before reaching a solution to a problem, such as improving drug development or designing new battery materials.

South Korea’s nepes corporation has turned to Siemens EDA for solutions in the development of advanced 3D-IC packages. The deal will see nepes incorporating several Siemens technologies, including the Calibre nmPlatform, Hyperlynx software and Xpedition Substrate Integrator software.

Siemens also formalized a partnership with Nuclei System Technology in which the pair of companies will work together on solution support for Nuclei’s RISC-V processor cores. The collaboration will allow clients to monitor CPU program execution in real-time via Nuclei’s RISC-V CPU Ips.

Keysight and ETS-Lindgren announced a breakthrough test solution for cellular devices using non-terrestrial networks. The solution is capable of measuring and validating the performance of both the transmitter and receiver of devices capable of supporting the network.

Nearly fifty companies raised $800 million for power electronics, data center interconnects, and more last month.

Manufacturing and Test

SEMI Europe issued a position statement to the European Union, warning against additional export controls or rules on foreign investment. SEMI argued that free trade partnerships are a better method for ensuring security than bans or restrictions.

Revenues for the top five wafer fab equipment manufacturers declined 1% YoY in 2023 to $93.5 billion, according to Counterpoint Research. The drop was attributed to weak spending on memory, inventory adjustments, and low demand in consumer electronics. The tide is changing, though.

Bruker closed two acquisitions. One involved Chemspeed Technologies, a Switzerland-based provider of automated laboratory R&D and QC workflow solutions. The second involved Phasefocus, an image processing company based in the UK.

A Swedish company, SCALINQ, released a commercially available large-scale packaging solution capable of controlling quantum devices with hundreds of qubits.

Solid Sands, a provider of testing and qualification technology for compilers and libraries, will partner with California-based Emprog to establish a representative presence in the U.S.

Automotive

Tesla halted production at its Brandenberg, Germany, gigafactory after an environmental activist group attacked an electricity pylon, reports the Guardian.

Stellantis will invest €5.6 billion (~$6.1B) in South America to support more than 40 new products, decarbonization technologies, and business opportunities.

The amount of data being collected, processed, and stored in vehicles is exploding, and so is the value of that data. That raises questions that are still not fully answered about how that data will be used, by whom, and how it will be secured.

While industry experts expect many benefits of V2X technology, technological and social hurdles to cross. But there is progress.

Infineon released its next-gen silicon carbide (SiC) MOSFET trench technology with 650V and 1,200V options improving stored energies and charges by up to 20%, ideal for power semiconductor applications such as photovoltaics, energy storage, DC EV charging, motor drives, and industrial power supplies.

Hyundai selected Ansys to supply structural simulation solutions for vehicle body system analysis, providing end-to-end, predictively accurate capabilities for virtual performance validation.

ION Mobility used the Siemens Xcelerator portfolio for styling, mechanical engineering, and electric battery pack development for its ION M1-S electric motorbike.

Ethernovia sampled a family of automotive PHY transceivers that scale from 10 Gbps to 1 Gbps over 15 meters of automotive cabling.

The California Public Utilities Commission (CPUC) approved Waymo’s plan to expand its driverless robotaxi services to Los Angeles and other cities near San Francisco, reports Reuters.

By 2027, next-gen battery EVs (BEVs) will on average be cheaper to produce than comparable gas-powered cars, reports Gartner. But the firm noted that average cost of EV accident repair will rise by 30%, and 15% of EV companies founded in the last decade will be acquired or bankrupt.

University of California San Diego (UCSD) researchers developed a cathode material for solid-state lithium-sulfur batteries that is electrically conductive and structurally healable.

ION Storage Systems announced its anodeless and compressionless solid-state batteries (SSBs) achieved 125 cycles with under 5% capacity degradation in performance. ION has been working with the U.S. Department of Defense (DoD) to test its SSB before expanding into markets such as EVs, energy storage, consumer electronics, and aerospace.

Security

Advanced process nodes and higher silicon densities are heightening DRAM’s susceptibility to Rowhammer attacks, as reduced cell spacing significantly decreases the hammer count needed for bit flips. A multi-layered, system-level approach is crucial to DRAM protection.

Researchers at Bar-Ilan University and Rafael Defense Systems proposed an analytical electromagnetic model for IC shielding against hardware attacks.

Keysight acquired the IP of Firmalyzer, whose firmware security analysis technology will be integrated into the Keysight IoT Security Assessment and Automotive Security solutions, providing analysis into what is happening inside the IoT device itself.

Flex Logix joined the Intel Foundry U.S. Military Aerospace Government (USMAG) Alliance, ensuring U.S. defense industrial base and government customers have access to the latest technology, enabling successful designs for mission critical programs.

The EU Council presidency and European Parliament reached a provisional agreement on a Cyber Solidarity Act and an amendment to the Cybersecurity Act (CSA) concerning managed security services.

The EU Agency for Cybersecurity (ENISA) and partners updated the compendium on elections cybersecurity in response to issues such as AI deep fakes, hacktivists-for-hire, the sophistication of threat actors, and the current geopolitical context.

The Cybersecurity and Infrastructure Security Agency (CISA) launched efforts to help secure the open source software ecosystem; updated its Public Safety Communications and Cyber Resiliency Toolkit; and issued other alerts including security advisories for VMware, Apple, and Cisco.

Pervasive Computing and AI

Johns Hopkins University engineers used natural language prompts and ChatGPT4 to produce detailed instructions to build a spiking neural network (SNN) chip. The neuromorphic accelerators could power real-time machine intelligence for next-gen embodied systems like autonomous vehicles and robots.

The global AI hardware market size was estimated at $53.71 billion in 2023, and is expected to reach about $473.53 billion by 2033, at a compound annual growth rate of 24.5%, reports Precedence Research.

National Institute of Standards and Technology (NIST) researchers and partners built compact chips capable of converting light into microwaves, which could improve navigation, communication, and radar systems.

Fig. 1: NIST researchers test a chip for converting light into microwave signals. Pictured is the chip, which is the fluorescent panel that looks like two tiny vinyl records. The gold box to the left of the chip is the semiconductor laser that emits light to the chip. Credit: K. Palubicki/NIST

The Indian government is investing 103 billion rupees ($1.25B) in AI projects, including computing infrastructure and large language models (LLMs).

Infineon is collaborating with Qt Group, bringing Qt’s graphics framework to Infineon’s graphics-enabled TRAVEO T2G cluster MCUs to optimize graphical user interface (GUI) development.

Keysight leveraged fourth-generation AMD EPYC CPUs to develop a new benchmarking methodology to test mobile and 5G private network performance. The method uses realistic traffic generation to uncover a CPU’s true power and scalability while observing bandwidth requirements.

The AI industry is pushing a nuclear power revival, reports NBC, and Amazon bought a nuclear-powered data center in Pennsylvania from Talen Energy for $650 million, according to WNEP.

Bank of America was awarded 644 patents in 2023 for technology including information security, AI, machine learning (ML), online and mobile banking, payments, data analytics, and augmented and virtual reality (AR/VR).

Mistral AI’s large language model, Mistral Large, became available in the Snowflake Data Cloud for customers to securely harness generative AI with their enterprise data.

China’s smartphone unit sales declined 7% year over year in the first six weeks of 2024, with Apple declining 24%, reports Counterpoint.

Shipments of LCD TV panels are expected to reach 55.8 million units in Q1 2024, a 5.3% quarter over quarter increase, reports TrendForce. And an estimated 5.8 billion LED lamps and luminaires are expected to reach the end of their lifespan in 2024, triggering a wave of secondary replacements and boosting total LED lighting demand to 13.4 billion units.

Korea Institute of Science and Technology (KIST) researchers mined high-purity gold from electrical and electronic waste.

The San Diego Supercomputer Center (SDSC) and the University of Utah launched a National Data Platform pilot project, aimed at making access to and use of scientific data open and equitable.

Events

Find upcoming chip industry events here, including:

Event Date Location
ISS Industry Strategy Symposium Europe Mar 6 – 8 Vienna, Austria
GSA International Semiconductor Conference Mar 13 – 14 London
Device Packaging Conference (DPC 2024) Mar 18 – 21 Fountain Hills, AZ
GOMACTech Mar 18 – 21 Charleston, South Carolina
SNUG Silicon Valley Mar 20 – 21 Santa Clara, CA
SEMICON China Mar 20 – 22 Shanghai
OFC: Optical Communications & Networking Mar 24 – 28 Virtual; San Diego, CA
DATE: Design, Automation and Test in Europe Conference Mar 25 – 27 Valencia, Spain
SEMI Therm Mar 25- 28 San Jose, CA
MemCon Mar 26 – 27 Silicon Valley
All Upcoming Events

Upcoming webinars are here.

Further Reading and Newsletters

Read the latest special reports and top stories, or check out the latest newsletters:

Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials
Automotive, Security and Pervasive Computing

The post Chip Industry Week In Review appeared first on Semiconductor Engineering.

Integrating Digital Twins In Semiconductor Operations

By Mark da Silva, Nishita Rao and Karim Somani

Chipmakers must adopt transformative technologies including Digital Twins (DT) to keep pace with unprecedented global semiconductor industry growth that is expected to drive its total market value to $1 trillion[1] as soon as 2030. Leveraging predictive modeling and other efficiency-enhancing innovations, DTs promise to optimize semiconductor design, manufacturing processes and equipment maintenance while improving overall operational efficiency.

With DTs rising in prominence as a critical enabler of industry growth, key players from across the semiconductor ecosystem – including OEMs, platforms and end users – gathered at the Semiconductor Digital Twin Workshop last December at SEMI headquarters in Milpitas, Calif. to discuss the latest DT developments and explore the path to advancing the technology.

Following are highlights from the sold-out event hosted by the SEMI Smart Manufacturing Initiative.

Key takeaways

  • Industry Alignment on DT Definition and Taxonomy
    • The semiconductor industry needs to align on the definition and taxonomy of DTs in semiconductor operations.
    • With collaboration crucial to advances in DTs, the industry must come together to develop a common understanding of the technology.
  • Data Sharing for Sustainability Improvements
    • Sharing data among various chip ecosystem players will be vital to driving sustainability improvements.
    • Focusing on equipment and operational DTs with sustainability in mind will help foster collaboration among industry stakeholders.
  • Advocacy for Standardized DT Architecture and Framework
    • A standardized DT framework architecture must be established to enhance interoperability, reliability, synchronization, and security.
    • The adoption of digital twin technical standards is in its early stages but increasing in importance as DT technology evolves.
    • Collaboration will be essential to accelerate the availability and adoption of several digital twin technical standards under development by SEMI and other Standards Development Organizations (SDOs).

Key challenges

  • Robust DT Framework and Overcoming Development Silos
    • Establishing a robust DT framework and overcoming isolated development silos in microelectronics are challenges the industry must overcome.
  • Managing Unclean Factory Data
    • Challenges include managing unclean factory data, varying data granularity, and addressing the lifecycle of data models.
  • Sharing Data Between Tools and Process Steps
    • Data sharing between various semiconductor tools and process steps must be seamless. Data provenance is critical for DT accuracy and validation.
  • Legacy Factories & Small/Medium Firms
    • Factories with older generation tools and processes have a unique challenge in developing process level DTs for existing products.

Workshop sessions

The workshop consisted of four sessions focused on DT efforts by equipment makers, solution providers, device makers, and factory integration providers.

Equipment-level digital twins session

The session focused on OEM efforts to develop tool-level DTs and highlighted the potential to improve efficiency, performance, and sustainability. The session also featured discussions on equipment-level data sharing, standards, and interoperability challenges that need to be addressed. Speakers included IRDS Co-Chair Supika Mashiro of TEL, Ala Moradian of Applied Materials, Joseph Ervin of LAM Research, Sean Glazier of Onto Innovation, Basil Milton and Chan-Pin Chong of Kulicke & Soffa, and Mark Huntington of McKinsey & Company.

Session speakers: (L) Supika Mashiro, TEL, and (R) Ala Moradien, AMAT. 

Speakers discussed existing DTs deployed in manufacturing such as Run-to-Run (R2R) control, virtual metrology, and predictive maintenance (PdM) and the need for standardized DTs that can communicate with each other. Tool-level DT solutions such as Applied Materials EcoTwin within the AppliedTwin platform provide a virtualized replica of chipmaking equipment for development and improvement of chip-level processes. The platform has also demonstrated extensibility to sustainability analysis, a significant development.

Other focus areas were the connectivity of DTs across different levels (tools to factories) and the use of AI to make them self-adjusting for manufacturing processes. The importance of DT infrastructure and associated challenges such as ensuring clean and accessible data, data flow, and communication to keeping DTs synchronized were raised as significant challenges. In the back-end, OEMs are making steady progress to virtualize various tools such as wire bonding. The session also highlighted DTs as a major investment across industries, with huge potential in chipmaking. Building a strong data sharing foundation is key to success.

Chamber process, operations and planning level digital twin session

The session was led by solution providers from across the semiconductor ecosystem that develop tools to facilitate DTs at various hierarchical levels. The providers offer a variety of products and services across areas such as process physics-based models, chamber processes, operations, as well as planning modelling approaches to help companies implement and manage DTs. The session included technical details of DT models and their potential impact on the entire manufacturing process.

While the session made clear that DTs promise to revolutionize the semiconductor industry, it must overcome significant technical development challenges of integrating DTs into day-to-day operations. Speakers included Sarbajit Ghosal of SC Solutions, Norman Chang of Ansys, Holland Smith of INFICON, Chandra Reddy of IBM Research, Jon Herlocker of TIGNIS, Ken Smerz of ZELUS and John Behnke of INFICON.

Speakers emphasized the need for fast, multi-physics-based (and data-assisted) accurate DTs for real-time control and monitoring and that react instantly to changes, just like physical equipment. Think of it as having a virtual process line that can predict how different processes will interact. Sitting on top of the DTs are AI-powered (physics and/or data-driven) models that can then be harnessed to optimize manufacturing processes and predict yield.

Speakers also discussed operational-level DTs and the need for a central hub for all factory operational data to boost efficiency, maximize productivity, and reduce waste – all critical as the number of fabs grows in the years ahead. Construction DTs for pre-construction planning in the building of new chip fabs or expanding brown-field sites provide a preconstruction virtual blueprint that can help identify potential problems early on and minimize time to wafer starts. Lastly, how these various levels of DTs are integrated vertically within a factory play a key role in making decisions about autonomous fabs.

Digital twin adoption and implementation session

The session was led by device makers and owners of fabs, where DTs are critical for improving productivity by predicting yield, quality, and efficiency. A process-level DT enables a virtual representation of a product’s process flow in the fab, and it can be used to speed integration efforts (MRL 5-7), simulate specific outcomes, and optimize operations. Imagine a future where chip fabs are run by AI agents, with virtual models predicting problems before they happen and optimizing processes on the fly. That’s the vision shared by the session’s expert speakers. Their insights painted a fascinating picture of what’s next for the semiconductor industry. Speakers included Professor. H.-S Philip Wong of Stanford University, Steven J Meyer of Intel, Jae Yong Park of Samsung, Rosa Javadi of JABIL, Professor Amit Lal and Peter Doerschuk of Cornell University, Ben Davaji of Northeastern University, Pushkar Apte of SEMI and Bobby Mitra of Deloitte.

Session speakers: (L) Steven J Meyer, Intel, and (R) Jae Yong Park, Samsung.

The key development target is advanced AI-assisted manufacturing with three layers of virtual models – processes, tools, and the entire fab itself – all working together seamlessly is critical. This ambitious vision aligns with the National Semiconductor Technology Center (NSTC) DT Grand Challenge, which focuses on generating, sharing, and using data effectively. Intel’s AFS Software Suite, which includes high-speed simulators and graphical models to enable better planning and decision-making across multiple sites, is a real-world example of DTs used in today’s fabs.

Use cases of deploying AI to improve Automated Material Handling Systems (AMHS) asset utilization by 30% have also been demonstrated in real-world fab environments. The session highlighted the importance of scheduling with AI-powered DTs and standardizing data availability across the industry. Other impressive product development use case studies shared included a rapid COVID-19 tester system development and a global supply chain DT.

Speakers described how challenges such as infrastructure readiness, talent gaps, and data privacy concerns are slowing industrywide adoption. They also discussed efforts to develop an open-access academic cleanroom dedicated to developing and testing DT models for lithography and etching processes, with investigation of federated learning to address data privacy & sharing concerns. The experts characterized the hierarchy of DT types as a framework based on the ISA-95 standard to ensure seamless communication and collaboration between DTs across various levels, from process development to production. This interconnected approach could revolutionize chipmaking across the entire enterprise, as demonstrated by an example showing DTs spanning the enterprise.

Digital twin connectivity and platform integration session

The session focused on a variety of product and service offerings by cloud, facilities, and supply chain solution providers that help companies implement and manage DTs of various levels. These solutions include integration, connectivity, security and horizontal integration across the supply chain. Almost all speakers pointed to the importance of standardization efforts as crucial for future development. Speakers included Rad Desiraju of Microsoft, Gautham Unni of AWS, David Gross and Srividya Jayaram of Siemens, Slava Libman of FTD Solutions, Becky Kelderman of Rockwell Automation, Ram Walvekar of HCL Technologies, and Paul Trio of SEMI International Standards

Session speakers touched on definitions and categorization of DTs, including types and uses, as well as building dedicated infrastructure to support their development. The experts highlighted a few DT development challenges in areas such as data sources and provenance, as well as visualization and shared their solution offerings for creating, connecting, and maintaining these digital twins both vertically and horizontally within an enterprise.

The presenters also shared use cases on how DTs bridge design and manufacturing, enabling simulations and faster production, and how connecting DTs for various assets, processes, and products creates a holistic view. Session speakers also discussed a DT maturity scorecard that enables players from across the supply chain to track their progress and identify areas for improvement. Use cases of facility-level DTs for water management in fabs for promoting sustainability was also a topic of discussion.

The semiconductor industry’s commitment to digital twins

The Semiconductor Digital Twin Workshop showcased the industry’s commitment to adopting and advancing the technology. Continued collaboration and adherence to standards and sustainable practices will play a crucial role in unlocking the full potential of DT technology in semiconductor manufacturing.

SEMI thanks the speakers who provided access to their material presented at the workshop. Visit Semiconductor Digital Twin Workshop OnDemand | SEMI for the workshop materials.

Reference

  1. Ondrej Burkacky, Julia Dragon, and Nikolaus Lehmann, The semiconductor decade: A trillion-dollar industry, McKinsey & Company (blog), April 1, 2021

Nishita Rao is senior product marketing manager at SEMI.

Karim Somani is program manager at SEMI.  

The post Integrating Digital Twins In Semiconductor Operations appeared first on Semiconductor Engineering.

Blog Review: Feb. 21

Siemens’ John McMillan digs into physical verification maturity for high-density advanced packaging (HDAP) designs and major differences in the LVS verification flow compared to the well-established process for SoCs.

Synopsys’ Varun Shah identifies why a cloud adoption framework is key to getting the most out of deploying EDA tools in the cloud, including by ensuring that different types of necessary compute are accessible for all stages of the design cycle.

Cadence’s Reela Samuel suggests that a chiplet-based approach will provide improved performance and reduced complexity for the automotive sector, enabling OEMs to construct a robust yet flexible electronic architecture.

Keysight’s Emily Yan finds that today’s chip design landscape is facing challenges reminiscent of those encountered by the Large Hadron Collider in managing data volume, version control, and global collaboration.

Ansys’ Raha Vafaei explains why the finite-difference time-domain (FDTD) method, an algorithmic approach to solving Maxwell’s equations, is key for modeling nanophotonic devices, processes, and materials.

Arm’s Ed Player explains the different components of the Common Microcontroller Software Interface Standard (CMSIS) to help identify which are useful for particular Arm-based microcontroller projects.

SEMI’s Mark da Silva, Nishita Rao and Karim Somani check out the state of digital twins in semiconductor manufacturing and challenges such as the need for standardization and communication between different digital twins.

Plus, check out the blogs featured in the latest Low Power-High Performance newsletter:

Rambus’ Lou Ternullo looks at why performance demands of generative AI and other advanced workloads will require new architectural solutions enabled by CXL.

Ansys’ Raha Vafaei shines a light on how the evolution of photonics engineering will encompass novel materials and cutting-edge techniques.

Siemens’ Keith Felton explains why embracing emerging approaches is essential for crafting IC packages that address the evolving demands of sustainability, technology, and consumer preferences.

Cadence’s Mark Seymour lays out how CFD simulation software can predict time-dependent aspects and various failure scenarios for data center managers.

Arm’s Adnan Al-Sinan and Gian Marco Iodice point out that LLMs already run well on small devices, and that will only improve as models become smaller and more sophisticated.

Keysight’s Roberto Piacentini Filho shows how a modular approach can improve yield, reduce cost, and improve PPA/C.

Quadric’s Steve Roddy finds that smart local memory in an AI/ML subsystem solves SoC bottlenecks.

Synopsys’ Ian Land, Kenneth Larsen, and Rob Aitken detail why the traditional approach using monolithic system-on-chips (SoCs) falls short when addressing the complex needs of modern systems.

The post Blog Review: Feb. 21 appeared first on Semiconductor Engineering.

Why Chiplets Are So Critical In Automotive

Od: John Koon

Chiplets are gaining renewed attention in the automotive market, where increasing electrification and intense competition are forcing companies to accelerate their design and production schedules.

Electrification has lit a fire under some of the biggest and best-known carmakers, which are struggling to remain competitive in the face of very short market windows and constantly changing requirements. Unlike in the past, when carmakers typically ran on five- to seven-year design cycles, the latest technology in vehicles today may well be considered dated within several years. And if they cannot keep up, there is a whole new crop of startups producing cheap vehicles with the ability to update or change out features as quickly as a software update.

But software has speed, security, and reliability limitations, and being able to customize the hardware is where many automakers are now putting their efforts. This is where chiplets fit in, and the focus now is on how to build enough interoperability across large ecosystems to make this a plug-and-play market. The key factors to enable automotive chiplet interoperability include standardization, interconnect technologies, communication protocols, power and thermal management, security, testing, and ecosystem collaboration.

Similar to non-automotive applications at the board level, many design efforts are focusing on a die-to-die approach, which is driving a number of novel design considerations and tradeoffs. At the chip level, the interconnects between various processors, chips, memory, and I/O are becoming more complex due to increased design performance requirements, spurring a flurry of standards activities. Different interconnect and interface types have been proposed to serve varying purposes, while emerging chiplet technologies for dedicated functions — processors, memories, and I/Os, to name a few — are changing the approach to chip design.

“There is a realization by automotive OEMs that to control their own destiny, they’re going to have to control their own SoCs,” said David Fritz, vice president of virtual and hybrid systems at Siemens EDA. “However, they don’t understand how far along EDA has come since they were in college in 1982. Also, they believe they need to go to the latest process node, where a mask set is going to cost $100 million. They can’t afford that. They also don’t have access to talent because the talent pool is fairly small. With all that together comes the realization by the OEMs that to control their destiny, they need a technology that’s developed by others, but which can be combined however needed to have a unique differentiated product they are confident is future-proof for at least a few model years. Then it becomes economically viable. The only thing that fits the bill is chiplets.”

Chiplets can be optimized for specific functions, which can help automakers meet reliability, safety, security requirements with technology that has been proven across multiple vehicle designs. In addition, they can shorten time to market and ultimately reduce the cost of different features and functions.

Demand for chips has been on the rise for the past decade. According to Allied Market Research, global automotive chip demand will grow from $49.8 billion in 2021 to $121.3 billion by 2031. That growth will attract even more automotive chip innovation and investment, and chiplets are expected to be a big beneficiary.

But the marketplace for chiplets will take time to mature, and it will likely roll out in phases.  Initially, a vendor will provide different flavors of proprietary dies. Then, partners will work together to supply chiplets to support each other, as has already happened with some vendors. The final stage will be universally interoperable chiplets, as supported by UCIe or some other interconnect scheme.

Getting to the final stage will be the hardest, and it will require significant changes. To ensure interoperability, large enough portions of the automotive ecosystem and supply chain must come together, including hardware and software developers, foundries, OSATs, and material and equipment suppliers.

Momentum is building
On the plus side, not all of this is starting from scratch. At the board level, modules and sub-systems always have used onboard chip-to-chip interfaces, and they will continue to do so. Various chip and IP providers, including Cadence, Diode, Microchip, NXP, Renesas, Rambus, Infineon, Arm, and Synopsys, provide off-the-shelf interface chips or IP to create the interface silicon.

The Universal Chiplet Interconnect Express (UCIe) Consortium is the driving force behind the die-to-die, open interconnect standard. The group released its latest UCIe 1.1 specification in August 2023. Board members include Alibaba, AMD, Arm, ASE, Google Cloud, Intel, Meta, Microsoft, NVIDIA, Qualcomm, Samsung, and others. Industry partners are showing widespread support. AIB and Bunch of Wires (BoW) also have been proposed. In addition, Arm just released its own Chiplet System Architecture, along with an updated AMBA spec to standardize protocols for chiplets.

“Chiplets are already here, driven by necessity,” said Arif Khan, senior product marketing group director for design IP at Cadence. “The growing processor and SoC sizes are hitting the reticle limit and the diseconomies of scale. Incremental gains from process technology advances are lower than rising cost per transistor and design. The advances in packaging technology (2.5D/3D) and interface standardization at a die-to-die level, such as UCIe, will facilitate chiplet development.”

Nearly all of the chiplets used today are developed in-house by big chipmakers such as Intel, AMD, and Marvell, because they can tightly control the characteristics and behavior of those chiplets. But there is work underway at every level to open this market to more players. When that happens, smaller companies can begin capitalizing on what the high-profile trailblazers have accomplished so far, and innovating around those developments.

“Many of us believe the dream of having an off-the-shelf, interoperable chiplet portfolio will likely take years before becoming a reality,” said Guillaume Boillet, senior director strategic marketing at Arteris, adding that interoperability will emerge from groups of partners who are addressing the risk of incomplete specifications.

This also is raising the attractiveness of FPGAs and eFPGAs, which can provide a level of customization and updates for hardware in the field. “Chiplets are a real thing,” said Geoff Tate, CEO of Flex Logix. “Right now, a company building two or more chiplets can operate much more economically than a company building near-reticle-size die with almost no yield. Chiplet standardization still appears to be far away. Even UCIe is not a fixed standard yet. Not all agree on UCIe, bare die testing, and who owns the problem when the integrated package doesn’t work, etc. We do have some customers who use or are evaluating eFPGA for interfaces where standards are in flux like UCIe. They can implement silicon now and use the eFPGA to conform to standards changes later.”

There are other efforts supporting chiplets, as well, although for somewhat different reasons — notably, the rising cost of device scaling and the need to incorporate more features into chips, which are reticle-constrained at the most advanced nodes. But those efforts also pave the way for chiplets in automotive, and there is strong industry backing to make this all work. For example, under the sponsorship of SEMI, ASME, and three IEEE Societies, the new Heterogeneous Integration Roadmap (HIR) looks at various microelectronics design, materials, and packaging issues to come up with a roadmap for the semiconductor industry. Their current focus includes 2.5D, 3D-ICs, wafer-level packaging, integrated photonics, MEMS and sensors, and system-in-package (SiP), aerospace, automotive, and more.

At the recent Heterogeneous Integration Global Summit 2023, representatives from AMD, Applied Materials, ASE, Lam Research, MediaTek, Micron, Onto Innovation, TSMC, and others demonstrated strong support for chiplets. Another group that supports chiplets is the Chiplet Design Exchange (CDX) working group , which is part of the Open Domain Specific Architecture (ODSA) and the Open Compute Project Foundation (OCP). The Chiplet Design Exchange (CDX) charter focuses on the various characteristics of chiplet and chiplet integration, including electrical, mechanical, and thermal design exchange standards of the 2.5D stacked, and 3D Integrated Circuits (3D-ICs). Its representatives include Ansys, Applied Materials, Arm, Ayar Labs, Broadcom, Cadence, Intel, Macom, Marvell, Microsemi, NXP, Siemens EDA, Synopsys, and others.

“The things that automotive companies want in terms of what each chiplet does in terms of functionality is still in an upheaval mode,” Siemens’ Fritz noted. “One extreme has these problems, the other extreme has those problems. This is the sweet spot. This is what’s needed. And these are the types of companies that can go off and do that sort of work, and then you could put them together. Then this interoperability thing is not a big deal. The OEM can make it too complex by saying, ‘I have to handle that whole spectrum of possibilities.’ The alternative is that they could say, ‘It’s just like a high speed PCIe. If I want to communicate from one to the other, I already know how to do that. I’ve got drivers that are running my operating system. That would solve an awful lot of problems, and that’s where I believe it’s going to end up.”

One path to universal chiplet development?

Moving forward, chiplets are a focal point for both the automotive and chip industries, and that will involve everything from chiplet IP to memory interconnects and customization options and limitations.

For example, Renesas Electronics announced in November 2023 plans for its next-generation SoCs and MCUs. The company is targeting all major applications across the automotive digital domain, including advance information about its fifth-generation R-Car SoC for high-performance applications with advanced in-package chiplet integration technology, which is meant to provide automotive engineers greater flexibility to customize their designs.

Renesas noted that if more AI performance is required in Advanced Driver Assistance Systems (ADAS), engineers will have the capability to integrate AI accelerators into a single chip. The company said this roadmap comes after years of collaboration and discussions with Tier 1 and OEM customers, which have been clamoring for a way to accelerate development without compromising quality, including designing and verifying the software even before the hardware is available.

“Due to the ever increasing needs to increase compute on demand, and the increasing need for higher levels of autonomy in the cars of tomorrow, we see challenges in monolithic solutions scaling and providing the performance needs of the market in the upcoming years,” said Vasanth Waran, senior director for SoC Business & Strategies at Renesas. “Chiplets allows for the compute solutions to scale above and beyond the needs of the market.”

Renesas announced plans to create a chiplet-based product family specifically targeted at the automotive market starting in 2025.

Standard interfaces allow for SoC customization
It is not entirely clear how much overlap there will be between standard processors, which is where most chiplets are used today, and chiplets developed for automotive applications. But the underlying technologies and developments certainly will build off each other as this technology shifts into new markets.

“Whether it is an AI accelerator or ADAS automotive application, customers need standard interface IP blocks,” noted David Ridgeway, senior product manager, IP accelerated solutions group at Synopsys. “It is important to provide fully verified IP subsystems around their IP customization requirements to support the subsystem components used in the customers’ SoCs. When I say customization, you might not realize how customizable IP has become over the course of the last 10 to 20 years, on the PHY side as well as the controller side. For example, PCI Express has gone from PCIe Gen 3 to Gen 4 to Gen 5 and now Gen 6. The controller can be configured to support multiple bifurcation modes of smaller link widths, including one x16, two x8, or four x4. Our subsystem IP team works with customers to ensure all the customization requirements are met. For AI applications, signal and power integrity is extremely important to meet their performance requirements. Almost all our customers are seeking to push the envelope to achieve the highest memory bandwidth speeds possible so that their TPU can process many more transactions per second. Whenever the applications are cloud computing or artificial intelligence, customers want the fastest response rate possible.”

Fig 1: IP blocks including processor, digital, PHY, and verification help developers implement the entire SoC. Source: Synopsys

Fig 1: IP blocks including processor, digital, PHY, and verification help developers implement the entire SoC. Source: Synopsys

Optimizing PPA serves the ultimate goal of increasing efficiency, and this makes chiplets particularly attractive in automotive applications. When UCIe matures, it is expected to improve overall performance exponentially. For example, UCIe can deliver a shoreline bandwidth of 28 to 224 GB/s/mm in a standard package, and 165 to 1317 GB/s/mm in an advanced package. This represents a performance improvement of 20- to 100-fold. Bringing latency down from 20ns to 2ns represents a 10-fold improvement. Around 10 times greater power efficiency, at 0.5 pJ/b (standard package) and 0.25 pJ/b (advanced package), is another plus. The key is shortening the interface distance whenever possible.

To optimize chiplet designs, the UCIe Consortium provides some suggestions:

  • Careful planning consideration of architectural cut-lines (i.e. chiplet boundaries), optimizing for power, latency, silicon area, and IP reuse. For example, customizing one chiplet that needs a leading-edge process node while re-using other chiplets on older nodes may impact cost and time.
  • Thermal and mechanical packaging constraints need to be planned out for package thermal envelopes, hot spots, chiplet placements and I/O routing and breakouts.
  • Process nodes need to be carefully selected, particularly in the context of the associated power delivery scheme.
  • Test strategy for chiplets and packaged/assembled parts need to be developed up front to ensure silicon issues are caught at the chiplet-level testing phase rather than after they are assembled into a package.

Conclusion
The idea of standardizing die-to-die interfaces is catching on quickly but the path to get there will take time, effort, and a lot of collaboration among companies that rarely talk with each other. Building a vehicle takes one determine carmaker. Building a vehicle with chiplets requires an entire ecosystem that includes the developers, foundries, OSATs, and material and equipment suppliers to work together.

Automotive OEMs are experts at putting systems together and at finding innovative ways to cut costs. But it remains to seen how quickly and effectively they can build and leverage an ecosystem of interoperable chiplets to shrink design cycles, improve customization, and adapt to a world in which the leading edge technology may be outdated by the time it is fully designed, tested, and available to consumers.

— Ann Mutschler contributed to this report.

Related Reading
Automotive Relationships Shifting With Chiplets
As the automotive ecosystem balances the best approaches for designing in increasingly advanced features, how companies interact is still evolving.

The post Why Chiplets Are So Critical In Automotive appeared first on Semiconductor Engineering.

❌