FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇Techdirt
  • Age Verification Laws Are Just A Path Towards A Full Ban On Porn, Proponent AdmitsTim Cushing
    It’s never about the children. Supporters of age verification laws, book bans, drag show bans, and abortion bans always claim they’re doing these things to protect children. But it’s always just about themselves. They want to impose their morality on other adults. That’s all there is to it. Abortion bans are just a way to strip women of bodily autonomy. If it was really about cherishing children and new lives, these same legislators wouldn’t be routinely stripping school lunch programs of fundi
     

Age Verification Laws Are Just A Path Towards A Full Ban On Porn, Proponent Admits

20. Srpen 2024 v 19:50

It’s never about the children. Supporters of age verification laws, book bans, drag show bans, and abortion bans always claim they’re doing these things to protect children. But it’s always just about themselves. They want to impose their morality on other adults. That’s all there is to it.

Abortion bans are just a way to strip women of bodily autonomy. If it was really about cherishing children and new lives, these same legislators wouldn’t be routinely stripping school lunch programs of funding, introducing onerous means testing to government aid programs, and generally treating children as a presumptive drain on society.

The same goes for book bans. They claim they want to prevent children from accessing inappropriate material. But you can only prevent children from accessing it by removing it entirely from public libraries, which means even adults will no longer be able to read these books.

The laws targeting drag shows aren’t about children. They’re about punishing certain people for being the way they are — people whose mere existence seems to be considered wholly unacceptable by bigots with far too much power.

The slew of age verification laws introduced in recent years are being shot down by courts almost as swiftly as they’re enacted. And for good reason. Age verification laws are unconstitutional. And they’re certainly not being enacted to prevent children from accessing porn.

Of course, none of the people pushing this kind of legislation will ever openly admit their reasons for doing so. But they will admit it to people they think are like-minded. All it takes is a tiny bit of subterfuge to tease these admissions out of activist groups that want to control what content adults have access to — something that’s barely hidden by their “for the children” facade.

As Shawn Musgrave reports for The Intercept, a couple of people managed to coax this admission out of a former Trump official simply by pretending they were there to give his pet project a bunch of cash.

“I actually never talk about our porn agenda,” said Russell Vought, a former top Trump administration official, in late July. Vought was chatting with two men he thought were potential donors to his right-wing think tank, the Center for Renewing America. 

For the last three years, Vought and the CRA have been pushing laws that require porn websites to verify their visitors are not minors, on the argument that children need to be protected from smut. Dozens of states have enacted or considered these “age verification laws,” many of them modeled on the CRA’s proposals. 

[…]

But in a wide-ranging, covertly recorded conversation with two undercover operatives — a paid actor and a reporter for the British journalism nonprofit Centre for Climate Reporting — Vought let them in on a thinly veiled secret: These age verification laws are a pretext for restricting access to porn more broadly. 

“Thinly veiled” is right. While it’s somewhat amusing Vought was taken in so easily and was immediately willing to say the quiet part loud when he thought cash was on the line, he’s made his antipathy towards porn exceedingly clear. As Musgrave notes in his article, Vought’s contribution to Project 2025 — a right-wing masturbatory fantasy masquerading as policy proposals should Trump take office again — almost immediately veers into the sort of territory normally only explored by dictators and autocrats who relied heavily on domestic surveillance, forced labor camps, and torture to rein in those who disagreed with their moral stances.

Pornography, manifested today in the omnipresent propagation of transgender ideology and sexualization of children, for instance, is not a political Gordian knot inextricably binding up disparate claims about free speech, property rights, sexual liberation, and child welfare. It has no claim to First Amendment protection. Its purveyors are child predators and misogynistic exploiters of women. Their product is as addictive as any illicit drug and as psychologically destructive as any crime. Pornography should be outlawed. The people who produce and distribute it should be imprisoned. Educators and public librarians who purvey it should be classed as registered sex offenders. And telecommunications and technology firms that facilitate its spread should be shuttered.

Perhaps the most surprising part of this paragraph (and, indeed, a lot of Vought’s contribution to Project 2025) is that it isn’t written in all caps with a “follow me on xTwitter” link attached. These are not the words of a hinged person. They are the opposite — the ravings of a man in desperate need of a competent re-hinging service.

And he’s wrong about everything in this paragraph, especially his assertion that pornography is not a First Amendment issue. It is. That’s why so many of these laws are getting rejected by federal courts. The rest is hyperbole that pretends it’s just bold, common sense assertions. I would like to hear more about the epidemic of porn overdoses that’s leaving children parentless and overloading our health system. And who can forget the recent killing sprees of the Sinoloa Porn Cartel, which has led to federal intervention from the Mexican government?

But the most horrifying part is Vought’s desire to imprison people for producing porn and converting librarians to registered sex offenders just because their libraries carry some content that personally offends his sensibilities.

These are the words and actions of people who strongly support fascism so long as they’re part of the ruling party. They don’t care about kids, America, democracy, or the Constitution. They want a nation of followers and the power to punish anyone who steps out of line. The Center for Renewing America is only one of several groups with the same ideology and the same censorial urges. These are dangerous people, but their ideas and policy proposals are now so common it’s almost impossible to classify it as “extremist.” There are a lot of Americans who would rather see the nation destroyed than have to, at minimum, tolerate people and ideas they don’t personally like. Their ugliness needs to be dragged out into the open as often as possible, if only to force them to confront the things they’ve actually said and done.

  • ✇Semiconductor Engineering
  • A Hybrid ECO Detailed Placement Flow for Mitigating Dynamic IR Drop (UC San Diego)Technical Paper Link
    A new technical paper titled “A Hybrid ECO Detailed Placement Flow for Improved Reduction of Dynamic IR Drop” was published by researchers at UC San Diego. Abstract: “With advanced semiconductor technology progressing well into sub-7nm scale, voltage drop has become an increasingly challenging issue. As a result, there has been extensive research focused on predicting and mitigating dynamic IR drops, leading to the development of IR drop engineering change order (ECO) flows – often integrated wi
     

A Hybrid ECO Detailed Placement Flow for Mitigating Dynamic IR Drop (UC San Diego)

A new technical paper titled “A Hybrid ECO Detailed Placement Flow for Improved Reduction of Dynamic IR Drop” was published by researchers at UC San Diego.

Abstract:

“With advanced semiconductor technology progressing well into sub-7nm scale, voltage drop has become an increasingly challenging issue. As a result, there has been extensive research focused on predicting and mitigating dynamic IR drops, leading to the development of IR drop engineering change order (ECO) flows – often integrated with modern commercial EDA tools. However, these tools encounter QoR limitations while mitigating IR drop. To address this, we propose a hybrid ECO detailed placement approach that is integrated with existing commercial EDA flows, to mitigate excessive peak current demands within power and ground rails. Our proposed hybrid approach effectively optimizes peak current levels within a specified “clip”– complementing and enhancing commercial EDA dynamic IR-driven ECO detailed placements. In particular, we: (i) order instances in a netlist in decreasing order of worst voltage drop; (ii) extract a clip around each instance; and (iii) solve an integer linear programming (ILP) problem to optimize instance placements. Our approach optimizes dynamic voltage drops (DVD) across ten designs by up to 15.3% compared to original conventional flows, with similar timing quality and 55.1% less runtime.”

Find the technical paper here. Published June 2024.

Andrew B. Kahng, Bodhisatta Pramanik, and Mingyu Woo. 2024. A Hybrid ECO Detailed Placement Flow for Improved Reduction of Dynamic IR Drop. In Proceedings of the Great Lakes Symposium on VLSI 2024 (GLSVLSI ’24). Association for Computing Machinery, New York, NY, USA, 390–396. https://doi.org/10.1145/3649476.3658727.

The post A Hybrid ECO Detailed Placement Flow for Mitigating Dynamic IR Drop (UC San Diego) appeared first on Semiconductor Engineering.

  • ✇Semiconductor Engineering
  • A Hybrid ECO Detailed Placement Flow for Mitigating Dynamic IR Drop (UC San Diego)Technical Paper Link
    A new technical paper titled “A Hybrid ECO Detailed Placement Flow for Improved Reduction of Dynamic IR Drop” was published by researchers at UC San Diego. Abstract: “With advanced semiconductor technology progressing well into sub-7nm scale, voltage drop has become an increasingly challenging issue. As a result, there has been extensive research focused on predicting and mitigating dynamic IR drops, leading to the development of IR drop engineering change order (ECO) flows – often integrated wi
     

A Hybrid ECO Detailed Placement Flow for Mitigating Dynamic IR Drop (UC San Diego)

A new technical paper titled “A Hybrid ECO Detailed Placement Flow for Improved Reduction of Dynamic IR Drop” was published by researchers at UC San Diego.

Abstract:

“With advanced semiconductor technology progressing well into sub-7nm scale, voltage drop has become an increasingly challenging issue. As a result, there has been extensive research focused on predicting and mitigating dynamic IR drops, leading to the development of IR drop engineering change order (ECO) flows – often integrated with modern commercial EDA tools. However, these tools encounter QoR limitations while mitigating IR drop. To address this, we propose a hybrid ECO detailed placement approach that is integrated with existing commercial EDA flows, to mitigate excessive peak current demands within power and ground rails. Our proposed hybrid approach effectively optimizes peak current levels within a specified “clip”– complementing and enhancing commercial EDA dynamic IR-driven ECO detailed placements. In particular, we: (i) order instances in a netlist in decreasing order of worst voltage drop; (ii) extract a clip around each instance; and (iii) solve an integer linear programming (ILP) problem to optimize instance placements. Our approach optimizes dynamic voltage drops (DVD) across ten designs by up to 15.3% compared to original conventional flows, with similar timing quality and 55.1% less runtime.”

Find the technical paper here. Published June 2024.

Andrew B. Kahng, Bodhisatta Pramanik, and Mingyu Woo. 2024. A Hybrid ECO Detailed Placement Flow for Improved Reduction of Dynamic IR Drop. In Proceedings of the Great Lakes Symposium on VLSI 2024 (GLSVLSI ’24). Association for Computing Machinery, New York, NY, USA, 390–396. https://doi.org/10.1145/3649476.3658727.

The post A Hybrid ECO Detailed Placement Flow for Mitigating Dynamic IR Drop (UC San Diego) appeared first on Semiconductor Engineering.

  • ✇Ars Technica - All content
  • Pornhub prepares to block five more states rather than check IDsAshley Belanger
    Enlarge (credit: Aurich Lawson | Getty Images) Pornhub will soon be blocked in five more states as the adult site continues to fight what it considers privacy-infringing age-verification laws that require Internet users to provide an ID to access pornography. On July 1, according to a blog post on the adult site announcing the impending block, Pornhub visitors in Indiana, Idaho, Kansas, Kentucky, and Nebraska will be "greeted by a video featuring" adult entertainer Cherie Dev
     

Pornhub prepares to block five more states rather than check IDs

20. Červen 2024 v 22:33
Pornhub prepares to block five more states rather than check IDs

Enlarge (credit: Aurich Lawson | Getty Images)

Pornhub will soon be blocked in five more states as the adult site continues to fight what it considers privacy-infringing age-verification laws that require Internet users to provide an ID to access pornography.

On July 1, according to a blog post on the adult site announcing the impending block, Pornhub visitors in Indiana, Idaho, Kansas, Kentucky, and Nebraska will be "greeted by a video featuring" adult entertainer Cherie Deville, "who explains why we had to make the difficult decision to block them from accessing Pornhub."

Pornhub explained that—similar to blocks in Texas, Utah, Arkansas, Virginia, Montana, North Carolina, and Mississippi—the site refuses to comply with soon-to-be-enforceable age-verification laws in this new batch of states that allegedly put users at "substantial risk" of identity theft, phishing, and other harms.

Read 25 remaining paragraphs | Comments

WhatsApp’s New Rule: Enter Your Birthdate to Continue Chatting!

Od: Abdullah
10. Červen 2024 v 11:39
WhatsApp

WhatsApp, the popular messaging platform, is reportedly preparing to introduce age verification for users in Europe. This move comes as the minimum age requirement for ...

The post WhatsApp’s New Rule: Enter Your Birthdate to Continue Chatting! appeared first on Gizchina.com.

Using Formal Verification To Evaluate The HW Reliability Of A RISC-V Ibex Core In The Presence Of Soft Errors

31. Květen 2024 v 18:33

A technical paper titled “Using Formal Verification to Evaluate Single Event Upsets in a RISC-V Core” was published by researchers at University of Southampton.

Abstract:

“Reliability has been a major concern in embedded systems. Higher transistor density and lower voltage supply increase the vulnerability of embedded systems to soft errors. A Single Event Upset (SEU), which is also called a soft error, can reverse a bit in a sequential element, resulting in a system failure. Simulation-based fault injection has been widely used to evaluate reliability, as suggested by ISO26262. However, it is practically impossible to test all faults for a complex design. Random fault injection is a compromise that reduces accuracy and fault coverage. Formal verification is an alternative approach. In this paper, we use formal verification, in the form of model checking, to evaluate the hardware reliability of a RISC-V Ibex Core in the presence of soft errors. Backward tracing is performed to identify and categorize faults according to their effects (no effect, Silent Data Corruption, crashes, and hangs). By using formal verification, the entire state space and fault list can be exhaustively explored. It is found that misaligned instructions can amplify fault effects. It is also found that some bits are more vulnerable to SEUs than others. In general, most of the bits in the Ibex Core are vulnerable to Silent Data Corruption, and the second pipeline stage is more vulnerable to Silent Data Corruption than the first.”

Find the technical paper here. Published May 2024 (preprint).

Xue, Bing, and Mark Zwolinski. “Using Formal Verification to Evaluate Single Event Upsets in a RISC-V Core.” arXiv preprint arXiv:2405.12089 (2024).

Related Reading
Formal Verification’s Usefulness Widens
Demand for IC reliability pushes formal into new applications, where complex interactions and security risks are difficult to solve with other tools.
RISC-V Micro-Architectural Verification
Verifying a processor is much more than making sure the instructions work, but the industry is building from a limited knowledge base and few dedicated tools.

The post Using Formal Verification To Evaluate The HW Reliability Of A RISC-V Ibex Core In The Presence Of Soft Errors appeared first on Semiconductor Engineering.

  • ✇Techdirt
  • Ctrl-Alt-Speech: Won’t Someone Please Think Of The Adults?Leigh Beadon
    Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation‘s Ben Whitelaw. Subscribe now on Apple Podcasts, Overcast, Spotify, Pocket Casts, YouTube, or your podcast app of choice — or go straight to the RSS feed. In this week’s round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover: EU Explores Whether Telegram Falls Under Strict New Content Law (Bloomberg) Too Small to Polic
     

Ctrl-Alt-Speech: Won’t Someone Please Think Of The Adults?

1. Červen 2024 v 00:36

Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation‘s Ben Whitelaw.

Subscribe now on Apple Podcasts, Overcast, Spotify, Pocket Casts, YouTube, or your podcast app of choice — or go straight to the RSS feed.

In this week’s round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.

  • ✇Semiconductor Engineering
  • Competitive Open-Source EDA ToolsTechnical Paper Link
    A technical paper titled “Basilisk: Achieving Competitive Performance with Open EDA Tools on an Open-Source Linux-Capable RISC-V SoC” was published by researchers at ETH Zurich and University of Bologna. Abstract: “We introduce Basilisk, an optimized application-specific integrated circuit (ASIC) implementation and design flow building on the end-to-end open-source Iguana system-on-chip (SoC). We present enhancements to synthesis tools and logic optimization scripts improving quality of results
     

Competitive Open-Source EDA Tools

19. Květen 2024 v 22:52

A technical paper titled “Basilisk: Achieving Competitive Performance with Open EDA Tools on an Open-Source Linux-Capable RISC-V SoC” was published by researchers at ETH Zurich and University of Bologna.

Abstract:

“We introduce Basilisk, an optimized application-specific integrated circuit (ASIC) implementation and design flow building on the end-to-end open-source Iguana system-on-chip (SoC). We present enhancements to synthesis tools and logic optimization scripts improving quality of results (QoR), as well as an optimized physical design with an improved power grid and cell placement integration enabling a higher core utilization. The tapeout-ready version of Basilisk implemented in IHP’s open 130 nm technology achieves an operation frequency of 77 MHz (51 logic levels) under typical conditions, a 2.3x improvement compared to the baseline open-source EDA design flow presented in Iguana, and a higher 55% core utilization compared to 50% in the baseline design. Through collaboration with EDA tool developers and domain experts, Basilisk exemplifies a synergistic effort towards competitive open-source electronic design automation (EDA) tools for research and industry applications.”

Find the technical paper here. Published May 2024.

Sauter, Phillippe, Thomas Benz, Paul Scheffler, Zerun Jiang, Beat Muheim, Frank K. Gürkaynak, and Luca Benini. “Basilisk: Achieving Competitive Performance with Open EDA Tools on an Open-Source Linux-Capable RISC-V SoC.” arXiv preprint arXiv:2405.03523 (2024).

Related Reading
EDA Back On Investors’ Radar
Big changes are fueling growth, and it’s showing in EDA revenue, acquisitions, and stock prices.
RISC-V Wants All Your Cores
It is not enough to want to dominate the world of CPUs. RISC-V has every core in its sights, and it’s starting to take steps to get there.

The post Competitive Open-Source EDA Tools appeared first on Semiconductor Engineering.

  • ✇Semiconductor Engineering
  • Reset Domain Crossing VerificationSiemens EDA
    By Reetika and Sulabh Kumar Khare, Siemens EDA DI SW To meet low-power and high-performance requirements, system on chip (SoC) designs are equipped with several asynchronous and soft reset signals. These reset signals help to safeguard software and hardware functional safety as they can be asserted to speedily recover the system onboard to an initial state and clear any pending errors or events. By definition, a reset domain crossing (RDC) occurs when a path’s transmitting flop has an asynchrono
     

Reset Domain Crossing Verification

13. Květen 2024 v 09:01

By Reetika and Sulabh Kumar Khare, Siemens EDA DI SW

To meet low-power and high-performance requirements, system on chip (SoC) designs are equipped with several asynchronous and soft reset signals. These reset signals help to safeguard software and hardware functional safety as they can be asserted to speedily recover the system onboard to an initial state and clear any pending errors or events.

By definition, a reset domain crossing (RDC) occurs when a path’s transmitting flop has an asynchronous reset, and the receiving flop either has a different asynchronous reset than the transmitting flop or has no reset. The multitude of asynchronous reset sources found in today’s complex automotive designs means there are a large number of RDC paths, which can lead to systematic faults and hence cause data-corruption, glitches, metastability, or functional failures — along with other issues.

This issue is not covered by standard, static verification methods, such as clock domain crossing (CDC) analysis. Therefore, a proper reset domain crossing verification methodology is required to prevent errors in the reset design during the RTL verification stage.

A soft reset is an internally generated reset (register/latch/black-box output is used as a reset) that allows the design engineer to reset a specific portion of the design (specific module/subsystem) without affecting the entire system. Design engineers frequently use a soft reset mechanism to reset/restart the device without fully powering it off, as this helps to conserve power by selectively resetting specific electronic components while keeping others in an operational state. A soft reset typically involves manipulating specific registers or signals to trigger the reset process. Applying soft resets is a common technique used to quickly recover from a problem or test a specific area of the design. This can save time during simulation and verification by allowing the designer to isolate and debug specific issues without having to restart the entire simulation. Figure 1 shows a simple soft reset and its RTL to demonstrate that SoftReg is a soft reset for flop Reg.

Fig. 1: SoftReg is a soft reset for register Reg.

This article presents a systematic methodology to identify RDCs, with different soft resets, that are unsafe, even though the asynchronous reset domain is the same on the transmitter and receiver ends. Also, with enough debug aids, we will identify the safe RDCs (safe from metastability only if it meets the static timing analysis), with different asynchronous reset domains, that help to avoid silicon failures and minimize false crossing results. As a part of static analysis, this systematic methodology enables designers to intelligently identify critical reset domain bugs associated with soft resets.

A methodology to identify critical reset domain bugs

With highly complex reset architectures in automotive designs, there arises the need for a proper verification method to detect RDC issues. It is essential to detect unsafe RDCs systematically and apply appropriate synchronization techniques to tackle the issues that may arise due to delays in reset paths caused by soft resets. Thus designers can ensure proper operation of their designs and avoid the associated risks. By handling RDCs effectively, designers can mitigate potential issues and enhance the overall robustness and performance of a design. This systematic flow involves several steps to assist in RDC verification closure using standard RDC verification tools (see figure 2).

Fig. 2: Flowchart of methodology for RDC verification.

Specification of clock and reset signals

Signals that are intended to generate a clock and reset pulse should be specified by the user as clock or reset signals, respectively, during the set-up step in RDC verification. By specifying signals as clocks or resets (according to their expected behavior), designers can perform design rule checking and other verification checks to ensure compliance with clock and reset related guidelines and standards as well as best practices. This helps identify potential design issues and improve the overall quality of the design by reducing noise in the results.

Clock detection

Ideally, design engineers should define the clock signals and then the verification tool should trace these clocks down to the leaf clocks. Unfortunately, with complex designs, this is not possible as the design might have black boxes that originate clocks, or it may have some combinational logic in the clock signals that do not cover all the clocks specified by the user. All the un-specified clocks need to be identified and mapped to the user-specified primary clocks. An exhaustive detection of clocks is required in RDC verification, as potential metastability may occur if resets are used in different clock domains than the sequential element itself, leading to critical bugs.

Reset detection

Ideally, design engineers should define the reset signals, but again, due to the complexity of automotive and other modern designs, it is not possible to specify all the reset signals. Therefore a specialized verification tool is required for detection of resets. All the localized, black-box, gated, and primary resets need to be identified, and based on their usage in the RTL, they should be classified as synchronous, asynchronous, or dual type and then mapped to the user-specified primary resets.

Soft reset detection

The soft resets — i.e., the internally generated resets by flops and latches — need to be systematically detected as they can cause critical metastability issues when used in different clock domains, and they require static timing analysis when used in the same clock domain. Detecting soft resets helps identify potential metastability problems and allows designers to apply proper techniques for resolving these issues.

Reset tree analysis

Analysis of reset trees helps designers identify issues early in the design process, before RDC analysis. It helps to highlight some important errors in the reset design that are not commonly caught by lint tools. These include:

  • Dual synchronicity reset signals, i.e., the reset signal with a sample synchronous reset flop and a sample asynchronous reset flop
  • An asynchronous set/reset signal used as a data signal can result in incorrect data sampling because the reset state cannot be controlled

Reset domain crossing analysis

This step involves analyzing a design to determine the logic across various reset domains and identify potential RDCs. The analysis should also identify common reset sequences of asynchronous and soft reset sources at the transmitter and receiver registers of the crossings to avoid detection of false crossings that might appear as potential issues due to complex combinations of reset sources. False crossings are where a transmitter register and receiver register are asserted simultaneously due to dependencies among the reset assertion sequences, and as a result, any metastability that might occur on the receiver end is mitigated.

Analyze and fix RDC issues

The concluding step is to analyze the results of the verification steps to verify if data paths crossing reset domains are safe from metastability. For the RDCs identified as unsafe — which may occur either due to different asynchronous reset domains at the transmitter and receiver ends or due to the soft reset being used in a different clock domain than the sequential element itself — design engineers can develop solutions to eliminate or mitigate metastability by restructuring the design, modifying reset synchronization logic, or adjusting the reset ordering. Traditionally safe RDCs — i.e., crossings where a soft reset is used in the same clock domain as the sequential element itself — need to be verified using static timing analysis.

Figure 3 presents our proposed flow for identifying and eliminating metastability issues due to soft resets. After implementing the RDC solutions, re-verify the design to ensure that the reset domain crossing issues have been effectively addressed.

Fig. 3: Flowchart for proposed methodology to tackle metastability issues due to soft resets.

This methodology was used on a design with 374,546 register bits, 8 latch bits, and 45 RAMs. The Questa RDC verification tool using this new methodology identified around 131 reset domains, which consisted of 19 asynchronous domains defined by the user, as well as 81 asynchronous reset domains inferred by the tool.

The first run analyzed data paths crossing asynchronous reset domains without any soft reset analysis. It reported nearly 40,000 RDC crossings (as shown in table 1).

Reset domain crossings without soft reset analysis Severity Number of crossings
Reset domain crossing from a reset to a reset Violation 28408
Reset domain crossing from a reset to non-reset Violation 11235

Table 1: RDC analysis without soft resets.

In the second run, we did soft reset analysis and detected 34 soft resets, which resulted in additional violations for RDC paths with transmitter soft reset sources in different clock domains. These were critical violations that were missed in the initial run. Also, some RDC violations were converted to cautions (RDC paths with a transmitter soft reset in the same clock domain) as these paths would be safe from metastability as long as they meet the setup time window (as shown in table 2).

Reset domain crossings with soft reset analysis Severity Number of crossings
Reset domain crossing from a reset to a reset Violation 26957
Reset domain crossing from a reset to non-reset Violation 10523
Reset domain crossing with tx reset source in different clock Violation 880
Reset domain crossing from a reset to Rx with same clock Caution 2412

Table 2: RDC analysis with soft resets.

To gain a deeper understanding of RDC, metastability, and soft reset analysis in the context of this new methodology, please download the full paper Techniques to identify reset metastability issues due to soft resets.

The post Reset Domain Crossing Verification appeared first on Semiconductor Engineering.

  • ✇Semiconductor Engineering
  • Distributing RTL Simulation Across Thousands Of Cores On 4 IPU Sockets (EPFL)Technical Paper Link
    A technical paper titled “Parendi: Thousand-Way Parallel RTL Simulation” was published by researchers at EPFL. Abstract: “Hardware development relies on simulations, particularly cycle-accurate RTL (Register Transfer Level) simulations, which consume significant time. As single-processor performance grows only slowly, conventional, single-threaded RTL simulation is becoming less practical for increasingly complex chips and systems. A solution is parallel RTL simulation, where ideally, simulators
     

Distributing RTL Simulation Across Thousands Of Cores On 4 IPU Sockets (EPFL)

A technical paper titled “Parendi: Thousand-Way Parallel RTL Simulation” was published by researchers at EPFL.

Abstract:

“Hardware development relies on simulations, particularly cycle-accurate RTL (Register Transfer Level) simulations, which consume significant time. As single-processor performance grows only slowly, conventional, single-threaded RTL simulation is becoming less practical for increasingly complex chips and systems. A solution is parallel RTL simulation, where ideally, simulators could run on thousands of parallel cores. However, existing simulators can only exploit tens of cores.
This paper studies the challenges inherent in running parallel RTL simulation on a multi-thousand-core machine (the Graphcore IPU, a 1472-core machine). Simulation performance requires balancing three factors: synchronization, communication, and computation. We experimentally evaluate each metric and analyze how it affects parallel simulation speed, drawing on contrasts between the large-scale IPU and smaller but faster x86 systems.
Using this analysis, we build Parendi, an RTL simulator for the IPU. It distributes RTL simulation across 5888 cores on 4 IPU sockets. Parendi runs large RTL designs up to 4x faster than a powerful, state-of-the-art x86 multicore system.”

Find the technical paper here. Published March 2024 (preprint).

Emami, Mahyar, Thomas Bourgeat, and James Larus. “Parendi: Thousand-Way Parallel RTL Simulation.” arXiv preprint arXiv:2403.04714 (2024).

Related Reading
Anatomy Of A System Simulation
Balancing the benefits of a model with the costs associated with that model is tough, but it becomes even trickier when dissimilar models are combined.

The post Distributing RTL Simulation Across Thousands Of Cores On 4 IPU Sockets (EPFL) appeared first on Semiconductor Engineering.

  • ✇Techdirt
  • Bipartisan Group Of Senators Introduce New Terrible ‘Protect The Kids Online’ BillMike Masnick
    Apparently, the world needs even more terrible bills that let ignorant senators grandstand to the media about how they’re “protecting the kids online.” There’s nothing more serious to work on than that. The latest bill comes from Senators Brian Schatz and Ted Cruz (with assists from Senators Chris Murphy, Katie Britt, Peter Welch, Ted Budd, John Fetterman, Angus King, and Mark Warner). This one is called the “The Kids Off Social Media Act” (KOSMA) and it’s an unconstitutional mess built on a lon
     

Bipartisan Group Of Senators Introduce New Terrible ‘Protect The Kids Online’ Bill

2. Květen 2024 v 21:05

Apparently, the world needs even more terrible bills that let ignorant senators grandstand to the media about how they’re “protecting the kids online.” There’s nothing more serious to work on than that. The latest bill comes from Senators Brian Schatz and Ted Cruz (with assists from Senators Chris Murphy, Katie Britt, Peter Welch, Ted Budd, John Fetterman, Angus King, and Mark Warner). This one is called the “The Kids Off Social Media Act” (KOSMA) and it’s an unconstitutional mess built on a long list of debunked and faulty premises.

It’s especially disappointing to see this from Schatz. A few years back, I know his staffers would regularly reach out to smart people on tech policy issues in trying to understand the potential pitfalls of the regulations he was pushing. Either he’s no longer doing this, or he is deliberately ignoring their expert advice. I don’t know which one would be worse.

The crux of the bill is pretty straightforward: it would be an outright ban on social media accounts for anyone under the age of 13. As many people will recognize, we kinda already have a “soft” version of that because of COPPA, which puts much stricter rules on sites directed at those under 13. Because most sites don’t want to deal with those stricter rules, they officially limit account creation to those over the age of 13.

In practice, this has been a giant mess. Years and years ago, Danah Boyd pointed this out, talking about how the “age 13” bit is a disaster for kids, parents, and educators. Her research showed that all this generally did was to have parents teach kids that “it’s okay to lie,” as parents wanted kids to use social media tools to communicate with grandparents. Making that “soft” ban a hard ban is going to create a much bigger mess and prevent all sorts of useful and important communications (which, yeah, is a 1st Amendment issue).

Schatz’s reasons put forth for the bill are just… wrong.

No age demographic is more affected by the ongoing mental health crisis in the United States than kids, especially young girls. The Centers for Disease Control and Prevention’s Youth Risk Behavior Survey found that 57 percent of high school girls and 29 percent of high school boys felt persistently sad or hopeless in 2021, with 22 percent of all high school students—and nearly a third of high school girls—reporting they had seriously considered attempting suicide in the preceding year.

Gosh. What was happening in 2021 with kids that might have made them feel hopeless? Did Schatz and crew simply forget about the fact that most kids were under lockdown and physically isolated from friends for much of 2021? And that there were plenty of other stresses, including millions of people, including family members, dying? Noooooo. Must be social media!

Studies have shown a strong relationship between social media use and poor mental health, especially among children.

Note the careful word choice here: “strong relationship.” They won’t say a causal relationship because studies have not shown that. Indeed, as the leading researcher in the space has noted, there continues to be no real evidence of any causal relationship. The relationship appears to work the other way: kids who are dealing with poor mental health and who are desperate for help turn to the internet and social media because they’re not getting help elsewhere.

Maybe offer a bill that helps kids get access to more resources that help them with their mental health, rather than taking away the one place they feel comfortable going? Maybe?

From 2019 to 2021, overall screen use among teens and tweens (ages 8 to 12) increased by 17 percent, with tweens using screens for five hours and 33 minutes per day and teens using screens for eight hours and 39 minutes.

I mean, come on Schatz. Are you trolling everyone? Again, look at those dates. WHY DO YOU THINK that screen time might have increased 17% for kids from 2019 to 2021? COULD IT POSSIBLY BE that most kids had to do school via computers and devices at home, because there was a deadly pandemic making the rounds?

Maybe?

Did Schatz forget that? I recognize that lots of folks would like to forget the pandemic lockdowns, but this seems like a weird way to manifest that.

I mean, what a weird choice of dates to choose. I’m honestly kind of shocked that the increase was only 17%.

Also, note that the data presented here isn’t about an increase in social media use. It could very well be that the 17% increase was Zoom classes.

Based on the clear and growing evidence, the U.S. Surgeon General issued an advisory last year, calling for new policies to set and enforce age minimums and highlighting the importance of limiting the use of features, like algorithms, that attempt to maximize time, attention, and engagement.

Wait. You mean the same Surgeon General’s report that denied any causal link between social media and mental health (which you falsely claim has been proved) and noted just how useful and important social media is to many young people?

From that report, which Schatz misrepresents:

Social media can provide benefits for some youth by providing positive community and connection with others who share identities, abilities, and interests. It can provide access to important information and create a space for self-expression. The ability to form and maintain friendships online and develop social connections are among the positive effects of social media use for youth. , These relationships can afford opportunities to have positive interactions with more diverse peer groups than are available to them offline and can provide important social support to youth. The buffering effects against stress that online social support from peers may provide can be especially important for youth who are often marginalized, including racial, ethnic, and sexual and gender minorities. , For example, studies have shown that social media may support the mental health and well-being of lesbian, gay, bisexual, asexual, transgender, queer, intersex and other youths by enabling peer connection, identity development and management, and social support. Seven out of ten adolescent girls of color report encountering positive or identity-affirming content related to race across social media platforms. A majority of adolescents report that social media helps them feel more accepted (58%), like they have people who can support them through tough times (67%), like they have a place to show their creative side (71%), and more connected to what’s going on in their friends’ lives (80%). In addition, research suggests that social media-based and other digitally-based mental health interventions may also be helpful for some children and adolescents by promoting help-seeking behaviors and serving as a gateway to initiating mental health care.

Did Schatz’s staffers just, you know, skip over that part of the report or nah?

The bill also says that companies need to not allow algorithmic targeting of content to anyone under 17. This is also based on a widely believed myth that algorithmic content is somehow problematic. No studies have legitimately shown that of current algorithms. Indeed, a recent study showed that removing algorithmic targeting leads to people being exposed to more disinformation.

Is this bill designed to force more disinformation on kids? Why would that be a good idea?

Yes, some algorithms can be problematic! About a decade ago, algorithms that tried to optimize solely for “engagement” definitely created some bad outcomes. But it’s been a decade since most such algorithms have been designed that way. On most social media platforms, the algorithms are designed in other ways, taking into account a variety of different factors, because they know that optimizing just on engagement leads to bad outcomes.

Then the bill tacks on Cruz’s bill to require schools to block social media. There’s an amusing bit when reading the text of that part of the law. It says that you have to block social media on “federally funded networks and devices” but also notes that it does not prohibit “a teacher from using a social media platform in the classroom for educational purposes.”

But… how are they going to access those if the school is required by law to block access to such sites? Most schools are going to do a blanket ban, and teachers are going to be left to do what? Show kids useful YouTube science videos on their phones? Or maybe some schools will implement a special teacher code that lets them bypass the block. And by the end of the first week of school half the kids in the school will likely know that password.

What are we even doing here?

Schatz has a separate page hyping up the bill, and it’s even dumber than the first one above. It repeats some of the points above, though this time linking to Jonathan Haidt, whose work has been trashed left, right, and center by actual experts in this field. And then it gets even dumber:

Big Tech knows it’s complicit – but refuses to do anything about it…. Moreover, the platforms know about their central role in turbocharging the youth mental health crisis. According to Meta’s own internal study, “thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.” It concluded, “teens blame Instagram for increases in the rate of anxiety and depression.”

This is not just misleading, it’s practically fraudulent misrepresentation. The study Schatz is citing is one that was revealed by Frances Haugen. As we’ve discussed, it was done because Meta was trying to understand how to do better. Indeed, the whole point of that study was to see how teens felt about using social media in 12 different categories. Meta found that most boys felt neutral or better about themselves in all 12 categories. For girls, it was 11 out of 12. It was only in one category, body image, where the split was more pronounced. 32% of girls said that it made them feel worse. Basically the same percentage said it had no impact, or that it made them feel better.

Image

Also, look at that slide’s title. The whole point of this study was to figure out if they were making kids feel worse in order to look into how to stop doing that. And now, because grandstanders like Schatz are falsely claiming that this proves they were “complicit” and “refuse to do anything about it,” no social media company will ever do this kind of research again.

Because, rather than proactively looking to see if they’re creating any problems that they need to try to fix, Schatz and crew are saying “simply researching this is proof that you’re complicit and refuse to act.”

Statements like this basically ensure that social media companies stick their heads in the sand, rather than try to figure out where harm might be caused and take steps to stop that harm.

Why would Schatz want to do that?

That page then also falsely claims that the bill does not require age verification. This is a silly two-step that lying politicians claim every time they do this. Does it directly mandate age verification? No. But, by making the penalties super serious and costly for failing to stop kids from accessing social media that will obviously drive companies to introduce stronger age verification measures that are inherently dangerous and an attack on privacy.

Perhaps Schatz doesn’t understand this, but it’s been widely discussed by many of the experts his staff used to talk to. So, really, he has no excuse.

The FAQ also claims that the bill will pass constitutional muster, while at the same time admitting that they know there will be lawsuits challenging it:

Yes. As, for example, First Amendment expert Neil Richards explains, “[i]nstead of censoring the protected expression present on these platforms, the act takes aim at the procedures and permissions that determine the time, place and manner of speech for underage consumers.” The Supreme Court has long held that the government has the right to regulate products to protect children, including by, for instance, restricting the sale of obscene content to minors. As Richards explains: “[i]n the same way a crowded bar or nightclub is no place for a child on their own”—or in the way every state in the country requires parental consent if it allows a minor to get a tattoo—“this rule would set a reasonable minimum age and maturity limitation for social media customers.” 

While we expect legal challenges to any bill aimed at regulating social media companies, we are confident that this content-neutral bill will pass constitutional muster given the government interests at play.

There are many reasons why this is garbage under the law, but rather than breaking them all down (we’ll wait for judges to explain it in detail), I’ll just point out the major tell is in the law itself. In the definition of what a “social media platform” is in the law, there is a long list of exceptions of what the law does not cover. It includes a few “moral panics of yesteryear” that gullible politicians tried to ban and were found to have violated the First Amendment in the process.

It explicitly carves out video games and content that is professionally produced, rather than user-generated:

Image

Remember the moral panics about video games and TV destroying kids’ minds? Yeah. So this child protection bill is hasty to say “but we’re not banning that kind of content!” Because whoever drafted the bill recognized that the Supreme Court has already made it clear that politicians can’t do that for video games or TV.

So, instead, they have to pretend that social media content is somehow on a whole different level.

But it’s not. It’s still the government restricting access to content. They’re going to pretend that there’s something unique and different about social media, and that they’re not banning the “content” but rather the “place” and “manner” of accessing that content. Except that’s laughable on its face.

You can see that in the quote above where Schatz does the fun dance where he first says “it’s okay to ban obscene content to minors” and then pretends that’s the same as restrictions on access to a bar (it’s not). One is about the content, and one is about a physical place. Social media is all about the content, and it’s not obscene content (which is already an exception to the First Amendment).

And, the “parental consent” for tattoos… I mean, what the fuck? Literally 4 questions above in the FAQ where that appears Schatz insists that his bill has nothing about parental consent. And then he tries to defend it by claiming it’s no different than parental consent laws?

The FAQ also claims this:

This bill does not prevent LGBTQ+ youth from accessing relevant resources online and we have worked closely with LGBTQ+ groups while crafting this legislation to ensure that this bill will not negatively impact that community.

I mean, it’s good you talked to some experts, but I note that most of the LGBTQ+ groups I’m aware of are not listed on your list of “groups supporting the bill” on the very same page. That absence stands out.

And, again, the Surgeon General’s report that you misleadingly cited elsewhere highlights how helpful social media can be to many LGBTQ+ youth. You can’t just say “nah, it won’t harm them” without explaining why all those benefits that have been shown in multiple studies, including the Surgeon General’s report, somehow don’t get impacted.

There’s a lot more, but this is just a terrible bill that would create a mess. And, I’m already hearing from folks in DC that Schatz is trying to get this bill added to the latest Christmas tree of a bill to reauthorize the FAA.

It would be nice if we had politicians looking to deal with the actual challenges facing kids these days, including the lack of mental health support for those who really need it. Instead, we get unconstitutional grandstanding nonsense bills like this.

Everyone associated with this bill should feel ashamed.

Merging Power and Arithmetic Optimization Via Datapath Rewriting (Intel, Imperial College London)

A new technical paper titled “Combining Power and Arithmetic Optimization via Datapath Rewriting” was published by researchers at Intel Corporation and Imperial College London.

Abstract:
“Industrial datapath designers consider dynamic power consumption to be a key metric. Arithmetic circuits contribute a major component of total chip power consumption and are therefore a common target for power optimization. While arithmetic circuit area and dynamic power consumption are often correlated, there is also a tradeoff to consider, as additional gates can be added to explicitly reduce arithmetic circuit activity and hence reduce power consumption. In this work, we consider two forms of power optimization and their interaction: circuit area reduction via arithmetic optimization, and the elimination of redundant computations using both data and clock gating. By encoding both these classes of optimization as local rewrites of expressions, our tool flow can simultaneously explore them, uncovering new opportunities for power saving through arithmetic rewrites using the e-graph data structure. Since power consumption is highly dependent upon the workload performed by the circuit, our tool flow facilitates a data dependent design paradigm, where an implementation is automatically tailored to particular contexts of data activity. We develop an automated RTL to RTL optimization framework, ROVER, that takes circuit input stimuli and generates power-efficient architectures. We evaluate the effectiveness on both open-source arithmetic benchmarks and benchmarks derived from Intel production examples. The tool is able to reduce the total power consumption by up to 33.9%.”

Find the technical paper here. Published April 2024.

Samuel Coward, Theo Drane, Emiliano Morini, George Constantinides; arXiv:2404.12336v1.

The post Merging Power and Arithmetic Optimization Via Datapath Rewriting (Intel, Imperial College London) appeared first on Semiconductor Engineering.

  • ✇Techdirt
  • SCOTUS Needs To Take Up The Texas Age Verification LawsuitMike Masnick
    I think we could witness one of the most important First Amendment legal showdowns ever. The U.S. Supreme Court is being asked to rule on the constitutionality of mandatory age verification for porn websites. If the high court takes up the case, it would queue up a landmark debate pertaining to the First Amendment and privacy rights of millions of people. Free Speech Coalition and the parent companies of the largest adult entertainment websites on the web filed suit in the U.S. District Court fo
     

SCOTUS Needs To Take Up The Texas Age Verification Lawsuit

19. Duben 2024 v 21:52

I think we could witness one of the most important First Amendment legal showdowns ever.

The U.S. Supreme Court is being asked to rule on the constitutionality of mandatory age verification for porn websites. If the high court takes up the case, it would queue up a landmark debate pertaining to the First Amendment and privacy rights of millions of people.

Free Speech Coalition and the parent companies of the largest adult entertainment websites on the web filed suit in the U.S. District Court for the Western District of Texas with the intention to block House Bill (HB) 1181.

HB 1181 requires mandatory age verification for porn websites with users from Texas IP addresses. It also requires pseudoscientific health warnings to be posted on adult websites. Counsel representing the coalition and the porn companies argued that it violated the First Amendment rights of consumers and owners of the websites. This prompted the federal district court to initially enjoin the state of Texas from enforcing the law because its text appeared to be unconstitutional.

Acting Texas Attorney General Angela Colmenero appealed the injunction to the Fifth Circuit Court of Appeals. After a clear demonstration of classic Fifth Circuit tap dancing and the return of Ken Paxton to helm of the Attorney General’s office, Texas was granted permission to enforce the age verification requirements outlined in the law. Luckily, the circuit judges properly applied the Zauderer standard, denying the requirement to post the bogus health warnings.

Soon after this, Paxton announced lawsuits against the parent companies of Pornhub, xHamster, and Stripchat for violations of HB 1181. The penalties total in millions of dollars in damages, per the law. After the lawsuits for HB 1181 violations were announced and filed in circuit courts in Travis County, counsel for the plaintiffs tried to hold enforcement while they petitioned the high court to take up the case for consideration. Justice Samuel Alito, the circuit justice for the Fifth Circuit, has yet to indicate that the case will be taken up by the Supreme Court. There is no reason why they shouldn’t take it up because of how important this case is moving forward, and how this issue is showing up in so many other states.

The case, Free Speech Coalition et al. v. Paxton, is so important that the national affiliate of the American Civil Liberties Union announced they are aiding the plaintiffs and their current counsel, a team from the big law firm Quinn Emanuel, in their case. They will support the petition for writ of certiorari, potential oral arguments, etc. to render House Bill 1181 and all age verification laws as unconstitutional pipedreams.

Plaintiffs accurately argue that this is settled law, referring to the high court’s landmark decision in Reno v. American Civil Liberties Union. This decision found that segregating the content of the internet by age violates the rights of not only adults but for minors. The vast majority of age verification laws as they are structured now do just that.

While the Supreme Court provided for a less restrictive means to filter out minors from viewing age-restricted materials and potentially facing some level of harm, the vehicles of enforcement and some of the options touted in these bills for controlling minors’ web usage are, to the plaintiffs and civil liberties organizations, a violation of the First Amendment. ACLU and Quinn Emanuel attorneys for the plaintiffs present these arguments in their petition for writ of certiorari, which was filed in April 2024. Now, we just need the Supreme Court to take this seriously and not let the Fifth Circuit, the circuit that upheld a ban on drag shows, dictate law for the nation.

Michael McGrady covers the legal and tech side of the online porn business, among other topics.

Merging Power and Arithmetic Optimization Via Datapath Rewriting (Intel, Imperial College London)

A new technical paper titled “Combining Power and Arithmetic Optimization via Datapath Rewriting” was published by researchers at Intel Corporation and Imperial College London.

Abstract:
“Industrial datapath designers consider dynamic power consumption to be a key metric. Arithmetic circuits contribute a major component of total chip power consumption and are therefore a common target for power optimization. While arithmetic circuit area and dynamic power consumption are often correlated, there is also a tradeoff to consider, as additional gates can be added to explicitly reduce arithmetic circuit activity and hence reduce power consumption. In this work, we consider two forms of power optimization and their interaction: circuit area reduction via arithmetic optimization, and the elimination of redundant computations using both data and clock gating. By encoding both these classes of optimization as local rewrites of expressions, our tool flow can simultaneously explore them, uncovering new opportunities for power saving through arithmetic rewrites using the e-graph data structure. Since power consumption is highly dependent upon the workload performed by the circuit, our tool flow facilitates a data dependent design paradigm, where an implementation is automatically tailored to particular contexts of data activity. We develop an automated RTL to RTL optimization framework, ROVER, that takes circuit input stimuli and generates power-efficient architectures. We evaluate the effectiveness on both open-source arithmetic benchmarks and benchmarks derived from Intel production examples. The tool is able to reduce the total power consumption by up to 33.9%.”

Find the technical paper here. Published April 2024.

Samuel Coward, Theo Drane, Emiliano Morini, George Constantinides; arXiv:2404.12336v1.

The post Merging Power and Arithmetic Optimization Via Datapath Rewriting (Intel, Imperial College London) appeared first on Semiconductor Engineering.

  • ✇Techdirt
  • SCOTUS Needs To Take Up The Texas Age Verification LawsuitMike Masnick
    I think we could witness one of the most important First Amendment legal showdowns ever. The U.S. Supreme Court is being asked to rule on the constitutionality of mandatory age verification for porn websites. If the high court takes up the case, it would queue up a landmark debate pertaining to the First Amendment and privacy rights of millions of people. Free Speech Coalition and the parent companies of the largest adult entertainment websites on the web filed suit in the U.S. District Court fo
     

SCOTUS Needs To Take Up The Texas Age Verification Lawsuit

19. Duben 2024 v 21:52

I think we could witness one of the most important First Amendment legal showdowns ever.

The U.S. Supreme Court is being asked to rule on the constitutionality of mandatory age verification for porn websites. If the high court takes up the case, it would queue up a landmark debate pertaining to the First Amendment and privacy rights of millions of people.

Free Speech Coalition and the parent companies of the largest adult entertainment websites on the web filed suit in the U.S. District Court for the Western District of Texas with the intention to block House Bill (HB) 1181.

HB 1181 requires mandatory age verification for porn websites with users from Texas IP addresses. It also requires pseudoscientific health warnings to be posted on adult websites. Counsel representing the coalition and the porn companies argued that it violated the First Amendment rights of consumers and owners of the websites. This prompted the federal district court to initially enjoin the state of Texas from enforcing the law because its text appeared to be unconstitutional.

Acting Texas Attorney General Angela Colmenero appealed the injunction to the Fifth Circuit Court of Appeals. After a clear demonstration of classic Fifth Circuit tap dancing and the return of Ken Paxton to helm of the Attorney General’s office, Texas was granted permission to enforce the age verification requirements outlined in the law. Luckily, the circuit judges properly applied the Zauderer standard, denying the requirement to post the bogus health warnings.

Soon after this, Paxton announced lawsuits against the parent companies of Pornhub, xHamster, and Stripchat for violations of HB 1181. The penalties total in millions of dollars in damages, per the law. After the lawsuits for HB 1181 violations were announced and filed in circuit courts in Travis County, counsel for the plaintiffs tried to hold enforcement while they petitioned the high court to take up the case for consideration. Justice Samuel Alito, the circuit justice for the Fifth Circuit, has yet to indicate that the case will be taken up by the Supreme Court. There is no reason why they shouldn’t take it up because of how important this case is moving forward, and how this issue is showing up in so many other states.

The case, Free Speech Coalition et al. v. Paxton, is so important that the national affiliate of the American Civil Liberties Union announced they are aiding the plaintiffs and their current counsel, a team from the big law firm Quinn Emanuel, in their case. They will support the petition for writ of certiorari, potential oral arguments, etc. to render House Bill 1181 and all age verification laws as unconstitutional pipedreams.

Plaintiffs accurately argue that this is settled law, referring to the high court’s landmark decision in Reno v. American Civil Liberties Union. This decision found that segregating the content of the internet by age violates the rights of not only adults but for minors. The vast majority of age verification laws as they are structured now do just that.

While the Supreme Court provided for a less restrictive means to filter out minors from viewing age-restricted materials and potentially facing some level of harm, the vehicles of enforcement and some of the options touted in these bills for controlling minors’ web usage are, to the plaintiffs and civil liberties organizations, a violation of the First Amendment. ACLU and Quinn Emanuel attorneys for the plaintiffs present these arguments in their petition for writ of certiorari, which was filed in April 2024. Now, we just need the Supreme Court to take this seriously and not let the Fifth Circuit, the circuit that upheld a ban on drag shows, dictate law for the nation.

Michael McGrady covers the legal and tech side of the online porn business, among other topics.

K-Fault Resistant Partitioning To Assess Redundancy-Based HW Countermeasures To Fault Injections

A technical paper titled “Fault-Resistant Partitioning of Secure CPUs for System Co-Verification against Faults” was published by researchers at Université Paris-Saclay, Graz University of Technology, lowRISC, University Grenoble Alpes, Thales, and Sorbonne University.

Abstract:

“To assess the robustness of CPU-based systems against fault injection attacks, it is necessary to analyze the consequences of the fault propagation resulting from the intricate interaction between the software and the processor. However, current formal methodologies that combine both hardware and software aspects experience scalability issues, primarily due to the use of bounded verification techniques. This work formalizes the notion of k-fault resistant partitioning as an inductive solution to this fault propagation problem when assessing redundancy-based hardware countermeasures to fault injections. Proven security guarantees can then reduce the remaining hardware attack surface to consider in a combined analysis with the software, enabling a full co-verification methodology. As a result, we formally verify the robustness of the hardware lockstep countermeasure of the OpenTitan secure element to single bit-flip injections. Besides that, we demonstrate that previously intractable problems, such as analyzing the robustness of OpenTitan running a secure boot process, can now be solved by a co-verification methodology that leverages a k-fault resistant partitioning. We also report a potential exploitation of the register file vulnerability in two other software use cases. Finally, we provide a security fix for the register file, verify its robustness, and integrate it into the OpenTitan project.”

Find the technical paper here. Published 2024 (preprint).

Tollec, Simon, Vedad Hadžić, Pascal Nasahl, Mihail Asavoae, Roderick Bloem, Damien Couroussé, Karine Heydemann, Mathieu Jan, and Stefan Mangard. “Fault-Resistant Partitioning of Secure CPUs for System Co-Verification against Faults.” Cryptology ePrint Archive (2024).

Related Reading
RISC-V Micro-Architectural Verification
Verifying a processor is much more than making sure the instructions work, but the industry is building from a limited knowledge base and few dedicated tools.
New Concepts Required For Security Verification
Why it’s so difficult to ensure that hardware works correctly and is capable of detecting vulnerabilities that may show up in the field.

The post K-Fault Resistant Partitioning To Assess Redundancy-Based HW Countermeasures To Fault Injections appeared first on Semiconductor Engineering.

  • ✇Semiconductor Engineering
  • Maximizing Energy Efficiency For Automotive ChipsWilliam Ruby
    Silicon chips are central to today’s sophisticated advanced driver assistance systems, smart safety features, and immersive infotainment systems. Industry sources estimate that now there are over 1,000 integrated circuits (ICs), or chips, in an average ICE car, and twice as many in an average EV. Such a large amount of electronics translates into kilowatts of power being consumed – equivalent to a couple of dishwashers running continuously. For an ICE vehicle, this puts a lot of stress on the ve
     

Maximizing Energy Efficiency For Automotive Chips

7. Březen 2024 v 09:06

Silicon chips are central to today’s sophisticated advanced driver assistance systems, smart safety features, and immersive infotainment systems. Industry sources estimate that now there are over 1,000 integrated circuits (ICs), or chips, in an average ICE car, and twice as many in an average EV. Such a large amount of electronics translates into kilowatts of power being consumed – equivalent to a couple of dishwashers running continuously. For an ICE vehicle, this puts a lot of stress on the vehicle’s electrical and charging system, leading automotive manufacturers to consider moving to 48V systems (vs. today’s mainstream 12V systems). These 48V systems reduce the current levels in the vehicle’s wiring, enabling the use of lower cost smaller-gauge wire, as well as delivering higher reliability. For EVs, higher energy efficiency of on-board electronics translates directly into longer range – the primary consideration of many EV buyers (second only to price). Driver assistance and safety features often employ redundant component techniques to ensure reliability, further increasing vehicle energy consumption. Lack of energy efficiency for an EV also means more frequent charging, further stressing the power grid and producing a detrimental effect on the environment. All these considerations necessitate the need for a comprehensive energy-efficient design methodology for automotive ICs.

What’s driving demand for compute power in cars?

Classification and processing of massive amounts of data from multiple sources in automotive applications – video, audio, radar, lidar – results in a high degree of complexity in automotive ICs as software algorithms require large amounts of compute power. Hardware architectural decisions, and even hardware-software partitioning, must be done with energy efficiency in mind. There is a plethora of tradeoffs at this stage:

  • Flexibility of a general-purpose CPU-based architecture vs. efficiency of a dedicated digital signal processor (DSP) vs. a hardware accelerator
  • Memory sub-system design: how much is required, how it will be partitioned, how much precision is really needed, just to name a few considerations

In order to enable reliable decisions, architects must have access to a system that models, in a robust manner, power, performance, and area (PPA) characteristics of the hardware, as well as use cases. The idea is to eliminate error-prone estimates and guesswork.

To improve energy efficiency, automotive IC designers also must adopt many of the power reduction techniques traditionally used by architects and engineers in the low-power application space (e.g. mobile or handheld devices), such as power domain shutoff, voltage and frequency scaling, and effective clock and data gating. These techniques can be best evaluated at the hardware design level (register transfer level, or RTL) – but with the realistic system workload. As a system workload – either a boot sequence or an application – is millions of clock cycles long, only an emulation-based solution delivers a practical turnaround time (TAT) for power analysis at this stage. This power analysis can reveal intervals of wasted power – power consumption bugs – whether due to active clocks when the data stream is not active, redundant memory access when the address for the read operation doesn’t change for many clock cycles (and/or when the address and data input don’t change for the write operation over many cycles), or unnecessary data toggles while clocks are gated off.

To cope with the huge amount of data and the requirement to process that data in real time (or near real time), automotive designers employ artificial intelligence (AI) algorithms, both in software and in hardware. Millions of multiply-accumulate (MAC) operations per second and other arithmetic-intensive computations to process these algorithms give rise to a significant amount of wasted power due to glitches – multiple signal transitions per clock cycle. At the RTL stage, with the advanced RTL power analysis tools available today, it is possible to measure the amount of wasted power due to glitches as well as to identify glitch sources. Equipped with this information, an RTL design engineer can modify their RTL source code to lower the glitch activity, reduce the size of the downstream logic, or both, to reduce power.

Working together with the RTL design engineer is another critical persona – the verification engineer. In order to verify the functional behavior of the design, the verification engineer is no longer dealing just with the RTL source: they also have to verify the proper functionality of the global power reduction techniques such as power shutoff and voltage/frequency scaling. Doing so requires a holistic approach that leverages a comprehensive description of power intent, such as the Unified Power Format (UPF). All verification technologies – static, formal, emulation, and simulation – can then correctly interpret this power intent to form an effective verification methodology.

Power intent also carries through to the implementation part of the flow, as well as signoff. During the implementation process, power can be further optimized through physical design techniques while conforming to timing and area constraints. Highly accurate power signoff is then used to check conformance to power specifications before tape-out.

Design and verification flow for more energy-efficient automotive SoCs

Synopsys delivers a complete end-to-end solution that allows IC architects and designers to drive energy efficiency in automotive designs. This solution spans the entire design flow from architecture to RTL design and verification, to emulation-driven power analysis, to implementation and, ultimately, to power signoff. Automotive IC design teams can now put in place a rigorous methodology that enables intelligent architectural decisions, RTL power analysis with consistent accuracy, power-aware physical design, and foundry-certified power signoff.

The post Maximizing Energy Efficiency For Automotive Chips appeared first on Semiconductor Engineering.

  • ✇Semiconductor Engineering
  • Accellera Preps New Standard For Clock-Domain CrossingBrian Bailey
    Part of the hierarchical development flow is about to get a lot simpler, thanks to a new standard being created by Accellera. What is less clear is how long will it take before users see any benefit. At the register transfer level (RTL), when a data signal passes between two flip flops, it initially is assumed that clocks are perfect. After clock-tree synthesis and place-and-route are performed, there can be considerable timing skew between the clock edges arriving those adjacent flops. That mak
     

Accellera Preps New Standard For Clock-Domain Crossing

29. Únor 2024 v 09:06

Part of the hierarchical development flow is about to get a lot simpler, thanks to a new standard being created by Accellera. What is less clear is how long will it take before users see any benefit.

At the register transfer level (RTL), when a data signal passes between two flip flops, it initially is assumed that clocks are perfect. After clock-tree synthesis and place-and-route are performed, there can be considerable timing skew between the clock edges arriving those adjacent flops. That makes timing sign-off difficult, but at least the clocks are still synchronous.

But if the clocks come from different sources, are at different frequencies, or a design boundary exists between the flip flops — which would happen with the integration of IP blocks — it’s impossible to guarantee that no clock edges will arrive when the data is unstable. That can cause the output to become unknown for a period of time. This phenomenon, known as metastability, cannot be eliminated, and the verification of those boundaries is known as clock-domain crossing (CDC) analysis.

Special care is required on those boundaries. “You have to compensate for metastability by ensuring that the CDC crossings follow a specific set of logic design principles,” says Prakash Narain, president and CEO of Real Intent. “The general process in use today follows a hierarchical approach and requires that the clock-domain crossing internal to an IP is protected and safe. At the interface of the IP, where the system connects with the IP, two different teams share the problem. An IP provider may recommend an integration methodology, which often is captured in an abstraction model. That abstraction model enables the integration boundary to be verified while the internals of it will not be checked for CDC. That has already been verified.”

In the past, those abstract models differentiated the CDC solutions from veracious vendors. That’s no longer the case. Every IP and tool vendor has different formats, making it costly for everyone. “I don’t know that there’s really anything new or differentiating coming down the pipe for hierarchical modeling,” says Kevin Campbell, technical product manager at Siemens Digital Industries Software. “The creation of the standard will basically deliver much faster results with no loss of quality. I don’t know how much more you can differentiate in that space other than just with performance increases.”

While this has been a problem for the whole industry for quite some time, Intel decided it was time for a solution. The company pushed Accellera to take up the issue, and helped facilitate the creation of the standard by chairing the committee. “I’m going to describe three methods of building a product,” says Iredamola “Dammy” Olopade, chair of the Accellera working group, and a principal engineer at Intel. “Method number one is where you build everything in a monolithic fashion. You own every line of code, you know the architecture, you use the tool of your choice. That is a thing of the past. The second method uses some IP. It leverages reuse and enables the quick turnaround of new SoCs. There used to be a time when all IPs came from the same source, and those were integrating into a product. You could agree upon the tools. We are quickly moving to a world where I need to source IPs wherever I can get them. They don’t use the same tools as I do. In that world, common standards are critical to integrating quickly.”

In some cases, there is a hierarchy of IP. “Clock-domain crossings are a central part of our business,” says Frank Schirrmeister, vice president of solutions and business development at Arteris. “A network-on-chip (NoC) can be considered as ‘CDC central’ because most blocks connected to the NoC have different clocks. Also, our SoC integration tools see all of the blocks to be integrated, and those touch various clock domains and therefore need to deal with the CDC code that is inserted.”

This whole thing can become very messy. “While every solution supports hierarchical modeling, every tool has its own model solution and its own model representation,” says Siemens’ Campbell. “Vendors, or users, are stuck with a CDC solution, because the models were created within a certain solution. There’s no real transportability between any of the hierarchical modeling solutions unless they want to go regenerate models for another solution.”

That creates a lot of extra work. “Today, when dealing with customer CDC issues, we have to consider the customer’s specific environment, and for CDC, a potential mix of in-house flows and commercial tools from various vendors,” says Arteris’ Schirrmeister. “The compatibility matrix becomes very complex, very fast. If adopted, the new Accellera CDC standard bears the potential to make it easier for IP vendors, like us, to ensure compatibility and reduce the effort required to validate IP across multiple customer toolsets. The intent, as specified in the requirements is that ‘every IP provider can run its tool of choice to verify and produce collateral and generate the standard format for SoCs that use a different tool.'”

Everyone benefits. “IP providers will not need to provide extra documentation of clock domains for the SoC integrator to use in their CDC analysis,” says Ahmed Nasr, digital design manager at Mixel. “The standard CDC attributes generated by the EDA tool will be self-contained.”

The use model is relatively simple. “An IP developer signs off on CDC and then exports the abstract model,” says Real Intent’s Narain. “It is likely they will write this out in both the Accellera format and the native format to provide backward compatibility. At the next level of hierarchy, you read in the abstract model instead of reading in the full view of the design. They have various views of the IP, including the CDC view of the IP, which today is on the basis of whatever tool they use for CDC sign-off.”

The potential is significant. “If done right and adopted, the industry may arrive at a common language to describe CDC aspects that can streamline the validation process across various tools and environments used by different users,” says Schirrmeister. “As a result, companies will be able to integrate and validate IP more efficiently than before, accelerating development cycles and reducing the complexity associated with SoC integration.”

The standard
Intel’s Olopade describes the approach that was taken during the creation of the standard. “You take the most complex situations you are likely to find, you box them, and you co-design them in order to reduce the risk of bugs,” he said. “The boundaries you create are supposed to be simple boundaries. We took that concept, and we brought it into our definition to say the following: ‘We will look at all kinds of crossings, we will figure out the simple common uses, and we will cover that first.’ That is expected to cover 95% to 98% of the community. We are not trying to handle 700 different exceptions. It is common. It is simple. It is what guarantees production quality, not just from a CDC standpoint, but just from a divide-and-conquer standpoint.”

That was the starting point. “Then we added elements to our design document that says, ‘This is how we will evaluate complexity, and this is how we’ll determine what we cover first,'” he says. “We broke things down into three steps. Step one is clock-domain crossing. Everyone suffers from this problem. Step two is reset-domain crossing (RDC). As low power is getting into more designs, there are a lot more reset domains, and there is risk between these reset domains. Some companies care, but many companies don’t because they are not in a power-aware environment. It became a secondary consideration. Beyond the basic CDC in phase one, and RDC in phase two, all other interesting, small usage complexities will be handled in phase three as extensions to the standard. We are not going to get bogged down supporting everything under the sun.”

Within the standards group there are two sub-groups — a mapping team and a format team. Common standards, such as AMBA, UCIe, and PCIe have been looked at to make sure that these are fully covered by the standard. That means that the concepts should be useful for future markets.

“The concepts contained in the standard are extensible to hardened chiplets,” says Mixel’s Nasr. “By providing an accurate standard CDC view for the chiplet, it will enable integration with other chiplets.”

Some of those issues have yet to be fully explored. “The standard’s current documentation primarily focuses on clock-domain crossing within an SoC itself,” says Schirrmeister. “Its direct applicability to the area of chiplets would depend on further developments. The interfaces between fully hardened IP blocks on chiplets would communicate through standard interfaces like UCIe, BoW, or XSR, so the synchronization issues between chiplets on substrates would appear to be elevated to the protocol levels.”

Reset-domain crossings have yet to appear in the standard. “The genesis of CDC is asynchronous clocks,” says Narain. “But the genesis for reset-domain crossing is asynchronous resets. While the destination is due to the clock, the source of the problem is somewhere else. And as a result, the nature of the problem, the methodology that people use to manage that problem, are very different. The kind of information that you need to retain, and the kind of information that you can throw away, is different for every problem. Hence, abstractions are actually very customized for the application.”

Does the standard cover enough ground? That is part of the purpose of the review period that was used to collect information. “I can see some room for future improvement — for example, making some attributes mandatory like logic, associated_clocks, clock_period for clock ports,” says Nasr. “Another proposed improvement is adding reconvergence information, to be able to detect reconverging outputs of parallel synchronizers.”

The impact of all of this, if realized, is enormous. “If you truly run a collaborative, inclusive, development cycle, two things will happen,” says Olopade. “One, you are going to be able to find multiple ways to solve each problem. You need to understand the pros and cons against the real problems you are trying to solve and agree on the best way we should do it together. For each of those, we record the options, the pros and cons, and the reason one was selected. In a public review, those that couldn’t be part of that discussion get to weigh in. We weigh it against what they are suggesting versus why did we choose it. In the cases where it is part of what we addressed, and we justified it, we just respond, and we do not make a change. If you’re truly inclusive, you do allow that feedback to cause you to change your mind. We received feedback on about three items that we had debated, where the feedback challenged the decisions and got us to rehash things.”

The big challenge
Still, the creation of a standard is just the first step. Unless a standard is fully adopted, its value becomes diminished. “It’s a commendable objective and a worthy endeavor,” says Schirrmeister. “It will make interoperability easier and eventually allow us, and the whole industry, to reduce the compatibility matrix we maintain to deal with vendor tools individually. It all will depend on adoption by the vendors, though.”

It is off to a good start. “As with any standard, good intentions sometimes get severed by reality,” says Campbell. “There has been significant collaboration and agreements on how the standard is being pushed forward. We did not see self-submarining, or some parties playing nice just to see what’s going on but not really supporting it. This does seem like good collaboration and good decision making across the board.”

Implementation is another hurdle. “Will it actually provide the benefit that it is supposed to provide?” asks Narain. “That will depend upon how completely and how quickly EDA tool vendors provide support for the standard. From our perception, the engineering challenge for implementing this is not that large. When this is standardized, we will provide support for it as soon as we can.”

Even then, adoption isn’t a slam dunk. “There are short- and long-term problems,” warns Campbell. “IP vendors already have to support multiple formats, but now you have to add Accellera on top of that. There’s going to be some pain both for the IP vendors and for EDA vendors. We are going to have to be backward-compatible and some programs go on for decades. There’s a chance that some of these models will be around for a very long time. That’s the short-term pain. But the biggest hurdle to overcome for a third-party IP vendor, and EDA vendor, is quality assurance. The whole point of a hierarchical development methodology is faster CDC closure with no loss in quality. The QA load here is going to be big, because no customer is going to want to take the risk if they’ve got a solution that is already working well.”

Some of those issues and fears are expected to be addressed at the upcoming DVCon conference. “We will be providing a tutorial on CDC,” says Olopade. “The first 30 minutes covers the basics of CDC for those who haven’t been doing this for the last 10 years. The next hour will talk about the Accellera solution. It will concentrate on those topics which were hotly debated, and we need to help people understand, or carry people along with what we recommend. Then it may become more acceptable and more adoptive.”

Related Reading
Design And Verification Methodologies Breaking Down
As chips become more complex, existing tools and methodologies are stretched to the breaking point.

The post Accellera Preps New Standard For Clock-Domain Crossing appeared first on Semiconductor Engineering.

  • ✇Semiconductor Engineering
  • Weak Verification Plans Lead To Project DisarrayAnika Sunda
    The purpose of the verification plan, or vplan as we call it, is to capture all the verification goals needed to prove that the device works as specified. It’s a big responsibility! Getting it right means having a good blueprint for verification closure. However, getting it wrong could result in bug escapes, wasting of resources, and possibly lead to a device failing altogether. With the focus on AI-driven verification, the efficiency and effectiveness of verification planning are expected to im
     

Weak Verification Plans Lead To Project Disarray

29. Únor 2024 v 09:02

The purpose of the verification plan, or vplan as we call it, is to capture all the verification goals needed to prove that the device works as specified. It’s a big responsibility! Getting it right means having a good blueprint for verification closure. However, getting it wrong could result in bug escapes, wasting of resources, and possibly lead to a device failing altogether. With the focus on AI-driven verification, the efficiency and effectiveness of verification planning are expected to improve significantly.

There are several key elements needed to create a good vPlan. We will go over a few below.

Accurate verification features are needed for verification closure

The concept of divide and conquer suggests that every complex feature can be broken down into sub-features, which in turn can be further divided. Verisium Manager’s Planning Center facilitates this process by enabling users to create expandable/collapsible feature sections, a crucial capability for maintaining quality. Not having this key capability can put quality at risk.

Close alignment to the functional specification

Close adherence to the functional specification is crucially linked to the first point. Any new features or changes to existing ones should prompt immediate updates to the vPlan, as failing to do so could affect verification quality. The Planning Center allows users to associate paragraphs in the specification to the vPlan and provides notifications of any corresponding alterations. This allows users to respond by adjusting the vPlan accordingly in alignment with the specifications.

Connecting relevant metrics, vPlan features

Once the vPlan is defined, it’s important to connect the relevant metrics to demonstrate verification assurance of each feature. It may involve using a combination of code coverage, functional coverage, or directed test to provide that assurance. The Planning Center makes connecting these metrics to the vPlan very straightforward. Failing to link these metrics with the features could result in insufficiently verified features.

Showing real-time results

To effectively monitor progress and respond promptly to areas requiring attention, the vPlan should dynamically reflect the results in real time. This allows for accurate measurement of progress and focused allocation of resources. Delayed results could lead to wasted project time in non-priority areas. Verisium Manager’s vPlan Analysis automates this process enabling users to view real-time vPlan status for relevant regressions.

Customers have shared that vPlan quality significantly influences project outcomes. It’s crucial to prioritize creating higher quality vPlans, rather than simply focusing on speed. However, maintaining consistent high quality can be challenging due to the human tendency to quickly lose interest, with initial strong efforts tapering off as the process continues.

A thorough verification plan is the key to success in ASIC verification. Verification reuse is critical to the productivity and efficiency of system-on-chip (SoC), and a good vPlan is the first step in this direction. If you’re a verification engineer, take the time to develop a thorough verification plan for your next project. It will be one of the best investments you can make in the success of your project.

The post Weak Verification Plans Lead To Project Disarray appeared first on Semiconductor Engineering.

  • ✇Semiconductor Engineering
  • Impact of Scaling and BEOL Technology Solutions At The 7nm Node On MRAMTechnical Paper Link
    A technical paper titled “Impact of Technology Scaling and Back-End-of-the-Line Technology Solutions on Magnetic Random-Access Memories” was published by researchers at Georgia Institute of Technology. Abstract: “While magnetic random-access memories (MRAMs) are promising because of their nonvolatility, relatively fast speeds, and high endurance, there are major challenges in adopting them for the advanced technology nodes. One of the major challenges in scaling MRAM devices is caused by the eve
     

Impact of Scaling and BEOL Technology Solutions At The 7nm Node On MRAM

A technical paper titled “Impact of Technology Scaling and Back-End-of-the-Line Technology Solutions on Magnetic Random-Access Memories” was published by researchers at Georgia Institute of Technology.

Abstract:

“While magnetic random-access memories (MRAMs) are promising because of their nonvolatility, relatively fast speeds, and high endurance, there are major challenges in adopting them for the advanced technology nodes. One of the major challenges in scaling MRAM devices is caused by the ever-increasing resistances of interconnects. In this article, we first study the impact of shrunk interconnect dimensions on MRAM performance at various technology nodes. Then, we investigate the impact of various potential back-end-of-the-line (BEOL) technology solutions at the 7 nm node. Based on interconnect resistance values from technology computer-aided design (TCAD) simulations and MRAM device characteristics from experimentally validated/calibrated physical models, we quantify the potential array-level performance of MRAM using SPICE simulations. We project that some potential BEOL technology solutions can reduce the write energy by up to 34.6% with spin-orbit torque (SOT) MRAM and 29.0% with spin-transfer torque (STT) MRAM. We also observe up to 21.4% reduction in the read energy of the SOT-MRAM arrays.”

Find the technical paper here. Published January 2024.

P. Kumar, D. E. Shim, S. Narla and A. Naeemi, “Impact of Technology Scaling and Back-End-of-the-Line Technology Solutions on Magnetic Random-Access Memories,” in IEEE Journal on Exploratory Solid-State Computational Devices and Circuits, vol. 10, pp. 13-21, 2024, doi: 10.1109/JXCDC.2024.3357625.

Related Reading
MRAM Getting More Attention At Smallest Nodes
Why this 25-year-old technology may be the memory of choice for leading edge designs and in automotive applications.
ReRAM Seeks To Replace NOR
There is increased interest in ReRAM for embedded computing, especially in automotive applications, as more of its known issues are solved. Nevertheless, there is no one-size-fits-all NVM.

The post Impact of Scaling and BEOL Technology Solutions At The 7nm Node On MRAM appeared first on Semiconductor Engineering.

❌
❌