FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇Semiconductor Engineering
  • Ultrathin vdW Ferromagnet at Room Temperature (MIT)Technical Paper Link
    A technical paper titled “Current-induced switching of a van der Waals ferromagnet at room temperature” was published by researchers at Massachusetts Institute of Technology (MIT). Abstract: “Recent discovery of emergent magnetism in van der Waals magnetic materials (vdWMM) has broadened the material space for developing spintronic devices for energy-efficient computation. While there has been appreciable progress in vdWMM discovery, a solution for non-volatile, deterministic switching of vdWMMs
     

Ultrathin vdW Ferromagnet at Room Temperature (MIT)

A technical paper titled “Current-induced switching of a van der Waals ferromagnet at room temperature” was published by researchers at Massachusetts Institute of Technology (MIT).

Abstract:

“Recent discovery of emergent magnetism in van der Waals magnetic materials (vdWMM) has broadened the material space for developing spintronic devices for energy-efficient computation. While there has been appreciable progress in vdWMM discovery, a solution for non-volatile, deterministic switching of vdWMMs at room temperature has been missing, limiting the prospects of their adoption into commercial spintronic devices. Here, we report the first demonstration of current-controlled non-volatile, deterministic magnetization switching in a vdW magnetic material at room temperature. We have achieved spin-orbit torque (SOT) switching of the PMA vdW ferromagnet Fe3GaTe2  using a Pt spin-Hall layer up to 320 K, with a threshold switching current density as low as Jsw = 1.69 × 106 A cm-2 at room temperature. We have also quantitatively estimated the anti-damping-like SOT efficiency of our Fe3GaTe2/Pt bilayer system to be ξDL = 0:093, using the second harmonic Hall voltage measurement technique. These results mark a crucial step in making vdW magnetic materials a viable choice for the development of scalable, energy-efficient spintronic devices.”

Find the technical paper here. Published February 2024. MIT’s related news article and video is here.

Kajale, S.N., Nguyen, T., Chao, C.A. et al. Current-induced switching of a van der Waals ferromagnet at room temperature. Nat Commun 15, 1485 (2024). https://doi.org/10.1038/s41467-024-45586-4

 

 

The post Ultrathin vdW Ferromagnet at Room Temperature (MIT) appeared first on Semiconductor Engineering.

  • ✇Ars Technica - All content
  • Matrix multiplication breakthrough could lead to faster, more efficient AI modelsBenj Edwards
    Enlarge / When you do math on a computer, you fly through a numerical tunnel like this—figuratively, of course. (credit: Getty Images) Computer scientists have discovered a new way to multiply large matrices faster than ever before by eliminating a previously unknown inefficiency, reports Quanta Magazine. This could eventually accelerate AI models like ChatGPT, which rely heavily on matrix multiplication to function. The findings, presented in two recent papers, have led to w
     

Matrix multiplication breakthrough could lead to faster, more efficient AI models

8. Březen 2024 v 22:07
Futuristic huge technology tunnel and binary data.

Enlarge / When you do math on a computer, you fly through a numerical tunnel like this—figuratively, of course. (credit: Getty Images)

Computer scientists have discovered a new way to multiply large matrices faster than ever before by eliminating a previously unknown inefficiency, reports Quanta Magazine. This could eventually accelerate AI models like ChatGPT, which rely heavily on matrix multiplication to function. The findings, presented in two recent papers, have led to what is reported to be the biggest improvement in matrix multiplication efficiency in over a decade.

Multiplying two rectangular number arrays, known as matrix multiplication, plays a crucial role in today's AI models, including speech and image recognition, chatbots from every major vendor, AI image generators, and video synthesis models like Sora. Beyond AI, matrix math is so important to modern computing (think image processing and data compression) that even slight gains in efficiency could lead to computational and power savings.

Graphics processing units (GPUs) excel in handling matrix multiplication tasks because of their ability to process many calculations at once. They break down large matrix problems into smaller segments and solve them concurrently using an algorithm.

Read 11 remaining paragraphs | Comments

❌
❌