FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál

Honor Play 60 Plus Launched With Big 6000mAh Battery

24. Červen 2024 v 10:03

Honor has launched a new affordable phone in China. The Honor Play 60 Plus targets people who want a long-lasting battery experience. We will check ...

The post Honor Play 60 Plus Launched With Big 6000mAh Battery appeared first on Gizchina.com.

  • ✇Semiconductor Engineering
  • Chip Industry Week In ReviewThe SE Staff
    JEDEC and the Open Compute Project rolled out a new set of guidelines for standardizing chiplet characterization details, such as thermal properties, physical and mechanical requirements, and behavior specs. Those details have been a sticking point for commercial chiplets, because without them it’s not possible to choose the best chiplet for a particular application or workload. The guidelines are a prerequisite for a multi-vendor chiplet marketplace. AMD, Broadcom, Cisco, Google, HPE, Intel, Me
     

Chip Industry Week In Review

31. Květen 2024 v 09:01

JEDEC and the Open Compute Project rolled out a new set of guidelines for standardizing chiplet characterization details, such as thermal properties, physical and mechanical requirements, and behavior specs. Those details have been a sticking point for commercial chiplets, because without them it’s not possible to choose the best chiplet for a particular application or workload. The guidelines are a prerequisite for a multi-vendor chiplet marketplace.

AMD, Broadcom, Cisco, Google, HPE, Intel, Meta, and Microsoft proposed a new high-speed, low-latency interconnect specification, Ultra Accelerator Link (UALink), between accelerators and switches in AI computing pods. The 1.0 specification will enable the connection of up to 1,024 accelerators within a pod and allow for direct loads and stores between the memory attached to accelerators.

Arm debuted a range of new CPUs, including the Cortex-X925 for on-device generative AI, and the Cortex-A725 with improved efficiency for AI and mobile gaming. It also announced the Immortalis-G925 GPU for flagship smartphones, and the Mali-G725/625 GPUs for consumer devices. Additionally, Arm announced Compute Subsystems (CSS) for Client to provide foundational computing elements for AI smartphone and PC SoCs, and it introduced KleidiAI, a set of compute kernels for developers of AI frameworks. The Armv9-A architecture also added support for the Scalable Matrix Extension to accelerate AI workloads.

TSMC said its 2nm process is on target to begin mass production in 2025. Meanwhile, Samsung is expected to release its 1nm plan next month, targeting mass production for 2026 — a year ahead of schedule, reports Business Korea.

CHIPs for America and NATCAST released a 2024 roadmap for the U.S. National Semiconductor Technology Center (NSTC), identifying priorities for facilities, research, workforce development, and membership.

China is investing CNY 344 billion (~$47.5 billion) into the third phase of its National Integrated Circuit Industry Investment Fund, also known as the Big Fund, to support its semiconductor sector and supply chain, according to numerous reports.

Malaysia plans to invest $5.3 billion in seed capital and support for semiconductor manufacturing in an effort to attract more than $100 billion in foreign investments, reports Reuters. Prime Minister Anwar Ibrahim announced the effort to create at least 10 companies focused on IC design, advanced packaging, and equipment manufacturing.

imec demonstrated a die-to-wafer hybrid bonding flow for Cu-Cu and SiCN-SiCN at pitches down to 2µm at the IEEE’s ECTC conference. This breakthrough could enable die and wafer-level optical interconnects.

The chip industry is racing to develop glass for advanced packaging, setting the stage for one of the biggest shifts in chip materials in decades — and one that will introduce a broad new set of challenges that will take years to fully resolve.

Quick links to more news:

In-Depth
Global
Product News
Markets and Money
Security
Research and Training
Quantum
Events and Further Reading


In-Depth

Semiconductor Engineering published its Systems & Design newsletter featuring these top stories:


Global

STMicroelectronics is building a fully integrated SiC facility in Catania, Italy.  The high-volume 200mm facility is projected to cost over $5 billion.

Siliconware Precision Industries Co. Ltd.(SPIL) broke ground on an RM 6 billion (~$1.3 billion) advanced packaging and testing facility in Malaysia. Also, Google will invest $2 billion in Malaysia for its first data center, and a Google Cloud hub to meet growing demand for cloud services and AI literacy programs, reports AP.

In an SEC filing, Applied Materials received additional subpoenas from the U.S. Department of Commerce’s (DoC) Bureau of Industry and Security related to shipments of advanced semiconductor equipment to China. This comes on the heels of similar subpoenas issued last year.

A Chinese contractor working for SK hynix was arrested in South Korea and is being charged with funneling more than 3,000 copies of a paper on solving process failure issues to Huawei, reports South Korea’s Union News.

VSORA, CEA-Grenoble, and Valeo were awarded $7 million from the French government to build low-latency, low-power AI inference co-processors for autonomous driving and other applications.

In the U.S., the National Highway Traffic Safety Administration (NHTSA) is investigating unexpected driving behaviors of vehicles equipped with Waymo‘s 5th Generation automated driving system (ADS), with details of nine new incidents on top of the first 22.


Product News

ASE introduced powerSIP, a power delivery platform designed to reduce signal and transmission loss while addressing current density challenges.

Infineon announced a roadmap for energy-efficient power supply units based on Si, SiC, and GaN to address the energy needs of AI data centers, featuring new 8 kW and 12 kW PSUs, in addition to the 3 kW and 3.3 kW units available today. The company also released its CoolSiC MOSFET 400 V family, specially developed for use in the AC/DC stage of AI servers, complementing the PSU roadmap.

Fig. 1: Infineon’s 8kW PSU. Source: Infineon

Infineon also introduced two new generations of high voltage (HV) and medium voltage (MV) CoolGaN TM devices, enabling customers to use GaN in voltage classes from 40 V to 700 V. The devices are built using Infineon’s 8-inch foundry processes.

Ansys launched Ansys Access on Microsoft Azure to provide pre-configured simulation products optimized for HPC on Azure infrastructure.

Foxconn Industrial Internet used Keysight Technology’s Open RAN Studio solution to certify an outdoor Open Radio Unit (O-RU).

Andes Technology announced an SoC and development board for the development and porting of large RISC-V applications.

MediaTek uncorked a pair of mobile chipsets built on a 4nm process that use an octa-core CPU consisting of 4X Arm Cortex-A78 cores operating at up to 2.5GHz paired with 4X Arm Cortex-A55 cores.

The NVIDIA H200 Blackwell platform is expected to begin shipping in Q3 of 2024 and will be available to data centers by Q4, according to TrendForce.

A room-temperature direct fusion hybrid bonding system from Be Semiconductor has shipped to the NHanced advanced packaging facility in North Carolina. The new system offers faster throughput for copper interconnects with submicron pad sizes, greater accuracy and reduced warpage.


Markets and Money

Frore Systems raised $80 million for its solid-state active cooling module, which removes heat from the top of a chip without fans. The device in systems ranging from notebooks and network edge gateways to data centers.

Axus Technology received $12.5 million in capital equity funding to make its chemical mechanical planarization (CMP) equipment for semiconductor wafer polishing, thinning, and cleaning, including of silicon carbide (SiC) wafers.

Elon Musk’s xAI announced a series B funding round of $6 billion.

Micron was ordered to pay $445 million in damages to Netlist for patent infringement of the company’s DDR4 memory module technology between 2021 and 2024.

Global revenue from AI semiconductors is predicted to total $71 billion in 2024, up 33% from 2023, according to Gartner. In 2025, it is expected to jump to $91.9 billion. The value of AI accelerators used in servers is expected to total $21 billion in 2024 and reach $33 billion by 2028.

NAND flash revenue was $14.71 billion in Q1 2024, an increase of 28.1%, according to TrendForce.

The optical transceiver market dipped from $11 billion in 2022 to $10.9 billion in 2023, but it is predicted to reach $22.4 billion by 2029, driven by AI, 800G applications, and the transition to 200G/lane ecosystem technologies, reports Yole.

Yole also found that ultra-wideband technical choices and packaging types used by NXP, Apple, and Qorvo vary considerably, ranging from 7nm to 90nm, with both CMOS and finFET transistors.

The global market share of GenAI-capable smartphones increased to 6% in Q1 2024 from 1.3% in the previous quarter, reports Counterpoint. The premium segment accounted for over 70% of sales with Samsung on top and contributing 58%. Meanwhile, global foldable smartphone shipments were up 49% YoY in Q1 2024, led by Huawei, HONOR, and Motorola.


Security

The National Science Foundation awarded Worcester Polytechnic Institute researcher Shahin Tajik almost $0.6 million to develop new technologies to address hardware security vulnerabilities.

The Hyperform consortium was formed to develop European sovereignty in post-quantum cryptography, funded by the French government and EU credits. Members include IDEMIA Secure Transactions, CEA Leti, and the French cybersecurity agency (ANSSI).

In security research:

  • University of California Davis and University of Arizona researchers proposed a framework leveraging generative pre-trained transformer (GPT) models to automate the obfuscation process.
  • Columbia University and Intel researchers presented a secure digital low dropout regulator that integrates an attack detector and a detection-driven protection scheme to mitigate correlation power analysis.
  • Pohang University of Science and Technology (POSTECH) researchers analyzed threshold switch devices and their performance in hardware security.

The U.S. Defense Advanced Research Projects Agency (DARPA) seeks proposals for its AI Quantified program to develop technology to help deploy generative AI safely and effectively across the Department of Defense (DoD) and society.

Vanderbilt University and Oak Ridge National Laboratory (ORNL) partnered to develop dependable AI for national security applications.

The Cybersecurity and Infrastructure Security Agency (CISA) issued a number of alerts/advisories.


Research and Training

New York continues to amp up their semiconductor offerings. NY CREATES and Raytheon unveiled a semiconductor workforce training program. And Syracuse  University is hosting a free virtual course focused on the semiconductor industry this summer.

In research news:

  • A team of researchers at MIT and other universities found that extreme temperatures up to 500°C did not significantly degrade GaN materials or contacts.
  • University of Cambridge researchers developed adaptive and eco-friendly sensors that can be directly and imperceptibly printed onto biological surfaces, such as a finger or flower petal.
  • Researchers at Rice University and Hanyang University developed an elastic material that moves like skin and can adjust its dielectric frequency to stabilize RF communications and counter disruptive frequency shifts that interfere with electronics when a substrate is twisted or stretched, with potential for stretchable wearable electronic devices.

The National Science Foundation (NSF) awarded $36 million to three projects chosen for their potential to revolutionize computing. The University of Texas at Austin-led project aims to create a next-gen open-source intelligent and adaptive OS. The Harvard University-led project targets sustainable computing. The University of Massachusetts Amherst-led project will develop computational decarbonization.


Quantum

Singapore will invest close to S$300 million (~$222 million) into its National Quantum Strategy to support the development and deployment of quantum technologies, including an initiative to design and build a quantum processor within the country.

Several quantum partnerships were announced:

  • Riverlane and Alice & Bob will integrate Riverlane’s quantum error correction stack within Alice & Bob’s larger quantum computing system based on cat qubit technology.
  • New York University and the University of Copenhagen will collaborate to explore the viability of hybrid superconductor-semiconductor quantum materials for the production of quantum chips and integration with CMOS processes.
  • NXP, eleQtron, and ParityQC showed off a full-stack, ion-trap based quantum computer demonstrator for Germany’s DLR Quantum Computing Initiative.
  • Photonic says it demonstrated distributed entanglement between quantum modules using optically-linked silicon spin qubits with a native telecom networking interface as part of a quantum internet effort with Microsoft.
  • Classiq and HPE say they developed a rapid method for solving large-scale combinatorial optimization problems by combining quantum and classical HPC approaches.

Events and Further Reading

Find upcoming chip industry events here, including:

Event Date Location
Hardwear.io Security Trainings and Conference USA 2024 May 28 – Jun 1 Santa Clara, CA
SWTest Jun 3 – 5 Carlsbad, CA
IITC2024: Interconnect Technology Conference Jun 3 – 6 San Jose, CA
VOICE Developer Conference Jun 3 – 5 La Jolla, CA
CHIPS R&D Standardization Readiness Level Workshop Jun 4 – 5 Online and Boulder, CO
SNUG Europe: Synopsys User Group Jun 10 – 11 Munich
IEEE RAS in Data Centers Summit: Reliability, Availability and Serviceability Jun 11 – 12 Santa Clara, CA
3D & Systems Summit Jun 12 – 14 Dresden, Germany
PCI-SIG Developers Conference Jun 12 – 13 Santa Clara, CA
AI Hardware and Edge AI Summit: Europe Jun 18 – 19 London, UK
DAC 2024 Jun 23 – 27 San Francisco
Find All Upcoming Events Here

Upcoming webinars are here, including integrated SLM analytics solution, prototyping and validation of perception sensor systems, and improving PCB designs for performance and reliability.


Semiconductor Engineering’s latest newsletters:

Automotive, Security and Pervasive Computing
Systems and Design
Low Power-High Performance
Test, Measurement and Analytics
Manufacturing, Packaging and Materials

The post Chip Industry Week In Review appeared first on Semiconductor Engineering.

Elevating Smartphone Opulence: Honor Magic 6 RSR Porsche Design Review

11. Květen 2024 v 12:39

Luxury and performance often merge in the smartphone realm, as brands aim to provide an unmatched experience. The Honor Magic 6 RSR Porsche Design is ...

The post Elevating Smartphone Opulence: Honor Magic 6 RSR Porsche Design Review appeared first on Gizchina.com.

  • ✇XDA
  • Download Android USB Drivers for popular OEMsSkanda Hazarika
    Aftermarket tinkering on Android phones isn't as prominent as it once used to be. That's not to say you can't root your Android phone or install a custom ROM like LineageOS on it. You can still tinker and customize your phone to a large extent as long as you have the interest and the technical know-how. Most — if not all — aftermarket tinkering requires you to connect your Android smartphone to a computer using a USB cable, so you can use tools like the Android Debug Bridge (ADB) to
     

Download Android USB Drivers for popular OEMs

21. Duben 2024 v 01:30

Aftermarket tinkering on Android phones isn't as prominent as it once used to be. That's not to say you can't root your Android phone or install a custom ROM like LineageOS on it. You can still tinker and customize your phone to a large extent as long as you have the interest and the technical know-how. Most — if not all — aftermarket tinkering requires you to connect your Android smartphone to a computer using a USB cable, so you can use tools like the Android Debug Bridge (ADB) to interact with the device.

  • ✇IEEE Spectrum
  • Robert Kahn: The Great InterconnectorTekla S. Perry
    In the mid-1960s, Robert Kahn began thinking about how computers with different operating systems could talk to each other across a network. He didn’t think much about what they would say to one another, though. He was a theoretical guy, on leave from the faculty of the Massachusetts Institute of Technology for a stint at the nearby research-and-development company Bolt, Beranek and Newman (BBN). He simply found the problem interesting. “The advice I was given was that it would be a bad thing
     

Robert Kahn: The Great Interconnector

20. Duben 2024 v 17:00


In the mid-1960s, Robert Kahn began thinking about how computers with different operating systems could talk to each other across a network. He didn’t think much about what they would say to one another, though. He was a theoretical guy, on leave from the faculty of the Massachusetts Institute of Technology for a stint at the nearby research-and-development company Bolt, Beranek and Newman (BBN). He simply found the problem interesting.

“The advice I was given was that it would be a bad thing to work on. They would say it wasn’t going to lead to anything,” Kahn recalls. “But I was a little headstrong at the time, and I just wanted to work on it.”

Robert E. Kahn


Photo of an older man in a dark suit in front of a blue and green lined background

Current job:

Chairman, CEO, and president of the Corporation for National Research Initiatives (CNRI)

Date of birth:

23 December 1938

Birthplace:

Brooklyn, New York

Family:

Patrice Ann Lyons, his wife

Education:

BEE 1960, City College of New York; M.A. 1962 and Ph.D. 1964, Princeton University

First job:

Runner for a Wall Street brokerage firm

First electronics job:

Bell Telephone Laboratories, New York City

Biggest surprise in career:

Leaving—and then staying out of—academics

Patents:

Several, including two related to the digital-object architecture and two on remote pointing devices

Heroes:

His parents, his wife, Egon Brenner, Irwin Jacobs, Jack Wozencraft

Favorite books:

March of Folly: From Troy to Vietnam (1984) by Barbara W. Tuchman, The Two-Ocean War: A Short History of the United States Navy in the Second World War (1963) by Samuel Eliot Morison

Favorite movies:

The Day the Earth Stood Still (1951), Casablanca (1942)

Favorite kind of music:

Opera, operatic musicals

Favorite TV shows:

Golf, tennis, football, soccer—basically any sports show

Favorite food:

Chinese that he cooks himself, as taught to him by Franklin Kuo, codeveloper of ALOHAnet at the University of Hawaii

Favorite restaurants:

Le Bernardin, New York City, and L’Auberge Chez Francois, Great Falls, Va.

Leisure activities past and present:

Skiing, whitewater canoeing, tennis, golf, cooking

Key organizational memberships:

IEEE, Association for Computing Machinery (ACM), the U.S. National Academies of Science and Engineering, the Marconi Society

Major awards:

IEEE Medal of Honor “for pioneering technical and leadership contributions in packet communication technologies and foundations of the Internet,” the Presidential Medal of Freedom, the National Medal of Technology and Innovation, the Queen Elizabeth Prize for Engineering, the Japan Prize, the Prince of Asturias Award

Kahn ended up “working on it” for the next half century. And he is still involved in networking research today.

It is for this work on packet communication technologies—as part of the project that became the ARPANET and in the foundations of the Internet—that Kahn is being awarded the 2024 IEEE Medal of Honor.

The ARPANET Is Born

Kahn wasn’t the only one thinking about connecting disparate computers in the 1960s. In 1965, Larry Roberts, then at the MIT Lincoln Laboratory, connected one computer in Massachusetts to another in California over a telephone line. Bob Taylor, then at the Advanced Research Projects Agency (ARPA), got interested in connecting computers, in part to save the organization money by getting the expensive computers it funded at universities and research organizations to share their resources over a packet-switched network. This method of communications involves cutting up data files into blocks and reassembling them at their destination. It allows each fragment to take a variety of paths across a network and helps mitigate any loss of data, because individual packets can easily be resent.

Taylor’s project—the ARPANET—would be far more than theoretical. It would ultimately produce the world’s first operational packet network linking distributed interactive computers.

Meanwhile, over at BBN, Kahn intended to spend a couple of years in industry so he could return to academia with some real-world experience and ideas for future research.

“I wasn’t hired to do anything in particular,” Kahn says. “They were just accumulating people who they thought could contribute. But I had come from the conceptual side of the world. The people at BBN viewed me as other.”

Kahn didn’t know much about computers at the time—his Ph.D. thesis involved signal processing. But he did know something about communication networks. After earning a bachelor’s degree in electrical engineering from City College of New York in 1960, Kahn had joined Bell Telephone Laboratories, working at its headquarters in Manhattan, where he helped to analyze the overall architecture and performance of the Bell telephone system. That involved conceptualizing what the network needed to do, developing overall plans, and handling the mathematical calculations related to the architecture as implemented, Kahn recalls.

“We would figure out things like: Do we need more lines between Denver and Chicago?” he says.

Kahn stayed at Bell Labs for about nine months; to his surprise, a graduate fellowship came through that he decided to accept. He was off to Princeton University in the autumn of 1961, returning to Bell Labs for the next few summers.

So, when Kahn was at BBN a few years later, he knew enough to realize that you wouldn’t want to use the telephone network as the basis of a computer network: Dial-up connections took 10 or 20 seconds to go through, the bandwidth was low, the error rate was high, and you could connect to only one machine at a time.

Other than generally thinking that it would be nice if computers could talk to one another, Kahn didn’t give much thought to applications.

“If you were engineering the Bell System,” he says, “you weren’t trying to figure out who in San Francisco is going to say what to whom in New York. You were just trying to figure out how to enable conversations.”

A black and white graduation portrait of a man in a cap and gown. Bob Kahn graduated from high school in 1955.Bob Kahn

Kahn wrote a series of reports laying out how he thought a network of computers could be implemented. They landed on the desk of Jerry Elkind, a BBN vice president who later joined Xerox PARC. And Elkind told Kahn about ARPA’s interest in computer networking.

“I didn’t really know what ARPA was, other than I had seen the name,” Kahn says. Elkind told him to send his reports to Larry Roberts, the recently hired program manager for ARPA’s networking project.

“The next thing I know,” Kahn says, “there’s an RFQ [request for quotation] from ARPA for building a four-node net.” Kahn, still the consummate academic, hadn’t thought he’d have to do much beyond putting his thoughts down on paper. “It never dawned on me that I’d actually get involved in building it,” he says.

Kahn handled the technical portion of BBN’s proposal, and ARPA awarded BBN the four-node-network contract in January of 1969. The nodes rolled out later that year: at UCLA in September; the Stanford Research Institute (SRI) in October; the University of California, Santa Barbara, in November; and the University of Utah in December.

Kahn postponed his planned return to MIT and continued to work on expanding this network. In October 1972, the ARPANET was publicly unveiled at the first meeting of the International Conference on Computer Communications, in Washington, D.C.

“I was pretty sure it would work,” Kahn says, “but it was a big event. There were 30 or 40 nodes on the ARPANET at the time. We put 40 different kinds of terminals in the [Washington Hilton] ballroom, and people could walk around and try this terminal, that terminal, which might connect to MIT, and so forth. You could use Doug Engelbart’s NLS [oN-Line System] at SRI and manipulate a document, or you could go onto a BBN computer that demonstrated air-traffic control, showing an airplane leaving one airport, which happened to be on a computer in one place, and landing at another airport, which happened to be on a computer in another place.”

The demos, he recalled, ran 24 hours a day for nearly a week. The reaction, he says, “was ‘Oh my God, this is amazing’ for everybody, even people who worried about how it would affect their businesses.”

Goodbye BBN, Hello DARPA

Kahn officially left BBN the day after the demo concluded to join DARPA (the agency having recently added the word “Defense” to its name). He felt he’d done what he could on networking and was ready for a new challenge.

“They hired me to run a hundred-million-dollar program on automated manufacturing. It was an opportunity of a lifetime, to get on the factory floor, to figure out how to distribute processing, distribute artificial intelligence, use distributed sensors.”

A formal black and white portrait of a man wearing a suit and tie. Bob Kahn served on the MIT faculty from 1964 to 1966.Bob Kahn

Soon after he arrived at DARPA, Congress pulled the plug on funding for the proposed automated-manufacturing effort. Kahn shrugged his shoulders and figured he’d go back to MIT. But Roberts asked Kahn to stay. Kahn did, but rather than work on ARPANET he focused on developing packet radio, packet satellite, and even, he says, packetizing voice, a technology that led to VoIP (Voice over Internet Protocol) today.

Getting those new networks up and running wasn’t always easy. Irwin Jacobs, who had just cofounded Linkabit and later cofounded Qualcomm, worked on the project. He recalls traveling through Europe with Kahn, trying to convince organizations to become part of the network.

“We visited three PTTs [postal, telegraph, and telephone services],” Jacobs said, “in Germany, in France, and in the U.K. The reactions were all the same. They were very friendly, they gave us the morning to explain packet switching and what we were thinking of doing, then they would serve us lunch and throw us out.” But the two of them kept at it.

“We took a little hike one day,” Jacobs says. “There was a steep trail that went up the side of a fjord, water coming down the opposite side. We came across an old man, casting a line into the stream rushing downhill. He said he was fishing for salmon, and we laughed—what were his chances? But as we walked uphill, he yanked on his rod and pulled out a salmon.” The Americans were impressed with his determination.

“You have to have confidence in what you are trying to do,” Jacobs says. “Bob had that. He was able to take rejection and keep persisting.”

Ultimately, a government laboratory in Norway, the Norwegian Defence Research Establishment, and a laboratory at University College London came on board—enough to get the satellite network up and running.

And Then Came the Internet

With the ARPANET, packet-radio, and packet-satellite networks all operational, it was clear to Kahn that the next step would be to connect them. He knew that the ARPANET design all by itself wouldn’t be useful for bringing together these disparate networks.

“Number one,” he says, “the original ARPANET protocols required perfect delivery, and if something didn’t get through and you didn’t get acknowledgment, you kept trying until it got through. That’s not going to work if you’re in a noisy environment, if you’re in a tunnel, if you’re behind a mountain, or if somebody’s jamming you. So I wanted something that didn’t require perfect communication.”

“Number two,” he continues, “you wanted something that didn’t have to wait for everything in a message to get through before the next message could get through.

“And you had no way in the ARPANET protocols for telling a destination what to do with the information when it got there. If a router got a packet and it wasn’t for another node on the ARPANET, it would assume ‘Oh, must be for me.’ It had nowhere else to send it.”

Initially, Kahn assigned the network part of the IP addresses himself, keeping a record on a single index card he carried in his shirt pocket.

“Vint, as a computer scientist, thought of things in terms of bits and computer programs. As an electrical engineer, I thought about signals and bandwidth and the nondigital side of the world.”—Bob Kahn

He approached Vint Cerf, then an assistant professor at Stanford University, who had been involved with Kahn in testing the ARPANET during its development, and he asked him to collaborate.

“Vint, as a computer scientist, thought of things in terms of bits and computer programs. As an electrical engineer, I thought about signals and bandwidth and the nondigital side of the world. We brought together different sets of talents,” Kahn says.

“Bob came out to Stanford to see me in the spring of 1973 and raised the problem of multiple networks,” Cerf recalls. “He thought they should have a set of rules that allowed them to be autonomous but interact with each other. He called it internetworking.”

“He’d already given this serious thought,” Cerf continues. “He wanted SRI to host the operations of the packet-radio network, and he had people in the Norwegian defense-research establishment working on the packet-satellite network. He asked me how we could make it so that a host on any network could communicate with another in a standardized way.”

Cerf was in.

The two met regularly over the next six months to work on “the internetworking problem.” Between them, they made some half a dozen cross-country trips and also met one-on-one whenever they found themselves attending the same conference. In July 1973, they decided it was time to commit their ideas to paper.

“I remember renting a conference room at the Cabana Hyatt in Palo Alto,” Kahn says. The two planned to sequester themselves there in August and write until they were done. Kahn says it took a day; Cerf remembers it as two, or at least a day and a half. In any case, they got it done in short order.

Cerf took the first crack at it. “I sat down with my yellow pad of paper,” he says. “And I couldn’t figure out where to start.”

“I went out to pay for the conference room,” Kahn says. “When I came back Vint was sitting there with the pencil in his hand—and not a single word on the paper.”

Kahn admits that the task wasn’t easy. “If you tried to describe the United States government,” he says, “what would you say first? It’s the buildings, it’s the people, it’s the Constitution. Do you talk about Britain? Do you talk about Indians? Where do you start?”

Two men wearing medals on striped ribbons around their necks chat with President Bill Clinton In 1997, President Bill Clinton [right] presented the National Medal of Technology to Bob Kahn [center] and Vint Cerf [left].Bob Kahn

Kahn took the pencil from Cerf and started writing. “That’s his style,” Cerf says, “write as much as you can and edit later. I tend to be more organized, to start with an outline.”

“I told him to go away,” Kahn says, “and I wrote the first eight or nine pages. When Vint came back, he looked at what I had done and said, ‘Okay, give me the pencil.’ And he wrote the next 20 or 30 pages. And then we went back and forth.”

Finally, Cerf walked off with the handwritten version to give to his secretary to type. When she finished, he told her to throw that original draft away. “Historians have been mad at me ever since,” Cerf says.

“It might be worth a fortune today,” Kahn muses. The resulting paper, published in the IEEE Transactions on Communications in 1974, represented the basis of the Internet as we now know it. It introduced the Transmission Control Protocol, later separated into two parts and now known as TCP/IP.

A New World on an Index Card

A key to making this network of networks work was the Internet Protocol (IP) addressing system. Every new host coming onto the network required a new IP address. These numerical labels uniquely identify computers and are used for routing packets to their locations on the network.

Initially, Kahn assigned the network part of the IP addresses himself, keeping a record of who had been allotted what set of numbers on a single index card he carried in his shirt pocket. When that card began to fill up in the late ‘70s, he decided it was time to turn over the task to others. It became the responsibility of Jon Postel, and subsequently that of the Internet Assigned Numbers Authority (IANA) at the University of Southern California. IANA today is part of ICANN, the Internet Corporation for Assigned Names and Numbers.

Two men in casual dress stand in front of a rocky trail Bob Kahn and Vint Cerf visited Yellowstone National Park together in the early 2000s.Bob Kahn

Kahn moved up the DARPA ladder, to chief scientist, deputy director, and, in 1979, director of the Information Processing Techniques Office. He stayed in that last role until late 1985. At DARPA, in addition to his networking efforts, he launched the VLSI [very-large-scale integration] Architecture and Design Project and the billion-dollar Strategic Computing Initiative.

In 1985, with political winds shifting and government research budgets about to shrink substantially, Kahn left DARPA to form a nonprofit dedicated to fostering research on new infrastructures, including designing and prototyping networks for computing and communications. He established it as the Corporation for National Research Initiatives (CNRI).

Kahn reached out to industry for funding, making it clear that, as a nonprofit, CNRI intended to make its research results open to all. Bell Atlantic, Bellcore, Digital Equipment Corp., IBM, MCI, NYNEX, Xerox, and others stepped up with commitments that totaled over a million dollars a year for several years. He also reached out to the U.S. National Science Foundation and received funding to build testbeds to demonstrate technology and applications for computer networks at speeds of at least a gigabit. CNRI also obtained U.S. government funding to create a secretariat for the Internet Activities Board, which eventually led to the establishment of the Internet Engineering Task Force, which has helped evolve Internet protocols and standards. CNRI ran the secretariat for about 18 years.

Cerf joined Kahn at CNRI about six months after it started. “We were thinking about applications of the Internet,” Cerf says. “We were interested in digital libraries, as were others.” Kahn and Cerf sought support for such work, and DARPA again came through, funding CNRI to undertake a research effort involving building and linking digital libraries at universities.

They also began working on the concept of “Knowbots,” mobile software programs that could collect and store information to be used to handle distributed tasks on a network.

As part of that digital library project, Kahn collaborated with Robert Wilensky at the University of California, Berkeley, on a paper called “A Framework for Distributed Digital Object Services,” published in the International Journal on Digital Libraries in 2006.

The Digital Object Emerges

Out of this work came the idea that today forms the basis of much of Kahn’s current efforts: digital objects, also known as digital entities. A digital object is a sequence of bits, or a set of such sequences, having a unique identifier. A digital object may incorporate a wide variety of information—documents, movies, software programs, wills, and even cryptocurrency. The concept of a digital object, together with distributed repositories, metadata registries, and a decentralized identifier resolution system, form the digital-object architecture. From its identifier, a digital object can be located even if it moves to a different place on the net. Kahn’s collaborator on much of this work is his wife, Patrice Lyons, a copyright and communications lawyer.

Initially, CNRI maintained the registry of Digital Object Identifier (DOI) records. Then those came to be kept locally, and CNRI maintained just the registry of prefix records. In 2014, CNRI handed off that responsibility to a newly formed international body, the DONA Foundation in Geneva. Kahn serves as chair of the DONA board. The organization uses multiple distributed administrators to operate prefix registries. One, the International DOI Foundation, has handled close to 100 billion identifiers to date. The DOI system is used by a host of publishers, including IEEE, as well as other organizations to manage their digital assets.

A man in a suit stands in front of a sign with a paragraph of text.  The title, \u201cARPANET\u201d is legible. A plaque commemorating the ARPANET now stands in front of the Arlington, Va., headquarters of the Defense Advanced Research Projects Agency (DARPA). Bob Kahn

Kahn sees this current effort as a logical extension of the work he did on the ARPANET and then the Internet. “It’s all about how we use the Internet to manage information,” he says.

Kahn, now 85, works more than five days a week and has no intention of slowing down. The Internet, he says, is still in its startup phase. Why would he step back now?

“I once had dinner with [historian and author] David McCullough,” Kahn explains. Referring to the 1974 paper he wrote with Cerf, he says, “I told him that if I were sitting in the audience at a meeting, people wouldn’t say ‘Here’s what the writers of this paper really meant,’ because I would get up and say, ‘Well we wrote that and….’ “

“I asked McCullough, ‘When do you consider the end of the beginning of America?’” After some discussion, McCullough put the date at 4 July 1826, when both John Adams and Thomas Jefferson passed away.

Kahn agreed that their deaths marked the end of the country’s startup phase, because Adams and Jefferson never stopped worrying about the country that they helped create.

“It was such an important thing that they were doing that their lives were completely embedded in it,” Kahn says. “And the same is true for me and the Internet.”

This article appears in the May 2024 print issue as “The Great Interconnector.”

  • ✇Android Authority
  • I drove a car using just my eyes and a smartphoneC. Scott Brown
    At Mobile World Congress, there are always a few bizarre demonstrations of technology. HONOR had one of its own in which it allowed journalists to control a car using only eye movements. This was made possible by eye-tracking tech inside the company’s newly launched HONOR Magic 6 Pro smartphone. Before you assume I was driving a car through the streets of Barcelona just by looking at things, let me make some details clear. I could only do four things: start the car, stop the car, move it for
     

I drove a car using just my eyes and a smartphone

28. Únor 2024 v 13:49

At Mobile World Congress, there are always a few bizarre demonstrations of technology. HONOR had one of its own in which it allowed journalists to control a car using only eye movements. This was made possible by eye-tracking tech inside the company’s newly launched HONOR Magic 6 Pro smartphone.

Before you assume I was driving a car through the streets of Barcelona just by looking at things, let me make some details clear. I could only do four things: start the car, stop the car, move it forward a few meters, and move it backward a few meters. Additionally, the car was in a warehouse the entire time, with a safety crew monitoring the situation to ensure no one was hurt. Despite the lack of a wild Barca joyride, it was still pretty impressive.

So how does this work? Let’s jump in!

Smartphone eye tracking

HONOR Eye Tracking Car Demo MWC 2024 03

Credit: Paul Jones / Android Authority

Eye tracking in smartphones is nothing new. Way back in 2013, Samsung launched its Smart Scroll feature inside the Galaxy S4. Using eye tracking, you could scroll a page up or down simply by looking at the top or bottom of the phone. It worked surprisingly well — but few people liked it because using your finger was both faster and easier. Eventually, Samsung abandoned Smart Scroll.

A lot has changed in eleven years, though, and HONOR is ready to give eye-tracking tools another chance. On the Chinese version of the HONOR Magic 6 Pro, eye-tracking tools already enable people to expand and open notifications just by looking at them. This requires much more finesse than Smart Scroll could have ever provided, and that finesse is made possible by AI processing on the device itself.

Eye tracking in smartphones has been a thing for over a decade, but AI smarts is making it faster and more accurate than ever.

To track eye movements, the phone uses its front-facing 3D sensor, which lives in the pill-shaped display cutout. To set it up, you need to calibrate the phone so that it knows where your eyes are and how your eye movement works. It only takes a few seconds to do.

I was astonished at how accurately the system could detect what I was looking at once I had calibrated the phone — but it certainly wasn’t flawless. This makes sense, though: this calibration setup is heavily skewed at the moment because all the data is based on Asian eyes. I also had some trouble with my glasses, which seemed to confuse the system a bit. I took them off, and things worked much better.

These are some of the reasons why the global version of the Magic 6 Pro isn’t launching with this capability. HONOR wants to collect more data on a wider range of eyes before it pushes the feature to global products.

Once you’re set up, you can look at a screen element (such as a button or notification) and stay looking at it for a second or two, and that’s all it takes to activate it.

So that’s how the eye-tracking tech works, but now let’s get into driving the car!

Driving a car through eye movements

HONOR Eye Tracking Car Demo MWC 2024 01

Credit: Paul Jones / Android Authority

In HONOR’s warehouse space, an unidentified car (likely an Alfa Romeo Giulia Quadrifoglio) was hooked up to a self-driving system. This allowed the car to move forward and backward without anyone touching anything inside it. The car’s remote key was inside, which allowed it to auto-start and stop through smartphone controls.

HONOR built a simple app for this demo. The app had four buttons to control the four actions, with plenty of space between each button to avoid eye-tracking confusion. After all, you don’t want someone accidentally moving the car forward when they meant to move it backward, for example.

There were a few hiccups, but overall, the eye-tracking tech worked great.

Like I said earlier, I did face issues with my glasses when I first tried the demo. But, once I removed them, I could “push” each button accurately and quickly. HONOR let us do it a few times (and even ride in the car one of those times, which was cool).

Overall, the demo was simple but effective. The impracticability of the demo notwithstanding, it did prove that smartphone eye tracking could be used for more than just scrolling your screen.

Is this safe? And why does this matter?

HONOR Eye Tracking Car Demo MWC 2024 02

Credit: Paul Jones / Android Authority

If you’re concerned about the security of any data obtained from eye tracking, HONOR says that it is taking that very seriously. All eye-tracking data is kept on-device, not in the cloud, and the selfie camera isn’t used at all for the tracking, so there’s never a full photo of your eyes. In a sense, it’s not too dissimilar from a phone’s fingerprint scanner.

The big question, though, is why is HONOR doing this? No one will be driving a car with just eye movements any time soon, after all. While that’s true, the idea of using eye movements to interact with technology is a sound concept. For example, if your phone were propped on your kitchen counter while cooking, using your eyes to move around a recipe would be incredibly advantageous.

What HONOR has really done here is put some theatrics behind a simple idea: “pushing” a button using your eyes. It’s a bit over the top, sure, but no less exciting.

HONOR says the global version of the Magic 6 Pro will get eye-tracking features at some point this year. It also said it might come to last year’s Magic 5 Pro. That device has the necessary hardware for the feature, so it’s just a matter of HONOR putting in the work to make it happen.

What do you think about this? Do you see any advantages to eye-tracking tech? Let us know in the comments!

  • ✇Android Authority
  • First Samsung, now HONOR is also working on a smart ringHadlee Simons
    Credit: Kaitlyn Cimino / Android Authority HONOR has reportedly confirmed that it’s working on a smart ring. The company said it’s part of a wider health strategy. The manufacturer also reiterated plans for a flip foldable. Samsung made tech headlines earlier this year when it announced the Galaxy Ring. Now, fellow smartphone manufacturer HONOR has revealed smart ring plans of its own.
     

First Samsung, now HONOR is also working on a smart ring

28. Únor 2024 v 07:55

A user holds and Oura Ring 3 between two fingers, displaying the devices sensors.

Credit: Kaitlyn Cimino / Android Authority
  • HONOR has reportedly confirmed that it’s working on a smart ring.
  • The company said it’s part of a wider health strategy.
  • The manufacturer also reiterated plans for a flip foldable.

Samsung made tech headlines earlier this year when it announced the Galaxy Ring. Now, fellow smartphone manufacturer HONOR has revealed smart ring plans of its own.

AI Eye Tracking Feature of Honor Magic6 Pro Can Remotely Control a Car

22. Únor 2024 v 07:40
Honor Magic6 Pro Car Control

The Honor Magic6 Pro debuted with a lot of advanced technologies. Just recently, we covered the battery techs the company used for its new flagship phone, making ...

The post AI Eye Tracking Feature of Honor Magic6 Pro Can Remotely Control a Car appeared first on Gizchina.com.

❌
❌