FreshRSS

Zobrazení pro čtení

Jsou dostupné nové články, klikněte pro obnovení stránky.

The Doyen of the Valley Bids Adieu



When Senior Editor Tekla S. Perry started in this magazine’s New York office in 1979, she was issued the standard tools of the trade: notebooks, purple-colored pencils for making edits and corrections on page proofs, a push-button telephone wired into a WATS line for unlimited long distance calling, and an IBM Selectric typewriter, “the latest and greatest technology, from my perspective,” she recalled recently.

And she put that typewriter through its paces. “In this period she was doing deep and outstanding reporting on major Silicon Valley startups, outposts, and institutions, most notably Xerox PARC,” says Editorial Director for Content Development Glenn Zorpette, who began his career at IEEE Spectrum five years later. “She did some of this reporting and writing with Paul Wallich, another staffer in the 1980s. Together they produced stories that hold up to this day as invaluable records of a pivotal moment in Silicon Valley history.”

Indeed, the October 1985 feature story about Xerox PARC, which she cowrote with Wallich in 1985, ranks as Perry’s favorite article.

“While now it’s widely known that PARC invented history-making technology and blew its commercialization—there have been entire books written about that—Paul Wallich and I were the first to really dig into what had happened at PARC,” she says. “A few of the key researchers had left and were open to talking, and some people who were still there had hit the point of being frustrated enough to tell their stories. So we interviewed a huge number of them, virtually all in person and at length. Think about who we met! Alan Kay, Larry Tesler, Alvy Ray Smith, Bob Metcalfe, John Warnock and Chuck Geschke, Richard Shoup, Bert Sutherland, Charles Simonyi, Lynn Conway, and many others.”

“I know without a doubt that my path and those of my younger women colleagues have been smoothed enormously by the very fact that Tekla came before us and showed us the way.” –Jean Kumagai

After more than seven years of reporting trips to Silicon Valley, Perry relocated there permanently as Spectrum’s first “field editor.”

Over the course of more than four decades, Perry became known for her profiles of Valley visionaries and IEEE Medal of Honor recipients, most recently Vint Cerf and Bob Kahn. She established working relationships—and, in some cases, friendships—with some of the most important people in Northern California tech, including Kay and Smith, Steve Wozniak (Apple), Al Alcorn and Nolan Bushnell (Atari), Andy Grove (Intel), Judy Estrin (Bridge, Cisco, Packet Design), and John Hennessy (chairperson of Alphabet and former president of Stanford).

Just as her interview subjects were regarded as pioneers in their fields, Perry herself ranks as a pioneer for women tech journalists. As the first woman editor hired at Spectrum and one of a precious few women journalists reporting on technology at the time, she blazed a trail that others have followed, including several current Spectrum staff members.

“Tekla had already been at Spectrum for 20 years when I joined the staff,” Executive Editor Jean Kumagai told me. “I know without a doubt that my path and those of my younger women colleagues have been smoothed enormously by the very fact that Tekla came before us and showed us the way.”

Perry is retiring this month after 45 years of service to IEEE and its members. We’re sad to see her go and I know many readers are, too—from personal experience. I met an IEEE Life Member for breakfast a few weeks ago. I asked him, as an avid Spectrum reader since 1964, what he liked most about it. He began talking about Perry’s stories, and how she inspired him through the years. The connections forged between reader and writer are rare in this age of blurbage and spew, but the way Perry connected readers to their peers was, well, peerless. Just like Perry herself.

This article appears in the August 2024 print issue.

Space-based Solar Power: A Great Idea Whose Time May Never Come



The scene: A space-based solar power station called the Converter being commissioned some time in the Future. The characters: Two astronauts, Powell and Donovan, and a robot named QT-1 (“Cutie” to its human friends). The plot: The astronauts are training Cutie to take over the station’s operations, which involve collecting solar energy in space and then directing it as intense beams of microwaves down to Earth.

This is the backdrop for Isaac Asimov’s 1941 short story “Reason.” Most of the story centers around Asimov’s Three Laws of Robotics and the humans’ relationship with the robot. But the station itself is worth a second look. It’s pretty clear Asimov had no idea how a system like the Converter would actually work, except in the most basic terms. Here’s how Powell tries to explain it to Cutie:

“Our beams feed these worlds energy drawn from one of those huge incandescent globes that happens to be near us. We call that globe the Sun and it is on the other side of the station where you can’t see it.”

Harnessing the power of the sun in space is certainly an enticing idea. A decade ago we featured a project at the Japan Aerospace Exploration Agency that aimed to launch a 1-gigawatt solar station by 2031. As a step in that direction, JAXA says it will demonstrate a small satellite transmitting 1 kilowatt of power to Earth from an altitude of 400 kilometers next year. We’ve also reported on Caltech’s SSPD-1 demonstrator project and the US $100 million from a billionaire donor who funds it.

A space solar project would “waste capital that could be better spent improving less risky ways to shore up renewable energy, such as batteries, hydrogen, and grid improvements.”

And yet, space-based solar power remains more science fiction than science fact, as Henri Barde writes in “Castles in the Sky?” Barde should know: He recently retired from the European Space Agency, where among other things he evaluated space power systems. As Barde’s article makes abundantly clear, this clean energy would come at an enormous cost, if it can be done at all, “[wasting] capital that could be better spent improving less risky ways to shore up renewable energy, such as batteries, hydrogen, and grid improvements.”

For example, U.K.-based Space Solar estimates it will need 68 (!) SpaceX Starship launches to loft all the assets necessary to build one 1.7-km-long solar array in orbit. Nevermind that SpaceX hasn’t yet successfully launched a Starship into orbit and brought it back in one piece. Even if the company can eventually get the price down to $10 million per launch, we’re still talking hundreds of millions of dollars in launch costs alone. We also don’t have real-life Cuties to build such a station. And the ground stations and rectennas necessary for receiving the beamed power and putting it on the grid are still just distant dots on a road map in someone’s multimillion dollar research proposal.

Engineers are often inspired by science fiction. But inspiration only gets you so far. Space-based solar power will remain sci-fi fodder for the foreseeable future. For the monumental task of electrifying everything while reducing greenhouse gas emissions, it’s better to focus on solutions based on technology already in hand, like conventional geothermal, nuclear, wind, and Earth-based solar, rather than wasting time, brainpower, and money on a fantasy.

This article appears in the June 2024 print issue as “The Chasm Between Imagination and Feasibility.”

Travels with Perplexity AI



“How did you find me?” specialty coffee roaster Dajo Aertssen asked. He’d just handed me a bag of single-origin cascara, the dried flesh of coffee cherries, in his shop, Cafés Muda in Lille, France.

“The AI sent us,” I replied.

He looked puzzled, so I explained that my companion Dawn and I had asked an app called Perplexity AI to name a “coffee roaster in central Lille,” and that it suggested four places, including Muda. It had linked to a Web page that mentioned Aertssen’s French and world cup tasting championships and that he sources the kinds of beans I enjoy, like Muda’s sidra from Colombia, which tastes like raspberry jam. When I alluded to his status as a supertaster, Aertssen beamed, astonished that I knew anything about him at all.

Dawn and I used Perplexity as an on-demand travel guide throughout our tour of the Flemish region of Belgium and Northern France. The chatbot-based, generative-search startup was founded in 2022 and backed by Nvidia, Jeff Bezos, and New Enterprise Associates, among others. The company’s app rests on GPT-3.5 (the free version) or GPT-4 (the paid version, which I used) and incorporates the company’s own page-ranking algorithm, large language model, and natural-language-processing system.

Perplexity aims to provide an alternative to the broken Google search experience, which forces users to sift through ads, paid placements, and AI-generated junk posts to find what they’re looking for. Perplexity gives answers to questions along with links to the sources it uses. According to search engine optimization company BrightEdge, website referral traffic from Perplexity has been increasing 40 percent per month since January.

As we criss-crossed the countryside seeking cycling museums, waffles, and beer, we passed dozens of wind turbines, prompting me to ask “How much of France’s total energy generation comes from wind power?” (Perplexity didn’t exactly answer this one, but referenced an International Energy Agency site that stated wind met 7.8 percent of national electricity demand in 2021.) Next: “Where can I buy distilled water near me right now?” (In the nearest Carrefour hypermarket, in the housekeeping section, it turns out.) Then we flipped on the audio response mode so the vaguely English-accented male voice could “tell us about the construction of the cathedral in Amiens, France,” which we were going to the next day.

Despite the occasional miss, we agreed that the app significantly enhanced our real-world experience and connected us to Flemish culture and people.

Those connections would not have been as convenient, or maybe even possible, without the Internet—which provides the data on which generative search and many popular large language models rely—and without the seminal contributions of this year’s Medal of Honor recipient, Bob Kahn, whom Senior Editor Tekla S. Perry profiles on page 36 in this issue.

Kahn is being honored for his work on packet communication technologies, which were part of the project that became the ARPANET and the foundations of the Internet. Could he have foreseen people driving around asking an AI-powered search app any question that popped into their heads? Probably not. As Perry writes: “Other than generally thinking that it would be nice if computers could talk to one another, Kahn didn’t give much thought to applications.”

“If you were engineering the Bell system,” Kahn told Perry, “you weren’t trying to figure out who in San Francisco is going to say what to whom in New York. You were just trying to figure out how to enable conversations.”

Kahn’s creations are enabling conversations today, with chatbots in cars and coffee roasters in cafés around the world.

This article appears in the May 2024 print issue as “Making Connections.”

Software Sucks, but It Doesn’t Have To



You can’t see, hear, taste, feel, or smell it, but software is everywhere around us. It underpins modern civilization even while consuming more energy, wealth, and time than it needs to and burping out a significant amount of carbon dioxide into the atmosphere. The software industry and the code it ships need to be much more efficient in order to minimize the emissions attributable to programs running in data centers and over transmission networks. Two approaches to software development featured in Spectrum‘s April 2024 issue can help us get there.

In “Why Bloat Is Still Software’s Biggest Vulnerability,” Bert Hubert pays homage to the famed computer scientist and inventor of Pascal, Niklaus Wirth, whose influential essay “A Plea for Lean Software” appeared in IEEE Computer in 1995. Wirth’s essay built on a methodology first conceived by Spectrum contributing editor Robert N. Charette, who in the early 1990s adapted the Toyota Production System for software development.

Hubert points out that bloated code offers giant attack surfaces for bad actors. Malicious hacks and ransomware attacks, not to mention run-of-the-mill software failures, are like the weather now: partly cloudy with a 50 percent chance of your app crashing or your personal information being circulated on the Dark Web. Back in the day, limited compute resources forced programmers to write lean code. Now, with much more robust resources at hand, coders are writing millions of lines of code for relatively simple apps that call on hundreds of libraries of, as Hubert says, “unknown provenance.”

“There’s an already existing large segment of the software-development ecosystem that cares about this space—they just haven’t known what to do.” —Asim Hussain, Green Web Foundation

Among other things, he argues for legislation along the lines of what the European Union is trying to enforce: “NIS2 for important services; the Cyber Resilience Act for almost all commercial software and electronic devices; and a revamped Product Liability Directive that also extends to software.” Hubert, a software developer himself, walks the lean walk: His 3-megabyte image-sharing program Trifecta does the same job as other programs that use hundreds of megabytes of code.

Lean software should, in theory, be green software. In other words, it should run so efficiently that it reduces the amount of energy used in data centers and transmission networks. Overall, the IT and communications sectors are estimated to account for 2 to 4 percent of global greenhouse gas emissions and, according to one 2018 study, could by 2040 reach 14 percent. And that study came out prior to the explosion in AI applications, whose insatiable hunger for computing resources and the power required to feed the algorithms exacerbates an already complicated problem.

Thankfully, several groups are working on solutions, including the Green Web Foundation. The GWF was spun up almost 20 years ago to figure out how the Internet is powered, and now has a goal of a fossil-free Internet by 2030.

There are three main ways to achieve that objective, according to the foundation’s chair and executive director Asim Hussain: Use less energy, use fewer physical resources, and use energy more prudently—by, for instance, having your apps do more when there’s power from wind and solar available and less when there’s not.

“There’s an already existing large segment of the software-development ecosystem that cares about this space—they just haven’t known what to do,” Hussain told Spectrum contributing editor Rina Diane Caballar. They do now, thanks to Caballar’s extensive reporting and the handy how-to guide she includes in “We Need to Decarbonize Software.” Programmers have the tools to make software leaner and greener. Now it’s up to them, and as we’ve seen in the EU, their legislators, to make sustainable and secure code their top priority. Software doesn’t have to suck.

Lean Software, Power Electronics, and the Return of Optical Storage



Stephen Cass: Hi. I’m Stephen Cass, a senior editor at IEEE Spectrum. And welcome to Fixing The Future, our bi-weekly podcast that focuses on concrete solutions to hard problems. Before we start, I want to tell you that you can get the latest coverage from some of Spectrum‘s most important beats, including AI, climate change, and robotics, by signing up for one of our free newsletters. Just go to spectrum.ieee.org/newsletters to subscribe.

Today on Fixing The Future, we’re doing something a little different. Normally, we deep dive into exploring one topic, but that does mean that some really interesting things get left out for the podcast simply because they wouldn’t take up a whole episode. So here today to talk about some of those interesting things, I have Spectrum‘s Editor in Chief Harry Goldstein. Hi, boss. Welcome to the show.

Harry Goldstein: Hi there, Stephen. Happy to be here.

Cass: You look thrilled.

Goldstein: I mean, I am thrilled. I’m always excited to talk about Spectrum stories.

Cass: No, we’ve tied you down and made you agree to this, but I think it’ll be fun. So first up, I’d like to talk about this guest post we had from Bert Hubert which seemed to really strike a chord with readers. It was called Why Bloat Is Still Software’s Biggest Vulnerability: A 2024 plea for lean software. Why do you think this one resonated with readers, and why is it so important?

Goldstein: I think it resonated with readers because software is everywhere. It’s ubiquitous. The entire world is essentially run on software. A few days ago, even, there was a good example of the AT&T network going down likely because of some kind of software misconfiguration. This happens constantly. In fact, it’s kind of like bad weather, the software systems going down. You just come to expect it, and we all live with it. But why we live with it and why we’re forced to live with it is something that people are interested in finding out more, I guess.

Cass: So I think, in the past, when we associated giant bloated software, we had associated with large projects, these big government projects, these big airlines, big, big, big projects. And we’ve written about that a lot at Spectrum before, haven’t we?

Goldstein: We certainly have. And Bob Charette, our longtime contributing editor, who is actually the father of lean software, back in the early ‘90s took the Toyota Total Quality Management program and applied it to software development. And so it was pretty interesting to see Hubert’s piece on this more than 30 years later where the problems have just proliferated. And think about your average car these days. It’s approaching a couple hundred million lines of code. A glitch in any of those could cause some kind of safety problem. Recalls are pretty common. I think Toyota had one a few months ago. So the problem is everywhere, and it’s just going to get worse.

Cass: Yeah. One of the things that struck me was that Bert’s making the argument that you don’t actually need now an army of programmers to create bloated software— to get all those millions of lines of code. You could be just writing a code to open a garage door. This is a trivial program. Because of the way you’re writing it on frameworks, and those are pulling in dependencies and so on, you’re pulling in just millions of lines of other people’s code. You might not even know you’re doing it. And you kind of don’t notice unless, at the end of the day, you look at your final program file and you’re like, “Oh, why is that megabytes upon megabytes?” which represents endless lines of source code. Why is that so big? Because this is how you do software. You just pull these things together. You glue stuff. You focus on the business logic because that’s your value add, but you’re not paying attention to this enormous sort of—I don’t know; what would you call it?—invisible dark matter that surrounds your software.

Goldstein: Right. It’s kind of like dark matter. Yeah, that’s kind of true. I mean, it actually started making me think. All of these large language models that are being applied to software development. Co-piloting, I guess they call it, right, where the coder is sitting with an AI, trying to write better code. Do you think that might solve the problem or get us closer?

Cass: No, because I think those systems, if you look at them, they reflect modern programming usage. And modern programming usage is often to use the frameworks that are available. It’s not about really getting in and writing something that’s a little bit leaner. Actually, I think the Ais—it’s not their fault—they just do what we do. And we write bloaty softwares. So I think that’s not going to get any better necessarily with this AI stuff because the point of lean software is it does take extra time to make, and there are no incentives to make lean software. And Bert talks about, “Maybe we’re going to have to impose some of this legis— l e g i s l a tively.”—I speak good. I editor. You hire wise.—But some of these things are going to have to be mandated through standards and regulations, and specifically through the lens of these cybersecurity requirements and knowing what’s going into your software. And that may help with all just getting a little bit leaner. But I did actually want to— another news story that came up this week was Apple closing down its EV division. And you mentioned Bob Charette there. And he wrote this great thing for us recently about why EV cars are one thing and EV infrastructure is an even bigger problem and why EVs are proving to be really quite tough. And maybe the problem— again, it’s a dark matter problem, not so much the car at the center, but this sort of infrastructure— just talk a little bit about Bob’s book, which is, by the way, free to download, and we’ll have the link in the show notes.

Goldstein: Everything you need to know about the EV transition can be yours for the low, low price of free. But, yeah. And I think we’re starting to see-- I mean, even if you mandate things, you’re going to-- you were talking about legislation to regulate software bloat.

Cass: Well, it’s kind of indirect. If you want to have good security, then you’re going to have to do certain things. The White House just came out with this paper, I think yesterday or the day before, saying, “Okay, you need to start using memory-safe languages.” And it’s not quite saying, “You are forbidden from using C, and you must use Rust,” but it’s kind of close to that for certain applications. They exempted certain areas. But you can see, that is the government really coming in and, actually, what has often been a very personal decision of programmers, like, “What language do I use?” and, “I know how to use C. I know how to do garbage collection,” the government kind of saying, “Yeah, we don’t care how great a programmer you think you are. These programs lead to this class of bugs, and we’d really prefer if you used one of these memory-safe languages.” And that’s, I guess, a push into sort of the private lives of programmers that I think we’re going to see more of as time goes by.

Goldstein: Oh, that’s interesting because the—I mean, where I was going with that connection to legislation is that—I think what Bob found in the EV transition is that the knowledge base of the people who are charged with making decisions about regulations is pretty small. They don’t really understand the technology. They certainly don’t understand the interdependencies, which are very similar to the software development processes you were just referring to. It’s very similar to the infrastructure for electric cars because the idea, ultimately, for electric cars is that you also are revamping your grid to facilitate, whatchamacallit, intermittent renewable energy sources, like wind and solar, because having an electric car that runs off a coal-fired power plant is defeating the purpose, essentially. In fact, Ozzie Zehner wrote an article for us way back in the mid-Teens about the— the dirty secret behind your electric car is the coal that fuels it. And—

Cass: Oh, that was quite controversial. Yeah. I think maybe because the cover was a car perched at the top of a giant mountain of coal. I think that—

Goldstein: But it’s true. I mean, in China, they have one of the biggest electric car industries in the world, if not the biggest, and one of the biggest markets that has not been totally saturated by personal vehicles, and all their cars are going to be running on coal. And they’re the world’s second-largest emitter behind the US. But just circling back to the legislative angle and the state of the electric vehicle industry-- well, actually, are we just getting way off topic with the electric vehicles?

Cass: No, it is this idea of interdependence and these very systems that are all coupled in all kinds of ways we don’t expect. And with that EV story— so last time I was home in Ireland, one of the stories was— so they had bought this fleet of buses to put in Dublin to replace these double-decker buses, electric double-deck, to help Ireland hit its carbon targets. So this was an official government goal. We bought the buses, great expense purchasing the buses, and then they can’t charge the buses because they haven’t already done the planning permission to get the charging stations added into the bus depot, which just was this staggering level of interconnect whereas, one hand, the national government is very— “Yes, meeting our target goals. We’re getting these green buses in. Fantastic advance. Very proud of it,” la la la la, and you can’t plug the things in because just the basic work on the ground and dealing with the local government has not been there to put in the charging stations. All of these little disconnects add up. And the bigger, the more complex system you have, the more these things add up, which I think does come back to lean software. Because it’s not so much, “Okay. Yeah, your software is bloaty.” Okay, you don’t win the Turing Prize. Boo-hoo. Okay. But the problem is that because you are pulling all of these dependencies that you just do not know and all these places where things break— or the problem of libraries getting hijacked.

So we have to retain the capacity on some level— and this actually is a personal thing with me, is that I believe in the concept of personal computing. And this was the thing back in the 1970s when personal computers first came out, which the idea was it would— it was very explicitly part of the culture that you would free yourself from the utilities and the centralized systems and you could have a computer on your desk that will let you do stuff, that you didn’t have to go through, at that stage, university administrators and paperwork and you could— it was a personal computer revolution. It was very much front and center. And nowadays it’s kind of come back full circle because now we’re increasingly finding things don’t work if they’re not network connected. So I believe it should be possible to have machines that operate independently, truly personal machines. I believe it should be possible to write software to do even complicated things without relying on network servers or vast downloads or, again, the situation where you want it to run independently, okay, but you’ve got to download these Docker images that are 350 megabytes or something because an entire operating system has to be bundled into them because it is impossible to otherwise replicate the correct environment in which software is running, which also undercuts the whole point of open source software. The point of open source is, if I don’t like something, I can change it. But if it’s so hard for me to change something because I have to replicate the exact environment and toolchains that people on a particular project are using, it really limits the ability of me to come in and maybe— maybe I just want to make some small changes, or I just want to modify something, or I want to pull it into my project. That I have to bring this whole trail of dependencies with me is really tough. Sorry, that’s my rant.

Goldstein: Right. Yeah. Yeah. Actually, one of the things I learned the most about from the Hubert piece was Docker and the idea that you have to put your program in a container that carries with it an entire operating system or whatever. Can you tell me more about containers?

Cass: Yeah. Yeah. Yeah. I mean, you can put whatever you want into a container, and some containers are very small. It distributes its own thing. You can get very lean containers that is just basically the program and the install. But it basically replaces the old idea of installing software, where you’d— and that was a problem, because every time you installed a bit of software, it scarred your system in some way. There was always scar tissue because it made changes. It nestled in. If nothing else, it put files onto your disk. And so over time, one of the problems was that this then meant that your computer would accumulate random files. It was very hard to really uninstall something completely because it’d always put little hooks and would register itself in a different place in the operating system, again, because now it’s interoperating with a whole bunch of stuff. Programs are not completely standalone. At the very least, they’re talking to an operating system. You want it to talk nicely to other programs in the operating system. And this led to all these kind of direct install problems.

And so the idea was, “Oh, we will sandbox this out. We’ll have these little Docker images, basically, to do it,” but that does give you the freedom whereby you can build these huge images, which are essentially virtual machines running away. So, again, it relieves the process of having to figure out your install and your configuration, which is one thing he was talking about. When you had to do these installers, it did really make you clarify your thinking very sharply on configuration and so on. So again, containers are great. All these cloud technologies, being able to use libraries, being able to automatically pull in dependencies, they’re all terrific in moderation. They all solve very real problems. I don’t want to be a Luddite and go, “We should go back to writing assembler code as God intended.” That’s not what I’m saying, but we do sometimes have to look at— it does sometimes enable bad habits. It can incentivize bad habits. And you have to really then think very deliberately about how to combat those problems as they pop up.

Goldstein: But from the beginning, right? I mean, it seems to me like you have to commit to a lean methodology at the start of any project. It’s not something that the AI is going to come in and magically solve and slim down at the end.

Cass: No, I agree. Yeah, you have to commit to it, or you have to commit to frameworks where— I’m not going to necessarily use these frameworks. I’m going to go and try and do some of this myself, or I’m going to be very careful in how I look at my frameworks, like what libraries I’m going to use. I’m going to use maybe a library that doesn’t pull in other dependencies. This guy maybe wrote this library which has got 80 percent of what I need it to do, but it doesn’t pull in libraries, unlike the bells and whistles thing which actually does 400 percent of what I need it to do. And maybe I might write that extra 20 percent. And again, it requires skill and it requires time. And it’s like anything else. There are just incentives in the world that really tend to sort of militate against having the time to do that, which, again, is where we start coming back into some of these regulatory regimes where it becomes a compliance requirement. And I think a lot of people listening will know that time when things get done is when things become compliance requirements, and then it’s mandatory. And that has its own set of issues with it in terms of losing a certain amount of flexibility and so on, but that sometimes seems to be the only way to get things done in commercial environments certainly. Not in terms of personal projects, but certainly for commercial environments.

Goldstein: So what are the consequences, in a commercial environment, of bloat, besides— are there things beyond security? Here’s why I’m asking, because the idea that you’re going to legislate lean software into the world as opposed to having it come from the bottom up where people are recognizing the need because it’s costing them something—so what are the commercial costs to bloated software?

Cass: Well, apparently, absolutely none. That really is the issue. Really, none, because software often isn’t maintained. People just really want to get their products out. They want to move very quickly. We see this when it comes to— they like to abandon old software very quickly. Some companies like to abandon old products as soon as the new one comes out. There really is no commercial downside to using this big software because you can always say, “Well, it’s industry standard. Everybody is doing it.” Because everybody’s doing it. You’re not necessarily losing out to your competitor. We see these massive security breaches. And again, the legislating for lean software is through demanding better security. Because currently, we see these huge security breaches, and there’s very minimal consequences. Occasionally, yes, a company screws up so badly that it goes down. But even so, sometimes they’ll reemerge in a different form, or they’ll get gobbled up in someone.

There really does not, at the moment, seem to be any commercial downside for this big software, in the same way that— there are a lot of weird incentives in the system, and this certainly is one of them where, actually, the incentive is, “Just use all the frameworks. Bolt everything together. Use JS Electron. Use all the libraries. Doesn’t matter because the end user is not really going to notice very much if their program is 10 megabytes versus 350 megabytes,” especially now when people are completely immune to the size of their software. Back in the days when software came on floppy disk, if you had a piece of software that came on 100 floppy disks, that would be considered impractical. But nowadays, people are downloading gigabytes of data just to watch a movie or something like this. If a program is 1 gigabyte versus 100 megabytes, they don’t really notice. I mean, the only people who notice is if, say, video games— a really big video game. And then you see people going, “Well, it took me three hours to download the 70 gigabytes for this AAA game that I wanted to play.” That’s about the only time you see people complaining about the actual storage size of software anymore, but everybody else, they just don’t care. Yeah, it’s just invisible to them now.

Goldstein: And that’s a good thing. I think Charles Choi had a piece for us on-- we’ll have endless storage, right, on disks, apparently.

Cass: Oh, I love this story because it’s another story of a technology that looks like it’s headed off into the sunset, “We’ll see you in the museum.” And this is optical disk technology. I love this story and the idea that you can— we had laser disks. We had CDs. We had CD-ROMs. We had DVD. We had Blu-ray. And Blu-ray really seemed to be in many ways the end of the line for optical disks, that after that, we’re just going to use solid-state storage devices, and we’ll store all our data in those tiny little memory cells. And now we have these researchers coming back. And now my brain has frozen for a second on where they’re from. I think they’re from Shanghai. Is it Shanghai Institute?

Goldstein: Yes, I think so.

Cass: Yes, Shanghai. There we go. There we go. Very nice subtle check of the website there. And it might let us squeeze this data center into something the size of a room. And this is this optical disk technology where you can make a disk that’s about the size of just a regular DVD. And you can squeeze just enormous amount of data. I think he’s talking about petabits in a—

Goldstein: Yeah, like 1.6 petabits on--

Cass: Petabits on this optical surface. And the magic key is, as always, a new material. I mean, we do love new materials because they’re always the wellspring from which so much springs. And we have at Spectrum many times chased down materials that have not fulfilled necessarily their promise. We have a long history— and sometimes materials go away and they come back, like—

Goldstein: They come back, like graphene. It’s gone away. It’s come back.

Cass: —graphene and stuff like this. We’re always looking for the new magic material. But this new magic material, which has this—

Goldstein: Oh, yeah. Oh, I looked this one up, Stephen.

Cass: What is it? What is it? What is it? It is called--

Goldstein: Actually, our story did not even bother to include the translation because it’s so botched. But it is A-I-E, dash, D-D-P-R, AIE-DDPR or aggregation-induced emission dye-doped photoresist.

Cass: Okay. Well, let’s just call it magic new dye-doped photoresist. And the point about this is that this material works at basically four wavelengths. And why you want a material that responds at four different wavelengths? Because the limit on optical technologies— and I’m also stretching here into the boundaries on either side of optical. The standard rule is you can’t really do anything that’s smaller than the wavelength of the light you’re using to read or write. So the length of your laser sets the density of data on your disk. And what these clever clogs have done is they’ve worked out that by using basically two lasers at once, you can, in a very clever way, write a blob that is smaller than the wavelength of light, and you can do it in multiple layers. So usually, your standard Blu-ray disk, they’re very limited in the number of layers they have on them, like CDs originally, one layer.

So you have multiple layers on this disk that you can write to, and you can write at resolutions that you wouldn’t think you could do if you were just doing— from your high school physics or whatever. So you write it using these two lasers of two wavelengths, and then you read it back using another two lasers at two different wavelengths. And this all localizes and makes it work. And suddenly, as I say, you can squeeze racks and racks and racks of solid-state storage down to hopefully something that is very small. And what’s also interesting is that they’re actually closer to commercialization than you normally see with these early material stories. And they also think you could write one of these disks in six minutes, which is pretty impressive. As someone who stood and has sat watching the progress bars on a lot of DVD-ROMs burn over the years back in the day, six minutes to burn these—that’s probably for commercial mass production—is still pretty impressive. And so you could solve this problem of some of these large data transfers we get where currently you do have to ship servers from one side of the world to the other because it actually is too slow to copy things over the internet. And so this would increase the bandwidth of sort of the global sneakernet or station wagon net quite dramatically as well.

Goldstein: Yeah. They are super interested in seeing them deployed in big data centers. And in order for them to do that, they still have to get the writing speed up and the energy consumption down. So the real engineering is just beginning for this. Well, speaking of new materials, there’s a new use for aluminum nitride according to our colleague Glenn Zorpette who wrote about the use of the material in power transistors. And apparently, if you properly dope this material, it’ll have a much wider band gap and be able to handle higher voltages. So what does this mean for the grid, Stephen?

Cass: Yeah. So I actually find power electronics really fascinating because most of the history of transistors, right, is about making them use ever smaller amounts of electricity—5-volt logic used to be pretty common; now 3.3 is pretty common, and even 1.1 volts is pretty common—and really sipping microamps of power through these circuits. And power electronics kind of gets you back to actually the origins of being an electronics engineer, electrical engineers, which is when you’re really talking about power and energy, and you are humping around thousands of volts, and you’re humping around huge currents. And power electronics is an attempt to bring some of that smartness that transistors gives you into these much higher voltages. And we’ve seen some of this with, say, gallium nitride, which is a material we had talked about in Spectrum for years, speaking of materials that had been for years floating around, and then really, though, in the last like five years, you’ve seen it be a real commercial success. So all those wall warts we have have gotten dramatically smaller and better, which is why you can have a USB-C charger system where you can drive your laptop and bunch of ancillary peripherals all off one little wall wart without worrying about it bringing down the house because it’s just so efficient and so small. And most of those now are these new gallium-nitride-based devices, which is one example where a material really is making some progress.

And so aluminum nitride is kind of another step along that, to be able to handle even higher voltages, being able to handle bigger currents. So we’re not up yet to the level where you could have these massive high-voltage transmission lines directly, but the more and more you— the rising tide of where you can put these kind of electronics into your systems. First off, it means more efficient. As I say, these power adapters that convert AC to DC, they get more efficient. Your power supplies in your computer get more efficient, and your power supplies in your grid center. We’ve talked about how much power grid centers today get more efficient. And it bundles up. And the whole point of this is that you do want a grid that is as smart as possible. You need something that will be able to handle very intermittent power sources, fluctuating power sources. The current grid is really built around very, very stable power supplies, very constant power supplies, very stable frequency timings. So the frequency of the grid is the key to stability. Everything’s got to be on that 60 hertz in the US, 50 hertz in other places. Every power station has got to be synchronized very precisely with the other. So stability is a problem, and being able to handle fluctuations quickly is the key to both grid stability and to be able to handle some of these intermittent sources where the power varies as the wind blows stronger or weaker, as the day turns, as clouds move in front of your farm. So it’s very exciting from that point of view to see these very esoteric technologies. We’re talking about things like band gaps and how do you stick the right doping molecule in the matrix, but it does bubble up into these very-large-scale impacts when we’re talking about the future of electrical engineering and that old-school power and energy keeping the lights on and the motors churning kind of a way.

Goldstein: Right. And the electrification of everything is just going to put bigger demands on the grid, like you were saying, for alternative energy sources. “Alternative.” They’re all price competitive now, the solar and wind. But--

Cass: Yeah, not just at the generate— this idea that you have distributed power and power can be generated locally, and also being able to switch power. So you have these smart transformers so that if you are generating surplus power on your solar panels, you can send that to maybe your neighbor next door who’s charging their electric vehicle without at all having to be mediated by going up to the power company. Maybe your local transformer is making some of these local grid scale balancing decisions that are much closer to where the power is being used.

Goldstein: Oh, yeah. Stephen, that reminds me of this other piece we had this week, actually, on utilities and profit motive on their part hampering US grid expansion. It’s by a Harvard scholar named Ari Peskoe, and his first line is, “The United States is not building enough transmission lines to connect regional power networks. The deficit is driving up electricity prices, reducing grid reliability, and hobbling renewable-energy deployment.” And basically, they’re just saying that it’s not—what he does a good job explaining is not only how these new projects might impact their bottom lines but also all of the industry alliances that they’ve established over the years that become these embedded interests that need to be disrupted.

Cass: Yeah, the truth is there is a list of things we could do. Not magic things. There are pretty obvious things we could do that would make the US grid— even if you don’t care much about renewables, you probably do care about your grid resilience and reliability and being able to move power around. The US grid is not great. It is creaky. We know there are things that could be done. As a byproduct of doing those things, you also would actually make it much more renewable friendly. So it is this issue of— there are political problems. Depending on which administration is in power, there is more or less an appetite to deal with some of these interests. And then, yeah, these utilities often have incentives to kind of keep things the way they are. They don’t necessarily want a grid where it’s easier to get cheaper electricity or more green electricity from one place to a different market. Everybody loves a captive monopoly market they can sell. I mean, that’s wonderful if you could do that. And then there are many places with anti-competition rules. But grids are a real— it’s really difficult to break down those barriers.

Goldstein: It is. And if you’re in Texas in a bad winter and the grid goes down and you need power from outside but you’re an island unto yourself and you can’t import that power, it becomes something that is disruptive to people’s lives, right? And people pay attention to it during a disaster, but we have a slow-rolling disaster called climate change that if we don’t start overturning some of the barriers to electrification and alternative energy sources, we’re kind of digging our own grave.

Cass: It is very tricky because we do then get into these issues where you build these transmission lines, and there are questions about who ends up paying for those transmission lines and whether they get built over their lands, the local impacts of those. And it’s hard sometimes to tell. Is this a group that is really genuinely feeling that there is a sort of justice gap here— that they’re being asked to pay for the sins of higher carbon producers, or is this astroturfing? And sometimes it’s very difficult to tell that these organizations are being underwritten by people who are invested in the status quo, and it does become a knotty problem. And we are going to, I think, as things get more and more difficult, be really faced into making some difficult choices. And I am not quite sure how that’s going to play out, but I do know that we will keep tracking it as best we can. And I think maybe, yeah, you just have to come back and see how we keep covering the grid in pages of Spectrum.

Goldstein: Excellent. Well—

Cass: And so that’s probably a good point where— I think we’re going to have to wrap this round up here. But thank you so much for coming on the show.

Goldstein: Excellent. Thank you, Stephen. Much fun.

Cass: So today on Fixing The Future, I was talking with Spectrum‘s Editor in Chief Harry Goldstein, and we talked about electric vehicles, we talked about software bloat, and we talked about new materials. I’m Stephen Cass, and I hope you join us next time.

Sci-fi and Hi-fi



Many a technologist has been inspired by science fiction. Some have even built, or rebuilt, entire companies around an idea introduced in a story they read, as the founders of Second Life and Meta did, working from the metaverse as imagined by Neal Stephenson in his seminal 1992 novel Snow Crash.

IEEE Spectrum has a history of running amazing sci-fi stories. Twenty years ago, I worked with computer scientist and novelist Vernor Vinge on his “Synthetic Serendipity,” a short story he adapted from his novel Rainbows End just for publication in Spectrum. Vinge’s work is informed by his research and relationships with some of the world’s leading technologists, which in turn gave me plenty of background for the accompanying 2004 Spectrum article “Mike Villas’s World.” Vinge’s tale of the near future explored then-nascent technologies, such as 3D printing, augmented reality, and advanced search-engines, all of which Vinge depicts with stunning clarity and foresight.

So when our News Manager Margo Anderson and Contributing Editor Charles Q. Choi hatched the idea for the science fiction/fact package featured in this issue, our local sci-fi maven, Special Projects Editor Stephen Cass, eagerly volunteered to shepherd the project. Stephen is coauthor of Hollyweird Science: From Quantum Quirks to the Multiverse (on the science shown in movies and TV shows) and the editor of several sci-fi anthologies, including Coming Soon Enough, published by Spectrum 10 years ago.

Choi suggested we hire the futurist Karl Schroeder, author of 10 sci-fi novels, to write the sci-fi story. Cass, Choi, and Schroeder then had a brainstorming session. Cass recalls, “I knew by the end of it that Karl had the chops to nail the real science concepts we wanted to explore, and come up with a compelling narrative.”

The idea they hit upon—turning a planet into a computer—is not new in science fiction, Cass notes. But “we wanted Karl to explore the idea in a way that would shed light on what purpose you’d put one to,” he says, “and also think about what some of the unintended consequences might be. And he had to do it in 2,500 words, which is a very tight fit for a story.”

As for the accompanying nonfiction annotations, Choi’s brief was to work with Cass and Schroeder to make sure that the story, although fantastical and set in the far future, was sufficiently grounded in ideas that scientists and futurists are taking seriously today.

And of course, any good sci-fi story needs some cool art. For that, Deputy Art Director Brandon Palacio chose Andrew Archer, whose work has a terrific balance of realism and stylistic flair. Historically, many science-fiction stories and books have had accompanying art that’s only barely related to what happens in the text, but Archer worked with us to make sure his work really fit “Hijack”.

Deft storytelling is something Cass himself delivers in this month’s Hands On: “Vintage Hi-Fi Enters the 21st Century”. Not only is he our in-house sci-fi expert, he’s also our staff do-it-yourselfer. This month, he resurrects a vintage hi-fi that came from his wife’s family. Inspired by the recent passing of his father, who helped his own father in their radio and television rental shop in Dublin before spending decades working as a broadcast engineer, Cass wires up a tale of family and connection through technology that you’ll read only in these pages.

The Greening of Transportation



According to the Intergovernmental Panel on Climate Change, approximately 15 percent of net anthropogenic greenhouse gas emissions come from the transportation sector. To meet global climate targets, we must devise ways to get people and goods from point A to point B without burning fossil fuels.

In this month’s special report on the greening of transportation, we examine a moonshot idea for powering electric vehicles, the biggest change in aviation since the jet engine, and cargo ships with a battle-tested mode of generation.

Internal combustion engines (ICEs) in cars and vans accounted for almost half of all carbon dioxide emissions attributable to the transportation sector in 2022, according to Statista. And the world is waking up to the staggering challenges of going electric, as Contributing Editor Robert N. Charette pointed out last year in the IEEE Spectrum series “The EV Transition Explained.”

During his reporting for that series, Charette ran across a startup called Influit Energy that is trying to commercialize a new type of flow battery. Flow batteries are typically used in stationary applications like power-grid storage,but as Charette notes in our cover story, “Can Flow Batteries Finally Beat Lithium?” Influit’s battery circulates an energy-dense nanoelectrofuel to store 15 to 25 times as much energy as a similarly sized conventional flow battery. The Influit battery also compares favorably to lithium-based batteries in terms of safety and stability, and it could provide the range of an ICE vehicle. Cars and trucks with these kinds of batteries could fill up with the nanoelectrofuel at the pump, perhaps taking advantage of the existing infrastructure built for gas-guzzlers.

“We are in the early stages of a key transition: Electrification could be the first fundamental change in airplane propulsion systems since the advent of the jet engine.”–Amy Jankovsky, Christine Andrews, and Bill Rogers

The second article in our report looks at how recent innovations in power electronics, electric motors, and batteries for the car industry are beginning to find applications in airplane design. In one effort, GE Aerospace and Boeing’s Aurora Flight Sciences are working together on a hybrid-electric propulsion system for a 150-to-180-seat airplane. The project, described by Amy Jankovsky, Christine Andrews, and Bill Rogers in “Fly the Hybrid Skies,” started in 2021 and aims to modify a Saab 340 aircraft using two GE CT7 engines combined with electric propulsion units for a megawatt-class system. As the authors note, “We are in the early stages of a key transition: Electrification could be the first fundamental change in airplane propulsion systems since the advent of the jet engine.”

The maritime industry needs a similar fundamental advance, reports Prachi Patel in “Merchant Shipping’s Nuclear Option.” Almost all of the world’s commercial fleets still run on diesel fuel. The industry needs to move much faster if it’s to reach the target of net-zero emissions by 2050 set by the United Nations’ International Maritime Organization.

One way to meet this goal is to go nuclear. Some 160 nuclear-powered vessels ply the high seas today, though almost all are navy ships and submarines. Next-generation small modular reactors (SMRs) could be a game changer for commercial cargo ships. Patel describes several efforts around the world to adapt SMRs to the marine environment. In theory, the small reactors should be safer and simpler to operate than conventional nuclear reactors.

It’s easy to look at the challenges posed by climate change and sigh. Or cry. The engineers you’ll find in this issue don’t have time for despair. They’re too busy working the problem.

❌