FreshRSS

Zobrazení pro čtení

Jsou dostupné nové články, klikněte pro obnovení stránky.

Latency, Interconnects, And Poker

Semiconductor Engineering sat down with Larry Pileggi, Coraluppi Head and Tanoto Professor of Electrical and Computer Engineering at Carnegie Mellon University, and the winner of this year’s Phil Kaufman Award for Pioneering Contributions. What follows are excerpts of that conversation.

Semiconductor Engineering sat down with Larry Pileggi, Coraluppi Head and Tanoto Professor of Electrical and Computer Engineering at Carnegie Mellon University, and the winner of this year's Phil Kaufman Award for Pioneering Contributions. What follows are excerpts of that conversation.SE: When did you first get started working in semiconductors — and particularly, EDA?

Pileggi: This was 1984 at Westinghouse research. We were making ASICs — analog and digital — and with digital you had logic simulators. But for analog, there was no way to simulate them at Westinghouse. They didn’t even have SPICE loaded onto the machine. So I got a copy of SPICE from Berkeley and loaded that tape, and I was the first to use it in the research center. I saw how limited it was and thought, ‘There must be more mature things than this.’ While I was working there, I was taking a class with Andrzej Strojwas at CMU (Carnegie Mellon University). He came up to me after a few weeks in that class and said, ‘I really think you should come back to school for a PhD.’ I had never considered it up until that point. But getting paid to go to school? That was cool, so I signed up.

SE: Circuit simulation in analog is largely brute force, right?

Pileggi: The tools that are out there are really good. There are many SPICEs out there, and they all have their niches that can do really great things. But it’s not something you can easily scale. That’s really been a challenge. There’s brute force in the innermost loop, but you can accelerate it with hardware.

SE: What was the ‘aha’ moment for you with regard to dealing with latency of interconnect as interconnect continued to scale?

Pileggi: There was some interest in looking at RC networks that appeared on chips as sort of a special class of problem. Paul Penfield and others at MIT did this Elmore approximation of RC lines using the first moment of the impulse response. It’s from a 1930s Elmore paper about estimating the delay of amplifiers. Mark Horowitz, a student of Penfield, tried to extend that to a few moments. What we did was more of a generalized approach, using many moments and building high-order approximations that you could apply to these RC lines. So you’re really using this to calculate that the dominant time constants, or the dominant poles, in the network. And for RC circuits, what’s really interesting is that the bigger the network gets, the more dominant the poles get. So you could have a million nodes out there — and it’s a million capacitors and a million poles — but for an RC line, three of them will model it really well. That makes things really efficient, providing you can capture those three efficiently. I was naive, not knowing that French mathematicians like [Henri] Pade already had attempted Pade approximations long before. I dove in like, ‘Oh, this should work.’ And I ran into a lot of the realities for why it doesn’t work. But then I was able to apply some of the circuit know-how to squeeze it into a place where it worked very effectively.

SE: A lot of that early work was around radio signals. But as you move that into the computing world, what else can you do with that? And if you now don’t have to put everything onto a single chip, does that change things?

Pileggi: Let’s take the power distribution for an IC, for example. That’s primarily dominated on the chip by RC phenomenon. The resistance far dominates the jωL impedance — the inductance. But as you move to a package, that’s different. If you put different chips together, whether you stack them or you put them on an interposer, inductance starts to rear its ugly head. Inductance is extraordinarily nasty to model and simulate. The problem is that when you when you look at capacitances, that’s a that’s a potential matrix where you take the nearest couplings, and say, ‘Okay, I have enough of this capacitance to say this is going to dominate the behavior.’ You’re essentially throwing away what you don’t need. With inductance, there’s a one-over relationship as compared to capacitance. Now, if you want the dominant inductance effect, that’s not so easy to get. If you have mutual couplings from everything to everything else, and if you say I’m going to throw away the couplings to faraway things, that’s a seemingly reasonable thing to do from an accuracy standpoint, but it affects the stability of the approximation. Essentially it can violate conservation of flux, such that you get positive poles. So you can actually create unstable systems by just throwing away small inductance terms. Usually when you see someone calculating inductance, it’s really just an estimate — or they’ve done some things to crunch it into a stable model.

SE: Is that simulation based on the 80/20 rule, or 90/10?

Pileggi: Even for the packages we had before we started doing the multi-chip stuff, power distribution was RC, but when you flip it into a package with many layers of metal, it’s LC. We had the same problem for the past 20 years, but what happens has been managed by good engineers. They apply very conservative methods to make sure the chips will work.

SE: So now, when you pile that into advanced nodes and packages and eliminate all that margin, you’ve got serious challenges, right?

Pileggi: Yes, and that’s why it was a good time for me to switch to electric power grids.

SE: Power grids for our communities have their own set of problems, though, like localization and mixing direct and alternating current, and a bunch of inverters.

Pileggi: It’s a fascinating problem. When I first stepped into it, a student of mine started telling me about how they did simulation. I said, ‘Wow, that doesn’t make any sense.’ I naively thought it was just like a big circuit, but it’s much more than that. It’s a very cool problem to work on. We’ve developed a lot of really exciting technology for that problem. With inverters, there’s a whole control loop. There isn’t the inertia that you have with big rotating machines that are fed by coal. But you have all these components on the same grid. How the grid behaves dynamically is a daunting problem.

SE: Does that also vary by weather? You’ve got wide variations in ambient temperature and all sorts of noise to contend with.

Pileggi: Yes, absolutely. In fact, how the lines behave is very much a function of temperature. That affects how resistant the transmission lines are. Frequency is very low, but the lengths are very long, so you have similar problems, but even more so with renewables. There’s sun, then a cloud, then sun. Or the wind changes direction. How do you store energy for use later? That’s where they talk about heavy batteries in the ground and things like that. Doing this with an old grid, like the one we have, is challenging. I’d much rather be starting from scratch.

SE: When you got started in electronics, was it largely the domain of some very big companies with very big research budgets?

Pileggi: Yes, and this is where you saw where management really makes a difference. Some of those companies, like Westinghouse Research, had these incredible R&D facilities, but they didn’t utilize them effectively, like all the gallium arsenide research where I was working. It seemed that every time we would develop something to improve something, the management didn’t always know what to do with it. I worked with some of the smartest people I’ve ever met, and they had worked on projects like the first camera in space, but they were living in obscurity. Nobody knew anything about their work, but it was just amazing.

SE: One other math-related question. You apparently have a reputation for being a very strong poker player. How did these two worlds collide?

Pileggi: I was in Las Vegas for a DARPA meeting and I had an afternoon off and there was a Texas Hold’em poker tournament going on. I thought it would be kind of fun, so I played four or five hours, got knocked out, and it cost me 100 bucks. I was intrigued by it, though. I went back to Pittsburgh and found our local casino had started a poker room with tournaments. I started getting better, probably because I read like 30 books on the subject. The more you play, the more you realize there are lots of layers to this. I ultimately played in the World Series in Vegas, because it’s like a bucket-list thing, and that first time I made it to day two of the main event. That’s equivalent to finishing in the top 40% of the field. When I was back in Pittsburgh, there was a ‘Poker Night in America’ event at the casino. There were about 300 people and some pros. I played in that, and won first place. That was a Saturday around Thanksgiving in 2013. We played from noon until just after midnight, and then you start again on Sunday. We played until maybe 5 a.m.

SE: That must have taken a toll.

Pileggi: Yes, because I was chairing the search for new department heads. I had a Monday morning meeting scheduled that I couldn’t miss, so I e-mailed everyone to say I would be an hour late and asked if they could push back the meeting. I went home and ate something, slept for an hour, and went to campus to do the final vote. They asked, what happened? I said I was in a poker tournament. They thought I was joking. But then they saw me on TV. All the local news stations covered it like, ‘Local professor skips school.’ I got a call from someone I hadn’t talked to in 34 years. My dean said his son thought engineering was stupid. But then he found out than this engineer won this poker tournament, and now he thinks engineering is really cool.’

SE: How did that affect your engineering classes?

Pileggi: I introduced myself to a group of students here two years ago when I became the department head and asked if they had any questions. One young lady raised her hand and said, ‘Yeah, can you teach us how to play poker?’ So now I do a poker training session with students once a semester.

The post Latency, Interconnects, And Poker appeared first on Semiconductor Engineering.

❌