FreshRSS

Zobrazení pro čtení

Jsou dostupné nové články, klikněte pro obnovení stránky.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Zildjian's new e-drum kit is a gamechanger in music technology

Od: Thom Dunn
Zildjian

Electronic drum kits can do wonderful things, but even the best models out there still suffer from the same essential flaw: those god damn rubber1 cymbals.

But a few weeks ago, my friend Chris (drummer for the Roland High Life) and I visited the headquarters of the Avedis Zildjian Company just outside of Boston. — Read the rest

The post Zildjian's new e-drum kit is a gamechanger in music technology appeared first on Boing Boing.

💾

💾

💾

Zildjian's new e-drum kit is a gamechanger in music technology

Od: Thom Dunn
Zildjian

Electronic drum kits can do wonderful things, but even the best models out there still suffer from the same essential flaw: those god damn rubber1 cymbals.

But a few weeks ago, my friend Chris (drummer for the Roland High Life) and I visited the headquarters of the Avedis Zildjian Company just outside of Boston. — Read the rest

The post Zildjian's new e-drum kit is a gamechanger in music technology appeared first on Boing Boing.

💾

💾

💾

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Zildjian's new e-drum kit is a gamechanger in music technology

Od: Thom Dunn
Zildjian

Electronic drum kits can do wonderful things, but even the best models out there still suffer from the same essential flaw: those god damn rubber1 cymbals.

But a few weeks ago, my friend Chris (drummer for the Roland High Life) and I visited the headquarters of the Avedis Zildjian Company just outside of Boston. — Read the rest

The post Zildjian's new e-drum kit is a gamechanger in music technology appeared first on Boing Boing.

💾

💾

💾

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Q&A: A high-tech take on Wagner’s “Parsifal” opera

The world-famous Bayreuth Festival in Germany, annually centered around the works of composer Richard Wagner, launched this summer on July 25 with a production that has been making headlines. Director Jay Scheib, an MIT faculty member, has created a version of Wagner’s celebrated opera “Parsifal” that is set in an apocalyptic future (rather than the original Medieval past), and uses augmented reality headset technology for a portion of the audience, among other visual effects. People using the headsets see hundreds of additional visuals, from fast-moving clouds to arrows being shot at them. The AR portion of the production was developed through a team led by designer and MIT Technical Instructor Joshua Higgason.

The new “Parsifal” has engendered extensive media attention and discussion among opera followers and the viewing public. Five years in the making, it was developed with the encouragement of Bayreuth Festival general manager Katharina Wagner, Richard Wagner’s great-granddaughter. The production runs until Aug. 27, and can also be streamed on Stage+. Scheib, the Class of 1949 Professor in MIT’s Music and Theater Arts program, recently talked to MIT News about the project from Bayreuth.

Q: Your production of “Parsifal” led off this year’s entire Bayreuth festival. How’s it going?

A: From my point of view it’s going quite swimmingly. The leading German opera critics and the audiences have been super-supportive and Bayreuth makes it possible for a work to evolve … Given the complexity of the technical challenge of making an AR project function in an opera house, the bar was so high, it was a difficult challenge, and we’re really happy we found a way forward, a way to make it work, and a way to make it fit into an artistic process. I feel great.

Q: You offer a new interpretation of “Parsifal,” and a new setting for it. What is it, and why did you choose to interpret it this way?

A: One of the main themes in “Parsifal” is that the long-time king of this holy grail cult is wounded, and his wound will not heal. [With that in mind], we looked at what the world was like when the opera premiered in the late 19th century, around the time of what was known as the Great African Scramble, when Europe re-drew the map of Africa, largely based on resources, including mineral resources.

Cobalt remains [the focus of] dirty mining practices in the Democratic Republic of Congo, and is a requirement for a lot of our electronic objects, in particular batteries. There are also these massive copper deposits discovered under a Buddhist temple in Afghanistan, and lithium under a sacred site in Nevada. We face an intense challenge in climate change, and the predictions are not good. Some of our solutions like electric cars require these materials, so they’re only solutions for some people, while others suffer [where minerals are being mined]. We started thinking about how wounds never heal, and when the prospect of creating a better world opens new wounds in other communities. … That became a theme. It also comes out of the time when we were making it, when Covid happened and George Floyd was murdered, which created an opportunity in the U.S. to start speaking very openly about wounds that have not healed.

We set it in a largely post-human environment, where we didn’t succeed, and everything has collapsed. In the third act, there’s derelict mining equipment, and the holy water is this energy-giving force, but in fact it’s this lithium-ion pool, which gives us energy and then poisons us. That’s the theme we created.

Q: What were your goals about integrating the AR technology into the opera, and how did you achieve that?

A: First, I was working with my collaborator Joshua Higgason. No one had ever really done this before, so we just started researching whether it was possible. And most of the people we talked to said, “Don’t do it. It’s just not going to work.” Having always been a daredevil at heart, I was like, “Oh, come on, we can figure this out.”

We were diligent in exploring the possibilities. We made multiple trips to Bayreuth and made these milimeter-accurate laser scans of the auditorium and the stage. We built a variety of models to see how to make AR work in a large environment, where 2,000 headsets could respond simultaneously. We built a team of animators and developers and programmers and designers, from Portugal to Cambridge to New York to Hungary, the UK, and a group in Germany. Josh led this team, and they got after it, but it took us the better part of two years to make it possible for an audience, some of whom don’t really use smartphones, to put on an AR headset and have it just work.

I can’t even believe we did this. But it’s working.

Q: In opera there’s hopefully a productive tension between tradition and innovation. How do you think about that when it comes to Wagner at Bayreuth?

A: Innovation is the tradition at Bayreuth. Musically and scenographically. “Parsifal” was composed for this particular opera house, and I’m incredibly respectful of what this event is made for. We are trying to create a balanced and unified experience, between the scenic design and the AR and the lighting and the costume design, and create perfect moments of convergence where you really lose yourself in the environment. I believe wholly in the production and the performers are extraordinary. Truly, truly, truly extraordinary.

Q: People have been focused on the issue of bringing AR to Bayreuth, but what has Bayreuth brought to you as a director?

A: Working in Bayreuth has been an incredible experience. The level of intellectual integrity among the technicians is extraordinary. The amount of care and patience and curiosity and expertise in Bayreuth is off the charts. This community of artists is the greatest. … People come here because it’s an incredible meeting of the minds, and for that I’m immensely filled with gratitude every day I come into the rehearsal room. The conductor, Pablo Heras-Casado, and I have been working on this for several years. And the music is still first. We’re setting up technology not to overtake the music, but to support it, and visually amplify it.

It must be said that Katharina Wagner has been one of the most powerfully supportive artistic directors I have ever worked with. I find it inspiring to witness her tenacity and vision in seeing all of this through, despite the hurdles. It’s been a great collaboration. That’s the essence: great collaboration.

This work was supported, in part, by an MIT.nano Immersion Lab Gaming Program seed grant, and was developed using capabilities in the Immersion Lab. The project was also funded, in part, by a grant from the MIT Center for Art, Science, and Technology.

© Image: Enrico Nawrath. Courtesy of the Bayreuther Festival

Director and MIT Professor Jay Scheib speaks about his widely heralded production of Wagner’s “Parsifal” opera at the Bayreuth Festival, which features an apocalyptic theme and augmented reality headsets for the audience.

Play it again, Spirio

Seated at the grand piano in MIT’s Killian Hall last fall, first-year student Jacqueline Wang played through the lively opening of Mozart’s “Sonata in B-flat major, K.333.” When she’d finished, Mi-Eun Kim, pianist and lecturer in MIT’s Music and Theater Arts Section (MTA), asked her to move to the rear of the hall. Kim tapped at an iPad. Suddenly, the sonata she'd just played poured forth again from the piano — its keys dipping and rising just as they had with Wang’s fingers on them, the resonance of its strings filling the room. Wang stood among a row of empty seats with a slightly bemused expression, taking in a repeat of her own performance.

“That was a little strange,” Wang admitted when the playback concluded, then added thoughtfully: “It sounds different from what I imagine I’m playing.”

This unusual lesson took place during a nearly three-week residency at MIT of the Steinway Spirio | r, a piano embedded with technology for live performance capture and playback. “The residency offered students, faculty, staff, and campus visitors the opportunity to engage with this new technology through a series of workshops that focused on such topics as the historical analysis of piano design, an examination of the hardware and software used by the Spirio | r, and step-by-step guidance of how to use the features,” explains Keeril Makan, head of MIT Music and Theater Arts and associate dean of the School of Humanities, Arts, and Social Sciences.

Wang was one of several residency participants to have the out-of-body experience of hearing herself play from a different vantage point, while watching the data of her performance scroll across a screen: color-coded rectangles indicating the velocity and duration of each note, an undulating line charting her use of the damper pedal. Wang was even able to edit her own performance, as she discovered when Kim suggested her rhythmic use of the pedal might be superfluous. Using the iPad interface to erase the pedaling entirely, they listened to the playback again, the notes gaining new clarity.

“See? We don’t need it,” Kim confirmed with a smile.

“When MIT’s new music building (W18) opens in spring 2025, we hope it will include this type of advanced technology. It would add value not just to Wang’s cohort of 19 piano students in the Emerson/Harris Program, which provides a total of 71 scholars and fellows with support for conservatory-level instruction in classical, jazz, and world music. But could also offer educational opportunities to a much wider swath of the MIT community,” says Makan. “Music is the fifth-most popular minor at MIT; 1,700 students enroll in music and theater arts classes each semester, and the Institute is brimming with vocalists, composers, instrumentalists, and music history students.”

According to Kim, the Spirio enables insights beyond what musicians could learn from a conventional recording; hearing playback directly from the instrument reveals sonic dimensions an MP3 can’t capture. “Speaker systems sort of crunch everything down — the highs and the lows, they all kind of sound the same. But piano solo music is very dynamic. It’s supposed to be experienced in a room,” she says.

During the Spirio | r residency, students found they could review their playing at half speed, adjust the volume of certain notes to emphasize a melody, transpose a piece to another key, or layer their performance — prerecording one hand, for example, then accompanying it live with the other.

“It helps the student be part of the learning and the teaching process,” Kim says. “If there’s a gap between what they imagined and what they hear and then they come to me and say, ‘How do I fix this?’ they’re definitely more engaged. It’s an honest representation of their playing, and the students who are humbled by it will become better pianists.”

For Wang, reflecting on her lesson with Kim, the session introduced an element she’d never experienced since beginning her piano studies at age 5. “The visual display of how long each key was played and with what velocity gave me a more precise demonstration of the ideas of voicing and evenness,” Wang says. “Playing the piano is usually dependent solely on the ears, but this combines with the auditory experience a visual experience and statistics, which helped me get a more holistic view of my playing.”

As a first-year undergraduate considering a Course 6 major (electrical engineering and computer science, or EECS), Wang was also fascinated to watch Patrick Elisha, a representative from Steinway dealer M. Steinert & Sons, disassemble the piano action to point out the optical sensors that measure the velocity of each hammer strike at 1,020 levels of sensitivity, sampled 800 times per second.

“I was amazed by the precision of the laser sensors and inductors,” says Wang. “I have just begun to take introductory-level courses in EECS and am just coming across these concepts, and this certainly made me more excited to learn more about these electrical devices and their applications. I was also intrigued that the electrical system was added onto the piano without interfering with the mechanical structure, so that when we play the Spirio, our experience with the touch and finger control was just like that of playing a usual Steinway.”

Another Emerson/Harris scholar, Víctor Quintas-Martínez, a PhD candidate in economics who resumed his lapsed piano studies during the Covid-19 pandemic, visited Killian Hall during the residency to rehearse a Fauré piano quartet with a cellist, violist, and violinist. “We did a run of certain passages and recorded the piano part. Then I listened to the strings play with the recording from the back of the hall. That gave me an idea of what I needed to adjust in terms of volume, texture, pedal, etc., to achieve a better balance. Normally, when you’re playing, because you’re sitting behind the strings and close to the piano, your perception of balance may be somewhat distorted,” he notes.

Kim cites another campus demographic ripe for exploring these types of instruments like the Spirio | r and its software: future participants in MIT’s relatively new Music Technology Master's Program, along with others across the Institute whose work intersects with the wealth of data the instrument captures. Among them is Praneeth Namburi, a research scientist at the MIT.nano Immersion Lab. Typically, Namburi focuses his neuroscience expertise on the biomechanics of dancing and expert movement. For two days during the MTA/Spirio residency, he used the sensors at the Immersion Lab, along with those of the Spirio, to analyze how pianists use their bodies.

“We used motion capture that can help us contrast the motion paths of experts such as Mi-Eun from those of students, potentially aiding in music education,” Namburi recounts, “force plates that can give scientific insights into how movement timing is organized, and ultrasound to visualize the forearm tissues during playing, which can potentially help us understand musicianship-related injuries.”

“The encounter between MTA and MIT.nano was something unique to MIT,” Kim believes. “Not only is this super useful for the music world, but it’s also very exciting for movement researchers, because playing piano is one of the most complex activities that humans do with our hands.”

In Kim’s view, that quintessentially human complexity is complemented by these kinds of technical possibilities. “Some people might think oh, it's going to replace the pianist,” she says. “But in the end it is a tool. It doesn’t replace all of the things that go into learning music. I think it's going to be an invaluable third partner: the student, the teacher, and the Spirio — or the musician, the researcher, and the Spirio. It's going to play an integral role in a lot of musical endeavors.”

© Still from a video by Trillium Studios/Arts at MIT; videography by Seven Generations

Mi-Eun Kim (seated), pianist and lecturer at MIT Music and Theater Arts, and student Holden Mui interact with the Steinway Spirio.

Q&A: A high-tech take on Wagner’s “Parsifal” opera

The world-famous Bayreuth Festival in Germany, annually centered around the works of composer Richard Wagner, launched this summer on July 25 with a production that has been making headlines. Director Jay Scheib, an MIT faculty member, has created a version of Wagner’s celebrated opera “Parsifal” that is set in an apocalyptic future (rather than the original Medieval past), and uses augmented reality headset technology for a portion of the audience, among other visual effects. People using the headsets see hundreds of additional visuals, from fast-moving clouds to arrows being shot at them. The AR portion of the production was developed through a team led by designer and MIT Technical Instructor Joshua Higgason.

The new “Parsifal” has engendered extensive media attention and discussion among opera followers and the viewing public. Five years in the making, it was developed with the encouragement of Bayreuth Festival general manager Katharina Wagner, Richard Wagner’s great-granddaughter. The production runs until Aug. 27, and can also be streamed on Stage+. Scheib, the Class of 1949 Professor in MIT’s Music and Theater Arts program, recently talked to MIT News about the project from Bayreuth.

Q: Your production of “Parsifal” led off this year’s entire Bayreuth festival. How’s it going?

A: From my point of view it’s going quite swimmingly. The leading German opera critics and the audiences have been super-supportive and Bayreuth makes it possible for a work to evolve … Given the complexity of the technical challenge of making an AR project function in an opera house, the bar was so high, it was a difficult challenge, and we’re really happy we found a way forward, a way to make it work, and a way to make it fit into an artistic process. I feel great.

Q: You offer a new interpretation of “Parsifal,” and a new setting for it. What is it, and why did you choose to interpret it this way?

A: One of the main themes in “Parsifal” is that the long-time king of this holy grail cult is wounded, and his wound will not heal. [With that in mind], we looked at what the world was like when the opera premiered in the late 19th century, around the time of what was known as the Great African Scramble, when Europe re-drew the map of Africa, largely based on resources, including mineral resources.

Cobalt remains [the focus of] dirty mining practices in the Democratic Republic of Congo, and is a requirement for a lot of our electronic objects, in particular batteries. There are also these massive copper deposits discovered under a Buddhist temple in Afghanistan, and lithium under a sacred site in Nevada. We face an intense challenge in climate change, and the predictions are not good. Some of our solutions like electric cars require these materials, so they’re only solutions for some people, while others suffer [where minerals are being mined]. We started thinking about how wounds never heal, and when the prospect of creating a better world opens new wounds in other communities. … That became a theme. It also comes out of the time when we were making it, when Covid happened and George Floyd was murdered, which created an opportunity in the U.S. to start speaking very openly about wounds that have not healed.

We set it in a largely post-human environment, where we didn’t succeed, and everything has collapsed. In the third act, there’s derelict mining equipment, and the holy water is this energy-giving force, but in fact it’s this lithium-ion pool, which gives us energy and then poisons us. That’s the theme we created.

Q: What were your goals about integrating the AR technology into the opera, and how did you achieve that?

A: First, I was working with my collaborator Joshua Higgason. No one had ever really done this before, so we just started researching whether it was possible. And most of the people we talked to said, “Don’t do it. It’s just not going to work.” Having always been a daredevil at heart, I was like, “Oh, come on, we can figure this out.”

We were diligent in exploring the possibilities. We made multiple trips to Bayreuth and made these milimeter-accurate laser scans of the auditorium and the stage. We built a variety of models to see how to make AR work in a large environment, where 2,000 headsets could respond simultaneously. We built a team of animators and developers and programmers and designers, from Portugal to Cambridge to New York to Hungary, the UK, and a group in Germany. Josh led this team, and they got after it, but it took us the better part of two years to make it possible for an audience, some of whom don’t really use smartphones, to put on an AR headset and have it just work.

I can’t even believe we did this. But it’s working.

Q: In opera there’s hopefully a productive tension between tradition and innovation. How do you think about that when it comes to Wagner at Bayreuth?

A: Innovation is the tradition at Bayreuth. Musically and scenographically. “Parsifal” was composed for this particular opera house, and I’m incredibly respectful of what this event is made for. We are trying to create a balanced and unified experience, between the scenic design and the AR and the lighting and the costume design, and create perfect moments of convergence where you really lose yourself in the environment. I believe wholly in the production and the performers are extraordinary. Truly, truly, truly extraordinary.

Q: People have been focused on the issue of bringing AR to Bayreuth, but what has Bayreuth brought to you as a director?

A: Working in Bayreuth has been an incredible experience. The level of intellectual integrity among the technicians is extraordinary. The amount of care and patience and curiosity and expertise in Bayreuth is off the charts. This community of artists is the greatest. … People come here because it’s an incredible meeting of the minds, and for that I’m immensely filled with gratitude every day I come into the rehearsal room. The conductor, Pablo Heras-Casado, and I have been working on this for several years. And the music is still first. We’re setting up technology not to overtake the music, but to support it, and visually amplify it.

It must be said that Katharina Wagner has been one of the most powerfully supportive artistic directors I have ever worked with. I find it inspiring to witness her tenacity and vision in seeing all of this through, despite the hurdles. It’s been a great collaboration. That’s the essence: great collaboration.

This work was supported, in part, by an MIT.nano Immersion Lab Gaming Program seed grant, and was developed using capabilities in the Immersion Lab. The project was also funded, in part, by a grant from the MIT Center for Art, Science, and Technology.

© Image: Enrico Nawrath. Courtesy of the Bayreuther Festival

Director and MIT Professor Jay Scheib speaks about his widely heralded production of Wagner’s “Parsifal” opera at the Bayreuth Festival, which features an apocalyptic theme and augmented reality headsets for the audience.
❌