FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál

MIT spinout Arnasi begins applying LiquiGlide no-stick technology to help patients

The no-stick technology invented by Professor Kripa Varanasi and David Smith SM ’11, initially commercialized as LiquiGlide in 2012, went viral for its uncanny ability to make materials that stick to their containers — think ketchup, cosmetics, and toothpaste — slide out with ease.

Now, the company that brought you Colgate no-stick toothpaste is moving into the medical space, and the applications could improve millions of lives. The company, which recently rebranded as the Arnasi Group, has developed an ambitious plan to launch three new biomedical products over the next four years.

The first of those products, called Revel, is a deodorizing lubricant designed for ostomy pouches, which are used by individuals to collect bodily waste after digestive system surgeries. Up to 1 million people rely on such pouches in the United States. Ostomy pouches must be emptied multiple times per day, and issues resulting from sticking or clogging can cause embarrassing, time-consuming situations for the people relying on them.

Arnasi’s deodorizing lubricant can prevent clogging and simplify the ostomy pouch cleaning process. Unlike other options available, one application of its lubricant works for the entire day, the Arnasi team says, and they designed a single unit dose that fits in your pocket for added convenience.

An ostomy pouch “significantly impacts a person’s lifestyle,” Varanasi says. “They need to keep it clean, and they need to use it at all times. We are solving a very important problem while helping people by giving their dignity and lifestyles back.”

Revel, Arnasi’s FDA-registered product, officially launched this month, and it has already received promising feedback from nurses and patients.

Margaret is a nurse who relies on an ostomy pouch herself and cares for patients who need them after receiving colostomies and ileostomies. She received samples of Revel at a recent conference and says it could dramatically improve both her and her patients’ lives.

“These pouches need to be emptied frequently, and sometimes that’s very difficult to do,” she says. “This particular product makes everything slide out without any problems at all, and it’s a wonderful improvement. It also lasts long enough to empty the pouch three to four times, which is great because you don’t have to carry a bunch of this stuff around.”

Margaret’s experience echoes feedback Arnasi’s team has heard from many others.

“When we showed it to the nurses, they were blown away with the product,” says Arnasi CEO Dan Salain. “They asked us to get this product out to the market as fast as we could, and so that’s what we’re doing.”

Arnasi’s next medical products will be used to prevent biofilm and bacterial infections caused by implants and catheters, and will also help people with cystic fibrosis.

“We want to create products that really help people,” Salain says. “Anything that’s implantable in the body, whether it’s a catheter, a hip, knee, or joint replacement, a breast implant, a bladder sling — those things lend themselves to our technology.”

From packages to patients

Varanasi initially developed Arnasi’s liquid-impregnated surface technology with Smith, Arnasi’s co-founder and current CTO, when Smith was a graduate student in Varanasi’s lab. The research was initially funded by the MIT Energy Initiative and the MIT Deshpande Center to work on solid-liquid interfaces with broad applications for energy, water, and more.

“There’s this fundamental friction constraint called the no-slip boundary condition between a liquid and a solid, so by creating a new surface in which we can infuse a liquid that is less viscous, we can now get the product to easily slide on surfaces,” Varanasi explains. “That aha moment meant we could get around a fundamental constraint in fluid dynamics.”

Still, sticky surfaces are everywhere, and the scientific co-founders had to decide where to apply their technology first. Shortly after the invention, Varanasi was at home trying to decide on the best application when he saw his wife across the kitchen table trying to get honey out of a bottle. It was another aha moment.

Soon after, Varanasi’s team entered the MIT $100K Entrepreneurship Competition. The competition — and the corresponding videos of ketchup and other materials sliding out their bottles with ease — created a media storm and a frenzy of attention.

“The press exploded,” Varanasi says. “For three months, my phone didn’t stop ringing. My group website crashed. There was a lot of market pull and in response, we founded the company.”

Arnasi, still operating as LiquiGlide, licensed the intellectual property from MIT’s Technology Licensing Office and eventually signed large deals with some of the world’s biggest consumer packaged goods companies, who used it to create products like fully recyclable toothpaste.

“There is so much waste just because we can't get all of the product, be it food, cosmetics, or medical products, out of containers,” Varanasi says. “Fifty billion-plus packages are sold every year, and 5 to 10 percent of product is left behind on average. So, you can imagine the CO2 footprint of the wasted product. And even though a lot of this is in recyclable packaging, they can’t be recycled because you need to wash out all the product. The water footprint of this is huge, not to mention the wasted product.”

While all of that was going on, Arnasi’s team was also looking into the biomedical space. For instance, Varanasi’s lab previously showed the technology could be used to prevent occlusion from blood clots and thrombosis and reduce biofilm formation, among other applications.

After studying the industry and speaking with patients and nurses, Arnasi realized a better lubricant for ostomy pouches could improve millions of people’s lives.

“Stool accumulates in these pouches outside of people’s bodies, and they need to empty it up to eight times a day,” explains Brienne Engel, Arnasi’s director of business development. “That process has a lot of challenges associated with it: It can be difficult to drain, leaving a lot of mass behind, it takes a long time to drain, so you can spend a long time in a restroom trying to clear out your pouch, and then there’s something called pancaking that can push the pouch off the [surgical opening], introducing issues like leakage, odor, and failure of the ostomy pouching system.”

Ostomy and beyond

Arnasi’s ostomy lubricant, Revel, is the first non-water-based solution on the market, and as-yet unpublished third-party testing has shown it allows for faster, more complete pouch drainage, along with other benefits.

“A lot of the existing brands treat their consumers like patients, but what we’ve found is they want to be treated like people and have a consumer experience,” Salain says. “The magic we saw with our toothpaste product was people got this amazing consumer experience out of it, and we wanted to create the same thing with Revel.”

Now Arnasi is planning to use its technology in medical products for skin infections, cystic fibrosis, and in implantable catheters and joint replacements. Arnasi’s team believes those last two use cases could prevent millions of deadly infections.

“When people are getting hemodialysis catheters, they have a 33 percent risk of developing infections, and those that do get those infections have a 25 percent chance of dying from them,” Engel says. “Taking our underlying technology and applying it to catheters, for example, imparts anti-biofilm properties and also prevent things like thrombosis, or blood clotting on the outside of these catheters, which is a problem in and of itself but also provides a space for bacteria to seed.”

Ultimately, Varanasi’s team is balancing making progress on its biomedical applications while exploring other avenues for its technology — including energy, manufacturing, and agriculture — to maximize its impact on the world.

“We think of this as a company with many companies within it because of all the different areas that it can impact. Liquid-solid interfaces are ubiquitous, viscous products are everywhere, and deploying this technology to solve difficult problems has been a dream,” Varanasi says. “It’s a great example of how MIT technology can be used for the benefit of humankind.”

© Image: Courtesy of Liquiglide

“[B]y creating a new surface in which we can infuse a liquid that is less viscous, we can now get the product to easily slide on surfaces,” Varanasi explains.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇Ars Technica - All content
  • CEO of failing hospital chain got $250M amid patient deaths, layoffs, bankruptcyBeth Mole
    Enlarge / Hospital staff and community members held a protest in front of Carney Hospital in Boston on August 5 as Steward has announced it will close the hospital. "Ralph" refers to Steward's CEO, Ralph de la Torre, who owns a yacht. (credit: Getty | Suzanne Kreiter) As the more than 30 hospitals in the Steward Health Care System scrounged for cash to cover supplies, shuttered pediatric and neonatal units, closed maternity wards, laid off hundreds of health care workers, a
     

CEO of failing hospital chain got $250M amid patient deaths, layoffs, bankruptcy

Od: Beth Mole
20. Srpen 2024 v 23:20
 Hospital staff and community members held a protest in front of Carney Hospital  in Boston on August 5 as Steward has announced it will close the hospital. "Ralph" refers to Steward's CEO, Ralph de la Torre, who owns a yacht.

Enlarge / Hospital staff and community members held a protest in front of Carney Hospital in Boston on August 5 as Steward has announced it will close the hospital. "Ralph" refers to Steward's CEO, Ralph de la Torre, who owns a yacht. (credit: Getty | Suzanne Kreiter)

As the more than 30 hospitals in the Steward Health Care System scrounged for cash to cover supplies, shuttered pediatric and neonatal units, closed maternity wards, laid off hundreds of health care workers, and put patients in danger, the system paid out at least $250 million to its CEO and his companies, according to a report by The Wall Street Journal.

The newly revealed financial details bring yet more scrutiny to Steward CEO Ralph de la Torre, a Harvard University-trained cardiac surgeon who, in 2020, took over majority ownership of Steward from the private equity firm Cerberus. De la Torre and his companies were reportedly paid at least $250 million since that takeover. In May, Steward, which has hospitals in eight states, filed for Chapter 11 bankruptcy.

Critics—including members of the Senate Committee on Health, Education, Labor, and Pensions (HELP)—allege that de la Torre and stripped the system's hospitals of assets, siphoned payments from them, and loaded them with debt, all while reaping huge payouts that made him obscenely wealthy.

Read 12 remaining paragraphs | Comments

MIT spinout Arnasi begins applying LiquiGlide no-stick technology to help patients

The no-stick technology invented by Professor Kripa Varanasi and David Smith SM ’11, initially commercialized as LiquiGlide in 2012, went viral for its uncanny ability to make materials that stick to their containers — think ketchup, cosmetics, and toothpaste — slide out with ease.

Now, the company that brought you Colgate no-stick toothpaste is moving into the medical space, and the applications could improve millions of lives. The company, which recently rebranded as the Arnasi Group, has developed an ambitious plan to launch three new biomedical products over the next four years.

The first of those products, called Revel, is a deodorizing lubricant designed for ostomy pouches, which are used by individuals to collect bodily waste after digestive system surgeries. Up to 1 million people rely on such pouches in the United States. Ostomy pouches must be emptied multiple times per day, and issues resulting from sticking or clogging can cause embarrassing, time-consuming situations for the people relying on them.

Arnasi’s deodorizing lubricant can prevent clogging and simplify the ostomy pouch cleaning process. Unlike other options available, one application of its lubricant works for the entire day, the Arnasi team says, and they designed a single unit dose that fits in your pocket for added convenience.

An ostomy pouch “significantly impacts a person’s lifestyle,” Varanasi says. “They need to keep it clean, and they need to use it at all times. We are solving a very important problem while helping people by giving their dignity and lifestyles back.”

Revel, Arnasi’s FDA-registered product, officially launched this month, and it has already received promising feedback from nurses and patients.

Margaret is a nurse who relies on an ostomy pouch herself and cares for patients who need them after receiving colostomies and ileostomies. She received samples of Revel at a recent conference and says it could dramatically improve both her and her patients’ lives.

“These pouches need to be emptied frequently, and sometimes that’s very difficult to do,” she says. “This particular product makes everything slide out without any problems at all, and it’s a wonderful improvement. It also lasts long enough to empty the pouch three to four times, which is great because you don’t have to carry a bunch of this stuff around.”

Margaret’s experience echoes feedback Arnasi’s team has heard from many others.

“When we showed it to the nurses, they were blown away with the product,” says Arnasi CEO Dan Salain. “They asked us to get this product out to the market as fast as we could, and so that’s what we’re doing.”

Arnasi’s next medical products will be used to prevent biofilm and bacterial infections caused by implants and catheters, and will also help people with cystic fibrosis.

“We want to create products that really help people,” Salain says. “Anything that’s implantable in the body, whether it’s a catheter, a hip, knee, or joint replacement, a breast implant, a bladder sling — those things lend themselves to our technology.”

From packages to patients

Varanasi initially developed Arnasi’s liquid-impregnated surface technology with Smith, Arnasi’s co-founder and current CTO, when Smith was a graduate student in Varanasi’s lab. The research was initially funded by the MIT Energy Initiative and the MIT Deshpande Center to work on solid-liquid interfaces with broad applications for energy, water, and more.

“There’s this fundamental friction constraint called the no-slip boundary condition between a liquid and a solid, so by creating a new surface in which we can infuse a liquid that is less viscous, we can now get the product to easily slide on surfaces,” Varanasi explains. “That aha moment meant we could get around a fundamental constraint in fluid dynamics.”

Still, sticky surfaces are everywhere, and the scientific co-founders had to decide where to apply their technology first. Shortly after the invention, Varanasi was at home trying to decide on the best application when he saw his wife across the kitchen table trying to get honey out of a bottle. It was another aha moment.

Soon after, Varanasi’s team entered the MIT $100K Entrepreneurship Competition. The competition — and the corresponding videos of ketchup and other materials sliding out their bottles with ease — created a media storm and a frenzy of attention.

“The press exploded,” Varanasi says. “For three months, my phone didn’t stop ringing. My group website crashed. There was a lot of market pull and in response, we founded the company.”

Arnasi, still operating as LiquiGlide, licensed the intellectual property from MIT’s Technology Licensing Office and eventually signed large deals with some of the world’s biggest consumer packaged goods companies, who used it to create products like fully recyclable toothpaste.

“There is so much waste just because we can't get all of the product, be it food, cosmetics, or medical products, out of containers,” Varanasi says. “Fifty billion-plus packages are sold every year, and 5 to 10 percent of product is left behind on average. So, you can imagine the CO2 footprint of the wasted product. And even though a lot of this is in recyclable packaging, they can’t be recycled because you need to wash out all the product. The water footprint of this is huge, not to mention the wasted product.”

While all of that was going on, Arnasi’s team was also looking into the biomedical space. For instance, Varanasi’s lab previously showed the technology could be used to prevent occlusion from blood clots and thrombosis and reduce biofilm formation, among other applications.

After studying the industry and speaking with patients and nurses, Arnasi realized a better lubricant for ostomy pouches could improve millions of people’s lives.

“Stool accumulates in these pouches outside of people’s bodies, and they need to empty it up to eight times a day,” explains Brienne Engel, Arnasi’s director of business development. “That process has a lot of challenges associated with it: It can be difficult to drain, leaving a lot of mass behind, it takes a long time to drain, so you can spend a long time in a restroom trying to clear out your pouch, and then there’s something called pancaking that can push the pouch off the [surgical opening], introducing issues like leakage, odor, and failure of the ostomy pouching system.”

Ostomy and beyond

Arnasi’s ostomy lubricant, Revel, is the first non-water-based solution on the market, and as-yet unpublished third-party testing has shown it allows for faster, more complete pouch drainage, along with other benefits.

“A lot of the existing brands treat their consumers like patients, but what we’ve found is they want to be treated like people and have a consumer experience,” Salain says. “The magic we saw with our toothpaste product was people got this amazing consumer experience out of it, and we wanted to create the same thing with Revel.”

Now Arnasi is planning to use its technology in medical products for skin infections, cystic fibrosis, and in implantable catheters and joint replacements. Arnasi’s team believes those last two use cases could prevent millions of deadly infections.

“When people are getting hemodialysis catheters, they have a 33 percent risk of developing infections, and those that do get those infections have a 25 percent chance of dying from them,” Engel says. “Taking our underlying technology and applying it to catheters, for example, imparts anti-biofilm properties and also prevent things like thrombosis, or blood clotting on the outside of these catheters, which is a problem in and of itself but also provides a space for bacteria to seed.”

Ultimately, Varanasi’s team is balancing making progress on its biomedical applications while exploring other avenues for its technology — including energy, manufacturing, and agriculture — to maximize its impact on the world.

“We think of this as a company with many companies within it because of all the different areas that it can impact. Liquid-solid interfaces are ubiquitous, viscous products are everywhere, and deploying this technology to solve difficult problems has been a dream,” Varanasi says. “It’s a great example of how MIT technology can be used for the benefit of humankind.”

© Image: Courtesy of Liquiglide

“[B]y creating a new surface in which we can infuse a liquid that is less viscous, we can now get the product to easily slide on surfaces,” Varanasi explains.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.

MIT spinout Arnasi begins applying LiquiGlide no-stick technology to help patients

The no-stick technology invented by Professor Kripa Varanasi and David Smith SM ’11, initially commercialized as LiquiGlide in 2012, went viral for its uncanny ability to make materials that stick to their containers — think ketchup, cosmetics, and toothpaste — slide out with ease.

Now, the company that brought you Colgate no-stick toothpaste is moving into the medical space, and the applications could improve millions of lives. The company, which recently rebranded as the Arnasi Group, has developed an ambitious plan to launch three new biomedical products over the next four years.

The first of those products, called Revel, is a deodorizing lubricant designed for ostomy pouches, which are used by individuals to collect bodily waste after digestive system surgeries. Up to 1 million people rely on such pouches in the United States. Ostomy pouches must be emptied multiple times per day, and issues resulting from sticking or clogging can cause embarrassing, time-consuming situations for the people relying on them.

Arnasi’s deodorizing lubricant can prevent clogging and simplify the ostomy pouch cleaning process. Unlike other options available, one application of its lubricant works for the entire day, the Arnasi team says, and they designed a single unit dose that fits in your pocket for added convenience.

An ostomy pouch “significantly impacts a person’s lifestyle,” Varanasi says. “They need to keep it clean, and they need to use it at all times. We are solving a very important problem while helping people by giving their dignity and lifestyles back.”

Revel, Arnasi’s FDA-registered product, officially launched this month, and it has already received promising feedback from nurses and patients.

Margaret is a nurse who relies on an ostomy pouch herself and cares for patients who need them after receiving colostomies and ileostomies. She received samples of Revel at a recent conference and says it could dramatically improve both her and her patients’ lives.

“These pouches need to be emptied frequently, and sometimes that’s very difficult to do,” she says. “This particular product makes everything slide out without any problems at all, and it’s a wonderful improvement. It also lasts long enough to empty the pouch three to four times, which is great because you don’t have to carry a bunch of this stuff around.”

Margaret’s experience echoes feedback Arnasi’s team has heard from many others.

“When we showed it to the nurses, they were blown away with the product,” says Arnasi CEO Dan Salain. “They asked us to get this product out to the market as fast as we could, and so that’s what we’re doing.”

Arnasi’s next medical products will be used to prevent biofilm and bacterial infections caused by implants and catheters, and will also help people with cystic fibrosis.

“We want to create products that really help people,” Salain says. “Anything that’s implantable in the body, whether it’s a catheter, a hip, knee, or joint replacement, a breast implant, a bladder sling — those things lend themselves to our technology.”

From packages to patients

Varanasi initially developed Arnasi’s liquid-impregnated surface technology with Smith, Arnasi’s co-founder and current CTO, when Smith was a graduate student in Varanasi’s lab. The research was initially funded by the MIT Energy Initiative and the MIT Deshpande Center to work on solid-liquid interfaces with broad applications for energy, water, and more.

“There’s this fundamental friction constraint called the no-slip boundary condition between a liquid and a solid, so by creating a new surface in which we can infuse a liquid that is less viscous, we can now get the product to easily slide on surfaces,” Varanasi explains. “That aha moment meant we could get around a fundamental constraint in fluid dynamics.”

Still, sticky surfaces are everywhere, and the scientific co-founders had to decide where to apply their technology first. Shortly after the invention, Varanasi was at home trying to decide on the best application when he saw his wife across the kitchen table trying to get honey out of a bottle. It was another aha moment.

Soon after, Varanasi’s team entered the MIT $100K Entrepreneurship Competition. The competition — and the corresponding videos of ketchup and other materials sliding out their bottles with ease — created a media storm and a frenzy of attention.

“The press exploded,” Varanasi says. “For three months, my phone didn’t stop ringing. My group website crashed. There was a lot of market pull and in response, we founded the company.”

Arnasi, still operating as LiquiGlide, licensed the intellectual property from MIT’s Technology Licensing Office and eventually signed large deals with some of the world’s biggest consumer packaged goods companies, who used it to create products like fully recyclable toothpaste.

“There is so much waste just because we can't get all of the product, be it food, cosmetics, or medical products, out of containers,” Varanasi says. “Fifty billion-plus packages are sold every year, and 5 to 10 percent of product is left behind on average. So, you can imagine the CO2 footprint of the wasted product. And even though a lot of this is in recyclable packaging, they can’t be recycled because you need to wash out all the product. The water footprint of this is huge, not to mention the wasted product.”

While all of that was going on, Arnasi’s team was also looking into the biomedical space. For instance, Varanasi’s lab previously showed the technology could be used to prevent occlusion from blood clots and thrombosis and reduce biofilm formation, among other applications.

After studying the industry and speaking with patients and nurses, Arnasi realized a better lubricant for ostomy pouches could improve millions of people’s lives.

“Stool accumulates in these pouches outside of people’s bodies, and they need to empty it up to eight times a day,” explains Brienne Engel, Arnasi’s director of business development. “That process has a lot of challenges associated with it: It can be difficult to drain, leaving a lot of mass behind, it takes a long time to drain, so you can spend a long time in a restroom trying to clear out your pouch, and then there’s something called pancaking that can push the pouch off the [surgical opening], introducing issues like leakage, odor, and failure of the ostomy pouching system.”

Ostomy and beyond

Arnasi’s ostomy lubricant, Revel, is the first non-water-based solution on the market, and as-yet unpublished third-party testing has shown it allows for faster, more complete pouch drainage, along with other benefits.

“A lot of the existing brands treat their consumers like patients, but what we’ve found is they want to be treated like people and have a consumer experience,” Salain says. “The magic we saw with our toothpaste product was people got this amazing consumer experience out of it, and we wanted to create the same thing with Revel.”

Now Arnasi is planning to use its technology in medical products for skin infections, cystic fibrosis, and in implantable catheters and joint replacements. Arnasi’s team believes those last two use cases could prevent millions of deadly infections.

“When people are getting hemodialysis catheters, they have a 33 percent risk of developing infections, and those that do get those infections have a 25 percent chance of dying from them,” Engel says. “Taking our underlying technology and applying it to catheters, for example, imparts anti-biofilm properties and also prevent things like thrombosis, or blood clotting on the outside of these catheters, which is a problem in and of itself but also provides a space for bacteria to seed.”

Ultimately, Varanasi’s team is balancing making progress on its biomedical applications while exploring other avenues for its technology — including energy, manufacturing, and agriculture — to maximize its impact on the world.

“We think of this as a company with many companies within it because of all the different areas that it can impact. Liquid-solid interfaces are ubiquitous, viscous products are everywhere, and deploying this technology to solve difficult problems has been a dream,” Varanasi says. “It’s a great example of how MIT technology can be used for the benefit of humankind.”

© Image: Courtesy of Liquiglide

“[B]y creating a new surface in which we can infuse a liquid that is less viscous, we can now get the product to easily slide on surfaces,” Varanasi explains.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇Latest
  • Brickbat: What a Nice Idea!Charles Oliver
    In the United Kingdom, a National Health Service survey found that 26 percent of people in Cheshire and Merseyside who tried to book a dental appointment in the last two years were unsuccessful, and 20 percent of Cheshire and Merseyside residents who did get to see a dentist rated the experience fairly or very poor, numbers which are typical of England as a whole. The British Dental Association said that millions of people no longer even try to b
     

Brickbat: What a Nice Idea!

2. Srpen 2024 v 10:00
A female dentist sits by the patient's chair with a laptop open. | Valerii Honcharuk | Dreamstime.com

In the United Kingdom, a National Health Service survey found that 26 percent of people in Cheshire and Merseyside who tried to book a dental appointment in the last two years were unsuccessful, and 20 percent of Cheshire and Merseyside residents who did get to see a dentist rated the experience fairly or very poor, numbers which are typical of England as a whole. The British Dental Association said that millions of people no longer even try to book appointments because they know they can't get one, adding that getting to see an NHS dentist "is just a nice idea rather than a reality they can depend on."

The post Brickbat: What a Nice Idea! appeared first on Reason.com.

  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇Latest
  • The COVID-19 Vaccines Shouldn't Have Been FreeChristian Britschgi
    In a recent essay in the journal Monash Bioethics Review, oncologist Vinay Prasad and health researcher Alyson Haslam provide a comprehensive after-the-fact assessment of the federal government's rollout of the COVID-19 vaccines. Their basic takeaway is that the vaccines were a "scientific success" tarnished by flawed federal vaccine policy. The two argue the tremendous benefits of the COVID-19 vaccines for the elderly were undercut by government
     

The COVID-19 Vaccines Shouldn't Have Been Free

30. Květen 2024 v 16:30
Vaccines | Wachiwit/Dreamstime.com

In a recent essay in the journal Monash Bioethics Review, oncologist Vinay Prasad and health researcher Alyson Haslam provide a comprehensive after-the-fact assessment of the federal government's rollout of the COVID-19 vaccines.

Their basic takeaway is that the vaccines were a "scientific success" tarnished by flawed federal vaccine policy.

The two argue the tremendous benefits of the COVID-19 vaccines for the elderly were undercut by government guidance and messaging that pushed vaccines on the young, healthy, and previously infected when data suggested that wasn't worthwhile (and was in some cases counterproductive).

Worse still, the government even pushed vaccine mandates when it was increasingly clear the vaccines did not stop COVID-19 transmission, they argue.

To correct these errors for future pandemic responses, Prasad and Haslam recommend performing larger vaccine trials and collecting better data on vaccine performance in lower-risk populations. They also urge policy makers to be more willing to acknowledge the tradeoffs of vaccination.

That's sound advice. We'll have to wait and see if the government adopts it come the next pandemic.

There is one policy that they don't mention and doesn't totally depend on the government getting better at judging the risks of new vaccines: Charge people for them.

Had the government not provided COVID-19 vaccines for free and shielded vaccine makers and administrators from any liability for adverse reactions, prices could have better rationed vaccine supply and better informed people about their risks and benefits.

Without prices, people were instead left with flawed government recommendations, incentives, and rationing schemes.

Those who recall early 2021 will remember the complex, often transparently silly eligibility criteria state governments set up to ration scarce vaccine supplies. This often involved prioritizing younger, healthier, often politically connected "essential workers" over elderly people.

Prasad and Haslam criticize this as a government failure to prioritize groups at most risk of dying from COVID-19.

"While the UK prioritized nursing home residents and older individuals…the US included essential workers, including young, resident physicians," write Prasad and Haslam. "Health care workers face higher risks of acquiring the virus due to occupation (though this was and is offset by available personal protective equipment), but this was less than the elevated risk of death faced by older individuals."

Yet if the government hadn't assigned itself the role of distributing vaccines for free, it wouldn't have been forced into this position of rationing scarce vaccine supplies.

Demand for the vaccine is a function of the vaccine's price. Since the vaccine's price was $0, people who stood to gain comparatively less from vaccination and people for whom a vaccine would be lifesaving were equally incentivized to receive it.

Consequently, everyone rushed to get in line at the same time. The government then had to decide who got it first and predictably made flawed decisions.

Had vaccine makers been left to sell their product on an open market (instead of selling doses in bulk to the federal government to distribute for free), the elderly and those most at risk of COVID-19 would have been able to outbid people who could afford to wait longer. Perhaps more lives could have been saved.

Over the course of 2021, the supply of vaccines outgrew demand.

At the same time, as Prasad and Haslam recount, an increasing number of people (particularly young men) were developing myocarditis as a result of vaccination. Nevertheless, the government downplayed this risk, continued to urge younger populations to get vaccinated, and failed to collect data about the potential risks of vaccination.

That's all a failure of the government policy. Even if the government was slow to adjust its recommendations, prices could have played a constructive role in informing people about their own risk-reward tradeoff of getting vaccinated.

If a 20-year-old man who'd already had COVID-19 had to spend something to get vaccinated, instead of nothing, fewer would have. Prasad and Haslam argue that would have been the right call healthwise.

Without prices, that hypothetical 20-year-old's decision was informed mostly by government guidance, and, later, government mandates.

The government compounded this lack of prices by giving liability shields to vaccine makers. As it stands right now, no one is able to sue the maker of a COVID-19 vaccine should they have an adverse reaction. (Unlike standard, non-COVID vaccines, people are also not allowed to sue the government for compensation for the vaccine injuries.)

If pharmaceutical companies had to charge individual consumers to make money off their vaccines, and if those prices had to reflect the liability risks of the side effects some number of people would inevitably have, consumers would have been even better informed about the risks and rewards of vaccination.

One might counter that individual consumers aren't in a position to perform this risk-reward calculation on their own.

That ignores the ways that other intermediaries in a better position to evaluate the costs and benefits of vaccination could contribute to the price signals individuals would use to make their own decisions.

One could imagine an insurance company declining to cover COVID-19 vaccines for the aforementioned healthy 20-year-old while subsidizing their elderly customers to get the shot. (This is, of course, illegal right now. The Affordable Care Act requires most insurance plans to cover the costs of vaccination for everyone.)

Instead, the financial incentives that were attached to vaccination were another part of the federally subsidized vaccination campaign.

State Medicaid programs paid providers bonuses for the number of patients they vaccinated (regardless of how at risk of COVID-19 those patients were). State governments gave out gift cards to those who got vaccinated and entered them in lotteries to win even bigger prizes.

Leaving it up to private companies to produce and charge for vaccines would have one added benefit: It would make it much more difficult for the government to mandate vaccines or otherwise coerce people into getting them.

One of the things that made it easy for local and state governments to bar the unvaccinated from restaurants and schools was that they also had a lot of free, federally subsidized doses to give away. People didn't have a real "excuse" not to get a shot.

Had people been required to pay for vaccines, it would have been more awkward and much harder (politically and practically) to mandate that they do so.

Economist Alex Tabarrok likes to say that a "price is a signal wrapped up in an incentive." They signal crucial information and then incentivize people to act on that information in a rational, efficient way.

By divorcing COVID-19 vaccines from real price signals, we were left with an earnest, government-led vaccination effort. That effort got a lot of lifesaving vaccines to a lot of people.

But it also encouraged and subsidized people to get vaccinated when it was probably not a necessary or even good idea. When not enough people got vaccinated, governments turned to coercive mandates.

The post The COVID-19 Vaccines Shouldn't Have Been Free appeared first on Reason.com.

  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • MIT’s tiny technologies go to WashingtonKoch Institute
    On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases. The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for
     

MIT’s tiny technologies go to Washington

On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases.

The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for all Americans.

Participants included President Joe Biden, Biden-Harris administration leaders and White House staff, members of Congress, federal R&D funding agencies, scientists and engineers, academics, students, and science and technology industry innovators. The event holds special significance for MIT as eight years ago, MIT's Computer Science and Artificial Intelligence Laboratory participated in the last iteration of the White House Demo Day under President Barack Obama.

“It was truly inspirational hearing from experts from all across the government, the private sector, and academia touching on so many fields,” said President Biden of the event. “It was a reminder, at least for me, of what I’ve long believed — that America can be defined by a single word... possibilities.”

Launched in 2016, the Marble Center for Cancer Nanomedicine was established at the Koch Institute for Integrative Research at MIT to serve as a hub for miniaturized biomedical technologies, especially those that address grand challenges in cancer detection, treatment, and monitoring. The center convenes Koch Institute faculty members Sangeeta Bhatia, Paula Hammond, Robert Langer, Angela Belcher, Darrell Irvine, and Daniel Anderson to advance nanomedicine, as well as to facilitate collaboration with industry partners, including Alloy Therapeutics, Danaher Corp., Fujifilm, and Sanofi. 

Ana Jaklenec, a principal research scientist at the Koch Institute, highlighted several groundbreaking technologies in vaccines and disease diagnostics and treatment at the event. Jaklenec gave demonstrations from projects from her research group, including novel vaccine formulations capable of releasing a dozen booster doses pulsed over predetermined time points, microneedle vaccine technologies, and nutrient delivery technologies for precise control over microbiome modulation and nutrient absorption.

Jaklenec describes the event as “a wonderful opportunity to meet our government leaders and policymakers and see their passion for curing cancer. But it was especially moving to interact with people representing diverse communities across the United States and hear their excitement for how our technologies could positively impact their communities.”

Jeremy Li, a former MIT postdoc, presented a technology developed in the Belcher laboratory and commercialized by the spinout Cision Vision. The startup is developing a new approach to visualize lymph nodes in real time without any injection or radiation. The shoebox-sized device was also selected as part of Time Magazine’s Best Inventions of 2023 and is currently being used in a dozen hospitals across the United States.

“It was a proud moment for Cision Vision to be part of this event and discuss our recent progress in the field of medical imaging and cancer care,” says Li, who is a co-founder and the CEO of CisionVision. “It was a humbling experience for us to hear directly from patient advocates and cancer survivors at the event. We feel more inspired than ever to bring better solutions for cancer care to patients around the world.”

Other technologies shown at the event included new approaches such as a tortoise-shaped pill designed to enhance the efficacy of oral medicines, a miniature organ-on-a-chip liver device to predict drug toxicity and model liver disease, and a wireless bioelectronic device that provides oxygen for cell therapy applications and for the treatment of chronic disease.

“The feedback from the organizers and the audience at the event has been overwhelmingly positive,” says Tarek Fadel, who led the team’s participation at the event. “Navigating the demonstration space felt like stepping into the future. As a center, we stand poised to engineer transformative tools that will truly make a difference for the future of cancer care.”

Sangeeta Bhatia, the Director of the Marble Center and the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, adds: “The showcase of our technologies at the White House Demo Day underscores the transformative impact we aim to achieve in cancer detection and treatment. The event highlights our vision to advance cutting-edge solutions for the benefit of patients and communities worldwide.”

Ana Jaklenec (right), principal research scientist at the Koch Institute for Integrative Cancer Research at MIT, and Jeremy Li, CEO and co-founder of Cision Vision, presented at “American Possibilities: A White House Demo Day.”
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Scientists 3D print self-heating microfluidic devicesAdam Zewe | MIT News
    MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases. Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic. But many microfluidic applications r
     

Scientists 3D print self-heating microfluidic devices

MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases.

Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic.

But many microfluidic applications require chemical reactions that must be performed at specific temperatures. These more complex microfluidic devices, which are typically manufactured in a clean room, are outfitted with heating elements made from gold or platinum using a complicated and expensive fabrication process that is difficult to scale up.

Instead, the MIT team used multimaterial 3D printing to create self-heating microfluidic devices with built-in heating elements, through a single, inexpensive manufacturing process. They generated devices that can heat fluid to a specific temperature as it flows through microscopic channels inside the tiny machine.

Their technique is customizable, so an engineer could create a microfluidic that heats fluid to a certain temperature or given heating profile within a specific area of the device. The low-cost fabrication process requires about $2 of materials to generate a ready-to-use microfluidic.

The process could be especially useful in creating self-heating microfluidics for remote regions of developing countries where clinicians may not have access to the expensive lab equipment required for many diagnostic procedures.

“Clean rooms in particular, where you would usually make these devices, are incredibly expensive to build and to run. But we can make very capable self-heating microfluidic devices using additive manufacturing, and they can be made a lot faster and cheaper than with these traditional methods. This is really a way to democratize this technology,” says Luis Fernando Velásquez-García, a principal scientist in MIT’s Microsystems Technology Laboratories (MTL) and senior author of a paper describing the fabrication technique.

He is joined on the paper by lead author Jorge Cañada Pérez-Sala, an electrical engineering and computer science graduate student. The research will be presented at the PowerMEMS Conference this month.

An insulator becomes conductive

This new fabrication process utilizes a technique called multimaterial extrusion 3D printing, in which several materials can be squirted through the printer’s many nozzles to build a device layer by layer. The process is monolithic, which means the entire device can be produced in one step on the 3D printer, without the need for any post-assembly.

To create self-heating microfluidics, the researchers used two materials — a biodegradable polymer known as polylactic acid (PLA) that is commonly used in 3D printing, and a modified version of PLA.

The modified PLA has mixed copper nanoparticles into the polymer, which converts this insulating material into an electrical conductor, Velásquez-García explains. When electrical current is fed into a resistor composed of this copper-doped PLA, energy is dissipated as heat.

“It is amazing when you think about it because the PLA material is a dielectric, but when you put in these nanoparticle impurities, it completely changes the physical properties. This is something we don’t fully understand yet, but it happens and it is repeatable,” he says.

Using a multimaterial 3D printer, the researchers fabricate a heating resistor from the copper-doped PLA and then print the microfluidic device, with microscopic channels through which fluid can flow, directly on top in one printing step. Because the components are made from the same base material, they have similar printing temperatures and are compatible.

Heat dissipated from the resistor will warm fluid flowing through the channels in the microfluidic.

In addition to the resistor and microfluidic, they use the printer to add a thin, continuous layer of PLA that is sandwiched between them. It is especially challenging to manufacture this layer because it must be thin enough so heat can transfer from the resistor to the microfluidic, but not so thin that fluid could leak into the resistor.

The resulting machine is about the size of a U.S. quarter and can be produced in a matter of minutes. Channels about 500 micrometers wide and 400 micrometers tall are threaded through the microfluidic to carry fluid and facilitate chemical reactions.

Importantly, the PLA material is translucent, so fluid in the device remains visible. Many processes rely on visualization or the use of light to infer what is happening during chemical reactions, Velásquez-García explains.

Customizable chemical reactors

The researchers used this one-step manufacturing process to generate a prototype that could heat fluid by 4 degrees Celsius as it flowed between the input and the output. This customizable technique could enable them to make devices which would heat fluids in certain patterns or along specific gradients.

“You can use these two materials to create chemical reactors that do exactly what you want. We can set up a particular heating profile while still having all the capabilities of the microfluidic,” he says.

However, one limitation comes from the fact that PLA can only be heated to about 50 degrees Celsius before it starts to degrade. Many chemical reactions, such as those used for polymerase chain reaction (PCR) tests, require temperatures of 90 degrees or higher. And to precisely control the temperature of the device, researchers would need to integrate a third material that enables temperature sensing.

In addition to tackling these limitations in future work, Velásquez-García wants to print magnets directly into the microfluidic device. These magnets could enable chemical reactions that require particles to be sorted or aligned.

At the same time, he and his colleagues are exploring the use of other materials that could reach higher temperatures. They are also studying PLA to better understand why it becomes conductive when certain impurities are added to the polymer.

“If we can understand the mechanism that is related to the electrical conductivity of PLA, that would greatly enhance the capability of these devices, but it is going to be a lot harder to solve than some other engineering problems,” he adds.

“In Japanese culture, it’s often said that beauty lies in simplicity. This sentiment is echoed by the work of Cañada and Velasquez-Garcia. Their proposed monolithically 3D-printed microfluidic systems embody simplicity and beauty, offering a wide array of potential derivations and applications that we foresee in the future,” says Norihisa Miki, a professor of mechanical engineering at Keio University in Tokyo, who was not involved with this work.

“Being able to directly print microfluidic chips with fluidic channels and electrical features at the same time opens up very exiting applications when processing biological samples, such as to amplify biomarkers or to actuate and mix liquids. Also, due to the fact that PLA degrades over time, one can even think of implantable applications where the chips dissolve and resorb over time,” adds Niclas Roxhed, an associate professor at Sweden’s KTH Royal Institute of Technology, who was not involved with this study.

This research was funded, in part, by the Empiriko Corporation and a fellowship from La Caixa Foundation.

© Image: Courtesy of the researchers

MIT researchers developed a fabrication process to produce self-heating microfluidic devices in one step using a multi-material 3D printer. Pictured is an example of one of the devices.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • MIT’s tiny technologies go to WashingtonKoch Institute
    On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases. The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for
     

MIT’s tiny technologies go to Washington

On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases.

The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for all Americans.

Participants included President Joe Biden, Biden-Harris administration leaders and White House staff, members of Congress, federal R&D funding agencies, scientists and engineers, academics, students, and science and technology industry innovators. The event holds special significance for MIT as eight years ago, MIT's Computer Science and Artificial Intelligence Laboratory participated in the last iteration of the White House Demo Day under President Barack Obama.

“It was truly inspirational hearing from experts from all across the government, the private sector, and academia touching on so many fields,” said President Biden of the event. “It was a reminder, at least for me, of what I’ve long believed — that America can be defined by a single word... possibilities.”

Launched in 2016, the Marble Center for Cancer Nanomedicine was established at the Koch Institute for Integrative Research at MIT to serve as a hub for miniaturized biomedical technologies, especially those that address grand challenges in cancer detection, treatment, and monitoring. The center convenes Koch Institute faculty members Sangeeta Bhatia, Paula Hammond, Robert Langer, Angela Belcher, Darrell Irvine, and Daniel Anderson to advance nanomedicine, as well as to facilitate collaboration with industry partners, including Alloy Therapeutics, Danaher Corp., Fujifilm, and Sanofi. 

Ana Jaklenec, a principal research scientist at the Koch Institute, highlighted several groundbreaking technologies in vaccines and disease diagnostics and treatment at the event. Jaklenec gave demonstrations from projects from her research group, including novel vaccine formulations capable of releasing a dozen booster doses pulsed over predetermined time points, microneedle vaccine technologies, and nutrient delivery technologies for precise control over microbiome modulation and nutrient absorption.

Jaklenec describes the event as “a wonderful opportunity to meet our government leaders and policymakers and see their passion for curing cancer. But it was especially moving to interact with people representing diverse communities across the United States and hear their excitement for how our technologies could positively impact their communities.”

Jeremy Li, a former MIT postdoc, presented a technology developed in the Belcher laboratory and commercialized by the spinout Cision Vision. The startup is developing a new approach to visualize lymph nodes in real time without any injection or radiation. The shoebox-sized device was also selected as part of Time Magazine’s Best Inventions of 2023 and is currently being used in a dozen hospitals across the United States.

“It was a proud moment for Cision Vision to be part of this event and discuss our recent progress in the field of medical imaging and cancer care,” says Li, who is a co-founder and the CEO of CisionVision. “It was a humbling experience for us to hear directly from patient advocates and cancer survivors at the event. We feel more inspired than ever to bring better solutions for cancer care to patients around the world.”

Other technologies shown at the event included new approaches such as a tortoise-shaped pill designed to enhance the efficacy of oral medicines, a miniature organ-on-a-chip liver device to predict drug toxicity and model liver disease, and a wireless bioelectronic device that provides oxygen for cell therapy applications and for the treatment of chronic disease.

“The feedback from the organizers and the audience at the event has been overwhelmingly positive,” says Tarek Fadel, who led the team’s participation at the event. “Navigating the demonstration space felt like stepping into the future. As a center, we stand poised to engineer transformative tools that will truly make a difference for the future of cancer care.”

Sangeeta Bhatia, the Director of the Marble Center and the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, adds: “The showcase of our technologies at the White House Demo Day underscores the transformative impact we aim to achieve in cancer detection and treatment. The event highlights our vision to advance cutting-edge solutions for the benefit of patients and communities worldwide.”

Ana Jaklenec (right), principal research scientist at the Koch Institute for Integrative Cancer Research at MIT, and Jeremy Li, CEO and co-founder of Cision Vision, presented at “American Possibilities: A White House Demo Day.”
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Scientists 3D print self-heating microfluidic devicesAdam Zewe | MIT News
    MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases. Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic. But many microfluidic applications r
     

Scientists 3D print self-heating microfluidic devices

MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases.

Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic.

But many microfluidic applications require chemical reactions that must be performed at specific temperatures. These more complex microfluidic devices, which are typically manufactured in a clean room, are outfitted with heating elements made from gold or platinum using a complicated and expensive fabrication process that is difficult to scale up.

Instead, the MIT team used multimaterial 3D printing to create self-heating microfluidic devices with built-in heating elements, through a single, inexpensive manufacturing process. They generated devices that can heat fluid to a specific temperature as it flows through microscopic channels inside the tiny machine.

Their technique is customizable, so an engineer could create a microfluidic that heats fluid to a certain temperature or given heating profile within a specific area of the device. The low-cost fabrication process requires about $2 of materials to generate a ready-to-use microfluidic.

The process could be especially useful in creating self-heating microfluidics for remote regions of developing countries where clinicians may not have access to the expensive lab equipment required for many diagnostic procedures.

“Clean rooms in particular, where you would usually make these devices, are incredibly expensive to build and to run. But we can make very capable self-heating microfluidic devices using additive manufacturing, and they can be made a lot faster and cheaper than with these traditional methods. This is really a way to democratize this technology,” says Luis Fernando Velásquez-García, a principal scientist in MIT’s Microsystems Technology Laboratories (MTL) and senior author of a paper describing the fabrication technique.

He is joined on the paper by lead author Jorge Cañada Pérez-Sala, an electrical engineering and computer science graduate student. The research will be presented at the PowerMEMS Conference this month.

An insulator becomes conductive

This new fabrication process utilizes a technique called multimaterial extrusion 3D printing, in which several materials can be squirted through the printer’s many nozzles to build a device layer by layer. The process is monolithic, which means the entire device can be produced in one step on the 3D printer, without the need for any post-assembly.

To create self-heating microfluidics, the researchers used two materials — a biodegradable polymer known as polylactic acid (PLA) that is commonly used in 3D printing, and a modified version of PLA.

The modified PLA has mixed copper nanoparticles into the polymer, which converts this insulating material into an electrical conductor, Velásquez-García explains. When electrical current is fed into a resistor composed of this copper-doped PLA, energy is dissipated as heat.

“It is amazing when you think about it because the PLA material is a dielectric, but when you put in these nanoparticle impurities, it completely changes the physical properties. This is something we don’t fully understand yet, but it happens and it is repeatable,” he says.

Using a multimaterial 3D printer, the researchers fabricate a heating resistor from the copper-doped PLA and then print the microfluidic device, with microscopic channels through which fluid can flow, directly on top in one printing step. Because the components are made from the same base material, they have similar printing temperatures and are compatible.

Heat dissipated from the resistor will warm fluid flowing through the channels in the microfluidic.

In addition to the resistor and microfluidic, they use the printer to add a thin, continuous layer of PLA that is sandwiched between them. It is especially challenging to manufacture this layer because it must be thin enough so heat can transfer from the resistor to the microfluidic, but not so thin that fluid could leak into the resistor.

The resulting machine is about the size of a U.S. quarter and can be produced in a matter of minutes. Channels about 500 micrometers wide and 400 micrometers tall are threaded through the microfluidic to carry fluid and facilitate chemical reactions.

Importantly, the PLA material is translucent, so fluid in the device remains visible. Many processes rely on visualization or the use of light to infer what is happening during chemical reactions, Velásquez-García explains.

Customizable chemical reactors

The researchers used this one-step manufacturing process to generate a prototype that could heat fluid by 4 degrees Celsius as it flowed between the input and the output. This customizable technique could enable them to make devices which would heat fluids in certain patterns or along specific gradients.

“You can use these two materials to create chemical reactors that do exactly what you want. We can set up a particular heating profile while still having all the capabilities of the microfluidic,” he says.

However, one limitation comes from the fact that PLA can only be heated to about 50 degrees Celsius before it starts to degrade. Many chemical reactions, such as those used for polymerase chain reaction (PCR) tests, require temperatures of 90 degrees or higher. And to precisely control the temperature of the device, researchers would need to integrate a third material that enables temperature sensing.

In addition to tackling these limitations in future work, Velásquez-García wants to print magnets directly into the microfluidic device. These magnets could enable chemical reactions that require particles to be sorted or aligned.

At the same time, he and his colleagues are exploring the use of other materials that could reach higher temperatures. They are also studying PLA to better understand why it becomes conductive when certain impurities are added to the polymer.

“If we can understand the mechanism that is related to the electrical conductivity of PLA, that would greatly enhance the capability of these devices, but it is going to be a lot harder to solve than some other engineering problems,” he adds.

“In Japanese culture, it’s often said that beauty lies in simplicity. This sentiment is echoed by the work of Cañada and Velasquez-Garcia. Their proposed monolithically 3D-printed microfluidic systems embody simplicity and beauty, offering a wide array of potential derivations and applications that we foresee in the future,” says Norihisa Miki, a professor of mechanical engineering at Keio University in Tokyo, who was not involved with this work.

“Being able to directly print microfluidic chips with fluidic channels and electrical features at the same time opens up very exiting applications when processing biological samples, such as to amplify biomarkers or to actuate and mix liquids. Also, due to the fact that PLA degrades over time, one can even think of implantable applications where the chips dissolve and resorb over time,” adds Niclas Roxhed, an associate professor at Sweden’s KTH Royal Institute of Technology, who was not involved with this study.

This research was funded, in part, by the Empiriko Corporation and a fellowship from La Caixa Foundation.

© Image: Courtesy of the researchers

MIT researchers developed a fabrication process to produce self-heating microfluidic devices in one step using a multi-material 3D printer. Pictured is an example of one of the devices.
  • ✇Techdirt
  • Nurses Say Hospital Adoption Of Half-Cooked ‘AI’ Is RecklessKarl Bode
    We’ve noted repeatedly that while “AI” (language learning models) hold a lot of potential, the rushed implementation of half-assed early variants are causing no shortage of headaches across journalism, media, health care, and other sectors. In part because the kind of terrible brunchlord managers in charge of many institutions primarily see AI as a way to cut corners and attack labor. It’s been a particular problem in healthcare, where broken “AI” is being layered on top of already broken system
     

Nurses Say Hospital Adoption Of Half-Cooked ‘AI’ Is Reckless

Od: Karl Bode
2. Květen 2024 v 14:22

We’ve noted repeatedly that while “AI” (language learning models) hold a lot of potential, the rushed implementation of half-assed early variants are causing no shortage of headaches across journalism, media, health care, and other sectors. In part because the kind of terrible brunchlord managers in charge of many institutions primarily see AI as a way to cut corners and attack labor.

It’s been a particular problem in healthcare, where broken “AI” is being layered on top of already broken systems. Like in insurance, where error-prone automation, programmed from the ground up to prioritize money over health, is incorrectly denying essential insurance coverage to the elderly.

Last week, hundreds of nurses protested the implementation of sloppy AI into hospital systems in front of Kaiser Permanente. Their primary concern: that systems incapable of empathy are being integrated into an already dysfunctional sector without much thought toward patient care:

“No computer, no AI can replace a human touch,” said Amy Grewal, a registered nurse. “It cannot hold your loved one’s hand. You cannot teach a computer how to have empathy.”

There are certainly roles automation can play in easing strain on a sector full of burnout after COVID, particularly when it comes to administrative tasks. The concern, as with other industries dominated by executives with poor judgement, is that this is being used as a justification by for-profit hospital systems to cut corners further. From a National Nurses United blog post (spotted by 404 Media):

“Nurses are not against scientific or technological advancement, but we will not accept algorithms replacing the expertise, experience, holistic, and hands-on approach we bring to patient care,” they added.

Kaiser Permanente, for its part, insists it’s simply leveraging “state-of-the-art tools and technologies that support our mission of providing high-quality, affordable health care to best meet our members’ and patients’ needs.” The company claims its “Advance Alert” AI monitoring system — which algorithmically analyzes patient data every hour — has the potential to save upwards of 500 lives a year.

The problem is that healthcare giants’ primary obligation no longer appears to reside with patients, but with their financial results. And, that’s even true in non-profit healthcare providers. That is seen in the form of cut corners, worse service, and an assault on already over-taxed labor via lower pay and higher workload (curiously, it never seems to impact outsized high-level executive compensation).

AI provides companies the perfect justification for making life worse on employees under the pretense of progress. Which wouldn’t be quite as terrible if the implementation of AI in health care hadn’t been such a preposterous mess, ranging from mental health chatbots doling out dangerously inaccurate advice, to AI health insurance bots that make error-prone judgements a good 90 percent of the time.

AI has great potential in imaging analysis. But while it can help streamline analysis and solve some errors, it may introduce entirely new ones if not adopted with caution. Concern on this front can often be misrepresented as being anti-technology or anti-innovation by health care hardware technology companies again prioritizing quarterly returns over the safety of patients.

Implementing this kind of transformative but error-prone tech in an industry where lives are on the line requires patience, intelligent planning, broad consultation with every level of employee, and competent regulatory guidance, none of which are American strong suits of late.

  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • MIT’s tiny technologies go to WashingtonKoch Institute
    On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases. The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for
     

MIT’s tiny technologies go to Washington

On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases.

The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for all Americans.

Participants included President Joe Biden, Biden-Harris administration leaders and White House staff, members of Congress, federal R&D funding agencies, scientists and engineers, academics, students, and science and technology industry innovators. The event holds special significance for MIT as eight years ago, MIT's Computer Science and Artificial Intelligence Laboratory participated in the last iteration of the White House Demo Day under President Barack Obama.

“It was truly inspirational hearing from experts from all across the government, the private sector, and academia touching on so many fields,” said President Biden of the event. “It was a reminder, at least for me, of what I’ve long believed — that America can be defined by a single word... possibilities.”

Launched in 2016, the Marble Center for Cancer Nanomedicine was established at the Koch Institute for Integrative Research at MIT to serve as a hub for miniaturized biomedical technologies, especially those that address grand challenges in cancer detection, treatment, and monitoring. The center convenes Koch Institute faculty members Sangeeta Bhatia, Paula Hammond, Robert Langer, Angela Belcher, Darrell Irvine, and Daniel Anderson to advance nanomedicine, as well as to facilitate collaboration with industry partners, including Alloy Therapeutics, Danaher Corp., Fujifilm, and Sanofi. 

Ana Jaklenec, a principal research scientist at the Koch Institute, highlighted several groundbreaking technologies in vaccines and disease diagnostics and treatment at the event. Jaklenec gave demonstrations from projects from her research group, including novel vaccine formulations capable of releasing a dozen booster doses pulsed over predetermined time points, microneedle vaccine technologies, and nutrient delivery technologies for precise control over microbiome modulation and nutrient absorption.

Jaklenec describes the event as “a wonderful opportunity to meet our government leaders and policymakers and see their passion for curing cancer. But it was especially moving to interact with people representing diverse communities across the United States and hear their excitement for how our technologies could positively impact their communities.”

Jeremy Li, a former MIT postdoc, presented a technology developed in the Belcher laboratory and commercialized by the spinout Cision Vision. The startup is developing a new approach to visualize lymph nodes in real time without any injection or radiation. The shoebox-sized device was also selected as part of Time Magazine’s Best Inventions of 2023 and is currently being used in a dozen hospitals across the United States.

“It was a proud moment for Cision Vision to be part of this event and discuss our recent progress in the field of medical imaging and cancer care,” says Li, who is a co-founder and the CEO of CisionVision. “It was a humbling experience for us to hear directly from patient advocates and cancer survivors at the event. We feel more inspired than ever to bring better solutions for cancer care to patients around the world.”

Other technologies shown at the event included new approaches such as a tortoise-shaped pill designed to enhance the efficacy of oral medicines, a miniature organ-on-a-chip liver device to predict drug toxicity and model liver disease, and a wireless bioelectronic device that provides oxygen for cell therapy applications and for the treatment of chronic disease.

“The feedback from the organizers and the audience at the event has been overwhelmingly positive,” says Tarek Fadel, who led the team’s participation at the event. “Navigating the demonstration space felt like stepping into the future. As a center, we stand poised to engineer transformative tools that will truly make a difference for the future of cancer care.”

Sangeeta Bhatia, the Director of the Marble Center and the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, adds: “The showcase of our technologies at the White House Demo Day underscores the transformative impact we aim to achieve in cancer detection and treatment. The event highlights our vision to advance cutting-edge solutions for the benefit of patients and communities worldwide.”

Ana Jaklenec (right), principal research scientist at the Koch Institute for Integrative Cancer Research at MIT, and Jeremy Li, CEO and co-founder of Cision Vision, presented at “American Possibilities: A White House Demo Day.”
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Scientists 3D print self-heating microfluidic devicesAdam Zewe | MIT News
    MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases. Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic. But many microfluidic applications r
     

Scientists 3D print self-heating microfluidic devices

MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases.

Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic.

But many microfluidic applications require chemical reactions that must be performed at specific temperatures. These more complex microfluidic devices, which are typically manufactured in a clean room, are outfitted with heating elements made from gold or platinum using a complicated and expensive fabrication process that is difficult to scale up.

Instead, the MIT team used multimaterial 3D printing to create self-heating microfluidic devices with built-in heating elements, through a single, inexpensive manufacturing process. They generated devices that can heat fluid to a specific temperature as it flows through microscopic channels inside the tiny machine.

Their technique is customizable, so an engineer could create a microfluidic that heats fluid to a certain temperature or given heating profile within a specific area of the device. The low-cost fabrication process requires about $2 of materials to generate a ready-to-use microfluidic.

The process could be especially useful in creating self-heating microfluidics for remote regions of developing countries where clinicians may not have access to the expensive lab equipment required for many diagnostic procedures.

“Clean rooms in particular, where you would usually make these devices, are incredibly expensive to build and to run. But we can make very capable self-heating microfluidic devices using additive manufacturing, and they can be made a lot faster and cheaper than with these traditional methods. This is really a way to democratize this technology,” says Luis Fernando Velásquez-García, a principal scientist in MIT’s Microsystems Technology Laboratories (MTL) and senior author of a paper describing the fabrication technique.

He is joined on the paper by lead author Jorge Cañada Pérez-Sala, an electrical engineering and computer science graduate student. The research will be presented at the PowerMEMS Conference this month.

An insulator becomes conductive

This new fabrication process utilizes a technique called multimaterial extrusion 3D printing, in which several materials can be squirted through the printer’s many nozzles to build a device layer by layer. The process is monolithic, which means the entire device can be produced in one step on the 3D printer, without the need for any post-assembly.

To create self-heating microfluidics, the researchers used two materials — a biodegradable polymer known as polylactic acid (PLA) that is commonly used in 3D printing, and a modified version of PLA.

The modified PLA has mixed copper nanoparticles into the polymer, which converts this insulating material into an electrical conductor, Velásquez-García explains. When electrical current is fed into a resistor composed of this copper-doped PLA, energy is dissipated as heat.

“It is amazing when you think about it because the PLA material is a dielectric, but when you put in these nanoparticle impurities, it completely changes the physical properties. This is something we don’t fully understand yet, but it happens and it is repeatable,” he says.

Using a multimaterial 3D printer, the researchers fabricate a heating resistor from the copper-doped PLA and then print the microfluidic device, with microscopic channels through which fluid can flow, directly on top in one printing step. Because the components are made from the same base material, they have similar printing temperatures and are compatible.

Heat dissipated from the resistor will warm fluid flowing through the channels in the microfluidic.

In addition to the resistor and microfluidic, they use the printer to add a thin, continuous layer of PLA that is sandwiched between them. It is especially challenging to manufacture this layer because it must be thin enough so heat can transfer from the resistor to the microfluidic, but not so thin that fluid could leak into the resistor.

The resulting machine is about the size of a U.S. quarter and can be produced in a matter of minutes. Channels about 500 micrometers wide and 400 micrometers tall are threaded through the microfluidic to carry fluid and facilitate chemical reactions.

Importantly, the PLA material is translucent, so fluid in the device remains visible. Many processes rely on visualization or the use of light to infer what is happening during chemical reactions, Velásquez-García explains.

Customizable chemical reactors

The researchers used this one-step manufacturing process to generate a prototype that could heat fluid by 4 degrees Celsius as it flowed between the input and the output. This customizable technique could enable them to make devices which would heat fluids in certain patterns or along specific gradients.

“You can use these two materials to create chemical reactors that do exactly what you want. We can set up a particular heating profile while still having all the capabilities of the microfluidic,” he says.

However, one limitation comes from the fact that PLA can only be heated to about 50 degrees Celsius before it starts to degrade. Many chemical reactions, such as those used for polymerase chain reaction (PCR) tests, require temperatures of 90 degrees or higher. And to precisely control the temperature of the device, researchers would need to integrate a third material that enables temperature sensing.

In addition to tackling these limitations in future work, Velásquez-García wants to print magnets directly into the microfluidic device. These magnets could enable chemical reactions that require particles to be sorted or aligned.

At the same time, he and his colleagues are exploring the use of other materials that could reach higher temperatures. They are also studying PLA to better understand why it becomes conductive when certain impurities are added to the polymer.

“If we can understand the mechanism that is related to the electrical conductivity of PLA, that would greatly enhance the capability of these devices, but it is going to be a lot harder to solve than some other engineering problems,” he adds.

“In Japanese culture, it’s often said that beauty lies in simplicity. This sentiment is echoed by the work of Cañada and Velasquez-Garcia. Their proposed monolithically 3D-printed microfluidic systems embody simplicity and beauty, offering a wide array of potential derivations and applications that we foresee in the future,” says Norihisa Miki, a professor of mechanical engineering at Keio University in Tokyo, who was not involved with this work.

“Being able to directly print microfluidic chips with fluidic channels and electrical features at the same time opens up very exiting applications when processing biological samples, such as to amplify biomarkers or to actuate and mix liquids. Also, due to the fact that PLA degrades over time, one can even think of implantable applications where the chips dissolve and resorb over time,” adds Niclas Roxhed, an associate professor at Sweden’s KTH Royal Institute of Technology, who was not involved with this study.

This research was funded, in part, by the Empiriko Corporation and a fellowship from La Caixa Foundation.

© Image: Courtesy of the researchers

MIT researchers developed a fabrication process to produce self-heating microfluidic devices in one step using a multi-material 3D printer. Pictured is an example of one of the devices.
  • ✇Techdirt
  • 96% Of Hospitals Share Sensitive Visitor Data With Meta, Google, and Data BrokersKarl Bode
    I’ve mentioned more than a few times how the singular hyperventilation about TikTok is kind of silly distraction from the fact that the United States is too corrupt to pass a modern privacy law, resulting in no limit of dodgy behavior, abuse, and scandal. We have no real standards thanks to corruption, and most people have no real idea of the scale of the dysfunction. Case in point: a new study out of the University of Pennsylvania (hat tip to The Register) analyzed a nationally representative
     

96% Of Hospitals Share Sensitive Visitor Data With Meta, Google, and Data Brokers

Od: Karl Bode
22. Duben 2024 v 14:23

I’ve mentioned more than a few times how the singular hyperventilation about TikTok is kind of silly distraction from the fact that the United States is too corrupt to pass a modern privacy law, resulting in no limit of dodgy behavior, abuse, and scandal. We have no real standards thanks to corruption, and most people have no real idea of the scale of the dysfunction.

Case in point: a new study out of the University of Pennsylvania (hat tip to The Register) analyzed a nationally representative sample of 100 U.S. hospitals, and found that 96 percent of them were doling out sensitive user visitor data to Google, Meta, and a vast coalition of dodgy data brokers.

Hospitals, it should be clear, aren’t legally required to publish website privacy policies that clearly detail how and with whom they share visitor data. Again, because we’re too corrupt as a country to require and enforce such requirements. The FTC does have some jurisdiction, but it’s too short staffed and under-funded (quite intentionally) to tackle the real scope of U.S. online privacy violations.

So the study found that a chunk of these hospital websites didn’t even have a privacy policy. And of the ones that did, about half the time the over-verbose pile of ambiguous and intentionally confusing legalese didn’t really inform visitors that their data was being transferred to a long list of third parties. Or, for that matter, who those third-parties even are:

“…we found that although 96.0% of hospital websites exposed users to third-party tracking, only 71.0% of websites had an available website privacy policy…Only 56.3% of policies (and only 40 hospitals overall) identified specific third-party recipients.”

Data in this instance can involve everything including email and IP addresses, to what you clicked on, what you researched, demographic info, and location. This was all a slight improvement from a study they did a year earlier showing that 98 percent of hospital websites shared sensitive data with third parties. The professors clearly knew what to expect, but were still disgusted in comments to The Register:

“It’s shocking, and really kind of incomprehensible,” said Dr Ari Friedman, an assistant professor of emergency medicine at the University of Pennsylvania. “People have cared about health privacy for a really, really, really long time.” It’s very fundamental to human nature. Even if it’s information that you would have shared with people, there’s still a loss, just an intrinsic loss, when you don’t even have control over who you share that information with.”

If this data is getting into the hands of dodgy international and unregulated data brokers, there’s no limit of places it can end up. Brokers collect a huge array of demographic, behavioral, and location data, use it to create detailed profiles of individuals, then sell access in a million different ways to a long line of additional third parties, including the U.S. government and foreign intelligence agencies.

There should be hard requirements about transparent, clear, and concise notifications of exactly what data is being collected and sold and to whom. There should be hard requirements that users have the ability to opt out (or, preferably in the cases of sensitive info, opt in). There should be hard punishment for companies and executives that play fast and loose with consumer data.

And we have none of that because our lawmakers decided, repeatedly, that making money was more important than market health, consumer welfare, and public safety. The result has been a parade of scandals that skirt ever closer to people being killed, at scale.

So again, the kind of people that whine about the singular privacy threat that is TikTok (like say FCC Commissioner Brendan Carr, or Senator Marsha Blackburn) — but have nothing to say about the much broader dysfunction created by rampant corruption — are advertising they either don’t know what they’re talking about, or aren’t addressing the full scope of the problem in good faith.

  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • MIT’s tiny technologies go to WashingtonKoch Institute
    On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases. The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for
     

MIT’s tiny technologies go to Washington

On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases.

The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for all Americans.

Participants included President Joe Biden, Biden-Harris administration leaders and White House staff, members of Congress, federal R&D funding agencies, scientists and engineers, academics, students, and science and technology industry innovators. The event holds special significance for MIT as eight years ago, MIT's Computer Science and Artificial Intelligence Laboratory participated in the last iteration of the White House Demo Day under President Barack Obama.

“It was truly inspirational hearing from experts from all across the government, the private sector, and academia touching on so many fields,” said President Biden of the event. “It was a reminder, at least for me, of what I’ve long believed — that America can be defined by a single word... possibilities.”

Launched in 2016, the Marble Center for Cancer Nanomedicine was established at the Koch Institute for Integrative Research at MIT to serve as a hub for miniaturized biomedical technologies, especially those that address grand challenges in cancer detection, treatment, and monitoring. The center convenes Koch Institute faculty members Sangeeta Bhatia, Paula Hammond, Robert Langer, Angela Belcher, Darrell Irvine, and Daniel Anderson to advance nanomedicine, as well as to facilitate collaboration with industry partners, including Alloy Therapeutics, Danaher Corp., Fujifilm, and Sanofi. 

Ana Jaklenec, a principal research scientist at the Koch Institute, highlighted several groundbreaking technologies in vaccines and disease diagnostics and treatment at the event. Jaklenec gave demonstrations from projects from her research group, including novel vaccine formulations capable of releasing a dozen booster doses pulsed over predetermined time points, microneedle vaccine technologies, and nutrient delivery technologies for precise control over microbiome modulation and nutrient absorption.

Jaklenec describes the event as “a wonderful opportunity to meet our government leaders and policymakers and see their passion for curing cancer. But it was especially moving to interact with people representing diverse communities across the United States and hear their excitement for how our technologies could positively impact their communities.”

Jeremy Li, a former MIT postdoc, presented a technology developed in the Belcher laboratory and commercialized by the spinout Cision Vision. The startup is developing a new approach to visualize lymph nodes in real time without any injection or radiation. The shoebox-sized device was also selected as part of Time Magazine’s Best Inventions of 2023 and is currently being used in a dozen hospitals across the United States.

“It was a proud moment for Cision Vision to be part of this event and discuss our recent progress in the field of medical imaging and cancer care,” says Li, who is a co-founder and the CEO of CisionVision. “It was a humbling experience for us to hear directly from patient advocates and cancer survivors at the event. We feel more inspired than ever to bring better solutions for cancer care to patients around the world.”

Other technologies shown at the event included new approaches such as a tortoise-shaped pill designed to enhance the efficacy of oral medicines, a miniature organ-on-a-chip liver device to predict drug toxicity and model liver disease, and a wireless bioelectronic device that provides oxygen for cell therapy applications and for the treatment of chronic disease.

“The feedback from the organizers and the audience at the event has been overwhelmingly positive,” says Tarek Fadel, who led the team’s participation at the event. “Navigating the demonstration space felt like stepping into the future. As a center, we stand poised to engineer transformative tools that will truly make a difference for the future of cancer care.”

Sangeeta Bhatia, the Director of the Marble Center and the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, adds: “The showcase of our technologies at the White House Demo Day underscores the transformative impact we aim to achieve in cancer detection and treatment. The event highlights our vision to advance cutting-edge solutions for the benefit of patients and communities worldwide.”

Ana Jaklenec (right), principal research scientist at the Koch Institute for Integrative Cancer Research at MIT, and Jeremy Li, CEO and co-founder of Cision Vision, presented at “American Possibilities: A White House Demo Day.”
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Scientists 3D print self-heating microfluidic devicesAdam Zewe | MIT News
    MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases. Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic. But many microfluidic applications r
     

Scientists 3D print self-heating microfluidic devices

MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases.

Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic.

But many microfluidic applications require chemical reactions that must be performed at specific temperatures. These more complex microfluidic devices, which are typically manufactured in a clean room, are outfitted with heating elements made from gold or platinum using a complicated and expensive fabrication process that is difficult to scale up.

Instead, the MIT team used multimaterial 3D printing to create self-heating microfluidic devices with built-in heating elements, through a single, inexpensive manufacturing process. They generated devices that can heat fluid to a specific temperature as it flows through microscopic channels inside the tiny machine.

Their technique is customizable, so an engineer could create a microfluidic that heats fluid to a certain temperature or given heating profile within a specific area of the device. The low-cost fabrication process requires about $2 of materials to generate a ready-to-use microfluidic.

The process could be especially useful in creating self-heating microfluidics for remote regions of developing countries where clinicians may not have access to the expensive lab equipment required for many diagnostic procedures.

“Clean rooms in particular, where you would usually make these devices, are incredibly expensive to build and to run. But we can make very capable self-heating microfluidic devices using additive manufacturing, and they can be made a lot faster and cheaper than with these traditional methods. This is really a way to democratize this technology,” says Luis Fernando Velásquez-García, a principal scientist in MIT’s Microsystems Technology Laboratories (MTL) and senior author of a paper describing the fabrication technique.

He is joined on the paper by lead author Jorge Cañada Pérez-Sala, an electrical engineering and computer science graduate student. The research will be presented at the PowerMEMS Conference this month.

An insulator becomes conductive

This new fabrication process utilizes a technique called multimaterial extrusion 3D printing, in which several materials can be squirted through the printer’s many nozzles to build a device layer by layer. The process is monolithic, which means the entire device can be produced in one step on the 3D printer, without the need for any post-assembly.

To create self-heating microfluidics, the researchers used two materials — a biodegradable polymer known as polylactic acid (PLA) that is commonly used in 3D printing, and a modified version of PLA.

The modified PLA has mixed copper nanoparticles into the polymer, which converts this insulating material into an electrical conductor, Velásquez-García explains. When electrical current is fed into a resistor composed of this copper-doped PLA, energy is dissipated as heat.

“It is amazing when you think about it because the PLA material is a dielectric, but when you put in these nanoparticle impurities, it completely changes the physical properties. This is something we don’t fully understand yet, but it happens and it is repeatable,” he says.

Using a multimaterial 3D printer, the researchers fabricate a heating resistor from the copper-doped PLA and then print the microfluidic device, with microscopic channels through which fluid can flow, directly on top in one printing step. Because the components are made from the same base material, they have similar printing temperatures and are compatible.

Heat dissipated from the resistor will warm fluid flowing through the channels in the microfluidic.

In addition to the resistor and microfluidic, they use the printer to add a thin, continuous layer of PLA that is sandwiched between them. It is especially challenging to manufacture this layer because it must be thin enough so heat can transfer from the resistor to the microfluidic, but not so thin that fluid could leak into the resistor.

The resulting machine is about the size of a U.S. quarter and can be produced in a matter of minutes. Channels about 500 micrometers wide and 400 micrometers tall are threaded through the microfluidic to carry fluid and facilitate chemical reactions.

Importantly, the PLA material is translucent, so fluid in the device remains visible. Many processes rely on visualization or the use of light to infer what is happening during chemical reactions, Velásquez-García explains.

Customizable chemical reactors

The researchers used this one-step manufacturing process to generate a prototype that could heat fluid by 4 degrees Celsius as it flowed between the input and the output. This customizable technique could enable them to make devices which would heat fluids in certain patterns or along specific gradients.

“You can use these two materials to create chemical reactors that do exactly what you want. We can set up a particular heating profile while still having all the capabilities of the microfluidic,” he says.

However, one limitation comes from the fact that PLA can only be heated to about 50 degrees Celsius before it starts to degrade. Many chemical reactions, such as those used for polymerase chain reaction (PCR) tests, require temperatures of 90 degrees or higher. And to precisely control the temperature of the device, researchers would need to integrate a third material that enables temperature sensing.

In addition to tackling these limitations in future work, Velásquez-García wants to print magnets directly into the microfluidic device. These magnets could enable chemical reactions that require particles to be sorted or aligned.

At the same time, he and his colleagues are exploring the use of other materials that could reach higher temperatures. They are also studying PLA to better understand why it becomes conductive when certain impurities are added to the polymer.

“If we can understand the mechanism that is related to the electrical conductivity of PLA, that would greatly enhance the capability of these devices, but it is going to be a lot harder to solve than some other engineering problems,” he adds.

“In Japanese culture, it’s often said that beauty lies in simplicity. This sentiment is echoed by the work of Cañada and Velasquez-Garcia. Their proposed monolithically 3D-printed microfluidic systems embody simplicity and beauty, offering a wide array of potential derivations and applications that we foresee in the future,” says Norihisa Miki, a professor of mechanical engineering at Keio University in Tokyo, who was not involved with this work.

“Being able to directly print microfluidic chips with fluidic channels and electrical features at the same time opens up very exiting applications when processing biological samples, such as to amplify biomarkers or to actuate and mix liquids. Also, due to the fact that PLA degrades over time, one can even think of implantable applications where the chips dissolve and resorb over time,” adds Niclas Roxhed, an associate professor at Sweden’s KTH Royal Institute of Technology, who was not involved with this study.

This research was funded, in part, by the Empiriko Corporation and a fellowship from La Caixa Foundation.

© Image: Courtesy of the researchers

MIT researchers developed a fabrication process to produce self-heating microfluidic devices in one step using a multi-material 3D printer. Pictured is an example of one of the devices.
  • ✇Latest
  • Two Cheers for the Proposed End Kidney Deaths ActIlya Somin
    (NA) At the Vox website, Dylan Matthews offers a compelling defense of the proposed End Kidney Deaths Act. He makes good points, and I agree the act would be a major improvement over the status quo. But full legalization of organ markets would be better still. Here's an excerpt from Matthews' article: What if I told you there was a way that the US could prevent 60,000 deaths, save American taxpayers $25 billion, and pay a deserving group of peopl
     

Two Cheers for the Proposed End Kidney Deaths Act

19. Duben 2024 v 01:02
Organ | NA
(NA)

At the Vox website, Dylan Matthews offers a compelling defense of the proposed End Kidney Deaths Act. He makes good points, and I agree the act would be a major improvement over the status quo. But full legalization of organ markets would be better still. Here's an excerpt from Matthews' article:

What if I told you there was a way that the US could prevent 60,000 deaths, save American taxpayers $25 billion, and pay a deserving group of people $50,000 each? Would you be interested?…

I am not a spokesman. I am simply a fan and supporter of the End Kidney Deaths Act, a bill put together by a group of kidney policy experts and living donors that would represent the single biggest step forward for US policy on kidneys since … well, ever….

The plan is simple: Every nondirected donor (that is, any kidney donor who gives to a stranger rather than a family member) would be eligible under the law for a tax credit of $10,000 per year for the first five years after they donate. That $50,000 in total benefits is fully refundable, meaning even people who don't owe taxes get the full benefit.

Elaine Perlman, a kidney donor who leads the Coalition to Modify NOTA, which is advocating for the act, based the plan on a 2019 paper that estimated the current disincentives to giving a kidney (from travel expenses to lost income while recovering from surgery to pain and discomfort) amounted to about $38,000. That's almost $50,000 in current dollars, after the past few years' inflation.

The paper also found that removing disincentives by paying this amount to donors would increase the number of living donors by 11,500 a year. Because the law would presumably take a while to encourage more donations, Perlman downgrades that to about 60,000 over the first 10 years, with more donations toward the end as people become aware of the new incentives. But 60,000 is still nothing to sneeze at….

The End Kidney Deaths Act is trying to solve a fundamental problem: Not nearly enough people are donating their kidneys….

In 2021, some 135,972 Americans were diagnosed with end-stage renal disease, meaning they would need either dialysis or a transplant to survive. That year saw only 25,549 transplants. The remaining 110,000 people needed to rely on dialysis.

Dialysis is a miraculous technology, but compared to transplants, it's awful. Over 60 percent of patients who started traditional dialysis in 2017 were dead within five years. Of patients diagnosed with kidney failure in 2017 who subsequently got a transplant from a living donor, only 13 percent were dead five years later.

Life on dialysis is also dreadful to experience. It usually requires thrice-weekly four-hour sessions sitting by a machine, having your blood processed. You can't travel for any real length of time, since you have to be close to the machine. More critically, even part-time work is difficult because dialysis is physically extremely draining.

An estimated 40,000 Americans die every year for lack of kidneys available for transplant. If enacted, the End Kidney Deaths Act would save many of these people. In addition, as Matthews points out, the $50,000 per kidney tax credits would easily pay for themselves, because kidney dialysis is vastly more expensive, and Medicare ends up paying for most of that expense. If more people suffering from kidney failure could get a new kidney quickly, the government would save a lot money on dialysis expenses, and those people would be able to be more productive (as well as avoiding great pain and discomfort).

Matthews also has a good response to claims that paying for kidneys would amount to problematic "commodification":

When you think of donor compensation as payment for work done, the injustice of the current system gets a lot clearer.

When I donated my kidney, many dozens of people got paid. My transplant surgeon got paid; my recipient's surgeon got paid. My anesthesiologist got paid; his anesthesiologist got paid. My nephrologist and nurses and support staff all got paid; so did his. My recipient didn't get paid, but hey — he got a kidney. The only person who was expected to perform their labor with no reward or compensation whatsoever was me, the donor.

This would outrage me less if the system weren't also leading to tens of thousands of people dying unnecessarily every year. But a system that refuses to pay people for their work, and in the process leads to needless mass death, is truly indefensible.

I agree, and have made similar points myself. And Matthews deserves great commendation for donating a kidney, thereby quite possibly saving a life! At the very least, he probably saved the recipient from having to endure additional years of painful kidney dialysis.

The major shortcoming of the End Kidney Deaths Act is the implicit price control it creates. By setting the payment at $50,000, it prevents higher payments where that would be necessary to ensure adequate supply. While the Act would save thousands of lives, the estimates Matthews cites (some 6000 to 11,500 additional kidney donations per year) would still leave us many thousands of kidneys short, thereby still dooming many people to needless death, or at least additional years on kidney dialysis.  This problem might be especially acute for patients whose genetics make it unusually difficult to find a matching donor. Conversely, if some potential donors are willing to sell for less than $50,000, there is no good reason to ban such transactions.

Full legalization of organ sales, with no price controls, would fix these problems. It's basic economics 101 that markets function best if prices are allowed to fluctuate in response to supply and demand. In a free market, insurance companies, medical care providers, and others have every incentive to pay what it takes, as the alternative of kidney dialysis is far more expensive. If necessary, the government could subsidize consumption by the poor, as it already does for kidney dialysis and many other health care expenses.

Matthews includes a passage lauding the End Kidney Deaths Act in part precisely precisely because it falls short of authorizing a full-blown organ market:

The most common objection to compensating kidney donors is that it amounts to letting people "sell" their kidneys, a phrasing that even some proponents of compensation adopt. For opponents, this feels dystopian and disturbing, violating their sense that the human body is sacred and should not be sold for parts.

But "selling kidneys" in this case is just a metaphor, and a bad one at that. The End Kidney Deaths Act would not in any sense legalize the selling of organs. Rich people would not be able to outbid poor people to get organs first. There would be no kidney marketplace or kidney auctions of any kind.

What the proposal would do is pay kidney donors for their labor. It's a payment for a service — that of donation — not a purchase of an asset. It's a service that puts some strain on our bodies, but that's hardly unusual. We pay a premium to people in jobs like logging and roofing precisely because they risk bodily harm; this is no different.

This formulation is clever. And I myself have noted parallels between organ markets and paying people for doing jobs involving physical risk, such as the work performed by lumberjacks  and professional football players (both of whom accept far greater risks than those faced by kidney donors). Nonetheless, if we compensate kidney donors, it is difficult to deny that such compensation is at least in part for giving up a kidney.

And there is nothing wrong with that! If you believe in the principle of "my body, my choice," the right to sell organs is one of the liberties that ideal entails. And there is no good reason to distinguish organ-selling from other potentially risky activities people are allowed to do for pay.  If anything, organ markets are more defensible than most of the others, because they could save many thousands of lives. By contrast, NFL players take greater risks to provide the rest of us with entertainment.

As for the fear that rich people will hoard or monopolize kidneys, that is highly improbable given that few people—rich or otherwise—are likely to have a need for more than one. In a nation of over 300 million people, full legalization would induce sufficient sales to fully cover the demand (roughly another 40,000 kidneys per year or so). If necessary, as noted above, government could subsidize the purchase of kidneys for poor people suffering kidney failure, as it does for other kinds of medical care for the poor.

A free market might be politically difficult to enact. But survey data suggests it may not be nearly as hard as is usually supposed.

In sum, the End Kidney Deaths Act would be a major improvement over the status quo. Matthews is absolutely right about that. But a more fully free market would be much better still.

In previous writings on organ sales, I have discussed the scope of the problem, and addressed standard arguments against organ market legalization, such as concerns that it would be too dangerous for organ donorsclaims that it amounts to to immoral "commodification" of the body, and fears that it would lead to exploitation of the poor (see also here).

The post Two Cheers for the Proposed End Kidney Deaths Act appeared first on Reason.com.

  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • MIT’s tiny technologies go to WashingtonKoch Institute
    On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases. The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for
     

MIT’s tiny technologies go to Washington

On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases.

The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for all Americans.

Participants included President Joe Biden, Biden-Harris administration leaders and White House staff, members of Congress, federal R&D funding agencies, scientists and engineers, academics, students, and science and technology industry innovators. The event holds special significance for MIT as eight years ago, MIT's Computer Science and Artificial Intelligence Laboratory participated in the last iteration of the White House Demo Day under President Barack Obama.

“It was truly inspirational hearing from experts from all across the government, the private sector, and academia touching on so many fields,” said President Biden of the event. “It was a reminder, at least for me, of what I’ve long believed — that America can be defined by a single word... possibilities.”

Launched in 2016, the Marble Center for Cancer Nanomedicine was established at the Koch Institute for Integrative Research at MIT to serve as a hub for miniaturized biomedical technologies, especially those that address grand challenges in cancer detection, treatment, and monitoring. The center convenes Koch Institute faculty members Sangeeta Bhatia, Paula Hammond, Robert Langer, Angela Belcher, Darrell Irvine, and Daniel Anderson to advance nanomedicine, as well as to facilitate collaboration with industry partners, including Alloy Therapeutics, Danaher Corp., Fujifilm, and Sanofi. 

Ana Jaklenec, a principal research scientist at the Koch Institute, highlighted several groundbreaking technologies in vaccines and disease diagnostics and treatment at the event. Jaklenec gave demonstrations from projects from her research group, including novel vaccine formulations capable of releasing a dozen booster doses pulsed over predetermined time points, microneedle vaccine technologies, and nutrient delivery technologies for precise control over microbiome modulation and nutrient absorption.

Jaklenec describes the event as “a wonderful opportunity to meet our government leaders and policymakers and see their passion for curing cancer. But it was especially moving to interact with people representing diverse communities across the United States and hear their excitement for how our technologies could positively impact their communities.”

Jeremy Li, a former MIT postdoc, presented a technology developed in the Belcher laboratory and commercialized by the spinout Cision Vision. The startup is developing a new approach to visualize lymph nodes in real time without any injection or radiation. The shoebox-sized device was also selected as part of Time Magazine’s Best Inventions of 2023 and is currently being used in a dozen hospitals across the United States.

“It was a proud moment for Cision Vision to be part of this event and discuss our recent progress in the field of medical imaging and cancer care,” says Li, who is a co-founder and the CEO of CisionVision. “It was a humbling experience for us to hear directly from patient advocates and cancer survivors at the event. We feel more inspired than ever to bring better solutions for cancer care to patients around the world.”

Other technologies shown at the event included new approaches such as a tortoise-shaped pill designed to enhance the efficacy of oral medicines, a miniature organ-on-a-chip liver device to predict drug toxicity and model liver disease, and a wireless bioelectronic device that provides oxygen for cell therapy applications and for the treatment of chronic disease.

“The feedback from the organizers and the audience at the event has been overwhelmingly positive,” says Tarek Fadel, who led the team’s participation at the event. “Navigating the demonstration space felt like stepping into the future. As a center, we stand poised to engineer transformative tools that will truly make a difference for the future of cancer care.”

Sangeeta Bhatia, the Director of the Marble Center and the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, adds: “The showcase of our technologies at the White House Demo Day underscores the transformative impact we aim to achieve in cancer detection and treatment. The event highlights our vision to advance cutting-edge solutions for the benefit of patients and communities worldwide.”

Ana Jaklenec (right), principal research scientist at the Koch Institute for Integrative Cancer Research at MIT, and Jeremy Li, CEO and co-founder of Cision Vision, presented at “American Possibilities: A White House Demo Day.”
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Scientists 3D print self-heating microfluidic devicesAdam Zewe | MIT News
    MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases. Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic. But many microfluidic applications r
     

Scientists 3D print self-heating microfluidic devices

MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases.

Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic.

But many microfluidic applications require chemical reactions that must be performed at specific temperatures. These more complex microfluidic devices, which are typically manufactured in a clean room, are outfitted with heating elements made from gold or platinum using a complicated and expensive fabrication process that is difficult to scale up.

Instead, the MIT team used multimaterial 3D printing to create self-heating microfluidic devices with built-in heating elements, through a single, inexpensive manufacturing process. They generated devices that can heat fluid to a specific temperature as it flows through microscopic channels inside the tiny machine.

Their technique is customizable, so an engineer could create a microfluidic that heats fluid to a certain temperature or given heating profile within a specific area of the device. The low-cost fabrication process requires about $2 of materials to generate a ready-to-use microfluidic.

The process could be especially useful in creating self-heating microfluidics for remote regions of developing countries where clinicians may not have access to the expensive lab equipment required for many diagnostic procedures.

“Clean rooms in particular, where you would usually make these devices, are incredibly expensive to build and to run. But we can make very capable self-heating microfluidic devices using additive manufacturing, and they can be made a lot faster and cheaper than with these traditional methods. This is really a way to democratize this technology,” says Luis Fernando Velásquez-García, a principal scientist in MIT’s Microsystems Technology Laboratories (MTL) and senior author of a paper describing the fabrication technique.

He is joined on the paper by lead author Jorge Cañada Pérez-Sala, an electrical engineering and computer science graduate student. The research will be presented at the PowerMEMS Conference this month.

An insulator becomes conductive

This new fabrication process utilizes a technique called multimaterial extrusion 3D printing, in which several materials can be squirted through the printer’s many nozzles to build a device layer by layer. The process is monolithic, which means the entire device can be produced in one step on the 3D printer, without the need for any post-assembly.

To create self-heating microfluidics, the researchers used two materials — a biodegradable polymer known as polylactic acid (PLA) that is commonly used in 3D printing, and a modified version of PLA.

The modified PLA has mixed copper nanoparticles into the polymer, which converts this insulating material into an electrical conductor, Velásquez-García explains. When electrical current is fed into a resistor composed of this copper-doped PLA, energy is dissipated as heat.

“It is amazing when you think about it because the PLA material is a dielectric, but when you put in these nanoparticle impurities, it completely changes the physical properties. This is something we don’t fully understand yet, but it happens and it is repeatable,” he says.

Using a multimaterial 3D printer, the researchers fabricate a heating resistor from the copper-doped PLA and then print the microfluidic device, with microscopic channels through which fluid can flow, directly on top in one printing step. Because the components are made from the same base material, they have similar printing temperatures and are compatible.

Heat dissipated from the resistor will warm fluid flowing through the channels in the microfluidic.

In addition to the resistor and microfluidic, they use the printer to add a thin, continuous layer of PLA that is sandwiched between them. It is especially challenging to manufacture this layer because it must be thin enough so heat can transfer from the resistor to the microfluidic, but not so thin that fluid could leak into the resistor.

The resulting machine is about the size of a U.S. quarter and can be produced in a matter of minutes. Channels about 500 micrometers wide and 400 micrometers tall are threaded through the microfluidic to carry fluid and facilitate chemical reactions.

Importantly, the PLA material is translucent, so fluid in the device remains visible. Many processes rely on visualization or the use of light to infer what is happening during chemical reactions, Velásquez-García explains.

Customizable chemical reactors

The researchers used this one-step manufacturing process to generate a prototype that could heat fluid by 4 degrees Celsius as it flowed between the input and the output. This customizable technique could enable them to make devices which would heat fluids in certain patterns or along specific gradients.

“You can use these two materials to create chemical reactors that do exactly what you want. We can set up a particular heating profile while still having all the capabilities of the microfluidic,” he says.

However, one limitation comes from the fact that PLA can only be heated to about 50 degrees Celsius before it starts to degrade. Many chemical reactions, such as those used for polymerase chain reaction (PCR) tests, require temperatures of 90 degrees or higher. And to precisely control the temperature of the device, researchers would need to integrate a third material that enables temperature sensing.

In addition to tackling these limitations in future work, Velásquez-García wants to print magnets directly into the microfluidic device. These magnets could enable chemical reactions that require particles to be sorted or aligned.

At the same time, he and his colleagues are exploring the use of other materials that could reach higher temperatures. They are also studying PLA to better understand why it becomes conductive when certain impurities are added to the polymer.

“If we can understand the mechanism that is related to the electrical conductivity of PLA, that would greatly enhance the capability of these devices, but it is going to be a lot harder to solve than some other engineering problems,” he adds.

“In Japanese culture, it’s often said that beauty lies in simplicity. This sentiment is echoed by the work of Cañada and Velasquez-Garcia. Their proposed monolithically 3D-printed microfluidic systems embody simplicity and beauty, offering a wide array of potential derivations and applications that we foresee in the future,” says Norihisa Miki, a professor of mechanical engineering at Keio University in Tokyo, who was not involved with this work.

“Being able to directly print microfluidic chips with fluidic channels and electrical features at the same time opens up very exiting applications when processing biological samples, such as to amplify biomarkers or to actuate and mix liquids. Also, due to the fact that PLA degrades over time, one can even think of implantable applications where the chips dissolve and resorb over time,” adds Niclas Roxhed, an associate professor at Sweden’s KTH Royal Institute of Technology, who was not involved with this study.

This research was funded, in part, by the Empiriko Corporation and a fellowship from La Caixa Foundation.

© Image: Courtesy of the researchers

MIT researchers developed a fabrication process to produce self-heating microfluidic devices in one step using a multi-material 3D printer. Pictured is an example of one of the devices.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Brain surgery training from an avatarBecky Ham | MIT.nano
    Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain. With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin. “I
     

Brain surgery training from an avatar

Benjamin Warf, a renowned neurosurgeon at Boston Children’s Hospital, stands in the MIT.nano Immersion Lab. More than 3,000 miles away, his virtual avatar stands next to Matheus Vasconcelos in Brazil as the resident practices delicate surgery on a doll-like model of a baby’s brain.

With a pair of virtual-reality goggles, Vasconcelos is able to watch Warf’s avatar demonstrate a brain surgery procedure before replicating the technique himself and while asking questions of Warf’s digital twin.

“It’s an almost out-of-body experience,” Warf says of watching his avatar interact with the residents. “Maybe it’s how it feels to have an identical twin?”

And that’s the goal: Warf’s digital twin bridged the distance, allowing him to be functionally in two places at once. “It was my first training using this model, and it had excellent performance,” says Vasconcelos, a neurosurgery resident at Santa Casa de São Paulo School of Medical Sciences in São Paulo, Brazil. “As a resident, I now feel more confident and comfortable applying the technique in a real patient under the guidance of a professor.”

Warf’s avatar arrived via a new project launched by medical simulator and augmented reality (AR) company EDUCSIM. The company is part of the 2023 cohort of START.nano, MIT.nano’s deep-tech accelerator that offers early-stage startups discounted access to MIT.nano’s laboratories.

In March 2023, Giselle Coelho, EDUCSIM’s scientific director and a pediatric neurosurgeon at Santa Casa de São Paulo and Sabará Children’s Hospital, began working with technical staff in the MIT.nano Immersion Lab to create Warf’s avatar. By November, the avatar was training future surgeons like Vasconcelos.

“I had this idea to create the avatar of Dr. Warf as a proof of concept, and asked, ‘What would be the place in the world where they are working on technologies like that?’” Coelho says. “Then I found MIT.nano.”

Capturing a Surgeon

As a neurosurgery resident, Coelho was so frustrated by the lack of practical training options for complex surgeries that she built her own model of a baby brain. The physical model contains all the structures of the brain and can even bleed, “simulating all the steps of a surgery, from incision to skin closure,” she says.

She soon found that simulators and virtual reality (VR) demonstrations reduced the learning curve for her own residents. Coelho launched EDUCSIM in 2017 to expand the variety and reach of the training for residents and experts looking to learn new techniques.

Those techniques include a procedure to treat infant hydrocephalus that was pioneered by Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital. Coelho had learned the technique directly from Warf and thought his avatar might be the way for surgeons who couldn’t travel to Boston to benefit from his expertise.

To create the avatar, Coelho worked with Talis Reks, the AR/VR/gaming/big data IT technologist in the Immersion Lab.

“A lot of technology and hardware can be very expensive for startups to access as they start their company journey,” Reks explains. “START.nano is one way of enabling them to utilize and afford the tools and technologies we have at MIT.nano’s Immersion Lab.”

Coelho and her colleagues needed high-fidelity and high-resolution motion-capture technology, volumetric video capture, and a range of other VR/AR technologies to capture Warf’s dexterous finger motions and facial expressions. Warf visited MIT.nano on several occasions to be digitally “captured,” including performing an operation on the physical baby model while wearing special gloves and clothing embedded with sensors.

“These technologies have mostly been used for entertainment or VFX [visual effects] or CGI [computer-generated imagery],” says Reks, “But this is a unique project, because we’re applying it now for real medical practice and real learning.”

One of the biggest challenges, Reks says, was helping to develop what Coelho calls “holoportation”— transmitting the 3D, volumetric video capture of Warf in real-time over the internet so that his avatar can appear in transcontinental medical training.

The Warf avatar has synchronous and asynchronous modes. The training that Vasconcelos received was in the asynchronous mode, where residents can observe the avatar’s demonstrations and ask it questions. The answers, delivered in a variety of languages, come from AI algorithms that draw from previous research and an extensive bank of questions and answers provided by Warf.

In the synchronous mode, Warf operates his avatar from a distance in real time, Coelho says. “He could walk around the room, he could talk to me, he could orient me. It’s amazing.”

Coelho, Warf, Reks, and other team members demonstrated a combination of the modes in a second session in late December. This demo consisted of volumetric live video capture between the Immersion Lab and Brazil, spatialized and visible in real-time through AR headsets. It significantly expanded upon the previous demo, which had only streamed volumetric data in one direction through a two-dimensional display.

Powerful impacts

Warf has a long history of training desperately needed pediatric neurosurgeons around the world, most recently through his nonprofit Neurokids. Remote and simulated training has been an increasingly large part of training since the pandemic, he says, although he doesn’t feel it will ever completely replace personal hands-on instruction and collaboration.

“But if in fact one day we could have avatars, like this one from Giselle, in remote places showing people how to do things and answering questions for them, without the cost of travel, without the time cost and so forth, I think it could be really powerful,” Warf says.

The avatar project is especially important for surgeons serving remote and underserved areas like the Amazon region of Brazil, Coelho says. “This is a way to give them the same level of education that they would get in other places, and the same opportunity to be in touch with Dr. Warf.”

One baby treated for hydrocephalus at a recent Amazon clinic had traveled by boat 30 hours for the surgery, according to Coelho.

Training surgeons with the avatar, she says, “can change reality for this baby and can change the future.”

© Photo courtesy of the MIT.nano Immersion Lab.

Benjamin Warf, the director of neonatal and congenital neurosurgery at Boston Children’s Hospital, uses a virtual reality environment to demonstrate a procedure that he pioneered to treat infant hydrocephalus. As Warf operates his avatar from a distance in real-time, medical residents in Brazil watch, interact, and learn in a 3D environment.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • MIT’s tiny technologies go to WashingtonKoch Institute
    On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases. The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for
     

MIT’s tiny technologies go to Washington

On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases.

The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for all Americans.

Participants included President Joe Biden, Biden-Harris administration leaders and White House staff, members of Congress, federal R&D funding agencies, scientists and engineers, academics, students, and science and technology industry innovators. The event holds special significance for MIT as eight years ago, MIT's Computer Science and Artificial Intelligence Laboratory participated in the last iteration of the White House Demo Day under President Barack Obama.

“It was truly inspirational hearing from experts from all across the government, the private sector, and academia touching on so many fields,” said President Biden of the event. “It was a reminder, at least for me, of what I’ve long believed — that America can be defined by a single word... possibilities.”

Launched in 2016, the Marble Center for Cancer Nanomedicine was established at the Koch Institute for Integrative Research at MIT to serve as a hub for miniaturized biomedical technologies, especially those that address grand challenges in cancer detection, treatment, and monitoring. The center convenes Koch Institute faculty members Sangeeta Bhatia, Paula Hammond, Robert Langer, Angela Belcher, Darrell Irvine, and Daniel Anderson to advance nanomedicine, as well as to facilitate collaboration with industry partners, including Alloy Therapeutics, Danaher Corp., Fujifilm, and Sanofi. 

Ana Jaklenec, a principal research scientist at the Koch Institute, highlighted several groundbreaking technologies in vaccines and disease diagnostics and treatment at the event. Jaklenec gave demonstrations from projects from her research group, including novel vaccine formulations capable of releasing a dozen booster doses pulsed over predetermined time points, microneedle vaccine technologies, and nutrient delivery technologies for precise control over microbiome modulation and nutrient absorption.

Jaklenec describes the event as “a wonderful opportunity to meet our government leaders and policymakers and see their passion for curing cancer. But it was especially moving to interact with people representing diverse communities across the United States and hear their excitement for how our technologies could positively impact their communities.”

Jeremy Li, a former MIT postdoc, presented a technology developed in the Belcher laboratory and commercialized by the spinout Cision Vision. The startup is developing a new approach to visualize lymph nodes in real time without any injection or radiation. The shoebox-sized device was also selected as part of Time Magazine’s Best Inventions of 2023 and is currently being used in a dozen hospitals across the United States.

“It was a proud moment for Cision Vision to be part of this event and discuss our recent progress in the field of medical imaging and cancer care,” says Li, who is a co-founder and the CEO of CisionVision. “It was a humbling experience for us to hear directly from patient advocates and cancer survivors at the event. We feel more inspired than ever to bring better solutions for cancer care to patients around the world.”

Other technologies shown at the event included new approaches such as a tortoise-shaped pill designed to enhance the efficacy of oral medicines, a miniature organ-on-a-chip liver device to predict drug toxicity and model liver disease, and a wireless bioelectronic device that provides oxygen for cell therapy applications and for the treatment of chronic disease.

“The feedback from the organizers and the audience at the event has been overwhelmingly positive,” says Tarek Fadel, who led the team’s participation at the event. “Navigating the demonstration space felt like stepping into the future. As a center, we stand poised to engineer transformative tools that will truly make a difference for the future of cancer care.”

Sangeeta Bhatia, the Director of the Marble Center and the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, adds: “The showcase of our technologies at the White House Demo Day underscores the transformative impact we aim to achieve in cancer detection and treatment. The event highlights our vision to advance cutting-edge solutions for the benefit of patients and communities worldwide.”

Ana Jaklenec (right), principal research scientist at the Koch Institute for Integrative Cancer Research at MIT, and Jeremy Li, CEO and co-founder of Cision Vision, presented at “American Possibilities: A White House Demo Day.”
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Scientists 3D print self-heating microfluidic devicesAdam Zewe | MIT News
    MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases. Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic. But many microfluidic applications r
     

Scientists 3D print self-heating microfluidic devices

MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases.

Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic.

But many microfluidic applications require chemical reactions that must be performed at specific temperatures. These more complex microfluidic devices, which are typically manufactured in a clean room, are outfitted with heating elements made from gold or platinum using a complicated and expensive fabrication process that is difficult to scale up.

Instead, the MIT team used multimaterial 3D printing to create self-heating microfluidic devices with built-in heating elements, through a single, inexpensive manufacturing process. They generated devices that can heat fluid to a specific temperature as it flows through microscopic channels inside the tiny machine.

Their technique is customizable, so an engineer could create a microfluidic that heats fluid to a certain temperature or given heating profile within a specific area of the device. The low-cost fabrication process requires about $2 of materials to generate a ready-to-use microfluidic.

The process could be especially useful in creating self-heating microfluidics for remote regions of developing countries where clinicians may not have access to the expensive lab equipment required for many diagnostic procedures.

“Clean rooms in particular, where you would usually make these devices, are incredibly expensive to build and to run. But we can make very capable self-heating microfluidic devices using additive manufacturing, and they can be made a lot faster and cheaper than with these traditional methods. This is really a way to democratize this technology,” says Luis Fernando Velásquez-García, a principal scientist in MIT’s Microsystems Technology Laboratories (MTL) and senior author of a paper describing the fabrication technique.

He is joined on the paper by lead author Jorge Cañada Pérez-Sala, an electrical engineering and computer science graduate student. The research will be presented at the PowerMEMS Conference this month.

An insulator becomes conductive

This new fabrication process utilizes a technique called multimaterial extrusion 3D printing, in which several materials can be squirted through the printer’s many nozzles to build a device layer by layer. The process is monolithic, which means the entire device can be produced in one step on the 3D printer, without the need for any post-assembly.

To create self-heating microfluidics, the researchers used two materials — a biodegradable polymer known as polylactic acid (PLA) that is commonly used in 3D printing, and a modified version of PLA.

The modified PLA has mixed copper nanoparticles into the polymer, which converts this insulating material into an electrical conductor, Velásquez-García explains. When electrical current is fed into a resistor composed of this copper-doped PLA, energy is dissipated as heat.

“It is amazing when you think about it because the PLA material is a dielectric, but when you put in these nanoparticle impurities, it completely changes the physical properties. This is something we don’t fully understand yet, but it happens and it is repeatable,” he says.

Using a multimaterial 3D printer, the researchers fabricate a heating resistor from the copper-doped PLA and then print the microfluidic device, with microscopic channels through which fluid can flow, directly on top in one printing step. Because the components are made from the same base material, they have similar printing temperatures and are compatible.

Heat dissipated from the resistor will warm fluid flowing through the channels in the microfluidic.

In addition to the resistor and microfluidic, they use the printer to add a thin, continuous layer of PLA that is sandwiched between them. It is especially challenging to manufacture this layer because it must be thin enough so heat can transfer from the resistor to the microfluidic, but not so thin that fluid could leak into the resistor.

The resulting machine is about the size of a U.S. quarter and can be produced in a matter of minutes. Channels about 500 micrometers wide and 400 micrometers tall are threaded through the microfluidic to carry fluid and facilitate chemical reactions.

Importantly, the PLA material is translucent, so fluid in the device remains visible. Many processes rely on visualization or the use of light to infer what is happening during chemical reactions, Velásquez-García explains.

Customizable chemical reactors

The researchers used this one-step manufacturing process to generate a prototype that could heat fluid by 4 degrees Celsius as it flowed between the input and the output. This customizable technique could enable them to make devices which would heat fluids in certain patterns or along specific gradients.

“You can use these two materials to create chemical reactors that do exactly what you want. We can set up a particular heating profile while still having all the capabilities of the microfluidic,” he says.

However, one limitation comes from the fact that PLA can only be heated to about 50 degrees Celsius before it starts to degrade. Many chemical reactions, such as those used for polymerase chain reaction (PCR) tests, require temperatures of 90 degrees or higher. And to precisely control the temperature of the device, researchers would need to integrate a third material that enables temperature sensing.

In addition to tackling these limitations in future work, Velásquez-García wants to print magnets directly into the microfluidic device. These magnets could enable chemical reactions that require particles to be sorted or aligned.

At the same time, he and his colleagues are exploring the use of other materials that could reach higher temperatures. They are also studying PLA to better understand why it becomes conductive when certain impurities are added to the polymer.

“If we can understand the mechanism that is related to the electrical conductivity of PLA, that would greatly enhance the capability of these devices, but it is going to be a lot harder to solve than some other engineering problems,” he adds.

“In Japanese culture, it’s often said that beauty lies in simplicity. This sentiment is echoed by the work of Cañada and Velasquez-Garcia. Their proposed monolithically 3D-printed microfluidic systems embody simplicity and beauty, offering a wide array of potential derivations and applications that we foresee in the future,” says Norihisa Miki, a professor of mechanical engineering at Keio University in Tokyo, who was not involved with this work.

“Being able to directly print microfluidic chips with fluidic channels and electrical features at the same time opens up very exiting applications when processing biological samples, such as to amplify biomarkers or to actuate and mix liquids. Also, due to the fact that PLA degrades over time, one can even think of implantable applications where the chips dissolve and resorb over time,” adds Niclas Roxhed, an associate professor at Sweden’s KTH Royal Institute of Technology, who was not involved with this study.

This research was funded, in part, by the Empiriko Corporation and a fellowship from La Caixa Foundation.

© Image: Courtesy of the researchers

MIT researchers developed a fabrication process to produce self-heating microfluidic devices in one step using a multi-material 3D printer. Pictured is an example of one of the devices.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • MIT’s tiny technologies go to WashingtonKoch Institute
    On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases. The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for
     

MIT’s tiny technologies go to Washington

On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases.

The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for all Americans.

Participants included President Joe Biden, Biden-Harris administration leaders and White House staff, members of Congress, federal R&D funding agencies, scientists and engineers, academics, students, and science and technology industry innovators. The event holds special significance for MIT as eight years ago, MIT's Computer Science and Artificial Intelligence Laboratory participated in the last iteration of the White House Demo Day under President Barack Obama.

“It was truly inspirational hearing from experts from all across the government, the private sector, and academia touching on so many fields,” said President Biden of the event. “It was a reminder, at least for me, of what I’ve long believed — that America can be defined by a single word... possibilities.”

Launched in 2016, the Marble Center for Cancer Nanomedicine was established at the Koch Institute for Integrative Research at MIT to serve as a hub for miniaturized biomedical technologies, especially those that address grand challenges in cancer detection, treatment, and monitoring. The center convenes Koch Institute faculty members Sangeeta Bhatia, Paula Hammond, Robert Langer, Angela Belcher, Darrell Irvine, and Daniel Anderson to advance nanomedicine, as well as to facilitate collaboration with industry partners, including Alloy Therapeutics, Danaher Corp., Fujifilm, and Sanofi. 

Ana Jaklenec, a principal research scientist at the Koch Institute, highlighted several groundbreaking technologies in vaccines and disease diagnostics and treatment at the event. Jaklenec gave demonstrations from projects from her research group, including novel vaccine formulations capable of releasing a dozen booster doses pulsed over predetermined time points, microneedle vaccine technologies, and nutrient delivery technologies for precise control over microbiome modulation and nutrient absorption.

Jaklenec describes the event as “a wonderful opportunity to meet our government leaders and policymakers and see their passion for curing cancer. But it was especially moving to interact with people representing diverse communities across the United States and hear their excitement for how our technologies could positively impact their communities.”

Jeremy Li, a former MIT postdoc, presented a technology developed in the Belcher laboratory and commercialized by the spinout Cision Vision. The startup is developing a new approach to visualize lymph nodes in real time without any injection or radiation. The shoebox-sized device was also selected as part of Time Magazine’s Best Inventions of 2023 and is currently being used in a dozen hospitals across the United States.

“It was a proud moment for Cision Vision to be part of this event and discuss our recent progress in the field of medical imaging and cancer care,” says Li, who is a co-founder and the CEO of CisionVision. “It was a humbling experience for us to hear directly from patient advocates and cancer survivors at the event. We feel more inspired than ever to bring better solutions for cancer care to patients around the world.”

Other technologies shown at the event included new approaches such as a tortoise-shaped pill designed to enhance the efficacy of oral medicines, a miniature organ-on-a-chip liver device to predict drug toxicity and model liver disease, and a wireless bioelectronic device that provides oxygen for cell therapy applications and for the treatment of chronic disease.

“The feedback from the organizers and the audience at the event has been overwhelmingly positive,” says Tarek Fadel, who led the team’s participation at the event. “Navigating the demonstration space felt like stepping into the future. As a center, we stand poised to engineer transformative tools that will truly make a difference for the future of cancer care.”

Sangeeta Bhatia, the Director of the Marble Center and the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, adds: “The showcase of our technologies at the White House Demo Day underscores the transformative impact we aim to achieve in cancer detection and treatment. The event highlights our vision to advance cutting-edge solutions for the benefit of patients and communities worldwide.”

Ana Jaklenec (right), principal research scientist at the Koch Institute for Integrative Cancer Research at MIT, and Jeremy Li, CEO and co-founder of Cision Vision, presented at “American Possibilities: A White House Demo Day.”
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Scientists 3D print self-heating microfluidic devicesAdam Zewe | MIT News
    MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases. Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic. But many microfluidic applications r
     

Scientists 3D print self-heating microfluidic devices

MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases.

Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic.

But many microfluidic applications require chemical reactions that must be performed at specific temperatures. These more complex microfluidic devices, which are typically manufactured in a clean room, are outfitted with heating elements made from gold or platinum using a complicated and expensive fabrication process that is difficult to scale up.

Instead, the MIT team used multimaterial 3D printing to create self-heating microfluidic devices with built-in heating elements, through a single, inexpensive manufacturing process. They generated devices that can heat fluid to a specific temperature as it flows through microscopic channels inside the tiny machine.

Their technique is customizable, so an engineer could create a microfluidic that heats fluid to a certain temperature or given heating profile within a specific area of the device. The low-cost fabrication process requires about $2 of materials to generate a ready-to-use microfluidic.

The process could be especially useful in creating self-heating microfluidics for remote regions of developing countries where clinicians may not have access to the expensive lab equipment required for many diagnostic procedures.

“Clean rooms in particular, where you would usually make these devices, are incredibly expensive to build and to run. But we can make very capable self-heating microfluidic devices using additive manufacturing, and they can be made a lot faster and cheaper than with these traditional methods. This is really a way to democratize this technology,” says Luis Fernando Velásquez-García, a principal scientist in MIT’s Microsystems Technology Laboratories (MTL) and senior author of a paper describing the fabrication technique.

He is joined on the paper by lead author Jorge Cañada Pérez-Sala, an electrical engineering and computer science graduate student. The research will be presented at the PowerMEMS Conference this month.

An insulator becomes conductive

This new fabrication process utilizes a technique called multimaterial extrusion 3D printing, in which several materials can be squirted through the printer’s many nozzles to build a device layer by layer. The process is monolithic, which means the entire device can be produced in one step on the 3D printer, without the need for any post-assembly.

To create self-heating microfluidics, the researchers used two materials — a biodegradable polymer known as polylactic acid (PLA) that is commonly used in 3D printing, and a modified version of PLA.

The modified PLA has mixed copper nanoparticles into the polymer, which converts this insulating material into an electrical conductor, Velásquez-García explains. When electrical current is fed into a resistor composed of this copper-doped PLA, energy is dissipated as heat.

“It is amazing when you think about it because the PLA material is a dielectric, but when you put in these nanoparticle impurities, it completely changes the physical properties. This is something we don’t fully understand yet, but it happens and it is repeatable,” he says.

Using a multimaterial 3D printer, the researchers fabricate a heating resistor from the copper-doped PLA and then print the microfluidic device, with microscopic channels through which fluid can flow, directly on top in one printing step. Because the components are made from the same base material, they have similar printing temperatures and are compatible.

Heat dissipated from the resistor will warm fluid flowing through the channels in the microfluidic.

In addition to the resistor and microfluidic, they use the printer to add a thin, continuous layer of PLA that is sandwiched between them. It is especially challenging to manufacture this layer because it must be thin enough so heat can transfer from the resistor to the microfluidic, but not so thin that fluid could leak into the resistor.

The resulting machine is about the size of a U.S. quarter and can be produced in a matter of minutes. Channels about 500 micrometers wide and 400 micrometers tall are threaded through the microfluidic to carry fluid and facilitate chemical reactions.

Importantly, the PLA material is translucent, so fluid in the device remains visible. Many processes rely on visualization or the use of light to infer what is happening during chemical reactions, Velásquez-García explains.

Customizable chemical reactors

The researchers used this one-step manufacturing process to generate a prototype that could heat fluid by 4 degrees Celsius as it flowed between the input and the output. This customizable technique could enable them to make devices which would heat fluids in certain patterns or along specific gradients.

“You can use these two materials to create chemical reactors that do exactly what you want. We can set up a particular heating profile while still having all the capabilities of the microfluidic,” he says.

However, one limitation comes from the fact that PLA can only be heated to about 50 degrees Celsius before it starts to degrade. Many chemical reactions, such as those used for polymerase chain reaction (PCR) tests, require temperatures of 90 degrees or higher. And to precisely control the temperature of the device, researchers would need to integrate a third material that enables temperature sensing.

In addition to tackling these limitations in future work, Velásquez-García wants to print magnets directly into the microfluidic device. These magnets could enable chemical reactions that require particles to be sorted or aligned.

At the same time, he and his colleagues are exploring the use of other materials that could reach higher temperatures. They are also studying PLA to better understand why it becomes conductive when certain impurities are added to the polymer.

“If we can understand the mechanism that is related to the electrical conductivity of PLA, that would greatly enhance the capability of these devices, but it is going to be a lot harder to solve than some other engineering problems,” he adds.

“In Japanese culture, it’s often said that beauty lies in simplicity. This sentiment is echoed by the work of Cañada and Velasquez-Garcia. Their proposed monolithically 3D-printed microfluidic systems embody simplicity and beauty, offering a wide array of potential derivations and applications that we foresee in the future,” says Norihisa Miki, a professor of mechanical engineering at Keio University in Tokyo, who was not involved with this work.

“Being able to directly print microfluidic chips with fluidic channels and electrical features at the same time opens up very exiting applications when processing biological samples, such as to amplify biomarkers or to actuate and mix liquids. Also, due to the fact that PLA degrades over time, one can even think of implantable applications where the chips dissolve and resorb over time,” adds Niclas Roxhed, an associate professor at Sweden’s KTH Royal Institute of Technology, who was not involved with this study.

This research was funded, in part, by the Empiriko Corporation and a fellowship from La Caixa Foundation.

© Image: Courtesy of the researchers

MIT researchers developed a fabrication process to produce self-heating microfluidic devices in one step using a multi-material 3D printer. Pictured is an example of one of the devices.
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • MIT’s tiny technologies go to WashingtonKoch Institute
    On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases. The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for
     

MIT’s tiny technologies go to Washington

On Nov. 7, a team from the Marble Center for Cancer Nanomedicine at MIT showed a Washington audience several examples of how nanotechnologies developed at the Institute can transform the detection and treatment of cancer and other diseases.

The team was one of 40 innovative groups featured at “American Possibilities: A White House Demo Day.” Technology on view spanned energy, artificial intelligence, climate, and health, highlighting advancements that contribute to building a better future for all Americans.

Participants included President Joe Biden, Biden-Harris administration leaders and White House staff, members of Congress, federal R&D funding agencies, scientists and engineers, academics, students, and science and technology industry innovators. The event holds special significance for MIT as eight years ago, MIT's Computer Science and Artificial Intelligence Laboratory participated in the last iteration of the White House Demo Day under President Barack Obama.

“It was truly inspirational hearing from experts from all across the government, the private sector, and academia touching on so many fields,” said President Biden of the event. “It was a reminder, at least for me, of what I’ve long believed — that America can be defined by a single word... possibilities.”

Launched in 2016, the Marble Center for Cancer Nanomedicine was established at the Koch Institute for Integrative Research at MIT to serve as a hub for miniaturized biomedical technologies, especially those that address grand challenges in cancer detection, treatment, and monitoring. The center convenes Koch Institute faculty members Sangeeta Bhatia, Paula Hammond, Robert Langer, Angela Belcher, Darrell Irvine, and Daniel Anderson to advance nanomedicine, as well as to facilitate collaboration with industry partners, including Alloy Therapeutics, Danaher Corp., Fujifilm, and Sanofi. 

Ana Jaklenec, a principal research scientist at the Koch Institute, highlighted several groundbreaking technologies in vaccines and disease diagnostics and treatment at the event. Jaklenec gave demonstrations from projects from her research group, including novel vaccine formulations capable of releasing a dozen booster doses pulsed over predetermined time points, microneedle vaccine technologies, and nutrient delivery technologies for precise control over microbiome modulation and nutrient absorption.

Jaklenec describes the event as “a wonderful opportunity to meet our government leaders and policymakers and see their passion for curing cancer. But it was especially moving to interact with people representing diverse communities across the United States and hear their excitement for how our technologies could positively impact their communities.”

Jeremy Li, a former MIT postdoc, presented a technology developed in the Belcher laboratory and commercialized by the spinout Cision Vision. The startup is developing a new approach to visualize lymph nodes in real time without any injection or radiation. The shoebox-sized device was also selected as part of Time Magazine’s Best Inventions of 2023 and is currently being used in a dozen hospitals across the United States.

“It was a proud moment for Cision Vision to be part of this event and discuss our recent progress in the field of medical imaging and cancer care,” says Li, who is a co-founder and the CEO of CisionVision. “It was a humbling experience for us to hear directly from patient advocates and cancer survivors at the event. We feel more inspired than ever to bring better solutions for cancer care to patients around the world.”

Other technologies shown at the event included new approaches such as a tortoise-shaped pill designed to enhance the efficacy of oral medicines, a miniature organ-on-a-chip liver device to predict drug toxicity and model liver disease, and a wireless bioelectronic device that provides oxygen for cell therapy applications and for the treatment of chronic disease.

“The feedback from the organizers and the audience at the event has been overwhelmingly positive,” says Tarek Fadel, who led the team’s participation at the event. “Navigating the demonstration space felt like stepping into the future. As a center, we stand poised to engineer transformative tools that will truly make a difference for the future of cancer care.”

Sangeeta Bhatia, the Director of the Marble Center and the John J. and Dorothy Wilson Professor of Health Sciences and Technology and Electrical Engineering and Computer Science, adds: “The showcase of our technologies at the White House Demo Day underscores the transformative impact we aim to achieve in cancer detection and treatment. The event highlights our vision to advance cutting-edge solutions for the benefit of patients and communities worldwide.”

Ana Jaklenec (right), principal research scientist at the Koch Institute for Integrative Cancer Research at MIT, and Jeremy Li, CEO and co-founder of Cision Vision, presented at “American Possibilities: A White House Demo Day.”
  • ✇MIT News - Nanoscience and nanotechnology | MIT.nano
  • Scientists 3D print self-heating microfluidic devicesAdam Zewe | MIT News
    MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases. Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic. But many microfluidic applications r
     

Scientists 3D print self-heating microfluidic devices

MIT researchers have used 3D printing to produce self-heating microfluidic devices, demonstrating a technique which could someday be used to rapidly create cheap, yet accurate, tools to detect a host of diseases.

Microfluidics, miniaturized machines that manipulate fluids and facilitate chemical reactions, can be used to detect disease in tiny samples of blood or fluids. At-home test kits for Covid-19, for example, incorporate a simple type of microfluidic.

But many microfluidic applications require chemical reactions that must be performed at specific temperatures. These more complex microfluidic devices, which are typically manufactured in a clean room, are outfitted with heating elements made from gold or platinum using a complicated and expensive fabrication process that is difficult to scale up.

Instead, the MIT team used multimaterial 3D printing to create self-heating microfluidic devices with built-in heating elements, through a single, inexpensive manufacturing process. They generated devices that can heat fluid to a specific temperature as it flows through microscopic channels inside the tiny machine.

Their technique is customizable, so an engineer could create a microfluidic that heats fluid to a certain temperature or given heating profile within a specific area of the device. The low-cost fabrication process requires about $2 of materials to generate a ready-to-use microfluidic.

The process could be especially useful in creating self-heating microfluidics for remote regions of developing countries where clinicians may not have access to the expensive lab equipment required for many diagnostic procedures.

“Clean rooms in particular, where you would usually make these devices, are incredibly expensive to build and to run. But we can make very capable self-heating microfluidic devices using additive manufacturing, and they can be made a lot faster and cheaper than with these traditional methods. This is really a way to democratize this technology,” says Luis Fernando Velásquez-García, a principal scientist in MIT’s Microsystems Technology Laboratories (MTL) and senior author of a paper describing the fabrication technique.

He is joined on the paper by lead author Jorge Cañada Pérez-Sala, an electrical engineering and computer science graduate student. The research will be presented at the PowerMEMS Conference this month.

An insulator becomes conductive

This new fabrication process utilizes a technique called multimaterial extrusion 3D printing, in which several materials can be squirted through the printer’s many nozzles to build a device layer by layer. The process is monolithic, which means the entire device can be produced in one step on the 3D printer, without the need for any post-assembly.

To create self-heating microfluidics, the researchers used two materials — a biodegradable polymer known as polylactic acid (PLA) that is commonly used in 3D printing, and a modified version of PLA.

The modified PLA has mixed copper nanoparticles into the polymer, which converts this insulating material into an electrical conductor, Velásquez-García explains. When electrical current is fed into a resistor composed of this copper-doped PLA, energy is dissipated as heat.

“It is amazing when you think about it because the PLA material is a dielectric, but when you put in these nanoparticle impurities, it completely changes the physical properties. This is something we don’t fully understand yet, but it happens and it is repeatable,” he says.

Using a multimaterial 3D printer, the researchers fabricate a heating resistor from the copper-doped PLA and then print the microfluidic device, with microscopic channels through which fluid can flow, directly on top in one printing step. Because the components are made from the same base material, they have similar printing temperatures and are compatible.

Heat dissipated from the resistor will warm fluid flowing through the channels in the microfluidic.

In addition to the resistor and microfluidic, they use the printer to add a thin, continuous layer of PLA that is sandwiched between them. It is especially challenging to manufacture this layer because it must be thin enough so heat can transfer from the resistor to the microfluidic, but not so thin that fluid could leak into the resistor.

The resulting machine is about the size of a U.S. quarter and can be produced in a matter of minutes. Channels about 500 micrometers wide and 400 micrometers tall are threaded through the microfluidic to carry fluid and facilitate chemical reactions.

Importantly, the PLA material is translucent, so fluid in the device remains visible. Many processes rely on visualization or the use of light to infer what is happening during chemical reactions, Velásquez-García explains.

Customizable chemical reactors

The researchers used this one-step manufacturing process to generate a prototype that could heat fluid by 4 degrees Celsius as it flowed between the input and the output. This customizable technique could enable them to make devices which would heat fluids in certain patterns or along specific gradients.

“You can use these two materials to create chemical reactors that do exactly what you want. We can set up a particular heating profile while still having all the capabilities of the microfluidic,” he says.

However, one limitation comes from the fact that PLA can only be heated to about 50 degrees Celsius before it starts to degrade. Many chemical reactions, such as those used for polymerase chain reaction (PCR) tests, require temperatures of 90 degrees or higher. And to precisely control the temperature of the device, researchers would need to integrate a third material that enables temperature sensing.

In addition to tackling these limitations in future work, Velásquez-García wants to print magnets directly into the microfluidic device. These magnets could enable chemical reactions that require particles to be sorted or aligned.

At the same time, he and his colleagues are exploring the use of other materials that could reach higher temperatures. They are also studying PLA to better understand why it becomes conductive when certain impurities are added to the polymer.

“If we can understand the mechanism that is related to the electrical conductivity of PLA, that would greatly enhance the capability of these devices, but it is going to be a lot harder to solve than some other engineering problems,” he adds.

“In Japanese culture, it’s often said that beauty lies in simplicity. This sentiment is echoed by the work of Cañada and Velasquez-Garcia. Their proposed monolithically 3D-printed microfluidic systems embody simplicity and beauty, offering a wide array of potential derivations and applications that we foresee in the future,” says Norihisa Miki, a professor of mechanical engineering at Keio University in Tokyo, who was not involved with this work.

“Being able to directly print microfluidic chips with fluidic channels and electrical features at the same time opens up very exiting applications when processing biological samples, such as to amplify biomarkers or to actuate and mix liquids. Also, due to the fact that PLA degrades over time, one can even think of implantable applications where the chips dissolve and resorb over time,” adds Niclas Roxhed, an associate professor at Sweden’s KTH Royal Institute of Technology, who was not involved with this study.

This research was funded, in part, by the Empiriko Corporation and a fellowship from La Caixa Foundation.

© Image: Courtesy of the researchers

MIT researchers developed a fabrication process to produce self-heating microfluidic devices in one step using a multi-material 3D printer. Pictured is an example of one of the devices.
❌
❌