FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇IEEE Spectrum
  • Photonic Chip Cuts Cost of Hunting ExoplanetsRachel Berkowitz
    At 6.5 meters in diameter, the James Webb Space Telescope’s primary mirror captures more light than any telescope that’s ever been launched from Earth. But not every astronomer has US $10 billion to spend on a space telescope. So to help bring the cost of space-based astronomy down, researchers at the National Research Council of Canada in Ottawa are working on a way to process starlight on a tiny optical chip. Ross Cheriton, a photonics researcher there, and his students built and tested a Cube
     

Photonic Chip Cuts Cost of Hunting Exoplanets

12. Srpen 2024 v 15:01


At 6.5 meters in diameter, the James Webb Space Telescope’s primary mirror captures more light than any telescope that’s ever been launched from Earth. But not every astronomer has US $10 billion to spend on a space telescope. So to help bring the cost of space-based astronomy down, researchers at the National Research Council of Canada in Ottawa are working on a way to process starlight on a tiny optical chip. Ross Cheriton, a photonics researcher there, and his students built and tested a CubeSat prototype with a new kind of photonic chip. The goal is to lower the barrier to entry for astronomical science using swarms of lower-cost spacecraft.

“We hope to enable smaller space telescopes to do big science using highly compact instrument-on-chips,” Cheriton says, who is also affiliated with the Quantum and Nanotechnology Research Centre in Ottawa.

Photonics integrated circuits (PICs) use light instead of electricity to process information, and they’re in wide use slinging trillions and trillions of bits around data centers. But only recently have astronomers begun to examine how to use them to push the boundaries of what can be learned about the universe.

Ground-based telescopes are plagued by Earth’s atmosphere, where turbulence blurs incoming light, making it difficult to focus it onto a camera chip. In outer space, telescopes can peer at extremely faint objects in non-visible wavelengths without correcting for the impact of turbulence. That’s where Cheriton aims to boldly go with a PIC filter that detects very subtle gas signatures during an exoplanet “eclipse” called a transit.

The main motivation for putting photonic chips in space is to reduce the size, weight, and cost of components, because it can be produced en masse in a semiconductor foundry. “The dream is a purely fiber and chip-based instrument with no other optics,” says Cheriton. Replacing filters, lenses, and mirrors with a chip also improves stability and scalability compared to ordinary optical parts.

CubeSats—inexpensive, small, and standardized satellites—have proved to be a cost-effective way of deploying small instrument payloads. “The compact nature of PICs is a perfect match for CubeSats to study bright exoplanet systems James Webb doesn’t have time to stare at,” says Cheriton.

For a total mission cost of less than $1 million—compared to the Webb’s $10 billion—an eventual CubeSat mission could stare at a star for days to weeks while it waits for a planet to cross the field of view. Then, it would look for slight changes in the star’s spectrum that are associated with how the planet’s atmosphere absorbs light—telltale evidence of gasses of a biological origin.

Smaller spectroscopy

As a proof-of-concept, Cheriton guided a team of undergraduate students who spent eight months designing and integrating a PIC into a custom 3U CubeSat (10 centimeter x 10 cm x 30 cm) platform. Their silicon nitride photonic circuit sensor proved itself capable of detecting the absorption signatures of CO2 in incoming light.

In their design, light entering the CubeSat’s collimating lens gets focused into a fiber and then pushed to the photonic chip. It enters an etched set of waveguides that includes a ring resonator. Here, light having a specific set of wavelengths builds in intensity over multiple trips around the ring, and is then output to a detector. Because only a select few wavelengths constructively interfere—those chosen to match a gas’s absorption spectrum—the ring serves as a comb-like filter. After the light goes through the ring resonator, the signal from the waveguide gets passed to an output fiber and onto a camera connected to a Raspberry Pi computer for processing. A single pixel’s intensity therefore serves as a reading for a gas’s presence.

red light with small black boxes Light travels through a waveguide on a photonic integrated circuit.Teseract

Because it’s built on a chip, the sensor could be multiplexed for observing several objects or sense different gasses simultaneously. Additionally, all the light falling on a single pixel means that the signal is more sensitive than a traditional spectrometer, says Cheriton. Moreover, instead of hunting for peaks in a full spectrum, the technology looks for how well the absorption spectrum matches that of a specific gas, a more efficient process. “If something is in space, you don’t want to send gigabytes of data home if you don’t have to,” he says.

Space travel is still a long way off for the astrophotonic CubeSat. The current design does not use space-qualified components. But Cheriton’s students tested it in the lab for red light (635 nm) and CO2 in a gas cell. They used a “ground station” computer to transmit all commands and receive all results—and to monitor the photovoltaics and collect data from the flight control sensors onboard their CubeSat.

Next, the team plans to test whether their sensor can detect oxygen with the silicon nitride chip, a material that was chosen for its transparency to the gas’s 760 nm wavelength. Success would leave them well positioned to meet what Cheriton calls the next huge milestone for astronomers: looking for an earth-like planet with oxygen.

The work was presented at the Optica (formerly Optical Society of America) Advanced Photonics conference in July.

Boeing eats another $125 million loss over Starliner woes

Boeing has revealed that it has taken another $125 million in losses as a result of its Starliner spacecraft's delayed return from the ISS. As SpaceNews reports, the company has revealed the losses in a filing with the US Securities and Exchange Commission, along with more details about its earnings for the second quarter of the year. The company already posted $288 million in losses "primarily as a result of delaying" the Crew Flight Test mission in 2023. 

The first crewed Starliner flight took off in June with NASA astronauts Butch Wilmore and Sunita Williams on board. Boeing's spacecraft was only supposed to stay docked to the ISS for eight days before ferrying the astronauts back to Earth, but issues with its hardware prevented the mission from sticking to its original timeline. 

The company had to examine and find what caused the Starliner's degraded maneuvering thrusters while it was approaching the ISS. In addition, the helium leak that caused several delays to the spacecraft's launch seemed to have worsened, as well. Since June, the company has been putting the spacecraft through a series of tests. Just a few days ago, on July 27, it completed a hot fire test of the Starliner's reaction control system jets and made sure that the vehicle's helium leak rates remain within the acceptable margin. The tests were conducted with Williams and Wilmore onboard, because they're part of the preparations for the spacecraft's flight back home. 

NASA said the tests' results are still being reviewed. But once Boeing and the agency ensure that the Starliner is ready, they will set a date for the Starliner and the astronauts' return flight. 

This article originally appeared on Engadget at https://www.engadget.com/boeing-eats-another-125-million-loss-over-starliner-woes-130027376.html?src=rss

© Boeing

A spacecraft with a view of Earth in the background.

Starliner astronauts’ return trip has been pushed back even further

Astronauts Butch Wilmore and Suni Williams, who flew on the much-delayed first crewed flight of Boeing’s Starliner craft, won’t be coming home from the International Space Station until sometime next month, well past their originally planned return date of June 14. NASA announced last night that it's pushing the date of their return trip back even further in order to allow for more reviews into problems that arose with Starliner during its flight, and to avoid conflicts with upcoming spacewalks. As of now, there’s no date set for the flight back to Earth.

Starliner launched on June 5 and delivered Wilmore and Williams to the ISS about a day later. Their stay was only supposed to last a week or so. During the flight, however, four small helium leaks sprung in the propulsion system, on top of the one that had already been identified prior to launch. And, when Starliner first attempted to approach the ISS on June 6 and begin docking, five of its 28 thrusters went offline. Boeing was able to get four of them back up and running. NASA also revealed a few days after launch that the teams were looking into an issue with a valve in the service module that was “not properly closed.”

The space agency had already pushed the date of the return trip back a few times over the last week and most recently landed on June 26, but now says the flight won’t take place until after the spacewalks planned for June 24 and July 2 have been completed. “We are letting the data drive our decision making relative to managing the small helium system leaks and thruster performance we observed during rendezvous and docking,” said Steve Stich, manager of NASA’s Commercial Crew Program, on Friday.

Leaders from @NASA and @BoeingSpace are adjusting the June 26 return to Earth of the Crew Flight Test mission with @NASA_Astronauts Butch Wilmore and Suni Williams from @Space_Station.

This adjustment deconflicts from a series of spacewalks while allowing mission teams time to… pic.twitter.com/pjqz1zEu4g

— NASA Commercial Crew (@Commercial_Crew) June 22, 2024

“Starliner is performing well in orbit while docked to the space station,” Stich also said. “We are strategically using the extra time to clear a path for some critical station activities while completing readiness for Butch and Suni’s return on Starliner and gaining valuable insight into the system upgrades we will want to make for post-certification missions.”

This article originally appeared on Engadget at https://www.engadget.com/starliner-astronauts-return-trip-has-been-pushed-back-even-further-174336571.html?src=rss

© NASA

Boeing and ISS astronauts pictured on the ISS: front, from left: Suni Williams, Oleg Kononenko, and Butch Wilmore. Second row from left are, Alexander Grebenkin, Tracy C. Dyson, and Mike Barratt. Back row: Nikolai Chub, Jeanette Epps, and Matthew Dominick

The Webb Telescope’s dazzling nebula image supports a long-held theory

The image of the Serpens Nebula you see above, taken by NASA’s James Webb Space Telescope (JWST), not only looks mesmerizing but also captures a never-before-seen phenomenon. The aligned, elongated “protostellar outflows” visible in the top left support a longstanding theory. As suspected, the jets shoot out in alignment from the swirling disks of surrounding material, showing evidence that clusters of forming stars spin in the same direction.

NASA says the bright and clumpy streaks in the image’s upper-left area, which somewhat resemble JJ Abrams-style lens flare, represent shockwaves caused by outward-shooting jets that emerge when the interstellar gas cloud collapses inwards. As forming stars condense and twirl more rapidly, some material shoots out perpendicular to the disk.

“Astronomers have long assumed that as clouds collapse to form stars, the stars will tend to spin in the same direction,” Klaus Pontoppidan of NASA Jet Propulsion Laboratory wrote in a blog post. “However, this has not been seen so directly before. These aligned, elongated structures are a historical record of the fundamental way that stars are born.”

Perpendicular jets (seen as thin beams of light, similar to lens flare) beaming out from a reddish forming cluster of stars.
The aligned jets (which look a bit like JJ Abrams-style lens flare) indicate the forming stars spin in the same direction.

The Serpens Nebula is only one or two million years old and sits around 1,300 light years from Earth. NASA says the dense cluster of protostars at the image’s center includes stars less than 100,000 years old. Serpens is a reflection nebula, meaning the gas and dust cloud shines by reflecting light from stars inside or nearby.

The JWST’s Near-Infrared Camera (NIRCam) captured the image, which covers about 16 trillion miles by 11 trillion miles. The black rectangles you see at the full image’s lower left and upper left represent missing data. NASA says its next step is to use the telescope’s Near-Infrared Spectrograph (NIRSpec) to study the Serpens Nebula’s chemical breakdown.

You can check out NASA’s instructional video below for a closer look at specific details from the glorious image.

This article originally appeared on Engadget at https://www.engadget.com/the-webb-telescopes-dazzling-nebula-image-supports-a-long-held-theory-210229206.html?src=rss

© NASA

Space image, showing forming stars in the Serpens Nebula. Red, orange, blue, black, diffraction spikes.

NASA’S James Webb Space Telescope has found the most distant galaxy ever observed

The hits keep on coming with NASA’S James Webb Space Telescope. According to the space agency, the JWST just found the most distant known galaxy ever. The catchily-named JADES-GS-z14-0 galaxy is said to have formed just 290 million years after the big bang, but it features some unique properties that are at odds with that notion.

The galaxy is incredibly large, at 1,600 light years across. It’s also very bright and features an unusual amount of starlight, given how soon it formed after the big bang. This has led researchers Stefano Carniani and Kevin Hainline to ask “how can nature make such a bright, massive, and large galaxy in less than 300 million years?” In cosmic time, that’s barely a blip.

The wavelengths of light emitted from JADES-GS-z14-0, as spotted by the JWST’s MIRI (Mid-Infrared Instrument), indicate the presence of strong ionized gas emissions, likely from an abundance of hydrogen and oxygen. This is also weird, as oxygen is not typically present early in the life of a galaxy. This suggests that “multiple generations of very massive stars had already lived their lives before we observed the galaxy.”

A chart of light wavelengths.
NASA

As always with distant space stuff, we are actually looking at the past, due to the speed of light, so that means that the galaxy spawned those multiple generations of massive stars in under 290 million years. Stars “only” take around ten million years to form, but can take up to 20 billion years to die. However, ultra-massive stars typically have decreased lifespans. So this finding doesn’t exactly rewrite our understanding of the cosmos, but does certainly call into question the nature of star formation in the early life of the universe.

“All of these observations, together, tell us that JADES-GS-z14-0 is not like the types of galaxies that have been predicted by theoretical models and computer simulations to exist in the very early universe,” the researchers told NASA. “It is likely that astronomers will find many such luminous galaxies, possibly at even earlier times, over the next decade with Webb.”

The Webb telescope has made a habit out of redefining our understanding of the cosmos. It has shown us stars being born in the Virgo constellation, found water for the first time orbiting a comet and discovered carbon dioxide on a distant exoplanet, which was a first. All of this has been done in under two years of operation, so who knows what the future will bring.

This article originally appeared on Engadget at https://www.engadget.com/nasas-james-webb-space-telescope-has-found-the-most-distant-galaxy-ever-observed-185833121.html?src=rss

© NASA

An image of the oldest galaxy.

Blue Origin successfully sends tourists to the edge of space again after a long hiatus

Blue Origin is back in the space tourism game. Jeff Bezos’ spaceflight company successfully flew six paying customers to the edge of space and back this morning, breaking its nearly two-year-long hiatus from crewed missions. This was Blue Origin’s seventh trip with humans on board. The mission — a quick jaunt to cross the Kármán line, or the boundary of space, about 62 miles above Earth — lifted off from the company’s Launch Site One in West Texas shortly after 10:30AM ET.

The six people inside the New Shepard crew capsule included 90-year-old Ed Dwight, a former Air Force Captain who was the first Black astronaut candidate when he was picked for the training program in 1961. He went through training but ultimately wasn’t selected for NASA’s Astronaut Corps, and never made it to space until now. Also on board were Mason Angel, Sylvain Chiron, Kenneth L. Hess, Carol Schaller and Gopi Thotakura. They were briefly able to unbuckle their seatbelts and experience zero gravity.

Blue Origin's crew capsule is seen descending to Earth with two parachutes deployed

The crew safely landed back on the ground about 10 minutes after launch. One of the capsule's three parachutes didn't properly deploy on the return trip, but this didn't pose any problems for its touchdown thanks to the redundancies in the system that account for exactly that type of situation. 

This was also the 25th mission for a New Shepard rocket. It last flew a crew in August 2022, but suffered a structural failure in its engine nozzle the following month during the launch of a payload mission and didn't fly again at all until December 2023. It returned to flight then with another payload mission, making today's launch its first with human passengers in almost two years. 

This article originally appeared on Engadget at https://www.engadget.com/blue-origin-successfully-sends-tourists-to-the-edge-of-space-again-after-a-long-hiatus-144745261.html?src=rss

New Shepard rocket lifts off from the launch pad
  • ✇Ars Technica - All content
  • How the perils of space have affected asteroid RyuguElizabeth Rayne
    Enlarge / The surface of Ryugu. Image credit: JAXA, University of Tokyo, Kochi University, Rikkyo University, Nagoya University, Chiba Institute of Technology, Meiji University, Aizu University, AIST (credit: JAXA) An asteroid that has been wandering through space for billions of years is going to have been bombarded by everything from rocks to radiation. Billions of years traveling through interplanetary space increase the odds of colliding with something in the vast empti
     

How the perils of space have affected asteroid Ryugu

19. Květen 2024 v 13:55
Grey image of a complicated surface composed of many small rocks bound together by dust.

Enlarge / The surface of Ryugu. Image credit: JAXA, University of Tokyo, Kochi University, Rikkyo University, Nagoya University, Chiba Institute of Technology, Meiji University, Aizu University, AIST (credit: JAXA)

An asteroid that has been wandering through space for billions of years is going to have been bombarded by everything from rocks to radiation. Billions of years traveling through interplanetary space increase the odds of colliding with something in the vast emptiness, and at least one of those impacts had enough force to leave the asteroid Ryugu forever changed.

When the Japanese Space Agency’s Hayabusa2 spacecraft touched down on Ryugu, it collected samples from the surface that revealed that particles of magnetite (which is usually magnetic) in the asteroid’s regolith are devoid of magnetism. A team of researchers from Hokkaido University and several other institutions in Japan are now offering an explanation for how this material lost most of its magnetic properties. Their analysis showed that it was caused by at least one high-velocity micrometeoroid collision that broke the magnetite’s chemical structure down so that it was no longer magnetic.

“We surmised that pseudo-magnetite was created [as] the result of space weathering by micrometeoroid impact,” the researchers, led by Hokkaido University professor Yuki Kimura, said in a study recently published in Nature Communications.

Read 12 remaining paragraphs | Comments

'Extreme' geomagnetic storm may bless us with more aurora displays tonight and tomorrow

The strongest geomagnetic storm in 20 years made the colorful northern lights, or aurora borealis, visible Friday night across the US, even in areas that are normally too far south to see them. And the show may not be over. Tonight may offer another chance to catch the aurora if you have clear skies, according to the NOAA, and Sunday could bring yet more displays reaching as far as Alabama.

The extreme geomagnetic storm continues and will persist through at least Sunday... pic.twitter.com/GMDKikl7mA

— NOAA Space Weather Prediction Center (@NWSSWPC) May 11, 2024

The NOAA’s Space Weather Prediction Center said on Saturday that the sun has continued to produce powerful solar flares. That’s on top of previously observed coronal mass ejections (CMEs), or explosions of magnetized plasma, that won’t reach Earth until tomorrow. The agency has been monitoring a particularly active sunspot cluster since Wednesday, and confirmed yesterday that it had observed G5 conditions — the level designated “extreme” — which haven’t been seen since October 2003. In a press release on Friday, Clinton Wallace, Director, NOAA’s Space Weather Prediction Center, said the current storm is “an unusual and potentially historic event.”

The Sun emitted two strong solar flares on May 10-11, 2024, peaking at 9:23 p.m. EDT on May 10, and 7:44 a.m. EDT on May 11. NASA’s Solar Dynamics Observatory captured images of the events, which were classified as X5.8 and X1.5-class flares. https://t.co/nLfnG1OvvE pic.twitter.com/LjmI0rk2Wm

— NASA Sun & Space (@NASASun) May 11, 2024

Geomagnetic storms happen when outbursts from the sun interact with Earth’s magnetosphere. While it all has kind of a scary ring to it, people on the ground don’t really have anything to worry about. As NASA explained on X, “Harmful radiation from a flare cannot pass through Earth’s atmosphere” to physically affect us. These storms can mess with our technology, though, and have been known to disrupt communications, GPS, satellite operations and even the power grid.

This article originally appeared on Engadget at https://www.engadget.com/extreme-geomagnetic-storm-may-bless-us-with-more-aurora-displays-tonight-and-tomorrow-192033210.html?src=rss

© REUTERS / Reuters

The aurora borealis, also known as the 'northern lights’, caused by a coronal mass ejection on the Sun, illuminate the skies over the southwestern Siberian town of Tara, Omsk region, Russia May 11, 2024. REUTERS/Alexey Malgavko
  • ✇Ars Technica - All content
  • Monster galactic outflow powered by exploding starsElizabeth Rayne
    Enlarge / All galaxies have large amounts of gas that influence their star-formation rates. (credit: NASA, ESA, CSA, and J. Lee (NOIRLab)) Galaxies pass gas—in the case of galaxy NGC 4383, so much so that its gas outflow is 20,000 light-years across and more massive than 50 million Suns. Yet even an outflow of this immensity was difficult to detect until now. Observing what these outflows are made of and how they are structured demands high-resolution instruments that can onl
     

Monster galactic outflow powered by exploding stars

12. Květen 2024 v 12:00
Image of a galaxy showing lots of complicated filaments of gas.

Enlarge / All galaxies have large amounts of gas that influence their star-formation rates. (credit: NASA, ESA, CSA, and J. Lee (NOIRLab))

Galaxies pass gas—in the case of galaxy NGC 4383, so much so that its gas outflow is 20,000 light-years across and more massive than 50 million Suns.

Yet even an outflow of this immensity was difficult to detect until now. Observing what these outflows are made of and how they are structured demands high-resolution instruments that can only see gas from galaxies that are relatively close, so information on them has been limited. Which is unfortunate, since gaseous outflows ejected from galaxies can tell us more about their star formation cycles.

The MAUVE (MUSE and ALMA Unveiling the Virgo Environment) program is now changing things. MAUVE’s mission is to understand how the outflows of galaxies in the Virgo cluster affect star formation. NGC 4383 stood out to astronomer Adam Watts, of the University of Australia and the International Centre for Radio Astronomy Research (ICRAR), and his team because its outflow is so enormous.

Read 12 remaining paragraphs | Comments

'Extreme' geomagnetic storm may bless us with more aurora displays tonight and tomorrow

The strongest geomagnetic storm in 20 years made the colorful northern lights, or aurora borealis, visible Friday night across the US, even in areas that are normally too far south to see them. And the show may not be over. Tonight may offer another chance to catch the aurora if you have clear skies, according to the NOAA, and Sunday could bring yet more displays reaching as far as Alabama.

The extreme geomagnetic storm continues and will persist through at least Sunday... pic.twitter.com/GMDKikl7mA

— NOAA Space Weather Prediction Center (@NWSSWPC) May 11, 2024

The NOAA’s Space Weather Prediction Center said on Saturday that the sun has continued to produce powerful solar flares. That’s on top of previously observed coronal mass ejections (CMEs), or explosions of magnetized plasma, that won’t reach Earth until tomorrow. The agency has been monitoring a particularly active sunspot cluster since Wednesday, and confirmed yesterday that it had observed G5 conditions — the level designated “extreme” — which haven’t been seen since October 2003. In a press release on Friday, Clinton Wallace, Director, NOAA’s Space Weather Prediction Center, said the current storm is “an unusual and potentially historic event.”

The Sun emitted two strong solar flares on May 10-11, 2024, peaking at 9:23 p.m. EDT on May 10, and 7:44 a.m. EDT on May 11. NASA’s Solar Dynamics Observatory captured images of the events, which were classified as X5.8 and X1.5-class flares. https://t.co/nLfnG1OvvE pic.twitter.com/LjmI0rk2Wm

— NASA Sun & Space (@NASASun) May 11, 2024

Geomagnetic storms happen when outbursts from the sun interact with Earth’s magnetosphere. While it all has kind of a scary ring to it, people on the ground don’t really have anything to worry about. As NASA explained on X, “Harmful radiation from a flare cannot pass through Earth’s atmosphere” to physically affect us. These storms can mess with our technology, though, and have been known to disrupt communications, GPS, satellite operations and even the power grid.

This article originally appeared on Engadget at https://www.engadget.com/extreme-geomagnetic-storm-may-bless-us-with-more-aurora-displays-tonight-and-tomorrow-192033210.html?src=rss

© REUTERS / Reuters

The aurora borealis, also known as the 'northern lights’, caused by a coronal mass ejection on the Sun, illuminate the skies over the southwestern Siberian town of Tara, Omsk region, Russia May 11, 2024. REUTERS/Alexey Malgavko
  • ✇IEEE Spectrum
  • A Brief History of the World’s First PlanetariumAllison Marsh
    In 1912, Oskar von Miller, an electrical engineer and founder of the Deutsches Museum, had an idea: Could you project an artificial starry sky onto a dome, as a way of demonstrating astronomical principles to the public?It was such a novel concept that when von Miller approached the Carl Zeiss company in Jena, Germany, to manufacture such a projector, they initially rebuffed him. Eventually, they agreed, and under the guidance of lead engineer Walther Bauersfeld, Zeiss created something amazing.
     

A Brief History of the World’s First Planetarium

1. Květen 2024 v 17:00


In 1912, Oskar von Miller, an electrical engineer and founder of the Deutsches Museum, had an idea: Could you project an artificial starry sky onto a dome, as a way of demonstrating astronomical principles to the public?

It was such a novel concept that when von Miller approached the Carl Zeiss company in Jena, Germany, to manufacture such a projector, they initially rebuffed him. Eventually, they agreed, and under the guidance of lead engineer Walther Bauersfeld, Zeiss created something amazing.

The use of models to show the movements of the planets and stars goes back centuries, starting with mechanical orreries that used clockwork mechanisms to depict our solar system. A modern upgrade was Clair Omar Musser’s desktop electric orrery, which he designed for the Seattle World’s Fair in 1962.

The projector that Zeiss planned for the Deutsches Museum would be far more elaborate. For starters, there would be two planetariums. One would showcase the Copernican, or heliocentric, sky, displaying the stars and planets as they revolved around the sun. The other would show the Ptolemaic, or geocentric, sky, with the viewer fully immersed in the view, as if standing on the surface of the Earth, seemingly at the center of the universe.

The task of realizing those ideas fell to Bauersfeld, a mechanical engineer by training and a managing director at Zeiss.

On the left, a 1927 black and white photo of a man wearing a suit and tie. On the right, a rough sketch of an apparatus with handwritten notes. Zeiss engineer Walther Bauersfeld worked out the electromechanical details of the planetarium. In this May 1920 entry from his lab notebook [right], he sketched the two-axis system for showing the daily and annual motions of the stars.ZEISS Archive

At first, Bauersfeld focused on projecting just the sun, moon, and planets of our solar system. At the suggestion of his boss, Rudolf Straubel, he added stars. World War I interrupted the work, but by 1920 Bauersfeld was back at it. One entry in May 1920 in Bauersfeld’s meticulous lab notebook showed the earliest depiction of the two-axis design that allowed for the display of the daily as well as the annual motions of the stars. (The notebook is preserved in the Zeiss Archive.)

The planetarium projector was in fact a concatenation of many smaller projectors and a host of gears. According to the Zeiss Archive, a large sphere held all of the projectors for the fixed stars as well as a “planet cage” that held projectors for the sun, the moon, and the planets Mercury, Venus, Mars, Jupiter, and Saturn. The fixed-star sphere was positioned so that it projected outward from the exact center of the dome. The planetarium also had projectors for the Milky Way and the names of major constellations.

The projectors within the planet cage were organized in tiers with complex gearing that allowed a motorized drive to move them around one axis to simulate the annual rotations of these celestial objects against the backdrop of the stars. The entire projector could also rotate around a second axis, simulating the Earth’s polar axis, to show the rising and setting of the sun, moon, and planets over the horizon.

Black and white photo of a domed structure under construction with workers standing on the top. The Zeiss planetarium projected onto a spherical surface, which consisted of a geodesic steel lattice overlaid with concrete.Zeiss Archive

Bauersfeld also contributed to the design of the surrounding projection dome, which achieved its exactly spherical surface by way of a geodesic network of steel rods covered by a thin layer of concrete.

Planetariums catch on worldwide

The first demonstration of what became known as the Zeiss Model I projector took place on 21 October 1923 before the Deutsches Museum committee in their not-yet-completed building, in Munich. “This planetarium is a marvel,” von Miller declared in an administrative report.

A photo of a crowd of people on the roof on a building. In 1924, public demonstrations of the Zeiss planetarium took place on the roof of the company’s factory in Jena, Germany.ZEISS Archive

The projector then returned north to Jena for further adjustments and testing. The company also began offering demonstrations of the projector in a makeshift dome on the roof of its factory. From July to September 1924, more than 30,000 visitors experienced the Zeisshimmel (Zeiss sky) this way. These demonstrations became informal visitor-experience studies and allowed Zeiss and the museum to make refinements and improvements.

On 7 May 1925, the world’s first projection planetarium officially opened to the public at the Deutsches Museum. The Zeiss Model I displayed 4,500 stars, the band of the Milky Way, the sun, moon, Mercury, Venus, Mars, Jupiter, and Saturn. Gears and motors moved the projector to replicate the changes in the sky as Earth rotated on its axis and revolved around the sun. Visitors viewed this simulation of the night sky from the latitude of Munich and in the comfort of a climate-controlled building, although at first chairs were not provided. (I get a crick in the neck just thinking about it.) The projector was bolted to the floor, but later versions were mounted on rails to move them back and forth. A presenter operated the machine and lectured on astronomical topics, pointing out constellations and the orbits of the planets.

Illustration showing a cutaway of a planetarium dome with a crowd of people waiting to enter. Word of the Zeiss planetarium spread quickly, through postcards and images.ZEISS Archive

The planetarium’s influence quickly extended far beyond Germany, as museums and schools around the world incorporated the technology into immersive experiences for science education and public outreach. Each new planetarium was greeted with curiosity and excitement. Postcards and images of planetariums (both the distinctive domed buildings and the complicated machines) circulated widely.

In 1926, Zeiss opened its own planetarium in Jena based on Bauersfeld’s specifications. The first city outside of Germany to acquire a Zeiss planetarium was Vienna. It opened in a temporary structure on 7 May 1927 and in a permanent structure four years later, only to be destroyed during World War II.

The Zeiss planetarium in Rome, which opened in 1928, projected the stars onto the domed vault of the 3rd-century Aula Ottagona, part of the ancient Baths of Diocletian.

The first planetarium in the western hemisphere opened in Chicago in May 1930. Philanthropist Max Adler, a former executive at Sears, contributed funds to the building that now bears his name. He called it a “classroom under the heavens.”

Japan’s first planetarium, a Zeiss Model II, opened in Osaka in 1937 at the Osaka City Electricity Science Museum. As its name suggests, the museum showcased exhibits on electricity, funded by the municipal power company. The city council had to be convinced of the educational value of the planetarium. But the mayor and other enthusiasts supported it. The planetarium operated for 50 years.

Who doesn’t love a planetarium?

After World War II and the division of Germany, the Zeiss company also split in two, with operations continuing at Oberkochen in the west and Jena in the east. Both branches continued to develop the planetarium through the Zeiss Model VI before shifting the nomenclature to more exotic names, such as the Spacemaster, Skymaster, and Cosmorama.

A black and white photo of a large complex dumbbell-shaped apparatus mounted on a wheeled cart. The two large spheres of the Zeiss Model II, introduced in 1926, displayed the skies of the northern and southern hemispheres, respectively. Each sphere contained a number of smaller projectors.ZEISS Archive

Over the years, refinements included increased precision, the addition of more stars, automatic controls that allowed the programming of complete shows, and a shift to fiber optics and LED lighting. Zeiss still produces planetariums in a variety of configurations for different size domes.

Today more than 4,000 planetariums are in operation globally. A planetarium is often the first place where children connect what they see in the night sky to a broader science and an understanding of the universe. My hometown of Richmond, Va., opened its first planetarium in April 1983 at the Science Museum of Virginia. That was a bit late in the big scheme of things, but just in time to wow me as a kid. I still remember the first show I saw, narrated by an animatronic Mark Twain with a focus on the 1986 visit of Halley’s Comet.

By then the museum also had a giant OmniMax screen that let me soar over the Grand Canyon, watch beavers transform the landscape, and swim with whale sharks, all from the comfort of my reclining seat. No wonder the museum is where I got my start as a public historian of science and technology. I began volunteering there at age 14 and have never looked back.

Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology.

An abridged version of this article appears in the May 2024 print issue as “A Planetarium Is Born.”

References


In 2023, the Deutsches Museum celebrated the centennial of its planetarium, with the exhibition 100 Years of the Planetarium, which included artifacts such as astrolabes and armillary spheres as well as a star show in a specially built planetarium dome.

I am always appreciative of corporations that recognize their own history and maintain robust archives. Zeiss has a wonderful collection of historic photos online with detailed descriptions.

I also consulted The Zeiss Works and the Carl Zeiss Foundation in Jena by Felix Auerbach, although I read an English translation that was in the Robert B. Ariail Collection of Historical Astronomy, part of the University of South Carolina’s special collections.


  • ✇IEEE Spectrum
  • Electronically Assisted Astronomy on the CheapDavid Schneider
    I hate the eye strain that often comes with peering through a telescope at the night sky—I’d rather let a camera capture the scene. But I’m too frugal to sink thousands of dollars into high-quality astrophotography gear. The Goldilocks solution for me is something that goes by the name of electronically assisted astronomy, or EAA.EAA occupies a middle ground in amateur astronomy: more involved than gazing through binoculars or a telescope, but not as complicated as using specialized cameras, exp
     

Electronically Assisted Astronomy on the Cheap

28. Duben 2024 v 17:00


I hate the eye strain that often comes with peering through a telescope at the night sky—I’d rather let a camera capture the scene. But I’m too frugal to sink thousands of dollars into high-quality astrophotography gear. The Goldilocks solution for me is something that goes by the name of electronically assisted astronomy, or EAA.

EAA occupies a middle ground in amateur astronomy: more involved than gazing through binoculars or a telescope, but not as complicated as using specialized cameras, expensive telescopes, and motorized tracking mounts. I set about exploring how far I could get doing EAA on a limited budget.

Photo of the moon.

Photo of a sun.

Photo of a nebula. Electronically-assisted-astronomy photographs captured with my rig: the moon [top], the sun [middle], and the Orion Nebula [bottom] David Schneider

First, I purchased a used Canon T6 DSLR on eBay. Because it had a damaged LCD viewscreen and came without a lens, it cost just US $100. Next, rather than trying to marry this camera to a telescope, I decided to get a telephoto lens: Back to eBay for a 40-year-old Nikon 500-mm F/8 “mirror” telephoto lens for $125. This lens combines mirrors and lenses to create a folded optical path. So even though the focal length of this telephoto is a whopping 50 centimeters, the lens itself is only about 15 cm long. A $20 adapter makes it work with the Canon.

The Nikon lens lacks a diaphragm to adjust its aperture and hence its depth of field. Its optical geometry makes things that are out of focus resemble doughnuts. And it can’t be autofocused. But these shortcomings aren’t drawbacks for astrophotography. And the lens has the big advantage that it can be focused beyond infinity. This allows you to adjust the focus on distant objects accurately, even if the lens expands and contracts with changing temperatures.

Getting the focus right is one of the bugaboos of using a telephoto lens for astrophotography, because the focus on such lenses is touchy and easily gets knocked off kilter. To avoid that, I built something (based on a design I found in an online astronomy forum) that clamps to the focus ring and allows precise adjustments using a small knob.

My next purchase was a modified gun sight to make it easier to aim the camera. The version I bought (for $30 on Amazon) included an adapter that let me mount it to my camera’s hot shoe. You’ll also need a tripod, but you can purchase an adequate one for less than $30.

Getting the focus right is one of the bugaboos of using a telephoto lens

The only other hardware you need is a laptop. On my Windows machine, I installed four free programs: Canon’s EOS Utility (which allows me to control the camera and download images directly), Canon’s Digital Photo Professional (for managing the camera’s RAW format image files), the GNU Image Manipulation Program (GIMP) photo editor, and a program called Deep Sky Stacker, which lets me combine short-exposure images to enhance the results without having Earth’s rotation ruin things.

It was time to get started. But focusing on astronomical objects is harder than you might think. The obvious strategy is to put the camera in “live view” mode, aim it at Jupiter or a bright star, and then adjust the focus until the object is as small as possible. But it can still be hard to know when you’ve hit the mark. I got a big assist from what’s known as a Bahtinov mask, a screen with angled slats you temporarily stick in front of the lens to create a diffraction pattern that guides focusing.

A set of images showing dim celestial objects transiting across the frames. A final synthesized frame shows a clear, sharp image. Stacking software takes a series of images of the sky, compensates for the motion of the stars, and combines the images to simulate long exposures without blurring.

After getting some good shots of the moon, I turned to another easy target: the sun. That required a solar filter, of course. I purchased one for $9 , which I cut into a circle and glued to a candy tin from which I had cut out the bottom. My tin is of a size that slips perfectly over my lens. With this filter, I was able to take nice images of sunspots. The challenge again was focusing, which required trial and error, because strategies used for stars and planets don’t work for the sun.

With focusing down, the next hurdle was to image a deep-sky object, or DSO—star clusters, galaxies, and nebulae. To image these dim objects really well requires a tracking mount, which turns the camera so that you can take long exposures without blurring from the motion of the Earth. But I wanted to see what I could do without a tracker.

I first needed to figure out how long of an exposure was possible with my fixed camera. A common rule of thumb is to take the focal length of your telescope in millimeters and divide by 500 to give you the maximum exposure duration in seconds. For my setup, that would be 1 second. A more sophisticated approach, called the NPF rule, factors in additional details regarding your imaging sensor. Using an online NPF-rule calculator gave me a slightly lower number: 0.8 seconds. To be even more conservative, I used 0.6-second exposures.

My first DSO target was the Orion Nebula, of which I shot 100 images from my suburban driveway. No doubt, I would have done better from a darker spot. I was mindful, though, to acquire calibration frames—“flats” and “darks” and “bias images”—which are used to compensate for imperfections in the imaging system. Darks and bias images are easy enough to obtain by leaving the lens cap on. Taking flats, however, requires an even, diffuse light source. For that I used a $17 A5-size LED tracing pad placed on a white T-shirt covering the lens.

With all these images in hand, I fired up the Deep Sky Stacker program and put it to work. The resultant stack didn’t look promising, but postprocessing in GIMP turned it into a surprisingly detailed rendering of the Orion Nebula. It doesn’t compare, of course, with what somebody can do with a better gear. But it does show the kinds of fascinating images you can generate with some free software, an ordinary DSLR, and a vintage telephoto lens pointed at the right spot.

This article appears in the May 2024 print issue as “Electronically Assisted Astronomy.”

  • ✇Ars Technica - All content
  • Io: New image of a lake of fire, signs of permanent volcanismJohn Timmer
    Enlarge (credit: NASA/JPL-Caltech/SwRI/MSSS/Gerald Eichstädt/Thomas Thomopoulos) Ever since the Voyager mission sent home images of Jupiter's moon Io spewing material into space, we've gradually built up a clearer picture of Io's volcanic activity. It slowly became clear that Io, which is a bit smaller than Mercury, is the most volcanically active body in the Solar System, with all that activity driven by the gravitational strain caused by Jupiter and its three other giant mo
     

Io: New image of a lake of fire, signs of permanent volcanism

19. Duben 2024 v 20:17
Io: New image of a lake of fire, signs of permanent volcanism

Enlarge (credit: NASA/JPL-Caltech/SwRI/MSSS/Gerald Eichstädt/Thomas Thomopoulos)

Ever since the Voyager mission sent home images of Jupiter's moon Io spewing material into space, we've gradually built up a clearer picture of Io's volcanic activity. It slowly became clear that Io, which is a bit smaller than Mercury, is the most volcanically active body in the Solar System, with all that activity driven by the gravitational strain caused by Jupiter and its three other giant moons. There is so much volcanism that its surface has been completely remodeled, with no signs of impact craters.

A few more details about its violence came to light this week, with new images being released of the moon's features, including an island in a lake of lava, taken by the Juno orbiter. At the same time, imaging done using an Earth-based telescope has provided some indications that this volcanism has been reshaping Io from almost the moment it formed.

Fiery, glassy lakes

The Juno orbiter's mission is primarily focused on studying Jupiter, including the dynamics of its storms and its internal composition. But many of its orbital passes have taken it right past Io, and this week, the Jet Propulsion Laboratory released some of the best images from these flybys. They include a shot of Loki Patera, a lake of lava that has an island within it. Also featured: the impossibly sheer slopes of Io's Steeple Mountain.

Read 12 remaining paragraphs | Comments

A satellite designed to inspect space junk just made it to orbit

Astroscale’s ADRAS-J spacecraft, a demonstration satellite that could inform future space junk cleanup efforts, is now in orbit after a successful launch from New Zealand on Sunday. The satellite was sent to space atop an Electron rocket from Rocket Lab. Its mission, which was selected by Japan’s space agency (JAXA) for Phase I of the Commercial Removal of Debris Demonstration program, will see ADRAS-J rendezvous with an old Japanese rocket upper stage that’s been in orbit since 2009.

There it goes! 🛰️👋

ADRAS-J is now in orbit, ready to start its mission of rendezvousing with an aging piece of space debris and observing it closely to determine whether it can be deorbited in future.

Proud to be part of this innovative @astroscale_HQ mission studying ways to… pic.twitter.com/WcMexdBhHR

— Rocket Lab (@RocketLab) February 18, 2024

The accumulation of waste in Earth’s orbit from decades of spaceflight is an issue of growing concern, and space agencies around the world are increasingly working to address it, in many cases tapping private companies to develop potential solutions. One of the most effective ways to deal with space junk could be to deorbit it, or move it to a lower altitude so it can burn up in Earth’s atmosphere. ADRAS-J will be the first to target a piece of existing large debris and attempt to safely approach and characterize it, relying on ground-based data to hone in on its position.

Over the next few months, it’ll make its way to the target and eventually try to get close enough to take images and assess its condition to determine if it can be removed. “ADRAS-J is officially on duty and ready to rendezvous with some space debris!” the company tweeted. “Let the new era of space sustainability begin!”

This article originally appeared on Engadget at https://www.engadget.com/a-satellite-designed-to-inspect-space-junk-just-made-it-to-orbit-192236821.html?src=rss

© Astroscale

A rendering of the ADRAS-J satellite in orbit with Earth in the background

Intuitive Machines’ moon lander sent home its first images and they’re breathtaking

Intuitive Machines’ lunar lander is well on its way to the moon after launching without a hitch on Thursday, but it managed to snap a few incredible images of Earth while it was still close to home. The company shared the first batch of images from the IM-1 mission on X today after confirming in an earlier post that the spacecraft is “in excellent health.” Along with a view of Earth and some partial selfies of the Nova-C lander, nicknamed Odysseus, you can even see the SpaceX Falcon 9 second stage falling away in the distance after separation.

Intuitive Machines successfully transmitted its first IM-1 mission images to Earth on February 16, 2024. The images were captured shortly after separation from @SpaceX's second stage on Intuitive Machines’ first journey to the Moon under @NASA's CLPS initiative. pic.twitter.com/9LccL6q5tF

— Intuitive Machines (@Int_Machines) February 17, 2024

Odysseus is on track to make its moon landing attempt on February 22, and so far appears to be performing well. The team posted a series of updates on X at the end of the week confirming the lander has passed some key milestones ahead of its touchdown, including engine firing. This marked “the first-ever in-space ignition of a liquid methane and liquid oxygen engine,” according to Intuitive Machines.

This article originally appeared on Engadget at https://www.engadget.com/intuitive-machines-moon-lander-sent-home-its-first-images-and-theyre-breathtaking-194208799.html?src=rss

© Intuitive Machines

A fisheye view showing part of the NOVA-C lander with a portion of Earth in the background along with a Falcon 9 upper stage falling away

NASA is looking for volunteers to live in its Mars simulation for a year

If extreme challenges are your cup of tea, NASA has the perfect opportunity for you. The space agency put out a call on Friday for volunteers to participate in its second yearlong simulated Mars mission, the Crew Health and Performance Exploration Analog (CHAPEA 2). For the duration of the mission, which will start in spring 2025, the four selected crew members will be housed in a 1,700-square-foot 3D-printed habitat in Houston. NASA is accepting applications on the CHAPEA website from now through April 2. It’s a paid gig, but NASA hasn’t publicly said how much participants will be compensated.

The Mars Dune Alpha habitat at NASA’s Johnson Space Center is designed to simulate what life might be like for future explorers on the red planet, where the environment is harsh and resources will be limited. There’s a crew currently living and working there as part of the first CHAPEA mission, which is now more than halfway through its 378-day assignment. During their stay, volunteers will perform habitat maintenance and grow crops, among other tasks. The habitat also has a 1,200-square-foot sandbox attached to it for simulated spacewalks.

To be considered, applicants must be a US citizen aged 30-55, speak English proficiently and have a master’s degree in a STEM field, plus at least two years of professional experience, a minimum of one thousand hours piloting an aircraft or two years of work toward a STEM doctoral program. Certain types of professional experience may allow applicants without a master’s to qualify too. CHAPEA 2 is the second of three mission NASA has planned for the program, the first of which began on June 25, 2023. 

This article originally appeared on Engadget at https://www.engadget.com/nasa-is-looking-for-volunteers-to-live-in-its-mars-simulation-for-a-year-172926396.html?src=rss

© NASA/CHAPEA crew

Nathan Jones stands inside the simulated Mars environment in a 1200-square-foot sandbox attached to the 3D printed habitat
  • ✇Ars Technica - All content
  • Newly spotted black hole has mass of 17 billion Suns, adding another dailyJohn Timmer
    Enlarge (credit: ESO/M. Kornmesser) Quasars initially confused astronomers when they were discovered. First identified as sources of radio-frequency radiation, later observations showed that the objects had optical counterparts that looked like stars. But the spectrum of these ostensible stars showed lots of emissions at wavelengths that didn't seem to correspond to any atoms we knew about. Eventually, we figured out these were spectral lines of normal atoms but heavily redsh
     

Newly spotted black hole has mass of 17 billion Suns, adding another daily

20. Únor 2024 v 19:59
Artist's view of a tilted orange disk with a black object at its center.

Enlarge (credit: ESO/M. Kornmesser)

Quasars initially confused astronomers when they were discovered. First identified as sources of radio-frequency radiation, later observations showed that the objects had optical counterparts that looked like stars. But the spectrum of these ostensible stars showed lots of emissions at wavelengths that didn't seem to correspond to any atoms we knew about.

Eventually, we figured out these were spectral lines of normal atoms but heavily redshifted by immense distances. This means that to appear like stars at these distances, these objects had to be brighter than an entire galaxy. Eventually, we discovered that quasars are the light produced by an actively feeding supermassive black hole at the center of a galaxy.

But finding new examples has remained difficult because, in most images, they continue to look just like stars—you still need to obtain a spectrum and figure out their distance to know you're looking at a quasar. Because of that, there might be some unusual quasars we've ignored because we didn't realize they were quasars. That's the case with an object named J0529−4351, which turned out to be the brightest quasar we've ever observed.

Read 12 remaining paragraphs | Comments

❌
❌