Space Articles: Latest Space and Astronomy News | Popular Science https://www.popsci.com/category/space/ Awe-inspiring science reporting, technology news, and DIY projects. Skunks to space robots, primates to climates. That's Popular Science, 145 years strong. Mon, 06 May 2024 18:11:09 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 https://www.popsci.com/uploads/2021/04/28/cropped-PSC3.png?auto=webp&width=32&height=32 Space Articles: Latest Space and Astronomy News | Popular Science https://www.popsci.com/category/space/ 32 32 SpaceX reveals new sleek spacesuits ahead of upcoming historic mission https://www.popsci.com/science/spacex-eva-suits/ Mon, 06 May 2024 18:11:09 +0000 https://www.popsci.com/?p=613688
SpaceX EVA suit helmet close up
The EVA suit helmet is 3D printed from polycarbonate materials. SpaceX

The Extravehicular Activity (EVA) suits will be worn during the Polaris Dawn spacewalk and feature HUD visor displays.

The post SpaceX reveals new sleek spacesuits ahead of upcoming historic mission appeared first on Popular Science.

]]>
SpaceX EVA suit helmet close up
The EVA suit helmet is 3D printed from polycarbonate materials. SpaceX

SpaceX has revealed its new Extravehicular Activity (EVA) suits that could make their low-Earth orbital debut by summer’s end. The new uniform is described as an evolution of the spacesuits currently worn by astronauts aboard Dragon missions, which are designed solely for remaining within pressurized environments. In contrast, the EVA suits will allow astronauts to work both within and outside their capsule as needed thanks to a number of advancements in materials fabrication, joint design, enhanced redundancy safeguards, as well as the integration of a helmet visor heads up display (HUD).

Announced over the weekend, the SpaceX EVA suits will be worn by the four crewmembers scheduled to comprise the Polaris Program’s first mission, Polaris Dawn. First launched in 2022, the Polaris Program is a joint venture through SpaceX intended to “rapidly advance human spaceflight capabilities,” according to its website. Targeted for no earlier than summer 2024, Polaris Dawn will mark the first commercial spacewalk, as well as the first spacewalk to simultaneously include four astronauts. While making history outside their Dragon capsule, the crew will be the first to test Starlink laser-based communications systems that SpaceX believes will be critical to future missions to the moon and eventually Mars.

Polaris Dawn astronaut crew wearing EVA suits
Polaris Dawn’s four astronauts will conduct their mission no earlier than summer 2024. SpaceX

Mobility is the central focus of SpaceX’s teaser video posted to X on May 4, with an EVA suit wearer showing off their smooth ranges of motion for fingers, shoulders, and elbows. As PCMag.com also detailed on Monday, SpaceX EVA suits are fabricated with a variety of textile-based thermal materials and include semi-rigid rotator joints that allow work in both pressurized and unpressurized environments. For the boots, designers utilized the same temperature resilient material found in the Falcon 9 rocket’s interstage and Dragon capsule’s trunk.

Polaris Dawn astronauts will also sport 3D-printed polycarbonate helmets with visors coated in copper and indium tin oxide alongside anti-glare and anti-fog treatments. During the spacewalk roughly 435-miles above Earth, each crewmember’s helmet will project a built-in heads up display (HUD) to provide real-time pressure, temperature, and relative humidity readings.

[Related: Moon-bound Artemis III spacesuits have some functional luxury sewn in.]

Similar to the Prada-designed getups for NASA’s Artemis III astronauts, the SpaceX EVA suit is also meant to illustrate a future in which all kinds of body types can live and work beyond Earth. SpaceX explains that all the EVA upgrades are scalable in design, which will allow customization to accommodate “different body types as SpaceX seeks to create greater accessibility to space for all of humanity.” Its proposed goal of manufacturing “millions” of spacesuits for multiplanetary life may seem far-fetched right now, but it’s got to start somewhere—even if only just four of them at the moment.

The post SpaceX reveals new sleek spacesuits ahead of upcoming historic mission appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Why Venus is so dry https://www.popsci.com/science/venus-dry/ Mon, 06 May 2024 15:00:00 +0000 https://www.popsci.com/?p=613608
The planet Venus is dry thanks to water loss to space as atomic hydrogen. In the dominant loss process, an HCO+ ion recombines with an electron, producing speedy hydrogen atoms (orange) that use CO molecules (blue) as a launchpad to escape.
The planet Venus is dry thanks to water loss to space as atomic hydrogen. In this illustration of the dominant loss process, an HCO+ ion recombines with an electron, producing speedy hydrogen atoms (orange) that use CO molecules (blue) as a launchpad to escape. Aurore Simonnet / Laboratory for Atmospheric and Space Physics / University of Colorado Boulde

New computer simulations offer clues into the 'cloud-swaddled' planet's upper atmosphere.

The post Why Venus is so dry appeared first on Popular Science.

]]>
The planet Venus is dry thanks to water loss to space as atomic hydrogen. In the dominant loss process, an HCO+ ion recombines with an electron, producing speedy hydrogen atoms (orange) that use CO molecules (blue) as a launchpad to escape.
The planet Venus is dry thanks to water loss to space as atomic hydrogen. In this illustration of the dominant loss process, an HCO+ ion recombines with an electron, producing speedy hydrogen atoms (orange) that use CO molecules (blue) as a launchpad to escape. Aurore Simonnet / Laboratory for Atmospheric and Space Physics / University of Colorado Boulde

Despite being Earth’s sister planet in terms of size, Venus is pretty parched compared to our watery world. New computer simulations may hold clues about exactly how our neighbor became so dry. 

Hydrogen atoms in the planet’s atmosphere may fling off into space due to a dissociative recombination–where electrons are removed. Venus may be losing roughly twice as much water every day than previous estimates. The findings are detailed in a study published May 6 in the journal Nature and may help explain what happens to water on other planets in our home galaxy.

“Water is really important for life,” study co-author and University of Colorado Boulder astrophysicist Eryn Cangi said in a statement. “We need to understand the conditions that support liquid water in the universe, and that may have produced the very dry state of Venus today.”

The mystery of the missing water

The Earth is roughly 71 percent water. If you took all of that water and spread it across the planet, you’d get a liquid layer about 1.9 miles deep. If you did the same thing on Venus, you would get a layer that is only 1.2 inches deep

“Venus has 100,000 times less water than the Earth, even though it’s basically the same size and mass,” study co-author and astrophysicist Michael Chaffin said in a statement

[Related: A private company wants to look for life just above Venus.]

However, the planet was not always such a desert. Scientists believe that billions of years ago when Venus was forming, it got about as much water as Earth. At some point, clouds of carbon dioxide in Venus’ atmosphere essentially turned the planet into a greenhouse. The trapping of carbon dioxide raised surface temperatures to 900 degrees Fahrenheit. All of Venus’ water evaporated into steam and most drifted into space. 

That ancient evaporation even still isn’t enough to explain Venus is as dry as it is warm or how it continues to lose water into space. 

“As an analogy, say I dumped out the water in my water bottle. There would still be a few droplets left,” Chaffin said. 

What’s kicking out the hydrogen?

To try to determine why Venus is so dry, the team on this study used computer models to look at the different chemical reactions occuring in the planet’s swirling atmosphere

“We’re trying to figure out what little changes occurred on each planet to drive them into these vastly different states,” said Cangi.

They found that a molecule made up of one atom each of hydrogen, carbon, and oxygen called HCO+ may be causing the planet to leak water.

[Related: Something is making Venus’s clouds less acidic.]

In a planet’s upper atmosphere, water mixes with carbon dioxide to form these HCO+ molecules. Earlier studies found that HCO+ may also be the reason why Mars lost a large amount of its original water.

On Venus, HCO+ is constantly produced in its atmosphere, but the individual hydrogen, carbon, and oxygen atoms don’t survive very long. The electrons in the atmosphere find the atoms, recombine, and then split them in two. When this happens, the hydrogen atoms zip away and may completely escape into space. It eventually is stealing one of the two components needed for water away from Venus. The team calculated that the only way to explain Venus’ dry state was if the planet had higher than expected volumes of HCO+ in its atmosphere. 

Probing Venus

Scientists have never observed HCO+ around Venus. The team believes that is because they’ve never had instruments that can properly look for the ion. None of the spacecraft that have visited Venus–including NASA’s Mariner 2, the European Space Agency’s Venus Express, or Japan’s Akatsuki and others—have carried instruments that could detect HCO+.

“One of the surprising conclusions of this work is that HCO+ should actually be among the most abundant ions in the Venus atmosphere,” Chaffin said.

[Related: We finally know why Venus is absolutely radiant.]

By the end of this decade, NASA plans to drop a probe through Venus’ atmosphere down to the surface during its DAVINCI (Deep Atmosphere Venus Investigation of Noble gasses, Chemistry, and Imaging) mission. While it won’t be able to detect HCO+, the team is hopeful that a future Venus mission might reveal another clue to the mystery of Venus’ missing water.  

“There haven’t been many missions to Venus,” Cangi said. “But newly planned missions will leverage decades of collective experience and a flourishing interest in Venus to explore the extremes of planetary atmospheres, evolution and habitability.”

The post Why Venus is so dry appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
China is en route to collect first-ever samples from the far side of the moon https://www.popsci.com/science/china-moon-launch/ Fri, 03 May 2024 14:20:28 +0000 https://www.popsci.com/?p=613439
A Long March 5 rocket, carrying the Chang'e-6 mission lunar probe, lifts off as it rains at the Wenchang Space Launch Centre in southern China's Hainan Province on May 3, 2024.
A Long March 5 rocket, carrying the Chang'e-6 mission lunar probe, lifts off as it rains at the Wenchang Space Launch Centre in southern China's Hainan Province on May 3, 2024. Credit: HECTOR RETAMAL/AFP via Getty Images

Chang'e-6 spacecraft's payoff could be historic.

The post China is en route to collect first-ever samples from the far side of the moon appeared first on Popular Science.

]]>
A Long March 5 rocket, carrying the Chang'e-6 mission lunar probe, lifts off as it rains at the Wenchang Space Launch Centre in southern China's Hainan Province on May 3, 2024.
A Long March 5 rocket, carrying the Chang'e-6 mission lunar probe, lifts off as it rains at the Wenchang Space Launch Centre in southern China's Hainan Province on May 3, 2024. Credit: HECTOR RETAMAL/AFP via Getty Images

China launched its uncrewed Chang’e-6 lunar spacecraft at 5:27 PM local time (5:27 PM EST) on Friday from the southern island province of Hainan, accelerating its ongoing space race with the US. If successful, a lander will detach upon reaching lunar orbit and descend to the surface to scoop up samples from the expansive South Pole-Aitken basin impact crater. Once finished, the lander will launch back up to Chang’e-6, dock, and return to Earth with the first-of-its-kind samples in tow. All told, the mission should take roughly 56 days to complete.

China’s potential return to the moon marks a significant development in international efforts to establish a permanent presence there. As the US moves forward with its Artemis program missions alongside assistance from Japan and commercial partners, China and Russia are also seeking to build their own lunar research station. Whoever does so first could have major ramifications for the future of moon exploration, resource mining, and scientific progress.

[Related: Why do all these countries want to go to the moon right now? ]

The China National Space Administration’s (CNSA) previous Chang’e-5 mission successfully landed a spacecraft at a volcanic plain on the moon’s near side, but Chang’e-6 aims to take things further, both technologically and logistically. To pull off a far side feat, CNSA mission controllers will need to use a satellite already in orbit around the moon to communicate with Chang’e-6 once its direct relay becomes blocked. But if they can manage it, the payoff will be substantial.

As NBC News explained Friday, the moon’s far side is much less volcanically active than its near side. Since all previous lunar samples have come from the near side, experts believe retrieving new samples elsewhere will help increase their understanding of the moon’s history, as well as potential information on the solar system’s origins.

NASA most likely still has an edge when it comes to returning actual humans to the moon, however. Even with recent mission delays, Artemis 3 astronauts are currently scheduled to reach the probable ice-laden lunar south pole by 2026. China does not expect to send its own taikonauts to the moon until at least 2030, and its joint research station with Russia still remains in its conceptual phase.

That same year will also mark the official decommissioning of the International Space Station. After NASA remotely guides it into a fiery re-entry through Earth’s atmosphere, the only remaining orbital station will be China’s three-module Tiangong facility.

In an interview with Yahoo Finance earlier this week, NASA Administrator Bill Nelson didn’t mince words about the potential ramifications of who sets up on the moon first.

“I think it’s not beyond the pale that China would suddenly say, ‘We are here. You stay out,’” Nelson said at the time. “That would be very unfortunate—to take what has gone on on planet Earth for years, grabbing territory, and saying it’s mine and people fighting over it.”

The post China is en route to collect first-ever samples from the far side of the moon appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
‘Lucy’s baby’ asteroid is only about 2 to 3 million years old https://www.popsci.com/science/baby-asteroid/ Fri, 03 May 2024 13:26:19 +0000 https://www.popsci.com/?p=613430
a small asteroid orbits around a larger one
A pair of stereoscopic images of the asteroid Dinkinesh and Selam created with data collected by the L’LORRI camera on NASA's Lucy spacecraft in the minutes around closest approach on November 1, 2023. NASA/Goddard/SwRI/Johns Hopkins APL/NOIRLab for the original images/Brian May/Claudia Manzoni for stereo processing of the images

The moonlet orbiting the asteroid Dinkinesh is 'an extraordinarily unique and complex body.’

The post ‘Lucy’s baby’ asteroid is only about 2 to 3 million years old appeared first on Popular Science.

]]>
a small asteroid orbits around a larger one
A pair of stereoscopic images of the asteroid Dinkinesh and Selam created with data collected by the L’LORRI camera on NASA's Lucy spacecraft in the minutes around closest approach on November 1, 2023. NASA/Goddard/SwRI/Johns Hopkins APL/NOIRLab for the original images/Brian May/Claudia Manzoni for stereo processing of the images

A newly discovered asteroid is a toddler–in space years. The moonlet circling the small asteroid Dinkinesh named Selam is about 2 to 3 million years old. Scientists arrived at this age estimate using new calculation methods that are described in a study published April 19 in the journal Astronomy and Astrophysics.

Selam is nicknamed “Lucy’s baby,” after NASA’s Lucy spacecraft discovered it orbiting another asteroid in November 2023. The Lucy mission is the first set to explore the Trojan asteroids. These are a group of about 7,000 primitive space rocks orbiting Jupiter. Lucy is expected to provide the first high-resolution images of these space rocks. Dinkinesh and Selam are located in the Main Asteroid Belt between Mars and Jupiter.

Discovering a tiny moonlet was a surprise. According to study co-author and Cornell University aerospace engineering doctoral student Colby Merrill, Selem turned out to be “an extraordinarily unique and complex body.” Selem is a contact binary that consists of two lobes that are piles of rubble stuck together and is the first of this kind of asteroid ever observed. Scientists believe that Selam was formed from surface material ejected by Dinkinesh’s rapid spinning.

[Related: NASA spacecraft Lucy says hello to ‘Dinky’ asteroid on far-flying mission.]

“Finding the ages of asteroids is important to understanding them, and this one is remarkably young when compared to the age of the solar system, meaning it formed somewhat recently,” Merrill said in a statement. “Obtaining the age of this one body can help us to understand the population as a whole.”

To estimate its age, the team studied how Dinkinesh and Selam moved in space–or its dynamics. Binary asteroids like this pair are engaged in a galactic tug-of-war. Gravity that is acting on the objects is making them physically bulge and results in tides similar to what oceans on Earth have. The tides slowly reduce the system’s energy. At the same time, the sun’s radiation also changes the binary system’s energy. This solar change is known as the Binary Yarkovsky-O’Keefe-Radzievskii-Paddack (BYORP) effect. The system will eventually reach an equilibrium, where tides and BYORP are equally strong.

NASA photo

Assuming that the forces between the two were at equilibrium and plugging in asteroid data from the Lucy mission, the team calculated how long it would have taken for Selam to get to its current state after it formed. The team said that they improved preexisting equations that assumed both bodies in a binary system are equally dense and did not factor in the secondary body’s mass. Their computers simulations ran about 1 million calculations with varying parameters and found a median age of 3 million years old, with 2 million being the most likely result. This calculation also agreed with one made by the Lucy mission based on a more traditional method for dating asteroids based on an analysis of their surface craters. 

According to the team, studying asteroids this way does not require a spacecraft like Lucy to take close-up images, thus saving money. It could be more accurate in cases where an asteroid’s surfaces have undergone recent changes from space travel. Since roughly 15 percent of all near-Earth asteroids are binary systems this method can also be used to study other secondary bodies like the moonlet Dimorphos. NASA deliberately crashed a spacecraft into Dimorphos to test out planetary defense technology in September 2022.

[Related: NASA’s asteroid blaster turned a space rock into an ‘oblong watermelon.’]

“Used in tandem with crater counting, this method could help better constrain a system’s age,” study co-author and Cornell University astronomy doctoral student Alexia Kubas said in a statement. “If we use two methods and they agree with each other, we can be more confident that we’re getting a meaningful age that describes the current state of the system.”

The post ‘Lucy’s baby’ asteroid is only about 2 to 3 million years old appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA’s Mars Sample Return mission has a shaky future https://www.popsci.com/science/mars-sample-return-nasa/ Fri, 03 May 2024 12:00:00 +0000 https://www.popsci.com/?p=612805
A still from an animation showing the Mars Sample Return mission’s plan, as designed by the Jet Propulsion Laboratory.
A still from an animation showing the Mars Sample Return mission’s plan, as designed by the Jet Propulsion Laboratory. NASA/JPL/YouTube

The agency is calling on private companies for backup.

The post NASA’s Mars Sample Return mission has a shaky future appeared first on Popular Science.

]]>
A still from an animation showing the Mars Sample Return mission’s plan, as designed by the Jet Propulsion Laboratory.
A still from an animation showing the Mars Sample Return mission’s plan, as designed by the Jet Propulsion Laboratory. NASA/JPL/YouTube

This article was originally featured on The Conversation.

A critical NASA mission in the search for life beyond Earth, Mars Sample Return, is in trouble. Its budget has ballooned from US$5 billion to over $11 billion, and the sample return date may slip from the end of this decade to 2040.

The mission would be the first to try to return rock samples from Mars to Earth so scientists can analyze them for signs of past life.

NASA Administrator Bill Nelson said during a press conference on April 15, 2024, that the mission as currently conceived is too expensive and too slow. NASA gave private companies a month to submit proposals for bringing the samples back in a quicker and more affordable way.

As an astronomer who studies cosmology and has written a book about early missions to Mars, I’ve been watching the sample return saga play out. Mars is the nearest and best place to search for life beyond Earth, and if this ambitious NASA mission unraveled, scientists would lose their chance to learn much more about the red planet.

The habitability of Mars

The first NASA missions to reach the surface of Mars in 1976 revealed the planet as a frigid desert, uninhabitable without a thick atmosphere to shield life from the Sun’s ultraviolet radiation. But studies conducted over the past decade suggest that the planet may have been much warmer and wetter several billion years ago.

The Curiosity and Perseverance rovers have each shown that the planet’s early environment was suitable for microbial life.

They found the chemical building blocks of life and signs of surface water in the distant past. Curiosity, which landed on Mars in 2012, is still active; its twin, Perseverance, which landed on Mars in 2021, will play a crucial role in the sample return mission.

s
The Mars Jezero Crater, which scientists are searching for signs of ancient bacteria. Credit: ESA/DLR/FU BerlinCC BY-SA

Why astronomers want Mars samples

The first time NASA looked for life in a Mars rock was in 1996. Scientists claimed they had discovered microscopic fossils of bacteria in the Martian meteorite ALH84001. This meteorite is a piece of Mars that landed in Antarctica 13,000 years ago and was recovered in 1984. Scientists disagreed over whether the meteorite really had ever harbored biology, and today most scientists agree that there’s not enough evidence to say that the rock contains fossils.

Several hundred Martian meteorites have been found on Earth in the past 40 years. They’re free samples that fell to Earth, so while it might seem intuitive to study them, scientists can’t tell where on Mars these meteorites originated. Also, they were blasted off the planet’s surface by impacts, and those violent events could have easily destroyed or altered subtle evidence of life in the rock.

There’s no substitute for bringing back samples from a region known to have been hospitable to life in the past. As a result, the agency is facing a price tag of $700 million per ounce, making these samples the most expensive material ever gathered.

A compelling and complex mission

Bringing Mars rocks back to Earth is the most challenging mission NASA has ever attempted, and the first stage has already started.

Perseverance has collected over two dozen rock and soil samples, depositing them on the floor of the Jezero Crater, a region that was probably once flooded with water and could have harbored life. The rover inserts the samples in containers the size of test tubes. Once the rover fills all the sample tubes, it will gather them and bring them to the spot where NASA’s Sample Retrieval Lander will land. The Sample Retrieval Lander includes a rocket to get the samples into orbit around Mars.

Private Space Flight photo

The European Space Agency has designed an Earth Return Orbiter, which will rendezvous with the rocket in orbit and capture the basketball-sized sample container. The samples will then be automatically sealed into a biocontainment system and transferred to an Earth entry capsule, which is part of the Earth Return Orbiter. After the long trip home, the entry capsule will parachute to the Earth’s surface.

The complex choreography of this mission, which involves a rover, a lander, a rocket, an orbiter and the coordination of two space agencies, is unprecedented. It’s the culprit behind the ballooning budget and the lengthy timeline.

Sample return breaks the bank

Mars Sample Return has blown a hole in NASA’s budget, which threatens other missions that need funding.

The NASA center behind the mission, the Jet Propulsion Laboratory, just laid off over 500 employees. It’s likely that Mars Sample Return’s budget partly caused the layoffs, but they also came down to the Jet Propulsion Laboratory having an overfull plate of planetary missions and suffering budget cuts.

Within the past year, an independent review board report and a report from the NASA Office of Inspector General raised deep concerns about the viability of the sample return mission. These reports described the mission’s design as overly complex and noted issues such as inflation, supply chain problems and unrealistic costs and schedule estimates.

NASA is also feeling the heat from Congress. For fiscal year 2024, the Senate Appropriations Committee cut NASA’s planetary science budget by over half a billion dollars. If NASA can’t keep a lid on the costs, the mission might even get canceled.

Thinking out of the box

Faced with these challenges, NASA has put out a call for innovative designs from private industry, with a goal of shrinking the mission’s cost and complexity. Proposals are due by May 17, which is an extremely tight timeline for such a challenging design effort. And it’ll be hard for private companies to improve on the plan that experts at the Jet Propulsion Laboratory had over a decade to put together.

An important potential player in this situation is the commercial space company SpaceX. NASA is already partnering with SpaceX on America’s return to the Moon. For the Artemis III mission, SpaceX will attempt to land humans on the Moon for the first time in more than 50 years.

However, the massive Starship rocket that SpaceX will use for Artemis has had only three test flights and needs a lot more development before NASA will trust it with a human cargo.

In principle, a Starship rocket could bring back a large payload of Mars rocks in a single two-year mission and at far lower cost. But Starship comes with great risks and uncertainties. It’s not clear whether that rocket could return the samples that Perseverance has already gathered.

Starship uses a launchpad, and it would need to be refueled for a return journey. But there’s no launchpad or fueling station at the Jezero Crater. Starship is designed to carry people, but if astronauts go to Mars to collect the samples, SpaceX will need a Starship rocket that’s even bigger than the one it has tested so far.

Sending astronauts also carries extra risk and cost, and a strategy of using people might end up more complicated than NASA’s current plan.

With all these pressures and constraints, NASA has chosen to see whether the private sector can come up with a winning solution. We’ll know the answer next month.

Disclosure: Chris Impey receives funding from the National Science Foundation and the Howard Hughes Medical Institute.

The post NASA’s Mars Sample Return mission has a shaky future appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Space Force finds a dead Cold War-era satellite missing for 25 years https://www.popsci.com/science/lost-satellite-found/ Thu, 02 May 2024 18:16:29 +0000 https://www.popsci.com/?p=613375
Sun above earth photo taken from ISS
The S73-7 Infra-Red Calibration Balloon was already lost once before since it first launched in 1974. NASA/JSC

It's not the first time the tiny spy balloon has disappeared.

The post Space Force finds a dead Cold War-era satellite missing for 25 years appeared first on Popular Science.

]]>
Sun above earth photo taken from ISS
The S73-7 Infra-Red Calibration Balloon was already lost once before since it first launched in 1974. NASA/JSC

The US Space Force located a tiny experimental satellite after it spent two-and-a–half decades missing in orbit. Hopefully, they’ll be able to keep an eye on it for good—unlike the last time.

The S73-7 Infra-Red Calibration Balloon (IRCB) was dead on arrival after ejecting from one of the Air Force’s largest Cold War orbital spy camera systems. Although it successfully departed the KH-9 Hexagon reconnaissance satellite about 500 miles above Earth in 1974, the S73-7 failed to inflate to its full 26-inch diameter. The malfunction prevented it from aiding ground based equipment triangulate remote sensing arrays and thus rendered it yet another hunk of space junk.

It wasn’t long afterwards that observers lost sight of the IRCB, only to once again locate the small satellite in early 1990s. And then, they managed to lose it again. Now, after another 25 years, the US Space Force’s 18th Space Defense Squadron rediscovered the experimental device.

Confirmation came through a recent post on X from Jonathan McDowell, an astrophysicist at the Harvard-Smithsonian Center for Astrophysics, who offered his “congrats to whichever… analyst made the identification.”

So how does a satellite disappear for years on end not once, but twice? It’s actually much easier than you might think. As Gizmodo explained on May 1, over 27,000 objects are currently in orbit, most of which are spent rocket boosters. These, along with various satellites, don’t transmit any sort of identification back to Earth. Because of this, tracking systems must match a detected object to a satellite’s predictable orbital path in order to ID it.

[Related: Some space junk just got smacked by more space junk, complicating cleanup.]

If you possess relatively up-to-date radar data, and there aren’t many contenders in a similar orbit, then it usually isn’t hard to pinpoint satellites. But the more crowded an area, the more difficult it is for sensors to match, especially if you haven’t seen your target in a while—say, miniature Infra-Red Calibration Balloon from the 1970s.

It’s currently unclear what information exactly tipped off Space Force to matching their newly detected object with the S73-7, but regardless, that makes it at least trackable above everyone’s heads. In all that time, McDowell’s data indicates the balloon has only descended roughly 9 miles from its original 500 mile altitude, so it’ll be a while before it succumbs to gravity and burns up in the atmosphere. Accounting for everything in orbit may sometimes be taken for granted, but it’s a vital component of humanity’s increasing reliance on satellite arrays, as well as the overall future of space travel.

The post Space Force finds a dead Cold War-era satellite missing for 25 years appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Astronomers still haven’t decided if Planet 9 is real https://www.popsci.com/science/is-planet-9-real/ Thu, 02 May 2024 12:30:00 +0000 https://www.popsci.com/?p=613153
Artist's concept of a hypothetical planet orbiting far from the Sun.
Artist's concept of a hypothetical planet orbiting far from the Sun. Caltech/R. Hurt (IPAC) / NASA

'We can’t quite explain it, but if you add Planet Nine into the model it all makes sense.'

The post Astronomers still haven’t decided if Planet 9 is real appeared first on Popular Science.

]]>
Artist's concept of a hypothetical planet orbiting far from the Sun.
Artist's concept of a hypothetical planet orbiting far from the Sun. Caltech/R. Hurt (IPAC) / NASA

Almost a decade after astronomers proposed the existence of Planet 9, an unseen extra planet in the distant reaches of the solar system, they still haven’t all agreed whether it’s real or not. Now, new research from Caltech astronomers just uncovered an extra line of evidence in favor of the hidden planet. Their computer simulations require Planet 9’s gravitational kick to explain how small bits of rock and ice from around Neptune’s orbit end up close to the sun.

“There is an open question of why particular objects in the solar system act the way they do, and we can’t quite explain it, but if you add Planet Nine into the model it all makes sense,” says Juliette Becker, an astronomer at the University of Wisconsin, Madison not affiliated with this new work. 

These objects are Trans-Neptunian Objects (TNOs)–chunks of debris in the outer solar system, beyond Neptune and even Pluto. Until the 2000s, astronomers hadn’t spotted many TNOs—especially not the most distant ones. They’re incredibly faint, a result of their small sizes and huge distances from Earth and difficult to see. Once astronomers had built up a more substantial catalog of observed TNOs, however, they began to notice some strange trends. 

A group of TNOs were bunched together, sharing similar orbits as if they were being wrangled up by something, like a group of sheep by a shepherd. These oddballs were orbiting at very high angles compared to other TNOs, and they were lined up in the same direction. Some astronomers, including the same Caltech crew behind the new bit of evidence, claimed that the most likely explanation for these observations was the existence of Planet 9 acting as a massive object acting as a gravitational shepherd for the TNO sheep. 

However, other astronomers thought Planet 9 was an outlandish solution to the puzzle at hand, coming up with other ways to explain the unexpected observations. Some suggested that the clusters of TNOs could be a natural result of the solar system’s formation, with no need for Planet 9. Others thought that the shepherd was actually a small black hole instead of a giant planet. More recently, two astronomers in Japan proposed that a different planet, instead of Planet 9, might be lurking in the Kuiper Belt. 

Theories abound for explaining the observed orbits of TNOs—and astronomers have spent the past eight years discussing and debating which make the most sense. This isn’t an anomaly, but instead an illustration of the scientific process. Scientists iteratively and collaboratively improve our understanding of a natural phenomenon, exploring all the evidence to find the best explanation for an observation.

Now, the Caltech team just showed how Planet 9 could be necessary to explain a different group of TNOs, which were somehow chucked towards the sun. An object on a path that crosses Neptune’s orbit, dips towards the sun, and swings back shouldn’t be able to stay that way for long. If we see objects in these kinds of orbits, something has to be pushing them to be there—perhaps even Planet 9. 

“If Planet Nine exists, it would occasionally pull the orbits of distant Trans-Neptunian objects closer to the sun, to the point where they cross Neptune’s orbit. Without Planet Nine, these objects can’t be pushed inward past Neptune very often,” explains Konstantin Batygin, Caltech astronomer and lead author on the new paper. 

“Planet Nine would re-supply the population of these objects as they are depleted, explaining why we can see them at the present day when the Solar System is relatively old,” adds Becker

Throughout the years of theories, some astronomers have been entranced by the idea of actually spotting Planet 9 in the night sky. Despite the evidence of its gravitational influence, seeing is still believing and many of us won’t be satisfied until we have concrete proof that Planet 9 is there in our telescopes.

Batygin and co-author Mike Brown, also an astronomer at Caltech, have been hunting for Planet 9 using huge archives of data taken by surveys of the night sky from the Pan-STARRS1 facility atop Haleakala in Hawai’i, the Dark Energy Survey completed in Chile, and the Zwicky Transient Facility in nearby San Diego. Astronomers from Yale even used the exoplanet-hunting satellite TESS to scan the sky for Planet 9. Unfortunately, no one has seen the elusive extra planet yet. 

“Simply put, Planet 9 is very distant and extraordinarily dim,” says Batygin. “The challenge of directly detecting it is difficult to appreciate without seeing first-hand how complex the observation process is, especially when looking for the proverbial needle in a haystack.”

The Vera Rubin Observatory in Chile—currently scheduled to begin operations in early 2025 and equipped with the largest digital camera ever made for astronomy—will provide an excellent opportunity to continue the search for Planet 9. Astronomers have been looking forward to this facility for years, even citing it in a PopSci article from 2020 as the key to solving this mystery once and for all

And if there’s no planet to be found, even with a bigger and better observatory on the case? “If it turns out not to be there, then we will need to find individual explanations for all these different observations,” says Becker. “I am continually amazed by just how many solar system puzzles Planet Nine’s existence would solve.”

The post Astronomers still haven’t decided if Planet 9 is real appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Ancient farm practice could help sustain future humans on Mars https://www.popsci.com/science/mars-farms-future/ Wed, 01 May 2024 18:00:00 +0000 https://www.popsci.com/?p=613210
Intercropped tomato (left) compared to monocropped tomatoes (right). Both were planted on the same day, but here we can see that the intercropped tomato plant is larger, bears more fruit, and the tomatoes ripened earlier than its monocropped counterpart.
Intercropped tomato (left) compared to monocropped tomatoes (right). Both were planted on the same day, but here we can see that the intercropped tomato plant is larger, bears more fruit, and the tomatoes ripened earlier than its monocropped counterpart. Wageningen University & Research/Rebeca Gonçalves

This ancient agricultural technique may increase yields of some plants on the Red Planet.

The post Ancient farm practice could help sustain future humans on Mars appeared first on Popular Science.

]]>
Intercropped tomato (left) compared to monocropped tomatoes (right). Both were planted on the same day, but here we can see that the intercropped tomato plant is larger, bears more fruit, and the tomatoes ripened earlier than its monocropped counterpart.
Intercropped tomato (left) compared to monocropped tomatoes (right). Both were planted on the same day, but here we can see that the intercropped tomato plant is larger, bears more fruit, and the tomatoes ripened earlier than its monocropped counterpart. Wageningen University & Research/Rebeca Gonçalves

NASA has big plans for space farms and there are plenty of ideas from astrobiologists for what the best crops to grow on Mars could be. To best optimize these future extraterrestrial farms, scientists are also exploring what planting methods could boost potential crop yields on the Red Planet. Some new experiments with tomato, carrot, and pea plants found that growing different crops mixed together could boost yields of some plants in certain Martian conditions. The findings could also have implications for life on Earth and are described in a study published May 1 in the journal PLOS One

A Martian greenhouse

In order for future humans to survive on Mars for long stretches at a time, nutritious food is going to be essential. While learning how fake astronaut Mark Watney grew potatoes in the sci-fi novel and film The Martian was entertaining and informative, real astronauts should have some helpful resources from planet Earth for growing food in future Mars settlements.

To learn how to best do this, scientists on Earth must simulate the unique conditions on the Red Planet here. Mars’ atmosphere is about 100 times thinner than Earth’s and is mostly made up of carbon dioxide, nitrogen, and argon gasses. Entire Martian colonies in the future will need to be set up in controlled enclosures similar to greenhouses with an Earth-like atmosphere of the right mixture of oxygen, nitrogen, and carbon dioxide.

[Related: Why space lettuce could be the pharmacy astronauts need.]

“The best ‘Martian environment’ is actually simply a greenhouse with controlled conditions including temperature, humidity, and gasses,” Rebeca Gonçalves, a study co-author and astrobiologist at Wageningen University & Research in The Netherlands, tells PopSci

For this study, Gonçalves and the team used greenhouses at the university to simulate a growing environment on Mars. They tested how crops fare in a simulated version of Martian regolith–the loose and rocky material covering the planet. Pots of standard potting soil and sand were used as a control group. Bits of organic Earth soil and other nutrients was also added to the sand and Martian regolith samples to improve water retention and root holding. 

a close-up of tomatoes sprouting up from reddish brown soil growing (left). simulated Martian regolith with a root system visible in the reddish brown soil
A close-up of Martian tomatoes growing (left). The simulated Martian regolith with a root system. CREDIT: Wageningen University & Research /Rebeca Gonçalves.

Picking plants

For the plants on this fake Martian farm, the team selected peas, carrots, and tomatoes. A 2014 study found that all three are able to grow in Martian regolith. According to Gonçalves, knowing that these plants could grow was key, since they were looking for an answer to a different question. They wanted to know how to use companion plants and intercropping–an ancient planting technique of growing two or more plants in close proximity–to boost crop yields. These three also could have an important nutritional role in the future. 

Pots of various plants lined up in the greenhouse (left). Pots with Mars, sand, and Earth soil (right)
Experimental set up in a greenhouse (left). Pots with Mars, sand, and Earth soil (right). CREDIT: Wageningen University & Research /Rebeca Gonçalves.

“They were chosen for their nutritional content, being high in antioxidants, vitamin C, and beta carotene,” says Gonçalves. “This is important because these nutrients are all completely lost in the process of food dehydration, which is the main process we use to send food to space missions. Therefore, the production of fresh food containing these nutrients is a must in a Martian colony.”

These crops are also companion species that share complementary traits. Peas are considered a main contributor to the intercropping system because they are legumes that can “fix” nitrogen. In nitrogen fixing, some plants and bacteria can turn nitrogen from the air into a form of ammonia that plants can use for nutrition. This, in turn, benefits other plants and diminishes the need for fertilizers to be added to the plant system. According to Gonçalves, it optimizes the resources needed for plants to grow on the Red Planet.

Martian tomatoes (left), Martian carrots (middle) and Martian peas (right)
All three experimental species yielded well in the Mars regolith treatment. Healthy Martian tomatoes (left), Martian carrots (middle) and Martian peas (right). CREDIT: Wageningen University & Research/Rebeca Gonçalves.

“Carrots were used to help aerate the soil, which can improve water and nutrient uptake by the companion plants, and tomatoes were used to provide shade for the temperature sensitive carrot and to give climbing support for the peas,” says Gonçalves.

Red fruit, red planet

All three species grew fairly well in the Martian regolith, producing just over half a pound of produce with only a minimum addition of nutrients. The tomatoes grew better when they were alongside the peas and carrots in an intercropping set up, than the control tomatoes that were grown alone. The tomatoes had a higher biomass and also had more potassium when grown this way. 

a scientist holds dried samples from harvested plants in clear containers
Rebeca Gonçalves with ground samples from the harvested tomatoes, peas, and carrots ready for nutrient analysis. CREDIT: Wageningen University & Research /Rebeca Gonçalves.

However, intercropping in this regolith appeared to decrease yields for the carrots and peas. These plants did better alone. In future experiments, the team hopes that some modifications to how the simulated Martian regolith is treated could help increase yields when intercropping is used, so that the carrots and peas can have similarly bigger harvests.

“The fact that it worked really well for one of the species was a big find, one that we can now build further research on,” says Gonçalves. 

[Related: Watering space plants is hard, but NASA has a plan.]

The team was also surprised by how intercropping showed an advantage in the sandy soil control group. It benefited two of the three plant species and this find could be applied to agricultural systems on Earth. Climate change is making some soils more sandy and this study is part of ongoing efforts to see how intercropping can help tackle this issue.

In future studies, the team hopes to figure out how to reach, “a completely self-sustainable system using 100% of the local resources on Mars.” This would help make these future colonies more financially viable and not as dependent on resupply missions. 

“If we can unlock the secret to regenerating poor soils while developing a high-yielding, self-sustainable food production system—exactly the goal of Martian agriculture research—we will have found a solution for a lot of the issues we are having here on Earth as well,” says Gonçalves.

The post Ancient farm practice could help sustain future humans on Mars appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
JWST measures ‘Hot Jupiter,’ a distant exoplanet hot enough to forge iron https://www.popsci.com/science/jwst-wasp-43b/ Wed, 01 May 2024 15:00:48 +0000 https://www.popsci.com/?p=613154
Artist rendering of exoplanet WASP-43b
This artist’s concept shows what the hot gas-giant exoplanet WASP-43 b could look like. A Jupiter-sized planet roughly 280 light-years away, the planet orbits its star at a distance of about 1.3 million miles, completing one circuit in about 19.5 hours. Credit: NASA, ESA, CSA

Blazing temperatures and supersonic winds rule WASP-43b.

The post JWST measures ‘Hot Jupiter,’ a distant exoplanet hot enough to forge iron appeared first on Popular Science.

]]>
Artist rendering of exoplanet WASP-43b
This artist’s concept shows what the hot gas-giant exoplanet WASP-43 b could look like. A Jupiter-sized planet roughly 280 light-years away, the planet orbits its star at a distance of about 1.3 million miles, completing one circuit in about 19.5 hours. Credit: NASA, ESA, CSA

NASA’s James Webb Space Telescope isn’t only snapping some of the most detailed images of our cosmos—it’s also helping an international team of astronomers determine the weather on planets trillions of miles away from Earth. Its latest subject, WASP-43b, appears to live up to its extremely heavy metal-sounding name.

Astronomers discovered WASP-43b back in 2011, but initially could only assess some of its potential conditions using the Hubble and now-retired Spitzer space telescopes. That said, it was immediately clear that the gas giant is a scorcher.According to their measurements, the planet orbits its star at just 1.3 million miles away. For comparison, that’s not even 1/25th the distance separating Mercury from the sun. WASP-43b is also tidally locked in its orbit, meaning that one side is always facing its star while the other half is constantly cloaked in darkness.

Chart of WASP-43b phase curve from low-resolution spectroscopy
Data from the Mid-Infrared Instrument on NASA’s Webb telescope shows the changing brightness of the WASP-43 star and planet system. The system appears brightest when the hot dayside of the planet is facing the telescope, and grows dimmer as the planet’s nightside rotates into view. Credit: Taylor J. Bell (BAERI); Joanna Barstow (Open University); Michael Roman (University of Leicester) Graphic Design: NASA, ESA, CSA, Ralf Crawford (STScI)

But at 280 light-years away and practically face-to-face with its star, WASP-43b is difficult to see clearly through telescopes. To get a better look, experts enlisted JWST’s Mid-Infrared Instrument (MIRI) to measure extremely small fluctuations in the brightness emitted by the WASP-43 system every 10 seconds for over 24 hours.

“By observing over an entire orbit, we were able to calculate the temperature of different sides of the planet as they rotate into view. From that, we could construct a rough map of temperature across the planet,” Taylor Bell, a researcher at the Bay Area Environmental Research Institute and the lead author of a study published yesterday in Nature Astronomy, said in Tuesday’s announcement.

[Related: JWST images show off the swirling arms of 19 spiral galaxies.]

Some of those temperatures are blazing enough to forge iron, with WASP-43b’s dayside averaging almost 2,300 degrees Fahrenheit. And while the nightside is a balmier 1,100 degrees Fahrenheit, that’s still only about 120 degrees short of the melting point for aluminum.

MIRI’s broad spectrum mid-infrared light data, paired alongside additional telescope readings and 3D climate modeling, also allowed astronomers to measure water vapor levels around the planet. With this information, the team could better calculate WASP-43b’s cloud properties, including their thickness and height.

Temperature map diagram for WASP-43b
This set of maps shows the temperature of the visible side of the hot gas-giant exoplanet WASP-43 b as it orbits its star. The temperatures were calculated based on more than 8,000 brightness measurements by Webb’s MIRI (the Mid-Infrared Instrument). Credit: Science: Taylor J. Bell (BAERI); Joanna Barstow (Open University); Michael Roman (University of Leicester) Graphic Design: NASA, ESA, CSA, Ralf Crawford (STScI)

The light data also revealed something striking about the gas giant’s atmospheric conditions—a total lack of methane, which astronomers previously hypothesized may be detectable, at least on the nightside. This fact implies that nearly 5,000 mph equatorial winds must routinely whip across WASP-43b, which are fast enough to prevent the chemical reactions necessary to produce detectable levels of methane.

“With Hubble, we could clearly see that there is water vapor on the dayside. Both Hubble and Spitzer suggested there might be clouds on the nightside,” Bell said on Tuesday. “But we needed more precise measurements from Webb to really begin mapping the temperature, cloud cover, winds, and more detailed atmospheric composition all the way around the planet.”

The post JWST measures ‘Hot Jupiter,’ a distant exoplanet hot enough to forge iron appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Zippy meteors, a globular cluster, and more light up May’s night sky https://www.popsci.com/science/cosmic-calendar-may-2024/ Tue, 30 Apr 2024 13:00:00 +0000 https://www.popsci.com/?p=612807
a pink-hued full moon rises above the new york city skyline
The full Flower Moon rises above One World Trade Center and the skyline of lower Manhattan in New York City on May 5, 2023. Lokman Vural Elibol/Anadolu Agency via Getty Images

Be sure to check out the Full Flower Moon on May 22 and 23rd.

The post Zippy meteors, a globular cluster, and more light up May’s night sky appeared first on Popular Science.

]]>
a pink-hued full moon rises above the new york city skyline
The full Flower Moon rises above One World Trade Center and the skyline of lower Manhattan in New York City on May 5, 2023. Lokman Vural Elibol/Anadolu Agency via Getty Images
May 5 and 6Eta Aquarids Meteor Shower Predicted Peak
May 11Globular Cluster Messier 5 Highest Point
May 14 through 30Lāhaina Noon
May 22 and 23Full Flower Moon

While we may not have the excitement of a total solar eclipse this month, May offers us a good chance to see some incredibly fast meteors zipping by. Nighttime stargazing should also start to get more comfortable as temperatures warm up in the Northern Hemisphere. Here’s what to look for in the night sky in May. 

May 5 and 6–Eta Aquarids Meteor Shower Predicted Peak

The Eta Aquarids Meteor Shower is expected to peak on May 5, where roughly 10 to 30 meteors per hour can be seen. Eta Aquarid meteors are known to be super speedy, with some traveling at about 148,000 mph into our planet’s atmosphere. These fast meteors can also leave behind incandescent bits of debris in their wake called trains. 

According to the 2024 Observer’s Handbook from the Royal Astronomical Society of Canada, this year’s Eta Aquarid Meteor Shower may put on a particularly good show. The waning crescent moon means less light in the night sky and may help viewing conditions.

[Related: The history of Halley’s Comet—and the fireball show it brings us every spring.]

The Farmer’s Almanac suggests looking towards the southeast between 2 to 4 a.m. local time on May 5 and 6. If it’s cloudy or you miss those days, the shower will likely stay fairly strong until around May 10. This meteor shower is usually active between April 19 and May 28 every year, peaking in early May. 

The point in the sky where the meteors appear to come from–or radiant–is in the direction of the constellation Aquarius and the shower is named for the constellation’s brightest star, Eta Aquarii. It is also one of two meteor showers created by the debris from Comet Halley.

May 11–Globular Cluster Messier 5 At Highest Point

A bright globular cluster called Messier 5 (or NGC 5904) will reach its highest point in the sky at about midnight local time. Using a telescope or pair of binoculars, look to the southeastern sky, where it should appear like a patch of light. In the evenings after May 5, M5 will be at its highest point for that day about four minutes earlier each day, according to In the Sky.

[Related: How the Hubble telescope is keeping a 265-year-old stargazing project alive.]

M5 is one of the oldest globular clusters in our galaxy. According to NASA, stars in globular clusters like this are believed to form in the same stellar nursery and grow old together. M5 has an apparent magnitude of 6.7 and is about 25,000 light-years away in the constellation Serpens, It is also very bright in July.

May 14 through 30–Lāhaina Noon

This twice a year event in the Earth’s tropical regions occurs when the sun is directly overhead around solar noon. At this point, upright objects do not cast shadows. It happens in May and then again in July. If you are in Hawaii, you can consult this timetable to see what day and times this month’s Lāhaina Noon will occur near you. 

According to the Bishop Museum, in English, the word “lāhainā” can be translated as “cruel sun,” and is a reference to severe droughts experienced in that part of the island of Maui in Hawaii. An older term in ʻŌlelo Hawaiʻi is “kau ka lā i ka lolo,” which means “the sun rests upon the brain” and references both the physical and cultural significance of the event. 

May 22 and 23–Full Flower Moon

May’s full moon will reach its peak illumination at 9:53 a.m. EDT on Thursday, May 23. Since it will already be below the horizon when it reaches peak illumination, it will be best to view it on the nights May 22 and 23rd. You can use a moonrise and moonset calculator to determine exactly what time to head out and take a gander at this month’s full moon. 

The name Flower Moon is in reference to May’s blooms when flowers are typically most abundant in the Northern Hemisphere. May’s full moon is also called the Flowering Moon or Waabigoni-giizis in Anishinaabemowin (Ojibwe), the They Plant Moon or Latiy^thos in Oneida, and the Dancing Moon or Ganö́’gat in Seneca. 

The same skygazing rules that apply to pretty much all space-watching activities are key during the nighttime events this month: Go to a dark spot away from the lights of a city or town and let the eyes adjust to the darkness for about a half an hour. 

The post Zippy meteors, a globular cluster, and more light up May’s night sky appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
China compiled the most detailed moon atlas ever mapped https://www.popsci.com/science/moon-atlas/ Mon, 29 Apr 2024 19:00:00 +0000 https://www.popsci.com/?p=612856
Moon photograph from Artemis 1
On flight day 20 of NASA’s Artemis I mission, Orion’s optical navigation camera looked back at the Moon as the spacecraft began its journey home. NASA/JSC

The Geologic Atlas of the Lunar Globe includes 12,341 craters, 81 basins, and 17 different rock types.

The post China compiled the most detailed moon atlas ever mapped appeared first on Popular Science.

]]>
Moon photograph from Artemis 1
On flight day 20 of NASA’s Artemis I mission, Orion’s optical navigation camera looked back at the Moon as the spacecraft began its journey home. NASA/JSC

If we want to establish a permanent human presence on the moon, we need more detailed maps than the existing options, some of which date back to the Apollo missions of 1960’s and 1970’s. After more than ten years of collaboration between more than 100 researchers working at the Chinese Academy of Sciences (CAS), the newest editions of lunar topography are rolling out for astronomers and space agencies around the world.

As highlighted recently by Nature, the Geologic Atlas of the Lunar Globe includes 12,341 craters, 81 basins, and 17 different rock types found across the moon’s surface, doubling previous map resolutions to a scale of 1:2,500,000.

[Related: Why do all these countries want to go to the moon right now?]

Although higher accuracy maps have been available for areas near Apollo mission landing sites, the US Geological Survey’s original lunar maps generally managed a 1:5,000,000 scale. Project co-lead and CAS geochemist Jianzhong Liu explained to Nature that “our knowledge of the Moon has advanced greatly, and those maps could no longer meet the needs for future lunar research and exploration.”

Geologic map of the moon
Credit: Chinese Academy of Sciences via Xinhua/Alamy

To guide lunar mapping into the 21st-century, CAS relied heavily on China’s ongoing lunar exploration programs, including the Chang’e-1 mission. Beginning in 2007, Chang’e-1’s high-powered cameras surveyed the moon’s surface from orbit for two years alongside an interference imaging spectrometer to identify various types of rock types. Additional data compiled by the Chang’e-3 (2013) and Chang’e-4 (2019) lunar landers subsequently helped hone those mapping endeavors. International projects like NASA’s Gravity Recovery and Interior Laboratory (GRAIL) and Lunar Reconnaissance Orbiter, as well as India’s Chandrayaan-1 probe all provided even more valuable topographical information.

The pivotal topographical milestone wasn’t an entirely altruistic undertaking, however. While CAS geophysicist Ross Mitchell described the maps as “a resource for the whole world,” he added that “contributing to lunar science is a profound way for China to assert its potential role as a scientific powerhouse in the decades to come.” 

[Related: Japan and NASA plan a historic lunar RV road trip together.]

The US is also far from the only ones anxious to set up shop on the moon—both China and Russia hope to arrive there by the mid-2030’s with the construction of an International Lunar Research Station near the moon’s south pole. Despite the two nations’ prior promise to be “open to all interested countries and international partners,” the US is distinctly not among the 10 other governments currently attached to the project.

China plans to launch its Chang’e-6 robotic spacecraft later this week, which will travel to the far side of the moon as the first of three new missions. In an interview on Monday, NASA Administrator Bill Nelson voiced his concerns of a potential real estate war on the moon.

Lithographic map of the moon
Credit: Chinese Academy of Sciences via Xinhua/Alamy

“I think it’s not beyond the pale that China would suddenly say, ‘We are here. You stay out,’” Nelson told Yahoo Finance. “That would be very unfortunate—to take what has gone on on planet Earth for years, grabbing territory, and saying it’s mine and people fighting over it.”

But if nothing else, at least the new maps will soon be available to virtually everyone. The Geologic Atlas is included in a new book from CAS, Map Quadrangles of the Geologic Atlas of the Moon, which also features an additional 30 sector diagrams offering even closer looks at individual lunar regions. The entire map resource will soon also become available to international researchers online through a cloud platform called Digital Moon.

The post China compiled the most detailed moon atlas ever mapped appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Rare quadruple solar flare event captured by NASA https://www.popsci.com/science/quadruple-solar-flare/ Thu, 25 Apr 2024 18:18:20 +0000 https://www.popsci.com/?p=612553
Image of sun highlighting four solar events
Similar activity will likely increase as the sun nears its 'solar maximum.'. Credit: NASA/SDO/AIA

The 'super-sympathetic flare' might affect satellites and spacecraft near Earth.

The post Rare quadruple solar flare event captured by NASA appeared first on Popular Science.

]]>
Image of sun highlighting four solar events
Similar activity will likely increase as the sun nears its 'solar maximum.'. Credit: NASA/SDO/AIA

Earlier this week, NASA’s Solar Dynamics Observatory (SDO) recorded a rarely seen event—four nearly-simultaneous flare eruptions involving three separate sunspots, as well as the magnetic filament between them. But as impressive as it is, the event could soon pose problems for some satellites and spacecraft orbiting Earth, as well as electronic systems here on the ground.

It may seem like a massive ball of fiery, thermonuclear chaos, but there’s actually a fairly predictable rhythm to the sun. Similar to Earth’s seasonal changes, the yellow dwarf star’s powerful electromagnetic fluctuations follow a roughly 11-year cycle of ebbs and flows. Although astronomers still aren’t quite sure why this happens, it’s certainly observable—and recent activity definitely indicates the sun is heading towards its next “solar maximum” later this year.

Gif of supersympathetic solar flares
Credit: NASA/SDO/AIA

As Spaceweather.com notes, early Tuesday morning’s “complex quartet” of solar activity was what’s known as a “super-sympathetic flare,” in which multiple events occur at nearly the same time. This happens thanks to the often hard-to-detect magnetic loops spreading across the sun’s corona, which can create explosive chain reactions in the process. In this case, hundreds of thousands of miles separated the three individual flares, but they still erupted within minutes of each other. All-told, the super-sympathetic flare encompassed about a third of the sun’s total surface facing Earth.

[Related: Why our tumultuous sun was relatively quiet in the late 1600s]

And that “facing Earth” factor could present an issue. BGR explains “at least some” of the electromagnetic “debris” could be en route towards the planet in the form of a coronal mass ejection (CME). If so, those forces could result in colorful auroras around the Earth’s poles—as well as create potential tech woes for satellite arrays and orbiting spacecraft, not to mention blackouts across some radio and GPS systems. The effects, if there are any, are estimated to occur over the next day or so, but at least they’re predicted to only be temporary inconveniences.

Luckily, multi-flare situations like this week’s aren’t a regular occurrence—the last time something similar happened was back in 2010 in what became known as the Great Eruption.

[Related: Hold onto your satellites: The sun is about to get a lot stormier]

Still, these super-sympathetic flares serve as a solid reminder of just how much of our modern, electronically connected society is at the sun’s mercy. As recently as 2022, for example, a solar storm knocked around 40 Starlink satellites out of orbit. The risk of solar-induced problems will continue to rise as the skies grow increasingly crowded.

While many companies continue to construct redundancy programs and backup systems for these potential headaches, astronomers and physicists still can’t predict solar activity very accurately. More research and funding is needed to create early warning and forecasting programs.

This year alone has already seen at least two other solar activity events—and seeing as how we still haven’t passed the solar maximum, more impressive (and maybe damaging) activity is likely on the way.

The post Rare quadruple solar flare event captured by NASA appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Gassy geysers create ‘spiders’ on Mars https://www.popsci.com/science/spiders-mars-inca-city/ Thu, 25 Apr 2024 15:58:35 +0000 https://www.popsci.com/?p=612539
left image- This rectangular image shows part of the martian surface as if the viewer is looking down and across the landscape, with the irregular, mottled ground appearing in swirled tones of brown and tan. right image- A slice of the martian surface is shown here. A rounded segment of an eroded crater basin is visible to the right. The key features seen across the image are dark spots with tendrils that are eerily reminiscent of spiders. These are visible in large numbers to the left, and scattered irregularly across the rest of the image.
Mars' Inca City formation (left) is home to Martian ‘spiders’ every spring (right). ESA/DLR/FU Berlin (left) ESA/TGO/CaSSIS (right)

ESA captures new images of the seasonal phenomenon in the 'Inca City' region of the Red Planet.

The post Gassy geysers create ‘spiders’ on Mars appeared first on Popular Science.

]]>
left image- This rectangular image shows part of the martian surface as if the viewer is looking down and across the landscape, with the irregular, mottled ground appearing in swirled tones of brown and tan. right image- A slice of the martian surface is shown here. A rounded segment of an eroded crater basin is visible to the right. The key features seen across the image are dark spots with tendrils that are eerily reminiscent of spiders. These are visible in large numbers to the left, and scattered irregularly across the rest of the image.
Mars' Inca City formation (left) is home to Martian ‘spiders’ every spring (right). ESA/DLR/FU Berlin (left) ESA/TGO/CaSSIS (right)

It’s ‘spider’ season on the Red Planet. There are no actual spiders on Mars–that we know of–but arachnid-shaped black spots dot some parts of our celestial neighbor every spring.

[Related: Mars’s mascara-like streaks may be caused by slush and landslides.]

The European Space Agency (ESA) released new images of these seasonal eruptions in a formation called Inca City in Mars’ southern polar region.

How do Martian ‘spiders’ form?

Mars has four distinct seasons, similarly to Earth. Each Martian season lasts roughly twice as long as a season here. According to the ESA, these spider-like marks appear in Martian spring when sunlight falls on layers of carbon dioxide that have been deposited over the dark Martian winter. The sunlight causes carbon dioxide ice at the bottom layer to turn into gas. The gas builds up and eventually breaks through slabs overlying ice around Mars’ poles. When they burst free, the dark material is dragged up to the surface as it travels and shatters layers of ice that are up to three feet thick. 

A slice of the martian surface is shown here. A rounded segment of an eroded crater basin is visible to the right. The key features seen across the image are dark spots with tendrils that are eerily reminiscent of spiders. These are visible in large numbers to the left, and scattered irregularly across the rest of the image.
Spider-like features form when spring sunshine falls on layers of carbon dioxide deposited over the dark winter months. CREDIT: ESA/TGO/CaSSIS.

The emerging gas is full of dark dust and shoots up through cracks in the ice similar to a fountain or geyser. The gas then travels back down and settles on the surface. The settling gas creates dark spots which range from 0.3 to 0.6 miles across. This same process creates the spider-shaped patterns that are etched beneath the ice.

The image was captured by the CaSSIS instrument aboard the ESA’s ExoMars Trace Gas Orbiter (TGO). CaSSIS stands for Colour and Stereo Surface Imaging System and it was built at the University of Bern in Germany. It creates high resolution images designed to complement data collected on Mars. It is made up of a telescope and focal plane system that are mounted on a rotation mechanism and has three electronics units that relay images back to the ESA

Mars’ mysterious Inca City

Most of the spots in this new image are seen on the outskirts of Angustus Labyrinthus–more commonly called Inca City. NASA’s Mariner 9 probe first spotted Inca City in 1972 and its geometric-looking network of ridges reminded astronomers of Inca ruins

Scientists are still not sure exactly how Inca City formed. It may be sand dunes that have turned to stone over millennia. Materials like magma or sand could also be seeping through cracked sheets of Martian rock. The ridges could also be winding structures related to glaciers called eskers. 

This rectangular image shows part of the martian surface as if the viewer is looking down and across the landscape, with the irregular, mottled ground appearing in swirled tones of brown and tan.
This oblique perspective view looks across a part of Mars nicknamed Inca City (formally named Angustus Labyrinthus). The reason for this is no mystery, with the linear network of ridges being reminiscent of Inca ruins. Traces of features known as ‘spiders’ can be seen; these small, dark features form as carbon dioxide gas warms up in sunlight and breaks through slabs of overlying ice. CREDIT: ESA/DLR/FU Berlin

Inca City also appears to be part of a large circle–about 53 miles in diameter. Scientists believe that the ‘ formation sits within a large crater that may have taken shape as a rock from space crashed into Mars’ surface. The impact likely caused faults to ripple through the surrounding plain. The faults were then filled with rising lava and have worn away over time. 

Towards the middle section of the image the landscape changes somewhat, with large roundish and oval swirls creating an effect reminiscent of marble. This effect is thought to occur when layered deposits are worn away over time.

[Related: Scientists brought ‘Mars spiders’ to Earth—here’s how.]

A few prominent steep, flat-topped mounds and hills stand almost 5,000 feet above the surrounding terrain. These mounds form as softer material is eroded wind, water, or ice. The harder material left behind forms these hills. Some signs of the ‘spiders’ are scattered across the dust-covered plateaus, lurking amongst various canyons and troughs.

The data for these images was captured on October 4, 2020 during Mars’ most recent spring. The Red Planet is currently in its autumn and its next spring equinox will be on November 12, 2024.

The post Gassy geysers create ‘spiders’ on Mars appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Astronomers fight to save a major X-ray space telescope https://www.popsci.com/science/save-chandra/ Thu, 25 Apr 2024 14:00:00 +0000 https://www.popsci.com/?p=612487
In about 5 billion years, our Sun will run out of fuel and expand, possibly engulfing Earth. These end stages of a star’s life can be utterly beautiful as is the case with this planetary nebula called the Helix Nebula. Astronomers study these objects by looking at all kinds of light, including X-rays that the Chandra X-ray Observatory sees.
In about 5 billion years, our Sun will run out of fuel and expand, possibly engulfing Earth. These end stages of a star’s life can be utterly beautiful as is the case with this planetary nebula called the Helix Nebula. Astronomers study these objects by looking at all kinds of light, including X-rays that the Chandra X-ray Observatory sees. X-ray: NASA/CXC/SAO; UV: NASA/JPL-Caltech/SSC; Optical: NASA/ STScI/M. Meixner, ESA/NRAO/T.A. Rector; Infrared:NASA/JPL-Caltech/K. Su; Image Processing: NASA/CXC/SAO/N. Wolk and K. Arcand

Chandra X-Ray Observatory is in trouble. Losing it could set the field back decades.

The post Astronomers fight to save a major X-ray space telescope appeared first on Popular Science.

]]>
In about 5 billion years, our Sun will run out of fuel and expand, possibly engulfing Earth. These end stages of a star’s life can be utterly beautiful as is the case with this planetary nebula called the Helix Nebula. Astronomers study these objects by looking at all kinds of light, including X-rays that the Chandra X-ray Observatory sees.
In about 5 billion years, our Sun will run out of fuel and expand, possibly engulfing Earth. These end stages of a star’s life can be utterly beautiful as is the case with this planetary nebula called the Helix Nebula. Astronomers study these objects by looking at all kinds of light, including X-rays that the Chandra X-ray Observatory sees. X-ray: NASA/CXC/SAO; UV: NASA/JPL-Caltech/SSC; Optical: NASA/ STScI/M. Meixner, ESA/NRAO/T.A. Rector; Infrared:NASA/JPL-Caltech/K. Su; Image Processing: NASA/CXC/SAO/N. Wolk and K. Arcand

One of NASA’s Great Observatories may soon meet an untimely demise. The Chandra X-Ray Observatory—an orbiting telescope launched in 1999 aboard Space Shuttle Columbia—is facing a major financial threat in NASA’s latest budget proposal. Major cuts to its funding could lead to layoffs for half of the observatory’s staff by October and, according to concerned scientists, a premature end to the mission around 2026. Astronomers are worried that losing a telescope so crucial to our studies of the high-energy cosmos could set the field back by decades. 

In an open letter, a group of astronomers claimed that Chandra “is capable of many more years of operation and scientific discovery” and that a “reduction of the budget of our flagship X-ray mission will have an outsized impact on both U.S. high-energy astrophysics research and the larger astronomy and astrophysics community.”

“It’s a huge monetary and environmental toll to put an observatory up in space, so I think it’s really important to value that and to not treat these instruments as disposable,” adds Samantha Wong, an astronomer at McGill University. “People outside of astronomy contribute to the cost of these instruments (both literally and in terms of environmental and satellite pollution), so it’s in everyone’s best interest that we use Chandra to the full extent it’s capable of.” 

Chandra was launched in the 90s along with the optical and ultraviolet Hubble Space Telescope, the infrared Spitzer Space Telescope (recently decommissioned in 2020), and the Compton Gamma Ray Observatory (the shortest lived of the bunch, ending in 2000). Much like the powerhouse Hubble, Chandra was initially meant to operate for five years—but its enduring excellent performance has cemented it as a pillar of astronomy research for the past two and a half decades. Although any piece of equipment will naturally degrade over time, Chandra continues to return excellent scientific results, deemed “the most powerful X-ray facility in orbit” from a recent NASA senior review, with potential to keep going for another decade until it runs out of fuel as long as the team on the ground can continue operating it.

Eventually, our Sun will run out of fuel and die (though not for another 5 billion years). As it does, it will become like the object seen here, the Cat’s Eye Nebula, which is a planetary nebula. A fast wind from the remaining stellar core rams into the ejected atmosphere and pushes it outward, creating wispy structures seen in X-rays by Chandra and optical light by the Hubble Space Telescope.
Eventually, our Sun will run out of fuel and die (though not for another 5 billion years). As it does, it will become like the object seen here, the Cat’s Eye Nebula, which is a planetary nebula. A fast wind from the remaining stellar core rams into the ejected atmosphere and pushes it outward, creating wispy structures seen in X-rays by Chandra and optical light by the Hubble Space Telescope. Credit: X-ray: NASA/CXC/SAO; Optical: NASA/ESA/STScI; Image Processing: NASA/CXC/SAO/J. Major, L. Frattare, K. Arcand

Space telescopes are huge endeavors and marvels of engineering, and each one opens up a new window to the universe. Astronomy requires seeing the universe in multiple wavelengths of light, far beyond what human eyes can sense, from low-energy radio waves to the highest-energy gamma rays. “It’s hard to overstate how much we’re learning about the cosmos by just putting big pieces of glass in the sky,” said Harvard astronomer Grant Tremblay in a conversation with New York Rep. Joe Morelle.

[ Related: Where do all those colors in space telescope images come from? ]

In space, X-rays can tell us about the most explosive phenomena in the cosmos: supernovae, supermassive black holes, colliding neutron stars, and more. Chandra is one of a small number of telescopes—including the European XMM-Newton and Japanese XRISM—that can sense X-rays, the same high-energy light used to image human bones here on the ground. However, Chandra is unique even out of that small bunch, able to see in unparalleled detail. Observations from Chandra have also revealed fluorescence on planets in the solar system, and where mysterious dark matter lurks in a cluster of galaxies.

This composite image shows the galaxy cluster 1E 0657-56, also known as the "bullet cluster." This cluster was formed after the collision of two large clusters of galaxies, the most energetic event known in the universe since the Big Bang.
This composite image shows the galaxy cluster 1E 0657-56, also known as the “bullet cluster.” This cluster was formed after the collision of two large clusters of galaxies, the most energetic event known in the universe since the Big Bang. Credit: X-ray: NASA/CXC/CfA/M.Markevitch et al.; Optical: NASA/STScI; Magellan/U.Arizona/D.Clowe et al.; Lensing Map: NASA/STScI; ESO WFI; Magellan/U.Arizona/D.Clowe et al.

Since the budget cuts were announced in March, astronomers have rallied together to #SaveChandra, compiling their case for the observatory into a website. “Together, Chandra and Hubble are amongst the most scientifically productive missions in the entire NASA Science portfolio,” reads the Save Chandra website. Astronomers also expect Chandra to be highly complementary to the famous JWST and the upcoming Rubin Observatory in Chile, which will scan nearly the entire sky every night. For example, Chandra can peer into the hearts of high-redshift galaxies seen by Webb, learning more about the supermassive black holes at their centers. It will also be crucial for follow up on the 10 million alerts that Rubin will generate each night by pinpointing short-lived, bright flashes from explosive celestial events.

Astronomers have also been advocating on social media, including sharing personal anecdotes of how important Chandra has been to their scientific careers, from a graduate student describing the importance of Chandra in her education to a professor reminiscing on her first research paper in 2001, which used Chandra data and led to 30 other papers related to the mission in her career. 

One of the biggest concerns in the community is the fact that there is no replacement for Chandra on the horizons. Its successor, the Lynx observatory, is “unlikely to launch before the 2050s” according to Dublin Institute of Advanced Studies astronomer Affelia Wibisono—that is, if it launches at all. NASA is considering a smaller X-ray probe mission (to be chosen from a few ideas, including STROBE-X or the Line Emission Mapper), but none of these concepts would fill the gap left by Chandra. Plus, the resulting layoffs from Chandra’s demise would lead to a huge loss of expertise in X-ray astronomy as jobless astronomers are forced to leave the field, creating a huge gap in our ability to even do the science expected from Lynx and other future observatories. Without Chandra, “there’s little incentive or accessibility to doing high energy work for the next decade or so, which really depletes the field and makes it hard to retain momentum in the science that we’re doing,” adds Wong.

To learn more about the supernova explosion, scientists compared the Webb view of the pristine debris with X-ray maps of radioactive elements that were created in the supernova. They used NASA’s Nuclear Spectroscopic Telescope Array (NuSTAR) data to map radioactive titanium — still visible today — and Chandra to map where radioactive nickel was by measuring the locations of iron. Radioactive nickel decays to form iron. These additional images show NuSTAR in blue, Chandra in purple, Webb/Spitzer in gold and green, and Hubble in yellow. Credit: X-ray: NASA/CXC/SAO, NASA/JPL/Caltech/NuStar; Optical: NASA/STScI/HST; IR: NASA/STScI/JWST, NASA/JPL/CalTech/SST; Image Processing: NASA/CXC/SAO/J. Schmidt, K. Arcand, and J. Major
To learn more about the supernova explosion, scientists compared the Webb view of the pristine debris with X-ray maps of radioactive elements that were created in the supernova. They used NASA’s Nuclear Spectroscopic Telescope Array (NuSTAR) data to map radioactive titanium—still visible today—and Chandra to map where radioactive nickel was by measuring the locations of iron. Radioactive nickel decays to form iron. These additional images show NuSTAR in blue, Chandra in purple, Webb/Spitzer in gold and green, and Hubble in yellow. Credit: X-ray: NASA/CXC/SAO, NASA/JPL/Caltech/NuStar; Optical: NASA/STScI/HST; IR: NASA/STScI/JWST, NASA/JPL/CalTech/SST; Image Processing: NASA/CXC/SAO/J. Schmidt, K. Arcand, and J. Major

With scientists so dedicated to the success of this mission, then, why would it be canceled by NASA?

Chandra’s plight is a symptom of a larger issue: ongoing cuts to science funding in the United States, partially resulting from the Fiscal Responsibility Act of 2023 that limited non-defense spending. NASA’s budget was cut by 2% in the 2024 fiscal year, a stark change from President Biden’s recent request for a 7% increase to their funding. Essentially, NASA leadership has been forced between a rock and a hard place—who would want to choose between one incredible discovery machine and another nearly equally amazing? “We acknowledge that we are operating in a challenging budget environment, and we always want to do the most science we possibly can,” explained NASA Astrophysics director Mark Clampin.

This isn’t the only budget threat facing astronomy this year, either. NASA’s Jet Propulsion Lab in Pasadena, CA was forced to make major layoffs in February, and the highly ambitious Mars Sample Return program (intended to bring rocks collected by the Perseverance rover, currently on the Red Planet, back home to Earth) is undergoing major restructuring after its original budget was deemed unrealistic. Even ground-based astronomy is faced with hard choices, as the National Science Foundation is now forced to decide on one next-generation telescope instead of the two originally planned for construction.

Astronomy, however, is a mere sliver of the U.S.’s overall budget, and many people are hoping for a future where we can fund more of these scientific endeavors. “Science has such absurdly high national and global return on investment that you can easily advocate for the whole discovery portfolio,” wrote Tremblay in a post on X. Astronomers have also highlighted the importance of astronomy missions for inspiring the next generation of scientists, and keeping the public interested in science overall. “Continuing to operate Chandra would symbolize a renewed dedication to setting big goals,” says West Virginia University astronomer Graham Doskoch. “That’s an idea that has relevance for everyone.”

So, what comes next for the great X-ray observatory? NASA is currently planning a “mini-review” to decide how to best operate Chandra under the new budget constraints, ideally hoping to scale back operations without completely shuttering the program. Meanwhile, Chandra advocates are encouraging people to talk to their government representatives, sign a community letter, and spread the word on social media with resources available on the Save Chandra Website.

“You want to #SaveChandra? The high impact way to do that is to reach out to your representatives and senators,” wrote astronomer Laura Lopez on X. In this critical moment for the future of astronomy, now all eyes are on Congress to see how the budget shakes out in the coming months and years.

The post Astronomers fight to save a major X-ray space telescope appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA HQ picked their best photos of the year. Here are our 13 favorites. https://www.popsci.com/science/best-nasa-hq-images-2023/ Wed, 24 Apr 2024 21:03:00 +0000 https://www.popsci.com/?p=612255
three photos: a sample of rock on a microscope slide, rocket boosters firing, a rocket from below
2023 was a stellar year for NASA images. Upper-right, clockwise: NASA/Keegan Barber, NASA/Bill Ingalls, NASA/Bill Ingalls

From space rock chunks to rocket launches, NASA had a big year.

The post NASA HQ picked their best photos of the year. Here are our 13 favorites. appeared first on Popular Science.

]]>
three photos: a sample of rock on a microscope slide, rocket boosters firing, a rocket from below
2023 was a stellar year for NASA images. Upper-right, clockwise: NASA/Keegan Barber, NASA/Bill Ingalls, NASA/Bill Ingalls

On September 24, 2023, a capsule from NASA’s OSIRIS-REx mission floated back to Earth, landing safely in the Utah desert. The mission was the first time the U.S. brought back a piece of an asteroid and a big moment for the space agency. The history-making success of OSIRIS-REx features prominently in NASA HQ’S Best of 2023 photos album, recently curated and shared on Flickr.

Other milestone moments documented in the photos include the Psyche spacecraft launch, the SpaceX Dragon Endurance landing, and the Earthly return of Frank Rubio, the record holder for longest single spaceflight by a U.S. astronaut.

NASA HQ shared 100 photographs, but we’ve selected our 13 favorites.

a large rocket shot from underneath
The Soyuz rocket is seen after being rolled out by train to the launch pad at Site 31, Tuesday, Sept. 12, 2023, at the Baikonur Cosmodrome in Kazakhstan. Expedition 70 NASA astronaut Loral O’Hara, Roscosmos cosmonauts Oleg Kononenko, and Nikolai Chub are scheduled to launch aboard their Soyuz MS-24 spacecraft on Sept. 15. Image: NASA/Bill Ingalls
a white and orange parachute with a capsule attached flies down from the sky into a sandy landscape
A training model of the sample return capsule is seen during a drop test in preparation for the retrieval of the sample return capsule from NASA’s OSIRIS-REx mission, Wednesday, Aug. 30, 2023, at the Department of Defense’s Utah Test and Training Range. The sample was collected from asteroid Bennu in October 2020 by NASA’s OSIRIS-REx spacecraft and will return to Earth on September 24th, landing under parachute at the Utah Test and Training Range. Image: NASA/Keegan Barber
two people watch a space capsule load onto a ship
Support teams raise the SpaceX Dragon Endeavour spacecraft aboard the recovery ship MEGAN shortly after it landed with NASA astronauts Stephen Bowen and Warren “Woody” Hoburg, UAE (United Arab Emirates) astronaut Sultan Alneyadi, and Roscosmos cosmonaut Andrey Fedyaev aboard in the Atlantic Ocean off the coast of Jacksonville, Florida, Monday, Sept. 4, 2023. Bowen, Hoburg, Alneyadi, and Fedyaev are returning after nearly six-months in space as part of Expedition 69 aboard the International Space Station. Image: NASA/Joel Kowsky
a streak of white light on a black sky
In this eight-minute long exposure, a SpaceX Falcon 9 rocket carrying the company’s Dragon spacecraft is launched on NASA’s SpaceX Crew-7 mission to the International Space Station with NASA astronaut Jasmin Moghbeli, ESA (European Space Agency) astronaut Andreas Mogensen, Japan Aerospace Exploration Agency (JAXA) astronaut Satoshi Furukawa, and Roscosmos cosmonaut Konstantin Borisov onboard, Saturday, Aug. 26, 2023, at NASA’s Kennedy Space Center in Florida. Also visible in this image is the entry burn and landing burn, at right, conducted by the first stage of the Falcon 9 rocket as it returned to Landing Zone 1 at Cape Canaveral Space Force Station. NASA’s SpaceX Crew-7 mission is the seventh crew rotation mission of the SpaceX Dragon spacecraft and Falcon 9 rocket to the International Space Station as part of the agency’s Commercial Crew Program. Moghbeli, Mogensen, Furukawa, and Borisov launched at 3:27 a.m. EDT from Launch Complex 39A at the Kennedy Space Center to begin a six month mission aboard the orbital outpost. Image: NASA/Joel Kowsky
fire shoots out of a rocket
The Soyuz rocket is launched with Expedition 70 NASA astronaut Loral O’Hara, and Roscosmos cosmonauts Oleg Kononenko and Nikolai Chub on Friday, Sept. 15, 2023, at the Baikonur Cosmodrome in Kazakhstan. Image: NASA/Bill Ingalls
two pilots from inside a cockpit
Members of the Kennedy Space Center (KSC) Flight Operations team are seen operating a helicopter as the sample return capsule from NASA’s OSIRIS-REx mission is is en route to the cleanroom, Sunday, Sept. 24, 2023, shortly after the capsule landed at the Department of Defense’s Utah Test and Training Range. The sample was collected from the asteroid Bennu in October 2020 by NASA’s OSIRIS-REx spacecraft. Image: NASA/Keegan Barber
photographers stand behind a glass wall as people in white hazmat suits touch a capsule
Curation teams process the sample return capsule from NASA’s OSIRIS-REx mission in a cleanroom, Sunday, Sept. 24, 2023, at the Department of Defense’s Utah Test and Training Range. The sample was collected from the asteroid Bennu in October 2020 by NASA’s OSIRIS-REx spacecraft. Image: NASA/Keegan Barber
a man in a space suit waves at the camera
Expedition 69 NASA astronaut Frank Rubio is helped out of the Soyuz MS-23 spacecraft just minutes after he Roscosmos cosmonauts Sergey Prokopyev and Dmitri Petelin, landed in a remote area near the town of Zhezkazgan, Kazakhstan on Wednesday, Sept. 27, 2023. The trio are returning to Earth after logging 371 days in space as members of Expeditions 68-69 aboard the International Space Station. For Rubio, his mission is the longest single spaceflight by a U.S. astronaut in history. Image: NASA/Bill Ingalls
on a cloudy day, a rocket launches as fire shoots out. a bird flies in the foreground
A SpaceX Falcon Heavy rocket with the Psyche spacecraft onboard is launched from Launch Complex 39A, Friday, Oct. 13, 2023, at NASA’s Kennedy Space Center in Florida. NASA’s Psyche spacecraft will travel to a metal-rich asteroid by the same name orbiting the Sun between Mars and Jupiter to study it’s composition. The spacecraft also carries the agency’s Deep Space Optical Communications technology demonstration, which will test laser communications beyond the Moon. Image: NASA/Aubrey Gemignani
a tiny speck of rock on a microscope slide
A sample from asteroid Bennu is seen prepared on a microscope slide, Friday, Nov. 3, 2023, at the Smithsonian’s National Museum of Natural History in Washington. The sample was collected from the carbon rich near Earth asteroid Bennu in October 2020 by NASA’s OSIRIS-REx spacecraft. Image: NASA/Keegan Barber
a black capsule sits on desert sand near shrub brush
The sample return capsule from NASA’s OSIRIS-REx mission is seen shortly after touching down in the desert, Sunday, Sept. 24, 2023, at the Department of Defense’s Utah Test and Training Range. The sample was collected from the asteroid Bennu in October 2020 by NASA’s OSIRIS-REx spacecraft. Image: NASA/Keegan Barber
a spacecraft reading "NASA" at night with a dark sky and bright moon
The Moon and the star Antares are seen in the sky above a SpaceX Falcon 9 rocket with the company’s Dragon spacecraft on top is seen on the launch pad at Launch Complex 39A, Thursday, Aug. 24, 2023, at NASA’s Kennedy Space Center in Florida. NASA’s SpaceX Crew-7 mission is the seventh crew rotation mission of the SpaceX Crew Dragon spacecraft and Falcon 9 rocket to the International Space Station as part of the agency’s Commercial Crew Program. NASA astronaut Jasmin Moghbeli, ESA (European Space Agency) astronaut Andreas Mogensen, Japan Aerospace Exploration Agency (JAXA) astronaut Satoshi Furukawa, and Roscosmos cosmonaut Konstantin Borisov are scheduled to launch at 3:27 a.m. EDT on Saturday, August 26, from Launch Complex 39A at the Kennedy Space Center. Image: NASA/Joel Kowsky
a spacecraft floats in the water as people in helmet look from boats
Support teams work around the SpaceX Dragon Endurance spacecraft shortly after it landed with with NASA astronauts Nicole Mann and Josh Cassada, Japan Aerospace Exploration Agency (JAXA) astronaut Koichi Wakata, and Roscosmos cosmonaut Anna Kikina onboard in the Gulf of Mexico off the coast of Tampa, Florida, Saturday, March 11, 2023. Mann, Cassada, Wakata, and Kikina are returning after 157 days in space as part of Expedition 68 aboard the International Space Station. Image: NASA/Keegan Barber

The post NASA HQ picked their best photos of the year. Here are our 13 favorites. appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA will unfurl a 860-square-foot solar sail from within a microwave-sized cube https://www.popsci.com/science/nasa-solar-sail/ Wed, 24 Apr 2024 15:53:58 +0000 https://www.popsci.com/?p=612334
ACS3 solar sail concept art above Earth
This artist’s concept shows the Advanced Composite Solar Sail System spacecraft sailing in space using the energy of the sun. ASA/Aero Animation/Ben Schweighart

The highly advanced solar sail boom could one day allow spacecraft to travel without bulky rocket fuel.

The post NASA will unfurl a 860-square-foot solar sail from within a microwave-sized cube appeared first on Popular Science.

]]>
ACS3 solar sail concept art above Earth
This artist’s concept shows the Advanced Composite Solar Sail System spacecraft sailing in space using the energy of the sun. ASA/Aero Animation/Ben Schweighart

NASA hitched a ride aboard Rocket Lab’s Electron Launcher in New Zealand yesterday evening, and is preparing to test a new, highly advanced solar sail design. Now in a sun-synchronous orbit roughly 600-miles above Earth, the agency’s Advanced Composite Solar Sail System (ACS3) will in the coming weeks deploy and showcase technology that could one day power deep-space missions without the need for any actual rocket fuel, after launch.

The fundamentals behind solar sails aren’t in question. By capturing the pressure emitted by solar energy, thin sheets can propel a spacecraft at immense speeds, similar to a sailboat. Engineers have already demonstrated the principles before, but NASA’s new project will specifically showcase a promising boom design constructed of flexible composite polymer materials reinforced with carbon fiber.

Sun photo

Although delivered in a toaster-sized package, ACS3 will take less than 30 minutes to unfurl into an 860-square-foot sheet of ultrathin plastic anchored by its four accompanying 23-foot-long booms. These poles, once deployed, function as sailboat booms, and will keep the sheet taut enough to capture solar energy.

[Related: How tiny spacecraft could ‘sail’ to Mars surprisingly quickly.]

But what makes the ACS3 booms so special is how they are stored. Any solar sail’s boom system will need to remain stiff enough through harsh temperature fluctuations, as well as durable enough to last through lengthy mission durations. Scaled-up solar sails, however, will be pretty massive—NASA is currently planning future designs as large as 5,400-square-feet, or roughly the size of a basketball court. These sails will need extremely long boom systems that won’t necessarily fit in a rocket’s cargo hold.

To solve for this, NASA rolled up its new composite material booms into a package roughly the size of an envelope. When ready, engineers will utilize an extraction system similar to a tape spool to uncoil the booms meant to minimize potential jamming. Once in place, they’ll anchor the microscopically thin solar sail as onboard cameras record the entire process.

NASA hopes the project will allow them to evaluate their new solar sail design while measuring how its resulting thrust influences the tiny spacecraft’s low-Earth orbit. Meanwhile, engineers will assess the resiliency of their novel composite booms, which are 75-percent lighter and designed to offer 100-times less shape distortion than any previous solar sail boom prototype.

Don’t expect the ACS3 experiment to go soaring off into space, though. After an estimated two-month initial flight and subsystem testing phase, ACS3 will conduct a weeks-long test of its ability to raise and lower the CubeSate’s orbit. It’s a lot of work to harness a solar force NASA says is equivalent to the weight of a paperclip in your palm. Still, if ACS3’s sail and boom system is successful, it could lead towards scaling up the design enough to travel across the solar system.

The post NASA will unfurl a 860-square-foot solar sail from within a microwave-sized cube appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Greetings, Earth! NASA can understand Voyager 1 again https://www.popsci.com/science/voyager-back/ Tue, 23 Apr 2024 14:08:21 +0000 https://www.popsci.com/?p=612013
An artist’s concept of NASA’s Voyager 1 traveling through interstellar space–or the space between stars.
An artist’s concept of NASA’s Voyager 1 traveling through interstellar space–or the space between stars. NASA/JPL-Caltech

The 46-year-old space probe is making sense for the first time in five months after remote repairs.

The post Greetings, Earth! NASA can understand Voyager 1 again appeared first on Popular Science.

]]>
An artist’s concept of NASA’s Voyager 1 traveling through interstellar space–or the space between stars.
An artist’s concept of NASA’s Voyager 1 traveling through interstellar space–or the space between stars. NASA/JPL-Caltech

For the first time since November 2023, NASA is receiving meaningful communication from its Voyager 1 probe. The agency has spent months troubleshooting a glitch in why the famed probe was sending home messages that looked like garbled up gibberish and not scientific data. The probe is now coherent, but according to NASA, the next step is to enable Voyager 1 to begin to return usable science information again. 

[Related: Voyager 1 is sending back bad data, but NASA is on it.]

Alongside its twin Voyager 2, these probes are the only spacecraft to ever fly in interstellar space–or the region between stars beyond the influence of the sun. Both Voyager 1 and Voyager 2 probes launched in 1977. Their mission initially included detailed observations of Jupiter and Saturn, but it continued on exploring the outer reaches of the solar system. Voyager 1 became the first spacecraft to enter interstellar space in 2012. Voyager 2 followed Voyager 1 into interstellar space in 2018

On November 14, 2023, Voyager 1 stopped sending readable science and engineering data back to Earth for the first time. Mission controllers could tell that the spacecraft was still receiving their commands and otherwise operating normally, so they were not sure why it was sending back such incoherent information. In March, the Voyager engineering team at NASA’s Jet Propulsion Laboratory (JPL) confirmed that the issue was related to one of the spacecraft’s three onboard computers, called the flight data subsystem (FDS). The FDS packages science and engineering data before it’s sent to Earth so that NASA can use it.

The team pinpointed the code responsible for packaging the spacecraft’s engineering data. The glitch was only on one single chip representing around 3 percent of the FDS memory, according to Space. They were unable to repair the chip. On April 18, JPL engineers migrated the code to other portions of the FDS memory. This required splitting the code up into several sections to store them at multiple locations in the FDS. The code was adjusted to work from multiple locations as one cohesive process and references to its new directories were updated. 

“When the mission flight team heard back from the spacecraft on April 20, they saw that the modification worked: For the first time in five months, they have been able to check the health and status of the spacecraft,” NASA wrote in an update on April 22.

[Related: When Voyager 1 goes dark, what comes next?]

As of now, the usable data returned so far relates to how the spacecraft’s engineering systems are working. The team plans more software repair work in the next several weeks so that Voyager 1 can send valuable science data about the outer reaches of the solar system that is readable once again. As of now, Voyager 2 is still operating normally.

The post Greetings, Earth! NASA can understand Voyager 1 again appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
FAA now requires reentry license to prevent spacecraft getting stuck up there https://www.popsci.com/science/space-reentry-license-faa/ Mon, 22 Apr 2024 19:25:50 +0000 https://www.popsci.com/?p=611946
round earth
The FAA said that it won’t allow “reentry vehicles” to launch into space without a license to return back to Earth. DepositPhotos

If what goes up must come down, you’ll need a license for that.

The post FAA now requires reentry license to prevent spacecraft getting stuck up there appeared first on Popular Science.

]]>
round earth
The FAA said that it won’t allow “reentry vehicles” to launch into space without a license to return back to Earth. DepositPhotos

What happens if you design a spacecraft to survive reentry, but launch without a green light from regulators to bring it back down? As we saw with Varda Space Industries, which fired a capsule into orbit last spring to make stuff in zero gravity, you might have to park in orbit until your Federal Aviation Administration paperwork is complete.

In a new April 17 notice effective immediately, the FAA seems to be indicating that it’s looking to avoid repeats of the Varda saga, which successfully landed its capsule in Utah back in February after a roughly seven-month delay. The company aimed to grow Ritonavir crystals in space, taking advantage of the environment to potentially improve the efficacy of the HIV antiviral drug.

Private Space Flight photo

Varda Space Industries’ spacecraft, W-1, successfully landed at the Utah Test and Training Range on February 21, 2024. This marks the first time a commercial company has landed a spacecraft on United States soil. Credit: Varda Space Industries.

Without citing the incident directly, the agency said that it won’t allow “reentry vehicles” to launch without a license to return. In other words, if a company plans to bring its vehicle back, it can’t send one into space in the first place unless the FAA has preemptively deemed its reentry plans safe. The agency said it analyzes the impact vehicles may have on public health, property, and national security before issuing reentry licenses.

Without pre-approval, the FAA argues critical systems could fail or the vehicle might run out of propellant or power, before regulators and reentry operators get all their ducks in a row.  The agency says it reviews numerous details that are self-disclosed by reentry operators, including the payload’s weight, the amount of hazardous materials present, the “explosive potential of payload materials” and the planned reentry site.

Varda emphasized earlier this month that it received launch approval last year and complied with all regulatory requirements to do so. In a statement to SpaceNews, FAA associate administrator Kelvin Coleman said the agency learned “some lessons” when it approved the company to launch without a reentry license.  

As spaceflight evolves, returnable vehicles require special attention to mitigate collisions with people and property on the ground, the FAA said in its notice. “Unlike typical payloads designed to operate in outer space, a reentry vehicle has primary components that are designed to withstand reentry substantially intact and therefore have a near-guaranteed ground impact,” the FAA wrote. 

[ Related: Yes, a chunk of the space station crashed into a house in Florida ]

The post FAA now requires reentry license to prevent spacecraft getting stuck up there appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Smooth lava lake on Jupiter’s moon sizzles in NASA aerial animations https://www.popsci.com/science/laval-lake-io-moon-jupiter/ Fri, 19 Apr 2024 13:24:44 +0000 https://www.popsci.com/?p=611705
an artist's illustration of a lava lake on one of jupiter's moons. it is primarily black as the magma has cooled with orange lava encircling it
An artist’s concept of a lava lake on Jupiter’s moon Io called Loki Patera. The image was made using data from the JunoCam imager aboard NASA’s Juno spacecraft. Loki is a depression filled with magma and rimmed with molten lava. NASA/JPL-Caltech/SwRI/MSSS

'Io is simply littered with volcanoes.'

The post Smooth lava lake on Jupiter’s moon sizzles in NASA aerial animations appeared first on Popular Science.

]]>
an artist's illustration of a lava lake on one of jupiter's moons. it is primarily black as the magma has cooled with orange lava encircling it
An artist’s concept of a lava lake on Jupiter’s moon Io called Loki Patera. The image was made using data from the JunoCam imager aboard NASA’s Juno spacecraft. Loki is a depression filled with magma and rimmed with molten lava. NASA/JPL-Caltech/SwRI/MSSS

NASA’s Juno mission scientists have used complex data collected during two flybys of Jupiter’s third largest moon Io to create animations that highlight this moon’s most dramatic features. Io is a little bit larger than the planet Earth and is also home to a mountain with a smooth lake of lava. Lava lakes like Io’s Loki Patera have a cooling surface crust that slowly thickens until it becomes denser than the underlying magma. It then sinks and pulls in the nearby crust. 

First launched in 2011, Juno arrived at our solar system’s largest planet in 2016 with a mission to explore the Jovian system. It has 95 known moons and its four largest–Io, Europa, Ganymede, and Callisto–are called the Galilean moons. Io is most volcanically active.

Moons photo

This animation is an artist’s concept of Loki Patera, a lava lake on Jupiter’s moon Io, made using data from the JunoCam imager aboard NASA’s Juno spacecraft. With multiple islands in its interior, Loki is a depression filled with magma and rimmed with molten lava. CREDIT: NASA/JPL-Caltech/SwRI/MSSS.

“Io is simply littered with volcanoes, and we caught a few of them in action,” Juno’s principal investigator Scott Bolton said in a statement. “We also got some great close-ups and other data on a 200-kilometer-long [127-mile-long] lava lake called Loki Patera. There is amazing detail showing these crazy islands embedded in the middle of a potentially magma lake rimmed with hot lava. The specular reflection our instruments recorded of the lake suggests parts of Io’s surface are as smooth as glass, reminiscent of volcanically created obsidian glass on Earth.”

The observations were announced April 16 during the European Geophysical Union General Assembly in Vienna, Austria.

[Related: See the most volcanic world in our solar system in new NASA images.]

Juno conducted very close flybys of Io in December 2023 and February 2024, getting within 930 miles of the surface. The spacecraft obtained first close-up images of Io’s northern latitudes. Maps created with data collected by Juno’s Microwave Radiometer (MWR) instrument show that Io has a surface that is more smooth compared to Jupiter’s other Galilean moons, but also has poles that are colder than their middle latitudes.

Moons photo

Created using data collected by the JunoCam imager aboard NASA’s Juno during flybys in December 2023 and February 2024, this animation is an artist’s concept of a feature on the Jovian moon Io that the mission science team nicknamed “Steeple Mountain.” CREDIT: NASA/JPL-Caltech/SwRI/MSSS

Mountains and polar cyclones

With every pass, Juno flies closer to the north pole of Jupiter. Changing the spacecraft’s orientation allows the MWR instrument to improve its resolution of Jupiter’s northern polar cyclones. These storms at the top of the gas giant can reach wind speeds of 220 miles per hour and the data collected by Juno reveals that not all polar cyclones are created equal.

“Perhaps [the] most striking example of this disparity can be found with the central cyclone at Jupiter’s north pole,” Steve Levin, Juno’s project scientist at NASA’s Jet Propulsion Laboratory, said in a statement. “It is clearly visible in both infrared and visible light images, but its microwave signature is nowhere near as strong as other nearby storms. This tells us that its subsurface structure must be very different from these other cyclones. The MWR team continues to collect more and better microwave data with every orbit, so we anticipate developing a more detailed 3D map of these intriguing polar storms.”

swirling red cyclones on the planet jupiter
NASA’s Juno spacecraft captured infrared images that astronomers combined to create this picture of Jupiter’s north pole, showing a central cyclone and the eight cyclones that encircle it. Data indicate that the storms are enduring features at the pole, with each circumpolar cyclone almost as wide as the distance between Naples, Italy, and New York City in the United States. Wind speeds in the storms can reach 220 miles per hour. The colors in this composite represent radiant heat; the yellow (thinner) clouds are about 9 degrees Fahrenheit and the dark red (thickest) are around –181 degrees Fahrenheit. CREDIT: NASA, Caltech, SwRI, ASI, INAF, JIRAM

Just how much water is on Jupiter? An enduring mystery

One of Juno’s primary science goals is to collect data that will help astronomers better understand Jupiter’s water abundance. However, the team isn’t looking for liquid water. Instead, they are studying Jupiterl’s atmosphere to quantify the presence of the molecules that make up water–oxygen and hydrogen. According to NASA, an accurate estimate of oxygen and hydrogen molecules present in Jupiter’s atmosphere is crucial to unlocking some of the underlying mysteries of how our solar system formed.  

Jupiter was likely the first planet to form roughly 4.5 billion years ago. It also contains most of the gas and dust that wasn’t incorporated into the sun when the solar system formed. Water abundance also has important implications for Jupiter’s meteorology and internal structure.

[Related: Juno finally got close enough to Jupiter’s Great Red Spot to measure its depth.]

In 1995, NASA’s Galileo probe provided early data on the amount of water on Jupiter, but the data created more questions than answers. It showed that the gas giant’s atmosphere was unexpectedly hot and actually deprived of water—contrary to what computer models had initially indicated.

“The probe did amazing science, but its data was so far afield from our models of Jupiter’s water abundance that we considered whether the location it sampled could be an outlier. But before Juno, we couldn’t confirm,” said Bolton. “Now, with recent results made with MWR data, we have nailed down that the water abundance near Jupiter’s equator is roughly three to four times the solar abundance when compared to hydrogen. This definitively demonstrates that the Galileo probe’s entry site was an anomalously dry, desert-like region.”

[Related: Jupiter’s icy ocean worlds could be cool travel destinations in the future.]

The new results support the idea that sometime during the formation of our solar-system, water-ice material may have been the source of heavy element enrichment. These are chemical elements that are heavier than hydrogen and helium that Jupiter accumulated. The planet’s formation remains puzzling, because Juno’s results on the core of the gas giant suggest that there is very low water abundance. How abundant H20 is on the gas giant remains a mystery that the Juno mission could potentially solve.  

What’s next for Juno

Data during the reminder of Juno’s mission could help determine how much water is on Jupiter in two ways. It could enable scientists to compare Jupiter’s water abundance near the polar regions to the equatorial region. It also may shed additional light on the structure of the planet’s dilute liquid core

Juno’s most recent flyby of Io was on April 9 and the spacecraft came within about 10,250 miles of the moon’s surface. Its 61st flyby of Jupiter is scheduled for May 12 and it will continue to explore the planet and its moons through September 2025

The post Smooth lava lake on Jupiter’s moon sizzles in NASA aerial animations appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Where do all those colors in space telescope images come from? https://www.popsci.com/science/telescope-images/ Thu, 18 Apr 2024 14:28:10 +0000 https://www.popsci.com/?p=611438
Cassiopeia A (Cas A) is a supernova remnant located about 11,000 light-years from Earth in the constellation Cassiopeia. It spans approximately 10 light-years. This image uses data from Webb’s Mid-Infrared Instrument (MIRI) to reveal Cas A in a new light. This image combines various filters with the color red assigned to 25.5 microns (F2550W), orange-red to 21 microns (F2100W), orange to 18 microns (F1800W), yellow to 12.8 microns (F1280W), green to 11.3 microns (F1130W), cyan to 10 microns (F1000W), light blue to 7.7 microns (F770W), and blue to 5.6 microns (F560W).
Cassiopeia A (Cas A) is a supernova remnant located about 11,000 light-years from Earth in the constellation Cassiopeia. It spans approximately 10 light-years. This image uses data from Webb’s Mid-Infrared Instrument (MIRI) to reveal Cas A in a new light. This image combines various filters with the color red assigned to 25.5 microns (F2550W), orange-red to 21 microns (F2100W), orange to 18 microns (F1800W), yellow to 12.8 microns (F1280W), green to 11.3 microns (F1130W), cyan to 10 microns (F1000W), light blue to 7.7 microns (F770W), and blue to 5.6 microns (F560W). NASA, ESA, CSA, Danny Milisavljevic (Purdue University), Tea Temim (Princeton University), Ilse De Looze (UGent) / Image Processing: Joseph DePasquale (STScI)

How scientists make vibrant spectacles out of grayscale blobs.

The post Where do all those colors in space telescope images come from? appeared first on Popular Science.

]]>
Cassiopeia A (Cas A) is a supernova remnant located about 11,000 light-years from Earth in the constellation Cassiopeia. It spans approximately 10 light-years. This image uses data from Webb’s Mid-Infrared Instrument (MIRI) to reveal Cas A in a new light. This image combines various filters with the color red assigned to 25.5 microns (F2550W), orange-red to 21 microns (F2100W), orange to 18 microns (F1800W), yellow to 12.8 microns (F1280W), green to 11.3 microns (F1130W), cyan to 10 microns (F1000W), light blue to 7.7 microns (F770W), and blue to 5.6 microns (F560W).
Cassiopeia A (Cas A) is a supernova remnant located about 11,000 light-years from Earth in the constellation Cassiopeia. It spans approximately 10 light-years. This image uses data from Webb’s Mid-Infrared Instrument (MIRI) to reveal Cas A in a new light. This image combines various filters with the color red assigned to 25.5 microns (F2550W), orange-red to 21 microns (F2100W), orange to 18 microns (F1800W), yellow to 12.8 microns (F1280W), green to 11.3 microns (F1130W), cyan to 10 microns (F1000W), light blue to 7.7 microns (F770W), and blue to 5.6 microns (F560W). NASA, ESA, CSA, Danny Milisavljevic (Purdue University), Tea Temim (Princeton University), Ilse De Looze (UGent) / Image Processing: Joseph DePasquale (STScI)

We’ve all seen beautiful images of outer space, with vivid swirls and bright stars resting on a black abyss. With how quick it is to snap a color photo on an iPhone, you might think that sophisticated space telescopes churn out color photos automatically, too. 

However, all digital cameras—from your phone to the James Webb Space Telescope—can’t actually see in color. Digital cameras record images as a bunch of ones and zeros, counting the amount of light hitting their sensors. Each pixel has a colored filter over it (either red, green, or blue), which only allows specific wavelengths of light to go through. The filters are arranged in a specific pattern (typically a four-pixel repeating square known as the Bayer pattern), which allows the camera’s computing hardware to combine the captured data into a full-colored image. Some digital cameras spread the colored filters out across three individual sensors, the data from which can similarly combine into a full-color image. Telescope cameras, however, have to take images with one filter at a time, such that they have to be combined by experts later into a composite image.

Processing scientific data into beautiful color images is actually a full-time job.

Approximate outlines help to define the features in the Sagittarius C (Sgr C) region. Astronomers are studying data from NASA’s James Webb Space Telescope to understand the relationship between these features, as well as other influences in the chaotic galaxy center. Credits: NASA, ESA, CSA, STScI, Samuel Crowe (UVA)
Approximate outlines help to define the features in the Sagittarius C (Sgr C) region. Astronomers are studying data from NASA’s James Webb Space Telescope to understand the relationship between these features, as well as other influences in the chaotic galaxy center. Credits: NASA, ESA, CSA, STScI, Samuel Crowe (UVA)

In our smartphones, the combination of layers happens incredibly fast—but telescopes are complicated scientific behemoths, and it takes a bit more effort to get the stunning results we know and love. Plus, when we’re looking at the cosmos, astronomers use wavelengths of light that our eyes can’t even see (e.g. infrared and X-rays), so those also need to be represented with colors in the rainbow. There are lots of decisions to be made about how to colorize space images, which begs the question: who is making these images, and how do they make them?

For the spectacular results we’ve been seeing from JWST, processing scientific data into beautiful color images is actually a full-time job. Science visualization specialists at the Space Telescope Science Institute in Baltimore stack images together and stitch observations from different instruments on the telescope. They also remove artifacts, or things in the image that aren’t actually real, but instead just results of the telescope equipment and how digital data is processed. These could be streaks from stray cosmic rays, oversaturation of the brightest stars, or noise from the detector itself. 

Black and white to color

Before they even think about color, these specialists need to balance out the dark and light values in the image. Scientific cameras are meant to record a wide range of brightnesses beyond what our eyes can pick up on. This means that the raw images from telescopes often look very dark to our eyes, and you have to brighten up the image to see anything.

Once they have black and white images where the details are visible, they start adding color. “Different telescopes have filters that are made to be sensitive to only certain wavelengths of light, and the colorful space images we see are combinations of separate exposures taken in these different filters” similar to the earlier description of a phone camera, explains Katya Gozman, an astronomer at the University of Michigan. “We can assign each filter to a separate color channel—red, green or blue, the primary colors of visible light. When stacked on top of each other, we get the spectacular textbook color image that we’re used to seeing in the media,” she adds.

This is where it becomes a bit of an art, choosing colors based on not only scientific accuracy, but also what looks best. For JWST and Hubble, the usual routine is to use blue for the shortest wavelengths, green for in between, and red for the longest wavelengths.

The end result, of course, also depends on what kind of data the image specialists have to work with in the first place. The team often chooses different colors to highlight the fact that NIRCam and MIRI—two of Webb’s infrared cameras—are looking at different wavelengths (near-infrared and mid-infrared, respectively), and therefore different physical structures. For example, in the Cassiopeia A supernova remnant, JWST’s observations revealed a bubble of something emitting a specific wavelength of light, colored as green in the MIRI image and resultantly known as the “Green Monster.” Without this visualization, astronomers may not have noticed such a curious feature that provides insight into how giant stars die—and after some investigation, they figured out the Green Monster is a region of debris disturbed by the huge blast from the supernova explosion.

This image provides a side-by-side comparison of supernova remnant Cassiopeia A (Cas A) as captured by NASA’s James Webb Space Telescope’s NIRCam (Near-Infrared Camera) and MIRI (Mid-Infrared Instrument). Credits: NASA, ESA, CSA, STScI, Danny Milisavljevic (Purdue University), Ilse De Looze (UGent), Tea Temim (Princeton University)
This image provides a side-by-side comparison of supernova remnant Cassiopeia A (Cas A) as captured by NASA’s James Webb Space Telescope’s NIRCam (Near-Infrared Camera) and MIRI (Mid-Infrared Instrument). Credits: NASA, ESA, CSA, STScI, Danny Milisavljevic (Purdue University), Ilse De Looze (UGent), Tea Temim (Princeton University)

Invisible to visible

Generally, image specialists try to keep things as close to reality as possible. For example, if a telescope is observing in visible light, wavelengths can directly map to colors we’re used to seeing. But for those parts of the spectrum invisible to our eyes, they have to make choices about which visible colors to use. This is where it becomes a bit of an art, choosing colors based on not only scientific accuracy, but also what looks best. For JWST and Hubble, the usual routine is to use blue for the shortest wavelengths, green for in between, and red for the longest wavelengths. If there are more than three different filters to choose from (as is often the case with JWST, especially when using more than one of its high tech instruments), sometimes they’ll add in purple, teal, and orange for other wavelengths in between the red, green, and blue.

d
Webb’s raw telescope images initially appear almost completely black (left). They are initially transformed by image processors into crisp black-and-white images (center) and then full-color composites (right). Credit: JWST

Color images are far more than a pretty picture, though—they’re actually quite useful for science. The human brain is excellent at picking up patterns in color, such as parsing a map with color-coded subway lines or recognizing that “a red light is stop, green is go,” says Mark Popinchalk, an astronomer at the American Museum of Natural History. “These are daily examples where societal information is presented and processed quickly through color. Scientists want to use the same tool,” he adds. “But instead of societal information, it’s scientific. If X-rays are red, and ultraviolet is blue, we can very quickly interpret energetic light beyond what humans are capable of.” The result is a visual representation of an intense amount of data–much more than can be processed with the naked eye, or in black and white alone. 

For example, Gozman describes how images have helped recognize “where different physical processes are happening in an object, such as seeing where star formation is happening in a galaxy or where different elements are located around a nebula.” Color images with light beyond the visible spectrum have even revealed dark matter around galaxies, such as in the bullet cluster.

[ Related: This is what Uranus and Neptune may really look like ]

Another particularly recent and interesting example of image coloration is the case of Neptune. The dark blue photo of the icy world from the Voyager mission doesn’t actually reflect its true color, as if we were looking at it with our own eyes—instead, it’s more similar to the pale face of Uranus. “Back in the 80s, astronomers actually stretched and modified the images of Neptune to bring out more contrast in some of its fainter features, leading it to have that deep blue hue which made it look very different compared to Uranus,” explains Gozman. “Though astronomers were aware of this, the public was not. This is one good example of how reprocessing the same data in different ways can lead to completely different representations.”

Image analysis is, and always has been, a huge part of astronomy, finding ways to see the cosmos beyond the limitations of our very limited human eyes. You can even try your own hand at it—JWST data is available to the public from NASA, and they even run an astrophotography challenge open to anyone. Now, when you see a beautiful image of space, perhaps you can think of it as a wonderful melding of science and art.

The post Where do all those colors in space telescope images come from? appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Plunge into an immersive IMAX movie featuring the James Webb Space Telescope https://www.popsci.com/science/james-webb-space-telescope-imax/ Thu, 18 Apr 2024 13:22:11 +0000 https://www.popsci.com/?p=611355
a galaxy that looks like an hourglass pinched at the center with a shining protostar
An image of a protostar within the dark cloud L1527, taken with JWST’s Near-Infrared Camera (NIRCam). It is embedded within a cloud of material feeding its growth. Ejections from the star have cleared out cavities above and below it, whose boundaries glow orange and blue in this infrared view. NASA/ESA/CSA

In 'Deep Sky,' JWST comes to the really big screen with an abundance of data and no shortage of tears.

The post Plunge into an immersive IMAX movie featuring the James Webb Space Telescope appeared first on Popular Science.

]]>
a galaxy that looks like an hourglass pinched at the center with a shining protostar
An image of a protostar within the dark cloud L1527, taken with JWST’s Near-Infrared Camera (NIRCam). It is embedded within a cloud of material feeding its growth. Ejections from the star have cleared out cavities above and below it, whose boundaries glow orange and blue in this infrared view. NASA/ESA/CSA

In the new IMAX film Deep Sky, a protostar shines from the center of a dark cloud, the phantom galaxy swirls, and the dusty space clouds of the Cosmic Cliffs of Carina tower like mountain peaks. Also, scientists cry. The film centers on the James Webb Space Telescope’s visual legacy and the people behind it. At one point, NASA astrophysicist Amber Straughn gets to the heart of why seeing the Cosmic Cliffs of Carina is such an emotional journey. “This has always been there. It’s always been out there, but we’re just now able to see it. We now have this new telescope that’s opened up our eyes to let us see something we haven’t seen before.”

dusty space clouds with shining stars at the clouds peaks
Astronomers using JWST combined the capabilities of the telescope’s two cameras to
create a never-before-seen view of a star-forming region in the Carina Nebula. Captured in infrared light by the Near-Infrared Camera (NIRCam) and Mid-Infrared Instrument (MIRI), this combined image reveals previously invisible areas of star birth. CREDIT: NASA/ESA/CSA

While not quite as challenging as building a space telescope, making Deep Sky posed a novel challenge to the filmmakers, Nathaniel Kahn noted: “…Every time we’d start to get close to finishing, NASA would release a new amazing image, and we’d have to find a way to work that in!” As the film’s writer, director, and producer, Kahn and team were finishing the project in September of 2023, combining digital cinematography by NASA, ESA, and commercial satellite launch company Arianespace with animations and graphics created specifically for IMAX. If you want to see the stereotypes of the stoic scientists challenged and bask in the glory of space, you can catch the IMAX experience starting Friday, April 19. 

The drive to uncover the secrets of the cosmos propels this new telling of JWST’s unfolding story. Here’s what it took to get there.

‘It was waving goodbye’

In the almost two years since those first images were beamed back to planet Earth, it’s easy for casual observers to forget how improbable it was. JWST was initially supposed to launch in 2011 and congress even tried to cancel it that same year over budget concerns. It ultimately took 10,000 people from 14 countries, $10 billion, and 20 years to complete.

[Related: JWST images show off the swirling arms of 19 spiral galaxies.]

“I’ve worked on JWST for 15 years and I’m sort of one of the younger ones working on this telescope,” Straughn tells PopSci. “We faced a lot of challenges along the way and it was an audacious mission. We had to build this enormous telescope that had to be cold and that had to unfold in space. When you describe it, it sounds impossible.”

Multiple technologies needed to be invented to get this game-changer off the ground, including a critical sunshield. Since JWST primarily observes infrared light from faint and very far away objects. It must be kept extremely cold, at about -370 degrees Fahrenheit, to detect these faint signals of heat. The team constructed a five-layer sunshield about the size of a tennis court that protects it from other heat sources like the Earth, sun, and various moons. In the documentary, Amy Lo, the Deputy Director for Vehicle Engineering on JWST for Northrop Grumman, described it as being “SPF one million,” in order to keep it so cold and protected. She noted that there was no “second shot of doing this.”

a diagram of JWST's science instruments
The JWST has a cool side, which faces away from the sun, and a hot side, which faces the sun. CREDIT: NASA, ESA, CSA, Joyce Kang (STScI).

During its launch on Christmas Day 2021, JWST completed over 40 crucial deployments of its various instruments and overcame 344 “single point failures.” If any one of those single points had failed, the entire mission would have ended.

The mission overcame all 344 single point failures and even got an added surprise. About 45 seconds into the launch, they caught the telescope’s power source called the solar array open up. This proved JWST officially had power and the deployment was not something the team planned to be able to see with their own eyes during the launch. Through tears, NASA JWST Program Scientist Eric Smith said, “It was waving goodbye,” in the documentary. 

Back to the big bang

By several accounts, JWST is performing better than expected. It’s standing up against the micrometeoroids–tiny pieces of space dust that can build up on the telescope’s mirrors. The team had a good idea of how frequently the dust would hit the mirrors, but the size of the impacts was more surprising.

[Related: Why a 3,000-mile-long jet stream on Jupiter surprised NASA scientists.]

“What we’ve been able to do to help mitigate this is essentially change the way we’re operating so that as the telescope is facing away from the direction that the micrometeoroids are coming from when we think we could have higher impacts,” Straughn tells PopSci

It has also proven to be more stable and more efficient overall. According to Straughn, JWST has delivered more data in even less time than the team anticipated, revealing some of the most distant galaxies in the universe. These are galaxies that were born just after the big bang about 13.8 billion years ago. JWST has revealed that many are brighter, bigger, and more numerous than astrophysicists previously thought and their black holes are also growing incredibly fast. 

a swirling galaxy
M74 shines at its brightest in this combined optical/mid-infrared image, featuring data from both the Hubble Telescope and JWST. CREDIT: NASA/ESA/CSA.

“There’s an overarching new mystery that’s arisen of why galaxies are growing so big,” says Straughn. “When we find something that we don’t expect, that’s a new problem to solve that will help increase our knowledge about how the universe works.”

Towards the future

JWST built on the success of the Hubble Space Telescope and other observational projects are on our horizon. Scheduled to launch in 2027, the Nancy Grace Roman Telescope will explore exoplanets and dark matter. The Habitable Exoplanet Observatory (HabEx) is also in the early stages of development and will be specifically designed to discover life on other planets. 

[Related: In NASA’s new video game, you are a telescope hunting for dark matter.]

“I think that this telescope launch and these images came along at a perfect time to present a contrast to the bad things that are going on in the world,” says Straughn. “It really is an example of something that’s good, of what we humans can do when we put our hearts and our minds into something that’s for a bigger purpose.”

Deep Sky releases in IMAX theaters nationwide on Friday, April 19.

The post Plunge into an immersive IMAX movie featuring the James Webb Space Telescope appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Daddy long-legs-inspired robot could one day squirm through Martian caves https://www.popsci.com/technology/spider-robot/ Wed, 17 Apr 2024 18:00:00 +0000 https://www.popsci.com/?p=611312
Close-up photos of ReachBot.
Close-up photos of ReachBot. BDML Stanford University

The spiderbot's extendable legs can grasp onto uneven rock surfaces and propel it forward.

The post Daddy long-legs-inspired robot could one day squirm through Martian caves appeared first on Popular Science.

]]>
Close-up photos of ReachBot.
Close-up photos of ReachBot. BDML Stanford University

Robotic engineers are no stranger to turning to nature for inspiration. In recent years, birds, dogs, extinct sea creatures, and even humans themselves have all served as jumping off point for new mechanical designs. Now, researchers from Stanford are citing the Harvestman spider, better known as a daddy long-legs as inspiration for a new robot design they believe could be better equipped at navigating uneven rocky caverns and lava tubes. One day, they hope this spider-like design could even help robots navigate the icy caverns of the moon and Mars. 

How does the spider robot work?

The researchers introduced their new machine called the “ReachBot” in a paper published today in the journal Science Robotics. ReachBot features multiple extendable boom limbs which it can use to reach out for rocks and propel itself forward. Each limb comes attached with a three finger gripper that grabs onto the rocks and uses them as anchor points. The long-legged design means the robot’s limbs can potentially access the floor, ceiling, and walls of a lava tube or cave, which in turn provide increased leverage. This unique positioning, the researchers write, lets the ReachBot “assume a wide variety of possible configurations, bracing stances, and force application options.”

Harvestman spider, better known as a “daddy long-legs."
Harvestman spider, better known as a daddy long-legs. DepositPhotos

ReachBot attempts to fill in a form-factor gap among existing exploration robots. Small robots, the researchers argue, are useful for navigating through tight corridors but typically have limited reach. Larger robots, by contrast, might be able to reach more area but can get bogged down by their heft mass and mechanical complexity. ReachBot offers a compromise by relying on a small main body with limbs that can expand and reach out if necessary. 

The robot utilizes a set of onboard sensors to scale the area ahead of it and look for concave rocks or other signs suggestive of a graspable area. Like a physical spider. ReachBot doesn’t immediately assume rock surfaces are flat, but instead seeks “rounded features that the gripper can partially enclose.” Researchers say they tested the robot in simulation to help it improve its ability to correctly identify grippable surface areas and aid in footstep planning. Following the simulation, ReachBot was tested in the real-world in an unmanned lava tube near Pisgah crater in Mojave Desert. 

“Results from the field test confirm the predictions of maximum grasp forces and underscore the importance of identifying and steering toward convex rock features that provide a strong grip,” the researchers write. “They also highlight a characteristic of grasp planning with ReachBot, which is that identifying, aiming for, and extending booms involves a higher level of commitment than grasping objects in manufacturing scenarios.”

ReachBot could help researchers explore deep caves and caverns on other planets

Researchers believe ReachBot’s arachnid design could have extraterrestrial applications. Lava tubes like in the Mojave Desert where the robot was tested removes some of the area on the surface of the moon and Mars. In the latter example, researchers say ancient subsurface environments on the Red Planet remain relatively unchanged the time when some believe the planet may have been habitable. These sheltered cavern areas, they write, “could provide sites for future human habitation.” 

In theory, future exploratory space robots could use a design like ReachBot’s to explore deeper into areas contemporary robots could find inaccessible. Elsewhere, researchers are exploring how three-legged jumping machines and four-legged, dog inspired robots could similarly help scientists learn more about undiscovered areas of our solar system neighbors. 

The post Daddy long-legs-inspired robot could one day squirm through Martian caves appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Yes, a chunk of the space station crashed into a house in Florida https://www.popsci.com/science/space-junk-crash-florida/ Tue, 16 Apr 2024 19:29:11 +0000 https://www.popsci.com/?p=611173
ISS
March 11, 2021 - An external pallet packed with old nickel-hydrogen batteries is released from the Canadarm2 robotic arm as the International Space Station orbited 260 miles above the Pacific Ocean west of Central America. Mission controllers in Houston commanded the Canadarm2 to release the external pallet into space where it will orbit Earth between two to four years before burning up harmlessly in the atmosphere. The batteries were removed during previous spacewalks and replaced with newer lithium-ion batteries to continue powering the station's systems. NASA

NASA confirmed the origins of the orbital junk that left a homeowner shaken ‘in disbelief.’

The post Yes, a chunk of the space station crashed into a house in Florida appeared first on Popular Science.

]]>
ISS
March 11, 2021 - An external pallet packed with old nickel-hydrogen batteries is released from the Canadarm2 robotic arm as the International Space Station orbited 260 miles above the Pacific Ocean west of Central America. Mission controllers in Houston commanded the Canadarm2 to release the external pallet into space where it will orbit Earth between two to four years before burning up harmlessly in the atmosphere. The batteries were removed during previous spacewalks and replaced with newer lithium-ion batteries to continue powering the station's systems. NASA

The “object from the sky” that pierced through a home in Naples, FL. last month wasn’t a meteorite after all.

On April 15, NASA said the mysterious metallic cylinder—which tore through homeowner Alejandro Otero’s ceiling and floor—was actually part of a cargo pallet that contained “aging nickel hydride batteries.” The agency jettisoned the pallet from the International Space Station back in 2021, after installing new lithium-ion batteries on the artificial satellite. 

NASA expected the hardware to “fully burn up during entry through Earth’s atmosphere on March 8, 2024,” yet things turned out quite differently for the Otero family.
“It was a tremendous sound, and it almost hit my son. He was two rooms over and heard it all,” Otero told Florida broadcaster WINK News. After prying the object out from between mangled floorboards, Otero said he suspected it was a meteorite.

space debris
Recovered stanchion from the NASA flight support equipment used to mount International Space Station batteries on a cargo pallet. The stanchion survived re-entry through Earth’s atmosphere on March 8, 2024, and impacted a home in Naples, Florida. Credit: NASA

According to NASA, the debris was actually made of a nickel- and chromium-based superalloy called Inconel. The object originally functioned as part of a battery mount; after hitting Otero’s home, the surviving cylinder clocked in at 1.6 pounds, 4 inches tall and 1.6 inches in diameter.

In 2021, NASA anticipated the pallet would “orbit Earth between two to four years before burning up harmlessly in the atmosphere.” This week, NASA said the ISS will investigate the incident to “determine the cause of the debris survival,” adding that it’s “committed to responsibly operating in low Earth orbit, and mitigating as much risk as possible to protect people on Earth when space hardware must be released.”

While a space-junk crisis may sound like science fiction, debris left by humans in low-Earth orbit is rapidly piling up. The European Space Agency estimates there are 36,500 debris objects greater than 10 cm in Earth’s orbit. The agency reports that the total mass of all known space objects exceeds 11,500 metric tons (or more than 25 million pounds). Such junk includes everything from paint flecks and bolts to dead satellites and spent rocket boosters. Much of this stuff originates from governmental space programs, but it also comes from private companies, such as Elon Musk’s Starlink.

Around a dozen objects reenter the atmosphere on a daily basis, Moriba Jah, a professor of aerospace engineering at the University of Texas at Austin, said in a call with PopSci. “It’s not uncommon for these things to survive and make it to the surface,” explained Jah, though typically they crash into the ocean. However, as satellite launches rapidly increase, the professor cautioned that “statistically, [falling debris] will kill somebody at some point.” 

According to Professor Jah, solving this problem will require more reusable and recyclable tech. Today, “we have a linear space economy where the end state of any given satellite is to become junk.” Governments need to embrace a circular approach and “mandate that satellites can’t be launched if they’re going to be single use,” Jah argued.

In a separate call, John L. Crassidis—a professor of mechanical and aerospace engineering at the State University of New York at Buffalo—argued to PopSci that readers shouldn’t be too worried, for now, about death by space junk. “I think I’d be a lot more concerned about getting struck by lightning than having a piece of space debris fall on me,” said Crassidis. However, the professor said the risk will grow in the coming decades. The more space junk we have, the greater the chance that “somebody’s going to be eventually hurt.”

According to estimates by the nonprofit Aerospace Corporation, the likelihood of space debris injuring a particular person is less than one in a trillion. However, a 2022 University of British Columbia study predicts there’s a 10% chance that falling space debris will result in “one or more casualties” by 2032.

[ Related: How harpoons, magnets, and ion blasts could help us clean up space junk ]

To mitigate worst-case scenarios, Crassidis pointed to the need for “hard international treaties” that require space-faring nations to follow UN space debris guidelines. “No matter what anybody tells you, we do not have the technology to take out space debris right now,” argued Crassidis, who acknowledged that Europe has some “nice experiments” in the works. 

In the coming decades, “if our technology can’t catch up to the point of making that reality, and if we keep doing what we’re doing, then we’re for sure they are well on our way to Kessler syndrome,” Crassidis said, referring to a worst-case scenario in which space-junk collisions become so likely that they render low-Earth orbit useless for generations.

The post Yes, a chunk of the space station crashed into a house in Florida appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Humongous stellar-mass black hole is the biggest ever found in Milky Way https://www.popsci.com/science/biggest-stellar-mass-black-hole/ Tue, 16 Apr 2024 13:27:33 +0000 https://www.popsci.com/?p=611103
an illustration of a black hole with a companion star. the star is bright with a blue line indicating where the black hole is.
An artist’s impression of the largest stellar black hole in the Milky Way galaxy. The illustration shows the orbits of both the star and the black hole around their common center of mass. The stellar-mass black hole was found due to the wobbling motion it induces on a companion star. ESO/L. Calçada

Gaia-BH3's mass is 33 times bigger than our sun and it's only 2,000 light years from Earth.

The post Humongous stellar-mass black hole is the biggest ever found in Milky Way appeared first on Popular Science.

]]>
an illustration of a black hole with a companion star. the star is bright with a blue line indicating where the black hole is.
An artist’s impression of the largest stellar black hole in the Milky Way galaxy. The illustration shows the orbits of both the star and the black hole around their common center of mass. The stellar-mass black hole was found due to the wobbling motion it induces on a companion star. ESO/L. Calçada

Scientists have discovered an enormous stellar-mass black hole in our Milky Way galaxy that’s roughly 33 times more massive than our sun. This black hole designated as Gaia-BH3 was observed with the European Space Agency’s (ESA) Gaia space telescope and is pretty close to Earth in space-terms at only 2,000 light years away. It is described in a paper presented April 16 in the journal Astronomy & Astrophysics.

What is a stellar-mass black hole?

Stellar-mass black holes like Gaia-BH3 are formed when a large star runs out of gas and then collapses. They are generally about 10 times as massive as our sun. Data from the European Southern Observatory’s Very Large Telescope and other ground-based observatories verified its large mass of 33 times bigger than the sun. The stellar-mass black hole Cygnus X-1 is only 21 solar masses, making Gaia-BH3 “exceptional.”

This artist’s impression compares side-by-side three stellar black holes in our galaxy: Gaia BH1, Cygnus X-1, and Gaia BH3, whose masses are 10, 21, and 33 times that of the sun respectively. Gaia BH3 is the most massive stellar black hole found to date in the Milky Way. The radii of the black holes are directly proportional to their masses, but note that the black holes themselves have not been directly imaged. CREDIT: ESO/M. Kornmesser
This artist’s impression compares side-by-side three stellar black holes in our galaxy: Gaia BH1, Cygnus X-1, and Gaia BH3, whose masses are 10, 21, and 33 times that of the sun respectively. Gaia BH3 is the most massive stellar black hole found to date in the Milky Way. The radii of the black holes are directly proportional to their masses, but note that the black holes themselves have not been directly imaged. CREDIT: ESO/M. Kornmesser

However, both are considered small compared with the supermassive black hole at the heart of our galaxy–Sagittarius A*. Its mass is 4.2 million times that of the sun. Enormous black holes like Sagittarius A* are created by progressively larger and larger black holes merging together, and not by the death of large stars. 

A landmark discovery

This new discovery is considered a landmark by scientists because it’s the first time that a large black hole with this kind of origin story has been found so close to Earth. One of the clues that tipped off the Gaia mission team was an odd ‘wobbling’ motion occurring on the companion star orbiting the black hole

Gaia-BH3 is 2,000 light-years away in the constellation Aquila and is Earth’s second-closest known black hole. It was also an unexpected find while an international team of scientists were reviewing Gaia observations ahead of a full data drop planned for next year.

[Related: Fastest-growing black hole eats the equivalent of one sun a day.]

“No one was expecting to find a high-mass black hole lurking nearby, undetected so far,” Pasquale Panuzzo, an astronomer at the Observatoire de Paris, part of France’s National Centre for Scientific Research and Gaia collaboration member, said in a statement. “This is the kind of discovery you make once in your research life.”

Mass rich, metal-poor

Astronomers have found similarly large black holes outside of the Milky Way galaxy. The prevailing theory is that they may form from the collapse of stars that do not have many elements heavier than helium and hydrogen in their chemical makeup. These stars are considered “metal-poor” and are believed to lose less mass over their lifetimes, so they have more material left over to produce these high-mass black holes after they die. Evidence directly linking metal-poor stars to high-mass black holes has been lacking until these new observations. 

Stars that come in pairs tend to have similar chemical compositions, so Gaia BH3’s companion star holds some important clues to how the star collapsed to create this giant black hole. Data from the Very Large Telescope’s Ultraviolet and Visual Echelle Spectrograph instrument showed that the companion star was very metal-poor. This means that the star that collapsed to form Gaia BH3 was also metal-poor, as the theories predicted. 

[Related: Black hole collisions could possibly send waves cresting through space-time.]

“We took the exceptional step of publishing this paper based on preliminary data ahead of the forthcoming Gaia release because of the unique nature of the discovery,” astronomer Elisabetta Caffau, a study co-author from the CNRS Observatoire de Paris also a Gaia collaboration member, said in a statement

According to the team, making this data available early will allow other astronomers to study Gaia BH3 immediately without waiting for the complete Gaia data release. The full release from the space telescope is planned for late 2025 at the earliest and additional observations could reveal more about the black hole’s history.

The post Humongous stellar-mass black hole is the biggest ever found in Milky Way appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA wants to measure moonquakes with laser-powered fiber optic cables https://www.popsci.com/science/moonquake-laser-fiber-optic/ Mon, 15 Apr 2024 19:57:18 +0000 https://www.popsci.com/?p=611037
Moon surface
Although the moon lacks tectonic plates, it still generates quakes from a variety of other factors. NASA/GSFC/Arizona State University

The moon’s seismic activity requires extremely sensitive tools to cut through the lunar dust.

The post NASA wants to measure moonquakes with laser-powered fiber optic cables appeared first on Popular Science.

]]>
Moon surface
Although the moon lacks tectonic plates, it still generates quakes from a variety of other factors. NASA/GSFC/Arizona State University

Even without any known active tectonic movement, the moon can still rumble. Its dramatic thermal changes, miniscule contractions from cooling, and even the influences of Earth’s gravity have all contributed to noticeable seismic activity. And just like on Earth, detecting these potentially powerful moonquakes will be important for the safety of any future equipment, buildings, and people atop the lunar surface. 

But instead of traditional seismometers, NASA hopes Artemis astronauts will be able to deploy laser-powered fiber optic cables.

In a recent study published in Earth and Planetary Science Letters, researchers at Caltech made the case for the promising capabilities of a new, high-tech seismological tool known as distributed acoustic sensing (DAS). Unlike traditional seismometers, DAS equipment measures the extremely tiny tremors detected in laser light as it travels through fiber optic cables. According to a separate paper from last year, a roughly 62-mile DAS cable line could hypothetically do the job of 10,000 individual seismometers.

[Related: Researchers unlock fiber optic connection 1.2 million times faster than broadband.]

This is particularly crucial given just how difficult it’s been to measure lunar seismic activity in the past. Apollo astronauts installed multiple seismometers on the lunar surface during the 1970’s, which managed to record quakes as intense as a magnitude 5. But those readings weren’t particularly precise, due to what’s known as scattering—when seismic waves are muddied from passing through layers of extremely fine, powdery regalith dust.

Researchers believe using fiber optic DAS setups could potentially solve this problem by averaging thousands of sensor points, and the data to back it up. According to a recent Caltech profile, the team of geophysicists deployed a similar cable system near Antarctica’s South Pole, the closest environment on Earth to our natural satellite’s surface due to its remote, harsh surroundings. Subsequent tests successfully detected subtle seismic activity such as cracking and shifting ice, while holding up against the harsh surroundings.

Of course, the moon’s brutal surface makes Antarctica look almost pleasant by comparison. Aside from the dust, temperature fluctuations routinely vary between 130 and -334 degrees Fahrenheit, while the lack of atmosphere means regular bombardment by solar radiation. All that said, Caltech researchers believe fiber optic cabling could easily be designed to withstand these factors. With additional work, including further optimizing its energy efficiency, the team believes DAS equipment could arrive alongside Artemis astronauts in the near future, ready to measure any moonquakes that come its way.

The post NASA wants to measure moonquakes with laser-powered fiber optic cables appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The history-altering medical mystery of an astronaut who fell in the bathroom https://www.popsci.com/science/john-glenn-health-mystery/ Sat, 13 Apr 2024 17:32:00 +0000 https://www.popsci.com/?p=610769
a man looks into the ear of a man with thick glasses on
John Glenn's balance mechanism is tested by running cool water into his ear and measuring effect on eye motions (1962). NASA

What really caused the debilitating illnesses?

The post The history-altering medical mystery of an astronaut who fell in the bathroom appeared first on Popular Science.

]]>
a man looks into the ear of a man with thick glasses on
John Glenn's balance mechanism is tested by running cool water into his ear and measuring effect on eye motions (1962). NASA

On February 26, 1964, a 40-year-old man slipped in a hotel bathroom and clocked his head on the tub. The painful tumble would end up altering how the entire world approached space exploration. Why? Because that man was John Glenn, the first American to orbit the Earth, and that fall triggered a medical mystery that pushed to the forefront research into what spaceflight might do to the human body.

In the latest Popular Science video, we dig into the intriguing backstory of Glenn’s bathroom spill.

Space photo

Want more Popular Science videos? Learn about the buried treasure that helped take us to the moon. And remember to subscribe on YouTube for a new video every week.

The post The history-altering medical mystery of an astronaut who fell in the bathroom appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Let this astronaut show you around the International Space Station https://www.popsci.com/science/iss-video-tour/ Fri, 12 Apr 2024 17:00:00 +0000 https://www.popsci.com/?p=610687
Astronaut Andreas Mogensen aboard the ISS
Astronaut Andreas Mogensen spent over six months aboard the ISS. ESA/NASA

Danish astronaut Andreas Mogensen made a ‘keepsake’ tour video before returning to Earth.

The post Let this astronaut show you around the International Space Station appeared first on Popular Science.

]]>
Astronaut Andreas Mogensen aboard the ISS
Astronaut Andreas Mogensen spent over six months aboard the ISS. ESA/NASA

Andreas Mogensen returned to Earth in mid-March after a six-and-a-half month stint aboard the International Space Station. To mark his tenure as part of NASA’s Crew-7 mission, the Danish European Space Agency (ESA) astronaut has shared his souvenir from undock day—a guided video tour of the ISS.

“It’s been a month now since I left the [ISS],” Mogensen posted to X early Friday morning. “… It is as much a keepsake for me as it is a way for me to share the wonder of the International Space Station with you. Whenever I will miss my time onboard ISS, and especially my crewmates, I will have this video to look at.”

Mogensen began his show-and-tell in the space station’s front end, above which a docked SpaceX Dragon craft awaited to take him home on March 12. On his left is the roughly 114-by-22-foot Columbus module—a science laboratory provided by the ESA back in 2008. Across from the lab is the smaller Japanese Experiment Module (JEM), nicknamed Kibō, which arrived not long after Columbus.

Astronauts waving in ISS
Fellow astronauts wave to Mogensen aboard the ISS. Credit: ESA/NASA

From there, Mogensen provides a first-person look at various other ISS facilities, including workstations, storage units, bathrooms, gym equipment, multiple docking nodes, and even the station kitchen. Of course, given the delicate environment, that module looks more like another lab than an actual place to cook meals—presumably because, well, no one is actually cooking anything up there.

International Space Station orbiting above Earth
The International Space Station is pictured from the SpaceX Crew Dragon Endeavour during a fly around of the orbiting lab that took place following its undocking from the Harmony module’s space-facing port on Nov. 8, 2021. NASA

But the most stunning area in the entire ISS is undoubtedly the cupola, which provides a 360-degree panoramic view of Earth, as well as a decent look at the space station’s overall size.

[Related: What a total eclipse looks like from the ISS.]

Speaking of which, Mogenen’s video also does a great job showcasing just how comparatively small the ISS really is, even after over 25 years of module and equipment additions. At 356-feet-long, it’s just one yard shy of the length of a football field, but any given module or transit space is only a few feet wide. Factor in the copious amounts of cargo, equipment, supplies, experiment materials, as well as the over 8-miles of cabling that wire its electrical systems, and it makes for pretty tight living conditions. Near the end of Mogensen’s tour, it only takes him a little over a minute to glide through most of the entire station back to his original starting point.

View of Earth from ISS cupola
Andrea Mogensen’s view of Earth from inside the ISS cupola. Credit: ESA/NASA

Of course, none of that undercuts one of humanity’s most monumental achievements in space exploration. Although the ISS is nearing the end of its tenure (it’s scheduled for decommission in 2031), Mogensen’s keepsake is a great document of what life is like aboard the habitat. But for those now looking for an even more detailed tour, there’s always NASA’s virtual walkthrough.

The post Let this astronaut show you around the International Space Station appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch a tripod robot test its asteroid leaping skills https://www.popsci.com/technology/spacehopper-zero-gravity/ Fri, 12 Apr 2024 13:35:48 +0000 https://www.popsci.com/?p=610621
SpaceHopper robot in midair during parabolic flight test
SpaceHopper is designed to harness an asteroid's microgravity to leap across its surface. Credit: ETH Zurich / Nicolas Courtioux

SpaceHopper maneuvered in zero gravity aboard a parabolic flight.

The post Watch a tripod robot test its asteroid leaping skills appeared first on Popular Science.

]]>
SpaceHopper robot in midair during parabolic flight test
SpaceHopper is designed to harness an asteroid's microgravity to leap across its surface. Credit: ETH Zurich / Nicolas Courtioux

Before astronauts leave Earth’s gravity for days, weeks, or even months at a time, they practice aboard NASA’s famous parabolic flights. During these intense rides in modified passenger jets, trainees experience a series of stomach-churning ups and downs as the aircraft’s steep up-and-down movements create zero-g environments. Recently, however, a robot received similar education as their human counterparts—potentially ahead of its own journeys to space.

A couple years back, eight students at ETH Zürich in Switzerland helped design the SpaceHopper. Engineered specifically to handle low-gravity environments like asteroids, the small, three-legged bot is meant to (you guessed it) hop across its surroundings. Using a neural network trained in simulations with deep reinforcement learning, SpaceHopper is built to jump, coast along by leveraging an asteroid’s low-gravity, then orient and stabilize itself mid-air before safely landing on the ground. From there, it repeats this process to efficiently span large distances.

But it’s one thing to design a machine that theoretically works in computer simulations—it’s another thing to build and test it in the real-world.

Private Space Flight photo

Sending SpaceHopper to the nearest asteroid isn’t exactly a cost-effective or simple way to conduct a trial run. But thanks to the European Space Agency and Novespace, a company specializing in zero-g plane rides, the robot could test out its moves in the next best thing.

Over the course of a recent 30 minute parabolic flight, researchers let SpaceHopper perform in a small enclosure aboard Novespace’s Airbus A310 for upwards of 30 zero-g simulations, each lasting between 20-25 seconds. In one experiment, handlers released the robot in the middle of the air once the plane hit zero gravity, then observed it resituate itself to specific orientations using only its leg movements. In a second test, the team programmed SpaceHopper to leap off the ground and reorient itself before gently colliding with a nearby safety net.

Because a parabolic flight creates completely zero-g environments, SpaceHopper actually made its debut in less gravity than it would on a hypothetical asteroid. Because of this, the robot couldn’t “land” as it would in a microgravity situation, but demonstrating its ability to orient and adjust in real-time was still a major step forward for researchers. 

[Related: NASA’s OSIRIS mission delivered asteroid samples to Earth.]

“Until that moment, we had no idea how well this would work, and what the robot would actually do,” SpaceHopper team member Fabio Bühler said in ETH Zürich’s recent highlight video. “That’s why we were so excited when we saw it worked. It was a massive weight off of our shoulders.”

SpaceHopper’s creators believe deploying their jumpy bot to an asteroid one day could help astronomers gain new insights into the universe’s history, as well as provide information into our solar system’s earliest eras. Additionally, many asteroids are filled with valuable rare earth metals—resources that could provide a huge benefit across numerous industries back at home.

The post Watch a tripod robot test its asteroid leaping skills appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Japan and NASA plan a historic lunar RV roadtrip together https://www.popsci.com/science/japan-lunar-rv/ Thu, 11 Apr 2024 15:00:12 +0000 https://www.popsci.com/?p=610467
Toyota concept art for lunar RV
Japan is working alongside Toyota and Hyundai to develop a massive lunar RV. Toyota / JAXA

It would be the first time a non-American lands on the moon.

The post Japan and NASA plan a historic lunar RV roadtrip together appeared first on Popular Science.

]]>
Toyota concept art for lunar RV
Japan is working alongside Toyota and Hyundai to develop a massive lunar RV. Toyota / JAXA

Japan has offered to provide the United States with a pressurized moon rover—in exchange for a reserved seat on the lunar van. Per NASA, the two nations have themselves a deal. 

According to a new signed agreement between NASA and Japan’s government, the Japan Aerospace Exploration Agency (JAXA) will “design, develop, and operate” a sealed vehicle for both crewed and uncrewed moon excursions. NASA will then oversee the launch and delivery, while Japanese astronauts will join two surface exploration missions in the vehicle.

[ Related: SLIM lives! Japan’s upside-down lander is online after a brutal lunar night ]

‘A mobile habitat’

Japan’s pressurized RV will mark a significant step forward for lunar missions. According to Space.com, the nation has spent the past few years working to develop such a vehicle alongside Toyota and Mitsubishi Heavy Industries. Toyota offered initial specs for the RV last year—at nearly 20-feet-long, 17-feet-wide, and 12.5-feet-tall, the rover will be about as large as two minibusses parked side-by-side. The cabin itself will provide “comfortable accommodation” for two astronauts, although four can apparently cram in, should an emergency arise.

Like an RV cruising across the country, the rover is meant to provide its inhabitants with everything they could need for as long as 30 days at a time. While inside, astronauts will even be able to remove their bulky (and fashionable) getups and move about normally—albeit in about 16.6 percent the gravity as on Earth. Last week, NASA announced it had narrowed the search for its new Artemis Lunar Terrain Vehicle (LTV) to three companies, but unlike Japan’s vehicle, that one will be unpressurized.

[Related: It’s on! Three finalists will design a lunar rover for Artemis

“It’s a mobile habitat,” NASA Administrator Nelson said during yesterday’s press conference alongside Minister Moriyama, describing it as “a lunar lab, a lunar home, and a lunar explorer… a place where astronauts can live, work, and navigate the lunar surface.”

Moons photo

Similar to the forthcoming Lunar Terrain Vehicle, the Japanese RV can be remotely controlled if astronauts aren’t around, and will remain in operation for 10 years following its delivery.

“The quest for the stars is led by nations that explore the cosmos openly, in peace, and together… America no longer will walk on the moon alone,” Nelson added.

A total of 12 astronauts—all American men—have walked across the moon’s surface. When the U.S. returns to the moon with NASA’s Artemis missions, it will also be the first time a woman and a person of color will land on the moon.

After some rescheduling, NASA currently intends to send its Artemis II astronauts on a trip around the moon in late 2025. Artemis III will see the first two humans touchdown in over 50 years in either late 2026 or early 2027. The Artemis IV mission is currently intended to occur no earlier than 2030. Meanwhile, China is trying to land its own astronauts on the lunar surface in 2030

The post Japan and NASA plan a historic lunar RV roadtrip together appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Baby stars release gassy ‘sneezes’ while forming https://www.popsci.com/science/baby-star-sneeze/ Thu, 11 Apr 2024 12:45:53 +0000 https://www.popsci.com/?p=610435
an illustration of a shiny baby star surrounded by lines indicating magnetic fields
The baby star at the center is surrounded by a bright disk called a protostellar disk. Spikes of magnetic flux, gas, and dust in blue. Researchers found that the protostellar disk will expel magnetic flux, gas, and dust—much like a sneeze—during a star's formation. ALMA (ESO/NAOJ/NRAO)

The disk that surrounds newly forming stars shoots out material that could impact its future development.

The post Baby stars release gassy ‘sneezes’ while forming appeared first on Popular Science.

]]>
an illustration of a shiny baby star surrounded by lines indicating magnetic fields
The baby star at the center is surrounded by a bright disk called a protostellar disk. Spikes of magnetic flux, gas, and dust in blue. Researchers found that the protostellar disk will expel magnetic flux, gas, and dust—much like a sneeze—during a star's formation. ALMA (ESO/NAOJ/NRAO)

Our bodies can sometimes forcefully expel dust in our noses in the form of a sneeze. A similar phenomenon may be happening in baby stars. Some new observations of the protostellar disk that surrounds a baby star offer a closer look at how the disk releases plumes of gas, electromagnetic energy, and dust. The team from Kyushu University in Japan describes these “sneezes” as a release of magnetic flux or energy that could be a vital part of star formation. The findings are described in a study published April 11 in The Astrophysical Journal.

All stars develop in stellar nurseries, but star formation is a complex process that we still do not fully understand. These large areas of space that are full of the raw materials needed to create stars–gas, dust, and energy. Stellar nurseries with large concentrations of dust and gas eventually condense, forming a stellar core or baby star. Over time, the stellar cores will accumulate more material and grow in mass. As this growth unfolds, dust and gas form a ring around the new star astronomers call the protostellar disk.

“These structures are perpetually penetrated by magnetic fields, which brings with it magnetic flux,” study co-author and Kyushu University radio astronomer Kazuki Tokuda said in a statement. “However, if all this magnetic flux were retained as the star developed, it would generate magnetic fields many orders of magnitude stronger than those observed in any known protostar.”

[Related: The biggest gaseous structure in our galaxy is filled with baby star factories.]

Scientists have hypothesized that some mechanism during star development removes the magnetic flux. One theory is that the magnetic field gradually weakens over time as the cloud is gradually pulled into the stellar core.

In this new study, the team set their sights on a stellar nursery called MC 27. This stellar nursery is about 450 light-years away from Earth. They observed MC 27 using the ALMA Array, a collection of 66 high-precision radio telescopes in northern Chile.

“As we analyzed our data, we found something quite unexpected,” said Tokuda. “There were these ‘spike-like’ structures extending a few astronomical units from the protostellar disk. As we dug in deeper, we found that these were spikes of expelled magnetic flux, dust, and gas.”

According to the team, this phenomenon is called interchange instability. This occurs when instabilities in the magnetic field react with different amounts of gas in the protostellar disk surrounding the baby star. The result is the expulsion of magnetic flux.

“We dubbed this a baby star’s ‘sneeze’ as it reminded us of when we expel dust and air at high speeds,” said Tokuda.

[Related: Bursting stars could explain why it was so bright after the big bang.]

They also observed other spikes of energy thousands of astronomical units away from the protostellar disk. The team believes that these extra spikes could be the remnants of past stellar sneezes.

The team hopes that their findings will improve astronomer’s understanding of the detailed processes that shape the universe.

“Similar spike-like structures have been observed in other young stars, and it’s becoming a more common astronomical discovery,” said Tokuda. “By investigating the conditions that lead to these ‘sneezes’ we hope to expand our understanding of how stars and planets are formed.”

The post Baby stars release gassy ‘sneezes’ while forming appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Youth-stealing stars could explain ‘missing giants’ at the Milky Way’s center https://www.popsci.com/science/youth-stealing-stars-could-explain-missing-giants-at-the-milky-ways-center/ Wed, 10 Apr 2024 16:22:17 +0000 https://www.popsci.com/?p=610279
An image of the core of our Milky Way galaxy. This view combines the sharp imaging of Hubble’s Near Infrared Camera and Multi-Object Spectrometer (NICMOS) with color imagery from a previous Spitzer Space Telescope survey done with its Infrared Astronomy Camera (IRAC).
An image of the core of our Milky Way galaxy. This view combines the sharp imaging of Hubble’s Near Infrared Camera and Multi-Object Spectrometer (NICMOS) with color imagery from a previous Spitzer Space Telescope survey done with its Infrared Astronomy Camera (IRAC). NASA

The densely populated area at the center of our galaxy is home to epic cosmic collisions and mysterious phenomena.

The post Youth-stealing stars could explain ‘missing giants’ at the Milky Way’s center appeared first on Popular Science.

]]>
An image of the core of our Milky Way galaxy. This view combines the sharp imaging of Hubble’s Near Infrared Camera and Multi-Object Spectrometer (NICMOS) with color imagery from a previous Spitzer Space Telescope survey done with its Infrared Astronomy Camera (IRAC).
An image of the core of our Milky Way galaxy. This view combines the sharp imaging of Hubble’s Near Infrared Camera and Multi-Object Spectrometer (NICMOS) with color imagery from a previous Spitzer Space Telescope survey done with its Infrared Astronomy Camera (IRAC). NASA

Our planet is in the Milky Way’s suburbs—think of a quaint town in New Jersey, with the heart of Manhattan as the galactic center. However, instead of Central Park, the main attraction of the Milky Way is a supermassive black hole known as Sagittarius A*. This black hole is so large that it weighs a few million times our sun’s mass. In that galactic center, stars are packed much closer together than in the sun’s neighborhood, similar to rush hour commuters in the Big Apple, flinging about in their speedy orbits around the black hole. 

These stars are packed so tightly that they sometimes collide—an event that simply doesn’t happen in our more spacious part of the galaxy. New research presented this month at the American Physical Society’s annual meeting and submitted to The Astrophysical Journal uses computer simulations to show that these collisions are actually key to explaining some of the more mysterious phenomena of the galactic center.

Since the Nobel-prize-winning discovery of our galaxy’s black hole in 2020, astronomers have been studying the center of our galaxy as a way to understand galaxies across the universe.

“The centers of other galaxies are too far away for us to observe in detail, but we can learn about these environments by studying the center of the Milky Way,” explains Sanaea Rose, lead author and astronomer at Northwestern University’s Center for Interdisciplinary Exploration and Research in Astrophysics.

So far, astronomers have often been surprised by what they find at the center of the galaxy. For example, there is a population of particularly young, bright stars there that shouldn’t exist. The forces from the black hole are simply too strong for new stars to be born in that environment. On the other hand, there should be a bunch of older red giant stars in the galactic center, but they’re mysteriously missing. Plus, there are weird, yet-unexplained “puffy balls of dust and gas” as Rose describes them, known as G objects. “What we try to do is explain these unusual findings with stellar interactions, which are common in the dense star cluster,” she adds.

The orbits of stars within the central 1.0 X 1.0 arcseconds of our Galaxy. In the background, the central portion of a diffraction-limited image taken in 2015 is displayed. While every star in this image has been seen to move over the past 20 years, estimates of orbital parameters are best constrained for stars that have been observed through at least one turning point of their orbit. The annual average positions for these stars are plotted as colored dots, which have increasing color saturation with time. Also plotted are the best fitting simultaneous orbital solutions. (This image was created by Prof. Andrea Ghez and her research team at UCLA and are from data sets obtained with the W. M. Keck Telescopes.)
The orbits of stars within the central 1.0 X 1.0 arcseconds of our Galaxy. In the background, the central portion of a diffraction-limited image taken in 2015 is displayed. While every star in this image has been seen to move over the past 20 years, estimates of orbital parameters are best constrained for stars that have been observed through at least one turning point of their orbit. The annual average positions for these stars are plotted as colored dots, which have increasing color saturation with time. Also plotted are the best fitting simultaneous orbital solutions. (This image was created by Prof. Andrea Ghez and her research team at UCLA and are from data sets obtained with the W. M. Keck Telescopes.) Credit: UCLA Galactic Center Group – W.M. Keck Observatory Laser Team.

“Very close to the supermassive black hole, collisions of stars are so common that they become one of the strongest forces shaping the lives of stars,” says Morgan MacLeod, a co-author of this research and astronomer at the Harvard-Smithsonian Center for Astrophysics. 

These stars move at extremely high speeds: thousands of kilometers each second, compared to the leisurely 30 km/s our un trots through the galaxy. When two speeding stars run into each other head-on, they can destroy each other in the process like a car crash.

But, there are even stranger things that can happen according to Rose’s simulations. Some stars in the very closest regions—about ⅓ of a light year from Sagittarius A*—lose only their outer layers in these collisions, creating a population of strange, lower-mass stars at the galactic center and destroying most of the red giant stars astronomers expected to see. Ohio State University astronomer Alexander Stephan, who is not affiliated with the research team, noted that this work therefore explains the “crucial issue of the ‘missing giants’ in the galactic center.” These collisions might also be responsible for the mysterious G objects, their work suggests.

Outside of the closest ⅓ of a light year, the collisions aren’t quite so catastrophic. Two stars can actually merge together to make a huge star that “can masquerade as young-looking stars, even though they formed from an older population,” explains Rose. “Collisions may therefore explain the presence of young-seeming, massive stars very near the supermassive black hole.”

This bizarre part of the galaxy is certainly nothing like our Goldilocks-perfect home on Earth, but work like this brings us closer to understanding how invigorating collisions might explain some of the strange things we see in  exotic environment of the galactic center.

The post Youth-stealing stars could explain ‘missing giants’ at the Milky Way’s center appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
ADHD may have evolved to give us foraging superpowers https://www.popsci.com/science/weirdest-thing-adhd/ Wed, 10 Apr 2024 13:00:00 +0000 https://www.popsci.com/?p=610266
Researchers analyzed data from 457 adults who played an online foraging game.
Researchers analyzed data from 457 adults who played an online foraging game. DepositPhotos

Plus other weird things we learned this week.

The post ADHD may have evolved to give us foraging superpowers appeared first on Popular Science.

]]>
Researchers analyzed data from 457 adults who played an online foraging game.
Researchers analyzed data from 457 adults who played an online foraging game. DepositPhotos

What’s the weirdest thing you learned this week? Well, whatever it is, we promise you’ll have an even weirder answer if you listen to PopSci’s hit podcast. The Weirdest Thing I Learned This Week hits Apple, Spotify, YouTube, and everywhere else you listen to podcasts every-other Wednesday morning. It’s your new favorite source for the strangest science-adjacent facts, figures, and Wikipedia spirals the editors of Popular Science can muster. If you like the stories in this post, we guarantee you’ll love the show.

Heads up: The Weirdest Thing I Learned This Week has been nominated for a Webby! You can vote to help us win the Webby People’s Voice Award. Click here to vote by April 18

FACT: ADHD may have evolved to make us better at picking berries 

By Rachel Feltman

Researchers from the University of Pennsylvania recently released a study on the potential evolutionary benefits of ADHD. They analyzed data from 457 adults who played an online foraging game, where the objective was to collect as many berries as possible within an eight minute span.

Players could choose to either keep collecting berries from the bushes in their original location, or move to a new patch. (By the way, this sounds an awful lot like a game I used to play on Neopets!) Moving would cost them a brief time out, and there was no guarantee that the patch would have as many berries as their current location, but the number of berries you could get from each bush went down each time you foraged it again. 

Along with the game, subjects also took a survey designed to assess whether they had symptoms of ADHD. This didn’t constitute a full or formal diagnosis, but it screened for traits like having difficulty concentrating. 

When the researchers compared the survey results with the game play stats, they found that people with ADHD symptoms played differently—and more effectively—than their peers. They were more likely to move on to another bush, and collected an average of 602 berries compared with 521. 

I probably don’t need to tell you that this isn’t exactly a perfect model for actual foraging. The researchers do hope to do a similar experiment in the future involving in-person foraging, where they’d use people with formal ADHD diagnoses as their experimental subjects, but that would obviously be a much more complicated experiment to run. 

But this isn’t the first research to suggest that ADHD traits and other types of neurodiversity might have evolved to help our ancestors survive. Other studies have examined the differences in how people with ADHD search for information or objects and found that we spend more time in the “explore” phase of foraging versus the “exploit” phase. There’s even ongoing research to suggest that kids with ADHD are less susceptible to inattention bias.

In 2008, researchers found that members of a nomadic group in Kenya who had gene mutations associated with ADHD were in better health than average, while those same mutations were associated with malnourishment in closely related people who lived as farmers. There’s a broad idea known as the hunter versus farmer hypothesis that covers this phenomenon. The idea is that the hyperfocus associated with ADHD was actually a really useful trait back when humans spent their days hunting and foraging. It’s much less useful useful in agrarian and industrialized life. One 1998 study found that adults with self-reported ADHD were much better able to postpone eating, sleeping, and other personal needs to absorb themselves in an urgent task, like a last-minute deadline. That’s a mindset that would have come in handy for unpredictable food acquisition, like the sudden appearance of a herd of mammoths or an unexpected bounty of berries.

Some researchers have even suggested that sugar can trigger hyperactivity symptoms because the fructose makes our brains think we’ve come across a foraging bounty and should search for more berries.

While there’s a lot more research to be done on this subject, this study is an important reminder that our current sense of what’s “good” and what’s “normal” is pretty arbitrary—and that reframing these ideas can unlock really cool insights into why humans actually are the way they are. And at least according to some foragers, these findings are no surprise at all

FACT: Venus is Earth’s evil twin

By Knimbley

Join me as I embark on a fascinating journey into the depths of Venus’s mysteries. From Elden Ring’s DLC to Venus’s mythological allure and its longstanding status as a scientific enigma, my contribution to this week’s episode dances between realms of curious tangents, genderfluid anatomy, and fantasy. As we explore Venus’s dual nature and delve into the origins of stories both factual and fictional, listeners are invited to ponder the cosmic wonders that await us beyond Earth’s confines (and hopefully are unveiled within the Shadow of the Erdtree). With warmth and perhaps too much matcha, we navigate the intersection of myth and science, embracing the magic of exploration.

If you’re hungry for some more Venus-related science after this week’s episode, check out NASA’s content on the subject:

FACT: People think this lotion attracts spiders en masse—but the truth is more complicated than that 

By Jess Boddy

At the end of last year, people were all in a tizzy because of the lotion spiders. Yes, the lotion spiders. Someone left a review on Sephora’s website about a specific kind of lotion: the Delícia Drench body butter made by the company Sol de Janeiro. Here’s that review.

Insects photo

This wasn’t the only review that said this lotion attracted spiders—there were a handful. And then, the unspeakable happened… People posted the reviews to Reddit. Word of lotion spiders spread like wildfire. Folks started doing their own home “experiments,” putting the lotion on tissues and watching to see if spiders appeared. Pretty much everyone came to the same conclusion: this lotion attracts wolf spiders. 

However, scientists aren’t so sure. Listen to this week’s episode to find out the scientific truth about this potentially spider-attracting beauty product—and if there are others to avoid if you have a fear of arachnids. (Spoiler: It’s complicated.)

The post ADHD may have evolved to give us foraging superpowers appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The best star projectors for 2023 https://www.popsci.com/gear/best-star-projectors/ Tue, 23 Aug 2022 15:00:00 +0000 https://www.popsci.com/?p=463938
The best star projectors
Stan Horaczek

A light show billions of years in the making beams into your home.

The post The best star projectors for 2023 appeared first on Popular Science.

]]>
The best star projectors
Stan Horaczek

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Best overall Best overall star projector Sega Toys Homestar Flux
SEE IT

Get a scientifically accurate recreation of the night sky at home.

Best for adults BlissLights Sky Lite BlissLights Sky Lite 2.0
SEE IT

Skip the kid stuff without breaking the bank.

Best budget Infmetry Star Projector Infmetry Star Projector
SEE IT

This star light is designed ofor gaming rooms, home theaters,

Beyond a few bright celestial objects, the rise of light pollution has made it difficult for most people to experience a genuinely starry night sky—and that’s where star projectors come in. If artificial lights have obscured your view of the Milky Way, these compact devices provide a fun and comfortable way to observe the cosmos. All you need is a dark room with a power outlet and you’re ready to bask in the wonders of the universe. Many also function as night lights or pattern projectors that can spruce up a room without the celestial theme. While nothing can replace the awe-inspiring feeling of seeing millions of stars in person, the best star projectors can still leave you transfixed.

How we chose the best star projectors

I’ve been fortunate to visit areas less affected by light pollution, so I know what it’s like to gaze upon the grandeur of our galaxy. As an editor at TechnoBuffalo, I visited NASA’s Jet Propulsion Lab in Pasadena, Calif., to learn about the Mars rover. I also took a guided tour of the Goldstone Deep Space Communications Complex, where I saw enormous satellites used to communicate with faraway spacecraft. Over the last 10 years, I’ve written about gadgets and space for outlets like CNN Underscored, TechnoBuffalo, and Popular Science, and this guide, in a way, allows me to write about both. If you’re searching for a projector for movie night, you’re in the wrong place (though we do have a guide for the best projectors, the best home theater projectors, and the best outdoor projectors). But if you enjoy the stars of the sky as much as you do the stars of the screen, read on.

The best star projectors: Reviews & Recommendations

Whether you’re looking to liven up your space with colorful lights or follow in the footsteps of Carl Sagan, a star projector is a novel way to explore the cosmos. When making our picks, we found a balance between fantastical projectors, options for kids and adults, and a more scientifically accurate model that’s great for those who love astronomy.

Best overall: Sega Toys Homestar Flux

Sega

SEE IT

Why it made the cut: Sega’s Homestar Flux features the most scientifically accurate images out of all the star projectors we picked.

Specs 

  • Dimensions: 6.3 x 6.3 x 5.9 inches (LWH)
  • Weight: 1.36 pounds
  • Power: USB

Pros 

  • Supports multiple discs
  • Projects up to 60,000 stars at once
  • Great educational tool

Cons 

  • Expensive

Sega’s Homestar Flux is the closest thing to a planetarium if you’re a fan of astronomy and intend to use your star projector as an educational tool. It can project up to 60,000 stars at once and covers a circle with a 106-inch diameter. Unlike the other star projectors on this list, Sega’s model supports interchangeable discs, allowing owners to explore different parts of the universe in incredible detail. The Homestar Flux comes with two discs, the Northern Hemisphere and the Northern Hemisphere with constellation lines; it also supports additional discs that feature the Andromeda Galaxy, the southern hemisphere, and more. 

These discs contain data from different missions of the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA), and the United States Naval Observatory (USNO). While Sega’s projector is pricey, it features the most scientifically accurate experience and is a must-have for would-be astronomers.

Best portable: NEWSEE Northern Lights Star Projector

NEWSEE

SEE IT

Why it made the cut: NEWSEE’s Northern Lights Star Projector lets you take the magic of the stars with you everywhere.

Specs 

  • Dimensions: 4.7 x 4.7 x 4.8 inches (LWH)
  • Weight: 1.15 pounds
  • Power: USB-C

Pros 

  • Battery powered
  • 360-degree projection
  • White noise mode
  • Bluetooth streaming

Cons 

  • Don’t expect high-fidelity audio

NEWSEE’s Northern Lights Star Projector is the only model we’re recommending that can be taken anywhere. The battery-powered projector can run for a couple of hours before needing to be recharged—though because it has a USB-C port, you can plug it into a portable charger to extend its life. The projector sits on a stand and can be rotated so that you can find the best angle for your room. This flexibility comes in handy because you may be using the projector in multiple rooms because of its portability.

You can program NEWSEE’s projector to display one of four different star patterns, and play five different white noises. This star projector can even be used as a Bluetooth speaker for playing any music from your digital library. However, you shouldn’t get your hopes up where audio fidelity is concerned—consider this a fun bonus feature. If you want to take a star projector to a friend’s place or on vacation, this is the one to grab.

Best for adults: BlissLights Sky Lite

BlissLights

SEE IT

Why it made the cut: The Sky Lite from BlissLights will help you set the mood with the right lighting.

Specs 

  • Dimensions: 5.95 x 2.91 x 5.95 (LWH)
  • Weight: 1.68
  • Power: AC adapter

Pros 

  • Adjustable brightness
  • Tilting base
  • App controlled

Cons 

  • Projector design is easy to tip over

The Sky Lite from BlissLights is an excellent option for adults because it offers brightness controls, and several lighting effects, making it easy to set the proper mood. While star projectors generally become the center of attention in whatever room they’re in, the Sky Lite is excellent as complementary lighting, casting colorful auroras during dinner, movie nights, and parties. Additionally, the Sky lite 2.0 supports a rotation feature and a shutoff timer so that you can have your magical night under the stars before nodding off to bed. 

Best for kids: Gdnzduts Galaxy Projector

Gdnzduts

SEE IT

Why it made the cut: This galaxy projector features brightness controls and a shutoff timer, plus it doubles as a colorful night light.

Specs 

  • Dimensions: 6.45 x 6.45 x 4.92 (LWH)
  • Weight: 0.61 pounds
  • Power: USB

Pros 

  • Built-in speaker
  • Shutoff timer
  • Brightness controls

Cons 

  • Doesn’t show constellations

This simple galaxy projector features 21 lighting effects, a shutoff timer, brightness controls, and doubles as a night light. That way, you can find the right effect you like, adjust the brightness, and set a timer before bed. You can also toggle the lasers on and off, turning off the stars and letting the nebula-like effect lull you to sleep. The Galaxy Projector also comes with a remote, making it easy for kids to operate. Whether you want to inspire your kid’s imagination or keep them feeling safe with a night light, the Galaxy Projector is an excellent choice.

Best budget: Infmetry Star Projector

Amazon

SEE IT

Why it made the cut: Infmetry’s Star Projector offers an array of features at an affordable price.

Specs 

  • Dimensions: 7.1 x 7.1 x 7.5 inches (LWH)
  • Weight: 1.37 pounds
  • Power: USB

Pros 

  • Affordable
  • Five brightness modes
  • Shutoff timer

Cons

  • No nebula or aurora features

Infantry’s Star Projector casts 360 degrees of light through a precut dome, creating a night sky-like effect. This model also supports five brightness modes, a breathing mode, and four colors (white, yellow, blue, and green). There’s also a shutoff timer, so you can fall asleep with the projector on and wake up with it off. It’s not nearly as captivating as the other options on this list, but for the price, it’s a fun way to introduce someone to the wonders of the universe.

What to consider when buying the best star projectors

Generally, cheap star projectors are novelties that emit a mix of colorful swirling LED lights and class 2 lasers, which are low-power visible lasers—the same type used in laser pointers. While most models aren’t scientifically accurate, they provide a fanciful escape and can offer a calming experience. However, if you’re serious about astronomy and willing to spend more, you can find a star projector that can turn your room into a personal planetarium.

Most models we researched offer features like brightness and color controls, image rotation, and an automatic shut-off timer. We found picking the right star projector is more about finding the experience that matches your mood. Are you looking for the cosmic color of nebulae? What about scientifically accurate constellations? Whatever you’re after, there’s a star projector for everyone.

Projection type

You’d think that a star projector only projects, well, stars. But many of them can cover the broad cosmic spectrum and mimic everything from nebulae to auroras to constellations. As we mentioned, picking the right one is about capturing your interest and imagination. A projector that can cast a nebula or aurora is an excellent choice if you want to create a calming environment before going to sleep. A star projector with more scientifically accurate images is ideal for studying and educational use.

Brightness control

A good star projector uses an LED bulb and offers multiple brightness settings. While star projectors are most effective in a dark room, the models that project nebula and aurora make for great complementary lighting, such as during a party or movie night. They also make for good night lights and can help create a calming environment that encourages rest.

Color settings

In addition to adjusting brightness, most star projectors offer different color settings, similar to smart light bulbs. Users can create a scene that fits their mood through advanced color settings and change it with the press of a button. A green aurora may be suitable for calm and tranquility, while yellow may be ideal for happiness and optimism. Most star projectors allow color adjustments through a controller or smartphone app and support millions of color options.

Still vs. rotating

Star projectors generally offer different viewing modes: still and rotating. A projector that operates in still mode will cast light onto a surface and remain static. A projector with a rotating feature will put on a more dynamic light show by slowly rotating the lights. Many of the models we looked at are capable of switching between still and rotating modes.

Extra features

Beyond simply projecting lights onto a wall, some star projectors include extra features like white noise, app support, and shutoff timers. Some models can even be synced with your music so that you can put on a cosmic light show. While these features aren’t necessary, they make specific models more appealing, especially if you intend to use a star projector in a child’s room, because it can act as a night light and white noise machine and then shut off after a few hours.

FAQs

Q: How much do the best star projectors cost?

Star projectors can start at $10 and go up to the $150-$200 range, depending on quality and additional features. For example, something that provides a planetarium-like experience will be more expensive than one that has constellations etched into the cover.

Q: Can I use a star projector on any wall?

Yes, you can use a star projector on any wall in your home. These projectors typically have a short throw, which essentially means they need to be relatively close to the surface they’re projecting onto (between 6-10 feet). We recommend pointing your star projector at a blank wall or ceiling so that you can enjoy the maximum effect of the colorful lights without distractions.

Q: Where should a star projector be placed in a room?

For an evenly lit ceiling, you should try to put the star projector in the center of your room. We realize that’s not ideal for most people, so any place you have an outlet is a good spot. There’s no wrong place to put a star projector, as long as it’s not too close to the surface it’s projecting onto. What works and looks best to you may not be the same for someone else.

Q: Are star projectors for kids and adults the same?

Star projectors are appropriate for any age and generally offer the same features or designs, whether for a kid or an adult. There are some variations if you do want one for a specific age group. For example, some models might come with imagery, such as an astronaut or spaceship, aimed at younger audiences. Meanwhile, projectors that offer scientifically accurate images might only appeal to adults or people who are enthusiastic about astronomy. If you’re purchasing a star projector for someone, you must consider their interests to get the most out of what you buy.

Q: Are star projectors good to use as night lights?

If the imagery and colors of a star projector make you feel more relaxed, then you should use it as a night light. While not their intended purpose, many models we researched feature brightness and timer settings, making them suited for bedtime use. Some also have built-in sound machines, which some people claim help them fall asleep and stay asleep.

Final thoughts on the best star projectors

Star projectors are a fun and affordable way to add bright, colorful lights to your bedroom. That said, most are nothing more than novelties and put on light shows that vaguely resemble nebulae and auroras. If you’re searching for something with more scientifically accurate imagery, you can find some excellent options if you don’t mind spending more money. Better yet, we recommend traveling to a place unaffected by light pollution and experiencing the feeling of seeing millions of stars in person.

Why trust us

Popular Science started writing about technology more than 150 years ago. There was no such thing as “gadget writing” when we published our first issue in 1872, but if there was, our mission to demystify the world of innovation for everyday readers means we would have been all over it. Here in the present, PopSci is fully committed to helping readers navigate the increasingly intimidating array of devices on the market right now.

Our writers and editors have combined decades of experience covering and reviewing consumer electronics. We each have our own obsessive specialties—from high-end audio to video games to cameras and beyond—but when we’re reviewing devices outside of our immediate wheelhouses, we do our best to seek out trustworthy voices and opinions to help guide people to the very best recommendations. We know we don’t know everything, but we’re excited to live through the analysis paralysis that internet shopping can spur so readers don’t have to.

The post The best star projectors for 2023 appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Internet use dipped in the eclipse’s path of totality https://www.popsci.com/technology/eclipse-internet-drop/ Tue, 09 Apr 2024 19:16:12 +0000 https://www.popsci.com/?p=610142
People looking up at eclipse wearing protective glasses
Internet usage dropped as much as 60 percent in some states while people watched the eclipse. Photo by Brad Smith/ISI Photos/USSF/Getty Images for USSF

Data shows a lot of people logged off during the cosmic event.

The post Internet use dipped in the eclipse’s path of totality appeared first on Popular Science.

]]>
People looking up at eclipse wearing protective glasses
Internet usage dropped as much as 60 percent in some states while people watched the eclipse. Photo by Brad Smith/ISI Photos/USSF/Getty Images for USSF

New data indicates a once-in-a-generation eclipse is a pretty surefire way to convince people to finally log off the internet—at least for a few minutes. According to estimates from cloud-computing provider Cloudflare, yesterday’s online traffic dropped between 40-60 percent week-to-week within the April 8 eclipse’s path of totality. In aggregate terms for the US, “bytes delivered traffic dropped by 8 percent and request traffic by 12 percent as compared to the previous week” around 2:00pm EST.

According to NASA, yesterday’s path of totality included a roughly 110-mile-wide stretch of land as it passed across Mazatlán, Mexico, through 13 states within the continental US, and finally over Montreal, Canada. In America alone, an estimated 52 million people lived within the eclipse’s path of totality. And it certainly seems like a lot of them put down their phones and laptops to go outside and have a look.

[Related: What a total eclipse looks like from space.]

As The New York Times highlights, Vermont saw the largest mass log-off, with an estimated 60-percent drop in internet usage compared to the week prior. South Carolinians, meanwhile, appeared to be the least compelled to take a computer break, since their traffic only dipped by around four percent.

Map of solar eclipse internet traffic change in US from Cloudflare
Credit: Cloudflare

Interestingly, you can also glean a bit about weather conditions during the eclipse from taking a look at Cloudflare’s internet usage map of the US. While most of the states within the event’s trajectory showcase pretty sizable downturns, Texas only experienced a 15 percent reduction. But given a large part of the Lone Star State endured severe weather conditions, it’s likely many people remained inside—maybe even online to livestream the views of the eclipse elsewhere.

[Related: The full sensory experience of an eclipse totality, from inside a convertible in Texas.]

So what were people doing if they weren’t posting through the eclipse? Well, snapping photos of the moment is always pretty popular, while NASA oversaw multiple volunteer research projects.

Judging from Cloudflare’s data, it didn’t take long for people to log back online once the eclipse ended above them. Usage appeared to spike back to pretty standard levels almost exactly in time with the event’s ending in any given state. No doubt most people rushed to post their reactions, photos, and videos… but maybe yesterday will still serve as a nice reminder that there’s a lot more to see when you take a break and go outside for a bit.

The post Internet use dipped in the eclipse’s path of totality appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
What a total solar eclipse looks like from space https://www.popsci.com/science/eclipse-from-space/ Mon, 08 Apr 2024 23:06:13 +0000 https://www.popsci.com/?p=609974
a shadow on earth's surface
The view of Earth from the International Space Station during the eclipse. NASA

NASA shared an eerie view of the moon's shadow passing over Earth.

The post What a total solar eclipse looks like from space appeared first on Popular Science.

]]>
a shadow on earth's surface
The view of Earth from the International Space Station during the eclipse. NASA

Darkness, slivers of sunshine, and crescent shadows: The 2024 total solar eclipse put on quite a show. Down here on Earth, millions of people witnessed the fascinating sight of the moon passing in front of the sun. But a select few people had the chance to experience the eclipse from a different perspective: space.

The current residents of the International Space Station watched not only the actual eclipse, but what happened to Earth as the eclipse occurred. In a video shared by NASA, you can see the ominous shadow of the moon sliding over the surface of our planet.

“I can hardly imagine a view being better than the one we have right now, but if there is one, it’s from the Space Station,” NASA’s Earth-bound livestream commentators noted.

North America will not experience another total solar eclipse until August 23, 2044.

The post What a total solar eclipse looks like from space appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The full sensory experience of eclipse totality, from inside an Audi convertible https://www.popsci.com/technology/eclipse-audi-convertible/ Mon, 08 Apr 2024 20:58:06 +0000 https://www.popsci.com/?p=609843
a convertible in blackness during the eclipse
NASA’s eclipse expert Dr. Kelly Korreck says that the celestial event is a whole body experience: temperature, sound, and sight. Andi Hedrick/Audi

We headed into the path of totality in a techy open-air Audi S5 Cabriolet.

The post The full sensory experience of eclipse totality, from inside an Audi convertible appeared first on Popular Science.

]]>
a convertible in blackness during the eclipse
NASA’s eclipse expert Dr. Kelly Korreck says that the celestial event is a whole body experience: temperature, sound, and sight. Andi Hedrick/Audi

NASA’s Science Mission Directorate Heliophysics Division studies the nature of the sun and everything it touches. That includes the Earth, the atmosphere, and the magnetosphere, which is basically the planet’s force field against solar wind and radiation. As the United States amps up to a fever pitch due to today’s total solar eclipse, NASA is ground zero for the most interesting studies and history about this natural phenomenon.

Today the sun is more of a rock star than usual, with “eclipse parties” in full swing, and roadside stands selling commemorative t-shirts and cardboard viewing glasses are popping up all along the path of totality. Dr. Kelly Korreck, a heliophysicist and NASA’s eclipse lead, gave us the background on this captivating astro-event and offered tips on the best viewing areas.

We asked Dr. Korreck if watching the eclipse from a convertible (specifically, a tech-focused Audi S5 Cabriolet) would be a good idea, and she said it would be very appropriate. After all, besides safety glasses and a clear view of the sky, the only other thing you need is a great place to sit and lean your head back. 

As we waited for the clouds to clear from the sky, our photography team was a bit nervous. We got glimpses of the eclipse as the moon cast its great shadow, but would it clear? We’d soon find out.

Space photo

Spoiler: It was amazing. Video: Audi

An eclipse ushers in boatloads of scientific data points

If the moon’s shadow doesn’t excite you, consider this: Albert Einstein published his theory of general relativity in 1915, but it wasn’t proven until the total solar eclipse of 1919 when Sir Arthur Eddington and his team measured the influence of the sun’s gravity on starlight.

Dr. Korreck has been fascinated by the biggest star in our universe–the sun, of course–since long before she earned her doctorate on the subject. Scientists have long used solar eclipses to make scientific discoveries, she says. Eclipses led us to the first detection of helium, for instance, and this one will continue to give scientists the opportunity to study the sun’s effect on the ionosphere. Disturbances in the ionospheric layer can cause blips in our GPS navigation systems and communications, especially radio waves.  

To that end, we tested the Audi S5’s unique Bluetooth-connected seatbelt microphones, which enable clear conversations even with the top down. Three thumbtack-sized microphones are built into the outward-facing side of the seatbelt, which makes talking to someone like a brilliant NASA heliophysicist even more interesting. We also kept an eye on the S5’s GPS system, which didn’t flinch. 

seats in a car with the seatbelt pulled. on the seat belt are three dots that are microphones
Audi’s seatbelt microphones offer clearer conversations with the top down. Image: Audi

Eclipses happen about every 18 months somewhere in the world, but only in the same place every 400 to 1000 years, Dr. Korreck told us. In fact, the last total solar eclipse in Austin, Texas was more than 600 years ago, in 1397. Austin didn’t even exist back then. And the next one won’t be until 2343, long after we’re all gone. 

“Any specific town or city normally only gets an eclipse between every 400 and 1,000 years,” Dr. Korreck says. “So it’s very rare to [see one] in a specific location, but somewhere on Earth is getting this special dance, this special alignment of the planets.” 

The reason this particular total eclipse is so unusual is because it’s occurring during the period of “solar maximum,” when the sun is most active. There’s even a chance to see “streamers,” which NASA says will look like bright, pink curls or loops emanating from the sun. Heliophysicists (and the entire scientific community) are excited about this eclipse, because of the length and the intensity of the sun’s magnetic field in this period of time. 

“We’re at four and a half minutes for this eclipse,” Dr. Korreck says. “It was only two and a half minutes maximum in 2017, but it’ll be six-ish minutes in 2045. So we have more to look forward to in 20 years.” 

It’s more than just a visual event

When the moon stands between the sun and the Earth, the temperature outside can drop quickly – up to 10 degrees. I turned on the heated headrest, which blows warm air onto my neck; a welcome feature when you’re chilly. In Texas, it’s hot more often than it’s cold, so typically I’d use the cool setting to whisper cooling air instead. During an eclipse, the shroud of shadow blocking the sun erases heat quickly. So the sky goes dark, the temperature falls, and there’s even a measurable sound component. 

Space photo
Image: Andi Hedrick/Audi

“We mapped the bright light of the sun to a flute sound,” Harvard astronomer Allyson Bieryla told CNN on Friday. “Then it goes to a midrange, which is a clarinet, and then during totality, it kind of goes down to a low clicking sound, and that clicking even slows down during totality.”

That doesn’t even count the chirps, croaks, whines, and other sounds of the animal and insect kingdom as they process the odd turn of light during the event

“I think in general, an eclipse is such a full body experience,” Dr. Korreck says. “It gets colder, the light changes, the shadow gets a bit sharper. It’s a way to really experience a celestial event more than just a visual. Take some time to really enjoy it and take advantage of the special alignment that we have.” 

As the moment of totality approached, nearby horses brayed and dogs barked, as if it were truly twilight. And then it happened: The clouds parted and the sky grew dark, the animals quieted, and a stillness blanketed the landscape. We could see solar flares peeking from behind the corona, and Venus appeared below the sun. Outside of the S5 Cabriolet, the car’s headlights and taillights cast a signature pattern. For a couple of minutes, time stood still, and then daylight crept in again. It’s something I’ll never forget.

The post The full sensory experience of eclipse totality, from inside an Audi convertible appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
14 stellar photos from the 2024 total solar eclipse https://www.popsci.com/science/2024-eclipse-photos/ Mon, 08 Apr 2024 20:35:11 +0000 https://www.popsci.com/?p=609896
a partial eclipse behind the hand of the statue of liberty
A partial solar eclipse moves across the sky near the Crown of the Statue of Liberty on Liberty Island. TIMOTHY A. CLARY / AFP

April 8th's total solar eclipse began on the Pacific coast of Mexico and ended off the Atlantic coast of Canada.

The post 14 stellar photos from the 2024 total solar eclipse appeared first on Popular Science.

]]>
a partial eclipse behind the hand of the statue of liberty
A partial solar eclipse moves across the sky near the Crown of the Statue of Liberty on Liberty Island. TIMOTHY A. CLARY / AFP

Today was one for the history books as a total solar eclipse crossed North America. The sky first darkened in Mazatlán, Mexico on the country’s Pacific Coast. Torreón, Mexico saw the longest totality at 4 minutes and 28 seconds. It then entered the United States through Texas and traveled through Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire, and Maine. It entered Canada via Southern Ontario, and continued through Quebec, New Brunswick, Prince Edward Island, and Nova Scotia. The eclipse left the continental North America on the Atlantic coast of Newfoundland, Canada, at 5:16 p.m. NDT. 

Here’s how the eclipse looked at various locations, from Mexico to Canada.

the moon covers the sun
The moon eclipses the sun during a total solar eclipse across North America, at Niagara Falls State Park in Niagara Falls, New York. The next total solar eclipse that can be seen from a large part of North America won’t come around until 2044. Photo by ANGELA WEISS / AFP
pink ejections appear on the edges of a black eclipse
Solar prominences are seen during a total solar eclipse in Dallas, Texas. Photo by NASA/Keegan Barber
a partial sliver of the sun seen above the washington monument
The solar eclipse is seen above the Washington Monument in Washington, DC. Photo by Chip Somodevilla/Getty Images
The moon eclipses the sun during a total solar eclipse across North America, in Bloomington, Indiana, on April 8, 2024. This year's path of totality is 115 miles (185 kilometers) wide and home to nearly 32 million Americans, with an additional 150 million living less than 200 miles from the strip. The next total solar eclipse that can be seen from a large part of North America won't come around until 2044. (Photo by JOSH EDELSON / AFP)
The moon eclipses the sun during a total solar eclipse across North America, in Bloomington, Indiana. Photo by JOSH EDELSON / AFP
sliver of sun with clouds
A sliver of the sun is through the cloudsin Niagara Falls, Ontario, Canada. Photo by Vaughn Ridley/Getty Images
child's hand with the eclipse
A child observes the reflection of the eclipse in Guadalajara, Mexico. Photo by Leonardo Alvarez Hernandez/Getty Images
the moon covers the sun
A solar eclipse is seen through the clouds in Niagara Falls, Ontario, Canada. Photo by Vaughn Ridley/Getty Images
a composite of the eclipse showing all stages above a lake
This composite image of multiple exposures shows the progression of a total solar eclipse in Dallas, Texas. Photo by NASA/Keegan Barber
the eclipse behind the tip of the washington monument
The Moon, top, is seen passing in front of the Sun, with the top of the Washington Monument in silhouette. Photo by NASA/Bill Ingalls
progression of eclipse
This composite image of multiple exposures shows the progression of a total solar eclipse in Dallas, Texas. Photo by NASA/Keegan Barber
a sliver of the sun remains as moon and clouds move over it
The Moon is seen passing in front of the Sun just before totality during a solar eclipse in Kerrville, Texas. Photo by NASA/Aubrey Gemignani
a tiny spot of the sun shines through as the moon approaches full totality
A total solar eclipse is seen from the Indianapolis Motor Speedway. Photo by NASA/Joel Kowsky
people hold up smartphones to capture the eclipse
People are seen as they watch a total solar eclipse at the Indianapolis Motor Speedway. Photo by NASA/Joel Kowsky

And if you’re wondering what the eclipse looked like from space, NASA shared the view from the International Space Station.

If you can, consider recycling or donating any used eclipse glasses. Visit Astronomers Without Borders to learn more about how you can recycle your glasses. If you are located in the path of totality, many libraries will also offer convenient eclipse glasses recycling locations

The post 14 stellar photos from the 2024 total solar eclipse appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How to look at the eclipse without damaging your eyes https://www.popsci.com/how-to-not-damage-eyes-during-eclipse/ Tue, 22 Aug 2017 01:02:11 +0000 https://www.popsci.com/uncategorized/how-to-not-damage-eyes-during-eclipse/
a couple wearing glasses sits on the grass watching the eclipse
You need proper eclipse eyewear. DepositPhotos

It’s always a bad idea to look directly at the sun.

The post How to look at the eclipse without damaging your eyes appeared first on Popular Science.

]]>
a couple wearing glasses sits on the grass watching the eclipse
You need proper eclipse eyewear. DepositPhotos

Today, millions of people will have a chance to watch a total solar eclipse. If you’re one of them, be careful: looking directly at a solar eclipse without eye protection can permanently damage your vision.

It doesn’t matter if our rocky satellite is blocking all or some of our nearest star—the sun is still an incredibly bright source of light. Don’t risk your eyesight for a quick glimpse or even a once-in-a-lifetime event. Thankfully, it’s pretty easy to protect your eyes while watching an eclipse.

What happens if you look at a solar eclipse

We are able to see thanks to photoreceptors. These cells, also known as rods and cones, are located at the backs of our eyes, and convert the light reflected by the world around us into electrical impulses that our brain interprets as the image we see. But when strong light, like that from the sun, hits our eyes, a series of chemical reactions occur that damage and often destroy these rods and cones. This is known as solar retinopathy, and can make our eyesight blurry. Sometimes, if the damage is too great in one area, you can lose sight completely.

[Related: Every sunset ends with a green flash. Why is it so hard to see?]

On a typical sunny day, you almost never have to worry about solar retinopathy. That’s because our eyes have natural mechanisms that ensure too much light doesn’t get in. When it’s really bright outside, our pupils get super tiny, reducing the amount of sunlight that can hit your photoreceptors. But when you stare directly at the sun, your pupils’ shrinking power isn’t enough to protect your peepers.

This is where your eyes’ second defense mechanism comes into play. When we look at something bright, we tend to blink. This is known as the corneal or blink reflex, and it  prevents us from staring at anything too damagingly bright. 

Just before a solar eclipse has reached its totality, the moon is partially blocking the sun, making it a lot easier for us to look up at the star without blinking. But that doesn’t mean you should—even that tiny sliver of sunlight is too intense for our sensitive photoreceptors.

[Related: Total eclipses aren’t that rare—and you’ve probably missed a bunch of them]

Unfortunately, if you practice unprotected sun-gazing, you probably won’t know the effects of your actions until the next morning, when the damage to your photoreceptors has kicked in.

And while solar retinopathy is extremely rare, it is by no means unheard of. If you search the term in medical journals, you’ll find case reports after almost every popular solar eclipse. Let’s try really hard to do better this time, eyeball-havers.

How to safely watch a solar eclipse

Watching the eclipse with your own two eyes is easy: just wear legitimate eclipse sunglasses. These are crucial, as they will block the sun’s rays enough for you to safely see the eclipse without burning your eyes out.

And if you don’t have eclipse glasses, you can still enjoy the view, albeit not directly. Try whipping up your own eclipse projector or a DIY pinhole camera so you can enjoy the view without having to book an emergency visit to the eye doctor.

This story has been updated. It was originally published in 2017.

The post How to look at the eclipse without damaging your eyes appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Could a self-sustaining starship carry humanity to distant worlds? https://www.popsci.com/science/starship-humanity-distant-worlds/ Sun, 07 Apr 2024 16:00:00 +0000 https://www.popsci.com/?p=609101
spacecraft
The concept of a species being liberated from its home planet has been the dream of sailors and stargazers since the beginning of recorded history. NASA/Rick Guidice, Public domain, via Wikimedia Commons

Generation ships offer a tantalizing possibility: transporting humans on a permanent voyage to a new home among the stars.

The post Could a self-sustaining starship carry humanity to distant worlds? appeared first on Popular Science.

]]>
spacecraft
The concept of a species being liberated from its home planet has been the dream of sailors and stargazers since the beginning of recorded history. NASA/Rick Guidice, Public domain, via Wikimedia Commons

This article was originally featured on MIT Press Reader. This article is adapted from Christopher Mason’s book “The Next 500 Years: Engineering Life to Reach New Worlds.”

The only barrier to human development is ignorance, and this is not insurmountable.

Robert Goddard

Until 1992, when the first exoplanets were discovered, there had never been direct evidence of a planet found outside our solar system. Thirty years after this first discovery, thousands of additional exoplanets have been identified. Further, hundreds of these planets are within the “habitable zone,” indicating a place where liquid water, and maybe life, could be present. However, to get there, we need a brave crew to leave our solar system, and an even braver intergenerational crew to be born into a mission that, by definition, they could not choose. They would likely never see our solar system as anything more than a bright dot among countless others.

The idea of having multiple generations of humans live and die on the same spacecraft is actually an old one, first described by rocket engineer Robert Goddard in 1918 in his essay “The Last Migration.” As he began to create rockets that could travel into space, he naturally thought of a craft that would keep going, onward, farther, and eventually reach a new star. More recently, the Defense Advanced Research Projects Agency (DARPA) and NASA launched a project called the 100 Year Starship, with the goal of fostering the research and technology needed for interstellar travel by 2100.

This concept of a species being liberated from its home planet was captivating to Goddard, but it has also been the dream of sailors and stargazers since the beginning of recorded history. Every child staring into the night sky envisions flying through it. But, usually, they also want to return to Earth. One day, we may need to construct a human-driven city aboard a spacecraft and embark on a generational voyage to another solar system—never meant to return.

Distance, energy, particle assault

Such a grand mission would need to overcome many enormous challenges, the first and perhaps most obvious being distance. Not including the sun, the closest known star to Earth (Proxima Centauri) is 4.24 light-years, or roughly 25 trillion miles, away. Although 4.24 light-years is a mere hop on the cosmic scale, it would take quite some time to get there with our current technology.

The Parker solar probe, launched by NASA in 2018, is the fastest-moving object ever made by humans, clocking in at 430,000 miles per hour. But even at this speed, it would take 6,617 years to reach Proxima Centauri. Or, put another way, it would take roughly 220 human generations to make the trip.

Using current technology, it would take roughly 220 human generations to make the trip to Proxima Centauri.

The only way to decrease this number would be to move faster. Which brings us to our second challenge: finding the needed energy for propulsion and sustenance. To decrease the amount of time (and the number of generations) it would take to get to the new star, our speed would need to increase through either burning more fuel or developing new spacecraft with technology orders of magnitude better than what is currently at hand. Regardless of the technology used, the acceleration would likely need to come from one or a combination of these sources: prepackaged (nonrenewable) fuel, energy collected from starlight (which would be more challenging when between stars), elements like hydrogen in the interstellar medium, or by slingshotting off of celestial bodies.

The latest advancements in thrust technology might help refocus this issue. Nuclear fusion offers a promising solution, as it produces less radiation and converts energy more efficiently than other methods, which would enable spacecraft to reach much higher speeds. Leveraging nuclear fusion, as envisioned by Project Daedalus (British Interplanetary Society) and Project Longshot (U.S. Naval Academy/NASA), offers a path to interstellar travel within a single human lifetime. These studies suggest that a fusion-powered spacecraft could reach speeds exceeding 62 million miles per hour, potentially reducing travel times to nearby stars to just 45 years.

Yet even if we address the challenges of distance and energy by designing an incredibly fast, fuel-efficient engine, we’re faced with another problem: the ever-present threat of micrometeoroids. Consider that a grain of sand moving at 90 percent of the speed of light contains enough kinetic energy to transform into a small nuclear bomb (two kilotons of TNT). Given the variable particle sizes that are floating around in space and the extremely high velocities proposed for this mission, any encounter would be potentially catastrophic. This, too, would require further engineering to overcome, as the thick shielding we have available to us now would not only degrade over time but would likely be far too heavy. A few solutions might be creating lighter polymers, which can be replaced and fixed as needed in flight; utilizing extensive long-distance monitoring to identify large objects before impact; or developing some kind of protective field from the spacecraft’s front, capable of deflecting or absorbing the impact of incoming particles.

Physiological and psychological risks

As exemplified by the NASA Twins Study, the SpaceX Inspiration4 mission, and additional NASA one-year and six-month missions, the crews of a generation ship would face another critical issue: physiological and psychological stress. One way to get around the technological limitation of either increasing the speed of our ships or protecting the ships from colliding with debris is to, instead, slow biology using hibernation or diapause. However, humans who overeat and lie around all day with little movement in simulated hibernation or bed-rest studies can run a higher risk of developing type 2 diabetes, obesity, heart disease, and even death. So, how do bears do it?

During hibernation or torpor, bears are nothing short of extraordinary. Their body temperature dips, their heart rate plummets to as low as five beats per minute, and for months, they essentially do not eat, urinate, or defecate. Remarkably, they’re able to maintain their bone density and muscle mass. Part of their hibernation trick seems to come from turning down their sensitivity to insulin by maintaining stable blood glucose levels. Their heart becomes more efficient as well. A bear essentially activates an energy-saving, “smart heart” mode, relying on only two of its four chambers to circulate thicker blood.

In 2019, a seminal study led by Joanna Kelley at Washington State University revealed striking gene expression changes in bears during hibernation. Researchers used the same Illumina RNA-sequencing technology as used in NASA’s Twins Study to examine the grizzly bears as they entered hyperphagia (when bears eat massive quantities of food to store energy as fat) and then again during hibernation. They found that tissues across the body had coordinated, dynamic gene expression changes occurring during hibernation. Though the bears were fast asleep, their fatty tissue was anything but quiet. This tissue showed extensive signs of metabolic activity, including changes in more than 1,000 genes during hibernation. These “hibernation genes” are prime targets for people who would prefer to wait in stasis on the generation ship than stay awake.

Another biological mechanism that we could utilize on the generation ship is diapause, which enables organisms to delay their own development in order to survive unfavorable environmental conditions (e.g., extreme temperature, drought, or food scarcity). Many moth species, including the Indian meal moth, can start diapause at different developmental stages depending on the environmental signals. If there is no food to eat, as in a barren desert, it makes sense to wait until there is a better time and the rain of nutrients falls.

Diapause is actually not a rare event; embryonic diapause has been observed occurring in more than 100 mammals. Even after fertilization, some mammalian embryos can decide “to wait.” Rather than immediately implanting into the uterus, the blastocyst (early embryo) can stay in a state of dormancy, where little or no development takes place. This is somewhat like a rock climber pausing during an ascent, such as when a storm arrives, then examining all of the potential routes they may take and waiting until the storm passes. In diapause, even though the embryo is unattached to the uterine wall, the embryo can wait out a bad situation, such as a scarcity of food. Thus, the pregnant mother can remain pregnant for a variable gestational period, in order to await improved environmental conditions. The technology to engage human hibernation or diapause doesn’t exist in the 21st century, but one day might.

The impact of weightlessness, radiation, and mission stress on the muscles, joints, bones, immune system, and eyes of astronauts is not to be underestimated. The physiological and psychological risks of such a mission are especially concerning given that the majority of existing models are based on trips that were relatively short and largely protected from radiation by the Earth’s magnetosphere, with the most extensive study so far from Captain Scott Kelly’s 340-day trip.

Artificial gravity—essentially building a spacecraft that spins to replicate the effects of Earth’s gravity—would address many of these issues, though not all. Another major challenge would be radiation. There are a number of ways to try and mitigate this risk, be it shielding around the ship, preemptive medications (actively being studied by NASA), frequent temporal monitoring of cell-free DNA (cfDNA) for the early detection of actionable mutations, or cellular and genetic engineering of astronauts to better protect or respond to radiation. The best defense against radiation, especially in a long-term mission outside of our solar system, would likely be through a combination of these efforts.

But even if the radiation problem is solved, the psychological and cognitive strain of isolation and limited social interaction must be addressed. Just imagine if you had to work and live with your officemates and family, for your entire life, in the same building. While we can carefully select the first generation of astronauts for a long generation ship mission, their children might struggle to adapt to the social and environmental aspects of their new home.

Analog missions performed on Earth have shown that after 500 days in isolation with a small crew, most of the relationships were strained or even antagonistic.

Analog missions performed on Earth, such as the Mars-500 project, have shown that after 500 days in isolation with a small crew, most of the relationships were strained or even antagonistic. There are many descriptions of “space madness” appearing in both fiction and nonfiction, but their modeling and association to risk is limited. There is simply no way to know how the same crew and its descendent generations would perform in 10 or 100 years, and certainly not over thousands of years. Human history is replete with examples of strife, war, factions, and political backstabbing, but also with examples of cooperation, symbiosis, and shared governance in support of large goals (such as in research stations in Antarctica).

Choosing our new home

Before we launch the first-ever generation ships, we will need to gain a large amount of information about the candidate planets to which we are sending the first settlers. One way to do this is by sending probes to potential solar systems, gaining as much detail as possible to ensure ships have what they need before they are launched. Work on such ideas has already begun, as with the Breakthrough Starshot mission proposed by Yuri Milner, Stephen Hawking, and Mark Zuckerberg.

The idea is simple enough, and the physics was detailed by Kevin Parkin in 2018. If there were a fleet of extremely light spacecraft that contained miniaturized cameras, navigation gear, communication equipment, navigation tools (thrusters), and a power supply, they could be “beamed” ahead with lasers to accelerate their speed. If each minispacecraft had a “lightsail” targetable by lasers, they could all be sped up to reduce the transit time. Such a “StarChip” could make the journey to the exoplanet Proxima Centauri b—an exoplanet orbiting within the habitable zone of Proxima Centauri—in roughly 25 years and send back data for us to review, following another 25 years of data transit back to Earth. Then, we would have more information on what may be awaiting a crew if that location were chosen. The idea for this plan is credited to physicist Philip Lubin, who imagined in his 2015 article, “A Roadmap to Interstellar Flight,” an array of adjustable lasers that could focus on the StarChip with a combined power of 100 gigawatts to propel the probes to our nearest known star.

The ideal scenario would be seeding the world in preparation for humans, similar to missions being conducted on Mars. If these StarChips work, then they could be used to send microbes to other planets as well as sensors. They certainly have many challenges ahead of them as well, requiring them to survive the trip, decelerate, and then land on the new planet—no small feat. However, this travel plan is all within the range of tolerable conditions for known extremophiles on Earth that casually survive extreme temperatures, radiation, and pressure. The tardigrades, for one, have already survived the vacuum of space and may be able to make the trip to the other planet, and we could have other “seed” organisms sent along, too. Such an idea of a “genesis probe” that could seed other planets with Earth-based microbes, first proposed by Claudius Gros in 2016, would obviously violate all current planetary-protection guidelines, but it might also be the best means to prepare a planet for our arrival. Ideally, this would be done only once robotic probes have conducted an extensive analysis of the planet to decrease the chance of causing harm to any life that may already exist there.

The ethics of a generation ship

These biological, tactical, and psychological issues are driven by one key, last constraint on the generation ship: The passengers are stuck there. As such, this issue represents another challenge that must be addressed: the ethical component. What are the ethics of placing an entire group of people on a single spacecraft, with the expectation that they further procreate additional generations of people, on that ship? They would have to live with the knowledge that the ship on which they live, or are born, is the only world they will ever get to know. Certain social, economic, and cultural infrastructure would need to be built into a generation ship, along with recreational activities.

Bodysuits, virtual/augmented reality camera sets, and immersive experience sets have been built for recreational purposes on Earth, and these would be essential for the generation ship’s crews. Groups could play each other in a virtual environment, which would require less infrastructure than traditional sporting events and equipment do. Video games are, after all, not just exploratory and recreational events; they are a technological glue of society. Of course, games are just a single piece of the puzzle. Life aboard a generation ship would be fundamentally different and undeniably more challenging than anything experienced on Earth.

Some critics of sending spacecraft with humans have argued that if an interstellar mission cannot be completed within the lifetime of the crew, then it should not be started at all. Rather, because the technology for propulsion, design of ships, and rocketry (as well as our methods for genome and biological engineering) will all continue to improve, it would be better to wait. It is even possible that if we sent a generation ship to Proxima Centauri b in the year 2500, it would be passed by another spacecraft with more advanced propulsion sent in the year 3000.

This “incessant obsolescence postulate,” first framed by Robert Forward in 1996, is compelling as a thought experiment. Most technologies do tend to get better, and technology has continued to improve in almost all human societies. So how can one know when the right time is? Predicting the future is notoriously difficult.

The extinction we are trying to avoid could occur in that 500-year lag, resulting in the obliteration of all life with no backup.

However, a good option should not be the enemy of a perfect one. We can send two ships—the first in 2500 and the second in 3000—not just one. If the new ship catches up to the old one, they would likely be able to assist each other and should plan to do so. Further, this obsolescence concern misses the key risk of waiting too long to act. The extinction we are trying to avoid could occur in that 500-year lag, resulting in the obliteration of all life with no backup.

But even with advanced entertainment and potential hope of a new, enhanced ship appearing any moment, would the crew still stare out the windows into constant star-filled skies thinking of blue oceans? Or would they perhaps be elated about being the “chosen ones” with an extraordinary opportunity to explore and, quite literally, build a new world? The reality is this ship would be their world, and, for most, it would be the only world they would get to experience.

Yet this limitation of experience is actually not that different from the lives of all humans in history. All humans have been stuck on just one world, looking to the stars and thinking, “What if?” This vessel, the Earth, while large and diverse, is still just a single ship with a limited landscape, environment, and resources, wherein everyone up to the 21st century lived and died without the choice to leave. A few hundred astronauts have left Earth, temporarily, but they all had to return. The generation ship is just a smaller version of the one on which we grew up, and, if done properly, it may even be able to lead to a planet that is better than what we inherited. The new planet could be fertile ground for expanding life in the universe, while also offering lessons on how to preserve life on Earth.


Christopher E. Mason is a geneticist and computational biologist who leads the Space Omics and Medical Atlas (SOMA) project and the Cornell Aerospace Medicine Biobank (CAMbank). He is Professor of Genomics, Physiology, and Biophysics at Weill Cornell Medicine, Director of the WorldQuant Initiative for Quantitative Prediction, and the author of “The Next 500 Years: Engineering Life to Reach New Worlds,” from which this article is adapted.

The post Could a self-sustaining starship carry humanity to distant worlds? appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
In 1919, one eclipse chaser wanted to mount a telescope on a seaplane https://www.popsci.com/science/1919-eclipse-chasers/ Sat, 06 Apr 2024 16:03:00 +0000 https://www.popsci.com/?p=609006
a plane and an eclipse on a text background
An ambitious plan to mount a telescope on a seaplane. Popular Science

Even a century ago, astronomers went to great lengths not be foiled by clouds.

The post In 1919, one eclipse chaser wanted to mount a telescope on a seaplane appeared first on Popular Science.

]]>
a plane and an eclipse on a text background
An ambitious plan to mount a telescope on a seaplane. Popular Science

“What can the astronomer do, when, just as the moon is about to obscure the sun during a total eclipse, a cloud intervenes?” Popular Science posed such a dilemma to its readers in a 1919 solar eclipse story. “Pack up and go home” was the answer for the average eclipse viewer. But even in 1919 extreme eclipse chasers had contingency plans.

The moon’s full shadow hurtles across the Earth at a breakneck 1,500 mph roughly every 18 months. By a twist of cosmic fate unique in our solar system, our planet’s one and only moon happens to be the right size and distance to completely block the sun’s face, briefly exposing its corona, creating a spectacular sight. But that complete overlap only happens in a narrow path about 100 miles wide—the path of totality. 

Extreme eclipse chasers, who call themselves umbraphiles, will seek that path whenever it comes around, even to the remotest regions of Earth. Since the path carved by the moon’s shadow typically traverses thousands of miles—across oceans and continents—the goal is to pick a destination known for its cloudless skies.

Kelly Korreck, NASA’s program manager for the 2024 solar eclipse, which will speed across the US from Texas to Maine on April 8, has viewed eclipses from places as different as the deck of a US aircraft carrier (USS Yorktown) and the northern Chilean coast. For Korreck, the experience is incomparable. “Very strong emotions come up,” she says, “from almost fear that the sun has gone away to something very magical and very exciting.” As soon as it’s over—totality only lasts several minutes or less, location dependent—she admits that her immediate thought is, “When’s the next one? Where are we going to go?”

a man with a pipe and bowtie sits on a ladder looking through a large telescope
Dr. David Todd at the Georgetown Observatory on Aug. 21, 1924. Image: Library of Congress

In 1919, jetting across the world was not yet possible, and less of the planet was developed and accessible. Eclipse chasers were mostly well-funded scientists and astronomers who had the wherewithal to mount an expedition, set aside months for travel, and haul tons of equipment into remote regions. That’s why one astronomer’s plan in 1919 to mount a telescope on a seaplane and fly above the clouds seemed worth reporting, even though Popular Science’s editors were skeptical that it would work. The alternative, “unmanned balloons” fitted with cameras, proposed by George Hale, founder of the Mount Wilson Observatory in California, seemed much more practical. 

Whether the daring aeronautical astronomer, David Todd, an eccentric eclipse chaser and erstwhile professor at Amherst College, ever succeeded with his seaplane plan isn’t recorded. But the 1919 eclipse went down in the history books for its role in providing the backdrop for Arthur Eddington and Frank Dyson to prove Einstein’s theory of relativity. 

Today, NASA operates dozens of heliophysics missions, most from space-based observatories, free from the chance of cloudy skies.

Space photo

A total eclipse of the sun can never last more than eight minutes. Usually it lasts much less. An astronomer will travel thousands and thousands of miles to an out-of-the-way place, in order to make the most of a few precious minutes. The actors in a play are no more carefully rehearsed than are astronomers stationed at the various instruments. No one member of an eclipse expedition sees the eclipse as a whole; each one performs the special duties assigned to him. 

What if cloud or fog should steal between the earth and the sun? What if it should rain? All these elaborate preparations, all this tedious traveling, go for nothing. But fogs are always low-lying—never more than a thousand feet thick. Therefore, if cloud or fog creep in between the earth and the sun, the solution is to climb above them and see the eclipse in all its uncanniness. 

No wonder, then, that astronomers are interested in the experiment undertaken by Professor David Todd, of the Amherst College Astronomical Observatory, of using a seaplane in which to rise high above the clouds to view the eclipse.

Professor Todd’s Experiment

With the assistance of United States Naval officers and a seaplane, Professor Todd set out to take photographs of the sun’s eclipse which occurred on May 29. It was planned that the steamship on which the expedition sailed would stop at a point near the equator off the South American coast, launch the seaplane, and then stand by while the astronomer tried out his plan.

Space photo

It might have been expected that Professor Todd would be the first to carry astronomy into the air. He is the most enthusiastic, indefatigable, and ingenious of eclipse observers. He even went so far, some years ago, as to devise a method of operating a whole battery of astronomical instruments from a central point, but was unable to employ his invention for the observation of this particular eclipse because the sky was at the time obscured.

Although at the time of going to press the results of Professor Todd’s experiment have not been reported, it may be doubted that the plan of using a seaplane is practicable. Such is the vibration caused by a seaplane’s engine that the steady platform that must be provided for all telescopes becomes a shaking base hardly suitable for Professor Todd’s purpose. To be sure, it was his intention to offset the vibration by an elastic mounting of the telescope; but anyone who knows anything at all about the inertia of movable parts will admit that absolute steadiness can hardly be thus obtained.

A More Practical Scheme

Professor George E. Hale, of Mount Wilson Observatory, has a far more practical scheme, to our mind. His plan is to send an unmanned balloon above the clouds, and to steady the cameras, which the balloon will carry, by means of a gyroscope. Professor Hale plans to study the corona—that ghostly appendage which surrounds the sun, and which is visible from the earth only during an eclipse—at any time.

As we ascend in the atmosphere of the earth we finally reach a point, perhaps at an altitude of thirty miles or more, where the sky is not blue, but jet-black.

The sky is blue because the air is filled with countless billions of dust particles that diffuse the light of the sun. In the inky canopy of the sky above the region of dust particles, where the air is extremely thin, the stars appear in their proper places even in broad daylight. And the sun is a great blazing ball hung in the blackness. Its wonderful corona, the chief object of study during a total eclipse, gleams in all its pearly beauty.

Should Professor Todd Succeed

If Professor Hale succeeds in realizing his plan, we need not wait for a total eclipse in order to study the corona but we can photograph it whenever we please and study it day by day.

The post In 1919, one eclipse chaser wanted to mount a telescope on a seaplane appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How to make a pinhole camera to watch the solar eclipse https://www.popsci.com/diy/how-to-make-a-pinhole-camera/ Fri, 06 Oct 2023 16:19:21 +0000 https://www.popsci.com/?p=577644
A cardboard pinhole camera to watch an eclipse
Listen, we know this is not the most sophisticated-looking artifact, but it does a great job at protecting your eyes when you want to look at the sun. Sandra Gutierrez

This DIY projector might be the easiest you ever build.

The post How to make a pinhole camera to watch the solar eclipse appeared first on Popular Science.

]]>
A cardboard pinhole camera to watch an eclipse
Listen, we know this is not the most sophisticated-looking artifact, but it does a great job at protecting your eyes when you want to look at the sun. Sandra Gutierrez

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

It’s a well-known fact that staring at the sun is… not the best idea. In the same way that the sun can burn your skin, our home star can overwhelm your peepers with UV rays and literally scorch your retina.

That is a huge bummer, especially because watching a solar eclipse (when the moon covers the sun) is an incredibly cool experience. Thankfully, there are several ways to watch an eclipse without risking your vision, and one of them is building a pinhole camera out of a box, a piece of aluminum foil, and lots of tape. This is an easy and incredibly versatile project, and you can turn it into a permanent camera obscura when you’re done watching the eclipse. 

Stats

  • Time: 10 minutes
  • Cost: $1
  • Difficulty: easy 

Materials

Tools

How to make a pinhole camera

1. Light-proof your box. Leaving one side open, use duct tape or electrical tape to seal the box and prevent any light rays from sneaking in. Pay special attention to the corners and wherever two pieces of cardboard meet. The pinhole will only allow a few rays of light into your box, so the projection of the sun will be dim. That means the darker your camera, the easier it will be to see the image.

As we said, this project is versatile. You can use a wide range of box sizes to make your pinhole camera, but cereal and shoe boxes work exceptionally well. We used the 15-by-7 ½-by-5 ½-inch box that carried our neighbor’s latest online shopping spurt. 

Light-proofed box for pinhole camera.
Covering the openings and corners with duct tape is the easiest way to light-proof your box. But electrical tape will also do. Sandra Gutierrez

Likewise, duct tape and electrical tape are the best choices to light-proof your box, but you can use any tape that will block light—dark washi tape or masking tape will also do the trick. Just keep in mind that you may have to apply multiple layers to achieve total darkness inside your box. 

  • Pro tip: Check your work by holding your box up to a light and looking inside. If you still see some shine coming through, apply another layer of tape. 
Arrows pointing to the openings of a box where the light filters in.
Hold your box against a window or a lamp to see where the light comes through. The corners are often problematic spots you’ll need to cover. Sandra Gutierrez

2. Determine your pinhole’s location and cover the inside of the opposite face with white paper. Measure one of the smallest sides of the box, cut a piece of white paper to the same size, and tape or glue it to the inside of the corresponding face. It doesn’t have to be perfect—as long as most of the side is covered, you’ll be good to go. Just make sure that the paper doesn’t have any wrinkles or folds, as they may distort the image of the sun. 

White sheet of paper glued to the inside of a box.
If you don’t want to mess around with glue, you can always just tape the white paper that will be your screen. Do it carefully to avoid wrinkles and creases. Sandra Gutierrez

3. Measure the openings for the pinhole and the viewer. On the side opposite the one you covered with white paper, use your ruler and a pencil to measure two openings. The pinhole opening will be located in the upper left corner (about half an inch from the edges) and will be 2-by-2 inches (we’ll make it smaller later). 

Ruler measuring a square on a cardboard box.
Measurements don’t have to be exact. As long as the aluminum foil covers the entire opening, you’ll be fine. Sandra Gutierrez

The viewing opening will be located in the upper right corner of the box, half an inch from the top edge and an inch from the right edge of the box. This opening will be smaller—only 1 inch square.

4. Cut the openings. Using a box cutter or scissors, cut out the openings you drew. 

  • Pro tip: If the openings end up being too big, don’t sweat it—you can always adjust their size with tape. 

5. Close and seal the box. Use your newly cut openings to make sure there are no other places where light might be sneaking in. Pay special attention to the corners of the box above and below your openings. Cover all the places where pieces of cardboard meet with tape. 

6. Cover the larger opening with aluminum foil. Cut a smooth 2 ½-by-2 ½-inch piece of aluminum foil. With the dull side facing you, carefully cover the big opening with the metallic sheet and tape it in place. Make sure you secure it tightly so no light can get into the box.  

Aluminum foil covering the corner of a cardboard box.
Having a smooth piece of aluminum foil will prevent sunlight from being redirected. Sandra Gutierrez
  • Pro tip: To smooth out any creases, softly rub the top of any fingernail over the foil in a small, circular motion. 

7.  Use the thumbtack to poke a hole in the foil. Find the rough center of the 2-by-2-inch square under the aluminum sheet and gently push the tack through before pulling it back out—you want a clean, round hole. If you don’t have a thumbtack, you can use the tip of a toothpick or an embroidery needle. Just make sure that whatever you’re using has a point (it’ll make a neater hole) and that it’s approximately 0.2 millimeters wide. 

Fingers holding a needle in front of a pinhole camera.
We used an embroidery needle to poke our pinhole. If you find that what you used is too wide, you can just replace the piece of aluminum foil and start again. Sandra Gutierrez
  • Note: The width of your pinhole will determine how much light gets into the box. Too much light and the image will be blurry. If that’s the case, don’t worry—just replace the foil and try making a smaller pinhole. 

8. Put your pinhole camera to the test. Stand with your back facing the sun and look into the box through the viewport. Use your hands to block out as much light as possible and move around until you find the angle where sunlight enters through the pinhole. When this happens, you should see a small projection of the shape of the sun on the white paper you pasted inside the box. 

[Related: Total eclipses aren’t that rare—and you’ve probably missed a bunch of them]

Keep in mind that the weather is crucial in determining the quality of the image you’ll see inside your pinhole camera, and whether you can see the eclipse at all.

How a pinhole camera works

Images are light. Everything we see we perceive because there’s light bouncing off of it, beaming directly through our pupils and into our eyes. All cameras, including the humble pinhole camera you just made, operate under this basic principle. The better they filter the light, the sharper the resulting image will be. 

The sun, of course, is the ultimate light source. On a sunny day, rays from the star travel to Earth and bounce off of every surface they reach. This is a lot of light coming from all directions, so if we want to see only a small portion of the sun’s rays, we have to focus those rays and filter out the rest. That’s why the pinhole in your camera is so tiny or, in more technical terms, why its aperture is so narrow—it only lets a small amount of light into the box, just enough so you can see only a dim projection of the sun when you point the pinhole directly at it. 

The image of an LED lamp with a filter besides the image inside a pinhole camera.
I built my pinhole camera on a cloudy day, so I tested it with my LED lamp and added a very non-professionally made filter I made with aluminum foil to test the sharpness of the image. Sandra Gutierrez

The dimness of the image is not ideal, but it’s the tradeoff we make for sharpness—too much light results in a blurry, out-of-focus picture. This is important during a solar eclipse, as filtering the light will allow you to see the round shape of the sun become a crescent or a ring as the moon moves in and gradually blocks the sunlight. 

When the eclipse is over, use a skewer to widen your camera’s pinhole. When you look inside, you won’t only be able to see the sun, but a slightly brighter and inverted image of your surroundings. A bigger pinhole turns your box into a camera obscura, allowing more light in and projecting an image of the objects around you.  

This story was originally published in 2023 and updated in 2024.

The post How to make a pinhole camera to watch the solar eclipse appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
It’s on! Three finalists will design a lunar rover for Artemis https://www.popsci.com/science/artemis-moon-rover-finalists/ Thu, 04 Apr 2024 15:06:52 +0000 https://www.popsci.com/?p=609478
NASA Lunar Terrain Vehicle concept art
NASA wants the LTV ready for Artemis V astronauts scheduled to land on the moon in 2030. NASA

The Lunar Terrain Vehicle must be seen in action on the moon before NASA names its winner.

The post It’s on! Three finalists will design a lunar rover for Artemis appeared first on Popular Science.

]]>
NASA Lunar Terrain Vehicle concept art
NASA wants the LTV ready for Artemis V astronauts scheduled to land on the moon in 2030. NASA

NASA has announced three finalists to pitch them their best moon car ideas by this time next year to use on upcoming Artemis lunar missions. During a press conference yesterday afternoon, the agency confirmed Intuitive Machines, Lunar Outpost, and Venturi Astrolab will all spend the next 12 months developing their Lunar Terrain Vehicle (LTV) concepts as part of the “feasibility task order.”

According to Vanessa Wyche, director of NASA’s Johnson Space Center in Houston, the final LTV will “greatly increase our astronauts’ ability to explore and conduct science on the lunar surface while also serving as a science platform between crewed missions.”

Intuitive Machines LTV concept art
Credit: Intuitive Machines

While neither Lunar Outpost nor Venturi Astrolab have been on the moon yet, they are planning uncrewed rover missions within the next couple years. In February, Intuitive Machines became the first privately funded company to successfully land on the lunar surface with its NASA-backed Odysseus spacecraft. Although “Odie” officially returned the US to the moon after an over-50 year hiatus, touchdown complications resulted in the craft landing on its side, severely limiting the extent of its mission.

[Related: NASA’s quirky new lunar rover will be the first to cruise the moon’s south pole.]

The last time astronauts zipped around on a moon buggy was back in 1971 during NASA’s Apollo 15 mission. The new LTV, like its Apollo predecessor, will only accommodate two people in an unpressurized cockpit—i.e. exposed to the harsh moon environment.

Venturi Astrolab LTV concept next to rocket on moon
Credit: Venturi Astrolab

Once deployed, however, the LTV will differ from the Lunar Roving Vehicle in a few key aspects—most notably, it won’t always need someone at the steering wheel. While astronauts will pilot the LTV during their expeditions, the vehicle will be specifically designed for remote control once the Artemis crew is back home on Earth. In its initial May 2023 proposal call, the agency explained its LRV capabilities will be “similar to NASA’s Curiosity and Perseverance Mars rovers.” When NASA isn’t renting the LTV, the winning company will also be free to contract it out to private ventures in the meantime.

But while a promising lunar rover design is great to see on paper, companies will need to demonstrate their vehicle’s capabilities before NASA makes its final selection—and not just on some desert driving course here on Earth.

Lunar Outpost LTV concept art
Credit: Lunar Outpost

After reviewing the three proposals, NASA will issue a second task order to at least one of the finalists, requesting to see their prototype in action on the moon. That means the company (or companies) will need to plan and execute an independent lunar mission, deliver a working vehicle to the moon, and “validate its performance and safety.” Only once that little hurdle is cleared does NASA plan to greenlight one of the company’s rovers.

If everything goes smoothly, NASA’s Artemis V astronauts will use the winning LTV when they arrive near the moon’s south pole in 2030.

The post It’s on! Three finalists will design a lunar rover for Artemis appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA is designing a time zone just for the moon https://www.popsci.com/science/coordinated-lunar-time/ Wed, 03 Apr 2024 14:57:29 +0000 https://www.popsci.com/?p=609290
Buzz Aldrin on the moon next to American flag.
The White House has instructed the agency to begin looking into Coordinated Lunar Time ahead of our return to the moon—something Buzz Aldrin never had. NASA

Timekeeping works differently up there.

The post NASA is designing a time zone just for the moon appeared first on Popular Science.

]]>
Buzz Aldrin on the moon next to American flag.
The White House has instructed the agency to begin looking into Coordinated Lunar Time ahead of our return to the moon—something Buzz Aldrin never had. NASA

What time is it on the moon?

Well, right now, that’s somewhat a matter of interpretation. But humanity is going to need to get a lot more specific if it intends to permanently set up shop there. In preparation, NASA is aligning its clocks in preparation for the upcoming Artemis missions. On Tuesday, the White House issued a memo directing the agency to establish a Coordinated Lunar Time (LTC), which will help guide humanity’s potentially permanent presence on the moon. Like the internationally recognized Universal Time Zone (UTC), LTC will lack time zones, as well as a Daylight Savings Time.

It’s not quite a time zone like those on Earth, but an entire frame of time reference for the moon. 

As Einstein famously noted, time is very much relative. Most timekeeping on Earth is tied to Coordinated Universal Time (UTC), which relies on an international array of atomic clocks designed to determine the most precise time possible. This works just fine in relation to our planet’s gravitational forces, but thanks to physics, things are observed differently elsewhere in space, including on the moon.

“Due to general and special relativity, the length of a second defined on Earth will appear distorted to an observer under different gravitational conditions, or to an observer moving at a high relative velocity,” Arati Prabhakar, Assistant to the President for Science and Technology and Director at the Office of Science and Technology Policy (OSTB), explained in yesterday’s official memorandum

Because of this, an Earth-based clock seen by a lunar astronaut would appear to lose an average of 58.7 microseconds per Earth day, alongside various other periodic variational influences. This might not seem like much, but it would pose major issues for any future lunar spacecraft and satellites that necessitate extremely precise timekeeping, synchronization, and logistics.

[Related: How to photograph the eclipse, according to NASA.]

“A consistent definition of time among operators in space is critical to successful space situational awareness capabilities, navigation, and communications, all of which are foundational to enable interoperability across the U.S. government and with international partners,” Steve Welby, OTSP Deputy Director for National Security, said in Tuesday’s announcement.

NASA’s new task is about more than just literal timing—it’s symbolic, as well. Although the US aims to send the first humans back to the lunar surface since the 1970’s, it isn’t alone in the goal. As Reuters noted yesterday, China wants to put astronauts on the moon by 2030, while both Japan and India have successfully landed uncrewed spacecraft there in the past year. In moving forward to establish an international LTC, the US is making its lunar leadership plans known to everyone.

[Related: Why do all these countries want to go to the moon right now?]

But it’s going to take a lot of global discussions—and, yes, time—to solidify all the calculations needed to make LTC happen. In its memo, the White House acknowledged putting Coordinated Lunar Time into practice will need international agreements made with the help of “existing [timekeeping] standards bodies,” such as the United Nations International Telecommunications Union. They’ll also need to discuss matters with the 35 other countries who signed the Artemis Accords, a pact concerning international relations in space and on the moon. Things could also get tricky, given that Russia and China never agreed to those accords.

“Think of the atomic clocks at the US Naval Observatory. They’re the heartbeat of the nation, synchronizing everything,” Kevin Coggins, NASA’s space communications and navigation chief, told Reuters on Tuesday. “You’re going to want a heartbeat on the moon.”

NASA has until the end of 2026 to deliver its standardization plan to the White House. If all goes according to plan, there might be actual heartbeats on the moon by that point—the Artemis III crewed lunar mission is scheduled to launch “no earlier than September 2026.”

The post NASA is designing a time zone just for the moon appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Older galaxies are more chaotic https://www.popsci.com/science/aging-chaotic-galaxies/ Wed, 03 Apr 2024 13:31:42 +0000 https://www.popsci.com/?p=609253
a swirling, purple galaxy in space
NASA's Neil Gehrels Swift Observatory viewed our neighboring spiral galaxy Andromeda, also called M31, in ultraviolet light. NASA/Swift/Stefan Immler (GSFC) and Erin Grand (UMCP)

As they age, galaxies are getting random.

The post Older galaxies are more chaotic appeared first on Popular Science.

]]>
a swirling, purple galaxy in space
NASA's Neil Gehrels Swift Observatory viewed our neighboring spiral galaxy Andromeda, also called M31, in ultraviolet light. NASA/Swift/Stefan Immler (GSFC) and Erin Grand (UMCP)

Galaxies come in a variety of shapes and sizes. Some have buff, spiral arms. Others are necklace-shaped or oblong. They begin their lives rotating in an orderly fashion, but the movement of the stars eventually gets more random and less organized. Astronomers have not been able to pinpoint the reasons behind the changes, but new research poses a somewhat simple explanation–aging. As galaxies age, they tend to be more chaotic. The findings are described in a study published April 3 in Monthly Notices of the Royal Astronomical Society (MNRAS).

[Related: Listen to three breathtaking NASA images.]

“When we did the analysis, we found that age, consistently, whichever way we slice or dice it, is always the most important parameter,” study co-author and University of Sydney observational astrophysicist Scott Croom said in a statement. “Once you account for age, there is essentially no environmental trend, and it’s similar for mass. If you find a young galaxy it will be rotating, whatever environment it is in, and if you find an old galaxy, it will have more random orbits, whether it’s in a dense environment or a void.”

Solar System photo
A comparison of a young (top) and old (bottom) galaxy observed as part of the SAMI Galaxy Survey. Panels on the left are regular optical images from the Subaru Telescope. In the middle are rotational velocity maps (blue coming towards us, red going away from us) from SAMI. On the right are maps measuring random velocities (redder colors for greater random velocity). Both galaxies have the same total mass. The top galaxy has an average age of 2 billion years, high rotation and low random motion. The bottom galaxy has an average age of 12.5 billion years, slower rotation and much larger random motion. CREDIT: Image from the Hyper Suprime-Cam Subaru Strategic Program

When galaxies are young, they are star-forming machines. Older ones typically stop forming new stars. Earlier studies suggested that the galaxy’s environment or mass were the more important factors influencing how galaxies behave and move. According to the team, these ideas are not necessarily incorrect.

“We do know that age is affected by [the] environment. If a galaxy falls into a dense environment, it will tend to shut down the star formation. So galaxies in denser environments are, on average, older,” study co-author and University of Sydney astronomer Jesse van de Sande said in a statement. “The point of our analysis is that it’s not living in dense environments that reduces their spin, it’s the fact that they’re older.” For example, our own 13.6 billion year-old Milky Way galaxy still has a thin star forming disk and it is considered a high spin rotational galaxy. Older galaxies also move around more randomly than younger ones, no matter how densely packed with energy their environments are.    

In the new study, an international team of scientists used data from observations from the SAMI Galaxy Survey. SAMI has surveyed 3,000 galaxies across a wide range of cosmic environments, which helped the team compare and contrast different types of galaxies. Having more accurate observations of galactic behavior helped them fine-tune their models of how the universe developed. 

[Related: JWST images show off the swirling arms of 19 spiral galaxies.]

In future studies, the team hopes to create galaxy evolution simulations in better detail using the University of Sydney’s Hector Galaxy Survey.

“Hector is observing 15,000 galaxies, but with higher spectral resolution, allowing the age and spin of galaxies to be measured even in much lower mass galaxies and with more detailed environmental information,” study co-author and Hector Galaxy Survey lead Julia Bryant said in a statement.

This work ultimately aims to give scientists a better understanding about how the universe has evolved over billions of years and how our solar system came to be.  

The post Older galaxies are more chaotic appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A 3,200-megapixel digital camera is ready for its cosmic photoshoot https://www.popsci.com/science/largest-digital-camera/ Wed, 03 Apr 2024 13:00:00 +0000 https://www.popsci.com/?p=609139
LSST Camera Deputy Project Manager Travis Lange shines a flashlight into the LSST Camera.
The LSST Camera took two decades to build, and will embark on a 10-year-long cosmic imaging project. Credit: Jacqueline Ramseyer Orrell/SLAC National Accelerator Laboratory

The Legacy Survey of Space and Time (LSST) Camera is the size of a small car—and the biggest digital camera ever built for astronomy.

The post A 3,200-megapixel digital camera is ready for its cosmic photoshoot appeared first on Popular Science.

]]>
LSST Camera Deputy Project Manager Travis Lange shines a flashlight into the LSST Camera.
The LSST Camera took two decades to build, and will embark on a 10-year-long cosmic imaging project. Credit: Jacqueline Ramseyer Orrell/SLAC National Accelerator Laboratory

The world’s largest digital camera is officially ready to begin filming “the greatest movie of all time,” according to its makers. This morning, engineers and scientists at the Department of Energy’s SLAC National Accelerator Laboratory announced the completion of the Legacy Survey of Space and Time (LSST) Camera, a roughly 6,610-pound, car-sized tool designed to capture new information about the nature of dark matter and dark energy.

Following a two-decade construction process, the 3,200-megapixel LSST Camera will now travel to the Vera C. Rubin Observatory located 8,900-feet atop Chile’s Cerro Pachón. Once attached to the facility’s Simonyi Survey Telescope later this year, its dual five-foot and three-foot-wide lenses will aim skyward for a 10-year-long survey of the solar system, the Milky Way galaxy, and beyond.

Just how much detail can you get from a focal plane leveled to within a tenth the width of a human hair alongside 10-micron-wide pixels? Aaron Roodman, SLAC professor and Rubin Observatory Deputy Director and Camera Program Lead, likens its ability to capturing the details of a golf ball from 15-miles away “while covering a swath of the sky seven times wider than the full moon.” The resultant images will include billions of stars and galaxies, and with them, new insights into the universe’s structure.

[Related: JWST takes a jab at the mystery of the universe’s expansion rate.]

Among its many duties, the LSST Camera will search for evidence of weak gravitational lensing, which occurs when a gigantic galaxy’s gravitational mass bends light pathways from the galaxies behind it. Analyzing this data can offer researchers a better look at how mass is distributed throughout the universe, as well as how that distribution changed over time. In turn, this could help provide astronomers new ways to explore how dark energy influences the universe’s expansion.

Illustration breakdown of LSST Camera components
An artist’s rendering of the LSST Camera showing its major components including lenses, sensor array, and utility trunk. Credit: Chris Smith/SLAC National Accelerator Laboratory

To achieve these impressive goals, the LSST Camera needed to be much more than simply a scaled-up version of a point-and-shoot digital camera. While lenses like those within your smartphone often don’t include physical shutters, they are still usually found within SLR cameras. That said, their shutter speeds aren’t nearly as slow as the LSST Camera. 

“The [LSST] sensors are read out much more slowly and deliberately… ” Andy Rasmussen, SLAC staff physicist and LSST Camera Integration and Testing Scientist, tells PopSci. “… the shutter is open for 15 seconds (for the exposure) followed by 2 seconds to read (with shutter closed).” This snail’s pace allows LSST Camera operators to only deal with lower noise—only around 6 or 7 electrons—resulting in capturing much darker skies.

“We need quiet sensors so that we can tell that the dark sky is actually dark and also so that we can measure very dim objects in the sky,” Rasmussen continues. “During this 2 second readout period, we need to block any more light from entering the Camera, so that’s why we have a shutter (one of several mechanisms inside the Camera).”

To further ensure operators can capture the measurements of dim objects, they also ostensibly slow atomic activity near the LSST Camera’s focal point by lowering surrounding temperatures as low as -100C (173 Kelvin).

Beyond dark matter and dark energy research, cosmologists intend to use the LSST Camera to conduct a new, detailed census of the solar solar system. Researchers estimate new imagery could increase the number of known objects by a factor of 10, and thus provide additional insight into how the solar system formed, as well as keep track of any errant asteroids that may speed by Earth a little too close for comfort.

“More than ever before, expanding our understanding of fundamental physics requires looking farther out into the universe,” Kathy Turner, the Department of Energy’s Cosmic Frontier Program manager, said in today’s announcement. With LSST Camera’s installation, Turner believes researchers will be on the path to “answer some of the hardest, most important questions in physics today.”

The post A 3,200-megapixel digital camera is ready for its cosmic photoshoot appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
April skygazing: A total solar eclipse, a meteor shower, and the Pink Moon https://www.popsci.com/science/cosmic-calendar-april-2024/ Mon, 01 Apr 2024 13:00:00 +0000 https://www.popsci.com/?p=608714
a full moon with some clouds obscuring it
A Full Pink Moon sets over San Francisco, California on April 5, 2023. Tayfun Coskun/Anadolu Agency via Getty Images

The eclipse officially kicks off in North America on April 8 at 11:07 a.m. PDT.

The post April skygazing: A total solar eclipse, a meteor shower, and the Pink Moon appeared first on Popular Science.

]]>
a full moon with some clouds obscuring it
A Full Pink Moon sets over San Francisco, California on April 5, 2023. Tayfun Coskun/Anadolu Agency via Getty Images
April 8Total Solar Eclipse
April 21Comet 12P/Pons-Brooks Reaches Perihelion
April 21 through 23Lyrids Meteor Shower Predicted Peak
April 23Full Pink Moon

Millions across Canada, the United States, and Mexico are getting ready for this month’s big total solar eclipse. However, this exciting celestial event is not the only thing to get pumped about this Global Astronomy Month. April will bring in another possible chance to see the “Devil Comet” and a meteor shower. 

[ Related: This is the most cosmically perfect time in history ]

April 8-Total Solar Eclipse

In North America, the moon will pass between the sun and Earth, completely blocking the face of the sun. According to NASA, the sky will darken as if it were dawn or dusk in the areas where the moon blocks out the sun’s light. Torreón, Mexico will see the longest totality at 4 minutes and 28 seconds, while most places along the path of totality will see it last between 3.5 and four minutes. 

The first location in continental North America that will experience totality is the Pacific Coast of Mexico, at about 11:07 am PDT. The path of the eclipse will then enter the United States in Texas, and travel through Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire, and Maine. It will enter Canada via Southern Ontario, and continue through Quebec, New Brunswick, Prince Edward Island, and Nova Scotia. The eclipse will leave continental North America on the Atlantic coast of Newfoundland, Canada, at 5:16 p.m. NDT. 

The path of totality and partial contours crossing the US for the 2024 total solar eclipse occurring on April 8, 2024. CREDIT: NASA.
The path of totality and partial contours crossing the US for the 2024 total solar eclipse occurring on April 8, 2024. CREDIT: NASA.

It is incredibly important to not look directly in the sun without proper eye protection during the eclipse. You can also build your own eclipse glasses and pinhole camera to watch this incredible event without frying your eyeballs. Aspiring astrophotographers are also encouraged to try to photograph the event and you can learn how to do so safely with this NASA-approved guide.

[Related: How to make sure your eclipse glasses actually work.]

April 21- Comet 12P/Pons-Brooks Reaches Perihelion

The “Devil Comet” put on a show in the Northern Hemisphere in March, and could even photobomb this month’s eclipse. On April 21, it will reach its closest point to the sun. During this time, it may be visible to the naked eye if the sky is dark enough. As it moves from the constellation Aries to Taurus, it will also become visible from the Southern Hemisphere. For the best spots to try to catch a glimpse of Pons-Brooks, consult StarWalk

After June, Pons-Brooks will take another 71 years for it to complete a full circuit around the sun. It won’t be visible again until summer 2095, so this will likely be the last time most of us get to see it. 

April 21 through 23- Lyrids Meteor Shower Predicted Peak

The annual Lyrids meteor shower officially begins on April 15 and is predicted to peak beginning in the early evening hours of April 21. Unfortunately, this year’s shower will be impacted by a bright waxing gibbous moon, making the night sky a bit brighter. In a dark sky with no moon 10 to 15 meteors per hour can be expected, so this year’s may be a little bit low. However, the Lyrids are known for some rare surges in activity that can sometimes bring them up to 100 per hour. The meteor shower will be visible from both the Northern and Southern hemispheres, but is much more active in the north.

[Related: The moon is shrinking (very slowly).]

April 23- Full Pink Moon

The first full moon of spring in the Northern Hemisphere will reach peak illumination at 7:49 pm EDT on April 23. You can use the Farmer’s Almanac to calculate the local moonrise and moonset times near you. For best viewing, watch as the moon rises just above the horizon. 

April’s full moon is often called the pink moon in reference to the early springtime blooms of the wildflower Phlox subulata found in eastern North America, so it will not take on a pink hue. The April full moon is also called the Loon Moon or Maango-giizis in Anishinaabemowin (Ojibwe), the It’s Thundering Moon or Wasakayutese in Oneida, and the Planting Moon or O’nót’ah in Seneca.

The same skygazing rules that apply to pretty much all space-watching activities are key during the nighttime events this month: Go to a dark spot away from the lights of a city or town and let the eyes adjust to the darkness for about a half an hour. 

The post April skygazing: A total solar eclipse, a meteor shower, and the Pink Moon appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How to photograph solar eclipse: The only guide you need https://www.popsci.com/how-to-photograph-solar-eclipse-only-guide-you-need/ Fri, 18 Aug 2017 23:03:03 +0000 https://www.popsci.com/uncategorized/how-to-photograph-solar-eclipse-only-guide-you-need/
the ring of the sun during a solar eclipse
Save your eyeballs, get the pic. Teguh Prihatna/NurPhoto via Getty Images

There's a lot of information out there and not all of it is good.

The post How to photograph solar eclipse: The only guide you need appeared first on Popular Science.

]]>
the ring of the sun during a solar eclipse
Save your eyeballs, get the pic. Teguh Prihatna/NurPhoto via Getty Images

If you use social media of any kind, you’ve seen tons of articles about how to take a photo of the April 8 total solar eclipse. Some of them were created by true experts, while others were cobbled together from bits of information found hastily on Google. But photographing a solar eclipse is actually rather complicated–if you want to maximize your chances to get a good image and minimize the possibility of frying your eyes into scorched Ikea meatballs. Here, I’ll help cut through some of the noise to what’s really important for a good eclipse photograph.

Don’t: Put your eyes in jeopardy

By now you’re probably sick of hearing about how the eclipse can hurt your vision, but damaged sight is serious business. Here’s a solid safety guide to review before you even think about creating a photographic masterpiece. Make sure your eclipse glasses are legit, too. (We have some recommendations on eclipse glasses to try, too.)

Do: Plan to be in a good spot for the eclipse

Planning is important for a shot like this, but luckily it’s easy to scout a location before go-time. Use a resource like these NASA eclipse maps to find out where you can see the most coverage. The super popular spots will be crowded with other photography enthusiasts, so aim for something more unique than that scenic overlook where everyone you know had their engagement photos taken.

Do: Try and work a foreground element into your eclipse shot to give it context

Without any foreground, eclipse pictures just look like partial orange discs. If that’s what you want, have at it; but it likely won’t stand out when you share it. Using a wide-angle lens will help you fit other elements into the shot, to give it scope and context. Try a cityscape or some trees. Anything with a strong shape will read better in the image. A telephoto lens can get you some stunning images, but be careful: That kind of glass magnifies light, and, eclipse or no, you’re still photographing the sun. The viewfinder is not your friend.

If you have access to a really long telephoto lens, you can do some truly cool stuff by positioning a subject between you and the sun. The shot below, for example, took an 800mm telephoto lens (that’s even bigger than most of the big, white lenses you see on the sidelines of an NFL game) to make this scene look as it does. The result is a lot more striking than a circle floating in space. (Again, beware the viewfinder!)

Don’t: Rent expensive gear or buy expensive filters for a single eclipse

Solar filters are specialized pieces of photographic equipment meant to do two things: considerably reduce the total amount of visible light coming in and filter out invisible (at least to the naked eye) IR and UV rays that could damage your camera. For most people, buying one is total overkill. The good ones are expensive and don’t really have much utility outside of an eclipse. The cheap, bad ones will just distort your photos. If you’re dead set on pointing your camera at the sun, you need a filter to protect your gear, but make sure it’s worth the investment for you.

NASA says you can use welding glass to protect your eyes and camera, but it needs to be at least 12 shade, which is much darker than the stuff you’ll typically find at the hardware store. So, if you have a typical welding helmet laying around, don’t use it for viewing unless you check the shade of the glass.

You can also reportedly use a standard neutral density filter, but it needs to be rated to reduce light intake by 16 stops for safety. That means it takes exponentially more light to make the same exposure with the filter as it would without. That’s seriously dark and it still may not be sufficient for blocking the IR and UV rays.

If you already bought the filter, then have a blast with it. If not, see if another friend will let you share. You don’t actually need a solar filter on your camera (or your eyes) when the sun is fully covered by the moon (called totality), so you can shoot during that time without damage. Just err on the side of caution with your timing to make sure you don’t catch some straggling or emerging rays. The rule of thumb is that if you see the little marbles of light starting to form at the edge of the eclipse, you should already have your filter back on.

Don’t: Use an optical viewfinder when you’re using a solar filter

That photographic solar filter on the front of your camera can protect the sensor inside from getting fried and help correct your exposure, but it won’t necessarily protect your eyeballs. That’s not its job nor its promise. Amazon actually ran into a serious problem with this fact regarding some filters from a very reputable company called Lee. If you’re using a DSLR or a rangefinder camera, then stay away from that viewfinder and use your camera’s live view mode.

Don’t: Look through a telephoto lens with your eclipse glasses on

Solar filters are meant to go on the front of lenses you look through. Putting a lens behind a lens, only subjects it to rays that have been magnified and the intensity can do serious damage. This goes for photo lenses, telescopes, and even binoculars. This is true even when there’s no eclipse happening.

NASA photo

Do: Be careful about exposure (and shoot raw if you know what that is)

Exposure can be tricky when shooting the eclipse. The B&H guide has a handy exposure settings chart you can refer to before you go shooting. Chances are, when you point your camera in auto mode toward the eclipse, the resulting photo will be washed out because it’s compensating for the dark scene. You can fix this by using a camera feature called exposure compensation. This is typically achieved by turning a dial on your camera, but you can also do it on your smartphone. Tap and hold over the eclipsed sun to lock the focus and exposure. Then drag your thumb down and watch the image get darker.

If you’re using a camera that has raw mode—some smartphones even have this option now—you will get extra image data you can use when you’re editing your photo.

If you have no idea what “shooting in raw” means, then skip this step because a big important event is the worst time to try something new. Give raw a try down the road, though. Here’s a tutorial now how to get started.

Do: Shoot a lot of photos

While I’m not advocating the “spray and pray” method enabled by holding down the camera’s shutter button and letting it hammer out pictures as fast as it can, you should plan to shoot a lot of photos. If you get a sufficient number of good photos, you can make a cool sequence like the one below.

There are certain important moments during the eclipse that you’ll want to capture, especially if you’re making a sequence. Each phase has a technical name, as well as a more colloquial name like the “diamond ring” which happens just before and after totality.

Don’t: miss out on the actual event because you’re trying to snap a perfect pic

While getting the photo you want is great, also consider that you’re witnessing a very rare event, and take some time to admire it rather than obsessing over your camera.

Don’t: Be afraid to shoot a picture with your iPhone

According to Apple, the iPhone’s camera doesn’t need a solar filter for shooting a picture of the eclipse because of its wide angle lens and overall construction. We haven’t tested that fact, and Samsung and LG didn’t comment when contacted about the subject, but Apple seems very confident that it’s not a problem. Do consider this a disclaimer, however: It’s still totally possible that the eclipse will damage your iPhone camera, so proceed at your own risk.

One caveat is that your smartphone will probably try to overexpose the scene and wash out the wonderful eclipse part. You can fix this by pressing your thumb over the sun to lock the focus and exposure, then dragging your thumb downward to reduce the overall exposure. Exposure compensation, remember? It’s a pretty handy thing to use in a variety of other situations, too, not just epic events like this one.

Do: use a tripod

Setting up your shot and following the action will be a lot easier if you have a stable base, especially if you’re planning on using a long telephoto lens. Just be sure to check if you need a permit to put down a tripod where you’re planning to shoot. Often, parks let you operate without a permit if you’re going handheld, but require a permit when you start putting down stands or tripods.

Do: Use that fancy solar filter you bought after the eclipse is over

OK, so you didn’t get the advice about not buying a filter in time and now you’ve got an overly-expensive and extremely specific piece of gear. Luckily, that solar filter can also act as a pretty capable neutral density filter, which photographers use all the time.

By limiting light as it comes into the camera, it allows you to use longer shutter speeds to blur moving objects in the photo or shoot at wider apertures in bright conditions, which is good if you want to get that oh-so-popular shallow depth of field look with a nice blurry background.

This story was originally published in 2017 and updated in 2024.

The post How to photograph solar eclipse: The only guide you need appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Japan’s SLIM moon lander survives a second brutal lunar night https://www.popsci.com/science/slim-reboot-again/ Thu, 28 Mar 2024 14:00:00 +0000 https://www.popsci.com/?p=608358
Image taken of JAXA SLIM lunar lander on moon upside down
SLIM lived through another two weeks of -200 degree temperatures. JAXA/Takara Tomy/Sony Group Corporation/Doshisha University

It's still upside down, but it's showing signs of life.

The post Japan’s SLIM moon lander survives a second brutal lunar night appeared first on Popular Science.

]]>
Image taken of JAXA SLIM lunar lander on moon upside down
SLIM lived through another two weeks of -200 degree temperatures. JAXA/Takara Tomy/Sony Group Corporation/Doshisha University

SLIM, Japan’s first successful lunar lander, isn’t going down without a fight. After making history—albeit upside down—in January, the Smart Lander for Investigating Moon continues to surprise mission control at Japan Aerospace Exploration Agency (JAXA) by surviving not one, but now two brutally frigid lunar nights.

“Last night, we received a response from #SLIM, confirming that the spacecraft made it through the lunar night for the second time!” JAXA posted to X on Wednesday alongside a new image of its likely permanent, inverted vantage point near the Shioli crater. JAXA also noted that, because the sun is currently high above the lunar horizon, SLIM’s equipment is currently extremely hot (212-degrees Fahrenheit or so), so only the navigation camera can be used for the time being.

Based on their newly acquired data, however, it appears that some of the lander’s temperature sensors and unused battery cells are beginning to malfunction. Even so, JAXA says “the majority of functions that survived the first lunar night” are still going strong after yet another two-week stretch of darkness that sees temperatures drop to -208 Fahrenheit.

It’s been quite the multi-month journey for SLIM. After launching last September, SLIM eventually entered lunar orbit in early October, where it then spent weeks rotating around the moon’s surface. On January 19, JAXA initiated SLIM’s landing procedures, with early indications pointing towards a successful touchdown. After reviewing lander data, JAXA confirmed the spacecraft stuck the landing roughly 180-feet from an already extremely narrow 330-feet-wide target site—thus living up to SLIM’s “Moon Sniper” nickname.

[Related: SLIM lives! Japan’s upside-down lander is online after a brutal lunar night.]

The historic moment wasn’t a flawless mission, however. In the same update, JAXA explained that one of its lander’s main engines malfunctioned as it neared the surface, causing SLIM to tumble over, ostensibly on its head. In doing so, the craft’s solar panels now can’t work at their full potential, thus limiting battery life and making basic functions much more difficult for the lander.

JAXA still managed to make the most of its situation by using SLIM’s sensors to gather a ton of data on the surrounding lunar environment, as well as deploy a pair of tiny autonomous robots to survey the lunar landscape. On January 31, mission control released what it cautioned could very well be SLIM’s last postcard image from the moon ahead of an upcoming lunar night. The lander wasn’t designed for a lengthy life even in the best of circumstances, but its prospects appeared even dimmer given its accidental positioning.

Roughly two weeks later, however, SLIM proved it could endure in spite of the odds by booting back up and offering JAXA another opportunity to gather additional lunar information. A repeat of JAXA’s same warning came a few days later—and yet here things stand, with SLIM still chugging along. From the start, researchers have employed the lander’s multiple tools, including a Multi-Band Camera, to analyze the moon’s chemical composition, particularly the amounts of olivine, ““will help solve the mystery of the origin of the moon,” says JAXA.

At this point, it’s anyone’s guess how much longer the lander has in it. Perhaps it’s taking a cue from NASA’s only-recently-retired Mars Ingenuity rotocopter, which lasted around three years longer than intended.

The post Japan’s SLIM moon lander survives a second brutal lunar night appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This is the most cosmically perfect time in history https://www.popsci.com/science/the-most-cosmically-perfect-time-in-history/ Wed, 27 Mar 2024 13:00:00 +0000 https://www.popsci.com/?p=608121
A man uses a tinted glass to watch as the moon passes infront of the Earth's star marking a total eclipse, the only one this year, in Vigo, northwestern Spain on March 20, 2015.
A man uses a tinted glass to watch as the moon passes infront of the Earth's star marking a total eclipse, the only one this year, in Vigo, northwestern Spain on March 20, 2015. MIGUEL RIOPA/AFP via Getty Images

Plus other weird things we learned this week.

The post This is the most cosmically perfect time in history appeared first on Popular Science.

]]>
A man uses a tinted glass to watch as the moon passes infront of the Earth's star marking a total eclipse, the only one this year, in Vigo, northwestern Spain on March 20, 2015.
A man uses a tinted glass to watch as the moon passes infront of the Earth's star marking a total eclipse, the only one this year, in Vigo, northwestern Spain on March 20, 2015. MIGUEL RIOPA/AFP via Getty Images

What’s the weirdest thing you learned this week? Well, whatever it is, we promise you’ll have an even weirder answer if you listen to PopSci’s hit podcast. The Weirdest Thing I Learned This Week hits Apple, Spotify, YouTube, and everywhere else you listen to podcasts every-other Wednesday morning. It’s your new favorite source for the strangest science-adjacent facts, figures, and Wikipedia spirals the editors of Popular Science can muster. If you like the stories in this post, we guarantee you’ll love the show.

Heads up: Rachel and Jess are planning a livestream Q&A in the near future, as well as other fun bonus content! Follow Rachel on Patreon and Jess on Twitch to stay up to date. 

FACT: This is the most cosmically perfect time in history

By Clara Moskowitz

At least, in terms of observing cosmic phenomena, it is. We’re about to see a total solar eclipse over North America, which is a pretty rare phenomenon. But if we were living at a different point in cosmic history, it would be more than rare—it’d be impossible. 

See, the fact that the moon is the perfect size to cover up the face of the sun in the sky is a total coincidence. It didn’t have to be that way, and in fact, it didn’t used to be that way. 

The moon started off closer to Earth than it is now, so it would have looked bigger in the sky. It would have been so big that it wouldn’t just block the sun, it would also have covered up the solar corona—the glowing atmosphere around the sun that turns a total solar eclipse into a beautiful spectacle. 

And the moon’s getting farther away all the time—by about 1.5 inches each year. This movement is a consequence of how the moon tugs on Earth to create the tides, which in turn drag Earth’s spin down minutely. To conserve angular momentum, the moon speeds up a teensy tiny amount, and thus moves away from us. In another 620 million years, the moon will be far enough away that its face will appear too small to completely block out the sun like it does now. 

FACT: Eclipses have been freaking humans out for pretty much forever 

By Rachel Feltman

Our oldest visual representation of a solar eclipse could be a fairly innocuous looking mound of stone in County Meath, Ireland called the Loughcrew. This grassy hump dates back to around 3,300 BC, making it a good 1,000 years older than Stonehenge. It features a number of large stones with intricate carvings of abstract shapes like spirals and diamonds. Most importantly for our purposes, one of the cairns shows a large carving of overlapping concentric circles—a common visual representation of the sun being eclipsed and then revealed by the moon. 

In 2002, archaeoastronomer Paul Griffin compared the age of the site to calculations of when solar eclipses should have been visible in the area, and found a good match for November 30 3340 BC, just around sunset. He argued that the other symbols on the cairns might represent stars that became visible due to the darkness of the partial eclipse.

Archaeologists had previously noted the presence of charred human remains from around 50 individuals placed in a basin just in front of the carving, which of course evokes some kind of ceremonial sacrifice. 

Now, some scientists vehemently disagree with this interpretation of the Loughcrew cairns, because there’s no written record to disprove it. But that’s kind of the issue with looking 5,000 years into the past: We can say confidently what was going on in the sky, but we have to make a lot of inferences to piece together what people were doing on the ground. 

That being said, we can be pretty certain that our ancestors had some wild reactions to—and explanations for—total solar eclipses. You can hear about more of them in this week’s episode. 

FACT: A total solar eclipse is a perfect opportunity for scrutinizing the sun’s deeply weird corona

By Lee Billings 

One of the most striking aspects of a full-blown solar eclipse is the totality, the period in which the moon hangs over the sun to almost perfectly blot out its starlight. You might expect the sky to simply be dark around our briefly shadowed star, but you’d be wrong. Instead the dark sun is wreathed by what looks like a wavering silvery crown—hence the name, “corona,” Latin for “wreath” or “crown.” This is a complex, dynamic region of hot, rarefied plasma—ionized gas—swirling and billowing in magnetic fields that emanate from deeper within, and being in the moon’s star-blocking shadow is by far the best time to see it. The corona envelopes our star like a tattered, diaphanous and ever-regenerating shroud, constantly shedding pieces at its edges which flow out along magnetic field lines to make the solar wind, which itself forms a larger bubble around our entire solar system that serves as a semipermeable barrier against the seething background of cosmic radiation. Sometimes the corona unleashes larger clumps of material in what are known as coronal mass ejections, which can strike orbiting planets to raise potent solar storms.

And, for reasons no one fully understands, the corona is quite hot—a few million degrees. Which may not seem so strange until you realize the sun’s apparent “surface,” which lies just beneath, is only some thousands of degrees.

This temperature difference is the so-called “coronal heating problem,” and one reason it’s so deeply weird is because it requires non-thermal energy transfer. Neither simple radiant heat—infrared light—nor convective heat, like the bubbling churn of hot fluids, can pump enough energy into the corona to explain its high temperature. The situation is a bit like holding a hot incandescent light bulb or a mug of boiling tea and, instead of suffering a burn having your hand flash-vaporize to a rapidly expanding cloud of plasma; that is, the available thermal energy is insufficient to do the deed. So heliophysicists know that more bizarre processes must be at play, such as heating from some combination of turbulence and the crashing reconnection of immense, writhing loops of the sun’s powerful magnetic field. Bizarre processes that, in turn, must somehow contribute to larger-scale corona-connected phenomena such as the solar wind and the giant mass ejections that reach out to shape the entire solar system and beyond. Scientists will be trying to unlock some of these solar secrets during the upcoming eclipse.

The post This is the most cosmically perfect time in history appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How to photograph the eclipse, according to NASA https://www.popsci.com/science/nasa-eclipse-photo-tips/ Tue, 26 Mar 2024 15:00:00 +0000 https://www.popsci.com/?p=607943
2017 Total Solar Eclipse timelapse
This composite image shows the progression of a partial solar eclipse over Ross Lake, in Northern Cascades National Park, Washington on Monday, Aug. 21, 2017. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. A partial solar eclipse was visible across the entire North American continent along with parts of South America, Africa, and Europe. NASA/Bill Ingalls

You're gonna need some protection for your smartphone and camera lenses.

The post How to photograph the eclipse, according to NASA appeared first on Popular Science.

]]>
2017 Total Solar Eclipse timelapse
This composite image shows the progression of a partial solar eclipse over Ross Lake, in Northern Cascades National Park, Washington on Monday, Aug. 21, 2017. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. A partial solar eclipse was visible across the entire North American continent along with parts of South America, Africa, and Europe. NASA/Bill Ingalls

It’s hard to think of anyone as excited about the upcoming North American total solar eclipse as NASA. From citizen research projects to hosted events within the path of totality, the agency is ready to make the most of next month’s cosmic event—and they want to help you enjoy it, too. Earlier this month, NASA offered a series of tips on how to safely and effectively photograph the eclipse come April 8. Certain precautions are a must, but with a little bit of planning, you should be able to capture some great images of the moon’s journey across the sun, as well as its effects on everything beneath it.

First and foremost is protection. Just as you wouldn’t stare directly at the eclipse with your own eyes, NASA recommends you place specialized filters in front of your camera or smartphone’s lens to avoid damage. The easiest way to do this is simply use an extra pair of eclipse viewing glasses, but there also are a number of products specifically designed for cameras. It’s important to also remember to remove the filter while the moon is completely in front of the sun—that way you’ll be able to snap pictures of the impressive coronal effects.

[Related: How to photograph solar eclipse: The only guide you need]

Sun photo

And while you’re welcome to use any super-fancy, standalone camera at your disposal, NASA reminds everyone that it’s not necessary to shell out a bunch of money ahead of time. Given how powerful most smartphone cameras are these days, you should be able to achieve some stunning photographs with what’s already in your pocket. That said, there are still some accessories that could make snapping pictures a bit easier, such as a tripod for stabilization.

Next: practice makes perfect, as they say. Even though you can’t simulate the eclipse ahead of time, you can still test DSLR and smartphone camera settings on the sun whenever it’s out and shining (with the proper vision protection, of course). For DSLR cameras, NASA recommends using a fixed aperture of f/8 to f/16, alongside shutter speeds somewhere between 1/1000 to one-fourth of a second. These variations can be used during the many stages of the partial eclipse as it heads into its totality. Once that happens, the corona’s brightness will vary greatly, “so it’s best to use a fixed aperture and a range of exposures from approximately 1/1000 to 1 second,” according to the agency. Most smartphone cameras offer similar fine-tuning, so experiment with those as needed, too.

[Related: NASA needs your smartphone during April’s solar eclipse.]

A few other things to keep in mind: Make sure you turn off the flash, and opt for a wide-angle or portrait framing. For smartphones during totality, be sure to lock the camera’s focus feature, as well as enable the burst mode to capture a bunch of potentially great images. Shooting in the RAW image format is a favorite for astrophotographers, so that’s an option for those who want to go above and beyond during the eclipse. While Google Pixel cameras can enable RAW files by themselves, most other smartphones will require a third-party app download to do so, such as Yamera and Halide.

But regardless of your camera (and/or app) choice, it’s not just the sun and moon you should be striving to capture. NASA makes a great point that eclipses affect everything beneath them, from the ambient light around you, to the “Wow” factor on the faces of nearby friends and family members. Be sure to grab some shots of what’s happening around you in addition to what’s going on above.
For more detailed info on your best eclipse photographic options, head over to NASA.

The post How to photograph the eclipse, according to NASA appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Don’t miss your chance to see the cryovolcanic ‘devil comet’ https://www.popsci.com/science/see-devil-comet/ Tue, 26 Mar 2024 13:00:00 +0000 https://www.popsci.com/?p=607518
the icy white core of a comet surrounded by green dust
Comet 12P/Pons-Brooks growing brighter. The greenish coma of this periodic Halley-type comet has become relatively easy to observe in small telescopes. But the bluish ion tail now streaming from the active comet's coma and buffeted by the solar wind, is faint and difficult to follow. Still, in this image stacked exposures made on the night of February 11 reveal the fainter tail's detailed structures. Copyright Dan Bartlett

Comet 12P/Pons-Brooks won’t be visible again until 2097.

The post Don’t miss your chance to see the cryovolcanic ‘devil comet’ appeared first on Popular Science.

]]>
the icy white core of a comet surrounded by green dust
Comet 12P/Pons-Brooks growing brighter. The greenish coma of this periodic Halley-type comet has become relatively easy to observe in small telescopes. But the bluish ion tail now streaming from the active comet's coma and buffeted by the solar wind, is faint and difficult to follow. Still, in this image stacked exposures made on the night of February 11 reveal the fainter tail's detailed structures. Copyright Dan Bartlett

Skygazers have the chance to view more than just a bright planet Mercury or April’s total solar eclipse over the next few days. An unusual “devil comet” or Comet 12P/Pons-Brooks will be visible across the night sky over the next several days and may make an appearance during the big eclipse on April 8th. Since it only makes one orbit around the sun every 71 years, seeing Pons-Brooks is generally a once-in-a-lifetime opportunity.

What is the ‘devil comet’?

Pons-Brooks is a 10.5 mile-wide ball of ice and rock. It has a stretched out or highly elliptical orbit and is currently heading in the direction of our sun. It has a core made up of solid ice, gas, and dust that is surrounded by a frozen shell or nucleus. This nucleus is also covered by a cloud of icy dust called a coma that slowly leaks out of the center of the comet. 

Comet 12P/Pons-Brooks in the night sky. Green glowing gas swirls around a white center and red glowing gas encircles the green.
Comet 12P/Pons-Brooks’ swirling coma. This image is a composite of three very specific colors, showing the comet’s ever-changing ion tail in light blue, its outer coma in green, and highlights some red-glowing gas around the coma in a spiral. The spiral is thought to be caused by gas being expelled by the slowly rotating nucleus of the giant iceberg comet. CREDIT: Copyright Jan Erik Vallestad

Unlike most other comets, Pons-Brooks is cryovolcanic. It frequently erupts when solar radiation opens up fissures in the nucleus. This causes highly pressurized icy cryomagma to spew into space. When this occurs, the cloud of icy dust that surrounds it expands and appears brighter than usual. 

Pons-Brooks had a major eruption for the first time in 69 years in July 2023, which left it with two distinct trails of gas and ice that resemble a pair of devil horns. It has continued to erupt fairly frequently.

[Related: ‘Oumuamua isn’t an alien probe, but it might be the freakiest comet we’ve ever seen.]

When will it be visible?

Throughout the next few weeks, Pons-Brooks may be visible to the naked eye as it travels through the inner solar system. It will remain so until April 2, as it travels closer to the sun and won’t be visible in the dark night sky. It will be closest to Earth on June 2, when it is headed away from the sun. It does not pose any known threats to Earth and will be about 139.4 million miles away. 

SETI institute postdoctoral fellow Ariel Graykowski told Gizmodo that it is set to become even more active in the coming weeks and will be visible to the naked eye with a maximum brightness magnitude around 4.0. The lower the magnitude, the brighter the appearance.

“The limit for naked eye objects in dark, moonless skies is around 6 magnitudes,” Graykowski said, so “it won’t be super obvious in the sky.”

Where should I look?

In the Northern Hemisphere, it is most visible in the early evening towards the west-northwest horizon. Pons-Brooks is near the Pisces constellation and sits low in the northwestern sky. It should appear like a glowing ball of ice, with its forked horns following behind it.

[Related: Halley’s comet is on its way back towards Earth.]

“The comet will brighten a bit as it gets closer to the sun, and it should be visible to the naked eye low in the west about an hour after sunset,” Paul Chodas and Davide Farnocchia from NASA’s Jet Propulsion Laboratory told CNN. “You should go to a location away from city lights and with an unobstructed view of the western horizon. It would be advisable to use a pair of binoculars, since the comet may be hard to locate without them.”

Will it appear during the April 8 solar eclipse?

Maybe. The forecast remains uncertain, but Pons-brooks could be visible if it flares significantly. It would only be seen by viewers in the path of totality–the area stretching from Texas northeast towards Maine where the moon will fully block the sun’s light.

According to EarthSky, “when the sky darkens, you’ll see the brightest planet Venus pop into view on one side of the sun. On the other side of the sun, you’ll find the second-brightest planet, Jupiter. And if Comet Pons-Brooks is bright enough, you’ll see it between Jupiter and the sun, but closer to Jupiter.”

It will not make an appearance again until 2097, so now is your chance to get a look. 

The post Don’t miss your chance to see the cryovolcanic ‘devil comet’ appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Carl Sagan in 1986: ‘Voyager has become a new kind of intelligent being—part robot, part human’ https://www.popsci.com/science/carl-sagan-voyager/ Mon, 25 Mar 2024 13:02:00 +0000 https://www.popsci.com/?p=607384
Space photo
Bettmann/Getty / Popular Science

The renowned scientist reflected on the lesser-known triumphs and lofty ambitions of Voyager in Popular Science's October 1986 issue.

The post Carl Sagan in 1986: ‘Voyager has become a new kind of intelligent being—part robot, part human’ appeared first on Popular Science.

]]>
Space photo
Bettmann/Getty / Popular Science

One of the worries that kept legendary astronomer Carl Sagan up at night was whether aliens would understand us. In the mid-1970s, Sagan led a committee formed by NASA to assemble a collection of images, recorded greetings, and music to represent Earth. The montage was pressed onto golden albums and dispatched across the cosmos on the backs of Voyagers 1 and 2.

In a 1986 story Sagan wrote for Popular Science, he noted that “hypothetical aliens are bound to be very different from us—independently evolved on another world,” which meant they likely wouldn’t be able to decipher the golden discs. But he took assurance from an underappreciated dimension of Voyagers’ message: the designs of the vessels themselves.

“We are tool makers,” Sagan wrote. “This is a fundamental aspect, and perhaps the essence, of being human.” What better way to tell alien civilizations that Earthlings are toolmakers than by sending a living room-sized, aluminum-framed probe clear across the Milky Way. 

Although both spacecraft were only designed to swing by Jupiter and Saturn, Voyager 2’s trajectory also hurled it past Uranus and Neptune. Despite numerous mishaps along the way—and because of the elite toolmaker skills of NASA engineers—the probe was in good enough shape to send back close-ups of those distant worlds. In 2012, Voyager 1 became the first interstellar spacecraft, followed soon thereafter by Voyager 2. “Once out of the solar system,” Sagan wrote, “the surfaces of the spacecraft will remain intact for a billion years or more,” so resilient is their design.

Today, the probes are 12–15 billion miles from Earth, still operable (despite experiencing recent communication difficulties), and sailing through the relative calm of interstellar space. They are expected to continue to transmit data back to Earth for another year or so, or until their plutonium batteries quit. 

It was early 20th century wireless inventor Guglielmo Marconi who suggested that radio signals never die, they only diminish as they travel across space and time. Even after communications from the Voyager spacecraft cease, perhaps the tiny voices of Earth’s first emissaries, animated by NASA’s master toolmakers nearly half a century ago, will continue to drift through the cosmos for all time, accessible to far-flung civilizations equipped with sensitive enough receivers to listen.

Space photo

“Voyager’s Triumph” (Carl Sagan, October 1986)

A noted scientist tells the little-known story of the remarkable feats of the Voyager engineers, a dedicated band who repeatedly overcame technical adversity to ensure the success of these historic expeditions to the outer solar system.

Carl Sagan is Director, Laboratory for Planetary Studies, Cornell University, and, since 1970, a member of the Voy­ager Imaging Science Team. His Cosmos: A Special Edition is televised this fall. 

On Jan. 25, 1986, the Voyager 2 robot probe entered the Uranus system and reported a procession of wonders. The encounter lasted only a few hours, but the data faithfully relayed back to Earth have revolu­tionized our knowledge of the aquamarine planet, its more than 15 moons, its pitch black rings, and its belt of trapped high-energy charged particles. Voyager 2 and its compan­ion, Voyager 1, have done this before. At Jupiter, in 1979, they braved a dose of trapped charged particles 1,000 times what it takes to kill a human being [PS, July ’79); and in all that radiation they discovered the rings of the largest planet, the first active volcanoes outside Earth, and a pos­sible underground ocean on an airless world—among a few hundred other major findings. At Saturn, in 1980 and 1981, the two spacecraft survived a pummeling by tiny icy particles as they plummeted through previously un­ known rings; and there they discovered not a few, but thou­ sands of Saturnian rings, icy moons recently melted through unknown causes, and a large world with an ocean of liquid hydrocarbons surmounted by clouds of organic matter IPS, March ’81 l. These spacecraft have returned to Earth four trillion bits of information, the equivalent of about 100,000 encyclopedia volumes. 

Because we are stuck on Earth, we are forced to peer at distant worlds through an ocean of distorting air. It is easy to see why our spacecraft have revolutionized the study of the solar system: We ascend to the stark clarity of the vacuum of space, and there approach our objectives, flying past them or orbiting them or landing on their surfaces. These nearby worlds have much to teach us about our own, and they will be—unless we are so foolish as to destroy ourselves—as familiar to our descendents as the neighboring states are to those who live in America today. 

Voyager and its brethren are prodigies of human inven­tiveness. Just before Voyager 2 was to encounter the Uranus system, the mission design had scheduled a final course correction, a short firing of the on-board propul­sion system to position Voyager correctly as it flew among the moving moons. But the course correction proved un­necessary. The spacecraft was already within 200 kilome­ters of its designed trajectory after a voyage along an arcing path five billion kilometers in length. This is roughly the equivalent of throwing a pin through the eye of a needle 50 kilometers away, or firing your target pistol in New York and hitting the bull’s eye in Dallas.

The lodes of planetary treasure were transmitted back to Earth by the radio antenna aboard Voyager; but Earth is so far away that by the time the signal was gathered in by radiotelescopes on our planet, the received power was only 10-16 watts (fifteen zeros after the decimal point). Comparing this weak signal with the power emitted by an ordinary reading lamp is like comparing the width of an atom with the distance between Earth and the moon. (Incidentally, the first photograph ever taken of Earth and the moon together in space was acquired by one of the Voyager spacecraft.)

We tend to hear much about the splendors returned, and very little about the ships that brought them, or the shipwrights. It has always been that way. Our history books do not tell us much about the builders of the Nina, Pinta, and Santa Maria, or even the principle of the caravel. De­spite ample precedent, it is a clear injustice: The Voyager engineering team and its accomplishments deserve to be much more widely known.

Space photo

The Voyager spacecraft were designed and assembled, and are operated by the Jet Propulsion Laboratory (JPL) of the National Aeronautics and Space Administration in Pasadena, Calif. The mission was conceived during the late 1960s, first funded in 1972, but was not approved in its present form (which includes encounters at Uranus and Neptune) until after the 1979 Jupiter flyby. The two spacecraft were launched in late summer and early fall 1977 by a non-reusable Titan/Centaur booster configuration at Cape Canaveral, Fla. Weighing about a ton, a Voyager would fill a good-sized living room. Each spacecraft draws about 400 watts of power—considerably less than an average American home—from a generator that converts radioactive plutonium into electricity. The instrument that measures interplanetary magnetic fields is so sensitive that the flow of electricity through the innards of the spacecraft would generate spurious signals. As a result, this instrument is placed at the end of a long boom stretching out from the spacecraft. With other projections, it gives Voyager a slightly porcupine appearance. Two cameras, infrared and ultraviolet spectrometers, and an instrument called the photopolarimeter are on a scan platform; the platform swivels so these instruments can point toward a target world. The spacecraft antenna must know where Earth is if the transmitted data are to be received back home. The spacecraft also needs to know where the sun is and at least one bright star, so it can orient itself in three dimensions and point properly toward any passing world. It does no good to be able to return pictures over billions of miles if you can’t point the camera.

On-orbit repairs

Each spacecraft costs about as much as a single modern strategic bomber. But unlike bombers, Voyager cannot, once launched, be returned to the hangar for repairs.

As a result, the spacecraft’s computers and electronics are designed redundantly. And when Voyager finds itself in trouble, the computers use branched contingency tree logic to work out the appropriate course of action. As the spacecraft journeys increasingly far from Earth, the round-trip light (and radio) travel time also increases, approaching six hours by the time Voyager is at the distance of Uranus.

Thus, in case of an emergency, the spacecraft needs to know how to put itself in a safe standby mode while awaiting instructions from Earth. As the spacecraft ages, more and more failures are expected, both in its mechanical parts and its computer system, although there is as yet no sign of a serious memory deterioration, some robot Alzheimer’s disease. When an unexpected failure occurs, special teams of engineers—some of whom have been with the Voyager program since its inception—are assigned to “work” the problem. They will study the underlying basic science and draw upon their previous experience with the failed subsystems. They may do experiments with identical Voyager spacecraft equipment that was never launched or even manufacture a large number of components of the sort that failed in order to gain some statistical understanding of the failure mode.

In April 1978, almost eight months after launch, an omitted ground command caused Voyager 2’s on-board computer to switch from the prime radio receiver to its backup.

During the next ground transmission to the spacecraft, the receiver refused to lock onto the signal from Earth. A component called a tracking loop capacitor had failed. After seven days in which Voyager 2 was out of contact, its fault protection software commanded the backup receiver to be switched off and the prime receiver to be switched back on. But, mysteriously, the prime receiver failed moments later: It never recovered. Voyager 2 was now fundamentally imperiled. Although the primary receiver had failed, the on-board computer commanded the spacecraft to use it. There was no way for the controllers on Earth to command Voyager to revert to the backup receiver. Even worse, the backup receiver would be unable to receive the commands from Earth because of the failed capacitor. Finally, after a week of command silence, the computer was programmed to switch automatically between receivers.

And during that week’s time the JPL engineers designed an innovative command frequency control procedure to make a few essential commands comprehensible to the damaged backup receiver.

This meant the engineers were able to communicate, at least a little bit, with the spacecraft. Unfortunately the backup receiver now turned giddy, becoming extremely sensitive to the stray heat dumped when various components of the spacecraft were powered up or down. Over the following months the JPL engineers designed and conducted a series of tests that let them thoroughly understand the thermal consequences of most operational modes of the spacecraft on its ability to receive commands from Earth. The backup-receiver problem was entirely circumvented. It was this backup receiver that acquired all the commands from Earth on how to gather data in the Jupiter, Saturn, and Uranus systems. The engineers had saved the mission. (But to be on the safe side, during most of Voyager’s subsequent flight there is in residence in the onboard computers a nominal data-taking sequence for the next planet to be encountered.)

Another heart-wrenching failure occurred just after Voyager 2 emerged from behind Saturn after its closest approach to the planet in August 1981. The scan platform had been moving rapidly in the azimuth direction—quickly pointing here and there among the rings, moons, and the planet itself during the time of closest approach. Suddenly, the platform jammed. A stuck scan platform obviously implies a severe reduction in future pictures and other key data. The scan platform is driven by gear trains called actuators, so first the JPL engineers ran an identical copy of the flight actuator in a simulated mission. The ground actuator failed after 348 revolutions: the actuator on the spacecraft had failed after 352 revolutions. The problem turned out to be a lubrication failure. Plainly, it would be impossible to overtake Voyager with an oil can. The engineers wondered whether it would be possible to restart the failed actuator by alternately heating and cooling it, so that the thermal stresses would cause the components of the actuator to expand and contract at different rates and un-jam the system. After gaining experience with specially manufactured actuators on the ground, the engineers jubilantly found that they were able to use this procedure to start the scan platform up again in space. More than this, they devised techniques to diagnose any imminent actuator failure early enough to work around the problem. Voyager 2’s scan platform worked perfectly in the Uranus system. The engineers had saved the day again.

Ingenious solutions

Voyager 1 and 2 were designed to explore the Jupiter and Saturn systems only. It is true that their trajectories would carry them to Uranus and Neptune, but officially these planets were never contemplated as targets for Voyager exploration: The spacecraft was not supposed to last that long. Because of trajectory requirements in the Saturn system, Voyager 1 was flung on a path that will never encounter any other known world; but Voyager 2 flew to Uranus with brilliant success, and is now on its way to an August 1989 encounter with the Neptune system.

Space photo

At these immense distances, sunlight is getting progressively dimmer, and the spacecraft’s transmitted radio signals to Earth are getting progressively fainter. These were predictable but still very serious problems that the JPL engineers and scientists also had to solve before the encounter with Uranus.

Because of the low light levels at Uranus, the Voyager television cameras were obliged to take longer time exposures. But the spacecraft was hurtling through the Uranus system so fast (about 35,000 miles per hour) that the image would have been smeared or blurred—an experience shared by many amateur photographers. To overcome this, the entire spacecraft had to be moved during the time exposures to compensate for the motion, like panning in the direction opposite yours while taking a photograph of a street scene from a moving car. This may sound easier than it is: You have to compensate for the most casual of motions. At zero gravity, the mere start and stop of the on-board tape recorder that’s registering the image can jiggle the spacecraft enough to smear the picture. This problem was solved by commanding the spacecraft thrusters, instruments of exquisite sensitivity, to compensate for the tape-recorder jiggle at the start and stop of each sequence by turning the entire spacecraft just a little. To compensate for the low received radio power at Earth, a new and more efficient digital encoding algorithm was designed for the cameras, and the radiotelescopes on Earth were joined together with oth ers to increase their sensitivity. Overall, the imaging system worked, by many criteria, better at Uranus than it did at Saturn or even at Jupiter.

Voyager has become a new kind of intelligent being—part robot, part human. It extends the human senses to far-off worlds.

The ingenuity of the JPL engineers is growing faster than the spacecraft is deteriorating. And Voyager may not be done exploring after its Neptune encounter.

There is, of course, a chance that some vital subsystem will fail tomorrow, but in terms of the radioactive decay of the plutonium power source, the two Voyager spacecraft will be able to return data to Earth until roughly the year 2015. By then they will have traveled more than a hundred times Earth’s distance from the sun, and may have penetrated the heliopause, the place where the interplanetary magnetic field and charged particles are replaced by their interstellar counterparts; the heliopause is one definition of the frontier of the solar system.

Robot-human partnerships

These engineers are heroes of our time. And yet almost no one knows their names. I have attached a table giving the names of a few of the JPL engineers who played central roles in the success of the Voyager missions.

In a society truly concerned for its future, Don Gray, Charlie Kohlhase, or Howard Marderness, would be as well known for their extraordinary abilities and accomplishments as Dwight Gooden, Wayne Gretzky, or Kareem Abdul Jabbar are for theirs.

Voyager has become a new kind of intelligent being-part robot, part human. It extends the human senses to far-off worlds. For simple tasks and short-term problems, it relies on its own intelligence; but for more complex tasks and longer term problems, it turns to another, considerably larger brain—the collective intelligence and experience of the JPL engineers. This trend is sure to grow. The Voyagers embody the technology of the early 1970s; if such spacecraft were to be designed in the near future, they would incorporate stunning improvements in artificial intelligence, in data-processing speed, in the ability to self-diagnose and repair, and in the capacity for the spacecraft to learn from experience. In the many environments too dangerous for people, the future belongs to robot-human partnerships that will recognize Voyager as antecedent and pioneer.

Unlike what seems to be the norm in the so-called defense industry, the Voyager spacecraft came in at cost, on time, and vastly exceeding both their design specifications and the fondest dreams of their builders. These machines do not seek to control, threaten, wound, or destroy; they represent the exploratory part of our nature, set free to roam the solar system and beyond.

Once out of the solar system, the surfaces of the spacecraft will main intact for a billion years or more, as the Voyagers circumnavigate the center of the Milky Way galaxy.

This kind of technology, its findings freely revealed to all humans everywhere, is one of the few activities of the United States admired as much by those who find our policies uncongenial as by those who agree with us on every issue. Unfortunately, the tragedy of the space shuttle Challenger implies agonizing delays in the launch of Voyager’s successor missions, such as the Galileo Jupiter orbiter and entry probe. Without real support from Congress and the White House, and a clear long-term NASA goal, NASA scientists and engineers will be forced to find other work, and the historic American triumphs in solar-system exploration—symbolized by Voyager—will become a thing of the past. Missions to the planets are one of those things—and I mean this for the entire human species—that we do best. We are tool makers—this is a fundamental aspect, and perhaps the essence, of being human.

Greeting the aliens

Both Voyager spacecraft are on escape trajectories from the solar system. The gravitational fields of Jupiter, Saturn, and Uranus have flung them at such high velocities that they are destined ultimately to leave the solar system altogether and wander for ages in the calm, cold blackness of interstellar space—where, it turns out, there is essentially no erosion.

Once out of the solar system, the surfaces of the spacecraft will main intact for a billion years or more, as the Voyagers circumnavigate the center of the Milky Way galaxy. We do not know whether there are other space-faring civilizations in the Milky Way. And if they do exist, we do not know how abundant they are.

But there is at least a chance that some time in the remote future one of the Voyagers will be intercepted by an alien craft. Voyagers 1 and 2 are the fastest spacecraft ever launched by humans; but even so, they are traveling so slowly that it will be tens of thousands of years before they go the distance to the nearest star. And they are not headed toward any of the nearby stars. As a result there could be no danger of Voyager attracting “hostile” aliens to Earth, at least not any time soon.

So, it seemed appropriate to include some message of greeting from Earth At NASA’s request, a committee I chaired designed a phonograph record that was affixed to the outside of each of the Voyager spacecraft. The records contain 116 pictures in digital form, describing our science and technology, our institutions, and ourselves; what will surely be unintelligible greetings in many languages; a sound essay on the evolution of our planet; and an hour and a half of the world’s greatest music. But the hypothetical aliens are bound to be very different from us—independently evolved on another world. Are we really sure they could understand our message? Every time I feel these concerns stirring, though, I reassure myself: Whatever the incomprehensibilities of the Voyager record, any extraterrestrial that finds it will have another standard by which to judge us.

Each Voyager is itself a message. In its exploratory intent, in the lofty ambition of its objectives, and in the brilliance of its design and performance, it speaks eloquently for us.

The post Carl Sagan in 1986: ‘Voyager has become a new kind of intelligent being—part robot, part human’ appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The buried treasure that helped take us to the moon https://www.popsci.com/science/buried-papers-space-race/ Sun, 24 Mar 2024 17:06:00 +0000 https://www.popsci.com/?p=607746
surface of the moon
NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington

The space race, Cold War, and moon landing all connect back to an obscure silver iron mining operation in Germany.

The post The buried treasure that helped take us to the moon appeared first on Popular Science.

]]>
surface of the moon
NASA/Johns Hopkins University Applied Physics Laboratory/Carnegie Institution of Washington

One of the most valuable caches in human history was found buried in an old mine in Lower Saxony. It wasn’t a precious mineral or ancient artifact—no, it was paper. In the latest video from Popular Science, we share the unbelievable and little known story of how 14 tons of buried paper impacted the space race.

Space photo

Want more Popular Science videos? Check out “The $15,000 A.I. From 1983” and “Why Do We Put Holes In Our Head?” And don’t forget to subscribe on YouTube.

The post The buried treasure that helped take us to the moon appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA needs your smartphone during April’s solar eclipse https://www.popsci.com/science/nasa-smartphone-eclipse-app/ Thu, 21 Mar 2024 14:00:00 +0000 https://www.popsci.com/?p=607305
Timelapse of total solar eclipse showcasing Baily's beads
This image highlights Baily's beads, a feature of total solar eclipses that are visible at the very beginning and the very end of totality. It's composed of a series of images taken during a total solar eclipse visible from ESO's La Silla Observatory on 2 July 2019. Baily's Beads are caused by the Moon's mountains, valleys, and craters. These surface features create an uneven edge of the Moon, where small "beads" of sunlight still shine through the lowest parts for a few moments after the rest of the Sun is covered. P. Horálek/European Southern Observatory

The free SunSketcher app will use your phone’s camera to record the event and help study the sun’s ‘oblateness.’

The post NASA needs your smartphone during April’s solar eclipse appeared first on Popular Science.

]]>
Timelapse of total solar eclipse showcasing Baily's beads
This image highlights Baily's beads, a feature of total solar eclipses that are visible at the very beginning and the very end of totality. It's composed of a series of images taken during a total solar eclipse visible from ESO's La Silla Observatory on 2 July 2019. Baily's Beads are caused by the Moon's mountains, valleys, and craters. These surface features create an uneven edge of the Moon, where small "beads" of sunlight still shine through the lowest parts for a few moments after the rest of the Sun is covered. P. Horálek/European Southern Observatory

Listening for crickets isn’t the only way you can help NASA conduct research during the total solar eclipse passing across much of North America on April 8—you can also lend your smartphone camera to the cause. The agency is calling on anyone within the upcoming eclipse’s path to totality to participate in its SunSketcher program. The program will amass volunteer researcher data to better understand the star’s shape. To participate, all you need is NASA’s free app, which uses a smartphone’s camera coupled with its GPS coordinates to record the eclipse. But why?

The sun looks simply spherical in many photographs and renderings, and in the sun if you happen to briefly glance at it during the day—an emphasis on “briefly,” of course. But thanks to what’s known as oblateness, this isn’t ever really the case. A rotating spheroid will oblate when its centrifugal force generates enough inertia to slightly flatten it out into a more irregular, elliptical shape. Within the solar system, Earth, Jupiter, and Saturn all also display oblateness, but the sun has some unique characteristics affecting how it oblates in particular.

Total solar eclipse showcasing Baily's beads
Baily’s Beads as seen during the 2017 total eclipse. CREDIT: NASA/Aubrey Gemignani

According to NASA, the sun’s oblateness “depends upon the interior structure of the rotation, which we know from sunspot motions to be latitude-dependent at least.” Astronomers also think gas flows accompanying the sun’s magnetic activity and convection can create “transient distortions at a smaller level.” The upcoming total solar eclipse will provide astronomers an opportunity to better understand all this in the sun, but to make that happen, NASA wants you to harness the moon.

Earth’s natural satellite can serve as a valuable research partner in measuring the sun’s oblateness. This is due to a phenomenon known as “Baily’s beads,” which are the tiny flashes of light during an eclipse that occur as solar light passes over the moon’s rugged terrain of craters, hills, and valleys. Since satellite imagery has helped produce extremely detailed mappings of lunar topography, experts can match Baily’s beads to the moon’s features as it passes in front of the sun.

[Related: New evidence suggests dogs may ‘picture’ objects in their minds, similarly to people.]

These flashes will vary depending on where an observer is located within the path of totality. If you could amass data from a vast number of observer locales, however, you could better understand the sun’s surface variations due to its oblateness. And there are potentially millions of individual locales directly underneath the April 8 eclipse. Enter: SunSketcher.

“With your help, we hope to create a massive hour-long database of observations, more than we could ever make on our own,” NASA says.

All volunteers need to do is angle their phones up to capture the big event and let SunSketcher record the rest. Once all those videos are collected, NASA says the solar disk’s size and shape can be calculated to within a few kilometers, “an accuracy that is far better than currently known.” The reliable, detailed information on solar oblateness captured during SunSketcher can also be used to study how solar gravity affects the motions of inner planets, as well as help test various gravitational theories.

It’s worth noting that serving as an official SunSketcher volunteer will sacrifice the ability to use your smartphone to snap videos or pictures for yourself—but that’s arguably a small price to pay for helping conduct valuable scientific research.

The post NASA needs your smartphone during April’s solar eclipse appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA’s asteroid blaster turned a space rock into an ‘oblong watermelon’ https://www.popsci.com/science/dart-oblong-asteroid/ Wed, 20 Mar 2024 14:00:00 +0000 https://www.popsci.com/?p=607218
A circular asteroid with a smaller rock orbiting it. The asteroid Dimorphos was captured by NASA’s DART mission just two seconds before the spacecraft struck its surface on September 26, 2022.
The asteroid Dimorphos was captured by NASA’s DART mission just two seconds before the spacecraft struck its surface on September 26, 2022. Observations of the asteroid before and after impact suggest it is a loosely packed “rubble pile” object. NASA/Johns Hopkins APL

The successful DART mission likely made Dimorphos' shape more 'askew' and eccentric.

The post NASA’s asteroid blaster turned a space rock into an ‘oblong watermelon’ appeared first on Popular Science.

]]>
A circular asteroid with a smaller rock orbiting it. The asteroid Dimorphos was captured by NASA’s DART mission just two seconds before the spacecraft struck its surface on September 26, 2022.
The asteroid Dimorphos was captured by NASA’s DART mission just two seconds before the spacecraft struck its surface on September 26, 2022. Observations of the asteroid before and after impact suggest it is a loosely packed “rubble pile” object. NASA/Johns Hopkins APL

In a “picture perfect” test, NASA’s Double Asteroid Redirection Test (DART) successfully smashed a car-sized spacecraft into an asteroid in September 2022. The mission showed that a spacecraft could successfully defect a hazardous space rock if it were ever heading for Earth, even though the odds of a cataclysmic event happening are pretty low. DART changed the asteroid’s orbit, and now scientists found that the blistering impact also likely changed the asteroid’s shape. The findings are described in a study published March 19 in the Planetary Science Journal.

DART targeted the 560-foot-wide asteroid Dimorphos, which orbits a larger near-Earth asteroid called Didymos. Before the impact, Dimorphos had a generally symmetrical oblate spheroid shape.

“When DART made impact, things got very interesting,” Shantanu Naidu, a study co-author and navigation engineer at NASA’s Jet Propulsion Laboratory (JPL), said in a statement. “Dimorphos’ orbit is no longer circular. The entire shape of the asteroid has changed, from a relatively symmetrical object to a ‘triaxial ellipsoid’-–something more like an oblong watermelon.”

NASA photo
This illustration shows the approximate shape change that the asteroid Dimorphos experienced after DART hit it. Before impact, left, the asteroid was shaped like a squashed ball; after impact it took on a more elongated shape, like a watermelon. CREDIT: NASA/JPL-Caltech

Previously, it took Dimorphos 11 hours and 55 minutes to complete one loop around Didymos and it had a well-defined, circular orbit about 3,900 feet from it. The space rock’s orbital period–the time it takes to complete one orbit–is now shorter by about 33 minutes and 15 seconds. 

To look into the changes after the impact with DART, Naidu and the team on this study used multiple sources of data in their computer models. The first source was the images that DART captured as it approached the asteroid. These images taken aboard the spacecraft gave close-up measurements of the gap between Didymos and Dimorphos and helped the team gauge the dimensions of both asteroids just before impact.  

The second data source was NASA’s Deep Space Network’s Goldstone Solar System Radar. This rader system is located near Barstow, California. It bounced radio waves off both Didymos and Dimorphos. These radio waves precisely measured the position of Dimorphos relative to Didymos after impact. These radar observations helped NASA conclude that DART exceeded the mission’s expectations

[Related: DART left an asteroid crime scene. This mission is on deck to investigate it.]

The most significant source of data came from ground telescopes all over the world that measured both asteroids’ light curve. This is how the sunlight reflecting off the asteroids’ rocky surfaces changed over time. Comparing the light curves before and after impact helped the team learn how DART changed Dimorphos’ motion. As Dimorphos orbits, it periodically passes in front of Didymos and then behind it. During these mutual events, one of the asteroids in the system can cast a shadow on the other, or block our view from Earth. A temporary dimming in the light curve can be recorded by telescopes in both scenarios. 

The team used the timing of this series of light-curve dips to figure out the shape of the orbit. Their models revealed that Dimorphos’ orbit is now slightly elongated, or eccentric. 

“Before impact the times of the events occurred regularly, showing a circular orbit. After impact, there were very slight timing differences, showing something was askew,” study co-author and JPL senior research scientist Steve Chesley said in a statement. “We never expected to get this kind of accuracy.”

[Related: Smashed asteroid surrounded by a ‘cloud’ of boulders.]

According to the team, the models are so precise that they can even show that Dimorphos rocks back and forth as it orbits Didymos. 

The models also calculated how the orbital period evolved. Right after impact, DART reduced the average distance between the two asteroids. It shortened Dimorphos’ orbital period by 32 minutes and 42 seconds, down to 11 hours, 22 minutes, and 37 seconds. 

In the week’s following its collision with DART, the asteroid’s orbital period continued to shorten as it shed more rocky material. It settled in at 11 hours, 22 minutes, and 3 seconds per orbit–or 33 minutes and 15 seconds less time than it took before impact. Dimorphos also now has an average orbital distance of about 3,780 feet–or roughly 120 feet closer to Didymos than it was before colliding with DART.

Another study published in February found that the asteroid is likely a loose rubble pile asteroid–like the recently sampled asteroid Bennu–composition due to its collision with DART. 

“The results of this study agree with others that are being published,” lead scientist for solar system small bodies at NASA Headquarters Tom Statler, said in a statement. “Seeing separate groups analyze the data and independently come to the same conclusions is a hallmark of a solid scientific result. DART is not only showing us the pathway to asteroid-deflection technology, it’s revealing [a] new fundamental understanding of what asteroids are and how they behave.” Statler was not an author on this study. 

To get a closer look at Didymos and Dimorphos, the European Space Agency’s Hera mission is scheduled to launch in October 2024. It will be taking a detailed survey of the asteroid pair and could officially confirm just how much DART reshaped Dimorphos.

The post NASA’s asteroid blaster turned a space rock into an ‘oblong watermelon’ appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
For nearly $500,000, you too can have dinner in the ‘SpaceBalloon’ above Earth https://www.popsci.com/science/spacevip-balloon-trip/ Mon, 18 Mar 2024 15:30:00 +0000 https://www.popsci.com/?p=606914
SpaceVIP SpaceBalloon above Earth concept art
The high-priced trips are scheduled to start in late 2025. SpaceVIP

'Space is for everybody.'

The post For nearly $500,000, you too can have dinner in the ‘SpaceBalloon’ above Earth appeared first on Popular Science.

]]>
SpaceVIP SpaceBalloon above Earth concept art
The high-priced trips are scheduled to start in late 2025. SpaceVIP

A luxury space tourism company called SpaceVIP is currently taking reservations for its Stratospheric Dining Experience. For $495,000, six participants will enjoy a Michelin Star restaurant-catered jaunt into suborbit, sans rockets or zero gravity. Scheduled to launch as early as 2025 from Florida’s Space Coast, the travelers will “gently lift” into the sky aboard the pressurized cabin of Spaceship Neptune, a supposedly carbon neutral “SpaceBalloon” designed by another elite getaway startup called Space Perspective. Over the course of six hours, travelers will be wined and dined by Rasmus Munk, Head Chef at Alchemist, a 2 Michelin Star “Holistic Cuisine” restaurant. 

What is “Holistic Cuisine?” According to a joint announcement, it’s apparently a meal that doubles as “an intentional story… that will inspire thought and discussion on the role of humanity in protecting our planet” while “challenging the diner to reexamine our relationship with Earth and those who inhabit it.”  The diners can ponder this while watching the sunrise over Earth’s curvature from approximately 100,000-feet above sea level.

“Embarking on this unprecedented culinary odyssey to the cosmos marks a pivotal moment in human history,” Roman Chiporukha, founder of SpaceVIP, said in a statement. “This inaugural voyage is but the first chapter in SpaceVIP’s mission to harness the transformative power of space travel to elevate human consciousness and shape the course of our collective evolution.”

Concept art of SpaceBalloon cabin interior
Concept art depicting the SpaceBalloon’s interior. Credit: SpaceVIP

Space Perspective representatives also said they believe such a trip will spur what’s known as the “Overview Effect” within their “Explorers,” referring to the feeling of awe many astronauts have described upon the Earth from the heavens. If it doesn’t, at least their tickets reportedly will be going to Space Prize Foundation, a nonprofit dedicated to advancing women within the space industry.

Those astronauts, however, felt their Overview Effect after years of physical, mental, and technological training. With a pressurized cabin, stable gravity, and a Space Spa (the name for the bathroom), Stratospheric Dining Experience attendees can simply bypass all of that by ponying up 12-times the annual salary of a first-year public school teacher in the US. For a more generalized overview effect, one ticket is about 2,640-percent higher than the global average yearly wage.

Test flights will commence later this year ahead of the 2025 launch window, when SpaceVIP’s Explorers “will be making history by enjoying the meal of a lifetime above 99-percent of Earth’s atmosphere.”

Despite being replete with mentions of “space” throughout the press materials, the meal won’t technically be in outer space. At its apex, SpaceVIP and Space Perspective’s “SpaceBalloon” will be about 43 miles below the Kármán line. For an actual, albeit brief, trip to space, Blue Origin spots are reportedly going for about $250,000 a seat.

The post For nearly $500,000, you too can have dinner in the ‘SpaceBalloon’ above Earth appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Why do solar eclipses happen? https://www.popsci.com/science/why-do-solar-eclipses-happen/ Sat, 16 Mar 2024 16:00:00 +0000 https://www.popsci.com/?p=606787
solar eclipse
Solar eclipses happen because of a few factors, including the Moon’s size and distance from the Sun. Xinhua/Xinhua via Getty Images

Solar eclipses result from a fantastic celestial coincidence of scale and distance.

The post Why do solar eclipses happen? appeared first on Popular Science.

]]>
solar eclipse
Solar eclipses happen because of a few factors, including the Moon’s size and distance from the Sun. Xinhua/Xinhua via Getty Images

This article was originally featured on The Conversation.

On April 8, 2024, millions across the U.S. will have the once-in-a-lifetime chance to view a total solar eclipse. Cities including Austin, Texas; Buffalo, New York; and Cleveland, Ohio, will have a direct view of this rare cosmic event that lasts for just a few hours.

While you can see many astronomical events, such as comets and meteor showers, from anywhere on Earth, eclipses are different. You need to travel to what’s called the path of totality to experience the full eclipse. Only certain places get an eclipse’s full show, and that’s because of scale.

The relatively small size of the Moon and its shadow make eclipses truly once-in-a-lifetime opportunities. On average, total solar eclipses are visible somewhere on Earth once every few years. But from any one location on Earth, it is roughly 375 years between solar eclipses.

I’m an astronomer, but I have never seen a total solar eclipse, so I plan to drive to Erie, Pennsylvania, in the path of totality, for this one. This is one of the few chances I have to see a total eclipse without making a much more expensive trip to someplace more remote. Many people have asked me why nearby eclipses are so rare, and the answer is related to the size of the Moon and its distance from the Sun.

Moons photo

Size and scale

You can observe a solar eclipse when the Moon passes in front of the Sun, blocking some or all of the Sun from view. For people on Earth to be able to see an eclipse, the Moon, while orbiting around the Earth, must lie exactly along the observer’s line of sight with the Sun. Only some observers will see an eclipse, though, because not everyone’s view of the Sun will be blocked by the Moon on the day of an eclipse.

The fact that solar eclipses happen at all is a bit of a numerical coincidence. It just so happens that the Sun is approximately 400 times larger than the Moon and also 400 times more distant from the Earth.

So, even though the Moon is much smaller than the Sun, it is just close enough to Earth to appear the same size as the Sun when seen from Earth.

For example, your pinky finger is much, much smaller than the Sun, but if you hold it up at arm’s length, it appears to your eye to be large enough to block out the Sun. The Moon can do the same thing – it can block out the Sun if it’s lined up perfectly with the Sun from your point of view.

Path of totality

When the Earth, Moon and Sun line up perfectly, the Moon casts a shadow onto the Earth. Since the Moon is round, its shadow is round as it lands on Earth. The only people who see the eclipse are those in the area on Earth where the shadow lands at a given moment.

The Moon is continuously orbiting around the Earth, so as time goes on during the eclipse, the Moon’s shadow moves over the face of the Earth. Its shadow ends up looking like a thick line that can cover hundreds of miles in length. Astronomers call that line the path of totality.

From any given location along the path of totality, an observer can see the Sun completely eclipsed for a few minutes. Then, the shadow moves away from that location and the Sun slowly becomes more and more visible.

A tilted orbit

Solar eclipses don’t happen every single time the Moon passes in between Earth and the Sun. If that were the case, there would be a solar eclipse every month.

If you could float above the Earth’s North Pole and see the Moon’s orbit from above, you would see the Moon line up with the Sun once every time it orbits around the Earth, which is approximately once per month. From this high point of view, it looks like the Moon’s shadow should land on Earth every orbit.

However, if you could shift your perspective to look at the Moon’s orbit from the orbital plane, you would see that the Moon’s orbit is tilted by about 5 degrees compared with Earth’s orbit around the Sun. This tilt means that sometimes the Moon is too high and its shadow passes above the Earth, and sometimes the Moon is too low and its shadow passes below the Earth. An eclipse happens only when the Moon is positioned just right and its shadow lands on the Earth.

Moons photo

As time goes on, the Earth and the Moon continue spinning, and eventually the Moon aligns with Earth’s orbit around the Sun at the same moment the Moon passes between the Sun and the Earth.

While only certain cities are in the path of totality for this April’s eclipse, the entire U.S. is still close enough to this path that observers outside of the path of totality will see a partial eclipse. In those locations, the Moon will appear to pass in front of part of the Sun, leaving a crescent shape of the Sun still visible at the moment of maximum eclipse.

The post Why do solar eclipses happen? appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Uranus is a grizzly bear: Understanding planet mass using animals https://www.popsci.com/science/planet-mass-compared-to-animals/ Sat, 16 Mar 2024 14:11:00 +0000 https://www.popsci.com/?p=606693
planet names alongside images of a cat, human, dog, rat, squirrel, giraffes, elephant, bear, and horse
You can understand the real scale of planets by comparing their relative sizes to animals on Earth. Popular Science

Aww, Mercury is a kitten.

The post Uranus is a grizzly bear: Understanding planet mass using animals appeared first on Popular Science.

]]>
planet names alongside images of a cat, human, dog, rat, squirrel, giraffes, elephant, bear, and horse
You can understand the real scale of planets by comparing their relative sizes to animals on Earth. Popular Science

The vastness of space can make trying to understand the relative mass of celestial bodies brain boggling. Sure, you know the sun dwarfs the planets orbiting it, but how much more mass does the hot ball of hydrogen and helium at the center of our solar system actually have?

If that question had you scratching your head trying to visualize something, take a breath and watch our latest video, Planets as Animals:

Space photo

By mass, if Venus is a 6-year-old human child, then Earth is a Labrador dog (and a very good boy) and Uranus is a grizzly bear. Using 360-degree 3D animation, we help your brain comprehend the real scale of the planets in our solar system using animals.

Want more Popular Science videos? Check out “The $15,000 A.I. From 1983” and “Why Do We Put Holes In Our Head?” And don’t forget to subscribe on YouTube.

The post Uranus is a grizzly bear: Understanding planet mass using animals appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Crypto scammers flooded YouTube with sham SpaceX Starship livestreams https://www.popsci.com/technology/crypto-scam-starship-launch-livestream/ Thu, 14 Mar 2024 15:26:22 +0000 https://www.popsci.com/?p=606533
Starship rocket launching during third test
The SpaceX Starship Flight 3 Rocket launches at the Starbase facility on March 14, 2024 in Brownsville, Texas. The operation is SpaceX's third attempt at launching this rocket into space. The Starship Flight 3 rocket becomes the world's largest rocket launched into space and is vital to NASA's plans for landing astronauts on the Moon and Elon Musk's hopes of eventually colonizing Mars. Photo by Brandon Bell/Getty Images

A fake Elon Musk hawked an ‘amazing opportunity’ during this morning’s big launch.

The post Crypto scammers flooded YouTube with sham SpaceX Starship livestreams appeared first on Popular Science.

]]>
Starship rocket launching during third test
The SpaceX Starship Flight 3 Rocket launches at the Starbase facility on March 14, 2024 in Brownsville, Texas. The operation is SpaceX's third attempt at launching this rocket into space. The Starship Flight 3 rocket becomes the world's largest rocket launched into space and is vital to NASA's plans for landing astronauts on the Moon and Elon Musk's hopes of eventually colonizing Mars. Photo by Brandon Bell/Getty Images

YouTube is flooded with fake livestream accounts airing looped videos of “Elon Musk” supposedly promoting crypto schemes. Although not the first time to happen, the website’s layout, verification qualifications, and search results page continue to make it difficult to separate legitimate sources from the con artists attempting to leverage today’s Starship test launch—its most successful to date, although ground control eventually lost contact with the rocket yet again.

After entering search queries such as “Starship Launch Livestream,” at least one supposed verified account within the top ten results takes users to a video of Elon Musk standing in front of the over 400-feet-tall rocket’s launchpad in Boca Chica, Texas. Multiple other accounts airing the same clip can be found further within the search results.

Space X photo

“Don’t miss your chance to change your financial life,” a voice similar to Musk’s tells attendees over footage of him attending a previous, actual Starship event. “This initiative symbolizes our commitment to making space exploration accessible to all, while also highlighting the potential of financial innovations represented by cryptocurrencies.”

“…to send either 0.1 Bitcoin or one Ethereum or Dogecoin to the specified address. After completing the transaction within a minute, twice as much Bitcoin or Ethereum will be returned to your address. …It is very important to use reliable and verified sources to scan the QR code and visit the promotion website. This will help avoid possible fraudulent schemes. Please remember administration is not responsible for loss due to not following the rules of our giveaway due to incorrect transactions or the use of unreliable sources. Don’t miss your chance to change your financial life. Connect Cryptocurrency wallet right now and become part of this amazing opportunity. You will receive double the amount reflected in your Bitcoin wallet. This initiative symbolizes our commitment to making space exploration accessible to all while also highlighting the potential of financial innovations are represented by cryptocurrencies. So let us embark on this remarkable journey to financial independence and cosmic discoveries…”

Fake Elon Musk

It’s unclear if the audio is AI vocal clone or simply a human impersonation, but either way oddly stilted and filled with glitches. A QR code displayed at the bottom of the screen (which PopSci cropped out of the video above) takes viewers to a website falsely advertising an “Official event from SpaceX Company” offering an “opportunity to take a share of 2,000 BTC,” among other massive cryptocurrency hauls.

There are currently multiple accounts mirroring the official SpaceX YouTube page airing simultaneous livestreams of the same scam clip. One of those accounts has been active since May 16, 2022, and has over 2.3 million subscribers—roughly one-third that of SpaceX’s actual, verified profile. Unlike the real company’s locale, however, the fake profile is listed as residing in Venezuela.

[Related: Another SpaceX Starship blew up.]

Scammers have long leveraged Musk’s public image for similar con campaigns. The SpaceX, Tesla, and X CEO is a longtime pusher of various cryptocurrency ventures, and is one of the world’s wealthiest men. Likewise, YouTube is a particularly popular venue for crypto grifters. In June 2020, for example, bad actors made away with $150,000 through nearly identical SpaceX YouTube channels. Almost exactly two years later, the BBC noted dozens of fake Musk videos advertising crypto scams, earning a public rebuke from the actual Musk himself. The crypto enthusiast outlet BitOK revealed a campaign almost exactly the same as today’s scams around the time as the November 2023 Starship event.

Update 3/15/24 12:40pm: YouTube spokesperson confirmed that the company has “terminated four channels in line with our policies which prohibit cryptocurrency phishing schemes.” According to YouTube, video uploads are monitored by a combination of machine learning and human reviewers.

The post Crypto scammers flooded YouTube with sham SpaceX Starship livestreams appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Voyager, Chandrayaan, Curiosity: How do spacecraft get their names? https://www.popsci.com/science/voyager-chandrayaan-curiosity-how-do-spacecrafts-get-their-names/ Thu, 14 Mar 2024 12:09:34 +0000 https://www.popsci.com/?p=606504
Artist's concept of Voyager.
Artist's concept of Voyager. NASA

Bits of different countries’ cultures are launched into space with each space exploration project.

The post Voyager, Chandrayaan, Curiosity: How do spacecraft get their names? appeared first on Popular Science.

]]>
Artist's concept of Voyager.
Artist's concept of Voyager. NASA

Somehow, when you talk about space exploration, it seems like all the human-made projects have these incredible, inspiring names: Perseverance, Voyager, Challenger, Curiosity. That, however, isn’t a coincidence. NASA has been thoughtfully discussing and implementing policies for how they bestow official names on their spacecraft since the organization’s very beginnings in the 1960s. 

The milestone of naming a mission or spacecraft is still met with fanfare today, as it marks a sort of officialization of the project—it makes it seem more real and gives the public a way to bring up a cutting edge science project in conversation. Recently, for example, the China National Space Administration (CNSA) announced names for its upcoming Lunar projects, which aim to get humans to the moon by 2030: a crew vessel named Mengzhuo (“dream vessel”) and a lander called Lanyue (“embracing the moon”).

So, how do these names come to be in the first place? Who decided to call NASA’s modern return to the moon Artemis, and where did CNSA come up with Lanyue

The answer to that question depends on which space agency’s names you’re asking about, as there is no centralized authority a la airport code naming from the International Air Transport Association. “Names given to spaceflight projects and programs have originated from no single source or method. Some have their foundations in mythology and astrology, some in legend and folklore. Some have historic connotations. Some are based on a straightforward description of their mission,” explains the historical NASA document, “Origins of NASA Names” from 1976

Lanyue, for example, comes from a 1965 Mao Zedong poem and was suggested by members of the Chinese public to capture the community’s spirit of reaching towards the moon. Another recent project—India’s Chandrayaan-2 and -3—also has a pretty literal meaning for a project traveling to the Moon. In Sanskrit, Chandrayaan literally means “mooncraft.” Other parts of the mission source their names from other bits of Indian culture and history; Chandrayaan-3’s lander Vikram is a tribute to one of the pioneers of the Indian Space Research Organization (ISRO), while the Pragyan rover gets its name from a Sanskrit word meaning “wisdom.”

This tradition of naming spacecraft after their targets in a language other than English even runs straight back to the beginning of the space race, with the then-USSR’s naming scheme. Sputnik, the first launched satellite, means “companion” or “satellite” in the astronomical sense. Venera, the Soviet mission to Venus, is actually just how you say Venus in Russian! Their progression of space stations faced a similar nomenclatorial fate: the original and somewhat military-focused Salyut meant salute, and the later Mir and Soyuz—peace and union respectively—tracked the transition to international cooperation in Earth orbit.

NASA, in particular, has been wrangling names since its inception. In a 1963 memo, then-NASA Associate Administrator Robert C. Seamans, Jr. wrote, “The terminology used for the several spacecraft within the project, both before and after launch, has become somewhat confusing. It is important that we have a more nearly standard method of designating spacecraft.” That first mission directive, number 7620, became the foundation for decades of NASA policy surrounding mission names. 

Their guidelines? Pick simple but unique words that won’t be confused with other NASA efforts (or international projects!), and start numbering if you have multiple spacecraft in a series (e.g. Voyager 1 and Voyager 2). Try to avoid acronyms, and don’t write in all caps. According to a 1960 draft NASA directive, names must also “convey meaning consistent with NASA mission for the exploration, scientific investigation, and exploration of space for peaceful purposes.”

These somewhat simple rules allowed for a lot of flexibility in how names got to the proposal stage. For some missions, the team of scientists and engineers working on the project comes up with ideas themselves. Many famous names have also come from student essay contests, including the naming of the Sojourner Mars rover after abolitionist Sojourner Truth, designating ground relay stations in New Mexico with Native American names like Cacique (leader) and Danzante (dancer), and the more recent Perseverance rover in the Mars 2020 mission.

Many NASA satellites—the Hubble Space Telescope, the Nancy Grace Roman Space Telescope, the Parker Solar Probe—are named after people, particularly scientists and pioneers in astronomy. In the past few years, however, the first major change to Policy 7620 in decades was presented as a result of the James Webb Space Telescope naming controversy. Some criticized using the name of an administrator who was in power during a time of intense exclusion for gay employees (known as the Lavender Scare), and although NASA decided to keep the name, they did suggest future projects avoid naming themselves after individuals, in the most recent revision of Policy 7620.

What’s in a name contains the culture, values, heroes, and priorities of a given society and their space exploration program. A project like Lanyue captures the excitement of a return to the moon, while a project like the Perseverance rover reminds us that science requires determination. These names bring life to the goals of the mission and the dreams of the people creating them, and carve out a designation to make a mark on the history of space exploration.

Thank you to Nanette Smith and Stephen Garber from NASA History Division for their helpful guidance in finding historical policy memos and documents.

The post Voyager, Chandrayaan, Curiosity: How do spacecraft get their names? appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
‘Space headaches’ could be a real pain for astronauts https://www.popsci.com/science/space-headaches-astronauts/ Wed, 13 Mar 2024 20:00:00 +0000 https://www.popsci.com/?p=606454
NASA astronaut Scott Tingle is pictured during a 2018 spacewalk to swap out a degraded robotic hand on the Canadarm2.
NASA astronaut Scott Tingle during a 2018 spacewalk to swap out a degraded robotic hand on the Canadarm2. Johnson Space Center

A study of several NASA, ESA, and JAXA astronauts shows an association between long-haul space flight and headaches.

The post ‘Space headaches’ could be a real pain for astronauts appeared first on Popular Science.

]]>
NASA astronaut Scott Tingle is pictured during a 2018 spacewalk to swap out a degraded robotic hand on the Canadarm2.
NASA astronaut Scott Tingle during a 2018 spacewalk to swap out a degraded robotic hand on the Canadarm2. Johnson Space Center

Space travel is certainly not for the faint of heart, for many reasons including its effects on physical health. It can potentially disturb human immune systems and increase red blood cell death. Astronauts can even suffer from bone loss during missions. It could also increase headaches. Astronauts with no prior history of headaches may experience migraine and tension-type headaches during long-haul space flights–over 10 days in space. The findings are detailed in a study published March 13 in the journal Neurology, the medical journal of the American Academy of Neurology.

“Changes in gravity caused by space flight affect the function of many parts of the body, including the brain,” W. P. J. van Oosterhout, study co-author and a neurologist at Leiden University Medical Center in the Netherlands, said in a statement. “The vestibular system, which affects balance and posture, has to adapt to the conflict between the signals it is expecting to receive and the actual signals it receives in the absence of normal gravity.”

[Related: 5 space robots that could heal human bodies—or even grow new ones.]

The study looked at 24 astronauts from NASA, the European Space Agency (ESA), and the Japan Aerospace Exploration Agency (JAXA). All of the astronauts were assigned to International Space Station expeditions for up to 26 weeks from November 2011 to June 2018. Combined, the astronauts studied spent a total of 3,596 days in space. 

The astronauts all completed health screenings and a questionnaire about individual headache history before their space flight flight. Nine of them reported never having any headaches prior to the study, with three reporting a headache that interfered with their daily activities within the last year. None of the astronauts had a history of recurrent headaches or had a migraine diagnosis. 

During space flight, they filled out a daily questionnaire for the first seven days and a weekly questionnaire each following week throughout their stay in the International Space Station. The astronauts reported 378 headaches during their combined days in space

The study found that 92 percent of the astronauts surveyed experienced headaches during space flight, compared to just 38 percent who reported experiencing headaches in the two to six months before going into space. Twenty-two of the 24 astronauts studied also experienced one or more headache episodes during their first week in space. About 89 percent of these headaches were tension headaches and 10 percent were likely a migraine. Headaches were also of a higher intensity and more likely to be like a migraine during the first week of space flight. 

According to van Oosterhout, the changes to the brain’s balance and posture system, combined with adjusting to zero gravity during the first week of space flight, “can lead to space motion sickness in the first week, of which headache is the most frequently reported symptom. Our study shows that headaches also occur later in space flight and could be related to an increase in pressure within the skull.” 

[Related: Why space lettuce could be the pharmacy astronauts need.]

The astronauts were monitored after returning back to Earth and none of them reported any headaches in the three months after returning home. 

One of the study’s limitations is that it relied on self-reporting of symptoms, so the memory recall may not have been fully accurate. It also didn’t say that going into space causes headaches, only shows an association. 

“Further research is needed to unravel the underlying causes of space headache and explore how such discoveries may provide insights into headaches occurring on Earth,” said van Oosterhout. “Also, more effective therapies need to be developed to combat space headaches as for many astronauts this [is] a major problem during space flights.”

The post ‘Space headaches’ could be a real pain for astronauts appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
15 captivating photos of auroras seen from space https://www.popsci.com/science/space-auroras-photos/ Sun, 10 Mar 2024 00:20:00 +0000 https://www.popsci.com/?p=605303
a green and red aurora with a space craft in the foreground
This is one of a series of night time images photographed by one of the Expedition 29 crew members from the International Space Station. It features Aurora Australis over the southern Indian ocean. NASA

Greens, purples, reds, and yellows splash across the sky.

The post 15 captivating photos of auroras seen from space appeared first on Popular Science.

]]>
a green and red aurora with a space craft in the foreground
This is one of a series of night time images photographed by one of the Expedition 29 crew members from the International Space Station. It features Aurora Australis over the southern Indian ocean. NASA

An aurora is a dazzling spectacle to witness down here on Earth, but from space, you get an entirely different perspective on the light show. While most of us won’t have the opportunity to see the beauty of an aurora from space first-hand, astronauts have captured stunning images of what an aurora looks like dancing over our planet.

In February, NASA astronaut Jasmin Moghbeli shared her view of an aurora from the International Space Station, writing on X (formerly Twitter): “Sometimes I can’t believe this is our planet, OUR home. How lucky we are to live somewhere so spectacular and alive. I will definitely miss these views, but I look forward to exploring more of our planet and the beautiful views from the ground.” Moghbeli’s photos show a green aurora australis dancing over the southern hemisphere with the ISS in the foreground.

Unless you have a flight to the ISS booked, you’ll have to settle for experiencing the beauty of auroras from space in photographs. Thankfully the photos are quite beautiful too.

[Related: We finally know what sparks the Northern Lights]

green and red aurora in the middle with a cloudy earth below and stars above
The aurora australis seemingly crowns the Earth’s horizon as the International Space Station orbited 272 miles above the southern Indian Ocean in between Asia and Antarctica. Image: NASA
Space photo
A greenish appearing aurora forms the backdrop for this 35mm scene of the Earth orbiting Space Shuttle Endeavour’s aft cargo bay. Featured in the bay are the antennae for the SIR-C/X-SAR imaging radar instruments, illuminated by moonlight. The crew sighted the southern lights (aurora australis) several times during each of the eleven days of the mission. Image: NASA
Space photo
Earth Observation taken during a night pass by the Expedition 40 crew aboard the International Space Station (ISS). A docked Soyuz spacecrat is also visible in foreground. Image: NASA
red and green aurora over a blue ocean
A brilliant and vivid Aurora Borealis illuminates the Earth’s northern hemisphere on Jan 20, 2016, providing a spectacular view for members of Expedition 46 aboard the International Space Station. Image: NASA
green aurora
The aurora australis streams across the Earth’s atmosphere as the International Space Station orbited 271 miles above the southern Indian Ocean in between Asia and Antarctica. Image: NASA
green aurora over a cloudy earth
While docked and onboard the International Space Station, a STS-123 Endeavour crew member captured the glowing green beauty of the Aurora Borealis. Looking northward across the Gulf of Alaska, over a low pressure area (cloud vortex), the aurora brightens the night sky. This image was taken on March 21, 2008. Image: NASA
a green, yellow, purple, and blue aurora over the earth
The aurora australis streams across the Earth’s atmosphere as the International Space Station orbited 271 miles above the southern Indian Ocean in between Asia and Antarctica. Image: NASA
a green aurora line in a dark sky
Astronaut Don L. Lind, mission specialist, termed this scene of an aurora in the Southern Hemisphere as “spectacular,” during a TV down link featuring discussion of the auroral observations on the seven-day flight. This scene was captured by astronaut Robert F. Overmyer, crew commander, using a 35mm camera. Dr. Lind, monitoring activity in the magnetosphere at various points throughout the flight, pinpointed the spacecraft’s location as being over a point halfway between Australia and the Antarctic continent. There are moonlit clouds on Earth. The blue-green band and the tall red rays are aurora. The brownish band parallel to the Earth’s horizon is a luminescence of the atmosphere itself and is referred to as airglow. Dr. T. Hallinan of the Geophysical Institute of Fairbanks serves as principal investigator for the auroral observations experiment and spent a great deal of time with Dr. Lind in preparation for the flight. Image: NASA
green and purple aurora
Earth Observation taken during a night pass by the Expedition 40 crew aboard the International Space Station (ISS). Folder lists this as: Phenomenal Aurora. Part of the Space Station Remote Manipulator System (SSRMS) arm is also visible. Image: NASA
an aurora over city lights
Members of Expedition 43 on the International Space Station captured this contrasting image of Earth sunrise, aurora and sparling cities in northern Europe. Image: NASA
the arm of a space craft in the foreground with a green aurora in the background
Night Earth Observations taken by Expedition 41 crewmember. Aurora and Remote Manipulator System (RMS) are visible. Image: NASA
green aurora over clouds
A 35mm frame of the Aurora Australis, also known as the Southern Lights, photographed from Space Shuttle Discovery’s flight deck by one of its seven crew members. One of the mission objectives was to measure the spectral and spatial characteristics of auroral emissions. While passing over the sunlighted portion of Earth, the crew was able to take a number of photos of the various geographic points on the planet; much of the time on nightside passes was devoted to a thorough study and documentation of auroral displays. Image: NASA
a green aurora with the 'legs' of the international space station
Earth observation taken by the Expedition 40 crew aboard the International Space Station (ISS). Image: NASA
city lights in the night's sky with a green aurora overhead
The city lights (bottom center to far left) of Moscow and Saint Petersburg in Russia, to Helsinki, Finland, are framed by an aurora in this photograph from the International Space Station as it orbited 264 miles above. Image: NASA

The post 15 captivating photos of auroras seen from space appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
‘Alien’ signal was likely a very big truck https://www.popsci.com/science/uap-seismic-data-truck/ Fri, 08 Mar 2024 18:15:00 +0000 https://www.popsci.com/?p=605936
Google Earth Image of seismic center and truck road
The area near the seismic station in Manus Island, based on satellite images acquired on March 23, 2023. CREDIT: ROBERTO MOLAR CANDANOSA AND BENJAMIN FERNANDO/JOHNS HOPKINS UNIVERSITY, WITH IMAGERY FROM CNES/AIRBUS VIA GOOGLE

Researchers took a deeper look at seismic data taken during the 2014 fireball landing near Papua New Guinea.

The post ‘Alien’ signal was likely a very big truck appeared first on Popular Science.

]]>
Google Earth Image of seismic center and truck road
The area near the seismic station in Manus Island, based on satellite images acquired on March 23, 2023. CREDIT: ROBERTO MOLAR CANDANOSA AND BENJAMIN FERNANDO/JOHNS HOPKINS UNIVERSITY, WITH IMAGERY FROM CNES/AIRBUS VIA GOOGLE

There’s no doubt an extremely bright fireball careened through the atmosphere north of Papua New Guinea on January 8, 2014. It’s also true that divers recovered materials at the bottom of the ocean last year near where many experts believed the object landed—and that prominent Harvard astrophysicist Avi Loeb theorized some of these metallic spherules were possibly of “extraterrestrial technological” origin. But as to the ground vibrations recorded at a seismic station on Manus Island during the same atmospheric event? The explanation is likely much more mundane.

“[T]hey have all the characteristics we’d expect from a truck and none of the characteristics we’d expect from a meteor,” Johns Hopkins planetary seismologist Benjamin Fernando said on Thursday.

Fernando and his colleagues will present their findings on March 12 during the annual Lunar and Planetary Science Conference in Houston, Texas.

Although Fernando’s team concedes it’s difficult to prove what something isn’t through signal data, it’s pretty easy to highlight the characteristics it may share with existing, explainable seismic info. 

“The signal changed directions over time, exactly matching a road that runs past the seismometer,” said Fernando.

[Related: How scientists decide if they’ve actually found signals of alien life.]

To further bolster the much more everyday explanation, researchers also utilized data collected during the 2014 event by facilities in Australia and Palau originally built to measure nuclear test sound waves. After factoring in those recordings, Fernando’s team revised the previous location estimations for a more exact spot of the atmospheric occurrence—an area 100 miles away from the original region.

“The fireball location was actually very far away from where the oceanographic expedition went to retrieve these meteor fragments,” Fernando said of the 2023 recovery trip. “Not only did they use the wrong signal, they were looking in the wrong place.”

The team also doesn’t mince words in their new paper, “Probably Not Aliens: Seismic Data Analysis from the 2014 ‘Interstellar Meteor.’” Of the alien theory, the researchers “consider it to be at best highly overstated and at worst entirely erroneous.” And of the material recovered last year, “poor localisation implies that any material recovered is far less likely to be from the meteor, let alone of interstellar or even extraterrestrial origin.”

[Related: How lightning on exoplanets could make it harder to find alien life.]

Given NASA’s estimate that around 50 tons of meteoritic material bombards Earth every day, Fernando’s team says it’s definitely possible some of those fragments retrieved from the ocean floor may indeed be from some other meteorite. Regardless, they “strongly suspect that it wasn’t aliens.”

Disappointing? Perhaps. But there’ll probably be plenty of new UAP sightings to parse in the future—especially if people take up the government’s offer to submit their own inexplicable events.

For more detailed debunking, tune into a livestream of next week’s findings here.

The post ‘Alien’ signal was likely a very big truck appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How lightning on exoplanets could make it harder to find alien life https://www.popsci.com/science/how-lightning-on-exoplanets-could-make-it-harder-to-find-alien-life/ Thu, 07 Mar 2024 14:00:00 +0000 https://www.popsci.com/?p=605673
It depends on the type of atmosphere and how much lightning is happening.
It depends on the type of atmosphere and how much lightning is happening. NASA

Extraterrestrial biosignatures can be both boosted and hidden.

The post How lightning on exoplanets could make it harder to find alien life appeared first on Popular Science.

]]>
It depends on the type of atmosphere and how much lightning is happening.
It depends on the type of atmosphere and how much lightning is happening. NASA

We’re used to thunder and lightning here on Earth. But what might they be like on another planet? We know other worlds in the solar system have lightning strikes, for example, high in the clouds of Jupiter or during dust storms on Mars. Now, astronomers are thinking about lightning on planets beyond the solar system–and its effects on the signs of life on those planets.

In a new research paper accepted to the journal Astronomy and Astrophysics, a team of astrobiologists investigated how lightning might change some of the biosignatures—chemical signs of life—we could look for on other worlds. Overall, the results are nuanced, just like the complex atmosphere we experience here on Earth. Lightning, it turns out, can amplify some biosignatures while masking others. This both offers clues to find extraterrestrial life and complicates our observations.

Lightning appears to us as a bright flash, usually during a big rainstorm, and it’s caused by electricity in the atmosphere discharging between clouds or to the ground. It also “influences the chemistry of planetary atmospheres, including, as we all know, on Earth,” explains co-author Edward Schwieterman, an astrobiologist at the University of California, Riverside. Lightning even may have played a role in how life got started on our planet—astrobiologists think it could have brought together some of the molecules that eventually became amino acids in our bodies.

How much lightning happens on a planet depends on a whole array of factors, including how much water is in the atmosphere and how hot (or cold) it is. And if there’s enough lightning, it might significantly change what’s going on in the atmosphere. This is particularly critical for astrobiologists planning to look for specific chemicals that indicate that there is alien biology happening on planets beyond the solar system. 

“A lot of the work that people do on exoplanet atmospheric biosignatures is very theoretical, based purely on computational modeling,” adds Nick Wogan, an astronomer at NASA Goddard not affiliated with the new work. This research team, however, combined both hands-on lab experiments and computer simulations to study how many different chemicals are produced and affected when lightning strikes. They found that the outcome depends on the type of atmosphere and how much lightning is happening.

What signs of life are astrobiologists looking for?

A few particularly interesting chemicals for biology are ammonia, methane, ozone and nitrous oxide. Scientists don’t know of a way ammonia can be made without biological processes, and nitrous oxide is a common product of bacteria’s metabolisms. Methane is very commonly produced by life here on Earth, including when humans burn fossil fuels and cows burp, and ozone is a good sign that an atmosphere contains oxygen, which we humans famously need to breathe. If these chemicals are detected on a faraway planet, there is potential that life may exist on that world, marking it as a major target of interest for further research.

In the new study, the team showed that lightning can’t make enough ammonia, methane, or nitrous oxide on its own, making those chemicals even more reliable signs of life. They also showed that it can’t make carbon monoxide—sometimes called an anti-biosignature, because its presence means life probably doesn’t exist there—meaning we won’t discount an inhabited planet as dead and full of carbon monoxide just because it has lightning on it. 

[ Related: How scientists decide if they’ve actually found signals of alien life ]

However, lightning might make it harder to spot ozone. That is, a planet with only a few times more lightning than Earth might have its ozone signatures hidden by the effects of lightning.

When will we see these signs?

Actually observing chemistry and lightning on an Earth-like rocky exoplanet is still a goal for the future, and something we haven’t really done yet. “We have yet to identify the presence of an atmosphere on a temperate rocky planet like Earth, though few planets have been examined so far,” explains Schwieterman. Most of the rocky planets we have studied so far are close enough to their stars that their atmospheres are gone, burned off by the intense radiation of their suns.

Astronomers are, however, making tangible progress towards observing truly Earth-like planets. The Habitable Worlds Observatory, planned for launch in the 2040s, will be designed to look for biosignatures. Research to understand atmospheres and their many factors—including lightning—“will be key when future telescopes begin searching for biosignature gases on Earth-sized exoplanets,” adds Wogan.

“Overall, the effect on biosignatures by lightning seems to be not too much to worry about,” says lead author Patrick Barth, astronomer at the University of Stuttgart. Lightning, he adds, is just “another piece of information that you have to take into account when you’re observing an exoplanet.”

Update 03/11/24 8:55am: The missing word “enough” has been added to sentence beginning with, “In the new study, the team showed that lightning…”

The post How lightning on exoplanets could make it harder to find alien life appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA’s astronaut applications are open again. Do you have what it takes? https://www.popsci.com/science/nasa-astronaut-application-open/ Wed, 06 Mar 2024 17:00:00 +0000 https://www.popsci.com/?p=605607
NASA astronaut training
You can apply for the 24th astronaut candidate class until April 6. NASA / YouTube

If you missed out on space camp, it's time to see if you qualify for the real thing.

The post NASA’s astronaut applications are open again. Do you have what it takes? appeared first on Popular Science.

]]>
NASA astronaut training
You can apply for the 24th astronaut candidate class until April 6. NASA / YouTube

NASA is wasting no time after yesterday’s 23rd astronaut class graduation ceremony. After congratulating the 10 newest people now eligible for flight assignments, the agency has opened the application portal for its next pool of potentially spacebound voyagers. And to celebrate the occasion, NASA enlisted the legend Morgan Freeman to narrate its announcement video.

International Space Station photo

A total of 360 candidates have now taken part in the demanding, two-year training school since 1959, but only three did not finish the program. Currently, just 48 Astronaut Office members are currently active. NASA picked its latest 10 graduates from over 12,000 applicants, who are now also qualified for future assignments aboard the International Space Station, Artemis program missions, and even future commercial space station projects. As Space.com notes, however, their current newbie status will more likely place them in technical roles supporting flights, such as serving as Mission Control capsule communicators (capcoms), as well as overseeing rocket and spacecraft preparations. Before long, however, they could find themselves in line to board those very same vehicles for missions to the ISS or the moon, based on their backgrounds and career experience.

“Picture yourself in space, contributing to a new chapter of human exploration as a NASA astronaut,” Freeman says during the one-minute spot—okay, easy to do, but what about the reality of what’s required to apply?

The astronaut application checklist

According to NASA’s current astronaut candidate application page, you need at least a master’s degree or international equivalent in a STEM-related field such as “engineering, biological science, physical science, computer science, or mathematics.” A minimum of two years currently enrolled in a PhD or similar program can also qualify you, as well as advanced medical degrees. Currently, participation in a stateside or international Test Pilot School program is fair game, too. Either two years of related STEM professional experience or a minimum of 1,000 hours pilot-in-time spent on jet aircrafts are also needed. There aren’t any age restrictions, but every astronaut candidate has so far been somewhere between 26 and 46-years-old, with a median age of 34. 

[Related: How to apply for NASA’s next Mars habitat simulation.]

Unsurprisingly, there’s also a lengthy list of physical assessments and medical requirements, including preliminary and random drug testing for illegal substances, psychiatric evaluations, swimming tests, and the Agency Physical Fitness Test. One’s sitting blood pressure can’t exceed 140/90 and you need 20/20 vision, although LASIK surgery or eyeglasses is fine these days. On the shorter or taller side? Sorry—the height window is only 65-to-75-inches in order to fit into NASA’s (increasingly trendy) spacesuits.

If all those hurdles sound relatively feasible to clear, feel free to head over to the USAJOBS page to fire off an application by April 6. That said, if you’re looking to start a bit closer to Earth, there’s always NASA’s Mars habitat simulation project to consider.

The post NASA’s astronaut applications are open again. Do you have what it takes? appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Listen to three breathtaking NASA images https://www.popsci.com/science/listen-nasa-images/ Mon, 04 Mar 2024 15:00:00 +0000 https://www.popsci.com/?p=605318
M74 aka the Phantom Galaxy shown in a combined optical/mid-infrared image, featuring data from both the Hubble Space Telescope and the James Webb Space Telescope. It is one of the celestial objects featured
M74 aka the Phantom Galaxy shown in a combined optical/mid-infrared image, featuring data from both the Hubble Space Telescope and the James Webb Space Telescope. It is one of the celestial objects featured. ESA/Webb, NASA & CSA, J. Lee and the PHANGS-JWST Team; ESA/Hubble & NASA, R. Chandar Acknowledgement: J. Schmidt

Sonification translates complex space data into soundscapes.

The post Listen to three breathtaking NASA images appeared first on Popular Science.

]]>
M74 aka the Phantom Galaxy shown in a combined optical/mid-infrared image, featuring data from both the Hubble Space Telescope and the James Webb Space Telescope. It is one of the celestial objects featured
M74 aka the Phantom Galaxy shown in a combined optical/mid-infrared image, featuring data from both the Hubble Space Telescope and the James Webb Space Telescope. It is one of the celestial objects featured. ESA/Webb, NASA & CSA, J. Lee and the PHANGS-JWST Team; ESA/Hubble & NASA, R. Chandar Acknowledgement: J. Schmidt

Space produces some otherworldly sounds–black hole songs, Martian dust tornadoes, and meteorites crashing into the Red Planet to name a few. Now, NASA has released three new sonifications of images taken from NASA’s Chandra X-ray Observatory and other telescopes.

The new sonifications highlight different celestial objects observed by NASA telescopes.

[Related: NASA turns spectacular space telescope images into vibey ‘cosmic sonifications.’]

What is sonification?

Sonification translates data into sound. Scientific data is collected by Chandra and other space telescopes as digital signals that are usually turned into the dazzling visuals that we see on Earth. Sonification takes that information and maps it into sound. 

According to NASA, the sonification scans data from one side to the other and each wavelength is mapped out to a different range of tones that our ears can hear. The light of objects is pitched higher and the intensity of the light controls the volume. Radio waves are given the lowest tones, the medium tones are visible data, and the X-rays have the highest tones. 

MSH 11-52–The Cosmic Hand

The first sonification is of MSH 11-52. This is a supernova remnant that is releasing a large cloud of energized particles that looks somewhat like a human hand. It’s estimated that light from this supernova reached Earth roughly 1,700 years ago. The supernova is seen and heard here using data from Chandra, NASA’s Imaging X-ray Polarimetry Explorer (IXPE), and ground-based optical data.

Space Telescope photo

M74–The Phantom Galaxy

This sonification features M74, which is a spiral galaxy like our Milky Way. It is about 3.2 million light-years away from earth in the constellation Pisces. Spiral galaxies like these typically have a rotating disc with spiral ‘arms’ that curve out from a dense central region. This sonification combines data taken with NASA’s James Webb and Hubble Space Telescopes and X-rays from Chandra.

Space Telescope photo

IC 443–The Jellyfish Nebula

The third sonification trio IC 443, nicknamed the Jellyfish Nebula. This nebula is about 5,000 light years away and is the expanding debris cloud from a very large star that exploded. The light from this supernova reached planet Earth more than 30,000 years ago. The data in this sonification include X-rays from Chandra and the now-retired German ROSAT mission. It also uses  radio data from NSF’s Very Large Array and optical data from the Digitized Sky Survey.

Space Telescope photo

NASA’s sonification project began in 2020 and built off of other Chandra projects aimed at reaching blind and visually-impaired audiences. A new documentary, Listen to the Universe, is   now available on NASA+ and explores how these sonifications are created and tells the story of the team that makes them possible. 

[Related: Listen: Meteoroids make little ‘bloop’ noises when crashing into Mars.]

“Sonifications add a new dimension to stunning space imagery, and make those images accessible to the blind and low-vision community for the first time,” Liz Landau, who leads multimedia efforts for NASA’s Astrophysics Division at NASA Headquarters, said in a statement. “I was honored to help tell the story of how Dr. Arcand and the System Sounds team make these unique sonic experiences and the broad impact those sonifications have had.”

The post Listen to three breathtaking NASA images appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Watch the plasma fly in space capsule’s dramatic fall to Earth https://www.popsci.com/science/space-capsule-reentry-video/ Thu, 29 Feb 2024 21:45:00 +0000 https://www.popsci.com/?p=605067
Varda W-1 capsule reentry video screenshot
After 8 months in orbit, Varda's first reusable capsule made a safe return to Earth on Feb. 21. Varda / YouTube

Varda's W-1 spent 8 months in orbit before recording its entire trip home.

The post Watch the plasma fly in space capsule’s dramatic fall to Earth appeared first on Popular Science.

]]>
Varda W-1 capsule reentry video screenshot
After 8 months in orbit, Varda's first reusable capsule made a safe return to Earth on Feb. 21. Varda / YouTube

It took less than 30 minutes for Varda Space Industries’ W-1 capsule to leave its orbital home of eight months and plummet back to Earth. Such a short travel time not only required serious speed (around 25 times the speed of sound), but also the engineering wherewithal to endure “sustained plasma conditions” while careening through the atmosphere. In spite of these challenges, Varda’s first-of-its-kind reentry mission was a success, landing back on the ground on February 21. To celebrate, the company has released video footage of the capsule’s entire descent home.

Check out W-1’s fiery return below—available as both abbreviated and extended cuts:

Installed on a Rocket Lab Photon satellite bus, Varda’s W-1 capsule launched aboard a SpaceX Falcon 9 rocket June 12, 2023. Once in low-Earth orbit, its mini-lab autonomously grew crystals of the common HIV treatment drug ritonavir. Manufacturing anything in space, let alone pharmaceuticals, may seem like overcomplicating things, but there’s actually a solid reason for it. As Varda explains on its website, processing materials in microgravity may benefit from a “lack of convection and sedimentation forces, as well as the ability to form more perfect structures due to the absence of gravitational stresses.”

In other words, medication crystals like those in ritonavir can be grown larger and more structurally sound than is typically possible here on Earth.

Although the experiment wrapped up in just three weeks, Varda needed to delay reentry plans multiple times due to issues securing FAA approval. After finally getting the go-ahead, the W-1 readied for its return earlier this month. All the while, it contained a video camera ready to capture its dramatic fall.

Private Space Flight photo

After ejecting from its satellite host, W-1 begins a slightly dizzying spin that provides some incredible shots from hundreds of miles above Earth. At about the 12-minute mark, the planet’s gravitational pull really takes hold—that’s when things begin to heat up for Varda’s experimental capsule.

[Related: First remote, zero-gravity surgery performed on the ISS from Earth (on rubber)]

At Mach 25 (around 17,500 mph), exterior friction between the craft and Earth’s atmosphere becomes so intense that it literally splits the chemical bonds of nearby air molecules. This results in a dazzling show of sparks and plasma before W-1’s parachute deploys to slow and stabilize its final descent. Finally, the capsule can be seen touching down in a remote region of Utah, where it was recovered by the Varda crew.

Next up will be an assessment of the space-grown drug ingredients, and additional launches of capsules for more manufacturing experiments. While they might not all include onboard cameras to document their returns, W-1’s is plenty mesmerizing enough.

The post Watch the plasma fly in space capsule’s dramatic fall to Earth appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
March’s skies shine with the worm moon, a bright Mercury, and penumbral lunar eclipse https://www.popsci.com/science/march-2024-cosmic-calendar/ Thu, 29 Feb 2024 19:00:00 +0000 https://www.popsci.com/?p=605047
The full moon rises behind the Castel del Monte in Andria, Italy on March 7, 2023. March's full moon is also called the worm moon.
The full moon rises behind the Castel del Monte in Andria, Italy on March 7, 2023. March's full moon is also called the worm moon. Davide Pischettola/NurPhoto via Getty Images

Get ready for April’s solar eclipse by practicing stargazing this month.

The post March’s skies shine with the worm moon, a bright Mercury, and penumbral lunar eclipse appeared first on Popular Science.

]]>
The full moon rises behind the Castel del Monte in Andria, Italy on March 7, 2023. March's full moon is also called the worm moon.
The full moon rises behind the Castel del Monte in Andria, Italy on March 7, 2023. March's full moon is also called the worm moon. Davide Pischettola/NurPhoto via Getty Images
March 15 through 31Look for Mercury
March 20Vernal Equinox
March 25 Full Worm Moon
March 25Penumbral Lunar Eclipse

The countdown to April’s solar eclipse has begun, but there is still a month of fun stargazing opportunities to keep us excited before the big show. Weather folklore says that in the Northern Hemisphere, the third month of the year goes “in like a lion, out like a lamb.” Usually, we can expect fierce wintery weather to kick off March and calm springlike weather to end it. While it is tough to predict exactly what kind of weather that the transitional and temperamental month of March brings, there are some cosmic events to keep your eye on as the days start to get a little bit longer.

[Related: Delta’s solar eclipse flight sold out, but your best bet to see it is still down here.]

March 15 through 31– Look for Mercury

With a radius of only 1,516 miles, Mercury is our solar system’s smallest planet and the closest to our sun. Since it is located so near the sun’s bright rays and is so tiny, it can be more difficult to spot in the night sky. Starting around March 15, Mercury’s apparent distance from the sun will be just far enough away for stargazers to get a look at this planet.

According to the Adler Planetarium in Chicago, it’s best to begin looking to the western sky about 40 minutes after sunset. It will be about seven to 10 degrees above the horizon, so try to have a clear sightline without a lot of interference from buildings or trees. It will reach its greatest eastern elongation on March 24, and then get slightly dimmer as the month winds down. 

March 20–Vernal Equinox

The Vernal Equinox is also known as the first day of spring. The season technically arrives in the Northern Hemisphere at 11:06 p.m. EDT on March 19, or March 20 at 3:06 a.m. Coordinated Universal Time (UTC). This is the standard measurement used to keep time zones organized and is maintained by very precise atomic clocks that are housed at laboratories all over the world. The United States Naval Observatory keeps official time in the United States. 

The equinox occurs twice a year (once in the spring and once in the fall). The March equinox brings earlier sunrises, later sunsets and sprouting plants to the Northern Hemisphere and the opposite effects to the Southern Hemisphere.

March 25– Full Worm Moon

This month’s full moon will reach peak illumination at 3:00 a.m. EDT on Monday, March 25. Beginning on March 24, the bright moon will begin to rise above the horizon. This month’s moon is also the Paschal Full Moon. This is what determines when Easter is celebrated. The holiday is always commemorated on the first Sunday after the first full moon of spring, so this year Easter will be on Sunday, March 31.

[Related: Why scientists think it’s time to declare a new lunar epoch.]

The origin of the name worm moon has a few different stories. Originally, it was believed to refer to the time of year when earthworms emerge, as snow melts and soil warms. However, recent research from the Farmer’s Almanac found that during the 1760s, Captain Jonathan Carver, a colonial explorer from Massachusetts, visited the Naudowessie (Dakota) and other Native American tribes and wrote that “Worm Moon” refers to beetle larvae which start to emerge from the thawing bark of trees and places they hide out during the winter.

Additional names for March’s full moon include the Snowshoe Breaking Moon or Bebookwedaagime-giizis in Anishinaabemowin (Ojibwe), the Hackberry Month or Niyó’not’à:h in Seneca, and the Spring Moon or Upinagasraq or in the Inupiat language.

March 25– Penumbral Lunar Eclipse

While not quite as dramatic as next month’s solar eclipse, there will be a penumbral lunar eclipse on March 25. This occurs when the moon passes through the Earth’s partial shadow, or penumbra. The moon will darken slightly, but not completely. According to NASA, this month’s eclipse will be visible throughout all North America, Mexico, Central America, and South America. For the best viewing time near you, check out timeanddate.com.

The same skygazing rules that apply to pretty much all star gazing activities are key this month: Go to a dark spot away from the lights of a city or town and let your eyes adjust to the darkness for about a half an hour.

The post March’s skies shine with the worm moon, a bright Mercury, and penumbral lunar eclipse appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Why our tumultuous sun was relatively quiet in the late 1600s https://www.popsci.com/science/sun-quiet/ Thu, 29 Feb 2024 14:00:00 +0000 https://www.popsci.com/?p=604855
This composite image of the Sun includes high-energy X-ray data from NASA's Nuclear Spectroscopic Telescope Array (NuSTAR) shown in blue; lower energy X-ray data from the X-ray Telescope (XRT) on the Japanese Aerospace Exploration Agency's Hinode mission shown in green; and ultraviolet light detected by the Atmospheric Imaging Assembly (AIA) on NASA's Solar Dynamics Observatory (SDO) shown in red.
This composite image of the Sun includes high-energy X-ray data from NASA's Nuclear Spectroscopic Telescope Array (NuSTAR) shown in blue; lower energy X-ray data from the X-ray Telescope (XRT) on the Japanese Aerospace Exploration Agency's Hinode mission shown in green; and ultraviolet light detected by the Atmospheric Imaging Assembly (AIA) on NASA's Solar Dynamics Observatory (SDO) shown in red. JPL/NASA

New simulations shine a light on 300-year-old solar mystery.

The post Why our tumultuous sun was relatively quiet in the late 1600s appeared first on Popular Science.

]]>
This composite image of the Sun includes high-energy X-ray data from NASA's Nuclear Spectroscopic Telescope Array (NuSTAR) shown in blue; lower energy X-ray data from the X-ray Telescope (XRT) on the Japanese Aerospace Exploration Agency's Hinode mission shown in green; and ultraviolet light detected by the Atmospheric Imaging Assembly (AIA) on NASA's Solar Dynamics Observatory (SDO) shown in red.
This composite image of the Sun includes high-energy X-ray data from NASA's Nuclear Spectroscopic Telescope Array (NuSTAR) shown in blue; lower energy X-ray data from the X-ray Telescope (XRT) on the Japanese Aerospace Exploration Agency's Hinode mission shown in green; and ultraviolet light detected by the Atmospheric Imaging Assembly (AIA) on NASA's Solar Dynamics Observatory (SDO) shown in red. JPL/NASA

From Earth, the Sun appears comfortingly constant, always just about the same brightness in our blue skies during the day. Up close, however, it’s a lot more tumultuous, dotted with sunspots and roiling with solar flares.

Solar activity varies in fairly predictable 11 year cycles, meaning there are more sunspots and flares at certain intervals. But, there’s also fluctuation between cycles—some periods are more intense, and others, like the three decade Maunder minimum in the late 1600s, are almost entirely devoid of sunspots. Astronomers have been trying to pin down the exact physics behind these phenomena for years, although they have a clue as to the culprit: the Sun’s magnetic field. 

New research, published in the journal Monthly Notices of the Royal Astronomical Society last year and recently presented at a symposium of the International Astronomical Union, simulates how the stars’ spin can lead to very different kinds of solar cycles. They explain that spin affects a star’s magnetic field, which is born from hot plasma flowing inside the star’s core, creating the differences seen in nearby stars other than our sun. It turns out that the so-called “grand minima” that our sun experiences—like the long-ago Maunder minimum—may not be uniform across all stars. 

“Young stars, rapidly rotating and full of energy, are like children: lively, unpredictable, and active… Old stars, with their slow rotation, are reminiscent of the elderly: moving at a more measured pace, and embodying a calmer presence.”

Observations from the past 50 years all point to one trend: that more active stars tend to rotate faster. The new simulations provide a physical explanation for this trend and “confirm the suspected link between a faster rotation and more active stellar cycle,” according to Ryan French, an astronomer at the National Solar Observatory not affiliated with the publication.

A very long solar filament that had been snaking around the Sun erupted (Dec. 6, 2010) with a flourish. STEREO (Behind) caught the action in dramatic detail in extreme ultraviolet light of Helium. It had been almost a million km long (about half a solar radius) and a prominent feature on the Sun visible over two weeks earlier before it rotated out of view. Filaments, elongated clouds of cooler gases suspended above the Sun by magnetic forces, are rather unstable and often break away from the Sun. Credit: NASA/GSFC/SOHO NASA
A very long solar filament that had been snaking around the Sun erupted (Dec. 6, 2010) with a flourish. STEREO (Behind) caught the action in dramatic detail in extreme ultraviolet light of Helium. It had been almost a million km long (about half a solar radius) and a prominent feature on the Sun visible over two weeks earlier before it rotated out of view. Filaments, elongated clouds of cooler gases suspended above the Sun by magnetic forces, are rather unstable and often break away from the Sun. Credit: NASA/GSFC/SOHO NASA

The newly published simulations use the physics of fluid dynamics to imitate the rotation and flow of hot plasma within a star. This moving plasma is what generates a star’s magnetic field, known as a magnetic dynamo. The researchers tried these simulations with sun-sized stars that rotate at different speeds, from slow (30 days to complete a revolution), to something similar to our sun (about 25 days), to very fast (1 day). They found that the faster a star rotates, the stronger and more chaotic its magnetic field is. This leads to less predictable solar cycles, and fewer periods of inactivity like the Maunder minimum.

“Young stars, rapidly rotating and full of energy, are like children: lively, unpredictable, and active. These stars have magnetic fields that are strong and chaotic, mirroring the boundless energy and sometimes erratic behaviors of kids,” write lead author Vindya Vashishth and her colleague Anu Sreedevi from the Indian Institute of Technology. “Old stars, with their slow rotation, are reminiscent of the elderly: moving at a more measured pace, and embodying a calmer presence. Their magnetic fields are weaker, and their activity cycles are smooth and predictable, with occasional grand minima. These grand minima become more frequent as the star ages, much like how elderly individuals may have more frequent periods of rest or quietude,” they add.

In order to have grand minima—which they call the “hibernation phase of a star”—like the sun, a star must spin slower than once every ten days according to their models. They have good reason to be confident in their results, too. We have observational evidence that the sun has experienced around 27 grand minima in the past 11,000 years, and their study found a similar frequency of those events in a simulation that spans 11,000 years of the sun’s history.

What is happening with our sun right now?

Some astronomers even think our sun might be in a grand minimum right now, or at least just coming out of one. “Whether the last solar cycle was a Maunder Minimum or not is still debatable,” says Jia Huang, a solar scientist at Berkeley not involved with the new work. “This paper provides a new aspect to understanding the relationship between solar rotations and the occurrence of grand minima, thus it is timely and insightful to understand why and how the grand minima happen.”

The previous solar cycle—Solar Cycle 24—spanned December 2008 to December 2019, and was particularly weak, meaning there weren’t as many sunspots, flares, or other activity on the sun’s surface. A lack of solar activity can actually be good for humanity, as large solar storms or Coronal Mass Ejections (CMEs) can have deleterious effects on our power grids and satellites. On the other hand, extended periods of quiet from the Sun might make Earth a less pleasant place to live; the Maunder minimum seems to coincide with the Little Ice Age, but a causal link has yet to be confirmed.

What’s next for our sun’s solar activity?

We’re now in Solar Cycle 25, which began in December 2019 and will continue until about 2030. Solar scientists are starting to get excited, as the maximum of this particular cycle is set to occur within the next two years. “As we approach the solar maximum of the Sun’s solar cycle, more attention than ever turns to the activity of our local star,” adds French. “Years ago, predictions were made of how this Solar Cycle 25 would play out, and we’re finally close to revealing the truth.”

Scientists at the National Oceanic and Atmospheric Administration claim Cycle 25 will be fairly mild, but will also finally break the trend of weakening solar activity, avoiding a situation that would truly mimic the Maunder minimum. They’ve also predicted that Cycle 25’s maximum might bring some excitement, including “impactful space weather events” in 2024 and possibly extra-radiant aurora. That’s not all for the sun’s time in the spotlight, either — solar activity will be on full display in April 2024, when a total solar eclipse will allow viewers in North America a gorgeous glimpse at the sun’s corona. It’s the last such event expected for this continent until 2045, so don’t miss out!

The post Why our tumultuous sun was relatively quiet in the late 1600s appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Odie the lunar lander is not dead yet https://www.popsci.com/science/odysseus-lunar-lander-mission/ Wed, 28 Feb 2024 19:35:57 +0000 https://www.popsci.com/?p=604519
On Feb. 22, 2024, Intuitive Machines’ Odysseus lunar lander captures a wide field of view image of Schomberger crater on the Moon approximately 125 miles (200 km) uprange from the intended landing site, at approximately 6 miles (10 km) altitude.
On Feb. 22, 2024, Intuitive Machines’ Odysseus lunar lander captures a wide field of view image of Schomberger crater on the Moon approximately 125 miles (200 km) uprange from the intended landing site, at approximately 6 miles (10 km) altitude. Intuitive Machines

Despite toppling on its side during landing, Odysseus is outliving its 10-20 hour prognosis.

The post Odie the lunar lander is not dead yet appeared first on Popular Science.

]]>
On Feb. 22, 2024, Intuitive Machines’ Odysseus lunar lander captures a wide field of view image of Schomberger crater on the Moon approximately 125 miles (200 km) uprange from the intended landing site, at approximately 6 miles (10 km) altitude.
On Feb. 22, 2024, Intuitive Machines’ Odysseus lunar lander captures a wide field of view image of Schomberger crater on the Moon approximately 125 miles (200 km) uprange from the intended landing site, at approximately 6 miles (10 km) altitude. Intuitive Machines

Despite landing on its side and struggling to maintain power, Odysseus, the first US spacecraft to land on the moon in over half a century, is still somewhat operational. Built by the Houston-based company, Intuitive Machines, “Odie” marked a historic return to the lunar surface, and became the first privately funded venture ever to successfully reach the moon.

On Tuesday morning, Intuitive predicted that the spacecraft “may continue up to an additional 10-20 hours.” Yet, mission control plans to put the lander to sleep later tonight. Odie “continues to generate solar power,” said Intuitive Machines co-founder and president Steve Altemus during today’s mission update. Altemus also confirmed that engineers will attempt to revive Odysseus in 2-to-3 weeks following the upcoming lunar night’s conclusion.

“We’ve gotten over 15 megabytes of data,” said CLPS project scientist Sue Lederer when discussing the data the team is retrieving from Odysseus on Wednesday. “We went from basically a cocktail straw of data coming back to a boba tea size straw of data coming back.”

picture of odie on the surface of the moon, touching down with its engine firing. the landing gear pieces are broken off
An image of Odysseus on the surface of the moon, touching down with its engine firing. Pieces of landing gear are are broken off. Credit: Intuitive Machines

Launched from NASA’s Kennedy Space Center on February 15 aboard a SpaceX Falcon 9 rocket, Odysseus spent the next week traveling 230,000-miles towards the moon—and even documented its journey in the process.

[Related: ‘Odie’ makes space history with successful moon landing.]

For a moment, it seemed as though Odysseus might meet a recent predecessor’s similar fate. Less than a week before the Odysseus launch, the Peregrine lunar lander built by Astrobotics experienced a “critical loss of propellant” on its way to the moon, forcing the private company to abandon its mission.

NASA’s Lunar Reconnaissance Orbiter captured this image of the Intuitive Machines’ Nova-C lander, called Odysseus, on the Moon’s surface on Feb. 24, 2024, at 1:57 p.m. EST). Odysseus landed at 80.13 degrees south latitude, 1.44 degrees east longitude, at an elevation of 8,461 feet (2,579 meters). The image is 3,192 feet (973 meters) wide, and lunar north is up. (LROC NAC frame M1463440322L) Credit: NASA/Goddard/Arizona State University
NASA’s Lunar Reconnaissance Orbiter captured this image of the Intuitive Machines’ Nova-C lander, called Odysseus, on the Moon’s surface on Feb. 24, 2024, at 1:57 p.m. EST). Odysseus landed at 80.13 degrees south latitude, 1.44 degrees east longitude, at an elevation of 8,461 feet (2,579 meters). The image is 3,192 feet (973 meters) wide, and lunar north is up. (LROC NAC frame M1463440322L) Credit: NASA/Goddard/Arizona State University

While circling the moon ahead of last week’s descent, Odysseus ground engineers discovered they failed to turn on the spacecraft’s navigating laser system. As luck would have it, Odysseus housed an experimental NASA laser navigation device intended for testing once it reached its final destination. Mission controllers managed to boot up the laser, which allowed the lander to finish its trip. On February 22, Odysseus arrived close to the Malapert A crater within a mile of its target, approximately 185 miles from the moon’s south pole—but not without a debilitating setback.

While landing, a faster-than-intended descent caused one of its six legs to malfunction and tip Odysseus on its side. According to mission representatives, the resulting position blocked a number of Odie’s antennas, and angled solar panels in a way that limited their ability to draw power. A similar issue plagued yet another recent historic lunar landing mission, when Japan’s SLIM spacecraft arrived to the moon last month intact, if upside down.

[ Related: SLIM lives! Japan’s upside-down lander is online after a brutal lunar night ]

But even if it perfectly stuck the landing, Odysseus would still only have had another two-to-three days of life before powering down as the moon entered its next lunar night. Designers did not intend Odie to survive the harsh, 14.5-day phase that sees temperatures plummet as low as -208 Fahrenheit.

During a February 28 mission update, representatives say NASA Adminstrator Bill Nelson considers Odie’s landing a “success” despite setbacks.

Odysseus contained six NASA experiments (including that aforementioned laser nav system) intended to help plan for future Artemis program missions, a camera designed by university students, a lunar telescope prototype, as well as an art project containing 125 steel sculptures by Jeff Koons. According to Intuitive Machines CEO Steve Altemus, Odysseus tipped so that only the Koons cargo faces downward into the lunar dirt.

This story is developing. We will update this article with more details.

The post Odie the lunar lander is not dead yet appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA and Google Earth Engine team up with researchers to help save tigers https://www.popsci.com/environment/tiger-conservation-nasa-google/ Tue, 27 Feb 2024 15:37:58 +0000 https://www.popsci.com/?p=604513
Tiger walking across snow
A female tiger in the Sikhote-Alin Biosphere Reserve, a UNESCO site, in Russia. ANO WCS and Sikhote-Alin Biosphere Reserve

Here’s how a new real-time data system could improve wild tiger habitats—and the health of our planet.

The post NASA and Google Earth Engine team up with researchers to help save tigers appeared first on Popular Science.

]]>
Tiger walking across snow
A female tiger in the Sikhote-Alin Biosphere Reserve, a UNESCO site, in Russia. ANO WCS and Sikhote-Alin Biosphere Reserve

Less than 4,500 tigers remain in the world, according to the International Union for the Conservation of Nature (IUCN). Habitat loss continues to pose an immense existential threat to the planet’s largest cat species—a problem compounded due to the animals residing in some of Earth’s most ecologically at-risk regions and landscapes.

To better monitor the situation in real time, NASA, Google Earth Engine, and over 30 researcher collaborators are announcing TCL 3.0 today, a new program that combines satellite imagery and powerful computer processing to keep an eye on tigers’ existing and reemerging ecosystems.

“The ultimate goal is to monitor changes in real time to help stabilize tiger populations across the range,” Eric W. Sanderson, VP for Urban Conservation at the New York Botanical Garden and first author of a recent foundational study published in Frontiers in Conservation Science explained.

[Related: A new algorithm could help detect landslides in minutes.]

“Tiger Conservation Landscapes,” or TCLs, refer to the planet’s distinct locales where Panthera tigris still roam in the wild. Because of their size, diet, and social habits, tigers require comparatively large areas to not only survive, but flourish.

According to researchers, stable tiger populations “are more likely to retain higher levels of biodiversity, sequester more carbon, and mitigate the impacts of climate change, at the same time providing ecosystem services to millions of humans in surrounding areas.” In doing so, TCLs can serve as a reliable, informative indicator of overall environmental health markers.

Unfortunately, the total area of Tiger Conservation Landscapes declined around 11 percent between 2001 and 2020. Meanwhile, potential restored habitats have only plateaued near 16 percent of their original scope—if such spaces were properly monitored and protected, however, tigers could see a 50 percent increase in available living space. 

Using this new analytical computing system based on Google Earth Engine data, NASA Earth satellite observations, biological info, and conservation modeling, TCL 3.0 will offer environmentalist groups and national leaders critical, near-real time tools for tiger conservation efforts.

“Analysis of ecological data often relies on models that can be difficult and slow to implement, leading to gaps in time between data collection and actionable science,” Charles Tackulic, a research statistician with the US Geological Survey, said in today’s announcement. “The beauty of this project is that we were able to minimize the time required for analysis while also creating a reproducible and transferable approach.”

Researchers say government and watchdog users of TCL 3.0 will be able to pinpoint tiger habitat loss as it happens, and hopefully respond accordingly. National summaries of initial available data can be found through the Wildlife Conservation Society, with more information to come.

TCL 3.0 provides an unprecedentedly complex and advanced monitoring system for one of the planet’s most threatened creatures, but as researchers note in their new study, the solution is arguably extremely simple.
“What have we learned about tiger conservation over the last two decades? Conservation works when we choose to make it so,” the authors conclude in their recent report. “Conservation is straightforward. Don’t cut down their habitat. Don’t stalk them, harass them, or kill them or their prey. Control poaching and extinguish the illegal trade in tiger bones and parts. Prevent conflicts with people and livestock wherever possible, and where and when not, then mitigate losses to forestall retaliation.”

Correction 2/27/24 5:53PM: This article has been updated to more accurately reflect the world’s remaining tiger population. PopSci regrets the error.

The post NASA and Google Earth Engine team up with researchers to help save tigers appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Astronomers discover new moons orbiting Uranus and Neptune https://www.popsci.com/science/uranus-neptune-new-moons/ Mon, 26 Feb 2024 21:28:55 +0000 https://www.popsci.com/?p=604308
The discovery image of the new Uranian moon S/2023 U1 using the Magellan telescope. Uranus is just off the field of view in the upper left, as seen by the increased scattered light. S/2023 U1 is the faint point of light in the center of the image with the arrow pointing to it. The trails are from background stars.
The discovery image of the new Uranian moon S/2023 U1 using the Magellan telescope. Uranus is just off the field of view in the upper left, as seen by the increased scattered light. S/2023 U1 is the faint point of light in the center of the image with the arrow pointing to it. The trails are from background stars. Scott Sheppard

The tiny satellites were spotted circling our solar system’s most far flung planets.

The post Astronomers discover new moons orbiting Uranus and Neptune appeared first on Popular Science.

]]>
The discovery image of the new Uranian moon S/2023 U1 using the Magellan telescope. Uranus is just off the field of view in the upper left, as seen by the increased scattered light. S/2023 U1 is the faint point of light in the center of the image with the arrow pointing to it. The trails are from background stars.
The discovery image of the new Uranian moon S/2023 U1 using the Magellan telescope. Uranus is just off the field of view in the upper left, as seen by the increased scattered light. S/2023 U1 is the faint point of light in the center of the image with the arrow pointing to it. The trails are from background stars. Scott Sheppard

Astronomers are adding three newly discovered moons to our solar system’s growing list of known celestial bodies.  A team of international researchers spotted an additional moon circling Uranus’ for the first time in almost two decades and two new moons orbiting the planet Neptune. The discoveries were announced on February 23 by the International Astronomical Union’s Minor Planet Center, a scientific organization who is responsible for designating our solar system’s comets, planets, and moons.

[Related: Neptune’s faint rings glimmer in new James Webb Space Telescope image.]

“The three newly discovered moons are the faintest ever found around these two ice giant planets using ground-based telescopes,” Scott S. Sheppard, an astronomer with the Carnegie Institution for Science who collaborated on the moons’ discovery, said in a statement. “It took special image processing to reveal such faint objects.”

Uranus’ new moon will have a dramatic name

The planet Uranus now has 28 known moons. The new moon is temporarily named S/2023 U1, but it will eventually be named after a character from a Shakespearean play. Uranus moons including Puck, Titania, and Oberon reference A Midsummer Night’s Dream, while the moon Miranda is a reference to The Tempest, both plays written by the English playwright.

At only five miles wide, S/2023 U1 is likely Uranus’ smallest known moon. It takes the tiny satellite 680 days to orbit the planet. Shepherd first spotted S/2023 U1 on November 4, 2023, using the Magellan telescopes at Carnegie Science’s Las Campanas Observatory in Chile. Followup observations were conducted one month later. Marina Brozovic and Bob Jacobson of NASA’s Jet Propulsion Laboratory then helped Shepherd determine a possible moon orbit.

New Neptunian moons–one bright, one faint

With this new discovery, the planet Neptune now has 16 known satellites. The brighter of Neptune’s two newly discovered moons is tentatively named S/2002 N5. It is 14 miles wide and appears to be in a 9-year orbit around Neptune. The fainter moon is named S/2021 N1 and it is about 8.6 miles wide. It circles the planet once every 27 years. Both of these moons will eventually be given names based on sea gods and nymphs in Greek mythology.

The two new Neptunian moons were first observed in September 2021. Shepherd worked with David Tholen of the University of Hawaii, Chad Trujillo of Northern Arizona University, and Patryk Sofia Lykawa of Kindai University, and the Subaru telescope to detect the moons. They confirmed the orbit of the brighter moon (S/2002 N5) over about two years and conducted followup observations with the Magellan telescopes.  

“Once S/2002 N5’s orbit around Neptune was determined using the 2021, 2022, and 2023 observations, it was traced back to an object that was spotted near Neptune in 2003 but lost before it could be confirmed as orbiting the planet,” said Sheppard. 

Detecting the fainter moon (S/2021 N1) required some special observing time under “ultra-pristine conditions” at the European Southern Observatory’s Very Large Telescope and on Gemini Observatory’s 8-meter telescope in order to secure its orbit. 

[Related: Expect NASA to probe Uranus within the next 10 years.]

By using these telescopes, shepherd and colleagues snapped dozens of five-minute exposures over three- or four-hour periods on a series of nights. The short-burst images were then layered so that all three new moons could come into view. 

“Because the moons move in just a few minutes relative to the background stars and galaxies, single long exposures are not ideal for capturing deep images of moving objects,” Sheppard said. “By layering these multiple exposures together, stars and galaxies appear with trails behind them, and objects in motion similar to the host planet will be seen as point sources, bringing the moons out from behind the background noise in the images.” 

More understanding of how these moons were captured can help astronomers learn about the tumultuous early years of our solar system and how the planets at the out edge move. Future missions to Uranus and Neptune are in the preliminary planning stages, and more data on their moons will allow the team to better study these far-flung planets. 

The post Astronomers discover new moons orbiting Uranus and Neptune appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
SLIM lives! Japan’s upside-down lander is online after a brutal lunar night https://www.popsci.com/science/slim-moon-lander-reboot/ Mon, 26 Feb 2024 16:00:00 +0000 https://www.popsci.com/?p=604194
Image taken of JAXA SLIM lunar lander on moon upside down
SLIM is defying the odds yet again after a two-week lunar night. JAXA/Takara Tomy/Sony Group Corporation/Doshisha University

The historic moon lander beat the odds.

The post SLIM lives! Japan’s upside-down lander is online after a brutal lunar night appeared first on Popular Science.

]]>
Image taken of JAXA SLIM lunar lander on moon upside down
SLIM is defying the odds yet again after a two-week lunar night. JAXA/Takara Tomy/Sony Group Corporation/Doshisha University

Japan Aerospace Exploration Agency (JAXA) announced on Monday that its historic Smart Lander for Investigating Moon has defied the odds—after surviving a brutal, two-week lunar night while upside down, SLIM’s solar cells subsequently gathered enough energy to restart the spacecraft over the weekend. In an early morning post to X, JAXA reported it briefly established a communication relay with its lunar lander on Sunday, but the moon’s extremely high surface temperature currently prevents engineers from doing much else at the moment. Once SLIM’s instrument temperatures cool off in a few days’ time, however, JAXA intends to “resume operations” through additional scientific observations as long as possible.

[Related: This may be SLIM’s farewell transmission from the moon.]

SLIM arrived near the moon’s Shioli crater on January 19, making Japan the fifth nation to ever reach the lunar surface. Although JAXA’s lander successfully pulled off an extremely precise touchdown, it did so upside down after its main engines malfunctioned about 162-feet above the ground. The resulting nose-down angle meant SLIM’s solar cell arrays now face westward, thereby severely hindering its ability to gather power. Despite these problems, the craft’s two tiny robots still deployed and carried out their reconnaissance duties as hoped and snapped some images of the inverted lander. Meanwhile, SLIM transmitted its own geological survey data back to Earth for a few precious hours before shutting down.

Although JAXA officials cautioned that might be it for their lander, SLIM defied the odds and rebooted 10 days later with enough juice to continue surveying its lunar surroundings, such as identifying and measuring nearby rock formations.

“Based on the large amount of data obtained, analysis is now underway to identify rocks and estimate the chemical composition of minerals, which will help to solve the mysteries surrounding the origin of the Moon. The scientific results will be announced as soon as they are obtained,” JAXA said at the time.

But by February 1, the moon’s roughly 14.5-day lunar night was setting in, plunging temperatures down to a potentially SLIM-killing -208 Fahrenheit. Once again, JAXA bid a preemptive farewell to their plucky, inverted technological achievement—only to be surprised yet again over the weekend.

The rocks on which a detailed 10-band observation was performed. Due to different solar illumination conditions, a few of the rocks selected for observation were changed and additions added.
Figure 2: The rocks on which a detailed 10-band observation was performed. Due to different solar illumination conditions, a few of the rocks selected for observation were changed and additions added. CREDIT: JAXA, RITSUMEIKAN UNIVERSITY, THE UNIVERSITY OF AIZU

In the few days since the most recent lunar evening’s conclusion, SLIM apparently recharged its solar cells enough to come back online. But as frigid as the moon’s night phases are, its daytime temperatures can be just as brutal. According to JAXA, some of the lander’s equipment initially warmed up to over 212-degrees Fahrenheit. To play it safe, mission control is giving things a little time to cool off before tasking SLIM with additional scans, such as using its Multi-Band Camera to assess nearby regolith formations’ chemical compositions.

JAXA has a few more days before the moon enters another two-week night, during which SLIM will go into yet another hibernation. While it could easily succumb to the lunar elements this next time, it’s already proven far more resilient than its designers thought possible. It may not surpass expectations as dramatically as NASA’s Mars Ingenuity rotocopter (RIP), but the fact that SLIM made it this long is cause enough for celebration.

The post SLIM lives! Japan’s upside-down lander is online after a brutal lunar night appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Apollo 17: Looking back at the last time the US landed on the moon https://www.popsci.com/science/apollo-17-moon-images/ Sun, 25 Feb 2024 13:15:00 +0000 https://www.popsci.com/?p=604135
large boulder with astronaut
Scientist-astronaut Harrison H. Schmitt is photographed standing next to a huge, split lunar boulder during the third Apollo 17 extravehicular activity (EVA) at the Taurus-Littrow landing site. NASA

The 1972 lunar landing concluded a pioneering time for moon exploration.

The post Apollo 17: Looking back at the last time the US landed on the moon appeared first on Popular Science.

]]>
large boulder with astronaut
Scientist-astronaut Harrison H. Schmitt is photographed standing next to a huge, split lunar boulder during the third Apollo 17 extravehicular activity (EVA) at the Taurus-Littrow landing site. NASA

In the December 1972 issue of Popular Science, writer Alden P. Armagnac described Apollo 17 as “the most exciting geological field trip in history.” The lunar landing concluded NASA’s groundbreaking Apollo program and ended up being the last time the United States landed on the moon in the 20th century.

This week, after 51 years, the US returned to moon on Odysseus, an uncrewed lander that became the first privately-built spacecraft to survive a moon landing. Odysseus (or “Odie”) was built by Texas-based Intuitive Machines and carried a payload that included NASA navigation and tech experiments. NASA plans to use the instruments to collect vital data ahead of planned crewed missions later this decade.

Space photo
The December 1972 issue of Popular Science included a preview of the Apollo 17 mission and a look back at previous Apollo missions.

To mark the American return to the moon, we wanted to take a look back at Apollo 17 through images. Commander Gene Cernan, Lunar Module Pilot Harrison Schmitt, and Command Module Pilot Ronald Evans blasted off from the Kennedy Space Center on December 7, 1972. The 12-day mission included several notable feats: the first astronaut-scientist on the moon (Schmitt), the first poem read from the surface of the moon, and circling the moon 75 times.

As Armagnac wrote: “When some future lunar settler writes the history of man on the moon, its most dramatic chapter is bound to be the Apollo adventures of 1969-1972.” We’ll have to wait and see what dramatics 21st century moon exploration brings.

the grey surface of the moon in the foreground with earth in the background

The crescent Earth rises above the lunar horizon in this photograph taken from the Apollo 17 spacecraft in lunar orbit during National Aeronautics and Space Administration’s (NASA) final lunar landing mission in the Apollo program. While astronauts Eugene A. Cernan, commander, and Harrison H. Schmitt, lunar module pilot, descended in the Lunar Module (LM) “Challenger” to explore the Taurus-Littrow region of the moon, astronaut Ronald E. Evans, command module pilot, remained with the Command and Service Modules (CSM) “America” in lunar orbit. Photo: NASA
a robotic vehicle sits next to a boulder on the grey sands of the moon

The Apollo 17 Lunar Roving Vehicle (LRV) is photographed near a large lunar boulder during the third Apollo 17 extravehicular activity (EVA) at the Taurus-Littrow landing site. About half of the boulder is captured in this scene, photographed by astronaut Eugene A. Cernan, mission commander. While astronauts Cernan and Harrison H. Schmitt descended in the Lunar Module (LM) “Challenger” to explore the lunar surface, astronaut Ronald E. Evans, command module pilot, remained with the Command and Service Modules (CSM) in lunar orbit. Photo: NASA
An astronaut with a camera on his chest stands amongst boulders

Astronaut Eugene A. Cernan stands near an over-hanging rock during the third Apollo 17 lunar surface extravehicular activity (EVA) at the Taurus-Littrow landing site. Scientist-astronaut Harrison H. Schmitt took this photograph. The tripod-like object just outside the shaded area is the gnomon and photometric chart assembly which is used as a photographic reference to establish local vertical sun angle, scale and lunar color. The gnomon is one of the Apollo Lunar Geology Hand Tools. While astronauts Cernan and Schmitt descended in the Lunar Module “Challenger” to explore the moon, astronaut Ronald E. Evans remained with the Apollo 17 Command and Service Modules in lunar orbit. Photo: NASA
An astronaut salutes the american flag

Astronaut Eugene A. Cernan, Apollo 17 commander, is photographed next to the deployed United States flag during lunar surface extravehicular activity (EVA) at the Taurus-Littrow landing site. The highest part of the flag appears to point toward our planet Earth in the distant background. This picture was taken by scientist-astronaut Harrison H. Schmitt, lunar module pilot. While astronauts Cernan and Schmitt descended in the Lunar Module (LM) to explore the moon, astronaut Ronald E. Evans, command module pilot, remained with the Command and Service Modules (CSM) in lunar orbit. Photo: NASA
the shadow of an astronaut is seen in front of lunar vehicles

 Wide-angle view of the Apollo 17 Taurus-Littrow lunar landing site. To the left in the background is the Lunar Module. To the right in the background is the Lunar Roving vehicle. An Apollo 17 crewmember is photographed between the two points. The shadow of the astronaut taking the photograph can be seen in the right foreground. Photo: NASA
an american flag on the surface of the grey, dusty moon

In this view looking out the Lunar Module (LM) windows shows the United States Flag on the moon’s surface. This view looks toward the north Massif. The LM thrusters can be seen in foreground. While astronauts Eugene A. Cernan, commander, and Harrison H. Schmitt, lunar module pilot, descended in the LM “Challenger” to explore the Taurus-Littrow region of the moon, astronaut Ronald E. Evans, command module pilot, remained with the Command and Service Modules (CSM) “America” in lunar orbit. Photo: NASA
orange dust seen amongst grey dust

 A close-up view of the much-publicized orange soil which the Apollo 17 crewmen found at Station 4 (Shorty Crater) during the second Apollo 17 extravehicular activity (EVA) at the Taurus-Littrow landing site. The orange soil was first spotted by scientist-astronaut Harrison H. Schmitt. While astronauts Schmitt and Eugene A. Cernan descended in the Lunar Module (LM) “Challenger” to explore the lunar surface, astronaut Ronald E. Evans remained with the Apollo 17 Command and Service Modules (CSM) in lunar orbit. The orange soil was never seen by the crewmen of the other lunar landing missions – Apollo 11 (Sea of Tranquility); Apollo 12 (Ocean of Storms); Apollo 14 (Fra Mauro); Apollo 15 (Hadley-Apennines); and Apollo 16 (Descartes). Photo: NASA
an astronaut mid-trip with a leg in the air

Scientist-astronaut Harrison H. Schmitt loses his balance and heads for a fall during the second Apollo 17 extravehicular activity (EVA) at the Taurus-Littrow landing site, as seen in this black and white reproduction taken from a color television transmission made by the color RCA TV camera mounted on the Lunar Roving Vehicle. Schmitt is lunar module pilot of the Apollo 17 lunar landing mission. Astronaut Ronald E. Evans, command module pilot, remained with Apollo 17 Command and Service Modules in lunar orbit while astronauts Schmitt and Eugene A. Cernan, commander, descended in the Lunar Module “Challenger” to explore the moon. Photo: NASA
a shiny silver module floats above the surface of a cratered moon

An excellent view of the Apollo 17 Command and Service Modules (CSM) photographed from the Lunar Module (LM) “Challenger” during rendezvous and docking maneuvers in lunar orbit. The LM ascent stage, with astronauts Eugene A. Cernan and Harrison H. Schmitt aboard, had just returned from the Taurus-Littrow landing site on the lunar surface. Astronaut Ronald E. Evans remained with the CSM in lunar orbit. Note the exposed Scientific Instrument Module (SIM) Bay in Sector 1 of the Service Module (SM). Three experiments are carried in the SIM bay: S-209 lunar sounder, S-171 infrared scanning spectrometer, and the S-169 far-ultraviolet spectrometer. Also mounted in the SIM bay are the panoramic camera, mapping camera and laser altimeter used in service module photographic tasks. A portion of the LM is on the right. Photo: NASA

The post Apollo 17: Looking back at the last time the US landed on the moon appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
What we should think about before terraforming alien worlds https://www.popsci.com/science/why-terraform-alien-worlds/ Sat, 24 Feb 2024 17:00:00 +0000 https://www.popsci.com/?p=603335
Mars surface NASA Odyssey
Carl Sagan famously argued, “If there is life on Mars, I believe we should do nothing with Mars.”. NASA/JPL-Caltech/ASU

Whenever someone waxes poetic about planetary engineering, it’s worth taking a moment to consider the ethical implications.

The post What we should think about before terraforming alien worlds appeared first on Popular Science.

]]>
Mars surface NASA Odyssey
Carl Sagan famously argued, “If there is life on Mars, I believe we should do nothing with Mars.”. NASA/JPL-Caltech/ASU

This story originally featured on the MIT Press Reader.

Exploration, habitation, and resource extraction all carry a risk of inflicting environmental damage in space, just as they do here on Earth. But some futurists and space settlement enthusiasts have proposed an even more drastic alteration of the space environment: the transformation of the surface of a planet or moon into a more Earth-like environment via a process known as terraforming.

The atmospheric chemistry, pressure, and temperature inside an artificial space habitat is, by design, Earth-like enough to be habitable by humans, but it requires enclosure by pressurized walls and constant maintenance. Terraforming would affect the entire surface of a planet, rather than just a smaller “indoor” region, and by planetary scientist Christopher McKay’s definition, the environment of a terraformed planet “must be stable over long time scales and must require no, or a minimum of, continued technological intervention.” After an initial input of energy and effort, a terraformed environment would behave like Earth’s natural environment and essentially maintain itself.

For example, in 1961, Carl Sagan speculated on the possibility of the “microbiological re-engineering” of Venus by introducing blue-green algae into its atmosphere. The algae would use photosynthesis to convert the planet’s abundant carbon dioxide into oxygen, which would also reduce the greenhouse effect and lower Venus’s surface temperature. Sagan later turned his attention to the potential for “re-engineering” Mars, a planet now considered to be one of our best candidates for successful terraformation. Mars has the opposite problem as Venus: Instead of harboring a thick, toxic atmosphere with a runaway greenhouse effect maintaining deathly high temperatures and pressures at the surface, Mars lost nearly its entire original atmosphere to solar wind, leaving surface pressures so low that liquid water cannot exist. To terraform Mars, planetary engineers would need to increase its surface temperature and atmospheric pressure while protecting the atmosphere from solar wind. Sagan suggested spreading a dark material, or even growing dark-colored plants, on Mars’s polar ice caps, allowing them to absorb more of the Sun’s heat, increasing the surface temperature while releasing water vapor and carbon dioxide into the atmosphere. Other researchers have explored the feasibility of importing greenhouse gases or building giant orbital mirrors to increase Mars’s surface temperature, constructing a magnetic shield to protect Mars’s atmosphere, and releasing genetically engineered microbes onto the planet’s surface to alter the atmospheric and surface chemistry.

Terraformation is the ultimate example of long-term planning, as even optimistic estimates predict that it would take centuries of effort and patience before a human could walk unprotected on the surface of Mars. Advocates of terraforming Mars or other space environments see it as a crucial step toward creating a truly multi-planet civilization. Robert Zubrin, the founder and president of the Mars Society, an organization that advocates for human Mars exploration and colonization, even claims that the successful terraforming of Mars would demonstrate humanity’s superiority over the physical world: “The first astronauts to reach Mars will prove that the worlds of the heavens are accessible to human life. But if we can terraform Mars, it will show that the worlds of the heavens themselves are subject to the human intelligent will,” he writes in his 1996 book “The Case for Mars.”

By transforming Mars’s surface into Earth’s, we might exterminate species or entire ecosystems without even detecting their existence.

Whenever someone waxes poetic about humankind bending the universe to our will, it’s worth taking a moment to consider the ethical implications of the proposal. One major consideration about terraforming is that the process could damage or even wipe out any existing life on the planet being terraformed. If an alien microbe evolved on Mars, it probably would not survive in a more Earth-like environment, so by transforming Mars’s surface into Earth’s, we might exterminate species or entire ecosystems without even detecting their existence. The changes we would make to a cold, dry, relatively airless world like Mars would also introduce physical processes—such as wind, flowing water, and new chemical reactions—that could easily erase or contaminate any evidence that extraterrestrial life ever existed on the surface. If we allow planetary engineering to race ahead of astrobiological research, we could miss our opportunity to make what would be the most important scientific discovery in human history: the discovery of life that evolved beyond our planet. We also risk exterminating the very lifeforms we dream of discovering.

The ethical dilemma of terraforming far exceeds planetary protection concerns about forward contamination by a lander or even a human settlement. The goal of terraforming is to intentionally create an entire ecosystem on a global scale, which would more than likely destroy any existing ecosystem. Terraforming technology might even become feasible before we definitively determine whether extraterrestrial life exists on the planet or moon that we hope to transform. But suppose we do discover evidence of existing microbial life on a planet like Mars. Should this disqualify Mars as a target for terraforming? Should we avoid settling on Mars at all?

Carl Sagan, in his book “Cosmos,” famously argued for exactly this stance: “If there is life on Mars, I believe we should do nothing with Mars. Mars then belongs to the Martians, even if the Martians are only microbes. The existence of an independent biology on a nearby planet is a treasure beyond assessing, and the preservation of that life must, I think, supersede any other possible use of Mars.” Planetary scientist Christopher McKay even argues that if microbial life is discovered on Mars, humans should not simply leave Mars to the microbes, we should “undertake the technological activity that will enhance the survival of any indigenous Martian biota and promote global changes on Mars that will allow for maximizing the richness and diversity of these Martian life forms.” In other words, we should engineer the surface of Mars not to improve its habitability for terrestrial life, but for Martian life!

Space ethicist Kelly Smith finds these types of arguments, that humans should avoid worlds where microbial life might already exist, difficult to defend. “You have to first grant that microbes, as a class of organisms, are somehow on the same level with human beings,” he told me in 2018. “I’m not saying you can’t make an argument to that effect, but it really stretches credulity. It’s an uphill battle.” After all, humans have already demonstrated that we are willing to intentionally eradicate disease-causing viruses like smallpox to prevent human death and suffering. Admittedly, viruses are not unequivocally considered to be “alive,” and there were some ethical concerns during the development of the vaccine that smallpox eradication represented a “new form of genocide.” But given the opportunity, humans would likely jump at the chance to exterminate deadly microbial species like the bacterium that causes cholera or the parasite that causes malaria. Unlike these terrestrial microbes, however, hypothetical Martian microbes currently pose no danger to humanity, or even to individual humans. They may merely someday stand in the way of our off-Earth expansion. Space settlement advocates argue that such an expansion is vital for humanity’s long-term survival, but does this potential for indirect harm justify their extinction?

It may seem premature to debate the ethics of using a technology that does not yet exist to indirectly destroy an ecosystem that may not exist at all. But our potential for inadvertently exterminating a unique species or ecosystem in space might arise long before we develop the technology to terraform entire planets. By the time we come to an agreement about the ethics of terraforming and planetary protection, it might be too late.

Erika Nesvold, an astrophysicist, has worked as a researcher at NASA Goddard and the Carnegie Institution for Science. She is a developer for Universe Sandbox, a physics-based space simulator; cofounder of the nonprofit organization the JustSpace Alliance; the creator and host of the podcast Making New Worlds; and author of “Off-Earth,” from which this article is excerpted.

The post What we should think about before terraforming alien worlds appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA wants you to record crickets during April’s solar eclipse https://www.popsci.com/science/nasa-eclipse-study-soundscapes/ Fri, 23 Feb 2024 18:00:00 +0000 https://www.popsci.com/?p=604045
Colorful cricket on green leaf
The behaviors of animals such as birds and crickets can be affected when they see a solar eclipse. Credit: Moment Open / Getty

Here's how to capture nature for the Eclipse Soundscapes Project.

The post NASA wants you to record crickets during April’s solar eclipse appeared first on Popular Science.

]]>
Colorful cricket on green leaf
The behaviors of animals such as birds and crickets can be affected when they see a solar eclipse. Credit: Moment Open / Getty

American scientist William Wheeler not only looked to the sky during a total solar eclipse; he also made sure to pay attention to everything around him. On August 31, 1932, Wheeler and fellow collaborators located throughout the northeastern regions of US and Canada took part in one of the earliest eclipse-related participatory studies to document the celestial event’s effects on wildlife. Volunteers made nearly 500 records of animal and insect reactions that day—nearly a century later, NASA hopes to honor those contributions, as well as exponentially expand on them.

On April 8, the agency is calling for citizen scientist volunteers along the upcoming total solar eclipse’s path to help in its ongoing Eclipse Soundscapes Project. Through a combination of visual, audio, and written recordings, NASA aims to help further researchers’ understanding of the occurrence’s influence on various ecosystems across the country.

Sun photo

As the moon passes in front of the sun, ambient light dims, temperatures fall, and even some stars begin to appear. These sudden environmental shifts have been known to fool animals into behaving as they would at dusk or dawn. According to NASA, the agency is specifically interested in better understanding the behavior of crickets, as well as observing the differences between how nocturnal and diurnal animals may respond.

“The more audio data and observations we have, the better we can answer these questions,” Kelsey Perrett, Communications Coordinator with the Eclipse Soundscapes Project, said in an announcement earlier this month. “Contributions from participatory scientists will allow us to drill down into specific ecosystems and determine how the eclipse may have impacted each of them.”

[Related: Delta’s solar eclipse flight sold out, but your best bet to see it is still down here.]

There are multiple ways any of the roughly 30 million people within the eclipse’s path can participate on April 8. People on or close to the path of totality can act as designated “Data Collectors” by purchasing a relatively low-cost audio recorder called an AudioMoth alongside a micro-SD card to capture surrounding sounds. Meanwhile, “Observers” can write down what they see and hear, then submit them through the project’s website, while “Apprentices” and “Data Analysts” can take quick, free online courses to help assess the incoming data. There are also plenty of options for anyone with sensory accessibility issues, and NASA made sure to include resources for facilitating large groups of volunteers through local schools, libraries, parks, and community centers.

The post NASA wants you to record crickets during April’s solar eclipse appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
JWST detects evidence of a neutron star in fiery supernova remains https://www.popsci.com/science/jwst-neutron-star-supernova/ Fri, 23 Feb 2024 14:13:51 +0000 https://www.popsci.com/?p=603963
A combination of a Hubble Space Telescope image of SN 1987A and the compact argon source. The faint blue source in the center is the emission from the compact source detected with the James Webb Space Telescope. Outside this is the stellar debris, which contains most of the mass. The inner bright “string of pearls” is the gas from the outer layers of the star that was expelled about 20,000 years before the final explosion. The fast debris is now colliding with the ring, explaining the bright spots. Outside of the inner ring are two outer rings, presumably produced by the same process as forming the inner ring. The bright stars to the left and right of the inner ring are unrelated to the supernova.
A combination of a Hubble Space Telescope image of SN 1987A and the compact argon source. The faint blue source in the center is the emission from the compact source detected with the James Webb Space Telescope. Outside this is the stellar debris, which contains most of the mass. The inner bright “string of pearls” is the gas from the outer layers of the star that was expelled about 20,000 years before the final explosion. The fast debris is now colliding with the ring, explaining the bright spots. Outside of the inner ring are two outer rings, presumably produced by the same process as forming the inner ring. The bright stars to the left and right of the inner ring are unrelated to the supernova. Hubble Space Telescope WFPC-3/James Webb Space Telescope NIRSpec/J. Larsson

After almost 40 years, astronomers are finally getting a closer look at SN 1987A's dramatic death.

The post JWST detects evidence of a neutron star in fiery supernova remains appeared first on Popular Science.

]]>
A combination of a Hubble Space Telescope image of SN 1987A and the compact argon source. The faint blue source in the center is the emission from the compact source detected with the James Webb Space Telescope. Outside this is the stellar debris, which contains most of the mass. The inner bright “string of pearls” is the gas from the outer layers of the star that was expelled about 20,000 years before the final explosion. The fast debris is now colliding with the ring, explaining the bright spots. Outside of the inner ring are two outer rings, presumably produced by the same process as forming the inner ring. The bright stars to the left and right of the inner ring are unrelated to the supernova.
A combination of a Hubble Space Telescope image of SN 1987A and the compact argon source. The faint blue source in the center is the emission from the compact source detected with the James Webb Space Telescope. Outside this is the stellar debris, which contains most of the mass. The inner bright “string of pearls” is the gas from the outer layers of the star that was expelled about 20,000 years before the final explosion. The fast debris is now colliding with the ring, explaining the bright spots. Outside of the inner ring are two outer rings, presumably produced by the same process as forming the inner ring. The bright stars to the left and right of the inner ring are unrelated to the supernova. Hubble Space Telescope WFPC-3/James Webb Space Telescope NIRSpec/J. Larsson

Astronomers using the James Webb Space Telescope (JWST) may be the winners of a mysterious 37-year-long game of hide-and-seek and solved a stellar death mystery in the process. They detected the best known evidence for a neutron star laying in the remnants of one of the most famous supernovae in space

This massive star explosion created so much debris that it took several years and one of the most powerful space telescopes ever created to peer through the wreckage of its stellar death. The findings are detailed in a study published February 22 in the journal Science and advances the study of these dramatic celestial deaths. 

“The mystery over whether a neutron star is hiding in the dust has lasted for more than 30 years and it is exciting that we have solved it,” study co-author and University College London astrophysicist Mike Barlow said in a statement.

[Related: An amateur astronomer spotted a new supernova remarkably close to Earth.]

What is a supernova?

A supernova is the explosive final death of some of the most massive stars in the known universe. They occur in stars that are eight to 10 times the mass of our sun, so it can take years for all of that gas and energy to collapse in on itself. Its final initial death blows can end within a few hours, but the brightness of the explosion will generally peak within a few months. Importantly, supernovae offer a way for scientists to study a key astronomical process in real time. Explosions like these fill space with the iron, silicon, carbon, and oxygen that build future stars and planets. They can even create the molecules that create life. 

In the study, the team looked at Supernova (SN) 1987A. This well known supernova occurred 160,000 light-years from Earth in a region called the Large Magellanic Cloud. Its light was first observed on Earth in February 1987, with its brightness peaking that May. It was the first supernova that could be seen with the naked eye since Kepler’s Supernova in 1604.

“Supernovae are the main sources of chemical elements that make life possible–so we want to get our models of them right,” said Barlow. “There is no other object like the neutron star in Supernova 1987A, so close to us and having formed so recently. Because the material surrounding it is expanding, we will see more of it as time goes on.”

An image taken with JWST’s Near-Infrared Camera image released in 2023 (left). Light from singly ionized argon (Argon II) captured by the Medium Resolution Spectrograph mode of the Mid-Infrared Instrument (top right). Light from multiply ionized argon captured by the Near-Infrared Spectrograph (bottom right). Both instruments show a strong signal from the center of the supernova remnant. This indicated to the science team that there is a source of high-energy radiation there, most likely a neutron star.
An image taken with JWST’s Near-Infrared Camera image released in 2023 (left). Light from singly ionized argon (Argon II) captured by the Medium Resolution Spectrograph mode of the Mid-Infrared Instrument (top right). Light from multiply ionized argon captured by the Near-Infrared Spectrograph (bottom right). Both instruments show a strong signal from the center of the supernova remnant. This indicated to the science team that there is a source of high-energy radiation there, most likely a neutron star. CREDIT: NASA, ESA, CSA, STScI, Claes Fransson (Stockholm University), Mikako Matsuura (Cardiff University), M. Barlow (UCL), Patrick Kavanagh (Maynooth University), Josefin Larsson (KTH).

SN 1987A is also considered a core-collapse supernova, where its compacted remains could form a neutron star or a black hole. Some incredibly small subatomic particles produced by the supernova called neutrinos indicated that a neutron star may have formed. However, in the almost 40 years since SN 1987A was detected, it has not been clear if this neutron star persisted or collapsed into a black hole. The star has been hidden by dust from the explosion.  

How JWST confirmed a neutron star

The observations for this work were taken on July 16, 2022, just after the space telescope became operational. The team in the study used JWST instruments–MIRI and NIRSpec–that can observe the supernova at infrared wavelengths to peer beyond the dust. They found evidence of heavy argon and sulfur atoms whose outer electrons had been stripped off near where the explosion occurred. This process is called ionization

[Related: See the stunning Supernova 1987A in a whole new light.]

They modeled multiple scenarios and found that the atoms may have been ionized by ultraviolet and X-ray radiation from a hot cooling neutron star. It also could have been due to the winds of relativistic particles that were accelerated by a quickly rotating neutron star and interacting with material from the supernova. 

“Our detection with James Webb’s MIRI and NIRSpec spectrometers of strong ionized argon and sulfur emission lines from the very center of the nebula that surrounds Supernova 1987A is direct evidence of the presence of a central source of ionizing radiation,” said Barlow. “Our data can only be fitted with a neutron star as the power source of that ionizing radiation.”

The findings are consistent with several theories about how neutron stars form. Models suggest that sulfur and argon are produced in large amounts inside of a dying star just before it goes supernova. Scientists studying SN 1987A and other supernovae predicted that ultraviolet and X-radiation in a supernova remnant would indicate that a newborn neutron star was present. Now, using ultraviolet and X-ray radiation was what helped us find it. 

“This supernova keeps offering us surprises,” study co-author and Sweden Royal Institute of Technology astrophysicist Josefin Larsson said in a statement. “Nobody had predicted that the compact object would be detected through a super strong emission line from argon, so it’s kind of amusing that that’s how we found it in the JWST.”

The post JWST detects evidence of a neutron star in fiery supernova remains appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
‘Odie’ makes space history with successful moon landing https://www.popsci.com/science/successful-moon-landing-2024-odie/ Thu, 22 Feb 2024 23:58:11 +0000 https://www.popsci.com/?p=603888
Odysseus passes over the near side of the moon. The photo was taken while the spacecraft was orbiting the moon.
Odysseus passes over the near side of the moon. The photo was taken while the spacecraft was orbiting the moon. Intuitive Machines

Odysseus becomes the first privately-built spacecraft to survive a lunar landing.

The post ‘Odie’ makes space history with successful moon landing appeared first on Popular Science.

]]>
Odysseus passes over the near side of the moon. The photo was taken while the spacecraft was orbiting the moon.
Odysseus passes over the near side of the moon. The photo was taken while the spacecraft was orbiting the moon. Intuitive Machines

After a few tense minutes troubleshooting some communications issues, Odysseus has officially become the first privately constructed spacecraft to land on the moon. Mission Director Tim Crain announced that “Odysseus has a new home.” The uncrewed lunar lander likely touched down near at an impact crater by the moon’s south pole called Malapert A at 6:24 p.m. EST on February 22, 2024. Built by Houston-based Intuitive Machines, “Odie” is the first American spacecraft to land on the moon since Apollo 17 in 1972. 

“I know this was a nail-biter, but we are on the surface, and we are transmitting,” Intuitive Machines CEO Steve Altemus announced on the webcast. “Welcome to the moon.” While the company has confirmed that it has made contact with the lander, the state of the spacecraft is not yet clear.

It landed in a region that is about 3.5 billion years old. This landing site is near some craters and cliffs, on the side of the moon that is visible from the Earth and could be prime future landing spot for astronauts. Scientists believe that the permanently shadowed craters hold frozen water, which could be used for drinking water during the crewed Artemis missions scheduled later this decade

[Related: ‘Odie’ snaps its first images of Earth on its way to the moon.]

During the livestream, NASA administrator Bill Nelson announced that today begins “a new adventure in science, innovation, and American leadership in space. Today is a day that shows the power and promise of NASA’s commercial partnerships. Congratulations to everyone involved in this great endearing quest at Intuitive Machines, Space X, and right here at NASA.”

On Wednesday February 21, Intuitive Machines announced that Odysseus had fired its engine for six minutes and 48 seconds. This slowed it down enough to be pulled into the moon’s orbit about 57 miles above the lunar surface. 

Odysseus successfully launched atop a SpaceX Falcon 9 rocket on February 15 at 1:05 a.m. EST. The uncrewed lander completed a 230,000-mile journey towards the moon, sending back some images of the Earth along the way. Only government-funded programs from Russia, China, India, the United States, and most recently Japan have performed a successful lunar landing. 

The spacecraft is a hexagonal cylinder with six landing legs and stands at roughly 14 feet tall and five feet wide. Intuitive Machines calls the spacecraft design Nova-C and notes that it’s about the size of red London telephone booths. When completely loaded with fuel, it weighs about 4,200 pounds

An artist’s rendition of Odie on the moon. CREDIT: Intuitive Machines
An artist’s rendition of Odie on the moon. CREDIT: Intuitive Machines

NASA is the main sponsor of the mission, paying Intuitive Machines about $118 million to deliver its payload to the moon. NASA hopes that this mission will jumpstart the lunar economy ahead of future crewed Artemis missions. The six NASA navigation and tech experiments in the lander’s payload will collect data critical for the planned missions. Odysseus is also carrying a camera built by students at Embry-Riddle Aeronautical University, a prototype for a future moon telescope, and an art project by Jeff Koons. Koons told The New York Times that the project was inspired by his son, Sean Koons. It includes 125 stainless steel sculptures of the moon that are named after inspiring historical figures, including Ada Lovelace, Plato, and Leonardo da Vinci. 

[Related: This private lander could be the first US machine on the moon this century.]

Odysseus’s success comes one month after Pittsburgh-based Astrobotic’s Peregrine lunar lander failed to complete its mission. The spacecraft burned in the Earth’s atmosphere about 10 days after a broken fuel tank and massive leak caused the mission to fail. Other attempts to get a privately-built lunar lander on the moon include Israel’s Beresheet lander in 2019 and Japan’s Hakuto-R Mission 1 lander in 2023

This is a developing story, please check back for more details.

UPDATE, February 22, 2024, 8:53 p.m. EST: Two hours after successfully landing on the moon, Intuitive Machines confirmed on X that “Odysseus is upright and starting to send data.”

UPDATE, February 23, 2024 8:19 a.m. EST: This story has been update with images taken by Odysseus while in orbit and an artist’s rendition of what the lander could look like on the lunar surface.

UPDATE, February 26, 2024 8:23 a.m. EST:  According to Intuitive Machines, the moon lander tripped and fell during its touchdown and is lying on its side. However, it is still functioning. During landing maneuvers, one of its legs got stuck in the lunar surface, causing it to fall over a rock.

The post ‘Odie’ makes space history with successful moon landing appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Delta’s solar eclipse flight sold out, but your best bet to see it is still down here https://www.popsci.com/science/delta-solar-eclipse-flight/ Thu, 22 Feb 2024 20:07:58 +0000 https://www.popsci.com/?p=603866
A total solar eclipse is seen on Monday, Aug. 21, 2017, from onboard a NASA Armstrong Flight Research Center’s Gulfstream III 25,000 feet above the Oregon coast. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina.
A total solar eclipse is seen on Monday, August 21, 2017 from onboard a NASA Armstrong Flight Research Center’s Gulfstream III 25,000 feet above the Oregon coast. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. NASA/Carla Thomas

Don’t worry, there are plenty of places to still catch the April 8 event on the ground.

The post Delta’s solar eclipse flight sold out, but your best bet to see it is still down here appeared first on Popular Science.

]]>
A total solar eclipse is seen on Monday, Aug. 21, 2017, from onboard a NASA Armstrong Flight Research Center’s Gulfstream III 25,000 feet above the Oregon coast. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina.
A total solar eclipse is seen on Monday, August 21, 2017 from onboard a NASA Armstrong Flight Research Center’s Gulfstream III 25,000 feet above the Oregon coast. A total solar eclipse swept across a narrow portion of the contiguous United States from Lincoln Beach, Oregon to Charleston, South Carolina. NASA/Carla Thomas

Earlier this week, Delta Air Lines announced an extra flight for its April 8 schedule, timed specifically to provide passengers an aerial view of the total solar eclipse. But if you were still hoping to snag a ticket for the afternoon jaunt alongside the path of totality, you’re already out of luck—seats aboard the Airbus A220-300 sold out within 24 hours.

According to Delta’s original announcement, DL Flight 1218 with service from Austin to Detroit will depart at 12:15 PM CST for its roughly 1,380-mile, 3-hour-long trip. Once at a cruising altitude of 30,000-feet, passengers will be able to view the celestial event through the plane’s “extra-large windows,” which the official Airbus specs manual says measure in at 11×16 inches. For comparison, a Boeing 777 includes 10×15 inch glimpses of the outside world. Everyone on the plane will receive special glasses to safely watch the eclipse (which is nice to hear, given how few free amenities remain on most commercial flights).

[Related: We can predict solar eclipses to the second. Here’s how.]

While the solar eclipse will last several minutes for anyone on the ground, Flight 1218’s timing and route should grant a longer spectacle.

As cool as a first-class seat to the eclipse would be, there are plenty of (likely cheaper) locations across the US to consider visiting on April 8. After traveling across Central America, the path of totality will pass across large portions of  Oklahoma, Arkansas, Missouri, Illinois, Kentucky, Indiana, Ohio, Pennsylvania, New York, Vermont, New Hampshire, and Maine.

If you’re truly determined to head to skies, NPR notes that there are other flight options scheduled to pass by at least some part of the eclipse, including from Delta, as well as several from Southwest.

But keep in mind: A plane’s altitude doesn’t necessarily guarantee a picture-perfect view of the eclipse—if anything, there’s a chance that cloud coverage could impede an onlooker’s vantage. There’s also the possibility of weather or air traffic control delays, which… well, this country has a history of such headaches.

So despite the multiple jetset options, your best bet to see April’s eclipse is simply making sure you’re within its route, firmly on the ground, and equipped with proper eyewear. Seriously, take it from NASA: “Viewing any part of the bright Sun through a camera lens, binoculars, or a telescope without a special-purpose solar filter secured over the front of the optics will instantly cause severe eye injury.”

The post Delta’s solar eclipse flight sold out, but your best bet to see it is still down here appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
‘Odie’ snaps its first images of Earth on its way to the moon https://www.popsci.com/science/odie-snaps-its-first-images-of-earth-on-its-way-to-the-moon/ Tue, 20 Feb 2024 20:00:00 +0000 https://www.popsci.com/?p=603451
The planet Earth as seen from space, aboard the Odysseus spacecraft.
Intuitive Machines' Odysseus moon lander beamed this image one day after its successful launch. Intuitive Machines

The privately-built lander took some selfies during its 230,000 mile journey to the lunar surface.

The post ‘Odie’ snaps its first images of Earth on its way to the moon appeared first on Popular Science.

]]>
The planet Earth as seen from space, aboard the Odysseus spacecraft.
Intuitive Machines' Odysseus moon lander beamed this image one day after its successful launch. Intuitive Machines

UPDATE, February 23, 2024 9:09 a.m. EST: Odie successfully landed on the moon and has started sending images back to Earth.

After a successful launch, Intuitive Machine’s robotic Odysseus spacecraft (aka Odie) beamed home its first images from space. In a post on X, Intuitive Machines wrote that the spacecraft “successfully transmitted its first IM-1 mission images to Earth on February 16, 2024.” The images were captured one day after the spacecraft blasted off from NASA’s Kennedy Space Center in Florida.

[Related: ‘Odie’ is en route for its potentially historic moon landing.]

According to the Houston-based company, the four images they selected were chosen from  hundreds of images taken by the lander’s cameras. These cameras were programmed to take five images every five minutes for the first two hours after Odysseus separated from the rocket’s second stage. 

“Out of all the images collected, Intuitive Machines chose to show humanity’s place in the universe with four wonderful images we hope to inspire the next generation of risk-takers,” the company wrote in a statement.

They capture the planet Earth fading into the background as the spacecraft continues on its 230,000-mile journey towards the moon.

Intuitive Machines also announced that Odysseus “continues to be in excellent health” and is communicating with mission control. Odysseus launched one month after Pittsburgh-based Astrobotic’s Peregrine lunar lander failed to complete its mission. The spacecraft burned in the Earth’s atmosphere about 10 days after a broken fuel tank and massive leak caused the mission to fail.

In addition to these first images, Odysseus’ engine also passed a crucial check in deep space over the weekend.

“Intuitive Machines flight controllers successfully fired the first liquid methane and liquid oxygen engine in space, completing the IM-1 mission engine commissioning. This engine firing included a full thrust mainstage engine burn and throttle down-profile necessary to land on the moon,” the company wrote in a post on X on February 16.

[Related: This private lander could be the first US machine on the moon this century.]

If the mission continues to go as planned, Odysseus will land on the moon on February 22, where it would be the first private spacecraft to conduct a successful lunar landing. Only government-funded programs from Russia, China, India, the United States, and most recently Japan have performed a lunar landing. 

The lander is aiming to touch down 186 miles away from the moon’s south pole. This region has cliffs, craters, and possibly frozen water. NASA is the main sponsor of the mission, paying Intuitive Machines about $118 million to deliver its payload to the moon. NASA hopes that if this mission is successful it will jumpstart the lunar economy ahead of future crewed missions. The space agency plans to land astronauts there later this decade. The six navigation and tech experiments in the lander’s payload that will collect data critical for these missions. 

The post ‘Odie’ snaps its first images of Earth on its way to the moon appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Dead satellite hurtles towards Earth in new grainy images https://www.popsci.com/science/ers-2-deorbit-photos/ Tue, 20 Feb 2024 16:00:00 +0000 https://www.popsci.com/?p=603399
Satellite image of ERS-2 deorbiting in Earth's atmosphere
ERS-2 launched in 1995, and surveyed Earth's topography and natural events for the ESA. ESA / HEO Space

After 29 years in orbit, ERS-2 is en route for a fiery demise tomorrow.

The post Dead satellite hurtles towards Earth in new grainy images appeared first on Popular Science.

]]>
Satellite image of ERS-2 deorbiting in Earth's atmosphere
ERS-2 launched in 1995, and surveyed Earth's topography and natural events for the ESA. ESA / HEO Space

A 5,000-pound dead satellite resembling a spaceship from Star Wars is hurtling towards Earth, but don’t worry—experts say situations like this happen “every week or two.”

Launched in 1995 by the European Space Agency from Kourou, French Guiana, the European Remote Sensing 2 (ERS-2) array spent over a decade-and-a-half observing the planet’s topography and weather events, including natural disasters in remote, hard-to-document regions. Alongside its sibling, ERS-1, the pair were considered the “most sophisticated Earth observation spacecraft” ever developed at the time of their deployment.

In July 2011, however, the ESA decided to retire its “nominally” functioning ERS-2 and begin a scheduled deorbiting process. The satellite underwent 66 maneuvers over the ensuing month, using up its remaining fuel to descend from an altitude from roughly 487-miles to 356-miles above the Earth’s surface. Since then, ERS-2’s orbit has slowly decayed to its current point—caught in the planet’s gravitational pull, and picking up speed as it falls into the atmosphere. 

On Sunday, the ESA posted grainy, black-and-white images to X taken last month by the Australian commercial imaging company, HEO, which show ERS-2 (then about 150-miles high) spiraling downwards during its final journey. From the camera’s vantage, the satellite certainly looks a lot like an incoming TIE Fighter from Star Wars

But no need to evade Imperial scrutiny—or even fiery orbital debris, for that matter. ERS-2 is currently falling at a rate of over 6.2 miles per day, a speed expected to accelerate as atmospheric drag takes an even greater hold. As of February 20, ERS-2 has around 120-or-so miles left to go, and will start breaking up and bursting into flames once about 50 miles high. Most, if not all, of the subsequent detritus will then immolate to harmless dust and ash, posing an extremely low damage risk for anything or anyone below it.

[Related: Some space junk just got smacked by more space junk, complicating cleanup.]

The ESA estimates ERS-2 will burn away around 3:53PM EST on Wednesday, although trackers offer as much as a 7-hour window on either side to account for “unpredictable solar activity” that could influence its descent speed. As to where in the world the satellite will fall apart—well, that part is a little more difficult to predict at the moment, although more accurate geolocation estimates are expected over the next day.

Deorbiting satellites is vital to ensuring enough room is kept for the thousands upon thousands of other human-made objects orbiting Earth. Increasingly crowded skies is a major concern for space agencies, private companies, and watchdog groups—an issue that isn’t likely to diminish anytime soon. Back in October, for example, a space junk cleanup mission proved more complicated when another piece of debris smacked into the satellite targeted for decommissioning. In the meantime, regulators like the FCC are fining companies for failing to do their part in accounting for their dead satellites.

After all, while a single satellite burning up during deorbit isn’t cause for concern—a “Kessler cascade” most certainly is. 

The post Dead satellite hurtles towards Earth in new grainy images appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Fastest-growing black hole eats the equivalent of one sun a day https://www.popsci.com/science/fastest-growing-brightest-black-hole/ Tue, 20 Feb 2024 15:09:53 +0000 https://www.popsci.com/?p=603382
An artist’s impression of quasar J059-4351, the bright core of a distant galaxy that is powered by a supermassive black hole. This quasar has been found to be the most luminous object known in the universe to date.
An artist’s impression of quasar J059-4351, the bright core of a distant galaxy that is powered by a supermassive black hole. This quasar has been found to be the most luminous object known in the universe to date. ESO/M. Kornmesser

It powers a quasar as bright as 500 trillion suns.

The post Fastest-growing black hole eats the equivalent of one sun a day appeared first on Popular Science.

]]>
An artist’s impression of quasar J059-4351, the bright core of a distant galaxy that is powered by a supermassive black hole. This quasar has been found to be the most luminous object known in the universe to date.
An artist’s impression of quasar J059-4351, the bright core of a distant galaxy that is powered by a supermassive black hole. This quasar has been found to be the most luminous object known in the universe to date. ESO/M. Kornmesser

It’s the fastest-growing black hole ever recorded. Quasar J0529-4351 eats the equivalent of the energy in our sun every single day. It is also roughly 17 billion times bigger than our sun. This ravenous star-gobbling hole is described in a study published February 19 in the journal Nature Astronomy and its size could help piece together the universe’s history.

[Related: Blindingly bright black holes could help cosmologists see deeper into the universe’s past.]

“The incredible rate of growth also means a huge release of light and heat,” study co-author and astronomer at The Australian National University (ANU) Christian Wolf said in a statement. “So, this is also the most luminous known object in the universe. It’s 500 trillion times brighter than our sun.” 

What are quasars?

Quasars are galaxies with an active and energetic core that is powered by black holes. They typically offer astronomers a different view of black hole, showing energetic jets beaming out from two sides. A quasar’s dark center gobbles up the matter that is nearby and then smushes that material into an incredibly hot disc. This matter is then shot out over huge distances. However, it takes billions of years for their light to be visible on Earth. This means astronomers can view these black holes as they existed billions of years ago. 

This image shows the region of the sky in which the record-breaking quasar J0529-4351 is situated. This picture was created from images forming part of the Digitized Sky Survey 2, while the inset shows the location of the quasar in an image from the Dark Energy Survey. CREDIT: ESO/Digitized Sky Survey 2/Dark Energy Survey.
This image shows the region of the sky in which the record-breaking quasar J0529-4351 is situated. This picture was created from images forming part of the Digitized Sky Survey 2, while the inset shows the location of the quasar in an image from the Dark Energy Survey. CREDIT: ESO/Digitized Sky Survey 2/Dark Energy Survey.

Quasars are still quite mysterious, but some more recent studies have found that quasars may shine consistently enough for astronomers to use them to fill in gaps in cosmic history. J0529-4351’s unprecedented brightness and size could help further this study of the universe’s early days. 

“It’s a surprise it remained undetected until now, given what we know about many other, less impressive black holes. It was hiding in plain sight,” study co-author and ANU astronomer Christopher Onken said in a statement.

A big black hole meets the Very Large Telescope

J0529-4351 has a mass that is roughly 17 billion times that of our solar system’s sun. It was detected with ANU’s Siding Spring Observatory’s telescope, but confirming such a massive black hole requires the help of an even bigger telescope. The team turned to the European Southern Observatory’s Very Large Telescope in Chile. With four telescopes 27 feet in diameter, it is one of the largest telescopes on Earth. They used it  to confirm the full nature of the black hole and measure its mass.  

“The light from this black hole has traveled over 12 billion years to reach us,” study co-author and University of Melbourne astrophysicist Rachel Webster said in a statement. “In the adolescent universe, matter was moving chaotically and feeding hungry black holes. Today, stars are moving orderly at safe distances and only rarely plunge into black holes.” 

[Related: What we can learn from baby black holes.]

The intense radiation is coming from the accretion disc around the black hole and creates a holding pattern for all the cosmic material waiting to be consumed. According to the team, it looks like a large storm cell with temperatures over 18,000 degrees Fahrenheit. The region has cosmic winds that are blowing so fast that they would go around the Earth in one second. 

Approaching a limit?

The accretion disc is about seven light years in diameter—or roughly 42 trillion Earth miles. According to the team, this makes it the largest known accretion disc in the universe. 

The team believes that the black hole could be approaching the Eddington mass limit. This is the proposed upper limit of the mass of a star or an accreditation disc. More research and observations are needed to get a better idea of its growth rate.

When the quasar was first detected in 1980, astronomers believed it was a star. It was reclassified in 2023 after more detailed observations were taken in Chile and Australia. 

“The exciting thing about this quasar is that it was hiding in plain sight and was misclassified as a star previously,” Yale University astrophysicist Priyamvada Natarajan told Sky News. Natarajan was not involved in the study.

The post Fastest-growing black hole eats the equivalent of one sun a day appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
How to apply for NASA’s next Mars habitat simulation https://www.popsci.com/science/nasa-mars-habitat-chapea-volunteers/ Fri, 16 Feb 2024 21:00:00 +0000 https://www.popsci.com/?p=603220
Concept art of NASA Mars habitat
Three, one-year-long stints in a Mars habitat simulation are meant to pave the way for the real thing. NASA

See if you qualify to be a volunteer for a yearlong stint.

The post How to apply for NASA’s next Mars habitat simulation appeared first on Popular Science.

]]>
Concept art of NASA Mars habitat
Three, one-year-long stints in a Mars habitat simulation are meant to pave the way for the real thing. NASA

Looking for a change of pace from your day-to-day routine? Life on Earth feeling a bit overwhelming at the moment? How about a one-year residency alongside three strangers at a 3D-printed Mars habitat simulation?

On Friday, NASA announced it is now accepting applications for the second of three missions in its ongoing Crew Health and Performance Analog (CHAPEA) experiment. For 12 months, a quartet of volunteers will reside within Mars Dune Alpha, a 1,700-square-foot residence based at the Johnson Space Center in Houston, Texas, where they can expect to experience “resource limitations, equipment failures, communication delays, and other environmental stressors.” 

[Related: To create a small Mars colony, leave the jerks on Earth.]

When not pretending to fight for your survival on a harsh, barren Martian landscape, CHAPEA team members will also conduct virtual reality spacewalk simulations, perform routine maintenance on the Mars Dune Alpha structure itself, oversee robotic operations, and grown their own crops, all while staying in shape through regular exercise regimens.

But if the thought of pretending to reside 300 million miles away from your current home sounds appealing, well… cool your jets. NASA makes it clear that there are a few requirements applicants must meet before being considered for the jobs—such as a master’s degree in a STEM field like engineering, computer science, or mathematics. Then you’ll need either two years professional experience in a related field, or a minimum of 1,000 hours spent piloting aircrafts. Also, only non-smokers between 30 and 55-years-old will be considered, and military experience certainly sounds like a plus.

Oh, and you’ll also need to fill out NASA’s lengthy questionnaire, which includes entries like, “Are you willing to have no communication outside of your crew without a minimum time delay of 20 minutes for extended periods (up to one year)?” and, “Are you willing to consume processed, shelf-stable spaceflight foods for a year with no input into the menu?”

It’s certainly a lot to consider. But as tough as it might be, simulations like CHAPEA are vital for NASA’s Artemis plans to establish a permanent human presence on both the moon and Mars. The truly intrepid and accomplished among you have until April 2 to fill out the official application. Seeing as how CHAPEA’s inaugural class is currently about halfway through their one-year stint, this second round of volunteers won’t need to report for duty until sometime in 2025. 

The post How to apply for NASA’s next Mars habitat simulation appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Water molecules detected on the surface of an asteroid in space for the first time https://www.popsci.com/science/water-asteroid-space/ Fri, 16 Feb 2024 17:00:00 +0000 https://www.popsci.com/?p=603138
A graphic rendering of an asteroid in space, with a close up on where water molecules are on the surface.
Using data from NASA’s Stratospheric Observatory for Infrared Astronomy (SOFIA), Southwest Research Institute scientists have discovered water molecules on the surface of an asteroid. NASA/Carla Thomas/SwRI

The data came from a now-defunct NASA mission and was collected by the Faint Object InfraRed Camera.

The post Water molecules detected on the surface of an asteroid in space for the first time appeared first on Popular Science.

]]>
A graphic rendering of an asteroid in space, with a close up on where water molecules are on the surface.
Using data from NASA’s Stratospheric Observatory for Infrared Astronomy (SOFIA), Southwest Research Institute scientists have discovered water molecules on the surface of an asteroid. NASA/Carla Thomas/SwRI

Scientists have detected water molecules on the surface of an asteroid in space for the first time. The findings reveal new details about how water is distributed in the solar system and are detailed in a study published February 12 in The Planetary Science Journal.

[Related: What astronomers learned from a near-Earth asteroid they never saw coming.]

Water molecules have been detected in asteroid samples returned to Earth, but this marks the first time that the molecules have been discovered on the surface of an asteroid in space. The team studied four silicate-rich asteroids using data from the now-retired Stratospheric Observatory for Infrared Astronomy (SOFIA). This plane equipped with a telescope was operated by the German Aerospace Center and NASA. Some observations taken by SOFIA’s Faint Object InfraRed Camera (FORCAST) instrument revealed that asteroids Iris and Massalia have evidence of a specific wavelength of light that indicates that water molecules are present on their surface. The asteroid Iris is giant at 124-miles-diameters and orbits our sun between mars and Jupiter. Massalia is about 84 miles across and is also near the Red Planet.  

“Asteroids are leftovers from the planetary formation process, so their compositions vary depending on where they formed in the solar nebula,” Anicia Arredondo, study co-author and astronomer and asteroid specialist at the Southwest Research Institute, said in a statement. “Of particular interest is the distribution of water on asteroids, because that can shed light on how water was delivered to Earth.”

Dry silicate asteroids are described as anhydrous and they typically form closer to the sun. More icy space rocks like Chariklo are found further away from the sun. Understanding where asteroids are located in the solar system and what they are made from can tell us how the materials in our solar system have been distributed and evolved over time. Since water is necessary for all life on Earth, pinpointing where water could exist can drive where to look for life in our solar system and even beyond.

“We detected a feature that is unambiguously attributed to molecular water on the asteroids Iris and Massalia,” said Arredondo “We based our research on the success of the team that found molecular water on the sunlit surface of the moon. We thought we could use SOFIA to find this spectral signature on other bodies.”

The water molecules were detected by SOFIA in one of the moon’s largest craters in its southern hemisphere. Earlier observations of the moon and asteroids have found some form of hydrogen, but could not tell the difference between water and a close chemical relative called hydroxyl. The team found roughly the equivalent of a 12-ounce bottle of water on the crater. The water was chemical bound in minerals and trapped in a cubic meter of soil spread across the lunar surface.

“Based on the band strength of the spectral features, the abundance of water on the asteroid is consistent with that of the sunlit Moon,” said Arredondo. “Similarly, on asteroids, water can also be bound to minerals as well as adsorbed to silicate and trapped or dissolved in silicate impact glass.”

[Related: NASA spacecraft Lucy says hello to ‘Dinky’ asteroid on far-flying mission.]

Parthenope and Melpomene were the two fainter asteroids in the study, and the data did not reveal any definitive conclusions about the presence of water molecules. According to the team, the FORCAST instrument is not sensitive enough to detect the water spectral feature if present here. The team is now getting the help from NASA’s James Webb Space Telescope to use its precise optics and ability to see in infrared signals to investigate other targets in space.

“We have conducted initial measurements for another two asteroids with Webb during cycle two,” said Arredondo. “We have another proposal in for the next cycle to look at another 30 targets. These studies will increase our understanding of the distribution of water in the solar system.”

The post Water molecules detected on the surface of an asteroid in space for the first time appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
First remote, zero-gravity surgery performed on the ISS from Earth (on rubber) https://www.popsci.com/technology/remote-surgery-robot-iss/ Fri, 16 Feb 2024 15:00:00 +0000 https://www.popsci.com/?p=602988
Surgeon using spaceMIRA remote surgery tool on ISS
A team of surgeons used rubber bands to represent human tissue aboard the ISS. Credit: Virtual Incision

Surgeons in Nebraska controlled spaceMIRA from 250 miles below the ISS as it cut through simulated human tissue.

The post First remote, zero-gravity surgery performed on the ISS from Earth (on rubber) appeared first on Popular Science.

]]>
Surgeon using spaceMIRA remote surgery tool on ISS
A team of surgeons used rubber bands to represent human tissue aboard the ISS. Credit: Virtual Incision

Researchers successfully completed the first remote, zero-gravity “surgery” procedure aboard the International Space Station. Over the weekend, surgeons based at the University of Nebraska spent two hours testing out a small robotic arm dubbed the Miniaturized In Vivo Robotic Assistant, or spaceMIRA, aboard the ISS as it orbited roughly 250 miles above their heads. 

But don’t worry—no ISS astronauts were in need of desperate medical attention. Instead, the experiment utilized rubber bands to simulate human skin during its proof-of-concept demonstration on Saturday.

[Related: ‘Odie’ is en route for its potentially historic moon landing.]

Injuries are inevitable, but that little fact of life gets complicated when the nearest hospital is a seven-month, 300-million-mile journey away. But even if an incredibly skilled doctor is among the first people to step foot on Mars, they can’t be trained to handle every possible emergency. Certain issues, such as invasive surgeries, will likely require backup help. To mitigate these problems in certain situations, remote controlled operations could offer a possible solution.

Designed by Virtual Incision, a startup developing remote-controlled medical tools for the world’s most isolated regions, spaceMIRA weights only two pounds and takes up about as much shelf-space as a toaster oven. One end of its wandlike is topped with a pair of pronglike arms—a left one to grip, and right one to cut.

[Related: 5 space robots that could heal human bodies—or even grow new ones ]

Speaking with CNN on Wednesday, Virtual Incision cofounder and chief technology officer Shane Farritor explained spaceMIRA’s engineering could offer Earthbound the hands and eyes needed to perform “a lot of procedures minimally invasively.”

On February 10, a six-surgeon team in Lincoln, Nebraska, took spaceMIRA (recently arrived aboard the ISS via a SpaceX Falcon 9 rocket) for its inaugural test drive. One arm gripped a mock tissue sample, and the other used scissors to dissect specific portions of the elastic rubber bands.

spaceMIRA prototype on desk
A version of the spaceMIRA (seen above) traveled to the ISS earlier this month. Credit: Virtual Incision

While researchers deemed the experiment a success, surgeons noted the difficulty in accounting for lag time. Communications between Earth and the ISS are delayed about 0.85 seconds—while a minor inconvenience in most circumstances, even milliseconds can mean a matter of life or death during certain medical emergencies. Once on the moon, Artemis astronauts and NASA headquarters will deal with a full 1.3 seconds of delay between both sending and receiving data. On Mars, the first human explorers will face a full hour of waiting after firing off their message, then waiting for a response. Even taking recent laser communications breakthroughs into consideration, patience will remain a virtue for everyone involved in future lunar and Mars expeditions.

This means that, for the time being, devices like spaceMIRA are unlikely to help in split second medical decisions. But for smaller issues—say, a lunar resident’s stitch up after taking a tumble, such medical tools could prove invaluable for everyone involved. In the meantime, Virtual Incision’s remote controlled equipment could still find plenty of uses here on Earth.

The post First remote, zero-gravity surgery performed on the ISS from Earth (on rubber) appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
‘Odie’ is en route for its potentially historic moon landing https://www.popsci.com/science/intuitive-moon-landing-launch/ Thu, 15 Feb 2024 17:00:00 +0000 https://www.popsci.com/?p=602957
A SpaceX Falcon 9 rocket carrying Intuitive Machines’ Nova-C lunar lander lifts off from Launch Pad 39A at NASA’s Kennedy Space Center in Florida a
A SpaceX Falcon 9 rocket carrying Intuitive Machines’ Nova-C lunar lander lifts off from Launch Pad 39A at NASA’s Kennedy Space Center in Florida at 1:05 a.m. EST on February 15, 2024. As part of NASA’s Commercial Lunar Payload Services (CLPS) initiative and Artemis campaign, Intuitive Machines’ first lunar mission is intended to carry science and commercial payloads to the moon. NASA

The robotic lander from Intuitive Machines could become the first privately-built spacecraft on the moon.

The post ‘Odie’ is en route for its potentially historic moon landing appeared first on Popular Science.

]]>
A SpaceX Falcon 9 rocket carrying Intuitive Machines’ Nova-C lunar lander lifts off from Launch Pad 39A at NASA’s Kennedy Space Center in Florida a
A SpaceX Falcon 9 rocket carrying Intuitive Machines’ Nova-C lunar lander lifts off from Launch Pad 39A at NASA’s Kennedy Space Center in Florida at 1:05 a.m. EST on February 15, 2024. As part of NASA’s Commercial Lunar Payload Services (CLPS) initiative and Artemis campaign, Intuitive Machines’ first lunar mission is intended to carry science and commercial payloads to the moon. NASA

A potentially new frontier of lunar exploration began at NASA’s Kennedy Space Center in Florida in the wee hours of the morning. Intuitive Machines’ robotic Odysseus lunar lander successfully launched atop a SpaceX Falcon 9 rocket on February 15 at 1:05 a.m. EST. The uncrewed lander was successfully separated from the rocket about an hour after launch, beginning its 230,000-mile journey towards the moon.

If the mission goes as planned, Odysseus will land on the moon on February 22, where it would be the first private spacecraft to conduct a successful lunar landing. Only government-funded programs from Russia, China, India, the United States, and most recently Japan have performed a lunar landing. 

[Related: This private lander could be the first US machine on the moon this century.]

“It is a profoundly humbling moment for all of us at Intuitive Machines,” the company’s vice president of space systems Trent Martin said during a pre-launch press conference. “The opportunity to return the United States to the moon for the first time since 1972 demands a hunger to explore, and that’s at the heart of everyone at Intuitive Machines.”

Moons photo

The spacecraft is a hexagonal cylinder with six landing legs and is roughly 14 feet tall and five feet wide. Intuitive Machines calls the spacecraft design Nova-C and notes that it’s about the size of a classic red London telephone booth. When fully loaded with fuel, it weighs about 4,200 pounds

The lander is aiming to touch down 186 miles away from the moon’s south pole. This region has cliffs, craters, and possibly frozen water. NASA is the main sponsor of the mission, paying Intuitive Machines about $118 million to deliver its payload to the moon. NASA hopes that if this mission is successful it will jumpstart the lunar economy ahead of future crewed missions. The space agency plans to land astronauts there later this decade. The six navigation and tech experiments in the lander’s payload that will collect data critical for these missions. 

Intuitive Machines’ Nova-C moon lander stands upright on six legs next to an American flag.
Intuitive Machines’ Nova-C moon lander. This particular spacecraft is named Odysseus. Credit: Intuitive Machines

“NASA scientific instruments are on their way to the moon–a giant leap for humanity as we prepare to return to the lunar surface for the first time in more than half a century,” NASA Administrator Bill Nelson said in a statement. “These daring moon deliveries will not only conduct new science at the moon, but they are supporting a growing commercial space economy while showing the strength of American technology and innovation. We have so much to learn through CLPS flights that will help us shape the future of human exploration for the Artemis Generation.”

A camera constructed by students at Embry-Riddle Aeronautical University and an art project by Jeff Koons are also making the lunar journey.

Employees at Intuitive Machines held a naming contest to select the lander’s moniker, picking Odysseus after the hero in the ancient Greek poem the Odyssey by Homer. Engineer Mario Romero suggested the name as an analogy for a mission to the moon. 

[Related: Watch a giant, inflatable space station prototype explode during its intentional ‘ultimate burst.’]

“This journey takes much longer due to the many challenges, setbacks and delays,” Romero said in a statement. “Traveling the daunting, wine-dark sea repeatedly tests his mettle, yet ultimately, Odysseus proves worthy and sticks the landing back home after 10 years.”

Odysseus launches one month after Pittsburgh-based Astrobotic’s Peregrine lunar lander failed to complete its mission. The spacecraft burned in the Earth’s atmosphere about 10 days after a broken fuel tank and massive leak caused the mission to fail. Other attempts to get a privately-built lunar lander on the moon include Israel’s Beresheet lander in 2019 and Japan’s Hakuto-R Mission 1 lander in 2023

The post ‘Odie’ is en route for its potentially historic moon landing appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
A Martian solar eclipse turns the sun into a giant googly eye https://www.popsci.com/science/phobos-mars-solar-eclipse/ Mon, 12 Feb 2024 19:11:26 +0000 https://www.popsci.com/?p=602387
Phobos creating partial solar eclipse on Mars, image taken by Perseverance rover
A Phobos eclipse will only grow larger over the next 50 million years as it continues to descend towards Mars. NASA/JPL-Caltech/ASU

NASA's Perseverance rover captured Phobos as it crossed in front of the sun last week.

The post A Martian solar eclipse turns the sun into a giant googly eye appeared first on Popular Science.

]]>
Phobos creating partial solar eclipse on Mars, image taken by Perseverance rover
A Phobos eclipse will only grow larger over the next 50 million years as it continues to descend towards Mars. NASA/JPL-Caltech/ASU

The next solar eclipse to cross North America is fast approaching, but over on Mars, the Red Planet already experienced one of its own celestial shadow events this year.

On February 8, the asteroid-sized Martian moon Phobos crossed in front of the sun above Jezero Crater—the area just so happening to host NASA’s Perseverance rover. As Phobos continued across the sky, Percy’s left Mastcam-Z camera angled away from its usual landscape vista subject matter towards the satellite, snapping a few dozen photos for project coordinators back at NASA’s Jet Propulsion Laboratory (JPL).

Gallery of Phobos solar eclipse thumbnails
Credit: NASA/JPL/ASU

The images showcase a markedly different full lunar eclipse than the ones Earth receives every 2.5 or so years. Given both Phobos’ size and shape, the moon doesn’t fully cover the sun—instead, the  17x14x11 mile misshapen hunk of rock blocks only a small portion of the star as it continues along its path. The result arguably resembles more googly eye than awe-inspiring cosmic calendar occurrence, but it’s still a pretty impressive vantage point.

Phobos and its smaller sibling moon Deimos were discovered in 1877 by US astronomer Asaph Hall, and are respectively named after the Greek words for “Fear” and “Dread.” The origins of both satellites aren’t wholly understood, although astronomers theorize them to be either asteroids or debris leftover from the solar system’s formation that occurred around 4.5 billion years ago.

[Related: The Mars Express just got up close and personal with Phobos.]

While the Earth’s moon continues to inch away from its planetary pull at a rate of roughly 1.5 inches per year, Phobos is actually being drawn towards Mars—about six feet closer every century. While that makes for a comparatively slow descent, it does still mean the moon will eventually either crash into Mars, or break it up into thousands of fragments to form a planetary ring like Saturn’s. No need to worry, though, since that grand finale isn’t expected for another 50 million years. In the meantime, Phobos will continue orbiting Mars at a rate of three times per day, while the slower Deimos completes its journey every 30 hours.

Perseverance’s lunar eclipse capture, while incredible on its own, naturally fails to capture much detail of the moon’s pockmarked surface. Luckily, the European Space Agency’s Mars Express caught a closer look back in 2022, when the satellite came within just 52 miles of the moon to snap its own photos.

The post A Martian solar eclipse turns the sun into a giant googly eye appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA’s Hubble space telescope reveals a galactic ‘string of pearls’ https://www.popsci.com/science/nasa-hubble-string-of-pearls/ Fri, 09 Feb 2024 15:36:29 +0000 https://www.popsci.com/?p=602089
Galaxy AM 1054-325 has been distorted into an S-shape from a normal pancake-like spiral shape by the gravitational pull of a neighboring galaxy, seen in this NASA Hubble Space Telescope image. A consequence of this is that newborn clusters of stars form along a stretched-out tidal tail for thousands of light-years, resembling a string of pearls. They form when knots of gas gravitationally collapse to create about one million newborn stars per cluster.
Galaxy AM 1054-325 has been distorted into an S-shape from a normal pancake-like spiral shape by the gravitational pull of a neighboring galaxy, seen in this NASA Hubble Space Telescope image. A consequence of this is that newborn clusters of stars form along a stretched-out tidal tail for thousands of light-years, resembling a string of pearls. They form when knots of gas gravitationally collapse to create about one million newborn stars per cluster. NASA, ESA, STScI, Jayanne English/University of Manitoba

425 clusters of newborn stars shine along cosmic tails that could hold clues to the universe's past.

The post NASA’s Hubble space telescope reveals a galactic ‘string of pearls’ appeared first on Popular Science.

]]>
Galaxy AM 1054-325 has been distorted into an S-shape from a normal pancake-like spiral shape by the gravitational pull of a neighboring galaxy, seen in this NASA Hubble Space Telescope image. A consequence of this is that newborn clusters of stars form along a stretched-out tidal tail for thousands of light-years, resembling a string of pearls. They form when knots of gas gravitationally collapse to create about one million newborn stars per cluster.
Galaxy AM 1054-325 has been distorted into an S-shape from a normal pancake-like spiral shape by the gravitational pull of a neighboring galaxy, seen in this NASA Hubble Space Telescope image. A consequence of this is that newborn clusters of stars form along a stretched-out tidal tail for thousands of light-years, resembling a string of pearls. They form when knots of gas gravitationally collapse to create about one million newborn stars per cluster. NASA, ESA, STScI, Jayanne English/University of Manitoba

When galaxies collide, their stars are not actually destroyed. These rough-and-tumble dynamics actually trigger the formation of new generations of stars, and potentially even planets to accompany them.

[Related: Behold six galactic collisions, masterfully captured by Hubble.]

Astronomers using the Hubble Space Telescope have taken a closer look at 12 of these interacting galaxies. These galaxies all have long tails of gas, dust, and multitudes of stars. Hubble can detect ultraviolet light and has uncovered 425 clusters of newborn stars located along these galactic tails that resemble a string of lights or pearls. Each of these clusters is packed with as many as one million newborn blue stars. The findings were described in a study published in the Monthly Notices of the Royal Astronomical Society in September 2023. A new image of the string of pearls galaxy (Galaxy AM 1054-325) was released by NASA on February 8

Galactic collisions and high pressure

When galaxies interact with each other, gravitational tidal forces will pull out long streams of gas and dust from the material that make up each galaxy. The Antennae and Mice galaxies have long, narrow, finger-like tendrils and are common examples of what these galactic tails look like.

“As galaxies merge, clouds of gas collide and collapse, creating a high-pressure environment where stars could form,” study co-author and Penn State University astronomer Jane Charlton said in a statement. “The interiors of these mergers have been well studied, but less was known about possible star formation in the debris that results from these mergers, such as in the tidal tails.”

Tails of young stars

In their study, a team of scientists used new observations and archival data to estimate the ages and masses of tidal tail star clusters. At only 10 million years old, these clusters are very young and appear to be forming at the same rate along tails that stretch for thousands of light-years.

“It’s a surprise to see lots of the young objects in the tails. It tells us a lot about cluster formation efficiency,” study co-author and Randolph-Macon College astronomer Michael Rodruck said in a statement. “With tidal tails, you will build up new generations of stars that otherwise might not have existed.”

The tails appear to be taking the spiral arm of a galaxy and stretching it further out into space. The exterior arm is pulled in a gravitational tug-of-war between the two interacting galaxies. 

Galaxies as a time capsule

Before these galactic mergers even happened, the galaxies were rich in dusty clouds of molecular hydrogen. These clouds may simply have been unable to move, but the clouds eventually jostled around and began to bump into each other. This activity then compressed the hydrogen to a point where it created a huge storm of star birth. 

[Related: How do you make cosmic sausage?]

Scientists are still not sure what the fate of these strung-out clusters of stars will be. They could stay gravitationally intact and eventually change into globular star clusters. It’s possible that they may also disperse to form a halo of stars around their host galaxy, or even be cast off and become wandering intergalactic stars.

Nearby galaxies observed by Hubble like these can be used as a proxy for what happened in our universe millions of years ago and are a way to peer into the distant past.

“We think that star clusters in tidal tails may have been more common in the early universe,” said Charlton, “When the universe was smaller and galaxy collisions were more frequent.”

The post NASA’s Hubble space telescope reveals a galactic ‘string of pearls’ appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA’s PACE satellite takes off to monitor phytoplankton—from space https://www.popsci.com/science/nasas-pace-satellite-launch/ Thu, 08 Feb 2024 14:30:00 +0000 https://www.popsci.com/?p=601927
NASA’s Plankton, Aerosol, Climate, ocean Ecosystem (PACE) satellite launches aboard a SpaceX Falcon 9 rocket. Fire and smoke are below the spacecraft as it lifts off from the launch pad.
NASA’s Plankton, Aerosol, Climate, ocean Ecosystem (PACE) satellite launched aboard a SpaceX Falcon 9 rocket from Space Launch Complex 40 at Cape Canaveral Space Force Station in Florida. NASA

The spacecraft will monitor the oceans and atmosphere of a warming planet in new ways.

The post NASA’s PACE satellite takes off to monitor phytoplankton—from space appeared first on Popular Science.

]]>
NASA’s Plankton, Aerosol, Climate, ocean Ecosystem (PACE) satellite launches aboard a SpaceX Falcon 9 rocket. Fire and smoke are below the spacecraft as it lifts off from the launch pad.
NASA’s Plankton, Aerosol, Climate, ocean Ecosystem (PACE) satellite launched aboard a SpaceX Falcon 9 rocket from Space Launch Complex 40 at Cape Canaveral Space Force Station in Florida. NASA

NASA’s PACE satellite successfully launched into orbit at 1:33 a.m. EST on February 8. The climate satellite launched aboard a SpaceX Falcon 9 rocket from Cape Canaveral Space Force Station in Florida.

[Related: Scientists say the ocean is changing color—and it’s probably our fault.]

The new Plankton, Aerosol, Climate, ocean Ecosystem satellite will study ocean health, air quality, the atmosphere, and the effects of climate change from about 420 miles above the Earth. While NASA already has over 24 Earth-observing satellites and instruments in orbit, this new one should give scientists better insight into how particles in the atmosphere like pollutants and volcanic ash interact with algae and plankton. 

Global Warming photo

“It’s going to teach us about the oceans in the same way that Webb is teaching us about the cosmos,” said Karen St. Germain, director of NASA’s Earth Science Division, Science Mission Directorate, during a pre-launch briefing on February 4th. St. Germain is referencing the James Webb Space Telescope, which has been unveiling mysteries of the deep cosmos for almost two years. 

What’s on board

PACE will scan the Earth every day using two science instruments, while a third device will take monthly measurements.  

According to NASA, the hyperspectral ocean color instrument will help researchers measure the world’s oceans and other bodies of water across a spectrum of ultraviolet (UV), visible, and near-infrared light. PACE will be able to detect 200 colors, compared to the seven or eight colors that current satellites can pick up. Seeing such a wide spectrum of color will allow researchers to track how phytoplankton is distributed around the globe. 

Two polarimeter instruments are also onboard–Hyper-Angular Rainbow Polarimeter #2 and Spectro-polarimeter for Planetary Exploration. Both will detect how sunlight interacts with particles in the atmosphere. This will give sciencents new insight into atmospheric aerosols and cloud properties, and air quality at local, regional, and global scales.

“Observations and scientific research from PACE will profoundly advance our knowledge of the ocean’s role in the climate cycle,” St. Germain said in a statement following the launch. “The value of PACE data skyrockets when we combine it with data and science from our Surface Water and Ocean Topography mission–ushering in a new era of ocean science. As an open-source science mission with early adopters ready to use its research and data, PACE will accelerate our understanding of the Earth system and help NASA deliver actionable science, data, and practical applications to help our coastal communities and industries address rapidly evolving challenges.”

Why study phytoplankton from space

Our planet’s oceans are responding to climate change in several different ways. Sea levels are rising as polar ice melts. Marine heat waves are killing sea life and fueling stronger storms. The ocean is even getting more green and shifts in ocean color is an indication that ecosystems may also be changing. A July 2023 study found that the changes and blue-green fluctuations to the ocean’s hue over the last two decades cannot be explained by natural year-to-year variability alone. These changes are present in over 56 percent of the planet’s oceans. The study also found that tropical oceans near the Earth’s equator have become steadily greener overtime.

[Related: The epic journey of dust in the wind often ends with happy plankton.]

Following tiny phytoplankton can help monitor all of these changes. These microscopic marine algae play a major role in the global carbon cycle. Phytoplankton absorb carbon dioxide from the atmosphere and convert it into cellular material. It drives the larger aquatic and global ecosystem by providing food for bigger organisms.

PACE will provide the first measurements of phytoplankton community composition around the world. “This will significantly improve our ability to understand Earth’s changing marine ecosystems, manage natural resources such as fisheries and identify harmful algal blooms,” wrote NASA.

A mission 20 years in the making

In addition to two scrubbed launch attempts earlier this week, PACE has powered through other adversity on its way into orbit. The Trump Administration tried to cancel the mission four times in separate budget proposals. However, the funds were already allocated by Congress which saved it.

It has also had delays and cost overruns. NASA capped the mission’s total price tag at $805 million in 2014, with a launch initially scheduled for 2022. By the 2024 launch, the cost ballooned to $948 million.

“After 20 years of thinking about this mission, it’s exhilarating to watch it finally realized and to witness its launch. I couldn’t be prouder or more appreciative of our PACE team,” Jeremy Werdell, PACE project scientist at NASA’s Goddard Space Flight Center, said in a statement. “The opportunities PACE will offer are so exciting, and we’re going to be able to use these incredible technologies in ways we haven’t yet anticipated. It’s truly a mission of discovery.”

Scientists expect to start getting the first data from PACE in a month or two. 

The post NASA’s PACE satellite takes off to monitor phytoplankton—from space appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
NASA’s Perseverance Rover spots damaged, lonely Ingenuity helicopter in the ‘bland’ part of Mars https://www.popsci.com/science/perseverance-ingenuity/ Wed, 07 Feb 2024 18:00:00 +0000 https://www.popsci.com/?p=601829
The Ingenuity helicopter sits on the surface of Mars on February 4, 2024.
The Ingenuity helicopter sits on the surface of Mars on February 4, 2024. NASA/JPL-Caltech

The rover will have to continue exploring the Red Planet without its drone companion.

The post NASA’s Perseverance Rover spots damaged, lonely Ingenuity helicopter in the ‘bland’ part of Mars appeared first on Popular Science.

]]>
The Ingenuity helicopter sits on the surface of Mars on February 4, 2024.
The Ingenuity helicopter sits on the surface of Mars on February 4, 2024. NASA/JPL-Caltech

On February 4, NASA’s Perseverance Rover snapped an image of its now defunct companion, the Ingenuity helicopter. The pair had spent almost three Earth years scouring the Red Planet for signs of ancient life, advancing aerial missions on Mars. The damaged ingenuity helicopter has been sitting there for just over two weeks. 

[Related: RIP Mars Ingenuity, the ‘little helicopter that could.’]

The Perseverance Rover snapped the image at 1:05 p.m. global mean solar time that shows the “little helicopter that could” sitting alone on a barren Martian sand dune in Neretva Vallis. Perseverance rolled away from its broken companion, possibility for the last time. The image was beamed back to Earth and processed by visual design student Simeon Schmauss, who stitched together the six raw NASA images into a panorama. 

On January 18, Ingenuity’s rotors were damaged when it made a landing on what NASA called a “bland” patch of Martian landscape. Typically, the helicopter used rocks and other distinguishing features on the Red Planet to help it navigate, but the drone did not have many visual cues during its 72nd and final flight. 

NASA confirmed that the rotocopter damaged at least one blade when it completed the flight. While it landed upright and was still in communication with NASA’s Jet Propulsion Laboratory (JPL), its flying days were officially over. The JPL is still analyzing the damage. 

On January 31, NASA held a live streamed tribute to Ingenuity. “We couldn’t be prouder or happier with how our little baby has done,” Ingenuity Project Manager Teddy Tzanetos said during the event. “It’s been the mission of a lifetime for all of us. And I wanted to say thank you to all of the people here that gave their weekends, their late nights. All the engineers, the aerodynamic scientists, the technicians who hand-crafted this aircraft.”

Ingenuity first landed on Mars on February 18, 2021. By April, it became the first powered aircraft to lift off from the surface of another planet. Ingenuity was initially intended to do five test flights with the Perseverance over 30 days. However, this four pound helicopter just kept going. It flew 14 times farther than planned and had a total flight time of two hours. Ingenuity hovered above the rover acting as a scout, as Perseverance puttered along the sands of Mars. It lasted about 33 times longer than NASA expected. 

[Related: Name a better duo than NASA’s hard-working Mars rover and helicopter.]

Before Ingenuity’s demise, the dynamic duo explored Mars’ Jezero Crater. This site contains evidence of ancient bodies of water that could have harbored life billions of years ago. Ingenuity worked by capturing aerial views of Mars that pinpointed places for Perseverance to explore further. 

During the January 31 livestream, NASA’s Mars Exploration Program Deputy Director Tiffany Morgan said that Ingenuity will have a lasting legacy for future aerial missions, and demonstrated how to use helicopters in missions to other planets

Thanks in part to Ingenuity’s success, NASA has proposed using two helicopters in a planned Mars Sample Return mission. These  small aircraft could help pick up the canisters of rock samples that the rover has been placing along the planet’s surface. The orbiter for this mission is expected to launch in 2027 and the lander in 2028, with the samples returned to Earth as early as 2033

Until then, Perseverance must go it alone. 

The post NASA’s Perseverance Rover spots damaged, lonely Ingenuity helicopter in the ‘bland’ part of Mars appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
ESA will send a triangle of satellites into space to study gravitational waves https://www.popsci.com/science/lisa-esa/ Wed, 07 Feb 2024 15:17:48 +0000 https://www.popsci.com/?p=601793
An artist's representation of two black holes generating gravitational waves, and LISA’s triangle of laser light.
An artist's representation of two black holes generating gravitational waves, and LISA’s triangle of laser light. ESA

New LISA satellite trio will be able to detect the forgotten 'middle children' of the black hole family.

The post ESA will send a triangle of satellites into space to study gravitational waves appeared first on Popular Science.

]]>
An artist's representation of two black holes generating gravitational waves, and LISA’s triangle of laser light.
An artist's representation of two black holes generating gravitational waves, and LISA’s triangle of laser light. ESA

A precisely arranged triangle of three satellites shooting laser beams at each other truly sounds like science fiction. But the European Space Agency (ESA) is set to make this a reality by 2035.

The project, known as the Laser Interferometer Space Antenna (LISA), is like the famous gravitational-wave-discovering LIGO experiment—just in space instead of underground tunnels. Lead by ESA, the project is a collaboration with NASA and a consortium of scientists. ESA recently gave the mission team the official go-ahead to begin construction on the spacecraft in January 2025, with launch planned a decade later. Astrophysicists who work on the mission, like Max Planck Institute astrophysicist Sarah Paczkowski, were overjoyed at the news, describing the mission adoption as “rewarding” and “super exciting.”

“LISA will be sensitive to an as-of-yet unexplored regime of gravitational waves” or ripples in the fabric of spacetime, explains Michael Zevin, an astrophysicist at the Adler Planetarium and part of the LIGO collaboration. Gravitational waves reveal the physics of black holes smashing together, massive supernova explosions, and even the earliest moments of the universe. LISA’s new perspective is “akin to the first time observing the universe in light outside the visible range, such as X-rays or infrared, which enabled an immense amount of science, discovery, and understanding of the cosmos,” he adds. 

Since the first detection of gravitational waves in 2016, astronomers have been eagerly exploring the cosmos through this new window. We’ve heard the final moments when two black holes spiral into each other and learned about how the highest energy events in the cosmos—like supernovae and gamma ray bursts—happen by simultaneously spotting light and gravitational waves from a cosmic explosion.

Space photo

.

[ Related: Gravitational waves just showed us something even cooler than black holes ]

The common words between LISA and LIGO are laser interferometer, describing the setup of the experiment. These projects note how the distance between two objects changes ever so slightly as a gravitational wave ripples by. They do this with two long arms—in LIGO’s case, two big tunnels each 2.5 miles long. In those tunnels, there are huge vacuum chamber tubes where laser light travels down each arm and bounces back off of a mirror. Where the light recombines in the center, it’ll look different if the laser has to travel farther in one arm due to a passing gravitational wave.

But, Earth-bound experiments are limited in the type of gravitational waves they can detect. With gravitational waves, you need long tunnels like LIGO’s to notice the ripples—and the longer the arms, the longer the waves you can detect. Underground detectors like LIGO are best at spotting the shortest gravitational waves, very high frequency vibrations in spacetime that come from the last milliseconds before black holes or neutron stars collide. This is partly because Earth is absolutely buzzing with activity that can disrupt the detector. “Earthquakes, cars, ocean waves, and even clouds passing over the detector” create noise that prevents astronomers from hearing the low-frequency rumbles of the universe, says Simon Barke, LISA Charge Management Device scientist at the University of Florida’s Precision Space Systems Laboratory. 

Ground-based detectors are limited in the size of the experiment; we simply can’t create a building bigger than Earth, or even bigger than a few states, limiting us to the shortest gravitational waves. On the other end of the spectrum, pulsar timing arrays like NANOGrav can only hear the lowest frequencies originating from humongous supermassive black holes, as they’re measuring the vast distances between dead stars known as pulsars. For everything in between, you need LISA.

“We’ve so far only heard a very narrow range of the sounds of spacetime,” explains Zevin. “Detectors on the ground have been listening to the violins of the gravitational-wave symphony. Pulsar timing arrays have recently announced evidence for hearing the bass section from supermassive black holes throughout the universe. LISA will be the first space-based gravitational-wave instrument and will listen to frequencies between these two regimes: the viola and cello sections of the orchestra.”

LISA will be made of three spacecraft, each containing a solid cube of gold-platinum, with lasers spanning the distance between them to note any changes caused by gravitational waves. This concept requires mind-boggling precision: the spacecraft must be ultra-stable, ensuring only the movement of spacetime affects the test masses, to measure ripples of only a few billionths of a millimeter. The spacecraft will also each be separated by a length “larger than the diameter of the Sun,” says Barke. 

TK
Infographic providing information on gravitational waves and how the LISA mission will measure them using laser beams and free-floating cubes. The image shows the three LISA spacecraft in orbit with the Earth and Sun visible. A zoomed in circle focuses on one of the spacecraft and the two golden cubes it contains. In the background an illustration of two colliding black holes is creating ripples in spacetime. Another box shows a sequence of triangles to demonstrate the effect gravitational waves will have on the distance travelled by LISA’s laser beams. Credit: ESA / ATG Medialab, CC BY-SA 3.0 IGO

This technology will allow astronomers to track the collisions of binary stars in the Milky Way, map the gravity around the most monstrous black holes lurking at the centers of galaxies, and test relativity in such an extreme environment that “we might see deviations from Einstein’s prediction,” says Barke. Scientists also hope LISA will help solve a long-standing question in astronomy: how do supermassive black holes grow to their gargantuan sizes? The upcoming experiment will finally be able to observe so-called intermediate-mass black holes, the “elusive middle-children of the black hole family” according to Zevin, which weigh in at thousands of times the mass of our sun and could be a key step on the path to becoming supermassive.

Sumeet Kulkarni, a gravitational wave astronomer at the University of Mississippi, also notes the gravity of this moment for those of us on Earth: “The confirmation that LISA will actually fly in the near future opens so many new research opportunities and challenges to overcome for an entire generation of physicists, astronomers, and engineers.”

The post ESA will send a triangle of satellites into space to study gravitational waves appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
New map of half the known universe shines with cosmic energy https://www.popsci.com/science/map-half-of-the-universe-x-ray/ Mon, 05 Feb 2024 17:00:00 +0000 https://www.popsci.com/?p=601441
Two different versions of the eROSITA map. Extended X-ray emissions (left) and point-like X-ray sources (right).
Two different versions of the eROSITA map. Extended X-ray emissions (left) and point-like X-ray sources (right). © MPE, J. Sanders for the eROSITA consortium

More than one million sources of energy are featured in the first X-ray images from the eROSITA space telescope.

The post New map of half the known universe shines with cosmic energy appeared first on Popular Science.

]]>
Two different versions of the eROSITA map. Extended X-ray emissions (left) and point-like X-ray sources (right).
Two different versions of the eROSITA map. Extended X-ray emissions (left) and point-like X-ray sources (right). © MPE, J. Sanders for the eROSITA consortium

On February 1, a team of astronomers and cosmologists published the first data from the eROSITA sky survey. The data includes a cosmic map of half of the universe that was taken in X-ray light. Scientists from institutions in Germany and Russia used the eROSITA space telescope, which is positioned at Lagrange Point 2 near the James Webb Space Telescope. The soft X-ray imaging telescope generated a detailed X-ray view of the sky over the western hemisphere. The map includes close to one million cosmic sources of energy, including more than 700,000 supermassive black holes that are gobbling up the centers of galaxies.

[Related: What is matter? It’s not as basic as you’d think.]

This new survey called the eROSITA All-Sky Survey Catalogue (eRASS1) is the largest-ever catalog of the most powerful sources of energy in the universe. The data was gathered between December 12, 2019 and June 11, 2020. According to the team from the Max Planck Institute for Extraterrestrial Physics in Germany, eROSITA was able to capture 170 million individual particles of X-ray light called photons. By measuring the energy and arrival time of each photon, astronomers could build a detailed map of the cosmos. The team did so by processing and calibrating the photons detected by the telescope against both bright and diffuse backgrounds.

In addition to the over 900,000 X-ray sources and supermassive black holes, the map has about 180,000 X-ray-emitting stars in the Milky Way and 12,000 galaxy clusters. It includes some less common objects like pulsars, binary stars, and supernova remnants. The survey also features some cosmic web filaments. These bunches of hot gas are the largest known structures in the universe and connect galaxies in clusters. 

Half of the X-ray sky, projected onto a circle (so-called Zenit Equal Area projection) with the center of the Milky Way on the left and the galactic plane running horizontally. Photons have been color-coded according to their energy (red for energies 0.3-0.6 keV, green for 0.6-1 keV, blue for 1-2.3 keV). CREDIT: © MPE, J. Sanders for the eROSITA consortium.
Half of the X-ray sky, projected onto a circle (so-called Zenit Equal Area projection) with the center of the Milky Way on the left and the galactic plane running horizontally. Photons have been color-coded according to their energy (red for energies 0.3-0.6 keV, green for 0.6-1 keV, blue for 1-2.3 keV). CREDIT: © MPE, J. Sanders for the eROSITA consortium.

“These are mind-blowing numbers for X-ray astronomy,” Andrea Merloni, eROSITA principal investigator and co-author of the first paper, said in a statement. “We’ve detected more sources in 6 months than the big flagship missions XMM-Newton and Chandra have done in nearly 25 years of operation.”

About 50 scientific papers accompany the data release, adding to the already 200 studies already written using the eROSITA telescope. The primary goal is for this telescope to use clusters of galaxies to observe how dark energy accelerates the expansion of the universe.

The treasure trove of data collected from eRASS1 could lead to answers to some of the biggest questions in cosmology, namely how the universe evolved and why space is expanding at a quickening pace. The data release also pinpoints the positions in the sky from which individual photons are received, when the photons arrive, and how much energy they contain. This helped make the map more precise. 

The eROSITA Consortium made the software needed to analyze the data from the telescope and the catalogs that go beyond X-ray data available.

[Related: Dark energy fills the cosmos. But what is it?]

“We’ve made a huge effort to release high-quality data and software,” eROSITA Operations team leader Iriam Ramos-Ceja said in a statement. “We hope this will broaden the base of scientists worldwide working with high-energy data and help push the frontiers of X-ray astronomy.”

eROSITA made three other scans between June 2020 and February 2022, but the project was put on hold due to Russia’s invasion of Ukraine. Additional cosmology results based on an in-depth analysis of the cluster are expected to be published by the middle of February. A full list of the newly released scientific publications from this data can be found here.

The post New map of half the known universe shines with cosmic energy appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
This may be SLIM’s farewell transmission from the moon https://www.popsci.com/science/slim-probe-last-picture/ Fri, 02 Feb 2024 19:30:00 +0000 https://www.popsci.com/?p=601357
SLIM lunar lander final moon photo
This might be the last image Earth receives from Japan's lunar probe. Credit: JAXA, Ritsumeikan University, University of Aizu

Japan’s lunar lander made history and defied the odds, but it may finally be down for the count.

The post This may be SLIM’s farewell transmission from the moon appeared first on Popular Science.

]]>
SLIM lunar lander final moon photo
This might be the last image Earth receives from Japan's lunar probe. Credit: JAXA, Ritsumeikan University, University of Aizu

SLIM, Japan’s historic moon lander, is officially powered down in preparation for a brutal, likely fatal lunar nighttime lasting around 14.5 days. Before drifting off to what very probably will be a permanent slumber, however, the small craft beamed back a few final glimpses of its new home to mission control at the Japanese space agency, JAXA.

[Related: Japan’s SLIM lunar lander stuck the landing—upside down.]

“Last night (January 31st to February 1st), we sent a command to turn on the probe’s communication device just in case, and when there was no response, we confirmed that SLIM had entered a dormant state,” reads a machine translated message from JAXA posted to X on Thursday. “This is the last scene taken by SLIM with its navigation camera before dusk.”

Japan’s Smart Lander for Investigating Moon, or SLIM, first ran into trouble during its descent on January 19, when its main engines malfunctioned approximately 162-feet above the lunar surface. The resultant loss of thrust threw the lander off kilter, and while it arrived intact, it did so nosedown with SLIM’s solar panels faced westward. Engineers worried the lander would be unable to generate enough power to continue communicating with Earth for very long, and SLIM subsequently went silent only a few hours after its arrival—although its two, tiny autonomous robots ejected unscathed to begin their own surveys.

Image taken of JAXA SLIM lunar lander on moon upside down
Japan’s lunar lander SLIM landed upside down. Credit: JAXA/Takara Tomy/Sony Group Corporation/Doshisha University JAXA/Takara Tomy/Sony Group Corporation/Doshisha University

Almost 10 days later, however, the sun’s return provided SLIM enough juice to reboot itself and commence a few more operations, including using its Multi-Band Camera to scan the chemical composition of its lunar surroundings. JAXA researchers are currently analyzing all the data SLIM relayed back to Earth, paying specific attention to the detection of olivine, which “will help solve the mystery of the origin of the moon,” JAXA officials said in a statement released on February 1.

SLIM’s final glimpse of the moon shows a darkening landscape as it enters its lengthy lunar night, when temperatures plummet as low as a balmy -208 Fahrenheit. It’s interesting to compare the last photo with SLIM’s two previous snapshots taken immediately after touchdown on January 19, as well as after coming back online ten days later. Viewed side-by-side, the triptych highlights an out-of-frame sun’s slow descent across the moon’s horizon as it casts lengthening shadows across the lunar landscape and regolith. (Pictured below: From left to right: SLIM’s images of the lunar surface from Jan. 19 to Feb. 1. Credit: JAXA/Takara Tomy/Sony Group Corporation/Doshisha University.)

But although it’s very likely SLIM’s official end to a monthslong journey, JAXA isn’t shutting down operations just yet. After all, spacecraft often prove far more resilient than initially believed—just ask the NASA teams behind Voyager or Ingenuity.

“Although SLIM was not designed for the harsh lunar nights, we plan to try to operate again from mid-February, when the Sun will shine again on SLIM’s solar cells,” JAXA posted to X on Thursday.

The post This may be SLIM’s farewell transmission from the moon appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Why interstellar objects like ‘Oumuamua and Borisov may hold clues to exoplanets https://www.popsci.com/science/oumuamua-and-borisov-may-hold-clues-to-exoplanets/ Fri, 02 Feb 2024 14:25:18 +0000 https://www.popsci.com/?p=601253
The first interstellar interloper detected passing through the Solar System, 1l/‘Oumuamua, came within 24 million miles of the Sun in 2017. It’s difficult to know exactly what ‘Oumuamua looked like, but it was probably oddly shaped and elongated, as depicted in this illustration.
The first interstellar interloper detected passing through the Solar System, 1l/‘Oumuamua, came within 24 million miles of the Sun in 2017. It’s difficult to know exactly what ‘Oumuamua looked like, but it was probably oddly shaped and elongated, as depicted in this illustration. NASA, ESA, JOSEPH OLMSTED (STSCI), FRANK SUMMERS (STSCI)

The detection of two celestial interlopers careening through our solar system have scientists eagerly anticipating more.

The post Why interstellar objects like ‘Oumuamua and Borisov may hold clues to exoplanets appeared first on Popular Science.

]]>
The first interstellar interloper detected passing through the Solar System, 1l/‘Oumuamua, came within 24 million miles of the Sun in 2017. It’s difficult to know exactly what ‘Oumuamua looked like, but it was probably oddly shaped and elongated, as depicted in this illustration.
The first interstellar interloper detected passing through the Solar System, 1l/‘Oumuamua, came within 24 million miles of the Sun in 2017. It’s difficult to know exactly what ‘Oumuamua looked like, but it was probably oddly shaped and elongated, as depicted in this illustration. NASA, ESA, JOSEPH OLMSTED (STSCI), FRANK SUMMERS (STSCI)

This article was originally featured on Knowable Magazine.

On October 17 and 18, 2017, an unusual object sped across the field of view of a large telescope perched near the summit of a volcano on the Hawaiian island of Maui. The Pan-STARRS1 telescope was designed to survey the sky for transient events, like asteroid or comet flybys. But this was different: The object was not gravitationally bound to the Sun, or to any other celestial body. It had arrived from somewhere else.

The mysterious object was the first visitor from interstellar space observed passing through the solar system. Astronomers named it 1I/‘Oumuamua, borrowing a Hawaiian word that roughly translates to “messenger from afar arriving first.” Two years later, in August 2019, amateur astronomer Gennadiy Borisov discovered the only other known interstellar interloper, now called 2I/Borisov, using a self-built telescope at the MARGO observatory in Nauchnij, Crimea.

While typical asteroids and comets in the solar system orbit the Sun, ‘Oumuamua and Borisov are celestial nomads, spending most of their time wandering interstellar space. The existence of such interlopers in the solar system had been hypothesized, but scientists expected them to be rare. “I never thought we would see one,” says astrophysicist Susanne Pfalzner of the Jülich Supercomputing Center in Germany. At least not in her lifetime.

With these two discoveries, scientists now suspect that interstellar interlopers are much more common. Right now, within the orbit of Neptune alone, there could be around 10,000 ‘Oumuamua-size interstellar objects, estimates planetary scientist David Jewitt of UCLA, coauthor of an overview of the current understanding of interstellar interlopers in the 2023 Annual Review of Astronomy and Astrophysics.

Researchers are busy trying to answer basic questions about these alien objects, including where they come from and how they end up wandering the galaxy. Interlopers could also provide a new way to probe features of distant planetary systems.

But first, astronomers need to find more of them.

“We’re a little behind at the moment,” Jewitt says. “But we expect to see more.”

2I/Borisov appears as a fuzzy blue dot in front of a distant spiral galaxy (left) in this November 2019 image taken by the Hubble Space Telescope when the object was approximately 200 million miles from Earth.
2I/Borisov appears as a fuzzy blue dot in front of a distant spiral galaxy (left) in this November 2019 image taken by the Hubble Space Telescope when the object was approximately 200 million miles from Earth. CREDIT: NASA, ESA, AND D. JEWITT (UCLA)

Alien origins

At least since the beginning of the 18th century, astronomers have considered the possibility that interstellar objects exist. More recently, computer models have shown that the solar system sent its own population of smaller bodies into the voids of interstellar space long ago due to gravitational interactions with the giant planets.

Scientists expected most interlopers to be exocomets composed of icy materials. Borisov fit this profile: It had a tail made of gases and dust created by ices that evaporated during its close passage to the Sun. This suggests that it originated in the outer region of a planetary system where temperatures were cold enough for gases like carbon monoxide to have frozen into its rocks. At some point, something tossed Borisov, roughly a kilometer across, out of its system.

One potential culprit is a stellar flyby. The gravity of a passing star can eject smaller bodies, known as planetesimals, from the outer reaches of a system, according to a recent study led by Pfalzner. A giant planet could also eject an object from the outer regions of a planetary system if an asteroid or comet gets close enough for the planet’s gravitational tug to speed up the smaller body enough for it to escape its star’s hold. Close approaches can also happen when planets migrate across their planetary systems, as Neptune is thought to have done in the early solar system.

The interstellar interloper 2I/Borisov (large black dot) was discovered three months before it passed by the Sun, allowing astronomers to capture images of the object for about a year. Borisov’s path brought it within 180 million miles of Earth (large blue dot). The relative locations of Borisov and Earth are shown for three points in time.
The interstellar interloper 2I/Borisov (large black dot) was discovered three months before it passed by the Sun, allowing astronomers to capture images of the object for about a year. Borisov’s path brought it within 180 million miles of Earth (large blue dot). The relative locations of Borisov and Earth are shown for three points in time. CREDIT: Knowable Magazine

‘Oumuamua, on the other hand, is not what scientists expected. Observations suggest it is quite elongated—perhaps 240 meters long and as narrow as 40 meters. And unlike Borisov, it didn’t show any gas or dust activity, raising the possibility that it originated closer to its star where it was too warm for ices to form. If this was the case, a stellar flyby or giant planet probably would not have been able to pull the object out of its system. Instead, it may have been ejected during the death throes of its star: Pulses of gas from a dying star could push planets and planetesimals outward, destabilizing their orbits enough to send some of them flying into interstellar space.

It’s possible, however, that ‘Oumuamua did form in the cold outer reaches of its system and, as it neared the Sun, developed a gas tail that was not detected by telescopes. One clue is that the object sped up more than would be expected from the gravity of the solar system alone. A recent study suggests that such a boost could have come from small amounts of hydrogen outgassing that the telescopes didn’t detect. Several asteroids in our solar system may have gotten a similar boost from outgassing of water vapor, according to another study. Future observations by the James Webb Space Telescope, and by the JAXA Hayabusa2 Extended Mission (which will rendezvous with one of these solar system asteroids, known as “dark comets,” in 2031) may detect low levels of outgassing.

“We’ll have to wait and see, but they could be analogs of ‘Oumuamua,” says planetary scientist Darryl Seligman of Cornell University, coauthor with Jewitt of the review of interstellar interlopers.

Searching for nomads

More data, from more interlopers, may help resolve some of these questions. In order to gather these data, scientists will need better odds of detecting the objects when they pass through the solar system. “If Pan-STARRS1 didn’t observe where we did that particular night, it’s likely that ‘Oumuamua would never have been found,” says astronomer Robert Weryk, formerly of the University of Hawaii, who discovered the interloper in the telescope’s data.

The upcoming Legacy Survey of Space and Time at the Vera C. Rubin Observatory is expected to increase astronomers’ chances of finding these fast movers: Beginning as soon as 2025, the observatory’s telescope will image the entire visible southern sky every few nights, and its primary mirror has a diameter nearly seven meters larger than Pan-STARRS1, enabling it to see fainter objects, farther away. Once interlopers are detected, ground- and space-based telescopes will image them to try to determine what they are made of. And if a reachable target is discovered, the European Space Agency and the Japan Aerospace Exploration Agency’s Comet Interceptor, slated to launch in 2029, could be redirected to image the visitor up close.

The Vera C. Rubin Observatory in northern Chile will host the decade-long Legacy Survey of Space and Time, set to begin in 2025. The Observatory’s 8.4-meter Simonyi Survey Telescope will collect images at a rate that covers the entire visible sky every few nights, potentially allowing for the detection of more interstellar interlopers.
The Vera C. Rubin Observatory in northern Chile will host the decade-long Legacy Survey of Space and Time, set to begin in 2025. The Observatory’s 8.4-meter Simonyi Survey Telescope will collect images at a rate that covers the entire visible sky every few nights, potentially allowing for the detection of more interstellar interlopers. CREDIT: RUBINOBS / NSF / AURA / H. STOCKEBRAND

Eventually, astronomers hope to build a catalog of interstellar objects similar to the inventory of exoplanets, which has grown to over 5,500 entries since the first discovery in 1992. That future inventory could help researchers answer the long-standing question of how typical Earth and the solar system are. The compositions of a large sample of interstellar objects could yield clues about the makeup of objects in exoplanetary systems—including ones that might support life.

“Planetesimals are the building blocks of exoplanets,” says astronomer Meredith Hughes of Wesleyan University in Middletown, Connecticut. This means they “can provide information about the diversity of environments, including ones that could be habitable.”

Now, ‘Oumuamua is beyond the orbit of Neptune, and comet Borisov is almost as far. They will continue their journey back into interstellar space, where it’s anyone’s guess what will happen next. Perhaps they will spend an eternity wandering the vast voids of space, or maybe they will be captured by a star. Or they could collapse into a disk of evolving gas and dust in a new planetary system and begin their journeys all over again.

Astronomers estimate there could be more interstellar objects in the Milky Way than stars in the observable universe. Finding more of them will offer a new way to probe the mysteries of the cosmos.

“The really cool thing,” Pfalzner says, “is that interstellar objects come to us.”

This article originally appeared in Knowable Magazine, an independent journalistic endeavor from Annual Reviews. Sign up for the newsletter.

The post Why interstellar objects like ‘Oumuamua and Borisov may hold clues to exoplanets appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
February’s skies shine with Orion, a harmless comet, and an extra day of stargazing https://www.popsci.com/science/february-2024-cosmic-calendar/ Thu, 01 Feb 2024 13:00:00 +0000 https://www.popsci.com/?p=601007
The constellation Orion, the Hunter as seen in the northern winter Milky Way in February 2020. The Orion Nebula is the bright, overexposed pink glow below the Belt of Orion, while the curving arc of red is Barnard’s Loop, now thought to be a supernova remnant. The bright red glow at upper left is the Rosette Nebula. Red Betelgeuse was at its minimum the night this image was taken, at about the same brightness as Bellatrix to the right—Betelgeuse is usually about as bright as blue-white Rigel at the lower right.
The constellation Orion, the Hunter as seen in the northern winter Milky Way in February 2020. The Orion Nebula is the bright, overexposed pink glow below the Belt of Orion, while the curving arc of red is Barnard’s Loop, now thought to be a supernova remnant. The bright red glow at upper left is the Rosette Nebula. Red Betelgeuse was at its minimum the night this image was taken, at about the same brightness as Bellatrix to the right—Betelgeuse is usually about as bright as blue-white Rigel at the lower right. Alan Dyer/VW PICS/Universal Images Group via Getty Images

Leap Day is on February 29, 2024.

The post February’s skies shine with Orion, a harmless comet, and an extra day of stargazing appeared first on Popular Science.

]]>
The constellation Orion, the Hunter as seen in the northern winter Milky Way in February 2020. The Orion Nebula is the bright, overexposed pink glow below the Belt of Orion, while the curving arc of red is Barnard’s Loop, now thought to be a supernova remnant. The bright red glow at upper left is the Rosette Nebula. Red Betelgeuse was at its minimum the night this image was taken, at about the same brightness as Bellatrix to the right—Betelgeuse is usually about as bright as blue-white Rigel at the lower right.
The constellation Orion, the Hunter as seen in the northern winter Milky Way in February 2020. The Orion Nebula is the bright, overexposed pink glow below the Belt of Orion, while the curving arc of red is Barnard’s Loop, now thought to be a supernova remnant. The bright red glow at upper left is the Rosette Nebula. Red Betelgeuse was at its minimum the night this image was taken, at about the same brightness as Bellatrix to the right—Betelgeuse is usually about as bright as blue-white Rigel at the lower right. Alan Dyer/VW PICS/Universal Images Group via Getty Images
February 1 through 29Orion, the Hunter dominates the night sky
February 14Comet C/2021 S3 (PanSTARRS) closest approach to the sun
February 24Full Snow Moon
February 29Leap Day

February brings with it weather-forecasting rodents and romance, but a Valentine’s Day comet and Leap Day makes February 2024 even more exciting. The shortest month of every year has a few solid opportunities for looking up at the night sky and catching unique celestial bodies. If you’re in the Northern Hemisphere, that winter chill makes the sky a little easier to see due to  colder and less-hazy air. Here are some of the cosmic events to keep your eye on with your Valentine (or groundhog).

[Related: Why we turn stars into constellations.]

February 1 through 29–Orion, the Hunter dominates the night sky

One of the brightest constellations in the sky will be dominant this month. Orion, the Hunter will be most visible in the sky towards the south after midnight local time. It’s best to look for the three stars that make up Orion’s Belt. These three stars form a straight line at the midsection of the Hunter. 

Over a dozen stars make up this constellation, but there are two particularly bright spots named Betelgeuse and Rigel. The red supergiant Betelgeuse shines on Orion’s right shoulder. Betelgeuse is only about 10 million years old, making it a baby compared to our nearly 5 billion-year old sun. The constellation’s brightest star is the blue supergiant Rigel, located towards The Hunter’s left foot. Rigel is about 8 million years old and is 36,000 degrees Fahrenheit at its surface.

February 14–Comet C/2021 S3 (PanSTARRS) closest approach to the sun

This comet C/2021 S3 (PanSTARRS) will reach its closest point to the sun–or perihelion–on Valentine’s Day. It will shine at very bright magnitude (7.3), it should be fairly visible if it’s a clear night. If you are in the northeastern United States, look towards the southeastern horizon at least two hours before dawn. The comet will reach its closest (but not dangerous) approach to Earth this year on March 14th.

[Related: Why leap years exist.]

February 24–Full Snow Moon

February’s full moon will reach its peak illumination at 7:30 a.m. EST on Saturday, February 24. It will still appear full Friday night. It will drift above the horizon towards the east around sunset and should reach its highest point in the sky at about midnight on Saturday.

The name snow moon is pretty straightforward, as February is known for heavy snowfall. It is also called the When the Bear Cubs are Born Moon or Makoonsag-gaa-nitaawaadi-giizis in Anishinaabemowin (Ojibwe), Midwinter Moon, or Tsha’tekohselha in Oneida, and the Little Sister of the Waning Moon or Tahch’awɛka Tehekuma in Tunica.

February 29–Leap Day

It’s not something you can see from Earth, but Leap Day is technically an astronomical event

It takes our planet about 365.2422 days to make one complete revolution around the sun. That means there are about six extra hours in every year that are not included in the calendar year. So every four years, we have 24 extra hours to add to the calendar at the end of February. If there was no Leap Day, annual events including the summer and winter solstices or vernal and autumnal equinoxes would shift around to later in the year. According to NASA, it would take only 100 years for the summer to start in mid-July instead of June. 

The same skygazing rules that apply to pretty much all star gazing activities are key this month: Go to a dark spot away from the lights of a city or town and let your eyes adjust to the darkness for about a half an hour.

The post February’s skies shine with Orion, a harmless comet, and an extra day of stargazing appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
JWST images show off the swirling arms of 19 spiral galaxies https://www.popsci.com/science/jwst-new-spiral-galaxies/ Mon, 29 Jan 2024 19:15:00 +0000 https://www.popsci.com/?p=600638
Face-on spiral galaxy, NGC 628, is split diagonally in this image: The James Webb Space Telescope’s observations appear at top left, and the Hubble Space Telescope’s on bottom right. JWST’s observations combine near- and mid-infrared light and Hubble’s showcase visible light. Dust absorbs ultraviolet and visible light, and then re-emits it in the infrared. In JWST’s images, we see dust glowing in infrared light. In Hubble’s images, dark regions are where starlight is absorbed by dust.
Face-on spiral galaxy, NGC 628, is split diagonally in this image: The James Webb Space Telescope’s observations appear at top left, and the Hubble Space Telescope’s on bottom right. JWST’s observations combine near- and mid-infrared light and Hubble’s showcase visible light. Dust absorbs ultraviolet and visible light, and then re-emits it in the infrared. In JWST’s images, we see dust glowing in infrared light. In Hubble’s images, dark regions are where starlight is absorbed by dust. NASA, ESA, CSA, STScI, Janice Lee (STScI), Thomas Williams (Oxford), PHANGS Team

The stars, dust, and gas swirling around black holes could reveal the origin of some our universe’s most intricate structures.

The post JWST images show off the swirling arms of 19 spiral galaxies appeared first on Popular Science.

]]>
Face-on spiral galaxy, NGC 628, is split diagonally in this image: The James Webb Space Telescope’s observations appear at top left, and the Hubble Space Telescope’s on bottom right. JWST’s observations combine near- and mid-infrared light and Hubble’s showcase visible light. Dust absorbs ultraviolet and visible light, and then re-emits it in the infrared. In JWST’s images, we see dust glowing in infrared light. In Hubble’s images, dark regions are where starlight is absorbed by dust.
Face-on spiral galaxy, NGC 628, is split diagonally in this image: The James Webb Space Telescope’s observations appear at top left, and the Hubble Space Telescope’s on bottom right. JWST’s observations combine near- and mid-infrared light and Hubble’s showcase visible light. Dust absorbs ultraviolet and visible light, and then re-emits it in the infrared. In JWST’s images, we see dust glowing in infrared light. In Hubble’s images, dark regions are where starlight is absorbed by dust. NASA, ESA, CSA, STScI, Janice Lee (STScI), Thomas Williams (Oxford), PHANGS Team

Astronomers using the James Webb Space Telescope (JWST) have released new images of 19 nearby face-on spiral galaxies seen in a combination of near- and mid-infrared light. Spiral galaxies are some of the universe’s most awe-inspiring bodies. Their buff and wavy arms are chock full of stars arranged in a whirlpool pattern with vibrant colors and light. According to the European Space Agency (ESA), the most visually spectacular spiral galaxies are considered “face-on,” which means that their spiral arms and bulge are clearly visible.

[Related: Elliptical galaxies may just be spiral galaxies with their arms lobbed off.]

These new images combine years of data collected from multiple different telescopes to paint a more complete picture of these whirly spiral galaxies and how they form. 

“I feel like our team lives in a constant state of being overwhelmed–in a positive way–by the amount of detail in these images,” Thomas Williams, a postdoctoral researcher from the University of Oxford in the United Kingdom, said in a statement

Tracing spiral arms

JWST’s Near-Infrared Camera (NIR-Cam) captured millions of stars that appear in blue tones in the new images. Some of the stars appear climbed tightly together in clusters, while others are spread along the spiral arms. 

The telescope’s Mid-Infrared Instrument (MIRI) data shows where glowing space dust exists around and between the stars. It also shows some stars that have not fully formed. These stars are still encased in the dust and gas that fuel their growth

A collection of 19 face-on spiral galaxies from the James Webb Space Telescope in near- and mid-infrared light. CREDIT: Image NASA, ESA, CSA, STScI, Janice Lee (STScI), Thomas Williams (Oxford), PHANGS Team. Designer: Elizabeth Wheatley (STScI)
A collection of 19 face-on spiral galaxies from the James Webb Space Telescope in near- and mid-infrared light. CREDIT: Image NASA, ESA, CSA, STScI, Janice Lee (STScI), Thomas Williams (Oxford), PHANGS Team. Designer: Elizabeth Wheatley (STScI)

“These are where we can find the newest, most massive stars in the galaxies,” Erik Rosolowsky, a physicist from the University of Alberta in Canada, said in a statement.

The JWST images also show large, spherical shells in the gas and dust. According to the team, these holes were potentially created by one or more stars that exploded. The explosion then carved out giant holes in interstellar material. 

The spiral arms also reveal the extended regions of gas that appear red and orange in the new images.  

“These structures tend to follow the same pattern in certain parts of the galaxies,” Rosolowsky added. “We think of these like waves, and their spacing tells us a lot about how a galaxy distributes its gas and dust.” 

Further research into these structures could provide key insights into how galaxies in our universe build, maintain, and stop star formation. 

Center of the galaxy

Spiral galaxies likely grow from the inside out. Stars will begin to form at the core of the galaxy before spreading along the arms and spiraling away from the center. The location of the stars can also provide clues to their ages. The younger stars are most likely the ones the furthest away from the galaxy’s core. The areas closest to the core that appear to be illuminated by a blue spotlight are believed to be the older stars.  

Face-on barred spiral galaxy, NGC 1512, is split diagonally in this image. The JWST’s observations appear at top left, and the Hubble Space Telescope’s on bottom right. JWST’s observations combine near- and mid-infrared light and Hubble’s showcase visible and ultraviolet light. Dust absorbs ultraviolet and visible light, and then re-emits it in the infrared. In JWST’s images, we see dust glowing in infrared light. In Hubble’s images, dark regions are where starlight is absorbed by dust. CREDIT: NASA, ESA, CSA, STScI, Janice Lee (STScI), Thomas Williams (Oxford), PHANGS Team
Face-on barred spiral galaxy, NGC 1512, is split diagonally in this image. The JWST’s observations appear at top left, and the Hubble Space Telescope’s on bottom right. JWST’s observations combine near- and mid-infrared light and Hubble’s showcase visible and ultraviolet light. Dust absorbs ultraviolet and visible light, and then re-emits it in the infrared. In JWST’s images, we see dust glowing in infrared light. In Hubble’s images, dark regions are where starlight is absorbed by dust. CREDIT: NASA, ESA, CSA, STScI, Janice Lee (STScI), Thomas Williams (Oxford), PHANGS Team

The galaxy cores in pink and red spikes may be a sign of a giant and non-dormant black hole.

“That’s a clear sign that there may be an active supermassive black hole,” Eva Schinnerer, a staff scientist at the Max Planck Institute for Astronomy in Germany, said in a statement. “Or, the star clusters toward the center are so bright that they have saturated that area of the image.”

Sinking PHANGS into space

The images are part of a long-standing project called PHANGS–Physics at High Angular resolution in Nearby GalaxieS. It is supported by over 150 astronomers worldwide. Before JWST created the images, PHANGS was already analyzing large amounts of data from NASA’s Hubble Space Telescope, the Very Large Telescope’s Multi-Unit Spectroscopic Explorer, and the Atacama Large Millimeter/submillimeter Array. 

[Related: Bursting stars could explain why it was so bright after the big bang.]

These previous observations were taken in ultraviolet, visible, and radio light. JWST’s new near- and mid-infrared contributions have provided several pieces of evidence to the study of spiral galaxies. 

Space Telescope photo
Face-on spiral galaxy, NGC 4535. The gas and dust stand out in stark shades of orange and red, and show finer spiral shapes with the appearance of jagged edges. These are some of the star-forming regions of the galaxy. Both older and younger stars appear blue in color. CREDIT: NASA, ESA, CSA, STScI, Janice Lee (STScI), Thomas Williams (Oxford), PHANGS Team.

“Webb’s new images are extraordinary,” Janice Lee, a project scientist for strategic initiatives at the Space Telescope Science Institute in Maryland, said in a statement. “They’re mind-blowing even for researchers who have studied these same galaxies for decades. Bubbles and filaments are resolved down to the smallest scales ever observed, and tell a story about the star formation cycle.”

In addition to these new images, the PHANGS team has also released the largest catalog to date of about 100,000 star clusters which may help astronomers learn more about their stellar lives.

“Stars can live for billions or trillions of years,” Ohio State University astronomer Adam Leroy said in a statement. “By precisely cataloging all types of stars, we can build a more reliable, holistic view of their life cycles.”

The post JWST images show off the swirling arms of 19 spiral galaxies appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
The moon is shrinking (very slowly) https://www.popsci.com/science/moon-shrinking/ Fri, 26 Jan 2024 17:26:55 +0000 https://www.popsci.com/?p=600316
The full moon rises, with clouds below and behind it.
The full moon rises in Washington DC on Monday, March 9, 2020. NASA/Joel Kowsky

Some of the sites for future Artemis missions are vulnerable landslides and 'moonquakes' from the resulting fault lines.

The post The moon is shrinking (very slowly) appeared first on Popular Science.

]]>
The full moon rises, with clouds below and behind it.
The full moon rises in Washington DC on Monday, March 9, 2020. NASA/Joel Kowsky

Earth’s moon is a constant in the night sky, following predictable phases in its orbit. However, its size likely has been changing over time. A study published January 25 in the Planetary Science Journal found that the moon has shrunk more than 150 feet in circumference as its core gradually cooled over the past few hundred million years. 

[Related: The moon is 40 million years older than we thought, according to crystals collected by Apollo astronauts.]

A team of scientists from NASA, the Smithsonian, Arizona State University, and The University of Maryland discovered evidence that the continuing shrinkage led to some surface changes around the Lunar South Pole. The terrain has even changed in areas where NASA hopes to land during the crewed Artemis III mission

How the moon is like a grape

This lunar shrinking process looks similar to how a grape wrinkles when it becomes a raisin. The moon also wrinkles and creases as it shrinks down. However, a grape has a flexible skin, while the moon has a brittle surface. The brittleness causes faults to form where sections of the crust push up against each other.

The fault formation caused by this continued shrinking often comes with seismic activity like moonquakes. Any locations near these moon fault zones could pose a threat to human exploration there, the same way that those living near fault lines on Earth face a greater risk of earthquakes. 

Moons photo
The epicenter of one of the strongest moonquakes recorded by the Apollo Passive Seismic Experiment was located in the lunar south polar region. However, the exact location of the epicenter could not be accurately determined. A cloud of possible locations (magenta dots and light blue polygon) of the strong shallow moonquake using a relocation algorithm specifically adapted for very sparse seismic networks are distributed near the pole. Blue boxes show the locations of proposed Artemis III landing regions. Lobate thrust fault scarps are shown by small red lines. The cloud of epicenter locations encompasses a number of lobate scarps and many of the Artemis III landing regions. CREDIT: NASA/LRO/LROC/ASU/Smithsonian Institution.

In the new study, the team linked a group of faults in the moon’s south polar region to a powerful moonquake recorded by Apollo seismometers over 50 years ago. They used computer models to simulate the stability of surface slopes here and found that some areas in particular were vulnerable to lunar landslides from the seismic activity.

“Our modeling suggests that shallow moonquakes capable of producing strong ground shaking in the south polar region are possible from slip events on existing faults or the formation of new thrust faults,” Thomas R. Watters, study co-author and senior scientist emeritus in the National Air and Space Museum, said in a statement. “The global distribution of young thrust faults, their potential to be active and the potential to form new thrust faults from ongoing global contraction should be considered when planning the location and stability of permanent outposts on the moon.”

Shaking for hours

Shallow moonquakes occur only about 100 or so miles deep into the moon’s crust. They are caused by faults and can be strong enough to damage equipment and human-made structures. Earthquakes tend to last for only a few seconds or minutes at most. Shallow moonquakes can last for hours and even a whole afternoon. The team connected the magnitude 5 moonquake recorded by the Apollo Passive Seismic Network in the 1970s  to a group of faults detected by the Lunar Reconnaissance Orbiter more recently. This means that this seismic activity could devastate any future hypothetical settlements on the moon. 

[Related: 10 incredible lunar missions that paved the way for Artemis.]

“You can think of the moon’s surface as being dry, grounded gravel and dust. Over billions of years, the surface has been hit by asteroids and comets, with the resulting angular fragments constantly getting ejected from the impacts,” study co-author and University of Maryland geologist Nicholas Schmerr, said in a statement. “As a result, the reworked surface material can be micron-sized to boulder-sized, but all very loosely consolidated. Loose sediments make it very possible for shaking and landslides to occur.”

Moons photo
Lunar Reconnaissance Orbiter Camera (LROC), Narrow Angle Camera (NAC) mosaic of the Wiechert cluster of lobate scarps (left pointing arrows) near the lunar south pole. A thrust fault scarp cut across an approximately 1-kilometer (0.6-mile) diameter degraded crater (right pointing arrow). CREDIT: NASA/LRO/LROC/ASU/Smithsonian Institution.

The team will continue to map out this seismic activity on the moon, hoping to pinpoint more locations that could be dangerous for human exploration. NASA’s Artemis missions are currently scheduled to launch their first crewed flight in September 2025, with a crewed moon landing scheduled for September 2026. One of the ultimate goals of these future missions is a long-term human presence on the moon.

“As we get closer to the crewed Artemis mission’s launch date, it’s important to keep our astronauts, our equipment and infrastructure as safe as possible,” Schmerr said. “This work is helping us prepare for what awaits us on the moon—whether that’s engineering structures that can better withstand lunar seismic activity or protecting people from really dangerous zones.”

The post The moon is shrinking (very slowly) appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
RIP Mars Ingenuity, the ‘little helicopter that could’ https://www.popsci.com/science/rip-ingenuity-mars-helicopter/ Fri, 26 Jan 2024 15:09:07 +0000 https://www.popsci.com/?p=600283
Ingenuity rotocopter on Mars
Goodnight, sweet prince. NASA/JPL-Caltech/ASU/MSSS

NASA confirms its historic engineering feat finally succumbed to the Red Planet’s hazards, after surviving 33 times longer than expected.

The post RIP Mars Ingenuity, the ‘little helicopter that could’ appeared first on Popular Science.

]]>
Ingenuity rotocopter on Mars
Goodnight, sweet prince. NASA/JPL-Caltech/ASU/MSSS

Ingenuity—NASA’s tiny, overachieving Mars rotocopter—has officially ended its historic mission after three years of loyal, extended service. Despite initial plans to only conduct five-or-so test flights over roughly 30 days back in 2021, the four-pound, 19-inch-tall drone kept on trucking for another three years. Ingenuity ultimately spent over two hours buzzing through Red Planet’s thin, CO2-laden atmosphere during its 72 total flights, eventually traversing a whopping distance of roughly 11 miles.

On January 25, however, NASA confirmed its rotocopter damaged at least one blade while completing a flight on January 18. Although upright and still in communication with ground control, Ingenuity’s days of aerial exploration are definitely behind it.

Mars photo

Dubbed “the little helicopter that could” by NASA director Bill Nelson in a prerecorded message posted yesterday, Ingenuity “flew higher and farther than we ever imagined.”

“Through missions like Ingenuity, NASA is paving the way for future flight in our solar system and smarter, safer human exploration to Mars and beyond,” he continued.

The helicopter touched down alongside the Perseverance rover way back on February 18, 2021, but continued setting new records as recently as last month. On December 20, 2023, Ingenuity sped along at nearly 22.5 mph for 135 seconds, covering about 2,315 feet in the process. Another successful flight ensued on December 22, but Ingenuity’s 71st mission unfortunately ended in an emergency landing. A planned vertical takeoff to confirm its location on January 18 allowed Ingenuity to ascend 40 feet into the air for 4.5 seconds before starting a slow descent to the Martian surface.

Mars photo
NASA’s Ingenuity Mars Helicopter captured this view of sand ripples during its 70th flight, on Dec. 22, 2023. The smooth, relatively featureless terrain proved difficult for the helicopter’s navigation system to track during Flight 72, on Jan. 18, 2024, resulting in a rough landing. Credits: NASA/JPL-Caltech.

At about three feet from landing, however, the rotocopter lost contact with Perseverance, which is (among many other things) responsible for relaying Ingenuity’s data back to Earth. NASA reestablished a link the following day, but later identified significant rotor blade damage.

[Related: NASA’s Ingenuity helicopter set a new flight distance record on Mars.] 

“Ingenuity is an exemplar of the way we push the boundaries of what’s possible every day,” Laurie Leshin, director of NASA’s Jet Propulsion Laboratory, said in yesterday’s announcement. “I’m incredibly proud of our team behind this historic technological achievement and eager to see what they’ll invent next.”

According to NASA’s final tally, Ingenuity lived up to its name for nearly 1,000 Martian days—around 33 times longer than anticipated. During its tenure, the rotocopter received a software update beamed through space that allowed it to autonomously select the best landing sites, weathered destructive dust storms, contended with a dead sensor, and lived through Martian winter temperatures as low as -112 degrees Fahrenheit

Fare thee well, Ingenuity. For a trip down memory lane, check out NASA’s official mission website.

The post RIP Mars Ingenuity, the ‘little helicopter that could’ appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Japan’s SLIM lunar lander stuck the landing—upside down https://www.popsci.com/science/slim-lunar-lander-upside-down/ Thu, 25 Jan 2024 21:00:00 +0000 https://www.popsci.com/?p=600209
Image taken of JAXA SLIM lunar lander on moon upside down
Eh. Close enough. JAXA/Takara Tomy/Sony Group Corporation/Doshisha University

Despite the inverted arrival, JAXA gives its ‘Moon Sniper’ a ‘perfect score.’

The post Japan’s SLIM lunar lander stuck the landing—upside down appeared first on Popular Science.

]]>
Image taken of JAXA SLIM lunar lander on moon upside down
Eh. Close enough. JAXA/Takara Tomy/Sony Group Corporation/Doshisha University

Here’s the good news: Japan’s space agency confirmed its historic Smart Lander for Investigating Moon (SLIM) successfully touched down last week with near pinpoint accuracy. 

The bad news? SLIM did it upside-down.

The Japan Aerospace Exploration Agency (JAXA) confirmed the topsy-turvy predicament on Thursday, thanks to images received from a pair of autonomous probes dispatched by SLIM shortly ahead of touchdown. Regardless of its positioning, however, JAXA project manager Shinichiro Saki gave the endeavor a “perfect score.”

“Something we designed traveled all the way to the moon and took that snapshot. I almost fell down when I saw it,” he said through the Associated Press on January 25. “We demonstrated that we can land where we want. We opened a door to a new era.”

[Related: Japan makes history with its first uncrewed moon landing.]

Japan is now the fifth nation to make it to the moon’s surface, but differentiates its feat through its precision. Although lunar landers previously aimed for landing zones as large as six-miles-wide, SLIM lived up to its “Moon Sniper” nickname. Following a few days of analysis, JAXA confirmed the craft landed barely 180-feet from its already impressive 330-feet-wide target—well within the hopes of JAXA engineers. SLIM now resides close to the Shioli crater on the moon’s near side.

During its descent, however, officials have now confirmed the lander’s main engines malfunctioned an estimated 162-feet above the surface. This loss of thrust resulted in a slightly rougher touchdown than planned, likely influencing its current inverted position. Due to SLIM’s now-perpetual handstand, its solar panels are angled in the wrong direction. Without any reliable access to the sun’s energy, SLIM is basically powerless—at least for the time being. JAXA officials believe there still may be a chance for their lander to juice back up in a few days’ time, once the moon re-enters its daytime orbit.

Map with annotations of SLIM lunar lander's position
Lunar topography captured by the Indian spacecraft Chandrayaan-2, overlaid with images acquired by the SLIM navigation camera during the HV2 (second hovering) at an altitude of about 50m. The two blue frames are images acquired during the obstacle detection at HV2. As the spacecraft subsequently enters the obstacle avoidance operation, the performance of the pinpoint landing is evaluated based on the positional accuracy at this point. The positional accuracy at the time of the first and second obstacle detection was respectively about 3 – 4m and 10m. Note that it is highly likely that the main engine was already affected by the loss of function when the second obstacle detection occurred. The SLIM footprint in the red frame is the safe landing zone set autonomously by SLIM based on the obstacle detection during HV2. Credit: Chandrayaan-2:ISRO/SLIM:JAXA

But even if SLIM is destined to take an indefinite, much deserved nap, its mission has already provided researchers an initial batch of data. The lander’s two tiny drones, LEV-1 and LEV-2, transmitted a recording of their mothership’s landing alongside 275 images back home.

SLIM arguably marks one of JAXA’s biggest accomplishments in years. In 2003, the agency’s Hayabusa probe began its two-year journey to the 1,000-foot-long asteroid, Itokawa. Hayabusa took off yet again in 2005, finally returning to Earth in 2010 with samples in tow—a first in space exploration. JAXA repeated a similar mission with Hayabusa2, which returned from its sojourn to the asteroid Ryugu in 2020.

The lunar win also likely provides a welcome morale boost for the nation’s space enthusiasts. Last April, private Japanese company ispace’s Hakuto-R lander made it to the moon’s orbit, but promptly crashed during its descent attempt.

The post Japan’s SLIM lunar lander stuck the landing—upside down appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>
Check out JWST’s new image of a star factory https://www.popsci.com/science/jwst-star-factory-nebula/ Thu, 25 Jan 2024 16:30:00 +0000 https://www.popsci.com/?p=600128
A bright young star within a colorful nebula. The star is identifiable as the brightest spot in the image, surrounded by six large spokes of light that cross the image. A number of other bright spots can also be seen in the clouds, which are shown in great detail as layers of colorful wisps.
An image from the NASA/ESA/CSA James Webb Space Telescope features an H II region in the Large Magellanic Cloud, a satellite galaxy of our Milky Way. This nebula, known as N79, is a region of interstellar atomic hydrogen that is ionized, captured here by Webb’s Mid-InfraRed Instrument (MIRI). ESA/Webb, NASA & CSA, M. Meixner

The extragalactic, starburst-patterned nebula N79 is about 1,630-light-year-wide.

The post Check out JWST’s new image of a star factory appeared first on Popular Science.

]]>
A bright young star within a colorful nebula. The star is identifiable as the brightest spot in the image, surrounded by six large spokes of light that cross the image. A number of other bright spots can also be seen in the clouds, which are shown in great detail as layers of colorful wisps.
An image from the NASA/ESA/CSA James Webb Space Telescope features an H II region in the Large Magellanic Cloud, a satellite galaxy of our Milky Way. This nebula, known as N79, is a region of interstellar atomic hydrogen that is ionized, captured here by Webb’s Mid-InfraRed Instrument (MIRI). ESA/Webb, NASA & CSA, M. Meixner

Stars are being born in a stellar new image from the James Webb Space Telescope (JWST). The telescope imaged the 1,630-light-year-wide nebula N79. It is located in the Large Magellanic Cloud (LMC), a satellite galaxy of our own Milky Way galaxy that is almost 200,000 light-years away from Earth. It’s possible that the LMC could crash into our home galaxy–in about two billion years.

[Related: Astronomers spot an extragalactic star with a disc around it for the first time.]

The image uses orange, yellow, and blue filters to showcase star nurseries forming new stars. It is still being explored by astronomers. JWST’s Mid-InfraRed Instrument took the image that features interstellar atomic hydrogen.

Seeing ‘starbursts’ more clearly

This new image hones in on a region called N79 South, or S1. The area is made up of three giant bunches of cold atomic gas called molecular clouds. The bright starburst effect at the center is due to the diffraction spikes caused by the 18 pieces that make JWST’s primary mirror as they collect light. Since these mirrors are put together in a hexagon, there are six main diffraction spikes.

“Patterns like these are only noticeable around very bright, compact objects, where all the light comes from the same place,” the European Space Agency (ESA) wrote in a press release. “Most galaxies, even though they appear very small to our eyes, are darker and more spread out than a single star, and therefore do not show this pattern.”

When JWST looks at galaxies that may appear small, more light is visible with the help of how these mirrors are arranged and work. 

The telescope’s Mid-InfraRed Instrument also captures light at longer wavelengths and this new view uses that to showcase N79’s glowing gas and dust. Mid-infrared light can reveal what is going on deep inside these gas and dust clouds. Shorter wavelengths of light would be scattered or absorbed by the nebula’s dust grains.

Nebula siblings

N79 is considered to be a younger sibling of the Tarantula Nebula, which is about 161,000 light-years from Earth. While the two are similar, astronomers believe that N79 has been forming stars twice as fast as the Tarantula Nebula. 

[Related: The Running Chicken Nebula shimmers in new ESO image.]

“Star-forming regions such as this are of interest to astronomers because their chemical composition is similar to that of the gigantic star-forming regions observed when the Universe was only a few billion years old and star formation was at its peak,” said the ESA.

The star forming regions in our galaxy aren’t producing stars at quite the breakneck pace as N79. They also have different chemical compositions. JWST is helping astronomers compare and contrast observations of star formation in N79 with its deep observations of distant galaxies in the Early Universe

Astronomers are also hoping to get their look at the planet-forming disks of material that surround young stars that resemble our sun. An image like that could give us a better idea of how our solar system formed, roughly 4.6 billion years ago.

The post Check out JWST’s new image of a star factory appeared first on Popular Science.

Articles may contain affiliate links which enable us to share in the revenue of any purchases made.

]]>