How Does NASA Take Photos of the Sun?

A brief guide to the fascinating world of solar photography.

A tiled composite image of the sun depicting multiple wavelengths of light. Credit:

Recommendations are independently chosen by Reviewed’s editors. Purchases you make through our links may earn us a commission.

At a time when the sun was still thought to revolve around the earth, the famed astronomer Galileo Galilei was tracing its reflection onto a sheet of paper. By aiming a telescope at the sun and reflecting the eyepiece onto a screen, he was able to trace patterns within the sun, providing the first evidence of sunspots.

This week, NASA scientists—using a far more advanced light capture system than Galileo’s—observed a massive square-shaped dark spot on the surface of the sun.

The spot looks eerily like a hole, but is actually just a large area where solar winds are shooting out at superfast speeds—also known as a "coronal hole."

It’s a stunning image, but how exactly did scientists capture it? Better question: How do they capture any images of the sun at all?

You can trace this obsession with solar photography back to the earliest photographs.

As any child learns early on, staring at the sun for too long can blind you. And the big fireball in the sky usually blinds cameras, too. Even if you crank up the shutter speed and shrink the aperture on a typical camera, you’re likely to be left with a featureless blot of light where Sol is supposed to be.

Scientists and amateur shooters have been trying to get around this problem since the earliest days of photography. In 1845, French physicists Louis Fizeau and Leon Foucault captured a rudimentary image of the sun with an exposure of just 1/60th of a second—super fast for the time. This was less than seven years after Louis Daguerre exposed the first daguerreotype photograph.


Courtesy: National Science Foundation, High Altitude Observatory

Eventually, through decades of experimentation, these pioneers discovered the secret: To capture the sun, you need to record the light we can't see.

Today, the stunning solar images NASA pumps out—along with pictures of distant galaxies and gaseous nebulae—are captured by creating composite images of ultraviolet and infrared light.

Related content

The vibrant yellows and oranges are not, in fact, accurate color renderings but arbitrarily assigned tones.

Some of the most famous images of the sun—the ones that depict its corona in all its gaseous majesty—are the result of cameras that filter out all but extreme ultraviolet light.

That means the vibrant yellows and oranges are not, in fact, accurate color renderings—they're added after the fact, to make the photos more appealing to human minds.

Dean Pesnell, a project scientist at NASA, explained in an interview with the National Journal that the colors are totally arbitrary. “Otherwise [...] there would be all these black and white images and it wouldn't be nearly as fun.”

massive solar flare sun.jpeg

The technology used to capture these wavelengths is extremely complicated, but even NASA's most cutting-edge imaging devices are fundamentally not so different from the cameras used by Louis Fizeau and Leon Foucault during the Polk administration.

The technology used to capture these wavelengths is extremely complicated, but fundamentally not so different from the cameras used during the Polk administration.

The most advanced of these systems is probably NASA’s Solar Dynamics Observatory (SDO), which Pesnell is involved in operating. This geosynchronous satellite was launched in 2010 and is designed to directly observe the sun for a period of five years.

NASA’s goal for the SDO—and really all solar imaging systems—is not merely to capture breathtaking pictures of the sun. It's to study how the sun’s magnetic field is generated, and how this energy creates solar wind, energetic particles, and radiation here on Earth.

solar flare in different wavelengths.jpg

A solar flare captured by the SDO in six different wavelengths.

A precise understanding of the sun’s electromagnetic processes could allow researchers to predict solar storms. A single high-intensity solar storm could wipe out power grids on a national scale and wreak havoc on the world economy, so an early-warning system could have real benefits. In that sense, the SDO is more like a storm-chasing device than a traditional camera.

Understanding the sun’s electromagnetic processes could allow researchers to predict solar storms.

Of course, the SDO is not the only piece of technology that allows us to snap amazing photographs of the sun. The Solar and Heliospheric Observatory—a joint project between NASA and the European Space Agency—was launched back in 1995. Since then it has discovered more than 2,700 comets and provided researchers with near-real-time data of solar activity, including some breathtaking images.

And as for you? It’s not impossible to take your own pictures of the sun, but even if you invest in protective filters, high-powered telescopes, expensive digital backs, and safety equipment, you’re not likely to capture its violent beauty at NASA-level quality.

Unless, y'know, you can find a way to get into space yourself.

The Sun and Its Wavelengths

These images show the variety of ways the SDO observes the sun.

Up next