PostHole
Compose Login
You are browsing eu.zone1 in read-only mode. Log in to participate.
rss-bridge 2026-02-20T18:29:07+00:00

The Future of Deep-Sky Astrophotography

Deep-sky astrophotography is rapidly evolving, and in some ways, the future is already here.
The post The Future of Deep-Sky Astrophotography appeared first on Sky & Telescope.


Astrophotography: Tips & Techniques

The Future of Deep-Sky Astrophotography

By:

Richard S. Wright Jr.

February 20, 2026

Get Articles like this sent to your inbox


By submitting this form, you are consenting to receive marketing emails from: Sky & Telescope. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact

Deep-sky astrophotography is rapidly evolving, and in some ways, the future is already here.

Daniel Hertzberg

It seems that every few years someone states, “We are in the golden age of astrophotography!” This is usually followed by a string of examples explaining why they believe it to be so. Here we are again at a crossroads, with astro-imaging gaining traction in popular culture beyond the amateur astronomy community. And much of this is driven by smartphones, microcomputers, CMOS detectors, and even gaming consoles.

Deep-sky astrophotography is again at the cusp of a revolution. It’s an amazing time to be sure, and we’re seeing things we never thought would be possible — mounts so accurate they don’t require guiding corrections during long exposures, cameras more sensitive than ever, and tiny computers that ride along with the telescope while orchestrating the tasks of a host of accessories. Astrophotography has an amazing journey ahead. Here’s where I see it going next.

The Times They Are A-changin’

We saw one revolution already at the close of the 20th century. Film astrophotography, with its chemical processes, darkroom artistry, and gas-hypering alchemy, gave way to the environment of digital cameras, specialized computer software, and the virtual darkroom. But even that familiar scenario will eventually become the exception rather than the rule.

Everything having to do with astrophotography is changing, including the optics we use. Commercially produced telescopes and camera lenses are better than ever, thanks both to advances in computer-formulated optical designs and the addition of new materials such as extra-low-dispersion glasses. Fast, high-quality optics are easily accessible and more affordable than ever. For example, with a mirrorless camera with a tripod and an f/1.4 lens, you can now record a Milky Way image in under 10 seconds with equipment that doesn’t cost a small fortune. An f/7 telescope, once considered photographically “fast” by deep-sky imagers, is now passed by unless it’s paired with a reducer/flattener. Scopes with focal ratios of f/5 and f/4 are now the norm, with even f/2 instruments available from several manufacturers. I currently own three telescopes that produce sharp, round stars across a full-frame camera sensor at f/3 and faster — a feat that was until recently considered prohibitively expensive.

The same goes with the detectors in our cameras. CMOS chips today exceed the performance of CCDs. Some CMOS camera manufacturers are closing in on the effective elimination of read noise and achieving the maximum quantum efficiencies possible for digital detectors. What’s next? I have some ideas, and I’m not the only one.

[Milky Way Canon EOS Ra operating at ISO 3200 and an f/1.4 Sigma Art lens on a stationary tripod - Cameras | Sky & Telescope]

*FAST AND DEEP This image of the Milky Way was recorded with a single, 10-second exposure made with a Canon EOS Ra operating at ISO 3200 and an f/1.4 Sigma Art lens on a stationary tripod. The same photograph taken with a DSLR a dozen years ago would have been riddled with noise and would have barely shown our galaxy at all.
Richard S. Wright, Jr.*

Computational Photography

When a process can’t be improved, the only thing left to do is to change the process. Any student of modern history understands that today’s amazing, leading-edge, expensive technology is only a few years away from being cheaper and small enough to fit in your pocket or strapped to your wrist. A good example is my Apple Watch. It can monitor my heart rate, read my blood-oxygen level, make international phone calls for free, and, if I ask it to, find all the dog photos on my paired smartphone. We already live in the future!

The next big buzzword creeping into the conversation about astro-imaging is computational photography. In fact, computational photography is already revolutionizing photography. You’re most likely already taking it for granted.

[Camera Arsenal 2 with a Graphics Processing Unit (GPU) | Sky & Telescope]

*AI FOR YOUR DSLR Products like the Arsenal 2 add a Graphics Processing Unit (GPU) to your camera for remote-controlled computational photography assistance. This device adds new capabilities to your camera, including focus stacking, high-dynamic-range compositing, and even accurate auto-focusing on star fields.
Richard S. Wright, Jr.*

Computational photography is when computer processing is used to aid or improve photography in-camera. We already have handheld cameras (in your smartphone) that combine multiple exposures of varying lengths to create high-dynamic range (HDR) images. That same smartphone camera will take as many as a dozen or more very short images in low light, then align and combine them to make a single, low-noise image — all happening automatically in your device while you watch. Today I can take a 5-second, handheld exposure in low light without a tripod, and it won’t be a smeared mess. These are all examples of computer processing being applied while the image is recorded to dramatically improve the capabilities of our cameras and the quality of our photos.

This technology is still in its infancy, as improving computational power continues to increase the capabilities of image- and graphics-processing technologies. In every case, industries outside of astrophotography are driving these technologies forward. You’re probably familiar with a Central Processing Unit (CPU) — the “brain” at the heart of most computer systems. Another acronym you may be less familiar with is GPU, or Graphics Processing Unit, a term coined by the hardware graphics accelerator company NVIDIA in 1999. Today, GPUs power 3D simulations and games in computer and gaming consoles, and the performance improvements they’ve made possible are staggering. Computer images are simply numbers, and a modern GPU can perform trillions of numeric calculations every second, making those 3D-rendered games run smoothly. Many of the latest advances in our smartphone-camera capabilities have arisen because your smartphone, just like your laptop or desktop computer, now comes with an integrated GPU.

Live Stacking

Sure, this technology is amazing, but what does it have to do with astrophotography?

[Smartscopes image the Running Man Nebula]

*AUTOMATED IMAGING This is the future: completely integrated imaging systems like the Unistellar eVscope 2 (left) and Vaonis Stellina (middle) that perform live stacking and image processing while you watch. Results like the image of the Running Man Nebula (right) excite beginners, and eventually these systems will deliver extremely high-quality results.
Richard Berry*

[...]


Original source

Reply