Improving Mobile Device Astrophotography

Explore my latest computational photography project focused on enhancing astrophotography on mobile devices.


Kevin Sorto

3/1/20248 min read

My research project aims to revolutionize mobile device astrophotography by leveraging advanced computational techniques. In this blog post, we will discuss the challenges in capturing stunning astrophotography with smartphones and share our novel approach to overcome these limitations. Join me on this exciting journey to push the boundaries of mobile photography.

brown concrete building under starry night
brown concrete building under starry night

What is Astrophotography?

When taking pictures of the night sky using a mobile device, many issues arise as devices are usually not equipped with sensors or adequate processing techniques to properly produce a sharp image. Luckily, this has started to change and companies have been greatly improving the cameras on their devices for low-light shooting.

The Apple iPhone 13, for example, includes a “night mode” that allows for better image capturing during low-light conditions. This, in combination with an image enhancement pipeline, is used to create sharp images of the night sky using a mobile device.

Mobile phone astrophotography, also known as astronomical imaging, is the photography or imaging of astronomical objects, celestial events, or areas of the night sky [1]. Astrophotography is a popular pastime and with improvements to mobile devices, more and more enthusiasts are turning to mobile devices as a medium to capture images of the night sky. Nevertheless, many challenges still do exist, which lead to subpar results. In an aim to close the gap, manufacturers have developed better sensors and have incorporated image processing techniques into devices. Apple, for example, has equipped newer iPhone 13 models with brand new features aimed to improve low-light photography. This research will focus on exploring the efficacy of these new systems paired with a image processing pipeline with the goal of creating sharp images of the night sky.

In the vast expanse of the night sky, where stars twinkle and galaxies unfold their mysteries, lies a universe waiting to be captured by the lens of our mobile devices. In an era where smartphones have become an integral part of our daily lives, they are now poised to be more than just communication tools. With advancements in computational photography, we embark on a celestial journey to push the boundaries of what our mobile devices can capture in the dark canvas of the night. This project delves into the realm of astrophotography, aiming to elevate the capabilities of your handheld device to unveil the cosmic wonders that have, until now, remained elusive to the casual stargazer.

Join me as we navigate the challenges of low-light conditions, intricate details of celestial bodies, and the delicate dance of light in the cosmos. Our mission is to empower amateur astronomers, stargazers, and space enthusiasts alike, providing them with the tools to transform their smartphones into potent astrophotography instruments.

In this blog series, we will delve into the intricacies of computational photography algorithms, explore innovative techniques, and share the excitement of capturing mesmerizing cosmic phenomena with nothing more than the device in your pocket. The universe is vast, and so are the possibilities hidden within the pixels of our screens.

Get ready to witness the stars as you've never seen them before. "Improving Mobile Device Astrophotography" is not just a project; it's an invitation to gaze upon the cosmos from the palm of your hand. Together, let's unlock the secrets of the night sky and bring the wonders of the universe closer to home.


The image processing pipeline includes the following: Low-Light Image Enhancement via Illumination Map Estimation (LIME) [2], Exposure Fusion [3], Single Image Removal Using Dark Channel Prior [4], and Global tone mapping. The overall aim is to be able to combine iPhone 13’s new capabilities to capture improved low-light images with known and effective image processing techniques to create a high-quality sharp image.


Various images will be taken using iPhone 13 Pro under different settings and light conditions spanning over various days in order to introduce variance within the images. Additionally, various celestial bodies were photographed, including the moon, various stars and shots were also taken involving the sun at sunrise or sunset. Candidate images were then taken through the image processing pipeline and enhanced.

Image collection quickly became one of the main issues with the experiments as unfavorable weather conditions made it difficult to capture desired photographs. Clouds would constantly hide stars and the moon would seldom be viewable from a good angle. Nevertheless, timing the image collection closer to sunrise, rather sunset, led to a greater degree of success as clouds would disappear at this time. Majority of the photographs taken without cloud distortion were taken at this time. Initially, a specialized app for night photography was being used, however, it quickly became apparent that it led to subpar results. The native iPhone camera app was used as it was found to produce sharper images – possibly to due to its built-in image processing pipeline.

Candidate photographs were then taken through a 4-tier image enhancement pipeline in order to create the final image. Low-Light Image Enhancement via Illumination Map Estimation (LIME) [2] algorithm was implemented using MATLAB and was the first enhancement applied to the images. This algorithm was chosen as it was found to provide very promising results when applied to super low-light images, which this research focuses on. This algorithm works by calculating the maximum illumination of each pixel by finding the maximum value from RGB channels and then applying an illumination map to the image. Opposed to other algorithms, LIME only estimates one factor, illumination, which was desirable in order to maintain the image processing as simple as possible.

Following LIME, an Exposure Fusion [3] algorithm was applied to the image. This algorithm works by fusing the original image with the illuminated LIME image, thus providing a better balance to illumination and producing an image with high dynamic range. This was necessary as many images after LIME would often be far too illuminated and small bright stars on the images would tend to disappear, the opposite of the aim. It can be seen, however, that this step would often add undesirable blur and error to the image.

Next, a dehazing algorithm was applied using the Single Image Haze Removal Using Dark Channel Prior [4] algorithm. The nature of the subject of the photographs would introduce undesirable fog, smoke, and would lead to hazy images. Therefore, the algorithm was implemented to ensure sharpness.

Lastly, a Global Tone Mapping was applied to the images in order to increase contrast. A built-in MATLAB library implementation was used and applied which helped to further enhance image sharpness.


Moon and Stars

Overall, the image processing pipeline coupled with a new device’s capabilities resulted in a good and mostly-sharp image that successfully illuminated the stars, sky and moon. As it can be seen from figure 1, the brightness was improved tremendously by the LIME and Exposure Fusion algorithms and the dehazing algorithm did a good job of blending it all together to form a clear final product. It is clear that the HDR operation introduced an undesirable noise into the top edge of the image, further research could include using a different algorithm to ensure that this noise is not present.


Additionally, the pipeline was tested to see how it would be affected with a large amount of clouds under low-light conditions. The results were still more impressive than simply with the moon and stars. The top image in figure 2 resulted in a clean image result and stars that were not visible under the original image became visible after going through the pipeline.

Below is a before and after of some images. Left image is camera native image and the right image is the image produced by the pipeline.

4.3 Motion

The iPhone “night mode” was also tested when there was a degree of motion or shake when taking the image. This was done in an effort to see how the processing would fare given this extra variable as holding perfectly still, especially when aiming high at the sky, is not always easily attainable. Figure 3 shows an image that was taken with some degree of motion. The original image turned out better than expected, while some noise is present, the moon and surroundings can still be quite visible. The processed image is shown and did a good job of highlighting the moon. Nevertheless, in order for a high-quality product to be attainable under motion, further processing would be needed. Perhaps a burst of photos under motion that is then stacked together would lead to better results.

For fun...

I was curious to see what the pipeline would produce under non-dark conditions. Below is the before and after of a picture taken of the George Mason University campus during sunset and run through the pipeline. It did a pretty good job of bolding out colors, though not perfect, but it did create a sharper and appealing image.


Aside from some unwanted noise from the Exposure Fusion algorithm, the experiments were quite successful. The image processing pipeline was able to successfully enhance the stars and the moon and provide a sharp image. Additionally, the results with clouds were far better than expected. The clouds helped reduce the unwanted glare from the Exposure Fusion algorithm and just as sharply enhanced the stars and the moon.

For further research, a few changes should be made. For instance, only one mobile device was tested. In order to truly examine the extent of mobile astrophotography, experiments need to be run using different mobile devices and their results compared. Furthermore, due to location and poor weather, shots with more celestial bodies was not feasible, but this is another limitation that needs to be tested in order to truly examine the extent at which a mobile device can compete with DSLR astrophotography. Lastly, a different exposure fusion algorithm or lighting algorithms should be experimented in order to ensure that the unwanted glare and noise brought on from this is avoided.


[1]    Cornell, S. (2022, February 1). Astrophotography for beginners 2023: How to shoot the night sky.

[2]    X. Guo, Y. Li and H. Ling, "LIME: Low-Light Image Enhancement via Illumination Map Estimation," in IEEE Transactions on Image Processing, vol. 26, no. 2, pp. 982-993, Feb. 2017, doi: 10.1109/TIP.2016.2639450.

[3]    T. Mertens, J. Kautz and F. Van Reeth, "Exposure Fusion," 15th Pacific Conference on Computer Graphics and Applications (PG'07), Maui, HI, USA, 2007, pp. 382-390, doi: 10.1109/PG.2007.17. 

 [4]    K. He, J. Sun and X. Tang, "Single Image Haze Removal Using Dark Channel Prior," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 33, no. 12, pp. 2341-2353, Dec. 2011, doi: 10.1109/TPAMI.2010.168.

[5]    H. Gupta, D. B. Salvadi, A. S. Areeckal and S. Udupa, "Star identification in night sky images using mobile phone camera," 2022 IEEE International Conference on Signal Processing, Informatics, Communication and Energy Systems (SPICES), THIRUVANANTHAPURAM, India, 2022, pp. 314-319, doi: 10.1109/SPICES52834.2022.9774242.