Is it Significant if Samsung's Smartphone Cameras Alter Their Lunar Photographs?
Is it Significant if Samsung's Smartphone Cameras Alter Their Lunar Photographs?
Following a week of an old controversy resurfacing, Samsung has officially unveiled the secrets behind their smartphones capturing high-definition moon photos. Although the details presented haven't offered anything new, Samsung's clarification about this week's uproar surrounding its moon photography capabilities serves as a reminder that the wonders of smartphone photography can be largely attributed to the software enhancements running in the background.
This whole controversy was ignited a week ago on Reddit by a user named ibreakphotos, whose post went viral claiming that Samsung's space zoom moon shots were fake. With evidence to back it up, they concluded that:
The moon photos taken by Samsung are fabricated. Their marketing tactics are misleading. The photos are adding details to areas that lack any substantial detail (in this case, they intentionally eliminated it). The post mentions multi-frames, multi-exposures, but the reality is that AI is doing the majority of the work, not the optics, which are incapable of resolving the level of detail displayed in the photos. Since the moon is tidally locked to the Earth, it's an easy task to train a model on various moon images and just superimpose the textured appearance when a moon-like object is detected.
It's not particularly shocking to learn that a significant portion of the labor is being handled by AI for those who understand the intricacies of smartphone photography. Samsung's response to the findings further affirms the role of AI in enhancing certain shots by emphasizing its reliance on this technology. The company clarifies that its Scene Optimizer feature has been supporting moon photography since the Galaxy S21 series and has since refined the algorithms associated with such shots to enable the feature to recognize a moon in the frame and optimize it accordingly.
According to Samsung:
The engine for detecting the moon was developed using a collection of various moon shapes and details, ranging from full moons to crescent moons, and is based on images captured from our perspective on Earth.
It utilizes an AI deep learning model to spot the presence of the moon and pinpoint the area it occupies within the relevant image – as indicated by the square box. The AI model learning process allows it to identify the area occupied by the moon even in images it wasn't trained on.
I tested Samsung’s “space zoom” mode on two Samsung smartphones: the Galaxy S22 Ultra and this year’s Galaxy S23 Ultra. The results were different. I didn't obtain the same results when I took pictures of a full moon with the Galaxy S23 Ultra. Instead, I got an overexposed beam of light emanating from the top instead of the intricate craters present in the Galaxy S22 Ultra photo. In line with Samsung’s explanation, this is how the algorithm operates. The Galaxy S23 Ultra wasn't able to produce a moon shot like its predecessor because it didn't immediately identify the moon, which contradicts how the company trained its algorithms to behave.
It's suggested that you give Samsung's entire post a read, particularly if you're fascinated by how smartphones manage to generate the images they do. However, it doesn't absolve Samsung of the fact that it overhyped its cameras to make it sound like they could zoom 100x natively, akin to a DSLR or mirrorless camera with a powerful telescopic lens. A smartphone and its glass are no match for a complete camera setup. The sensor size and rear-facing glass must be larger to capture the level of detail needed from the moon.
This is not standard postprocessing, but rather a live version of Photoshop. Samsung's moon photos, as discovered by the original Reddit post, are not always enhancing the existing shots but are instead using multiple images taken at the same time (unbeknownst to the user) and AI deep learning of existing moon details to partly synthesize what the phone believes the moon's characteristics should consist of in the shot. This means things like color will be preserved (which wouldn't be the case if the phone simply copied and pasted other moon photos onto yours), but it also means that your phone can still utilize the moon's tidally locked position to determine what it should look like at any given time and adjust your photo accordingly.
In essence, if it looks like a moon photo, it might just be a moon photo. I'm hesitant to take sides between a large conglomerate and the Reddit user, but I don't blame Samsung for boasting about its camera abilities. All most of us expect from our smartphones is the ability to capture the world, enabling us to reminisce upon our digital memories, rather than questioning why a photo appears the way it does.
Apple and Google employ similar marketing strategies for their respective smartphone cameras, with Google emphasizing the benefits of its machine learning to produce superior photos in comparison to its competition. If you're interested in documenting the moon phases as they occur outside your window, consider reconsidering the $1,200 starting price of the Samsung Galaxy S23 Ultra and investing some of that money in a powerful telescopic camera setup instead.
In the future, technology advancements in smartphone cameras, such as Samsung's Scene Optimizer feature, could revolutionize moon photography for amateur astronomers.ai plays a significant role in enhancing smartphone images, including capturing detailed moon photos.
As the tech industry continues to innovate, we can expect to see even more sophisticated AI integration in smartphone cameras, further blurring the lines between what can be achieved with a phone and a dedicated camera setup.