Hacker News new | past | comments | ask | show | jobs | submit login

If there was a change to the surface of the moon that was visible to the naked eye, would this camera's software filter it out? Because that is where I draw the line. Imagine seeing a large asteroid impact, taking a photo, and getting gaslit.



Early on in the 'controversy' someone on Reddit undertook a similar test, by photoshopping an image of the moon to have a different crater pattern, blurring it, and taking an enhanced photo. Samsung 'enhanced' the moon with the incorrect crater pattern.

After seeing that, I was pretty skeptical about the "they're just swapping in a high res png!" claims, regardless of how much they got repeated. This post is more evidence they're not just swapping in a higher res image, but I hunch people will keep repeating it anyway.


That the enhanced photo contains the wrong data would seem to pretty clearly prove that it is not enhancing the data from the camera sensor and instead using pre-existing photos as the data source.


To clarify:

1. User took a photo of the "real" moon, and manipulated the craters in photoshop. 2. User photographed a low-resolution image of the manipulated moon image with the Samsung phone. 3. The "enhanced" photo included the manipulated, not-real craters in greater detail.

It seems to me that indicates the camera app is utilizing data from the camera sensor to some extent, not just using pre-existing photos, because it "enhanced" craters that do not exist in any pre-existing photos. Why would it indicate the opposite?


A solution would be to always store the original unenhanced image, like apple does for images in iCloud.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: