You mean like wrinkles in basically all today's phones (at least for selfies camera, very aggressively) for silky smooth skin or completely changing skin tone that make even ghouls look OK for a nice instagram stream?
I properly don't get all the outrage, ironing faces, changing colors and removing moles is a nice celebrated feature and even created whole 'Apple skin' thingie, but adding well known static details to blurry image of the moon is somehow suddenly crossing the line? That line has been crossed long time ago my friends, look at optic physics of those tiny sensors and crappy lenses and results 'ya all want from them. People mentioned yesterday in main thread that Apple latest switched side pic of bunny in the grass for his other side, which is even more hilarious 'painting rather than photography' case.
Plus how things are exactly done in phone is known to maybe 10 engineers most probably in Korea, but people here quickly jumped on outrage wagon from 1 very lightly and non-scientifically done test on reddit (I tried to repeat his experiment with S22 ultra but failed 100% of the time, with his own images, it was just blurry mess 100% of the cases).
I have S22 ultra and its making properly good completely handheld night photos in total darkness (just stars, but no moon and no artificial light). I mean me standing in dark forest 11pm, snapping nice detailed foliage and starry pics, where my very eyes see very little from the scene. It very well surpasses my fullframe Nikon with superb lense in this. But its true that is done by main, big sensor, and not 10x zoom one, which is then zoomed extra say 3-4x digitally to get shot of moon which spans whole picture.
I properly don't get all the outrage, ironing faces, changing colors and removing moles is a nice celebrated feature and even created whole 'Apple skin' thingie, but adding well known static details to blurry image of the moon is somehow suddenly crossing the line? That line has been crossed long time ago my friends, look at optic physics of those tiny sensors and crappy lenses and results 'ya all want from them. People mentioned yesterday in main thread that Apple latest switched side pic of bunny in the grass for his other side, which is even more hilarious 'painting rather than photography' case.
Plus how things are exactly done in phone is known to maybe 10 engineers most probably in Korea, but people here quickly jumped on outrage wagon from 1 very lightly and non-scientifically done test on reddit (I tried to repeat his experiment with S22 ultra but failed 100% of the time, with his own images, it was just blurry mess 100% of the cases).
I have S22 ultra and its making properly good completely handheld night photos in total darkness (just stars, but no moon and no artificial light). I mean me standing in dark forest 11pm, snapping nice detailed foliage and starry pics, where my very eyes see very little from the scene. It very well surpasses my fullframe Nikon with superb lense in this. But its true that is done by main, big sensor, and not 10x zoom one, which is then zoomed extra say 3-4x digitally to get shot of moon which spans whole picture.