It's not in the cloud, this is all done on-device. They have a moon detection AI model that they pass the image through, the output of which is then used to set up and then run the moon enhancement model, which fills in the detail from moon images it's been trained upon.
These are part of a larger package of super-resolution models and processing pipeline that Samsung licenses from ArcSoft, a computational photography software company that specialises in mobile devices.
Anyway, if they can recognize gender and age and what do you have in your refrigerator ("This offers users a smoother and smarter refrigerator experience.") :
On Samsung devices that support this, it's implemented in the file libsuperresolution_raw.arcsoft.so in /system/lib64, if you're curious to have a look at how it works.
Some strings from that file, relating to the moon detection and enhancement process:
I suspect that if the AI detects two moons, it probably abandons "enhancement" and drops a message in the logs (or renders an error, though I doubt that's what that particular string is for).
These are part of a larger package of super-resolution models and processing pipeline that Samsung licenses from ArcSoft, a computational photography software company that specialises in mobile devices.