{"id":618117,"date":"2023-03-15T09:49:08","date_gmt":"2023-03-15T14:49:08","guid":{"rendered":"https:\/\/news.sellorbuyhomefast.com\/index.php\/2023\/03\/15\/samsung-responds-to-fake-moon-controversy\/"},"modified":"2023-03-15T09:49:08","modified_gmt":"2023-03-15T14:49:08","slug":"samsung-responds-to-fake-moon-controversy","status":"publish","type":"post","link":"https:\/\/newsycanuse.com\/index.php\/2023\/03\/15\/samsung-responds-to-fake-moon-controversy\/","title":{"rendered":"Samsung responds to fake Moon controversy"},"content":{"rendered":"<div>\n<p>Samsung has published an <a href=\"https:\/\/www.samsungmobilepress.com\/feature-stories\/how-samsung-galaxy-cameras-combine-super-resolution-technologies-with-ai-technology-to-produce-high-quality-images-of-the-moon\/\">English-language blog post<\/a> explaining the techniques used by its phones to photograph the Moon. The post\u2019s content isn\u2019t exactly new \u2014 it appears to be a lightly edited translation of an article <a href=\"https:\/\/r1-community-samsung-com.translate.goog\/t5\/camcyclopedia\/%EB%8B%AC-%EC%B4%AC%EC%98%81\/ba-p\/19202094?_x_tr_sl=auto&#038;_x_tr_tl=en&#038;_x_tr_hl=en-GB\">posted in Korean last year<\/a> \u2014 and doesn\u2019t offer much new detail on the process. But, because it\u2019s an official translation, we can more closely scrutinize its explanation of what Samsung\u2019s image processing technology is doing.<\/p>\n<p>The explanation is a response to a viral <a href=\"https:\/\/www.reddit.com\/r\/Android\/comments\/11nzrb0\/samsung_space_zoom_moon_shots_are_fake_and_here\/\">Reddit post<\/a> that showed in stark terms just how much extra detail Samsung\u2019s camera software is adding to images when taking a photo of what appears to be the Moon. These criticisms aren\u2019t new (<a href=\"https:\/\/www.inverse.com\/input\/reviews\/is-samsung-galaxy-s21-ultra-using-ai-to-fake-detailed-moon-photos-investigation-super-resolution-analysis\"><em>Input <\/em>published a lengthy piece<\/a> about Samsung\u2019s moon photography in 2021) but the simplicity of the test brought the issue greater attention: Reddit user ibreakphotos simply snapped a photo of an artificially blurred image of the Moon using a Samsung phone, which added in extra detail that didn\u2019t exist in the original. You can see the difference for yourself below: <\/p>\n<p>Samsung\u2019s blog post today explains that its \u201cScene Optimizer\u201d feature (which has supported Moon photography since the Galaxy S21 series) combines several techniques to generate better photos of the Moon. To start with, the company\u2019s Super Resolution feature kicks in at zoom levels of 25x and higher, and uses multi-frame processing to combine over 10 images to reduce noise and enhance clarity. It also optimizes its exposure so the Moon doesn\u2019t appear blown-out in the dark sky, and uses a \u201cZoom Lock\u201d feature that combines optical and digital image stabilization to reduce image blur.<\/p>\n<p>Actually identifying the Moon in the first place is done with an \u201cAI deep learning model\u201d that\u2019s been \u201cbuilt based on a variety of moon shapes and details, from full through to crescent moons, and is based on images taken from our view from the Earth.\u201d<\/p>\n<p>But the key step, and the one that\u2019s generated all the controversy, appears to be the use of an under-explained \u201cAI detail enhancement engine.\u201d Here\u2019s how Samsung\u2019s blog post describes the process:<\/p>\n<div>\n<blockquote>\n<p>\u201dAfter Multi-frame Processing has taken place, Galaxy camera further harnesses Scene Optimizer\u2019s deep-learning-based AI detail enhancement engine to effectively eliminate remaining noise and enhance the image details even further.\u201d<\/p>\n<\/blockquote>\n<\/div>\n<p>And here\u2019s Samsung\u2019s flow chart of the process, which describes the Detail Enhancement Engine as a convolution neural network (a type of machine learning model commonly used to process imagery) that ultimately compares the result with enhanced detail against a \u201cReference with high resolution.\u201d <\/p>\n<div>\n<div role=\"button\" aria-label=\"Zoom\" tabindex=\"0\">\n<figure>\n<div>\n<p><span><img alt=\"Flow chart showing Samsung moon photography processing pipeline.\"   src=\"https:\/\/duet-cdn.vox-cdn.com\/thumbor\/0x0:1440x547\/2400x912\/filters:focal(720x274:721x275):format(webp)\/cdn.vox-cdn.com\/uploads\/chorus_asset\/file\/24509255\/007_galaxy_camera_AI_technology.jpg\" decoding=\"async\" data-nimg=\"fill\" loading=\"lazy\" data-old-src=\"data:image\/gif;base64,R0lGODlhAQABAIAAAAAAAP\/\/\/yH5BAEAAAAALAAAAAABAAEAAAIBRAA7\"><\/span><\/p>\n<\/div>\n<\/figure>\n<\/div>\n<div><figcaption><em>Samsung\u2019s flow chart shows how the moon is identified, and then its \u201cDetail Enhancement Engine\u201d gets to work. <\/em><\/figcaption><p><cite>Image: Samsung<\/cite><\/p>\n<\/div>\n<\/div>\n<p>It seems to be this stage that\u2019s adding detail that wasn\u2019t present when the photo was originally taken, and could explain why <a href=\"https:\/\/old.reddit.com\/r\/Android\/comments\/11p7rqy\/update_to_the_samsung_space_zoom_moon_shots_are\/\">ibreakphotos\u2019 followup test<\/a> \u2014 inserting <a href=\"https:\/\/imgur.com\/PYV6pva\">a plain gray square<\/a> onto a blurry photo of the Moon \u2014 resulted in the blank square being <a href=\"https:\/\/imgur.com\/oa1iWz4\">given a Moon-like texture<\/a> by Samsung\u2019s camera software.\u00a0<\/p>\n<p>While this new blog post offers more details in English compared to what Samsung has said publicly before, it\u2019s unlikely to satisfy those who see any software capable of generating a realistic image of the Moon from a blurry photo as essentially faking the whole thing. And when these AI-powered capabilities are <a href=\"https:\/\/youtu.be\/iLwsPnywFc0\">used to advertise phones<\/a>, Samsung risks misleading customers about what the zoom features of its phones are capable of.<\/p>\n<p>But, as <a href=\"http:\/\/www.theverge.com\/2023\/3\/14\/23640006\/samsung-s23-moon-photo-controversy-space-zoom-computational-photography\">my colleague Allison wrote yesterday<\/a>, Samsung\u2019s camera software isn\u2019t a million miles away from what smartphone computational photography has been doing for years to get increasingly crisp and vibrant photographs out of relatively small image sensors. \u201cYear after year, smartphone cameras go a step further, trying to make smarter guesses about the scene you\u2019re photographing and how you want it to look,\u201d Allison wrote. \u201cThese things all happen in the background, and generally, we like them.\u201d<\/p>\n<p>Samsung\u2019s blog post ends with a telling line: \u201cSamsung continues to improve Scene Optimizer to reduce any potential confusion that may occur <em>between the act of taking a picture of the real moon and an image of the moon<\/em>.\u201d (Our emphasis.)<\/p>\n<p>On one level, Samsung is essentially saying: \u201cWe don\u2019t want to get fooled by any more creative Redditors who take pictures of images of the Moon that our camera thinks is the Moon itself.\u201d But on another the company is also highlighting just how much computational work goes into producing these pictures, and will continue to in future. In other words, we\u2019re left asking the same question: \u201cwhat is a photograph anyway?\u201d<\/p>\n<\/div>\n<p><a href=\"https:\/\/www.theverge.com\/2023\/3\/15\/23641069\/samsung-fake-moon-controversy-english-language-blog-post\" class=\"button purchase\" rel=\"nofollow noopener\" target=\"_blank\">Read More<\/a><br \/>\n Jon Porter<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Samsung has published an English-language blog post explaining the techniques used by its phones to photograph the Moon. The post\u2019s content isn\u2019t exactly new \u2014 it appears to be a lightly edited translation of an article posted in Korean last year \u2014 and doesn\u2019t offer much new detail on the process. But, because it\u2019s an<\/p>\n","protected":false},"author":1,"featured_media":618118,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[77,3454,46],"tags":[],"class_list":{"0":"post-618117","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-responds","8":"category-samsung","9":"category-technology"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/618117","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/comments?post=618117"}],"version-history":[{"count":0,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/618117\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media\/618118"}],"wp:attachment":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media?parent=618117"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/categories?post=618117"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/tags?post=618117"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}