{"id":239540,"date":"2025-12-23T15:42:06","date_gmt":"2025-12-23T20:42:06","guid":{"rendered":"https:\/\/today.uconn.edu\/?p=239540"},"modified":"2025-12-23T15:42:06","modified_gmt":"2025-12-23T20:42:06","slug":"new-image-sensor-breaks-optical-limits","status":"publish","type":"post","link":"https:\/\/today.uconn.edu\/2025\/12\/new-image-sensor-breaks-optical-limits\/","title":{"rendered":"New Image Sensor Breaks Optical Limits"},"content":{"rendered":"<p>Imaging technology has transformed how we observe the universe \u2014 from mapping distant galaxies with radio telescope arrays to unlocking microscopic details inside living cells. Yet despite decades of innovation, a fundamental barrier has persisted: capturing high-resolution, wide-field images at optical wavelengths without cumbersome lenses or strict alignment constraints.<\/p>\n<figure id=\"attachment_239542\" aria-describedby=\"caption-attachment-239542\" style=\"width: 451px\" class=\"wp-caption alignright\"><img decoding=\"async\" class=\"wp-image-239542 img-responsive lazyload\" data-src=\"https:\/\/today.uconn.edu\/wp-content\/uploads\/2025\/12\/Picture1.jpg\" alt=\"Close-up of a compact image sensor chip held between fingers.\" width=\"451\" height=\"256\" data-srcset=\"https:\/\/today.uconn.edu\/wp-content\/uploads\/2025\/12\/Picture1.jpg 536w, https:\/\/today.uconn.edu\/wp-content\/uploads\/2025\/12\/Picture1-300x170.jpg 300w\" data-sizes=\"(max-width: 451px) 100vw, 451px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 451px; --smush-placeholder-aspect-ratio: 451\/256;\" \/><figcaption id=\"caption-attachment-239542\" class=\"wp-caption-text\">Professor Guoan Zheng&#8217;s lab developed a new image sensor that achieves optical super-resolution without lenses. Inspired by the telescope array that captured the first black hole image, the device uses multiple sensors working in concert, computationally merging their observations to see finer details (Contributed photo).<\/figcaption><\/figure>\n<p>A new study by Guoan Zheng, a biomedical engineering professor and the director of the <a href=\"https:\/\/bioinnovation.engineering.uconn.edu\/\">UConn Center for Biomedical and Bioengineering Innovation (CBBI)<\/a>, and his research team at the UConn College of Engineering, was published in <a href=\"https:\/\/www.nature.com\/articles\/s41467-025-65661-8\">Nature Communications<\/a>, introducing a breakthrough solution that could redefine optical imaging across science, medicine, and industry.<\/p>\n<p>\u201cAt the heart of this breakthrough is a longstanding technical problem,\u201d said Zheng. \u201cSynthetic aperture imaging \u2013 the method that allowed the Event Horizon Telescope to image a black hole \u2013 works by coherently combining measurements from multiple separated sensors to simulate a much larger imaging aperture.\u201d<\/p>\n<p>In radio astronomy, this is feasible because the wavelength of radio waves is much longer, making precise synchronization between sensors possible. But at visible light wavelengths, where the scale of interest is orders of magnitude smaller, traditional synchronization requirements become nearly impossible to meet physically.<\/p>\n<p>The Multiscale Aperture Synthesis Imager (MASI) turns this challenge on its head. Rather than forcing multiple optical sensors to operate in perfect physical synchrony \u2013 a task that would require nanometer-level precision \u2013 MASI lets each sensor measure light independently and then uses computational algorithms to synchronize the data afterward.<\/p>\n<p>Zheng explained that it\u2019s akin to having multiple photographers capture the same scene, not as ordinary photos but as raw measurements of light wave properties, and then letting software stitch these independent captures into one ultra-high-resolution image.<\/p>\n<p>This computational phase synchronization scheme eliminates the need for rigid interferometric setups that have prevented optical synthetic aperture systems from practical deployment until now.<\/p>\n<figure id=\"attachment_239548\" aria-describedby=\"caption-attachment-239548\" style=\"width: 426px\" class=\"wp-caption alignleft\"><img decoding=\"async\" class=\"size-full wp-image-239548 img-responsive lazyload\" data-src=\"https:\/\/today.uconn.edu\/wp-content\/uploads\/2025\/12\/Picture2.png\" alt=\"Scientific visualizations showing MASI measurements and a 3D reconstruction of a bullet cartridge surface.\" width=\"426\" height=\"638\" data-srcset=\"https:\/\/today.uconn.edu\/wp-content\/uploads\/2025\/12\/Picture2.png 426w, https:\/\/today.uconn.edu\/wp-content\/uploads\/2025\/12\/Picture2-200x300.png 200w, https:\/\/today.uconn.edu\/wp-content\/uploads\/2025\/12\/Picture2-280x420.png 280w\" data-sizes=\"(max-width: 426px) 100vw, 426px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 426px; --smush-placeholder-aspect-ratio: 426\/638;\" \/><figcaption id=\"caption-attachment-239548\" class=\"wp-caption-text\">A bullet cartridge imaged by MASI. Top: The captured complex electric field contains both amplitude (brightness) and phase (color) information. Bottom: This data enables 3D reconstruction at micrometer resolution, showing the firing pin impression, a unique marking that can link a bullet casing to a specific gun (Contributed photo).<\/figcaption><\/figure>\n<p>MASI deviates from conventional optical imaging in two transformative ways. Rather than relying on lenses to focus light onto a sensor, MASI deploys an array of coded sensors positioned in different parts of a diffraction plane. Each captures raw diffraction patterns \u2013essentially the way light waves spread after interacting with an object. These diffraction measurements contain both amplitude and phase information, which are recovered using computational algorithms.<\/p>\n<p>Once each sensor\u2019s complex wavefield is recovered, the system digitally pads and numerically propagates the wavefields back to the object plane. A computational phase synchronization method then iteratively adjusts the relative phase offsets of each sensor\u2019s data to maximize the overall coherence and energy in the unified reconstruction.<\/p>\n<p>This step is the key innovation: by optimizing the combined wavefields in software rather than aligning sensors physically, MASI overcomes the diffraction limit and other constraints imposed by traditional optics.<\/p>\n<p>The result? A virtual synthetic aperture for larger than any single sensor, enabling sub-micron resolution and wide field coverage without lenses.<\/p>\n<p>Conventional lenses, whether in microscopes, cameras, or telescopes, force designers into trade-offs. To resolve smaller features, lenses must be closer to the object, often within millimeters, limiting working distance and making certain imaging tasks impractical or invasive.<\/p>\n<p>The MASI approach dispenses with lenses entirely, capturing diffraction patterns from centimeters away and reconstructing images with resolution down to sub-micron levels. This is similar to being able to examine the fine ridges on a human hair from across a desktop instead of bringing it inches from your eye.<\/p>\n<p>\u201cThe potential applications for MASI span multiple fields, from forensic science and medical diagnostics to industrial inspection and remote sensing,\u201d said Zheng, \u201cBut what\u2019s most exciting is the scalability &#8211; unlike traditional optics that become exponentially more complex as they grow, our system scales linearly, potentially enabling large arrays for applications we haven&#8217;t even imagined yet.\u201d<\/p>\n<p>The Multiscale Aperture Synthesis Imager represents a paradigm shift in optical imaging: one where computation resolves the fundamental limitations imposed by physical optics. By decoupling measurement from synchronization and replacing bulky lenses with software-controlled sensor arrays, MASI unlocks a new domain of imaging that is high-resolution, flexible, and scalable.<\/p>\n<p><iframe title=\"MASI enables 3D reconstruction of a battery\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/asmFPYvawM0?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/p>\n","protected":false},"excerpt":{"rendered":"<p>UConn engineers develop new image sensor to achieve 3D microscopic resolution without lenses.<\/p>\n","protected":false},"author":224,"featured_media":239564,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_crdt_document":"","wds_primary_category":0,"wds_primary_series":0,"wds_primary_attribution":0,"footnotes":""},"categories":[1866],"tags":[],"magazine-issues":[],"coauthors":[2646],"class_list":["post-239540","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-engr"],"pp_statuses_selecting_workflow":false,"pp_workflow_action":"current","pp_status_selection":"publish","acf":[],"publishpress_future_action":{"enabled":false,"date":"2026-04-19 09:59:08","action":"change-status","newStatus":"draft","terms":[],"taxonomy":"category","extraData":[]},"publishpress_future_workflow_manual_trigger":{"enabledWorkflows":[]},"_links":{"self":[{"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/posts\/239540","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/users\/224"}],"replies":[{"embeddable":true,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/comments?post=239540"}],"version-history":[{"count":12,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/posts\/239540\/revisions"}],"predecessor-version":[{"id":239570,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/posts\/239540\/revisions\/239570"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/media\/239564"}],"wp:attachment":[{"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/media?parent=239540"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/categories?post=239540"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/tags?post=239540"},{"taxonomy":"magazine-issue","embeddable":true,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/magazine-issues?post=239540"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/today.uconn.edu\/wp-rest\/wp\/v2\/coauthors?post=239540"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}