Apple Develops Sensor That Rivals Human Eye in Dynamic Range
by Rajat Saini · The Mac ObserverApple is developing a next-generation image sensor that could push the limits of dynamic range in digital imaging. According to a newly published patent, the company is working on a stacked sensor architecture capable of delivering up to 20 stops of dynamic range. This level would outperform cinema-grade cameras like the ARRI ALEXA 35 and come close to the dynamic range of the human eye.
A 20-Stop Breakthrough with Stacked Sensor Architecture
The patent, titled “Image Sensor With Stacked Pixels Having High Dynamic Range And Low Noise,” outlines a two-layer sensor design. The top sensor die captures light through photodiodes, while the underlying logic die handles processing. This layout reduces noise before the signal even exits the chip. Each pixel includes a current memory circuit to cancel thermal noise in real time, eliminating the need for post-processing corrections.
The sensor also integrates LOFIC (Lateral Overflow Integration Capacitor) technology. This allows each pixel to handle three levels of charge depending on scene brightness, preserving image detail in both highlights and shadows. Notably, Apple uses a simpler 3-transistor (3T) structure rather than the more common 4T, prioritizing noise control at the silicon level.
If Apple brings this sensor to market, it could set a new standard in mobile photography and challenge traditional camera makers in dynamic range and low-light performance. The 20-stop range implies a contrast ratio of 1,048,576 to 1. Few imaging systems today can achieve this level of detail across lighting extremes.
Implications for iPhones and Beyond
Apple has historically relied on Sony for image sensors. This patent suggests a shift toward proprietary solutions built in-house. As Y.M.Cinema Magazine first reported (via 9to5mac), the company appears to be engineering an advanced imaging pipeline with direct sensor-level processing. This technology may debut in future iPhones or in devices like the Apple Vision Pro.
Real-world testing of the iPhone 15 Pro Max by CineD showed that its camera delivers around 12 to 13.4 stops of dynamic range, depending on ISO and testing method. However, heavy internal noise reduction distorts some results, raising questions about how much real dynamic range is preserved. At ISO 55, for example, CineD observed five stops of exposure latitude. This is significantly below cinema cameras like the ARRI Alexa Mini LF, which outpaces the iPhone by five to seven stops.
Meanwhile, the human eye typically sees in the range of 10 to 14 stops. After adjusting to light conditions, it can perceive up to 30. Apple’s goal of matching or surpassing this through sensor innovation reflects its push to unite computational photography with hardware-level performance.
Caution: Patent Does Not Mean Product
Despite the excitement, this remains a patent. Apple often files for advanced technologies that never leave the lab. As discussed on Reddit and in Y.M.Cinema comments, the technical feasibility of this design in a mobile form factor raises valid skepticism. The camera community is quick to challenge claims that stretch the limits of current imaging science.
Still, Apple’s effort to design sensors with this level of precision shows its strategy to control the full imaging stack. If this technology becomes real, it could transform mobile cinematography, AR and VR fidelity, HDR content capture, and even lead to Apple’s first standalone professional camera system.