(All Images credit: Sony)
Sony has announced that it has "succeeded in developing the world’s first stacked CMOS image sensor technology with 2-Layer Transistor Pixel".
"Erm, what?" I hear you cry. Well, according to Sony, whereas conventional CMOS image sensors’ photodiodes and pixel transistors occupy the same substrate, this new technology separates photodiodes and pixel transistors onto different substrate layers. This allegedly has the potential to roughly double the saturation signal level compared to conventional CMOS sensors, resulting in enhanced dynamic range and reduced noise.
If this is sounding rather like deJa vu, then you're not alone. I recall vaguely similar claims being touted when the back-illuminated sensor was first released, and again when Sony introduced the first iteration of its stacked CMOS sensor at the launch of the RX100 IV and RX10 II cameras back in 2015. Since then we've seen a number of other 'world-first' implementations of stacked sensors in full-frame cameras and camera phones, each with incremental upgrades.
Sony has now modified this original stacked sensor structure by packaging the photodiodes and pixel transistors on separate substrates stacked one atop the other. In conventional stacked CMOS image sensors, by contrast, the photodiodes and pixel transistors sit alongside each other on the same substrate.
This new stacking technology allows the photodiode and pixel transistor layers to each be optimized, which Sony claims can approximately double saturation signal level relative to conventional image sensors and, in turn, can widen dynamic range.
Click on the green Discuss bar to talk about this article in the forums.