According to a recent supply chain rumor from China, Apple is investigating multispectral imaging technology for potential use in future iPhone cameras. This advancement could enhance features like Visual Intelligence, improve material identification, and advance overall image processing capabilities.
Weibo leaker “Digital Chat Station” reported that Apple is presently assessing related components in the supply chain. However, the tipster noted that official testing has not started yet, indicating the tech is still in an early exploratory phase.
Multispectral imaging differs from traditional smartphone photography, which relies solely on standard red, green, and blue light. Instead, the technology captures image data across multiple, distinct wavelength bands, which can add sensitivity to near-infrared or other narrow spectral ranges. This could potentially allow cameras to detect information that is largely invisible to conventional sensors.
If adopted in future iPhones, one potential advantage could be improved material and surface differentiation. By analysing how different materials reflect light across wavelengths, the iPhone’s camera could more accurately distinguish skin, fabric, vegetation, or reflective surfaces, enabling cleaner subject recognition and more reliable portrait effects.
MacDailyNews Note: In the same Weibo post, “Digital Chat Station” reiterated that the main camera on the iPhone 18 Pro models will feature a variable aperture, and the telephoto lens will come with a larger aperture. However, the leaker added that Apple has not yet started prototyping 200-megapixel sensors for future iPhones.
Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.

Absolutely no one is prototyping 200 megapixel sensors for cell phones or tablets. Why? Because it does not make any sense. The pixel size would have to be so small that even with a quantum efficiency (the efficiency of turning every photon into a usable electric signal) of 100% the sensitivity would only be good enough to be useful in bright daylight. Smaller pixels mean fewer photons absorbed in a given a given pixel in a given time frame. Too few photons in a given pixel means the electric signal is going to be indistinguishable from electronic noise.
Apple’s 48 megapixel cameras are nearly as high as Apple can go and still get reasonable low light capability. They might be able to get to 80 megapixel or even 100 megapixel, but approaching 200 megapixel is out of the question for the far foreseeable future, if ever. (Yes, Sony debuted a 256 megapixel focal plane array a couple years back, but that sensor could not, even in theory, be used in a cell phone. It’s both physically way too large and requires way too much power to operate.)