Apple is reportedly exploring camera technology currently being used in medical, archeology, and the military, with rumors suggesting the integration of multispectral imaging and high-resolution 200MP sensors in forthcoming iPhones. That would certainly be exciting if true, though as always, treats rumors like this with a grain of salt.
According to prominent tipster Digital Chat Station (via Weibo), Apple is in the early stages of evaluating components for multispectral sensors. Unlike standard red, green, and blue (RGB) receptors found in modern smartphones, multispectral cameras capture light data across a wider range of the electromagnetic spectrum, including near-infrared and ultraviolet. Multispectral imaging, typically reserved for military reconnaissance, satellite crop monitoring, and industrial quality control, could allow future iPhones to see what is invisible to the human eye. And no, we’re not talking about Sony Handycam’s here (IYKYK).
In terms of practical benefits in mobile camera imaging, multispectral setups have the potential to analyze how different materials reflect light across various wavelengths. In theory, the camera could accurately distinguish between skin, fabric, and foliage, particularly in challenging lighting conditions, plus bring a boost in color accuracy and white balance.

Before multispectral becomes a reality on Apple’s product timeline though, industry analysts from Morgan Stanley are more certain that Apple will adopt a Samsung-supplied 200MP primary sensor as early as 2028, likely with the iPhone 21. Still, Apple’s current focus remains on refining its 48MP systems. The upcoming iPhone 18 Pro is expected to prioritize optical flexibility, with rumors pointing toward a new variable aperture lens and a larger aperture for the telephoto camera to boost low-light performance.
Main photo credit: MicaSense (featuring its RedEdge-P Dual multispectral cameras)




