What's so ‘Fusion’ about the iPhone 16’s 48MP camera?

The iPhone 16 in ultramarine on a pink and blue background
(Image credit: Apple)

A new word is being thrown about to describe the iPhone 16 family’s main rear camera. It’s the 48MP Fusion Camera. 

The resolution? That’s not new. But the other word, Fusion, is. Apple uses the term Deep Fusion for its computational photography processing, which helps dramatically improve low-light shooting in particular. But it didn’t describe the iPhone 15 as having a Fusion Camera in the last generation. 

So what’s up? 

Is 'Fusion' used in part to make it seem the iPhone 16 gets more of a camera upgrade than it really has? Sure, probably. But there’s also some substance behind the name. Potentially. 

Getting to the bottom of this one could also help you make the most of an iPhone 16 or iPhone 16 Pro camera. It’s time for a closer look. 

48MP Fusion camera: the cynical take

Apple iPhone 16 Pro Max Hands on

(Image credit: Future / Lance Ulanoff)

Why Fusion? This term has quite the history among cameras, given it doesn’t always refer to a single, specific piece of technology.

Iconic film director James Cameron’s 3D shooting array was dubbed the Fusion Camera System. It’s a dual-camera arrangement, used to capture video in native 3D for loads of films including the original Avatar, back in the (most recent) heyday of 3D cinema. Cameron co-developed it with director of photography Vince Pace. 

GoPro’s first 360-degree camera was also called the Fusion, released back in 2017. It was the precursor to today’s GoPro Max, and knitted together the feeds of a pair of cameras with fisheye lenses, one on each side of its body.  

The term Fusion has a fairly powerful sense of merging physicality and technology. And in both of these cases it refers to combining the efforts of two separate camera sensors and lenses to achieve a specific goal: 3D footage or 360-degree photos and video. 

Does the same apply to the iPhone 16?

iPhone 16: one camera, three personalities 

Apple iPhone 16 Pro Max Hands on

(Image credit: Future / Lance Ulanoff)

The iPhone 16’s Fusion camera isn’t quite like the GoPro Fusion or James Cameron’s cinematic rig, though. Its 48MP primary rear camera gets the Fusion label, not the entire rear array. 

And there are actually several feature candidates for the reasoning behind the name. The first is all about how Apple exploits the style of sensor used in the iPhone 16, and how it can take on multiple identities. It can behave like a 48MP camera, a 12MP one and a 2x zoom Apple claims to have the quality of a native zoom. But how?

Just about all very high-resolution phone camera sensors use what’s known as a Quad Bayer array. If you want to get your head around phone cameras, you need to know a little about this kind of sensor. 

When light enters a camera lens, it passes through a color filter and hits the light-gathering part of the sensor, called a photosite. There are many millions of these in a camera sensor. 

In a standard camera sensor, there are red green and blue color filters for each pixel in the final image. And a sub-pixel photosite for each colour underneath. Little red, green and blue-serving dots’ information is collated to form a pixel of image data in your actual photo.

With a Quad Bayer array, there are blocks of four photosites under each of the red, green and blue colour filter sections, not one. This is why the iPhone 16 kinda has a 48MP camera and a 12MP one at the same time. 

In theory you’d use the full-resolution mode in bright conditions, treating each little sub-pixel as its own point of information, and merge those four sub-pixels in limited light. That’s called pixel binning, used for greater light sensitivity. In practice, phone manufacturers typically use the quarter resolution mode until you select a high-res one. 

This makes sense when we are still dealing with somewhat limited information with these Quad Bayer designs. Thanks to that color filter, while the iPhone 16 may be able to register 48MP of luminance (or gain) picture info, it still only effectively has 12MP of color hue data regardless of how you treat those pixels. 

You might compare this to a TV display, where each of its 8.3 million pixels consists of red green and blue sub-pixels. But here there are four of each per pixel brick. 

Quad Bayer cameras use a process called demosaicing to perform a sort of “best guess” of what this missing colour data might be. And there will also be denoising to compensate for the absolutely tiny size of photosites used to try to create a pixel’s worth of data here. 

The result is often a lot less clean and confident than the lower-resolution image a phone would shoot by default. It might appear less natural, aliased, processed, overly smoothed — a bit of a toss up whether the extra supposed detail might just be an AI-style confection. But in good light? Sure, an iPhone 16’s 48-megapixel mode will be worth playing around with. 

The iPhone 16 family adds a third shooting mode to counter this — 24MP. This combines the lower and higher resolution captures, shot in quick succession, ostensibly to balance size and detail. 

a render image of the IPhone 16 Pro's rear camera array

(Image credit: Apple)

An iPhone 16 can also take a 2x 12MP zoom image, using a sensor crop. Companies like to claim this is a form of lossless zoom, including Apple. But it isn’t really because, as already noted, even if there are 12MP’s worth of photosites, they can’t deliver 12MP of color information.

There’s a problem here. All of this was already in place in the last generation of iPhone. The 48MP camera came to the iPhone 15. It had a 24MP mode, and a 12MP mode too.

And yet, Apple’s launch event did suggest this familiar stuff is at the root of the whole new 'Fusion' thing.

“The capability to capture 48MP photos and optical quality 2x telephoto images makes this a 48MP Fusion camera,” said iPhone Product Manager Piyush Pratik during the phone’s launch.

It also does not appear Apple has upgraded the sensor this time around either. The specs indicate the iPhone 16 and iPhone 16 Pro cameras do not have the often-rumored upgrade to a larger Sony IMX903 sensor, as Apple still quotes the same “2 micron” and “2.44 micron" quad-pixels as it did last year for the iPhone 15 and iPhone 15 Pro cameras respectively.

The Sony IMX903 is a larger 1.14-inch sensor, where the IMX803 used in the iPhone 14 Pro and iPhone 15 Pro phones is a 1.3-inch sensor. This doesn’t mean the sensors are exactly the same this year, but they are more similar to the hardware of the iPhone 14 Pro from 2022 than camera fans would probably like.

Should we expect the worst, that this is all just a marketing ploy to make the old seem new? Perhaps not.

Paging Mr Cameron

Apple iPhone 16 Plus

(Image credit: Future/Jacob Krol)

The other reason the iPhone camera might be dubbed Fusion refers back to something mentioned earlier, how James Cameron and Vince Pace devised an arrangement of Sony CineAlta cameras to shoot in 3D. 

Apple has kinda done this with the iPhone 16 series too. And all it took was a nudge of the second sensor. 

In the standard iPhone 15 and iPhone 15 Plus, the standard and ultra-wide cameras sit across a diagonal. This has been switched up for a more conventional vertical arrangement in the iPhone 16 family. 

It looks less jaunty, sure, but means the two lenses can be used as a stereoscopic pair. Despite having completely different fields of view, the difference in their positions across the phone’s back lets the phone separate close objects from those far away, just like our eyes, using parallax to judge the difference. 

The base iPhone 16 can now capture stereoscopic images and video as a result, which can then be enjoyed on the Vision Pro headset. Last year this feature was limited to the iPhone 15 Pro and Pro Max. They already had those lined-up wide and ultra-wide camera lenses. 

Letting the more affordable lines of iPhone tie in directly to the Vision Pro headset might be taken as a positive sign for support for the whole spatial computer headset idea — reassuring when the original reportedly hasn’t sold like gangbusters.

Apple hasn’t suggested the whole stereoscopic capture thing is the reason for the 'Fusion' tag. But it has much more of a firm grounding than Apple’s messing about with the potential of Quad Bayer sensors, which on the surface seems a rerun of what Apple has done in the two previous generations of iPhone. It also bears much more of a relation to those previous uses of the term by Cameron and GoPro.

Will the iPhone 16 camera be better than the iPhone 15's?

iPhone 16 lineup

The iPhone 16 in Black, White, Pink, Teal and Ultramarine (Image credit: Apple)

One final possibility is perhaps the ideal one: Apple has learnt how to Bayer better in this generation. 

Maybe those 24MP images taken with the iPhone 16 family are going to shine like never before, enhanced by the power of the Apple Intelligence those Apple execs are so keen on talking about these days. 

Increased machine learning power and smarts could absolutely mean an improvement in the photo processing of these new phones, and even amount to a generational shift in quality despite using the same (or similar) hardware and techniques that look identical from a distance. 

However, to check that out we’ll have to wait until we’ve taken the iPhone 16 and iPhone 16 Pro out for a test drive. Does Fusion really mark the creation of something new? The jury’s out. 

For our first impressions of the new Apple phones, check out our hands-on iPhone 16 reviewiPhone 16 Plus reviewiPhone 16 Pro review, and iPhone 16 Pro Max review articles. 

You might also like

TOPICS
Andrew Williams

Andrew is a freelance journalist and has been writing and editing for some of the UK's top tech and lifestyle publications including TrustedReviews, Stuff, T3, TechRadar, Lifehacker and others.