Here’s a brief tour of how the iPhone camera quality has improved over the years, starting with the iPhone X and moving forwards right up to the current iPhone 14 series and the incoming iPhone 15…
Table of Contents
Apple’s iPhone –alongside Android-powered smartphones – has not just been a trailblazer in the smartphone market, but also in the realm of mobile photography. The reason why platforms like Instagram, TikTok and other image-based social platforms are so popular is because everybody is now a photographer, and the catalyst for this change? Smartphones.
Apple’s iPhone is, perhaps, the most well known smartphone from a brand perspective. People from all walks of life, old and young, use iPhones. And this mass adoption of smartphones has completely democratized photography.
Over the years, starting from the iPhone X, each iteration of Apple’s iPhone has introduced impressive camera upgrades, improving on what came before and bridging the gap between DSLR and smartphone camera performance. Nowadays, thanks to billions spent in R&D, that gap is closer than ever.
Films have been made using iPhones, millions of hours’ worth of YouTube content, and trillions of photographs. But how did we get here? Let’s chart the development and evolution of Apple’s iPhone camera over the past several years to find out…
iPhone Camera Quality & Resolution
|iPhone Model||Camera Resolution||Megapixels|
|iPhone 6||3264 x 2448 pixels||8 MP|
|iPhone 6S to iPhone X||4032 x 3024 pixels||12 MP|
|iPhone XR||4032 x 3024 pixels||12 MP|
|iPhone XS and XS Max||4032 x 3024 pixels||12 MP|
|iPhone 11, 11 Pro, and 11 Pro Max||4032 x 3024 pixels||12 MP|
|iPhone 12, 12 Mini, 12 Pro, and 12 Pro Max||4032 x 3024 pixels||12 MP|
|iPhone 13 Mini, 13, 13 Pro, and 13 Pro Max||4032 x 3024 pixels||12 MP|
|iPhone 14||4032 x 3024 pixels||12 MP|
|iPhone 14 Plus||4032 x 3024 pixels||12 MP|
|iPhone 14 Pro||8064 x 6048 pixels||48 MP|
|iPhone 14 Pro Max||8064 x 6048 pixels||48 MP|
It’s important to note that these figures apply to the default settings. You can choose to capture images in different formats (like HEIF, JPEG, or RAW), different aspect ratios, or use features that might affect the final resolution (like Panorama or Burst mode).
Furthermore, starting with the iPhone 12 Pro, you can take ProRAW images, which contain more detail and allow for more flexibility in editing but also result in much larger file sizes. The resolution remains the same, but the additional data in ProRAW files allows for more detail to be captured and preserved.
What Are ProRAW Images on iPhone?
Apple ProRAW is a feature introduced with the iPhone 12 Pro and 12 Pro Max that provides the benefits of shooting in RAW with the computational photography capabilities that iPhones are known for.
In traditional photography, RAW is a file format that captures all image data recorded by the sensor when you take a photo. Unlike JPEG or HEIF images which are processed and compressed, RAW images are uncompressed and unprocessed. This gives photographers more control over parameters like white balance, exposure, tone mapping, and noise reduction in post-production.
Shooting in RAW does mean that you miss out on many of the advanced photography features that modern iPhones offer, like Deep Fusion and Smart HDR, which are applied at the moment of capture.
This is where ProRAW comes in. It gives you all the standard RAW information, along with the image processing and noise reduction techniques that iPhones apply. This allows photographers to manually adjust things like exposure, color, and tone in post-production without losing the benefits of the iPhone’s computational photography.
You can think of ProRAW as a hybrid between a traditional RAW format and the processed JPEG or HEIF formats that iPhones typically capture. It’s designed for people who want to manually edit their photos to achieve a specific look, while still benefiting from the iPhone’s advanced photography features.
iPhone Camera Comparison Over The Years
The Leap with iPhone X
With the iPhone X, Apple signaled a significant shift in iPhone camera technology. Boasting a 12MP dual-camera system, the iPhone X offered a wide and telephoto lens combination, allowing optical zoom and Portrait mode, which had previously been exclusive to the Plus models.
The wide sensor featured an f/1.8 aperture, while the telephoto sensor had an f/2.4 aperture. What stood out was its improved image signal processor (ISP) that offered faster autofocus in low light conditions, and the introduction of Portrait Lighting.
What is Portrait Lighting on iPhone?
Portrait Lighting uses complex algorithms and the iPhone’s depth-sensing capabilities to manipulate the lighting on the subject of your photograph as you take it, almost as if you were changing real-world lighting setups in a studio environment.
There are six different Portrait Lighting effects:
- Natural Light: Your subject’s face in sharp focus against a blurred background.
- Studio Light: A clean look with your subject’s face brightly lit.
- Contour Light: Dramatic directional lighting that highlights the subject’s facial features.
- Stage Light: The subject’s face is spotlit against a deep black background.
- Stage Light Mono: Like Stage Light, but in classic black and white.
- High-Key Light Mono: The subject appears in grayscale against a white background.
The Refinement with iPhone XS and XS Max
In the next iteration, the iPhone XS and XS Max, Apple focused on refining its camera capabilities. On paper, the specs seemed the same as the iPhone X. However, Apple introduced the concept of computational photography with the feature “Smart HDR”. This feature used multiple exposures to create a single image with more detail in both the shadows and highlights, elevating the camera’s dynamic range.
What is Smart HDR on iPhone?
Traditionally, cameras have struggled with scenes that include both very bright and very dark areas, also known as high-contrast scenes. HDR technology is designed to remedy this by taking multiple photos at different exposure levels (one for highlights, one for shadows, and one in between) and then combining them into one photo. This results in an image that more accurately represents the range of light and dark areas that the human eye can see in the real world.
Smart HDR takes this concept a step further. When you take a photo, the iPhone’s camera actually captures a series of images almost simultaneously. These images are taken at different exposure levels: some are optimized for highlights, and others for shadows.
The iPhone then analyzes these images, chooses the best parts of each one, and merges them into a single photo. This happens almost instantly, thanks to the incredibly fast processing power of the iPhone’s chip.
Moreover, Smart HDR uses machine learning to recognize faces in your photos, ensuring that people are always well-exposed and detailed.
The result is photos that show more detail, have better balanced lighting, and look more like what you see with your own eyes. It’s particularly useful when you’re shooting in challenging lighting conditions, such as backlit scenes or situations with high contrast between bright and dark areas.
The Game Changer: iPhone 11 Series
The iPhone 11 series brought with it a notable advancement: the introduction of an ultra-wide lens. The new lens allowed users to capture a much wider field of view, opening up creative possibilities in landscapes, cityscapes, and group shots.
Night Mode was also introduced, enabling the capture of remarkably bright photos in low-light environments. The iPhone 11 Pro models offered a triple-lens setup for the first time, bringing together wide, ultra-wide, and telephoto lenses.
What is Night Mode on iPhone?
Traditionally, photos taken in dimly lit environments have been a challenge for most smartphones due to digital noise, loss of detail, and inaccurate colors. Night Mode on iPhone addresses these issues using a combination of longer exposure times, multiple shots, and computational algorithms.
Here’s how it works:
- When you take a photo in a low-light situation and Night Mode is engaged, the camera automatically takes a series of images over a period of time (the exact duration varies depending on the light conditions). Each of these images is at a different exposure level to capture various levels of detail.
- Then, the iPhone’s powerful chip and advanced software go to work. They align the images to correct for any movement (either from the subject or the camera), combine the best parts of each image, reduce noise, and enhance detail.
- The result is a single, well-exposed photo that appears much brighter and more detailed than what you could typically achieve in a similar low-light situation.
- One key thing to note about Night Mode is that it engages automatically when the camera sensor determines that the lighting conditions require it. You can tell when Night Mode is active because the Night Mode icon (a moon) at the top of the Camera app becomes yellow. You can also manually adjust the exposure time when Night Mode is active by tapping on the Night Mode icon and using the slider that appears.
The Bold Innovator: iPhone 12 Series
The iPhone 12 series, while retaining the triple-camera system of the iPhone 11 Pro models, brought about significant improvements. The entire iPhone 12 lineup saw the introduction of Night mode across all cameras, including the front-facing camera, a first for iPhones. HDR video recording became possible with Dolby Vision HDR, raising the bar for video quality in smartphones.
The flagship iPhone 12 Pro Max introduced sensor-shift optical image stabilization, previously only found in DSLRs, for its wide lens. This change helped reduce camera shake and improved low-light photography. In addition, it also brought in a larger sensor for the wide camera and a longer telephoto lens.
What is Sensor-Shift Optical Image Stabilization?
In traditional optical image stabilization, the lens moves to counteract any minor movements of the camera (such as your hand shaking slightly) that might blur the image. The goal is to keep the lens stable so that the image hitting the sensor is as sharp as possible.
Sensor-shift OIS takes a different approach: instead of moving the lens, it moves the sensor. This allows for potentially more precise correction of camera movement, resulting in even sharper photos and steadier videos. It’s especially beneficial in situations where the camera might be moving a lot (like walking or riding in a vehicle) or in low-light conditions, where the camera needs to keep the shutter open longer and is therefore more susceptible to blur from camera shake.
One of the main advantages of sensor-shift OIS over traditional OIS is that it can correct movement along five axes (up/down, left/right, and rotation) rather than just two (up/down and left/right), providing a more comprehensive stabilization system.
This feature, which was previously found only in high-end DSLR and mirrorless cameras, is part of Apple’s ongoing effort to make the iPhone a serious tool for professional photographers and videographers.
The Ultimate Powerhouse: iPhone 13 Series
With the iPhone 13 series, Apple continued to focus on low-light photography. The wide camera saw an increase in sensor size, enabling it to capture more light. The sensor-shift optical image stabilization also extended across the entire iPhone 13 range. Notably, the Cinematic Mode was introduced, using depth effects to automatically change focus during videos – a feature usually associated with high-end cameras and professional filmmaking.
The Pro models continued to differentiate themselves with ProRAW and ProRes video capabilities, giving professionals more control over their images and videos.
What is Cinematic Mode on iPhone?
The standout characteristic of Cinematic Mode is its ability to automatically change the focus between subjects in a video in a way that mimics the focus changes in professional movies, hence the name “Cinematic”. This is also known as depth-of-field or “bokeh” effect, where the subject is in sharp focus while the background is blurred.
Here’s how it works:
- When you’re shooting a video in Cinematic Mode, the iPhone uses machine learning and AI to identify people, animals, and objects in the scene. It then automatically creates a depth map of the scene, which allows it to keep certain subjects in sharp focus while blurring the background.
- One of the most impressive aspects of Cinematic Mode is its ability to automatically switch focus when a subject enters the frame or even when a subject in the frame looks away. For example, if two people are in the frame and one looks at the other, Cinematic Mode will automatically shift the focus to the person being looked at.
- Furthermore, you can manually change the focus and even adjust the level of bokeh effect during and after recording, offering a great deal of creative control. This is done with the help of Apple’s ProRAW and ProRes features.
The New Era: iPhone 14 Series
The iPhone 14 series, while still fresh in the market, has shown that Apple is not slowing down in pushing the boundaries of mobile photography. There’s a noticeable enhancement in both photo and video quality, thanks to the advanced ISP and neural engine. Night mode has become more capable, and the introduction of Photographic Styles allows users to implement preferred tonal adjustments that persist across shots, ensuring consistent style without compromising on photo quality.
What is Photographic Styles on iPhone?
Unlike filters, which are applied after a photo is taken and can often drastically alter an image, Photographic Styles are built into the photographic process itself. When you choose a style, it influences how the iPhone processes photos at the moment of capture, influencing factors like color, contrast, and tone mapping.
Apple provides four preset Photographic Styles to choose from:
- Rich Contrast: For strong, vibrant colors and a deeper contrast. Great for high-impact, bold images.
- Vibrant: For brighter, more vivid colors and a bit of boosted contrast.
- Warm: Adds a warm color temperature to your photos, enhancing yellows and reducing blues.
- Cool: Applies a cooler color temperature, emphasizing blues and reducing yellows.
Each of these styles is designed to create a different look and feel, allowing you to choose the one that best matches your creative intent or the mood of the scene you’re capturing. Furthermore, you can customize each style to further match your preferences, adjusting the Tone and Color of each style.
Once set, your chosen Photographic Style applies to all photos you take, ensuring a consistent look across your images. However, you can change your Photographic Style at any time, and it won’t affect the original quality of the images.
- iPhone 14 Review
- iPhone 13 Pro Max Review
- iPhone 14 Pro Max Review
- Smartphone Camera Settings Explained: How To Use Them Like A Pro
Richard Goodwin has been working as a tech journalist for over 10 years. He is the editor and owner of KnowYourMobile.
View all comments
Explore more →
For best image quality:
Shoot 4K at 24 FPS (we'll get into FPS next). It's the highest resolution available on iPhones and you'll get lots of detail in the image.
The iPhone 13 and iPhone 13 Pro are both capable of shooting professional-quality video along with high-resolution photography.What is the camera resolution of the iPhone 13? ›
Cameras. The iPhone 13 and 13 Mini feature the same camera system with three cameras: one front-facing camera (12MP f/2.2) and two back-facing cameras: a wide (12MP f/1.6) and ultra-wide (12MP f/2.4) camera.Why is my iPhone camera not high quality? ›
If your iPhone camera is blurry, be sure to clean the lenses with a clean and dry microfiber cloth. You should also try restarting the Camera app and the phone itself. Don't pinch to zoom when composing your photos, as this can lead to blurry photos as well.What resolution should I set my camera to? ›
|Resolution||Avg. Quality||Best Quality|
|5 megapixels||6x8 in.||5x7 in.|
|8 megapixels||8x10 in.||6x8 in.|
|12 megapixels||9x12 in.||8x10 in.|
|15 megapixels||12x15 in.||10x12 in.|
If your iPhone 13 photos are blurry, this is likely because your camera is shifting between lenses when it shouldn't be. To fix blurry photos, enable Macro Control and turn off Lens Correction in the Settings app.Is iPhone 14 camera better than iPhone 13? ›
The big difference between the two iPhones is the improved camera capabilities in the iPhone 14, which include better (faster) low light photography and the Action Mode. The Action Mode is definitely the marquee feature and one many people are likely to use, unlike the Cinematic Mode that arrived with the iPhone 13.Is iPhone 13 camera good enough? ›
For most people, the iPhone 13's camera setup will be more than sufficient to take excellent photos and videos, and it still offers a range of practical, easy-to-use camera features such as Night mode, Portrait mode, and Deep Fusion.What is the resolution of the iPhone 14 camera? ›
One of the biggest upgrades for the iPhone 14 Pro camera system is the main lens being increased to 48MP with what Apple calls an “advanced quad-pixel sensor.” However, the camera defaults to taking 12MP images. Read on for how to use the 48MP iPhone 14 Pro camera.What is the iPhone 14 camera quality? ›
By default, the camera bins pixels together in groups of 4, resulting in 12 MP photos with improved lighting. You can use ProRAW to capture full 48 MP photos. There is a new zoom step between wide 1x and telephoto 3x (by cropping into the new larger sensor for a 2x zoom)
So I was pleased to see that 12MP remains the default resolution for photos. Open the Camera app and take a photo, and you'll get a 12MP file. Indeed, enabling the 48MP capability is almost hidden. There's no 12MP/48MB switch in the Camera app.Should I shoot in 4K or 1080p on iPhone? ›
We recommend shooting in a resolution of 4K, with a frame rate of 24fps. If you'd like to shoot some slow-mo footage, 4K at 60fps will allow you to slow footage down by 40%. If you need to slow things down even further, iPhone 8 models and newer can shoot 240fps. However, this mode is limited to a resolution of 1080p.What resolution photos do iPhones take? ›
The latest, top of the line iPhone 14 Pro Max have a 48 megapixel sensor — that's a resolution of 8064 x 6048 pixels. That's 4X the resolution of an iPhone 13. That's a lot of pixels to tuck into such a small camera. This update includes the 12MP camera of the latest iPhones.What resolution do iPhones shoot in? ›
Any iPhone from the past several years offers basic video-recording options, though the premium Pro and Pro Max editions up the game with advanced video features. You can shoot a video at different resolutions and frame rates, including 720p at 30 frames per second, 1080p at 30 or 60fps, and 4K at 24, 30, or 60fps.