Xiaomi 12S Ultra Camera test

0

The Xiaomi 12S Ultra is the Chinese brand’s latest flagship phone, featuring an impressive spec sheet, which includes a WQHD+ TrueColor display and Qualcomm’s latest Snapdragon 8+ Gen 1 top-end chipset for plenty of processing power. Special emphasis has been put on camera performance and design. The 12S Ultra comes with a leatherette-style material on the back, mimicking the look and feel of a serious camera. The large Leica-badged camera hides a lens array that offers a 50MP 1-inch image sensor in the primary camera, 48MP resolution on the other camera modules, and a long tele lens with 120mm equivalent focal length. Users can also choose between two Leica image presets that offer either a more vibrant or more subdued color response and contrast curve.

We put the Xiaomi 12S Ultra through our rigorous DXOMARK Camera test suite to measure its performance in photo, video, and zoom quality from an end-user perspective. This article breaks down how the device fared in a variety of tests and several common use cases and is intended to highlight the most important results of our testing with an extract of the captured data.

Overview

Key camera specifications:

  • Primary:50MP 1.0″ sensor, 23mm equivalent f/1.9-aperture lens, Dual Pixel PDAF, OIS
  • Ultra-wide: 48MP 1/2.0″ sensor, 128˚ field of view, f/2.2-aperture lens, PDAF
  • Tele: 48MP 1/2.0″ sensor, 120mm equivalent f/4.1-aperture lens, OIS
  • Video: 4K at 60/30fps, 1080p at 60/30fps


Xiaomi 12S Ultra

138

camera

104

Honor Magic4 Ultimate

Best: Honor Magic4 Ultimate (111)

102

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (107)

93

Asus Smartphone for Snapdragon Insiders

Best: Asus Smartphone for Snapdragon Insiders (109)

104

Xiaomi Mi 11

Best: Xiaomi Mi 11 (111)

96

Honor Magic4 Ultimate

Best: Honor Magic4 Ultimate (102)

71

Google Pixel 6

Best: Google Pixel 6 (77)

76

Huawei Mate 40 Pro+

Best: Huawei Mate 40 Pro+ (82)

65

Huawei P50 Pro

Best: Huawei P50 Pro (80)

66

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (80)

124

Honor Magic4 Ultimate

Best: Honor Magic4 Ultimate (140)

56

Honor Magic4 Ultimate

Best: Honor Magic4 Ultimate (58)

102

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (118)

103

Honor Magic4 Ultimate

Best: Honor Magic4 Ultimate (107)

94

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (109)

86

Oppo Reno6 Pro 5G (Snapdragon)

Best: Oppo Reno6 Pro 5G (Snapdragon) (99)

93

Apple iPhone 13 Pro Max

Best: Apple iPhone 13 Pro Max (105)

85

Best

Xiaomi 12S Ultra

98

Vivo X70 Pro+

Best: Vivo X70 Pro+ (103)

CAMERA

Position in Global Ranking

5th

6. Apple iPhone 13 Pro Max

137

11. Asus Smartphone for Snapdragon Insiders

133

15. Samsung Galaxy S22 Ultra (Snapdragon)

131

15. Samsung Galaxy S22 Ultra (Exynos)

131

15. Vivo X70 Pro (MediaTek)

131

22. Apple iPhone 12 Pro Max

130

32. Samsung Galaxy S22+ (Exynos)

126

32. Samsung Galaxy S22 (Exynos)

126

32. Samsung Galaxy S20 Ultra 5G (Exynos)

126

36. Apple iPhone 11 Pro Max

124

36. Samsung Galaxy Z Fold3 5G

124

39. Samsung Galaxy S21 Ultra 5G (Snapdragon)

123

44. Oppo Reno6 Pro 5G (Snapdragon)

121

44. Samsung Galaxy S21 Ultra 5G (Exynos)

121

46. Samsung Galaxy S21 FE 5G (Snapdragon)

120

46. Samsung Galaxy Note20 Ultra 5G (Exynos)

120

46. Samsung Galaxy Note20 (Exynos)

120

46. Samsung Galaxy Note 20 (Exynos)

120

46. Vivo X60 Pro 5G (Snapdragon)

120

55. Samsung Galaxy S21+ 5G (Snapdragon)

119

55. Samsung Galaxy S21 5G (Snapdragon)

119

58. Apple iPhone SE (2022)

118

58. Samsung Galaxy Z Flip3 5G

118

58. Vivo X60 Pro 5G (Exynos)

118

64. Samsung Galaxy Note20 Ultra 5G (Snapdragon)

117

68. Samsung Galaxy S21+ 5G (Exynos)

116

68. Samsung Galaxy S21 5G (Exynos)

116

68. Samsung Galaxy Note 10+ 5G (Exynos)

116

72. Samsung Galaxy S20 FE (Exynos)

115

80. Samsung Galaxy Note20 (Snapdragon)

112

91. Samsung Galaxy Z Fold2 5G

109

91. Samsung Galaxy Z Fold 2

109

93. Xiaomi Redmi K40 Pro+

108

100. Xiaomi Redmi Note 10 Pro

106

103. Motorola Edge 20 Pro

105

103. Samsung Galaxy A53 5G

105

103. Samsung Galaxy A52s 5G

105

103. Samsung Galaxy A33 5G

105

113. Apple iPhone SE (2020)

103

116. Samsung Galaxy A52 5G

102

116. Xiaomi Redmi Note 11 Pro 5G

102

122. Xiaomi Redmi Note 11S 5G

97

124. Honor Magic4 Lite 5G

95

124. Samsung Galaxy A13 5G

95

127. Xiaomi Redmi Note 10S

92

130. Samsung Galaxy A22 5G

89

131. Samsung Galaxy A71 5G

88

131. Xiaomi Redmi Note 11

88

131. Xiaomi Redmi 10 2022

88

135. Samsung Galaxy A51 5G

87

CAMERA

Position in Ultra-Premium Ranking

5th

6. Apple iPhone 13 Pro Max

137

11. Asus Smartphone for Snapdragon Insiders

133

14. Samsung Galaxy S22 Ultra (Snapdragon)

131

14. Samsung Galaxy S22 Ultra (Exynos)

131

20. Apple iPhone 12 Pro Max

130

27. Samsung Galaxy S22+ (Exynos)

126

27. Samsung Galaxy S20 Ultra 5G (Exynos)

126

30. Apple iPhone 11 Pro Max

124

30. Samsung Galaxy Z Fold3 5G

124

33. Samsung Galaxy S21 Ultra 5G (Snapdragon)

123

35. Samsung Galaxy S21 Ultra 5G (Exynos)

121

36. Samsung Galaxy Note20 Ultra 5G (Exynos)

120

36. Samsung Galaxy Note 20 (Exynos)

120

36. Samsung Galaxy Note20 (Exynos)

120

39. Samsung Galaxy S21+ 5G (Snapdragon)

119

40. Samsung Galaxy Z Flip3 5G

118

42. Samsung Galaxy Note20 Ultra 5G (Snapdragon)

117

44. Samsung Galaxy S21+ 5G (Exynos)

116

44. Samsung Galaxy Note 10+ 5G (Exynos)

116

47. Samsung Galaxy Note20 (Snapdragon)

112

51. Samsung Galaxy Z Fold2 5G

109

51. Samsung Galaxy Z Fold 2

109

Pros

  • Good exposure for landscape, cityscape and portrait, from bright light to very low light conditions
  • Fairly wide dynamic range in both photo and video
  • Accurate photo autofocus in all light conditions
  • Excellent compromise between texture retention and  image noise
  • Interesting color rendering of photos; the two Leica modes provide different color rendering for photo, but are overall, similar in terms of image quality
  • Good image quality at long-range tele zoom

Cons

  • Some noticeable exposure and HDR rendering instabilities in various scenes
  • Lack of zero shutter lag for HDR conditions or low light, with a systematic delay measured at higher than 0.3 seconds
  • Focus and exposure instabilities when using the tele camera
  • Visible noise in high-contrast scenes even in good lighting conditions
  • Occasional low contrast and halo effects in backlit scenes, especially with deep or olive skin tones
  • No HDR on the preview image, which does not match the captured image

Test summary

When all the Xiaomi 12S Ultra’s high-end camera components (hardware, software, ISP) work together well, the resulting image quality is nothing short of amazing, and the device easily secures itself a top 5 position in the DXOMARK Camera ranking. However, compared to other high-performing devices in its class, for example, the Apple iPhone 13 Pro, image quality varies more, and overall, the device is less trustable.

The 12S Ultra also lags slightly behind its own predecessor the Xiaomi Mi 11 Ultra from last year in several camera test areas.

Xiaomi 12S Ultra Camera Scores vs Ultra-Premium

This graph compares DXOMARK photo, zoom and video scores between the tested device and references. Average and maximum scores of the price segment are also indicated. Average and maximum scores for each price segment are computed based on the DXOMARK database of devices tested.

About DXOMARK Camera tests: DXOMARK’s Camera evaluations take place in laboratories and in real-world situations using a wide variety of subjects. The scores rely on objective tests for which the results are calculated directly by measurement software on our laboratory setups, and on perceptual tests in which a sophisticated set of metrics allow a panel of image experts to compare aspects of image quality that require human judgment. Testing a smartphone involves a team of engineers and technicians for about a week. Photo, Zoom, and Video quality are scored separately and then combined into an Overall score for comparison among the cameras in different devices. For more information about the DXOMARK Camera protocol, click here. More details on smartphone camera scores are available here. The following section gathers key elements of DXOMARK’s exhaustive tests and analyses. Full performance evaluations are available upon request. Please contact us  on how to receive a full report.

Photo

144

Honor Magic4 Ultimate

Honor Magic4 Ultimate

About DXOMARK Camera Photo tests

For scoring and analysis, DXOMARK engineers capture and evaluate more than 2,600 test images both in controlled lab environments and in outdoor, indoor and low-light natural scenes, using the camera’s default settings. The photo protocol is designed to take into account the main use cases and is based on typical shooting scenarios, such as portraits, family, and landscape photography. The evaluation is performed by visually inspecting images against a reference of natural scenes, and by running objective measurements on images of charts captured in the lab under different lighting conditions from 1 to 1,000+ lux and color temperatures from 2,300K to 6,500K.

When shooting still images, the 12S Ultra delivers very good image quality in most situations. Colors look overall very nice, and the engineers found an excellent compromise between detail retention and image noise reduction. However, compared to last year’s Mi 11 Ultra, autofocus is often slower and exposure is more unstable on the 12S Ultra, and in some scenes, the older model delivers better image detail and lower noise levels.

Out of the box, the Xiaomi 12S Ultra camera lets you choose from two default camera presets that control parameters such as color response and contrast curve: Leica Vibrant and Leica Authentic. We performed all DXOMARK Camera testing with the Vibrant setting, and we are confident that the Leica Authentic setting would have resulted in fairly identical scores. Choosing one preset over the other is a matter of personal taste and does not have an impact on our scores. Only an inaccurate color response would result in a point reduction.

Xiaomi 12S Ultra Photo scores vs Ultra-Premium

The photo tests analyze image quality attributes such as exposure, color, texture, and noise in various light conditions. Autofocus performances and the presence of artifacts on all images captured in controlled lab conditions and in real-life images are also evaluated. All these attributes have a significant impact on the final quality of the images captured with the tested device and can help to understand the camera’s main strengths and weaknesses.

Exposure

104

Honor Magic4 Ultimate

Honor Magic4 Ultimate

Exposure is one key attribute for technically good pictures. We evaluate the brightness of the main subject through various use cases such as landscape, portrait, or still life. Other factors evaluated are the contrast and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Repeatability is also important because it demonstrates the camera’s ability to provide the same rendering when shooting several images of the same scene.

On average, the 12S Ultra produces accurate target exposure down to low light, and dynamic range is among the best we have seen, capturing good detail in both the brightest and darkest parts of a scene. However, the high dynamic range comes at a cost, and our testers observed some contrast issues, especially in more difficult backlit scenes. This issue appears to be better under control on the older Mi 11 Ultra model.

Xiaomi 12S Ultra: accurate exposure, wide dynamic range

Apple iPhone 13 Pro Max: accurate exposure, more limited dynamic range

Honor Magic4 Ultimate: accurate exposure, wide dynamic range

The contrast issue mentioned above is also noticeable when shooting with the Leica Authentic preset, albeit slightly less strong. Black areas are a little deeper, and highlight areas of the frame are slightly oversaturated.

Xiaomi 12S Ultra, Leica Vibrant preset: wide dynamic range, lack of contrast on subjects

Xiaomi 12S Ultra, Leica Authentic preset: reduced dynamic range, same lack of contrast

Xiaomi Mi 11 Ultra: dynamic range, lack of contrast on subjects

Our testers also observed exposure issues in very challenging light situations, such as this strongly backlit scene below. All three cameras struggle with these conditions, but the underexposure on the subject is less strong on the Apple and Honor. Because of the lack of zero shutter lag on the Xiaomi 12S Ultra, exposure was affected in the final capture.

Xiaomi 12S Ultra: strong subject underexposure

Apple iPhone 13 Pro Max: subject underexposure

Honor Magic4 Ultimate: subject underexposure

Color

102

Apple iPhone 13 Pro Max

Apple iPhone 13 Pro Max

For color, the image quality attributes analyzed are skin-tone rendering, white balance, color shading, and repeatability. For color and skin tone rendering, we penalize unnatural colors but we respect a manufacturer’s choice of color signature.

White balance is usually accurate, and there is no color shading. Colors are rich and pleasant in almost all tested conditions. The two different presets — Leica Vibrant and Authentic — both provide good color rendering and achieve the same color score.

Xiaomi 12S Ultra, Leica Authentic

Xiaomi 12S Ultra, Leica Vibrant

However, a few color casts can be visible, and our testers observed a yellowish tint on skin tones. The latter is especially true for darker skin tones but is also noticeable on lighter skin. As expected, the Leica Vibrant preset produces more saturated color than the Authentic setting, but both options are within acceptable limits and are purely a matter of taste.

Xiaomi 12S Ultra, Leica Vibrant: slight pink white balance cast

Apple iPhone 13 Pro Max: accurate white balance

Samsung Galaxy S22 Ultra (Snapdragon): accurate white balance

 

Autofocus

93

Asus Smartphone for Snapdragon Insiders

Asus Smartphone for Snapdragon Insiders

Autofocus tests concentrate on focus accuracy, focus repeatability, shooting time delay, and depth of field. Shooting delay is the difference between the time the user presses the capture button and the time the image is actually taken. It includes focusing speed and the capability of the device to capture images at the right time, what is called ‘zero shutter lag’ capability. Even if a shallow depth of field can be pleasant for a single subject portrait or close-up shot, it can also be a problem in some specific conditions such as group portraits; Both situations are tested. Focus accuracy is also evaluated in all the real-life images taken, from infinity to close-up objects and in low light to outdoor conditions.

The 12S Ultra’s autofocus generally works accurately. It is fast and stable, but you might have to wait a bit longer for the image to be captured than on some rivals. Given its large sensor and fast aperture, it’s no surprise the 12S Ultra has a shallow depth of field, rendering background subjects pretty soft. This can be used as a creative tool in close-up photography.

In contrast to some competitors, including some of Xiaomi’s own models, the camera does not offer zero shutter lag in all HDR light conditions and in low light. In practice, this means that you’re more likely to miss the decisive moment with the 12S Ultra than with some of its rivals. We have calculated that to get the best results, the image should be captured in a time slot of 100ms before or after the shutter button is triggered. Any sooner or later, and the exact intended moment might not be captured, which is specifically problematic in scenes with fast motion. With a 300ms delay, the 12S Ultra is quite a bit slower than the comparison devices in this test and outside our 50ms threshold.

Xiaomi 12S Ultra: long delay between capture and shot trigger, missing the instant

Apple iPhone 13 Pro Max: fast capture after shot is triggered

Honor Magic4 Ultimate: fast capture after shot is triggered

The photos above were triggered at exactly the same moment for all three devices.  The final captures clearly show the long shutter lag of the Xiaomi 12S Ultra. While the two competitive devices manage to capture the instant, the Xiaomi 12S Ultra fails to do so, and the ball is already out of the frame.

Autofocus irregularity and speed: 1000Lux Δ2EV Daylight Handheld

This graph illustrates focus accuracy and speed and also zero shutter lag capability by showing the edge acutance versus the shooting time measured on the AFHDR setup on a series of pictures. All pictures were taken at 1000Lux with Daylight illuminant, 500ms after the defocus. On this scenario, the backlit panels in the scene are set up to simulate a fairly high dynamic range: the luminance ratio between the brightest point and a 18% reflective gray patch is 2, which we denote by a Exposure Value difference of 2. The edge acutance is measured on the four edges of the Dead Leaves chart, and the shooting time is measured on the LED Universal Timer.

Autofocus irregularity and speed: 20Lux Δ0EV Tungsten Handheld

This graph illustrates focus accuracy and speed and also zero shutter lag capability by showing the edge acutance versus the shooting time measured on the AFHDR setup on a series of pictures. All pictures were taken at 20Lux with Tungsten illuminant, 500ms after the defocus. The edge acutance is measured on the four edges of the Dead Leaves chart, and the shooting time is measured on the LED Universal Timer.

The limited depth of field also makes it challenging to find the best focus point in multi-layer scenes, such as the group portrait below. In contrast to the Mi 11 Ultra and some rivals, the 12S Ultra does not appear to apply any sharpness or enhancements to faces in the background.



Xiaomi 12S Ultra, depth of field

Xiaomi 12S Ultra: shallow depth of field



Apple iPhone 13 Pro Max, depth of field

Apple iPhone 13 Pro Max: wider depth of field



Honor Magic4 Ultimate, depth of field

Honor Magic4 Ultimate: wider depth of field

Texture

104

Xiaomi Mi 11

Xiaomi Mi 11

Texture tests analyze the level of details and the texture of subjects in the images taken in the lab as well as in real-life scenarios. For natural shots, particular attention is paid to the level of details in the bright and dark areas of the image. Objective measurements are performed on chart images taken in various lighting conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The charts used are the proprietary DXOMARK chart (DMC) and the Dead Leaves chart.

DXOMARK CHART (DMC) detail preservation score vs lux levels for tripod and handheld conditions

This graph shows the evolution of the DMC detail preservation score with the level of lux, for two holding conditions. DMC detail preservation score is derived from an AI-based metric trained to evaluate texture and details rendering on a selection of crops of our DXOMARK chart.

Thanks to the high pixel count and high-quality lens, the 12S Ultra camera usually captures excellent levels of detail, but in terms of fine detail retention, the Xiaomi 12S Ultra lags just a little behind its own predecessor and the best models from other brands in its class. It is worth noting that the Vibrant preset produces slightly better detail than Authentic. Looking at the specifications of the camera, the texture/noise compromise is somewhat disappointing as the camera hardware with the big sensor is capable of capturing 30% more light than the predecessor.



Xiaomi 12S Ultra, Leica Vibrant

Xiaomi 12S Ultra, Leica Vibrant: good detail



Xiaomi 12S Ultra, Leica Authentic

Xiaomi 12S Ultra, Leica Authentic: lower level of detail



Xiaomi Mi 11 Ultra: excellent fine detail

Noise

96

Honor Magic4 Ultimate

Honor Magic4 Ultimate

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure on real-life images as well as images of charts taken in the lab. For natural images, particular attention is paid to the noise on faces, and landscapes, but also on dark areas and high dynamic range conditions. Noise on moving objects is also evaluated on natural images. Objective measurements are performed on images of charts taken in various conditions from 1 to 1000 lux and different kinds of dynamic range conditions. The chart used is the Dead Leaves chart and the standardized measurement such as Visual Noise derived from ISO 15739.

Visual noise evolution with illuminance levels in handheld condition

This graph shows the evolution of visual noise metric with the level of lux in handheld condition. The visual noise metric is the mean of visual noise measurement on all patches of the Dead Leaves chart in the AFHDR setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Thanks to the large sensor and the powerful chipset’s noise-reduction algorithms, image noise is usually pretty low. However, it is often not quite as low as on the best-in-class devices, including the Mi 11 Ultra. This is especially true for shadow areas in the frame or the leaves.



Xiaomi 12S Ultra, low light noise

Xiaomi 12S Ultra: slight noise on face



Apple iPhone 13 Pro Max, low light noise

Apple iPhone 13 Pro Max: noise on face



Honor Magic4 Ultimate, low light noise

Honor Magic4 Ultimate: very low noise



Xiaomi 12S Ultra, outdoor noise

Xiaomi 12S Ultra: noise in the shadows



Xiaomi Mi 11 Ultra, outdoor noise

Xiaomi Mi 11 Ultra: noise very well under control

Artifacts

71

Google Pixel 6

Google Pixel 6

The artifacts evaluation looks at lens shading, chromatic aberrations, geometrical distortion, edges ringing, halos, ghosting, quantization, unexpected color hue shifts, among others type of possible unnatural effects on photos. The more severe and the more frequent the artifact, the higher the point deduction on the score. The main artifacts observed and corresponding point loss are listed below.

The 12S Ultra keeps image artifacts under control very well. A few points were deducted for color quantization, and fusion artifacts, including ghosting and halo effects among others. However, in terms of artifacts, the 12S Ultra is an improvement over the Mi 11 Ultra, particularly ghosting and fusion artifacts.

Xiaomi 12S Ultra: Tone compression and slight halo effect around the face and the back of the neck. The contrast seems slighlty inconsistent within the face.

Apple iPhone 13 Pro Max: no halo, natural contrast, but low target exposure

Honor Magic4 Ultimate: no halo, natural contrast

Night

76

Huawei Mate 40 Pro+

Huawei Mate 40 Pro+

The Xiaomi 12S Ultra’s night mode is one of the best we have seen. Night shots usually offer a very wide dynamic range, similar to the Mi 11 Ultra, which is great for night cityscapes and similar applications. However, on some occasions, the HDR algorithm does not kick in, resulting in shadow clipping, as can be seen in the second sample set below. This is why the score in this section is slightly lower than on the Mi 11 Ultra.

Xiaomi 12S Ultra: good exposure and detail

Apple iPhone 13 Pro Max: limited dynamic range, less detail in highlight and shadow areas

Honor Magic4 Ultimate: good exposure and detail

Xiaomi 12S Ultra, flash-off: wide dynamic range, excellent highlight detail

Xiaomi 12S Ultra, flash-auto: neither flash nor HDR flash triggered, resulting in a lack of both shadow and highlight

Preview

66

Apple iPhone 13 Pro Max

Apple iPhone 13 Pro Max

Preview tests analyze the image quality of the camera app’s preview of the image, with particular attention paid to the difference between the capture and the preview, especially regarding dynamic range and the application of the bokeh effect. Also evaluated is the smoothness of the exposure, color and focus adaptation when zooming from the minimal to the maximal zoom factor available. The preview frame rate is measured using the LED Universal Timer.

The 12S Ultra’s preview on the display does not fully reflect the captured image, especially in high-contrast conditions or when bokeh mode is activated. So, even if the conditions aren’t ideal for taking an image, you should push the shutter anyway. The end result might look better than expected.

Xiaomi 12S Ultra, preview: HDR not applied to preview, bokeh effect very different to capture

Xiaomi 12S Ultra, capture: much wider dynamic range than preview, different bokeh effect to preview

 

Bokeh

65

Huawei P50 Pro

Huawei P50 Pro

Bokeh is tested in one dedicated mode, usually portrait or aperture mode, and analyzed by visually inspecting all the images captured in the lab and in natural conditions. The goal is to reproduce portrait photography comparable to one taken with a DLSR and a wide aperture. The main image quality attributes paid attention to are depth estimation, artifacts, blur gradient, and the shape of the bokeh blur spotlights. Portrait image quality attributes (exposure, color, texture) are also taken into account.

Depth estimation for applying the bokeh effect works well in simple scenes but is of noticeably lesser quality than on the Xiaomi Mi 11 Ultra. We also noticed a softer level of detail on the main subject.



Xiaomi 12S Ultra, bokeh mode: depth estimation errors, soft details on subject



Xiaomi Mi 11 Ultra, bokeh mode: better depth estimation, better detail on subject

Grid is correctly detected



Honor Magic4 Ultimate: better depth estimation, better detail on subject

Grid is more or less detected



Xiaomi 12S Ultra, low light bokeh in the lab

Xiaomi 12S Ultra: very noticeable depth estimation issues



Apple iPhone 13 Pro, low light bokeh in the lab

Apple iPhone 13 Pro: better depth estimation



Honor Magic4 Ultimate, low light bokeh in the lab

Honor Magic4 Ultimate: better depth estimation

Zoom

96

Honor Magic4 Ultimate

Honor Magic4 Ultimate

About DXOMARK Camera Zoom tests

DXOMARK engineers capture and evaluate over 400 test images in controlled lab environments and in outdoor, indoor, and low-light natural scenes, using the camera’s default settings and pinch zoom at various zoom factors from ultra wide to very long-range zoom. The evaluation is performed by visually inspecting the images against a reference of natural scenes, and by running objective measurements of chart mages captured in the lab under different conditions from 20 to 1000 lux and color temperatures from 2300K to 6500K.

The Xiaomi 12S Ultra camera is great for zooming both in and out. Thanks to top-end hardware in both the ultra-wide and tele cameras, the device is capable of achieving excellent results at all zoom settings, resulting in a very good Zoom score. However, zoom performance is also more unstable than on many rivals, resulting in a zoom score that is not quite up with the very best in class.

Wide

56

Honor Magic4 Ultimate

Honor Magic4 Ultimate

These tests analyze the performance of the ultra-wide camera at several focal lengths from 12 mm to 20 mm. All image quality attributes are evaluated, with particular attention paid to such artifacts as chromatic aberrations, lens softness, and distortion. Pictures below are an extract of tested scenes.

At a measured 13mm equivalent, the 12S Ultra’s ultra-wide camera offers a very wide field of view, allowing you to squeeze a lot of scene into the frame. However, the lens is a touch soft, and fine detail is not quite as well rendered as on the best in class.

Xiaomi 12S Ultra, ultra-wide: wide field of view, lack of detail

Apple iPhone 13 Pro Max, ultra-wide: wide field of view, better detail

Honor Magic4 Ultimate, ultra-wide: wide field of view, better detail

Tele

124

Honor Magic4 Ultimate

Honor Magic4 Ultimate

All image quality attributes are evaluated at focal lengths from approximately 40 mm to 300 mm, with particular attention paid to texture and detail. The score is derived from a number of objective measurements in the lab and perceptual analysis of real-life images.

DXOMARK CHART (DMC) detail preservation zoom score vs lux levels for tripod and handheld conditions

This graph shows the evolution of the DMC detail preservation zoom score with the level of lux, for two holding conditions. DMC detail preservation score is derived from an AI-based metric trained to evaluate texture and details rendering on a selection of crops of our DXOMARK chart.

DXOMARK CHART (DMC) detail preservation zoom score vs lux lux levels for tripod and handheld conditions

This graph shows the evolution of the DMC detail preservation zoom score with the level of lux, for two holding conditions. DMC detail preservation score is derived from an AI-based metric trained to evaluate texture and details rendering on a selection of crops of our DXOMARK chart.

DXOMARK CHART (DMC) detail preservation zoom score vs lux levels for tripod and handheld conditions

This graph shows the evolution of the DMC detail preservation zoom score with the level of lux, for two holding conditions. DMC detail preservation score is derived from an AI-based metric trained to evaluate texture and details rendering on a selection of crops of our DXOMARK chart.

Thanks to its dedicated 120mm tele lens and powerful processing, the 12S Ultra offers an excellent tele zoom, especially at long range. However, the score is negatively impacted by strong and frequent exposure and focus instabilities. These instabilities can occur in all conditions, so it’s always best to take multiple shots to increase the chances of having at least one good image.



Xiaomi 12S Ultra, long range zoom

Xiaomi 12S Ultra: good detail



Apple iPhone 13 Pro Max, long range zoom

Apple iPhone 13 Pro Max: lack of detail



Honor Magic4 Ultimate, long range zoom

Honor Magic4 Ultimate: good detail

The three samples below were captured in short succession and illustrate the exposure and focus instabilities mentioned above.

Xiaomi 12S Ultra, tele zoom, first capture

Xiaomi 12S Ultra, tele zoom, second capture of the same scene, exposure is completely different

Xiaomi 12S Ultra, tele zoom, third capture of the same scene, different exposure and out of focus

Video

113

Apple iPhone 13 Pro Max

Apple iPhone 13 Pro Max

About DXOMARK Camera Video tests

DXOMARK engineers capture and evaluate more than 2.5 hours of video in controlled lab environments and in natural low-light, indoor and outdoor scenes, using the camera’s default settings. The evaluation consists of visually inspecting natural videos taken in various conditions and running objective measurements on videos of charts recorded in the lab under different conditions from 1 to 1000 lux and color temperatures from 2300 to 6500K.

Overall, the 12S Ultra Video performance is on a very high level, and most users will be able to record very nice-looking video footage. However, the most demanding videographers might be held back by some of the instabilities we observed.

Like for Photo and Zoom, the 12S Ultra does well for Video but is not quite on the same level as the very best, even slightly lagging behind its predecessor the Mi 11 Ultra. Video footage shows good dynamic range, natural texture rendering in bright light, and good exposure down to low light. Face tracking works well, too. However, our testers also observed inaccurately rendered blue tones in outdoor conditions, jerky exposure adaptation in changing light conditions and autofocus failures.

The Xiaomi 12S Ultra’s video mode was tested at 4K resolution and a frame rate of 60 frames per second. The camera offers Dolby Vision HDR format as an option. We found in our testing that using the Dolby Vision format does not result in a higher score than the default SDR option.

Xiaomi 12S Ultra Video scores vs Ultra-Premium

Video tests analyze the same image quality attributes as for still images, such as exposure, color, texture, or noise, in addition to temporal aspects such as speed, and smoothness and stability of exposure, white balance, and autofocus transitions.

Exposure

102

Apple iPhone 13 Pro Max

Apple iPhone 13 Pro Max

The 12S Ultra video footage offers good target exposure. Dynamic range is wide in outdoor and indoor conditions and close to the latest iPhones, which lead the pack in this respect. Differences are noticeable in low light, though. Our testers also found that exposure was unstable and that exposure adaptation in changing light conditions was not always smooth.

Xiaomi 12S Ultra, video still: fairly good dynamic range and good target exposure

Apple iPhone13 Pro Max, video still: fairly good dynamic range and good target exposure

Honor Magic4 Ultimate, video still: fairly good dynamic range but slight underexposure

Xiaomi 12S Ultra: limited dynamic range in lowlight conditions

Apple iPhone 13 Pro Max: good dynamic range in lowlight conditions

 

Xiaomi 12S Ultra: wide dynamic range, good exposure on face but exposure stepping and instabilities

Xiaomi Mi 11 Ultra: stable exposure

Color

103

Honor Magic4 Ultimate

Honor Magic4 Ultimate

Exposure tests evaluate the brightness of the main subject and the dynamic range, eg. the ability to render visible details in both bright and dark areas of the image. Stability and temporal adaption of the exposure are also analyzed.
Image-quality color analysis looks at color rendering, skin-tone rendering, white balance, color shading, stability of the white balance and its adaption when light is changing.

Video white balance tends to be accurate, but in bright outdoor conditions, blue hues are sometimes rendered inaccurately. The color signature appears to be different in video compared to photos, and the Leica Authentic and Vibrant color presets are only available when shooting still images.

Xiaomi 12S Ultra: inaccurate blue hues

Apple iPhone 13 Pro Max: more natural colors

Honor Magic4 Ultimate: more natural colors

Autofocus

94

Apple iPhone 13 Pro Max

Apple iPhone 13 Pro Max

Xiaomi 12S Ultra: quick focus convergence but noticeable stepping

Apple iPhone 13 Pro Max: quick and smooth focus convergence

Honor Magic4 Ultimate: very quick and smooth focus convergence

Focus convergence when subject distances are changing is not as smooth as on the best-in-class competitors. Our testers also observed some focus issues, including unnecessary refocusing.

Xiaomi 12S Ultra: unwanted focus searching

Apple iPhone 13 Pro Max: no focus issues

Honor Magic4 Ultimate: no focus issues

Texture

86

Oppo Reno6 Pro 5G (Snapdragon)

Oppo Reno6 Pro 5G (Snapdragon)

Texture tests analyze the level of details and texture of the real-life videos as well as the videos of charts recorded in the lab. Natural videos recordings are visually evaluated, with particular attention paid to the level of details in the bright and areas as well as in the dark. Objective measurements are performed of images of charts taken in various conditions from 1 to 1000 lux. The charts used are the DXOMARK chart (DMC) and Dead Leaves chart.

As seen in the graph below, texture levels are slightly below the best in class across all light conditions. As expected, there is a drop in detail as the light gets dimmer.

DXOMARK CHART (DMC) detail preservation video score vs lux levels

This graph shows the evolution of the DMC detail preservation video score with the level of lux in video. DMC detail preservation score is derived from an AI-based metric trained to evaluate texture and details rendering on a selection of crops of our DXOMARK chart.

The following frames and crops extracted from one of our test videos in bright light conditions show the slight lack of details of the Xiaomi 12S Ultra compared with the Xiaomi Mi 11 Ultra.



Xiaomi 12S Ultra, video still 1000Lux

Xiaomi 12S Ultra: lack of fine detail



Xiaomi Mi 11 Ultra, video still 1000Lux

Xiaomi Mi 11 Ultra, better detail

Noise

93

Apple iPhone 13 Pro Max

Apple iPhone 13 Pro Max

Noise tests analyze various attributes of noise such as intensity, chromaticity, grain, structure, temporal aspects on real-life video recording as well as videos of charts taken in the lab. Natural videos are visually evaluated, with particular attention paid to the noise in the dark areas and high dynamic range conditions. Objective measurements are performed on the videos of charts recorded in various conditions from 1 to 1000 lux. The chart used is the DXOMARK visual noise chart.

The 12S Ultra’s video noise levels are slightly higher than on some rivals across all light levels. This is true for both spatial and temporal noise, as we can see in our lab measurements in the graph below. In very low light (1 Lux), exposure was too low to perform a noise measurement. But this extreme use case has a very low impact on our scores.

Spatial visual noise evolution with the illuminance level

This graph shows the evolution of spatial visual noise with the level of lux. Spatial visual noise is measured on the visual noise chart in the video noise setup. DXOMARK visual noise measurement is derived from ISO15739 standard.

Temporal visual noise evolution with the illuminance level

This graph shows the evolution of temporal visual noise with the level of lux. Temporal visual noise is measured on the visual noise chart in the video noise setup.

In more common but still challenging low light use cases, such as this scene, the 12S Ultra produces chromatic noise, which is more noticeable than on the best in class.



Xiaomi 12S Ultra, low light video still

Xiaomi 12S Ultra: visible chromatic noise



Xiaomi Mi11 Ultra, low light video still

Xiaomi Mi 11 Ultra: noise well under control

Stabilization

98

Vivo X70 Pro+

Vivo X70 Pro+

Stabilization evaluation tests the ability of the device to stabilize footage thanks to software or hardware technologies such as OIS, EIS, or any others means. The evaluation looks at residual motion, smoothness, jello artifacts, and residual motion blur on walk and run use cases in various lighting conditions. The video below is an extract from one of the tested scenes.

Video stabilization on the Xiaomi 12S Ultra is effective at counteracting camera shake and motion. However, when the camera motion is more complex, stabilization is not perfectly smooth. This explains the lower video score compared to the current top device for video, the Xiaomi Mi 11 Ultra. This can be seen in the samples below, recorded with both phones mounted on the same support, recording simultaneously. When the camera rotates (between 3 and 5 seconds), the Xiaomi 12S Ultra footage is noticeably more choppy than the Mi 11 Ultra.

Xiaomi 12S Ultra: visible “frame shift” or smoothness stabilization artifacts at 3 and 4 seconds during the rotation of the camera

Xiaomi Mi 11 Ultra: smooth stabilization during rotation of the camera

Artifacts

85

Best

Highest Score

Artifacts are evaluated with MTF and ringing measurements on the SFR chart in the lab as well as frame-rate measurements using the LED Universal Timer. Natural videos are visually evaluated by paying particular attention to artifacts such as aliasing, quantization, blocking, and hue shift, among others. The more severe and the more frequent the artifact, the higher the point deduction from the score. The main artifacts and corresponding point loss are listed below.

Our testers did not observe any strong artifacts in the 12S Ultra’s video footage. Overall, it is one of the best devices in this category that we have seen to date. Our testers barely noticed ringing, moiré, and color fringing. Thanks to the fast 60fps frame rate, the judder effect is especially well under control, making for very smooth panning shots.

Go to the source link

Leave A Reply

Your email address will not be published.