Smartphones… what they go through in our test lab 2021 ?

    Our laboratory sees between 150 and 200 smartphones and tablets per year. You may be wondering what the path is for these mobile devices on our premises. Well, look no further, the answers are here!

    Any new self-respecting smartphone soon joined the editorial staff of Digital . Once there, each device tested passes through several rooms of the laboratory (audio, photo, etc.). Several hundred points are thus scrutinized, analyzed and compared. No device escapes all stages of our protocol! For obvious reasons, if we cannot reveal the said protocol in its entirety here, we suggest that you take the route dedicated to smartphones and tablets within the editorial staff.

    Screen: immersed in pixels

    The first vector of interaction between a smartphone or tablet and the user, the screen is one of the most important points of our tests. Understand that it counts for a lot in the final scoring ( see box ). To carry out the measurements of this central element of a smartphone (or a tablet), we use an X-Rite i1 DisplayPro probe, as well as a battery of test targets developed by us. To compare the results of different smartphones, we take all measurements at 200 cd / m². On the software side, we use the HCFR colorimeter.

    The probe used to set the screen temperature or the contrast ratio.

    The first criterion measured is the contrast ratio, which can be defined by the amplitude between black and white. We thus place our probe on black and white test charts to calculate the famous white / black ratio. In this little game, smartphones with Amoled panels, like the Samsung Galaxy S20 Ultra, do best with values ​​that our probe is not able to define precisely because black is… black.

    Second criterion, the colorimetry is sanctioned by a measurement of the delta E – the average of the differences between the perfect colors and those displayed. We are also relying on in-house test charts here. At the same time, the shades of gray test charts allow us to measure the average gamma and the average temperature.

    Sometimes manufacturers multiply the display modes to allow everyone to find what they are looking for. Lively, Cinema, Reading, Basic… So many names for modes with often very disparate results. We test them all and choose the one that delivers the best results.

    In addition to the quality of the display, we measure the responsiveness of the screen. To do this, a device filming at 1000 fps takes over from the probe. We thus evaluate the afterglow, that is to say the time that an image takes to disappear, and the tactile delay which measures the time necessary for the screen to react to the finger.

    Finally, we evaluate the reflectance. For this, we use a glossmeter calculating the amount of reflections according to the viewing angles. We retrieve the results, output an average that we then communicate to you.

    Photo: like a digital camera

    Today, photography is an essential use of mobility. Over time, megapixels and modules multiply. To assess the performance of mobile devices in this discipline, we use the classic camera protocol and test scene. In the photo lab, we thus capture pictures in daylight equivalent to that of daylight, in low light, with a flash, on a tripod or even hand-held. Main, ultra wide-angle, telephoto … Each photo module goes through it. And if the smartphone offers it, we also observe the results in full definition, just to check the relevance of the 108 megapixel sensor of the Galaxy S20 Ultra for example.

    Photo like a digital camera
    Photo like a digital camera

    Some details may escape us under these conditions, all smartphones are used to take photos “in real life”. Finally, we are also evaluating the front sensor and the video mode, as well as the ergonomics of the dedicated app or the responsiveness of the entire photo environment.

    The photos are then processed and made available in our face-to-face, which brings together smartphones and tablets alongside cameras.

    Performance: multitasking and playing

    It is always in partnership with SmartViser that we evaluate the performance of a smartphone. The tests take place in two stages. First, we assess the ability of the mobile to juggle the different applications. We take this opportunity to load the RAM to varying degrees in order to see the reaction of the chip in question.

    Performance multitasking and playing

    Secondly, the game is tested. For this, we run a software developed by SmartViser. This game not only allows us to see what the mobile has in the belly, but also its frame rate and its ability to keep up. Some mobiles are indeed able to offer a high image rate, but for only five minutes. The fault is overheating or software optimization. We make sure to check what is happening on each mobile. Combined, the multitasking and gaming tests give a performance index that gives us a grade.

    Autonomy: Aim, Netflix and real life

    To assess the autonomy of smartphones, we use an application developed by SmartViser. This simulates a host of uses (internet browsing, calls, SMS, video playback, basic and 3D calculations, etc.). Thus, mobile devices are subject to what corresponds to actual use until exhaustion. The daily use of smartphones and tablets until the publication of the tests also allows us, in a more subjective way, to assess their endurance. We are also evaluating the charging pad supplied with the smartphone.

    Remember that the iPhone situation is still different today. Not being compatible with the Viser protocol, we continue to evaluate them during a Netflix autonomy test. An autonomy score with our Android protocol (Viser) is therefore in no way comparable to that obtained with the iOS protocol (Netflix). The charging time will however be taken into account on the next Apple mobiles.

    Audio: in the den of sound

    You might have noticed if you’ve been reading us for a long time, audio is no longer a subcategory on its own. We did not make it disappear, however, since audio performance is now part of the ergonomics and design note. Thus, we are content to penalize mobile phones that do not offer 3.5 mm mini-jack connectors or adapters. Apple and its latest generation iPhones are no strangers to this penalty.

    Let’s take the time to explain how the mini-jack connectors are evaluated. The audio lab is certainly the most requested among Digital. And for good reason: everything that is capable of producing sound must pass there. TV, monitors, smartphones, tablets, headsets, speakers, laptops, etc. :. all are entitled to the charm of the subdued light in this small soundproofed room dedicated to acoustics. We test the characteristics of the headphone output by connecting the smartphone to our Audio Precision AP58x analyzer. We then give it a variety of audio files to assess output power, dynamic range, crosstalk, and distortion. These tests are carried out after deactivating all the equalizers and other eccentric options which praise us a sound Dolby Atmos or other 12.1, often leading – not to say systematically – to a cacophony.

    Audio: in the den of sound

    Once the machine has spoken, we cross-check it with our very human ears. It is certain that we cannot detect deviations of a few decibels, but it happens that the machine is wrong – yes, well, that the human who the parameter is wrong – or that a setting jumps. When we are in a bind, we bring in the audio experts of the editorial staff. It sometimes turns out that such a completely outlandish measure is none other than the work of a careless builder.

    At the same time, the loudspeaker (s) are also analyzed. The vast majority are situated between the very bad and the bad, but here and there we come across correct performances and, exceptionally, good students.

    Because not everything is quantifiable

    The above procedures allow us to obtain quantified results, but certain things are evaluated by our expertise (yes, when we see so many smartphones passing by, we can call ourselves an expert). This is the case, for example, with ergonomics for which we base ourselves on our feelings, that of others, and our comparisons with other products. We do not judge a smartphone on its beauty, this already feeds enough debate between testers or readers, but on the quality of the materials and their assembly, the intelligence of the placement of a particular button or the occupancy rate of the front by the screen.

    And to continue to sharpen our assessment, we want to keep in mind what is important to you. Thus, we regularly conduct surveys to find out the criteria that interest you in order to meet your expectations.

    Let’s get out of the lab

    Although the results of our laboratory tests provide us with points of comparison between devices, such framed conditions sometimes overlook certain flaws or certain strengths. The other part of our tests therefore consists of using smartphones and tablets as you would in everyday life, in transport, at home, in the office, in the street… In short, everywhere!

    It is only after all these steps have been passed, with or without success, that we let the devices start up again.

    Red more

    Espionage, 5G, Google, Trump … Huawei in the eye of the storm

    Recent Articles

    Related Stories

    Stay on op - Ge the daily news in your inbox