We give you an overview of our two types of Awards, our dynamic score system, and the tools and methodology we use for testing. You also find a list of products that have received awards.
Last updated on June 22, 2015 as part of the major FlatpanelsHD redesign.
We have two Awards that are given to great products; Reference Award and Highly Recommended Award. We can obviously only give out awards to products that we have had in for testing. Our Recommendations list on the other hand includes all the TVs, monitors, and media players that we consider recommendable – even if we haven’t had the product on the test bench.
The Reference Award is the ultimate award, given to displays that set a new benchmark for picture quality. These displays stand out from the crowd and set a reference for all other products to beat. Our Reference Award only takes into account picture quality - nothing else; not price, not features, not UI. However, there is a single exception. If a display can deliver the exact same picture quality at a lower price it will take over.
The Reference Award is valid until a new display takes over. It may take a few months, it may take years. If a new Reference hasn’t appeared during a given year the award will carry over into the next year. The Award is year-numbered and at the end of the calendar year it will be archived and cannot be taken from it. Only in the same calendar year can a Reference Award be taken away and awarded to a new, better display.
Highly Recommended Award is given to a product that we think is recommendable. Picture quality is the main factor but we also take into account features and UI/UX. Finally, we consider price and compare the full package to what is available on the market.
The Highly Recommended Award can be given to as many products as necessary. It is linked to our dynamic score system (see below) in that sense that if an Award was given to a TV two years ago it is not necessarily better than a TV that did not receive an award this year. The requirements for this Award will change every year and is always assessed in relation to what is currently available on the market.
A Highly Recommended Award can only be taken away in instances where a change to the product significantly degrades quality or if the price increases beyond normal fluctuations.
Dynamic score system
We have recently introduced our score system for reviews. The primary reason why we have not used a score system to date is that oftentimes scores for earlier reviews quickly prove invalid, and that new reviews will constantly push towards 100% and thereby render the full scale invalid.
So we developed our own dynamic score system. In the process we had to overcome certain technical challenges and we have spent months evaluating before putting it into action. Here it is.
All our TV, monitor, and media player reviews will use the new score system that is divided into categories. For TVs the categories are:
“Picture quality” is an assessment of factors such as color reproduction, image processing, motion handling, contrast / black level, light homogeneity etc. “Features” is an evaluation of functionality such as apps, smart features, tuners, recording, as well as sound quality. “User experience” is an assessment of the user interface, speed, usability, the remote control, and build quality.
The total score is a weighted sum of the subcategories where; “Picture Quality” weights 50%, “Features” weighs 25% and “User Experience” weighs 25%.
We have three different sets of scores; one for TVs, one for monitors, and one for media boxes (media streamers, game consoles etc). Monitors will be assessed based on picture quality (40%), speed / response time (10%), features (25%) and ergonomics (25%). Media boxes will be assessed based on features (30%), ecosystem / apps (40%) and user experience (30%).
And then the “new” dynamic element: Each of the three categories will score on an absolute scale where the maximum number of points that can be achieved is a moving target that we determine. In 2015, we start with all categories on index 100 but as new and better technologies are introduced the maximum score (one for each category) is increased. For example if a new TV sets a new bar for picture quality. This means that a TV with a score of 85 points in 2015 will have a score of 85% (= 85/100). But in 2016 a new and better TV might increase the maximum score to index 120, and the old TV will now score 85/120 = 71%.
That is why the score system will allow you to compare TVs even across years.
All of our reviews will include the score and conclusion box seen below. It is updated automatically and dynamically. You can find older reviews and read the score that compares to what is available today. Neat, right?
Picture quality is assessed as overall picture quality, including color reproduction, image processing, contrast, motion etc. Features is an evaluation of the built-in functionality such as apps, connector ports, tuners, recording capabilities, decoder formats, and how useful they are, as well as sound quality. User experience is evaluated on the basis of user friendliness, speed, build quality, and day-to-day use of the TV. Total score is weighted: 50% Picture quality, 25% Features, 25% User experience. All scores are calculated based on a moving maximum target, defined by what we currently consider the best on market. It is then presented as a percentage. This means that a score will fall over time as new and better TVs set new standards. This allows you to compare scores across years. A score of 100% in a given category means that it is consider the best available product in this category to date.
Fantastic picture quality Perfect black level Colors webOS is promising Virtually perfect viewing angles
Curved panel Motion reproduction not great High input lag webOS too slow Poor sound quality
We will raise the maximum score whenever we find a better or revolutionary technology that sets new bars for quality, but as a general rule we reexamine the maximum score once a year. This year, for example, almost every new TV will offer a new operating system, so if we had launched the score system in 2014 it would probably have its maximum score increased this year. Another example is the introduction of new picture technologies such OLED, 4K, 8K, HDR that will increase the bar. But as said we start on index 100 this year.
It should be noted that a maximum score may well be composed of factors not found in a specific TV (and therefore not available to buy) – which differs from our “Reference Award”. For example, a 4K OLED panel with HDR from manufacturer A, combined with image processing from manufacturer B, an operating system from manufacturer C, and various other features from manufacturers D and E. It is therefore unlikely to see any product score 100% - even if it has the Reference Award! One consequence of this is that you will probably find the scores a bit lower than on other reviewer sites that have more loosely defined score systems. This is not because the product is bad, just because our target is very high.
When it comes to our two awards, things are a little different. You can read a full explanation of the Awards above, but in relation to scores it is important to note that scores do not take into account price. Our Highly Recommended Award does. So it is not directly related. For example, a TV with a 60% score in picture quality might receive a Highly Recommended Award whereas a TV with 65% does not, because of the simple fact that the former is considerably cheaper. More value-for-money.
As part of the re-design of FlatpanelsHD we have changed the structure and layout of reviews. In addition to the new score system, we will make use of more tables that should be self-explanatory, meaning that text will become less descriptive. We have also introduced several interactive tab elements so look for something you can tab on!
However, test methodology is virtually unchanged. We use equipment for measuring image parameters such as color accuracy, gamma, contrast (always on ANSI patterns), black level, brightness, input lag and PWM (flicker). These measurements will be presented in the tables and are of course included in the assessment of the product. We have also begun taking time readings, meaning that we include a number for boot-up of the TV, and the most important apps. That way you can compare products from different manufacturers.
Besides the theoretical tests, we obviously release the products into the wild. For a TV that means TV channels, Blu-ray, streaming services, game consoles, and PCs. We continue to our own monitorTest for stress testing but we also use other programs or discs.
As always, we include our suggested calibration settings in all reviews. It should be considered guidance, not conclusive fact, and please note that we calibrate for dark room viewing. However, it is important to emphasize that with calibration we use industry standards as goals. When a TV is told to reproduce a certain shade of purple, it must produce exactly this color. Nothing else. It is not up to the manufacturer to interpret or “improve”, which unfortunately is a common misconception about picture reproduction. However, colors can be affected by ambient light (for example room lighting intensity and the color temperature of light) which is why our suggested calibration settings are designed for relatively dark room viewing. The settings might be a bit different if you mostly use your TV is a brightly lit room.
We developed this test methodology almost decade ago, and have perfected it over the years. It is just as valid today as it was back then, simply because it addresses the fundamental elements of picture quality. This also means that all displays are evaluated on equal terms. That is what we believe a review is about!