How We Test Desktop PCs
At Consumerreviews.tv, our approach to reviewing desktop computers upholds the rich traditions established since the inception of our dedicated Testing Labs. With a legacy that traces back to 1984, we remain committed to comparing each system against its counterparts within the same category, based on key factors such as price, features, design, and our in-house performance evaluations.
To gauge performance, we employ a comprehensive suite of software benchmark tests and real-world applications and games. These carefully curated tests are designed to effectively highlight the strengths and weaknesses of a tested PC’s component mix, encompassing the processor, memory subsystem, storage hardware, and graphics capabilities.
In order to ensure the utmost accuracy, we utilize standardized tests developed by reputable benchmark developers. Additionally, we have developed our own proprietary tests whenever necessary. To stay ahead of the curve, we continually assess new benchmark solutions as they emerge in the market, making necessary adjustments to our testing procedures to accurately reflect the impact of the latest technologies.
At ConsumerReviews.TV, we are committed to conducting comprehensive evaluations of both desktop computers and laptops to ensure accurate and insightful comparisons. Our evaluation process is centered around UL’s PCMark 10 benchmark, a robust tool that replicates real-world productivity and content-creation scenarios.
This benchmark empowers us to effectively gauge the performance of everyday tasks such as word processing, web browsing, and videoconferencing.
The results are presented as a proprietary numeric score, where higher values indicate superior performance. To maintain uniformity in our assessments, we employ the primary test suite provided by PCMark 10.
For desktop PCs, we carry out the test at a resolution of 1,920 by 1,080 pixels (1080p), unless the system is an all-in-one (AIO) desktop equipped with an integrated display, which is tested at its native resolution. Our evaluation isn’t limited to PCMark 10 alone; we also administer the Full System Drive storage subtest.
This subtest serves to quantify program load times and boot drive throughput, offering a numeric score that reflects the responsiveness of the storage subsystem.
This comprehensive approach ensures that our reviews are both accurate and valuable to our readers seeking informed purchasing decisions.
Geekbench, another processor-focused benchmark, runs a series of CPU workloads simulating real-world applications. We record the Geekbench Multi-Core Score, which enables cross-platform comparisons.
For video editing capabilities, we rely on HandBrake, a tough, threaded workout that transcodes a 4K video to a 1080p MP4 file. We time the process and consider lower results (faster times) as better performance.
Lastly, we employ Puget Systems’ PugetBench for Photoshop, which uses Adobe’s image editor to measure a PC’s aptitude for content creation. This benchmark covers a range of tasks, including image resizing, applying filters, and GPU-accelerated functions. The PugetBench for Photoshop Overall Score provides a numeric value representing the system’s performance in content creation applications.
By utilizing these comprehensive benchmark tests and evaluations, we ensure that our reviews on ConsumerReviews.TV are accurate, reliable, and enable users to make informed decisions when choosing a desktop computer for their specific needs.
Graphics Performance Testing
We compare systems within the same category based on factors such as price, features, design, and performance evaluations. Our goal is to provide accurate and reliable reviews to help consumers make informed decisions.
To evaluate the graphics performance of desktop PCs, we use a range of benchmark tests that challenge every system and provide meaningful comparisons. One of the benchmarks we employ is UL’s 3DMark, which consists of various subtests that measure graphics muscle by rendering highly detailed 3D graphics. These tests emphasize particles and lighting.
We run two DirectX 12 tests from 3DMark on all PCs: Night Raid and Time Spy. Night Raid is suitable for lower-power systems with integrated graphics, while Time Spy is more demanding and suitable for high-end PCs with the latest graphics cards. Each test provides an overall score, with higher numbers indicating better performance.
In addition to 3DMark, we also utilize GFXBench, a cross-platform GPU performance benchmark. We run two subtests, Aztec Ruins and Car Chase, which stress-test low-level routines and game-like image rendering. These tests measure frames per second (fps), with higher numbers indicating better performance.
While synthetic benchmarks are useful for assessing general 3D graphics aptitude, we also evaluate gaming performance using full retail video games. Titles like cyberpunk 2077, Resident Evil 4, and Hi-Fi Rush, which have built-in benchmarks, help us assess how a system handles real-world gaming at different settings.
We test these games at both moderate and maximum graphics-quality presets, primarily at 1080p but sometimes at higher resolutions like 4K if the system configuration allows. For all-in-one desktops, we test at their native screen resolution if it’s neither 1080p nor 4K.
In some cases, we run F1 2021 twice to evaluate the impact of Nvidia’s DLSS technology on less powerful desktops. The first test is conducted with TAA anti-aliasing, and if supported, we switch to DLSS on a second run to improve frame rates. Our comprehensive suite of benchmark tests and evaluations ensures accurate and reliable reviews on ConsumerReviews.tv.
Special Cases: Workstations and Chrome OS
Not all of the aforementioned tests are conducted on each computer we evaluate at ConsumerReviews.TV. Our testing methodology is tailored to provide relevant insights for different types of systems. In the case of systems designed explicitly for gaming and equipped with dedicated graphics cards, we exclusively employ in-game benchmarks to assess performance.
For Macs, we steer clear of using PCMark or 3Dmark due to the absence of macOS versions for these tests. To comprehensively evaluate specialized subsets of desktops, we integrate supplementary tests into our standard procedures.
For Chrome OS desktops, a category that has become increasingly rare, compatibility with our standard tests is limited. We rely on PCMark, GFXBench, and Geekbench, all of which offer Android app versions suitable for Chromeboxes. However, it’s important to note that the scores generated by these tests are not directly comparable to those of Windows desktops or Macs.
Despite this limitation, we utilize these tests alongside alternatives like CrXPRT and Basemark Web, using their proprietary scores as relative benchmarks for Chrome machines.
Our evaluation scope extends to workstations as well. These robust machines cater to design, graphics rendering, and data science applications and are often equipped with high-end Core or Xeon processors and Nvidia or AMD professional GPUs. We incorporate two additional benchmarks specifically tailored to these professional systems.
First is the Blender benchmark, which measures the time taken by Blender’s Cycles path tracer to render photorealistic scenes of BMW cars using both the system’s CPU and GPU. Lower rendering times are indicative of superior performance.
The second benchmark, SPECviewperf 2020, is a well-recognized suite that gauges graphics performance for professional applications. This test involves rendering, rotating, and zooming in and out of models using viewsets from renowned independent software vendor (ISV) applications.
We execute the 1080p resolution tests based on PTC’s Creo CAD platform, Autodesk’s Maya software, and Dassault Systemes’ SolidWorks 3D rendering package. Results are conveyed in frames per second (fps), with higher scores denoting better performance. This comprehensive testing approach enables us to offer thorough insights for our readers seeking accurate evaluations of a diverse range of systems.