Originally Posted by
flippin_waffles
Benchmarks have been run largely unchanged for the last decade, and it has been a very flawed way of discerning hardware's performance, from the beggining. It is certainly beneficial to whichever company can afford to have the most software optimized in their favor, but still isn't an accurate way of deciding which architecture best fits an individuals needs, IMO.
For example, if I wanted to calculate Pi to 1 million digits, would I use SuperPi, or another software that can do it in a fraction of the time,
-Would it make more sense to go spend $1000 on a new system, or download a faster, 500kB program for free.
Or if I wanted a good compression utility,
-Would it make more sense to spend $1000 on a new system, or download a faster, freeware version of a multi-threaded compression utility.
Or if I wanted a good encoder/transcoder,
-Would it make more sense to spend $1000 on a new system, or download a faster, freeware(or <$50 purchase) version of software that increases the speed exponentially.
To me it's a no-brainer. Looking at a round of graphs, and making your purchasing decisions on which bar is longer or shorter in generic benchmark suites, is old school. Why would I use software that's slower than an alternative piece of sofware, it makes zero sense.
If, for example, 7zip compresses my file faster than WinRar, why on God's green earth would I use WinRar, regardless if i'm using an Intel or AMD based system. What it comes down to is, which combination of hardware of software gives me the best user experience, and which one gives me the best perf/$. These standardized benchmarketing practices need to evolve for a change, and give potential consumers a chance to make a real informed decision on what's best for them. The way hardware is reviewed currently is the same old status quo that means nothing in the end.
Unless it's professional software, in which case the decision makers are a little bit more informed than to base judgement off these kinds of tests.
About the only thing that makes any sense are gaming benchmarks, where their isn't alternative software that can completely skew results and conclusions.
So, since it's clear that software is just as important as hardware in evaluating performance, new review sites need to change that status quo, and focus alot more on what's actually best for the consumer, IMO.
Bookmarks