Awww. DilTech <3 Nvidia. How cute.![]()
Awww. DilTech <3 Nvidia. How cute.![]()
I haven't seen a single post by you which talked about GPUs and didn't either praise AMD or bury Nvidia to some degree.
Unless you really think there is nothing positive Nvidia has ever done, or there is nothing negative AMD has ever done (you don't really seem to be retarded, so I assume you don't think that way), you shouldn't be calling people "fans" of a company.
How about you reach out to AMD and ask about the performance difference instead of assuming that "they" dont care or "they" are to slow?
Maybe it's just you defending an unplayed and badly written title? Try to find out for all of us?
Both cards are targeted at DX11 market and If I may remind you both of them run majority of DX11 games well. Of course if you go overboard with tesellation levels then Fermi pulls ahead, but it also suffers a huge performance loss. Though as a reviewer could you answer the question - was tesselation meant to add complexity without visible IQ improvement? I guess not and there goes your overhyped NV advantage.Fermi was always targeted towards DX11 titles and its performance shows that. Cayman and Barts are a hodge-podge of two generations of products; an older one and a newer one. You don't seem to understand this one fact for some reason.
Again dont assume and ask why a GTX 470 is on par with HD 5970 and HD 6970 in the game that you have carefully selected for a review? As I mentioned if you think that AMD couldnt have optimized their drivers for an older game like lost planet 2 then you either think to highly of Nvidia driver team or choose to ignore that something is wrong...Plus, as I said: AMD could very well have comparable performance if they bothered to optimize their drivers for it. I for one won't bury my head in the sand just because someone's driver development is spinning their wheels.
I agree that open standards should be rewarded, though proprietary PhysX approach should be dropped since it dosent add anything new then a vendor lock to standard features. Though thats of course my opinion and you as a reviewer with free access to all cards may see it differently.That's actually a great idea. I should stop playing lip service to all the trolls who refuse to realize that both NVIDIA'a AND AMD's developer relationship programs should be rewarded. If anyone wants to sponsor a title, I am all for that since it pushes technologies that would otherwise not be seen in this overly consolized PC gaming market. If you aren't all for that and don't agree that it should be rewarded, then shame on you.![]()
I appreciate all the work all the reviewers do .....however could we start a poll or something on games to be reviewed with a list of cards ...? any1 interested ? SkyMTL ?
and maybe some specific scenes ? for example I hear that in crysis at the end vs the big boss there's a lot of screen action and causes jerky graphics issues ....
So you think that AMD are not aware that their cards/drivers are not optimised for the mentioned titles. They even tell reviewers not to use them particular titles when reviewing their hardware. So why not have a look into it. it's liek telling a kid not to smoke cigarettes, he might try it after all...
If SkyMTL didn't use them titles, noone would even know what's going on. Imagine, even if the games is total crap to you, that someone buys a brand new card for it and gets the same performance as a lower specced card or a card he owned before.
And even if it's a badly written title, there are loads of games that could use better multi core CPU support or even multi GPU support...
Really man if their is one website that I could vouch for for not being biased or putting out half done reviews, it has to be Hardware Canucks...
Think you realy have it by the wrong end... you indirectly imply he's green team biased, while it clearly shows you are 100 % red team... stop trolling. If you don't like it, too bad...then don't read it... and live happily in ya own CAREFULLY and SELECTED world...
Maybe you can get together with Calmatory and share a flat and dicuss the rest of the biased world...
Merry Xmass all...
Last edited by Leeghoofd; 12-24-2010 at 12:45 AM.
Question : Why do some overclockers switch into d*ckmode when money is involved
Remark : They call me Pro AsusSaaya yupp, I agree
Of course your opinion is 100% valid and legit and others are trolling? Great, Im glad you like the review since somehow your personal approach to others and standards seem to fit the overall picture that is being discussed here.
Anyway once again if someone tries to draw conclusions on e.g. power/performance ratios in a review based on biased titles then I still think that its not the best approach.
Merry Xmas to you also though I will skip the troll part.![]()
Indeed but it didn't used to be as bad as now.
I have been totally disappointed with what's been done with DX after DX9 & just having the new techs implemented does not make a good looking game as that awful AMD tech demo proves, DX is just a tool & not a magic all will look pretty paint brush.
Just like PhyXS get tacked on to some games too many DX11 features are just tacked on for the hell of it too, you cant make up for bad art direction with throwing lots of DX11 tech at it.
Too much effort is put in to games being more statically technically better but in a practical sense they are not because of how the extra tech is used in a very ineffective way.
I think the consoles are allot to blame because besides a hi res texture pack more geometry & effects the art direction is made for them in mind & throwing DX11 features at it is fail because it was not designed with DX11 in mind.
Many times you see people making comments about how they cant see the difference with some DX11 features turned on & off.
Tessellation is one that pops up allot with many saying they cant see the difference because of where its been used so in the case of LP2 they exaggerated to say hey look but that is no better than the over exaggerated use of Bloom, DOF, Yellow Colour Filter & Motion Blur all of which i turn off if the option in there to do so because even if there is a lower settings its still over the top for my tastes.
Yes we will keep buying cardsanyway but we are not enough.
So only optimised titles ( being it software or hardware wise) need to be included then ? Isn't that what biased is really about ?
My opinion : Include as many titles as possible. That provide the reviewer with a good constant output. The latter is what is essential for a reviewer. FYI I can bench F1 2010 10 times and get 10 different outcomes, sometimes miles apart... which makes it impossible to me to do a good judgement.
If I bench Lost planet 2 ( and I have) I get a pretty constant output, which is what I need to base conclusions upon... I don't care if it makes a card shine yes or no. If the card sucks monkey balls on a certain game, why not make it public ? Being it Lost Planet 2 or Hawx 2...
The 69xx series are a nice evolution , but nothing revolutionary. Not good enough to get the performance crown back... Admitting I was a bit dissapointed as the PR slides were telling me otherwise...
But it's a good step in the right direction. ATI will be back !!
Question : Why do some overclockers switch into d*ckmode when money is involved
Remark : They call me Pro AsusSaaya yupp, I agree
Havent you noticed that HD 5970 still holds the "performance crown" and ATI is no more?
Just pointing out some facts...
When you try to generate statistics and draw conclusions you shouldn't base them on results that are an exception for either one of the vendors. I hope that someone with English as their mother language can explain it better to you.
For example games like Lost Planet 2 or HawX2 could tell you that GTX has a superb power/performance ratio compared to Radeons which isnt true. This is what I call biased and unprofessional, since adding those 2 titles heavily swings the results in one direction (while 99% of games would tell you otherwise).
Reviewers should avoid such situations imho, since its misleads normal users that don't have proper IT know-how.
Last edited by Shadov; 12-24-2010 at 03:03 AM. Reason: Typos
If these were the only two titles tested you are 100% right, but there were loads of other titles tested not ? Plus if you look at eg LP2 results, the 69xx series perform better than the 58 and 68xx series... so good progress there...
If you add the remark that AMD didn't want you to test these particular titles as performance, it should be clear enough that their cards don't shine (yet) with these titles... it always has been that some games run better on another brand, while others run ultra smooth on your hardware...
I really don't see your problem...
Question : Why do some overclockers switch into d*ckmode when money is involved
Remark : They call me Pro AsusSaaya yupp, I agree
Ohh guys, stop attack SKY, it's a reviewer, so don't make him hate AMD .specially when i think he's objective for both brand, and it's really rareful to find that this days.
For L2P and Hawx2, there's no luck an AMD cards will be able to compete in this games, the games bring 200% performance on Cuda based cards, as they use plenty the geomtry shaders of GF100. ( SV position 4 float for 4 shaders )
Here the parralelism performance of the GF100 is completely in use .... impossible for AMD to fight there, there cards are not made for do this ( but for render normal operation )
both games have reduced quality outside the geometry ( AI planes look to be rendered in a DX8 games, explosion and effects are from another age (looking like bitmap from eighties) textures are really poor, even a 16bit console can render them ) .. BUt, the environnement geometry is pushed to the limit with the tesselation and for be sure "other graphism" rendering will not interfer in the performance, they have just completey decrease all other features.
In reality Hawx2 is more for be rendered by Quadro FX of a standard GPU, pushing the parralel operation to the maximum but for this they need to clean the work of the normal renderer and so decrease all other ressource.
It's why a GT430 is as fast of a HD5870, or a GTS450 kill completely a HD5870 2gb in this game. look the FPS in DX11, if i was MS i will push this game for show how fast can be a game in DX11 ...150fps average with a single card ( push all and just remove the AA and tesselation and any card will be in the 300fps... for a game of 2010 .. lol ), we are far of Crysis or Metro.. I push 161fps average on this games with 5870CFX with a scaling near of the 105% vs a single card ( 2x ) ... this is what score a GTX480 ( single GPU ) ... Even Unigine with Tesselation set to 2.0 ( the max ) don't show this difference between Cypress and GF100 .
We are not speaking about a normal branded titles...
Read this, find the sweet point where Fermi go uppon Cypress, and you will know for what is coded Hawx2.
http://beyond3d.com/content/reviews/55/10
Last edited by Lanek; 12-24-2010 at 03:53 AM.
CPU: - I7 4930K (EK Supremacy )
GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
Motherboard: Asus x79 Deluxe
RAM: G-skill Ares C9 2133mhz 16GB
Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0
children on the Internet = Fail
double post
Last edited by seanx; 12-24-2010 at 04:14 AM.
I'm going to generalise and say some ppl (including me) get annoyed when reviewers benchmark titles that most hardware fans (like on this site) don't play ....i.e hawk ...etc....
However I'm guessing they (the reviewers) like to have standard tests to go by and thats why they may do this....another being the use of special features e.g tessl or dx11 or wotever else ....
so the newer cards get benched by the newer games , and so goes the tech circle of life .....
The bad part is that I for one , haven't got to try hawk or batman or battleforge or any of these (to me) obscure titles .... and tbh I open up a few review sites at a time and check wot games
the're reviewing before I even read the review ....if there r "obscure" games I just close the page instantly .... I prefer "mainstream games" ( or in this case games I would play) benchmarked
as mentioned above i.e crysis1/warhead, UT3, Dirt1/2, grid,Civ5,LP2,StalkerCS/CoP,F1-2010,Needspeed, etc....
and then I will read the whole review ....
Another consideration for me is the use of older cards in the review to make comparisons against .... so in all I like sites like
guru,canuks,techpowerup,toms,anand, just to name a few
One more thing ... how come I dont see game/card round-ups anymore ? do these still exist ? u know 1 game vs 15 or so cards ...? help?
I'm going to generalise and say some ppl (including me) get annoyed when reviewers benchmark titles that most hardware fans (like on this site) don't play ....i.e hawk ...etc....
However I'm guessing they (the reviewers) like to have standard tests to go by and thats why they may do this....another being the use of special features e.g tessl or dx11 or wotever else ....
so the newer cards get benched by the newer games , and so goes the tech circle of life .....
The bad part is that I for one , haven't got to try hawk or batman or battleforge or any of these (to me) obscure titles .... and tbh I open up a few review sites at a time and check wot games
the're reviewing before I even read the review ....if there r "obscure" games I just close the page instantly .... I prefer "mainstream games" ( or in this case games I would play) benchmarked
as mentioned above i.e crysis1/warhead, UT3, Dirt1/2, grid,Civ5,LP2,StalkerCS/CoP,F1-2010,Needspeed, etc....
and then I will read the whole review ....
Another consideration for me is the use of older cards in the review to make comparisons against .... so in all I like sites like
guru,canuks,techpowerup,toms,anand, just to name a few
One more thing ... how come I dont see game/card round-ups anymore ? do these still exist ? u know 1 game vs 15 or so cards ...? help?
seanx is online now Report Post Reply With Quote
Ok lets do it this way then, just as an example to show the problem:
Lost planet 2 adds ~6% overall performance score to GTX cards and HawX 2 adds another ~6%, so in total 12%. Not much would someone uninformed say, but it throws the whole product positioning and pricing out the window.
Now the author of the review summarizes the scores from "various games" (mind the word various) and as a result GTX570 is on par with HD 6970. Though if HawX2 and LP2 wasnt included the card would be 12% slower. Now don't tell me 12% doesn't matter to enthusiasts that buy more expensive cards.
Furthermore less experienced users could walk away with the impression that GTX570 is on par with HD 6970, but nowhere is it mentioned that without the two above, seldom played games HD 6970 would be ahead and a better purchase option for 99% games out there.
The fact that those games were included in the charts might also lead to other anomalies like GTX coming artificially ahead in power/performance or price/performance. Of course this are just examples and I'm not implying that one or the other card is better. I just think that reviewers should be more cautious when it comes to game choices to not accidentally misinform others.
Last edited by Shadov; 12-24-2010 at 05:19 AM.
Since you're talking about statistics I'm going to assume you've heard of selection bias. That's exactly what you're proposing which is of course ridiculous. Let's just eliminate any data points from the analysis that don't agree with our predetermined baseless hypothesis that the cards should perform closely in ALL games.![]()
Well with your reasoning we can also remove AVP then as it seriously favours ATI cards, GRID 2 for NVidia ones, etc... This moot discussion is endless... like mentioned before some games run better on a particular brand...
The ATI 6970 will have stiff competition from the GTX 570 in the lower resolutions like Sky mentioned, once you up resolution and co the extra ram on the card and improved design will allow it to pull away... ATI still got the advantage for multi monitor support and co (rams help)... Multi GPU (Crossfire and SLI are also too heavily dependant on game /driver support) is as always hit and miss...
I've read up on 5 reviews now and all conclude about the same as the HC review... so they all must be biased... Both companies did a nice overhaul to reduce power consumption and heat... that's all there's to it, with the current generation, no spectacular improvements ( as promised by the PR folks )
Maybe you can give it a try yourself and give us a full unbiased approach to reviewing GPU's... think you are in for a surprise as it can't be done to suit everyone... One will nagg about the drivers, the other one want's only resolutions used on 30 inch monitors. Some want maxed out details, otherabout the test platform,... the list is endless... a reviewer can't win, never ever...
I always compare several sites results to get a general idea... and in my 20 years of scamming hardware sites it's the same with each generation... fan boys and co hop in and start discussions that really lead to nowhere... I just buy what suits me best. I replaced my GTX 260 with a 6850, nice upgrade and is perfectly suitable for my 24 monitor...
Last edited by Leeghoofd; 12-24-2010 at 06:12 AM.
Question : Why do some overclockers switch into d*ckmode when money is involved
Remark : They call me Pro AsusSaaya yupp, I agree
Actually, I'd say the 6970 really does represent a spectacular improvement in that we're now finally seeing 2GB of VRAM on cards as standard, resulting in much better performance at very high and multi-monitor resolutions and improved minimum FPS. This is a pretty big thing.
Rig specs
CPU: i7 5960X Mobo: Asus X99 Deluxe RAM: 4x4GB G.Skill DDR4-2400 CAS-15 VGA: 2x eVGA GTX680 Superclock PSU: Corsair AX1200
Foundational Falsehoods of Creationism
![]()
I won't defend Lost Planet 2 since we all know its reception ON THE CONCOLES wasn't that great. Personally, I had a good time playing it on the PC but that's beside the point.
Just because a game isn't well received doesn't mean that driver development should stop. Justifying poor performance by stating a game isn't popular is no way to go about looking at this industry.
If I tested 3/4 of those games, our reviews would be absolutely pointless since not one of them really puts massive strain on the GPU. Many thing Civ 5 does and I beg to differ.I'm surprised you never tested COD MW2 (although it's not demanding, it's certainly a relevant game, perhaps at high IQ settings), Civ 5 (late game situation?? It's the first game to actually lag my computer, and it's not even a FPS, has DX11 settings w/ tesselation), Bad Company 2, even SC II, or COD: black ops? With your much advertised timedemo / real world walkthrough testing, this would yield the most valuable data on GPUs on the internet.
BC2 is used. As was SC2 but that was removed due to the CPU playing a massive roll in the overall results.
I have played Civ 5 religiously since it was released and I can tell you that the CPU is what bottlenecks performance. Granted, there are sites using it for benchmarking purposes that somehow get low framerates on even high-end GPUs. Since they flat out refuse to publish their in-game methodology we have no idea where those numbers come from and I sure as heck can't repeat their results even after months of playing.
I don't choose games based upon popularity but rather on a combination of feature sets, benchmark run repeatability and overall GPU demands.
Paying lip service to a press release shouldn't be taken as fact.
No, we should be testing a mix of old and new popular games. Techreport has a fantastic testing suite, and it would be better if only they included a source benchmark as well.
Actually, techreport proves that even SCII, Civ 5, and BF2: BC are enough to really mid range GPUs at ultra-high IQ, and so you have an interesting testing ground at medium res, high IQ.
Hawx 2 is fine for testing, but Lost Planet is critically terrible game and not relevant at all to typical gaming performance. It's almost like benching a tech demo.
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
I loathe AMD just as much as Nvidia; with the exception that thus far they've been less restrictive and more open with their technologies which means that everyone benefits. Oh, and their ľArch has been superior to Nvidia's, kudos for that. Every single company is there to screw it's customers, yet some people seem to think that "Nvidia/AMD/Intel are the GOOD guys". Wtf?![]()
There is nothing positive about Nvidia except G80. Dare to disagree? They're hurting the consumer, yet some people have guts to praise them. But hey, at least we've got TWIMTBP, PHYSX and CUDA!!Unless you really think there is nothing positive Nvidia has ever done, or there is nothing negative AMD has ever done (you don't really seem to be retarded, so I assume you don't think that way), you shouldn't be calling people "fans" of a company.
Though, I'm somewhat interested in their future advancements in GPGPU; not from graphics point of view at all. Gaming is for the weak after all. Once we get +100 GP(GPU) cores which have open(Yeah right, it's Nvidia after all.) ISA and this under 1 W power consumption, I'll get genuinely interested about Nvidia again. Until then I really see them more as a joke than anything.
Bookmarks