Page 29 of 34 FirstFirst ... 1926272829303132 ... LastLast
Results 701 to 725 of 828

Thread: AMD Radeon HD6950/6970(Cayman) Reviews

  1. #701
    Xtreme Member
    Join Date
    Dec 2005
    Posts
    104
    Quote Originally Posted by SKYMTL View Post
    I won't defend Lost Planet 2 since we all know its reception ON THE CONCOLES wasn't that great. Personally, I had a good time playing it on the PC but that's beside the point.

    Just because a game isn't well received doesn't mean that driver development should stop. Justifying poor performance by stating a game isn't popular is no way to go about looking at this industry.




    If I tested 3/4 of those games, our reviews would be absolutely pointless since not one of them really puts massive strain on the GPU. Many thing Civ 5 does and I beg to differ.

    BC2 is used. As was SC2 but that was removed due to the CPU playing a massive roll in the overall results.

    I have played Civ 5 religiously since it was released and I can tell you that the CPU is what bottlenecks performance. Granted, there are sites using it for benchmarking purposes that somehow get low framerates on even high-end GPUs. Since they flat out refuse to publish their in-game methodology we have no idea where those numbers come from and I sure as heck can't repeat their results even after months of playing.

    I don't choose games based upon popularity but rather on a combination of feature sets, benchmark run repeatability and overall GPU demands.





    Paying lip service to a press release shouldn't be taken as fact.

    interesting info ty

  2. #702
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by SKYMTL View Post
    I won't defend Lost Planet 2 since we all know its reception ON THE CONCOLES wasn't that great. Personally, I had a good time playing it on the PC but that's beside the point.

    Just because a game isn't well received doesn't mean that driver development should stop. Justifying poor performance by stating a game isn't popular is no way to go about looking at this industry.
    Right, personally I'm sure you did. Or not. The game is terrible and there is a 90% chance it was ported terribly with barely functioning KBM support.

    If I tested 3/4 of those games, our reviews would be absolutely pointless since not one of them really puts massive strain on the GPU. Many thing Civ 5 does and I beg to differ.
    Lies. Please reference high IQ tests of those games that bring mid range GPUs to their knees. For some reason no one bothers to test high IQ med. resolution except for techpowerup. Techreport does high IQ high res and GTX 460s drop to 23 fps.

    BC2 is used. As was SC2 but that was removed due to the CPU playing a massive roll in the overall results.

    I have played Civ 5 religiously since it was released and I can tell you that the CPU is what bottlenecks performance. Granted, there are sites using it for benchmarking purposes that somehow get low framerates on even high-end GPUs. Since they flat out refuse to publish their in-game methodology we have no idea where those numbers come from and I sure as heck can't repeat their results even after months of playing.

    I don't choose games based upon popularity but rather on a combination of feature sets, benchmark run repeatability and overall GPU demands.
    Then you aren't reaching out to the people who actually read your reviews. We want to know how cards will play in games that we play.

    Paying lip service to a press release shouldn't be taken as fact.
    Perhaps you should actually see their testing that they did, and not lip service.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  3. #703
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by cegras View Post
    Lies. Please reference high IQ tests of those games that bring mid range GPUs to their knees. For some reason no one bothers to test high IQ med. resolution except for techpowerup. Techreport does high IQ high res and GTX 460s drop to 23 fps.
    23 FPS in what? Civ 5? Sure, because the CPU becomes the bottleneck. I have yet to encounter any repeatable late-game situation where the framerates dip to low digits.

    Or maybe that's because sites testing the game are using a completely unrealistic in-game benchmark that doesn't use AI (the one thing that tanks framerates when actually PLAYING the game).... Yes, it spits out frames but what's the use if it doesn't reflect real-world gameplay?

    We used Civ V and I am continuing to look for an acceptable situation where framerates can be conclusive but once again, I haven't been able to yet.


    Perhaps you should actually see their testing that they did, and not lip service.
    They did no testing other than discussing the points and counter points between NVIDIA's claims and AMD's. And yet they continue to use the benchmark in their reviews.
    Last edited by SKYMTL; 12-24-2010 at 07:36 AM.

  4. #704
    Xtreme Addict
    Join Date
    Oct 2007
    Location
    Chicago,Illinois
    Posts
    1,182
    for example games like lost planet 2 or hawx2 could tell you that gtx has a superb power/performance ratio compared to radeons which isnt true. This is what i call biased and unprofessional, since adding those 2 titles heavily swings the results in one direction (while 99% of games would tell you otherwise).

    Reviewers should avoid such situations imho, since its misleads normal users that don't have proper it know-how.
    Hardware canucks show's highest playable settings for all hardware.



  5. #705
    Xtreme Member
    Join Date
    Nov 2005
    Location
    Cape Town - South Africa
    Posts
    261
    SKYMTL maybe it is best to let the people believe what they want to. You can take someone to water or food, but you can not make them eat or drink. Thanks for the information and sharing your point of view.

  6. #706
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by ripken204 View Post
    i only look at benchmarks for the games that i have actually played or want to play
    Problem solved!

  7. #707

    Arrow

    Quote Originally Posted by trinibwoy View Post
    Problem solved!
    Problem would be solved if you could add the games (as in results) youre interested in to an overall chart from a review and see what card comes out on top.

    I wonder why no one has yet implemented such a feature...

  8. #708
    Xtreme Member
    Join Date
    Mar 2005
    Location
    TX, USA
    Posts
    308
    Quote Originally Posted by Shadov View Post
    Problem would be solved if you could add the games (as in results) youre interested in to an overall chart from a review and see what card comes out on top.

    I wonder why no one has yet implemented such a feature...
    I would love it if reviewers could implement dynamic results like that. You could filter out the games you don't play and see exactly how much an upgrade would benefit you. It would makes things easier to choose between the two camps.
    2600k@4.9Ghz 1.42V | Asrock Ext 3 Gen 3 | Custom H20 | 16GB PC3 1600 | 7970@1375/1600

  9. #709
    Xtreme Addict
    Join Date
    Jun 2005
    Location
    Rochester, NY
    Posts
    2,276
    Quote Originally Posted by Elfear View Post
    I would love it if reviewers could implement dynamic results like that. You could filter out the games you don't play and see exactly how much an upgrade would benefit you. It would makes things easier to choose between the two camps.
    i am very tempted to code that, to simply have people select their cpu, gpu, what speeds they are running at, what games and settings they played at, FPS, what driver they used.
    then be able to graph them in many different ways. pretty simple.
    Quote Originally Posted by NKrader View Post
    just start taking pics of peoples kids the parents will come talk to you shortly. if you have a big creepy van it works faster

  10. #710
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Yeah, except that all those pre-ran results in the database are terribly inaccurate unless you bother re-running old results to update the results which get invalid due to driver/software updates. Anyone claiming that it's bull can try to explain how every single review happens to have different results even though the hardware and software is seemingly the same. Either it's that results are inaccurate, or that someone is skewing the results. The only proper way to have accurate results is to run the benchmarks with all the compared cards/cpus for every review. Sure, it duplicates results, is tedious and everyone hates that; at least the results are accurate. In b4 SKYMTL comes over to write an essay how "today its hard because there's too much hardware to bench". Oh well, #care.

  11. #711
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by Calmatory View Post
    There is nothing positive about Nvidia except G80. Dare to disagree? They're hurting the consumer, yet some people have guts to praise them. But hey, at least we've got TWIMTBP, PHYSX and CUDA!!

    Though, I'm somewhat interested in their future advancements in GPGPU; not from graphics point of view at all. Gaming is for the weak after all. Once we get +100 GP(GPU) cores which have open(Yeah right, it's Nvidia after all. ) ISA and this under 1 W power consumption, I'll get genuinely interested about Nvidia again. Until then I really see them more as a joke than anything.
    So it must be Nvidia's fault that transparency AA doesn't work in DX10/11 on AMD cards? Even in DX9 the Nvidia implementation is better. Of all of the IQ arguments between the two this is the only one that I have noticed and I'm surprised that it doesn't get mentioned more.

    To say that Nvidia cards have no advantages over AMD is just blind fanboyism.
    Last edited by BababooeyHTJ; 12-24-2010 at 11:43 AM.

  12. #712
    Xtreme Addict
    Join Date
    Oct 2007
    Location
    Chicago,Illinois
    Posts
    1,182
    @Calmatory

    Not really hard just get mobo with 6/7 pcie slots w/3 of these that 18/21 gpus.Run 7/10 test a day then change gpu,so 30 days of benching = 500+ gpu's.

    You can use 1 amd and 1 Intel platform w/ fastest cpu/mem combo to = 1000+ gpus in 30 days.

    Update once a year.
    Last edited by Hell Hound; 12-24-2010 at 11:43 AM.



  13. #713
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by SKYMTL View Post
    23 FPS in what? Civ 5? Sure, because the CPU becomes the bottleneck. I have yet to encounter any repeatable late-game situation where the framerates dip to low digits.

    Or maybe that's because sites testing the game are using a completely unrealistic in-game benchmark that doesn't use AI (the one thing that tanks framerates when actually PLAYING the game).... Yes, it spits out frames but what's the use if it doesn't reflect real-world gameplay?

    We used Civ V and I am continuing to look for an acceptable situation where framerates can be conclusive but once again, I haven't been able to yet.
    Hang on there though. It's important to still grab FPS numbers from just panning around a crowded map. "Next Turn" is CPU dependent, but if you're just wandering around your civ during your turn and selecting production and whatnot, should be GPU dependent, no?

    Also, I still think including a Source timedemo like TF2 in a busy match would be an excellent reference, although outdated.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  14. #714
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by cegras View Post
    No, we should be testing a mix of old and new popular games. Techreport has a fantastic testing suite, and it would be better if only they included a source benchmark as well.

    Actually, techreport proves that even SCII, Civ 5, and BF2: BC are enough to really mid range GPUs at ultra-high IQ, and so you have an interesting testing ground at medium res, high IQ.

    Hawx 2 is fine for testing, but Lost Planet is critically terrible game and not relevant at all to typical gaming performance. It's almost like benching a tech demo.
    I don't push this for test or not test it, i just explain what the problem is on how HawX2 is coded .... that's all.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  15. #715
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by cegras View Post
    Hang on there though. It's important to still grab FPS numbers from just panning around a crowded map. "Next Turn" is CPU dependent, but if you're just wandering around your civ during your turn and selecting production and whatnot, should be GPU dependent, no?
    Panning still uses a massive amount of CPU cycles. Open up the Win 7 Performance Monitor and watch the usage jump like mad when panning and zooming on a late-game map.

  16. #716

    Wink

    Quote Originally Posted by BababooeyHTJ View Post
    So it must be Nvidia's fault that transparency AA doesn't work in DX10/11 on AMD cards? Even in DX9 the Nvidia implementation is better. Of all of the IQ arguments between the two this is the only one that I have noticed and I'm surprised that it doesn't get mentioned more.

    To say that Nvidia cards have no advantages over AMD is just blind fanboyism.
    Have you heard of AMD Radeons Morphological AA, works in every game that I have tried so far...

    Now you should go back to reading reviews!

  17. #717
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by Shadov View Post
    Have you heard of AMD Radeons Morphological AA, works in every game that I have tried so far...

    Now you should go back to reading reviews!
    Yep, I own a 6870. It works well and is great for certain games but doesn't look as good as 4xMSAA with MSTSAA and comes at a massive performance hit which makes it unusable for GTA4 as well as some other games sadly.

    There really is no excuse for the lack of TSAA support and I wouldn't call MLAA a replacement. It's a nice feature and all but I'm pretty sure that much like physix it won't factor into my next purchase.


    While were on testing games it would be great to see games like Darkplaces and Serious Sam HD that are probably more popular than Lost Planet 2 and happen to stress my hardware.
    Last edited by BababooeyHTJ; 12-24-2010 at 12:33 PM.

  18. #718
    Xtreme Member
    Join Date
    Oct 2010
    Location
    192.168.1.1
    Posts
    221
    Quote Originally Posted by trinibwoy View Post
    Problem solved!
    Not totally, though. You buy graphics cards to play not only the current games, but also games that will come out in the future. The only way to estimate how a graphics card will do in the future is to look at current, more straining titles. It won't be the most accurate estimation, but that's the only thing you'd have.

    So, surely you should care more about games you'll actually play, but for more future-proofness you should at least the consider the performance in current, GPU-straining games whether you have the intention to play them or not.

  19. #719
    Registered User
    Join Date
    Apr 2006
    Posts
    64
    Quote Originally Posted by Calmatory View Post
    I loathe AMD just as much as Nvidia; with the exception that thus far they've been less restrictive and more open with their technologies which means that everyone benefits. Oh, and their ľArch has been superior to Nvidia's, kudos for that. Every single company is there to screw it's customers, yet some people seem to think that "Nvidia/AMD/Intel are the GOOD guys". Wtf?
    Good post. I couldn't agree more.
    TANDY PC
    Intel 486 SX 25
    4 MB RAM
    Trident 512K SVGA
    120 MB Seagate HD
    14 Inch CTX CRT Monitor
    14.4K Modem (too slow for :banana::banana::banana::banana:)
    Radio Shack 2.0 Speakers (6V Battery Operated)
    OS: MS DOS 6.2
    Games: X-wing, Wing Commander, Veil of Darkness, Kings Quest, Zork

  20. #720
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by omega1alpha View Post
    Good post. I couldn't agree more.
    I too can agree with this short statement.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  21. #721
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Calmatory View Post
    I loathe AMD just as much as Nvidia; with the exception that thus far they've been less restrictive and more open with their technologies which means that everyone benefits. Oh, and their ľArch has been superior to Nvidia's, kudos for that. Every single company is there to screw it's customers, yet some people seem to think that "Nvidia/AMD/Intel are the GOOD guys". Wtf?
    I don't think anyone are the good guys. I completely ignore the politics when making my purchase, as whoever has the best hardware for the cash I feel like spending at that point in time gets my money. I've owned AMD/ATi hardware, and I've owned NVidia hardware. I've owned AMD processors, and I've owned intel processors. I let my wallet do the talking for my end, and I let the product's performance do the talking for the companies; between the two they generally can come up to an agreement that suits my needs.

    At the same time, I also tend to prefer to make sure that information on this site is correct when possible. The most annoying thing you can have happen is to have lots of people give you bad information(I've had it happen myself, got an ATi card back when they pretty much COULDN'T run openGL, not fun), which sways your decision in a way that it may not have gone had it not of been for that bad bit of info. As staff on this site, it's partially my responsibility to correct that information so that others who read this site looking for said information can find what it is they're looking for. Make sense?

    There is nothing positive about Nvidia except G80. Dare to disagree? They're hurting the consumer, yet some people have guts to praise them. But hey, at least we've got TWIMTBP, PHYSX and CUDA!!

    Though, I'm somewhat interested in their future advancements in GPGPU; not from graphics point of view at all. Gaming is for the weak after all. Once we get +100 GP(GPU) cores which have open(Yeah right, it's Nvidia after all. ) ISA and this under 1 W power consumption, I'll get genuinely interested about Nvidia again. Until then I really see them more as a joke than anything.
    Nothing positive, eh? I guess you thought the GTX 460 wasn't a GREAT card for the money? I guess you think the GTX570 is over-priced for it's performance level?

    I mean, I guess you might think they're hurting customers by having closed standards, but NVidia did pay the costs to create/obtain those things. Ask developers what they think about TWIMTBP; which helps bring the cost of developing a title down and seriously helps out with the burden of testing a title(try buying all those configurations of systems and see how much just that part costs you). You may see it as them "buying out" games, but the fact is they do assist the developer in doing what WE want them to do, and that is continue to make games for the PC. Considering it's a much higher development cost to do that, it's likely we'd see less titles than we already do without programs like this in place.

    Physx, well, I still don't know how I feel about that one. At the same time, there's NOTHING stopping AMD from attempting to make their own standard. We could then see if all their "we don't believe in closed standards" hype is true or not.

    As for the final part... So you're pretty much saying ONLY when a 100% impossible non-realistic goal is met that you'll be interested in NVidia's products; yet you have the audacity to call anyone else biased in any way, shape, or form? That pretty much sums up your intentions in a nut-shell, you'll ONLY like a brand if they can do something that is not physically possible, yet you'll buy the other brand regardless... That's an open and shut case right there.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  22. #722
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by DilTech View Post
    As for the final part... So you're pretty much saying ONLY when a 100% impossible non-realistic goal is met that you'll be interested in NVidia's products; yet you have the audacity to call anyone else biased in any way, shape, or form? That pretty much sums up your intentions in a nut-shell, you'll ONLY like a brand if they can do something that is not physically possible, yet you'll buy the other brand regardless... That's an open and shut case right there.
    You sir have no clue what you're talking about; I haven't bought a single discrete GPU since late 2008, and I don't realistically see myself buying any ever again. Ever. I couldn't care less how AMD or Nvidia twist their products and marketing; it doesn't touch me in any way. However, I'm curious about the show. It is just entertainment for me to follow this circus. Thus said; I'm not buying AMD either.

    Oh, and it WILL be possible for Nvidia to build multi-dozen general purpose processing cores which will consume less than 1 W. Not yet. Possibly not with silicon. What I was after was that I'm genuinely interested in their products once they can really produce something I'm interested about; general purpose(as in, I can compile Linux kernel on them), low power hugely parallel processors. Current GPUs are ALMOST there. They only need to make the ISA open; improve the general purpose logic and features, improve the memory subsystem and then shrink it down. I'm not sure if Nvidia will be around by the time all this becomes reality; maybe not. But as of right now, it seems that Nvidia will be the company to produce fun stuff like this; AMD and Intel push performance, ARM doesn't produce parallel processors and VIA can't deliver. Only Nvidia can do this. Maybe their products could be based on ARM ISA, I don't know. But right now GPUs are more or less worthless for me; I don't need graphics processing power. Heck, Intel's GMA 950 is fast enough for my needs and it even supports OpenGL 1.3; I can plot tons of texture mapped polygons at high framerates. Thats enough for now.

    No doubt has Nvidia done some good product launches with Fermi; e.g. GTX 460. However, Fermi as a microarchitecture.. It sucks. It tried to be a GPGPU and gaming focused. It didn't do well(compared to the potential if it was focused on only one at a time) at either of them. Tons of transistors and heat for the performance provided. Not efficient. And given that efficiency is the #1 thing to focus these days(thin clients; battery powered portable devices; low costs; being "eco-friendly") it just makes it worse. I have no problem to have Nvidia GPU in my future notebook; if it can provide good performance for low power consumption then I'm all for it. But really, Fermi can't quite deliver that. And even if it could I wouldn't care, as I have no plans to get a new one for a while. Maybe when we're at 28 nm, or even 22 nm.

    Oh well, there's ONE thing good about Nvidia; their developer support. They're actively supporting and encouraging developers with their Developer Zone. Heck, they even support demoscene with nVision. It's damn valuable for us devs who are interested to work with the hardware. Nvidia just has no interest in pushing open standards which would push the technology forward. Kind of like Microsoft, "it's just business".

    In the end, no matter if it is Nvidia, AMD, Intel, VIA or ARM having the crown, consumers will get screwed. Just look at Apple or Microsoft. Are they pushing the technology or making money?

  23. #723
    Registered User
    Join Date
    Jun 2010
    Location
    Denmark
    Posts
    90
    I play at the resolution of 2048*1152.

    Is that gonna make a noticable FPS drop against 1920*1080?

    On a 6970, btw.

  24. #724
    Xtreme Enthusiast
    Join Date
    Dec 2009
    Location
    Burbank, CA
    Posts
    563
    OK, so i built a computer for a customer this weekend, it was a XFX 6950, the corner of the power connector was filled down, then i boot into windows, install the newest drivers from amd's website and the drivers wont recognize the card, so i research online and find the hotfix and finally got it to work. AMD's drivers are still lacking, and the corner filled down just did it for me. Absolutely no reason to pick up the 6900 over the gtx 570/580, im not a fanboy, but i do prefer nvidia from my 15 yeras in the computer business.
    Intel Core i7 4770K @ 4.6ghz| Corsair H100i | MSI Z87-GD65 Gaming Edition | 16GB Corsair Vengeance @ 2.4GHZ | eVGA GTX 780 | Creative Sound Blaster ZxR | Samsung 500GB SSD/4TB Storage | LG 14X BLu-Ray Burner | Corsair HX1050 | Corsair Air 540 | Asus VG248QE 144HZ Monitor

  25. #725
    Xtreme Guru
    Join Date
    Jun 2010
    Location
    In the Land down -under-
    Posts
    4,452
    Quote Originally Posted by HelixPC View Post
    OK, so i built a computer for a customer this weekend, it was a XFX 6950, the corner of the power connector was filled down, then i boot into windows, install the newest drivers from amd's website and the drivers wont recognize the card, so i research online and find the hotfix and finally got it to work. AMD's drivers are still lacking, and the corner filled down just did it for me. Absolutely no reason to pick up the 6900 over the gtx 570/580, im not a fanboy, but i do prefer nvidia from my 15 yeras in the computer business.
    That being said, nvidia drivers are crap also.. seems like there drivers are getting good to worse also..

    Another thing I find funny is AMD/Intel would snipe any of our Moms on a grocery run if it meant good quarterly results, and you are forever whining about what feser did?

Page 29 of 34 FirstFirst ... 1926272829303132 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •