Page 87 of 109 FirstFirst ... 37778485868788899097 ... LastLast
Results 2,151 to 2,175 of 2723

Thread: The GT300/Fermi Thread - Part 2!

  1. #2151
    Xtreme Member
    Join Date
    Sep 2009
    Location
    Czech Republic, 50°4'52.22"N, 14°23'30.45"E
    Posts
    474
    Quote Originally Posted by kemo View Post
    Saturation point is when you switch from a 5870 to a 5970 in a game supporting crossfire and gets the same fps , then overclock your CPU and keep getting the same fps
    That only means you actually do stuff acording to specs provided by some vendor (don't remember the name) for GTX 470 driver : running Crysis under Vista on Pentium IV 2 GHz 1 GB RAM
    Quote Originally Posted by zalbard View Post
    I think we should start a new "Fermi part <InsertNumberHere>" thread each time it's delayed in this fashion!
    Quote Originally Posted by Movieman View Post
    Heck, I think we should start a whole new forum dedicated to hardware delays.

  2. #2152
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    So a mere 7 to 9 month wait gets you an extra 5% performance. With probably an additional 15% power consumption.

    Impressive Nvidia, impressive.

  3. #2153
    Xtreme Member
    Join Date
    Sep 2006
    Posts
    171
    Quote Originally Posted by Sly Fox View Post
    So a mere 7 to 9 month wait gets you an extra 5% performance. With probably an additional 15% power consumption.

    Impressive Nvidia, impressive.
    don't forget the 25% extra price

  4. #2154
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    wow, you guys, like, determined everything. performance, price, tdp. who needs to actually see reviews and pricing?
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  5. #2155
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    Quote Originally Posted by annihilat0r View Post
    wow, you guys, like, determined everything. performance, price, tdp. who needs to actually see reviews and pricing?
    Oh lighten up man.

    I'll be more than pleasantly surprised if I'm wrong. But I've seen nothing solid to the contrary thus far.

    And if Nvidia proves me wrong, I'll gladly eat my shorts. And post pics.
    Last edited by Sly Fox; 03-07-2010 at 03:33 PM.

  6. #2156
    Xtreme Enthusiast
    Join Date
    Jul 2008
    Location
    Portugal
    Posts
    811
    Quote Originally Posted by Sly Fox View Post
    And if Nvidia proves me wrong, I'll gladly eat my shorts. And post pics.
    Same here, for a different reason
    ASUS Sabertooth P67B3· nVidia GTX580 1536MB PhysX · Intel Core i7 2600K 4.5GHz · Corsair TX850W · Creative X-Fi Titanium Fatal1ty
    8GB GSKill Sniper PC3-16000 7-8-7 · OCZ Agility3 SSD 240GB + Intel 320 SSD 160GB + Samsung F3 2TB + WD 640AAKS 640GB · Corsair 650D · DELL U2711 27"

  7. #2157
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    449
    Quote Originally Posted by zerazax View Post
    Oops, you're right, I screwed up my math





    Uh, and how do you guys know that games aren't reaching the limits of shading/texture power? Neither side might have "screwed up", it might be that hardware is reaching a saturation point
    http://www.anandtech.com/video/showdoc.aspx?i=3740

    This.

    At first I was like, WTH, 5870 is slower then 4870x2

    Then I read that ATI totally redesigned the chip midway into the cycle so that made sense.
    --lapped Q9650 #L828A446 @ 4.608, 1.45V bios, 1.425V load.
    -- NH-D14 2x Delta AFB1212SHE push/pull and 110 cfm fan -- Coollaboratory Liquid PRO
    -- Gigabyte EP45-UD3P ( F10 ) - G.Skill 4x2Gb 9600 PI @ 1221 5-5-5-15, PL8, 2.1V
    - GTX 480 ( 875/1750/928)
    - HAF 932 - Antec TPQ 1200 -- Crucial C300 128Gb boot --
    Primary Monitor - Samsung T260

  8. #2158
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by ajaidev View Post
    I got my old baseball bat out of the closet to get a good whack at the dead horse.
    IMHO I know drivers.

    1. What are drivers? There are many components to control hardware and display output.. and then there is some sort of GUI. Bad GUI = Bad driver??
    Is it more important that brightness control on HDMI doesn't work, or that a game crashes on startup? What if driver install fails, but only in some very rare situation like triple-SLI/CF 99.99% of users will never see?

    2. GUI accessability. Before CCC (tree on left side, preview on right) ATI used "Control Panel" with tabs.. amateurish at best. CCC, built on .NET, had HORRENDOUS startup times (~40sec), and even button click lag. So nVidia wins by default, right? Well, I don't care much for huge mickey mouse icons, or Vista-like "what would you like to do" left column. I like the current nVidia Control Panel and its left "topic" tree, because its emulated CCC.

    3. Bad game developer, sit. Lets face it, the industry is young. And not all of the hundreds of games made every year are properly checked out. So if dev expects code to return 15, but its coded to return 14.7 causing game to crash, who's fault is that? Well, if nVidia get a beta before release, notices issue and adds exe detected workaround, every ATI card owner automatically assumes its their ATI driver's fault since it works on nVidia.

    3. BSOD. There are many charts. Many statistics. And until recent ATI GSOD, nVidia was the only one with threads HUNDREDS of pages long on debilatating issues and BSOD unresolved for GF6, GF7, GF8, GF9... well pretty much all generations. Check for yourself - try and install nVidia driver on XP64 SP2 - BSOD before getting to desktop... talk about "great user experience". nVidia blames MS. MS blames nVidia. No issues with ATI.

    4. nVidia and MS sitting in a tree. Ever since the fall from grace when ATI was first with DX9, and won contract for XBOX360, nVidia has been far behind the OS curve. After a lot of kicking and screaming about Vista's new driver model and what it required, nVidia caved in. If you were fortunate enough to be a beta tester for Vista (2006), you're probably trying to forget how clicking GUI buttons can cause crashes and BSOD. 3D, multi-display, rotation... either not supported or "experimental".

    5. Finally a shout out to Intel. A million monkey may not be able to recreate Shakespeare, but have somehow managed to make a "3D" driver. Just 2-3 years ago, your stupidity for trying to play a game on IGP would be rewarded with startup crashes. Due to lack of vertex shaders. Due to memory allocation errors. Due to the wrong moon phase. Ofcourse there was the long "exclusion" list - games it was hopeless to try running. 9% of all Vista BSOD, caused by Windows GUI and a little simple low setting 3D graphics. Fortunately, Intel driver has greatly improved just in time for actually working (!!) DX10, HDMI and Win7.

    PS: Anybody here remember VIA GART drivers (back when it was on chipset instead of GPU), or special game specific drivers (ie Tomb Raider, UT, S3 Metal)?

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  9. #2159
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,264
    Well said sir. I can't say I disagree with any of the points you've listed.

    CCC has improved vastly performance wise ( I remember when it had 1min+ startup times... ) but in the past, yes it was beyond unacceptable.

    I cringed when you mentioned the Vista beta... I had used it as far back as alpha and my god... epic fail. It took a good year until after it formally released until I'd call it remotely stable ( and another year to regain my sanity )

    PS: I remember the gart drivers Facepalm to the max.
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

  10. #2160
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by ***Deimos*** View Post
    If you were unfortunate enough to be a beta tester for Vista (2006), you're probably trying to forget how clicking GUI buttons can cause crashes and BSOD. 3D, multi-display, rotation... either not supported or "experimental".
    Fixed

  11. #2161
    Xtreme Member
    Join Date
    Jan 2007
    Posts
    287
    When is the suppose release date? I'm having a hard time, sell or not to sell my HD 5870.
    Last edited by aznsniper911; 03-07-2010 at 09:56 PM.

  12. #2162
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    26th march is supposed launch date. I just got news that Turkey would receive more Fermi based cards than France as first shipment. Also I am able to buy a 5850 for $280. Keep your sodding EU to yourself guaiz!
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  13. #2163
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Location
    Canada
    Posts
    763
    Quote Originally Posted by aznsniper911 View Post
    When is the suppose release date? I'm having a hard time, sell or not to sell my HD 5870.
    If its close to how things are showing so far.... I doubt you would have much reason to sell your card.
    Lian Dream: i7 2600k @ 4.7Ghz, Asus Maximus IV Gene-z, MSI N680GTX Twin Frozr III, 8GB 2x4GB Mushkin Ridgeback, Crucial M4 128GB x2, Plextor PX-755SA, Seasonic 750 X, Lian-li
    HTPC: E5300@3.8, Asus P5Q Pro Turbo, Gigabyte 5750 Silentcell, Mushkin 2GBx2, 2x 500gb Maxtor Raid 1, 300gb Seagate, 74gb Raptor, Seasonic G series 550 Gold, Silverstone LC16m

    Laptop: XPS15z Crucial M4
    Nikon D700 ~ Nikkor 17-35 F2.8 ~ Nikkor 50mm F1.8
    Lian Dream Work Log
    my smugmug
    Heatware

  14. #2164
    Registered User
    Join Date
    Jul 2009
    Posts
    66
    Quote Originally Posted by Jokester_wild View Post
    If its close to how things are showing so far.... I doubt you would have much reason to sell your card.
    One reason could be to get a 2GB 5870.

  15. #2165
    Banned
    Join Date
    Dec 2009
    Location
    China
    Posts
    69
    Quote Originally Posted by annihilat0r View Post
    Some guy at DH posted these, have you seen this before?


    Those are numbers from bbs.expreview.com ...posted in table before, not measured ...only copied

    PS. Those numbers are legit, but made with pre-production sample with lower clocks!

  16. #2166
    Registered User
    Join Date
    Feb 2010
    Location
    NVIDIA HQ
    Posts
    76
    So they're legit numbers from a configuration that won't be sold, using drivers that won't get released? ...just making sure we're clear.


    Amorphous

    Quote Originally Posted by Zed_X View Post
    Those are numbers from bbs.expreview.com ...posted in table before, not measured ...only copied

    PS. Those numbers are legit, but made with pre-production sample with lower clocks!
    NVIDIA Forums Administrator

  17. #2167
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    That's what we like to hear !
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  18. #2168
    Banned
    Join Date
    Dec 2009
    Location
    China
    Posts
    69
    Fermi is supported in 196.75 drivers, but ID removed from .inf file, in few days will be released new version instead 196.75, when will be cards launched, you can to download new drivers .. or ONE week before launch from Guru3D or some chinese web.

    This card is great for orientation view of performance, because clocks are lower then retail piece. In logical view, when card beat on lower clocks 5780, will not be in final revision more better?

    PS Price of this card is very good, lowest then 5870

  19. #2169
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    So you're saying they've actually set clocks? Last I heard, they're still juggling them

    And for that matter, prices have been set?

  20. #2170
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    Personal opinion... I dont think these cards will be cheap, regardless of performance.

    Theres no reason for them to be cheap
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  21. #2171
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by Zed_X View Post
    This card is great for orientation view of performance, because clocks are lower then retail piece. In logical view, when card beat on lower clocks 5780, will not be in final revision more better?
    Yeah, but... what about the case when the card doesn't beat HD5870 on lower clock, as it's the case? Will the final GTX470 be able to reach HD5870, or won't?

    Because current numbers show a clear defeat in the 2 games tested (Crysis Warhead and Dirt 2) by 18.5% and 10% respectively, and an innocuous win at what is nearly a synthetic benchmark by a 7%.

    That doesn't sound too promising, sincerely. Let's wait for reviews with the final clock frequencies, anyway...

    Quote Originally Posted by K404 View Post
    Personal opinion... I dont think these cards will be cheap, regardless of performance.

    Theres no reason for them to be cheap
    How can be that way? The price of a product depends entirely on how much the people is willing to pay for it, not on any other consideration (like production costs for example). And as I see it, performance is one of the main reasons why people is willing to pay more or less for a piece of hw.

    Of course, there are other things that might affect to this apart of performance, like disponibility or brand recognition, but performance should be one of the main factors in the price...
    Last edited by Farinorco; 03-08-2010 at 01:23 AM.

  22. #2172
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by mapel110 View Post
    Yes, because ATI launched a mid range chip with 2.15 Billion Transistor and nvidia is releasing a real high end monster with 3 Billion Transistors.
    drawing conclusions from their sheer transistor count is not a good idea...
    and calling the 5870 a midrange chip is a pretty weird statement... ati had to go for multi display setups to find a configuration that makes use of all the graphics performance it offers...

    Quote Originally Posted by H2O View Post
    Saaya, I thought we had a PCIe rep confirm that neither the HD5970 nor the GTX295 were certified as PCIe compatiable, because they could pass the 300W TDP limit. So if the GTX495 goes over 300W, and Nvidia can sufficiently cool it, losing the PCIe compatiability should not be a big deal, right?
    it was a pciE rep? i thought somebody just checked the list on the pci sig site? note that i looked at real meassured power consumption, not tdp and peak values... of course you can build a mars like card, but we all know that cooling was a major issue with that card and it wasnt stable at stock speeds with some people as it simply ran too hot. and that was with a huge and expensive heatsink already... like i said, above 300W you reach a point where any extra watt of power makes the pcb, pwm and heatsink designs exponentially more expensive.

    Quote Originally Posted by trinibwoy View Post
    Depends on which games you're talking about. Future games will make much more use of compute shaders and hence the distinction between games and general computing will begin to fade. The software is badly lagging the hardware at this point so it's hard to see the benefits on anything more than an academic level but hopefully that changes soon.
    yes, but why should software suddenly, magically, catch up? why should there not only be a lot of dx11 games, but good dx11 games, and then not only good dx11 games but good dx 11 games that use compute shaders so much that gf100 has an advantage from it? i just dont see that happening... sure, eventually games will demand a lot more tesselation and compute shader power, but by then we will have second and most likely third or 4th gen dx11 hardware and all this first gen dx11 stuff will be useless.

    Quote Originally Posted by Marios View Post
    The official TDP is different though.
    GTX295, Radeon HD 5970 and Radeon HD 4870X2 have a TDP of about 290W.
    GTX280 240W, HD5870 190W, GTX285 180W, HD4890 190W full load.
    yes, but the tdp values only matter for certificates and verification with pci sig... im more interested in feasability of card above 300W than whether it can be certified

    Quote Originally Posted by Olivon View Post
    http://tof.canardpc.com/view/a46f0aba-0458-4abe-9ce6-f1c14b4d5cbd.jpg
    http://tof.canardpc.com/preview2/a46...c14b4d5cbd.jpg
    more 8 vs 1 fps, 1.3gb vs 1gb nonsense...

    Quote Originally Posted by zalbard View Post
    GPU-Z does not support Fermi yet, it is a fake.
    it does... but this one is a fake
    whoever did it made a loooot of mistakes, i think he wanted people to know its fake, the mistakes are too obvious...

    Quote Originally Posted by ethomaz View Post
    Crysis Warhead: HD 5870 vs GTX 470 (Already posted?).

    PS. GTX 470 confirmed looking the mem size.
    http://bbs.expreview.com/attachments...2c1b865e9c.jpg
    15fps average...and more 1.3gb vs 1gb nonsense...

    Quote Originally Posted by illidan View Post
    so then, in your opinion, what's the difference between GDDR5 and DDR5?
    thats like asking you what the difference between an elephant and a llama is in your opinion its not an opinion, it IS a different standard... why evga keeps making this mistake, who knows... either they dont know, which is very possible, they are marketing people after all, or, they say ddr+high number because it makes it sound more advanced... everybody knows that his system is using ddr2 or ddr3 memory, and if they think the memory on the card is 2 or 3 generations ahead many n00bs probably go whOooOOoOAaAaaA

    Quote Originally Posted by Chickenfeed View Post
    Buy a HD5 series or GT4xx because of performance in current (DX9/10) games.
    yes, totally agree
    buy a next gen card that DOES support the next standard, but when it comes to performance, focus on current games.
    Quote Originally Posted by Chickenfeed View Post
    All of EVGAs boxes say DDR not GDDR.
    that doesnt make it correct, does it?
    some cards actually do use ddr memory as its cheaper, especially entry level and mainstream cards tend to use ddr2 and ddr3 these days as its fast enough and cheaper.

    Quote Originally Posted by Chickenfeed View Post
    Given the wider bus, they don't need memory much faster if faster at all than whats on the 58x0 cards to achieve adequete bandwidth. Unless they chose to use faster memory later on down the pipe (admist the delays), I don't expect clocks faster than 1300Hz (5.2Ghz) myself.
    yes, i totally agree... that was nvidias strategy with gt200 as well, wider bus means they can use cheaper slower memory and still beat ati in bandwidth and they dont need to push clocks really high which can be a pita. since they need all the performance they can get though, i wouldnt be surprised if they actually go for fast gddr5 now... probably as fast as they can get it to run... since their gddr5 controller is first gen or maaaybe second gen, im not sure how high they will be able to get... the imc will be the same or a slightly tweaked version of that in the gt21x 10.1 40nm cards and those only clock in at 3500mhz effective...

    Quote Originally Posted by Chickenfeed View Post
    All that said, with 1.5gb VRAM and high bandwidth, SLI 480s should be the best 2560x1600 high IQ config for some time to come ( I have my doubts that 2 5870 eyefinity cards will do better )
    5870 xfire might be good enough though, and cheaper...

    Quote Originally Posted by orangekiwii View Post
    If the advantage is for future dx11 games, then thats not really an advantage for consumer usage. By the time there is any spectacular or even semi decent game (AvP is bad and runs poorly anyway) there will be MUCH better dx11 hardware out. What matters is current games, if gtx480 is the same as 5870, then whats the point for consumers? I want performance in current games not games in 2 years, if I did I'd buy a card in 2 years. If gtx480 does in fact more or less equal 5870, then theres no reason to buy any card from this generation as simply put its just not a big enough step up in performance on any level.
    totally agree
    hogging hw performance "for later" is the most foolish thing you can do in IT

    Quote Originally Posted by annihilat0r View Post
    crysis warhead numbers look interesting!

    Quote Originally Posted by annihilat0r View Post
    This is getting ridiculous. At launch the highest Cypress chip (5870) was slower than the x2 version of the latest generation. (4870x2 > 5870) Now it's a problem that 470 (not even 480) is slower than a GTX 295?
    well maaaybe, just maaaybe thats because nvidia was creating a huge hype with several events and claiming 40-60% over 5870?

    Quote Originally Posted by Sly Fox View Post
    So a mere 7 to 9 month wait gets you an extra 5% performance. With probably an additional 15% power consumption.
    Impressive Nvidia, impressive.
    Quote Originally Posted by weston View Post
    don't forget the 25% extra price
    gt200 (295) vs gt300 (470/480)
    ~5-20% extra performance
    ~5-20% extra power consumption
    ~5-40% higher price
    +dx11
    +single gpu instead of dual gpu

    i think thats actually pretty damn good, and its not like ati did any better...
    the 5870 is slower than the 4870x2 and costs more, consumed less power and was a single gpu and had dx11 which made it acceptable... while the 470 will probably lose to the 295, the 480 definately wont. more perf comes at a cost, more power and a higher price. i think the price doesnt justified the extra performance, especially because more performance at higher prices is not what 90% of the market needs and wants right now... but hey, market demand will take care of that, and im sure there are enough people who are willing to pay huge prices for the fastest single gpu card. the only problem i see for nvidia is availability...

    if you compare the last product cycles from ati and nvidia, the differences are that nvidia uses more power and costs more, but also offers a performance boost while ati couldnt even reach the performance of their previous gen highend dual gpu card... ati was able to ship though, slowly and with a few bumps, but they could... even the 470 seems to be veeery limited in numbers :/

    i think this is a clasical example of a pr hype actually hurting the product because it drove expecations too high... and it was a bad decision to focus so much on more performance for a higher price instead of the same performance for a lower price, as performance really isnt such a limiting factor for todays gaming pcs...

    the specs look fine though, performance is good i think, price is acceptable, so is the power cosumption and heat... but availability... thats a real issue... not all that much for nvidia, but for its partners its a huge problem... they need some business to make money...
    Last edited by saaya; 03-08-2010 at 01:49 AM.

  23. #2173
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    I can actually see this. The test cards everyone been seeing is A2. Regardless of how things went, A3 should be better than A2 in clocks even if just barely. A1 was supposedly 500mhz, A2 was 600mhz according to charlie. A3 is unknown at this point. Charlie is saying things didn't get better but if rumors are both given equal weight, about clocks flying around, the A2 is already at speeds of 625-650. Even with a garbage respin a3, we should be seeing clocks very close if not at 700mhz and shaderclocks into the 1400 range.

    I think the bad performance right now is bad drivers more than anything because the gtx 295 with terrible clocks of 576 and 1242 of the core and shader respectively is giving 5 percent or more performance than the 5870. We are already seeing possible signs those that 8x AA is no longer the spot where AMD takes a vast lead over NV.

    If there is a 512 core part out of there and they have 675 core with 1400mhz+ shader, we should see a part that is 20% percent faster part than the 5870 even with bad drivers. We are likely seeing a a2 revision part, produce the 84 FPS vs 50 fps on a gtx 285 in the earlier ces show considering the timing. An A3 should do better than that and might get very close to the 100 fps that a 5970 produces. Even if this is one benchmarks, since it not obscure settings, some of it has to translate into real world gaming results.

    http://ht4u.net/reviews/2009/amd_ati...70/index30.php

    I think NV might be sandbagging its performance right now because no matter what they do, they won't be able to effect 5870 sales because the yields are still low enough that their will always be a supply issue. In addition, the amount of people that are interested in fermi and are going to buy one already exceed the small amount of cards NV has produced. I think the reason why NV is so silent, is the very reason AMD was silent about the performance of the rv770, it would rather have a less prepared opponent than a highly prepared one. I think AMD even knows better to fall for this like NV has done in the past, hence, the reason why they are releasing new low leakage r800 parts.

    NV screwed up badly in regards to performance with the g200, because it performance was way to close to the competition to justify the priced difference between the parts and had to do a really severe price cut. NV canned the g212 40nm gtx 28x replacement for a reason. A 40nm part with close to 360 shader part using 10.1 tech(and ddr5) + g200 tech would typically be more than enough to match a 5870, with a similar footprint to boot. I think NV went to direct x11 and a new part for a reason because it had sacrificed alot by not making g212.

    Charlie writes about NV like they are amateur that don't have a right to make silicon because they are too incompetent but I think we know better than that, although we might be pessimistic. I think Anand is right in saying NV didn't underestimate AMD this time. Although they might have not gotten the results they wanted, fermi parts are not going to be garbage and to some who want ultimate single chip performance, the wait could have been worth it to them.

    I am thinking the 470 will slightly beat the 5870 and the 480 will be on average 20-25% faster. Both cards will be inefficient for power usage those. With mature drivers I am expecting especially the overclocked edition of fermi to be 30-35% faster than the 5870.
    Last edited by tajoh111; 03-08-2010 at 01:44 AM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  24. #2174
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    tajoh, your up in the clouds man...

  25. #2175
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    But wait ... I were once told that the Nvidia 8800 trounces every single radeon ... So surely the fermi must be like ... like ... like ... ZOMG AWESOME

    And ATI drivers are rubbish and far less stable compared to Nvidias ... Thats what I heard.

    Quote Originally Posted by ajaidev View Post
    This heathen chart lies, dont listen to it, it is lies! They swapped ATI and Nvidia on purpose.



    Buy Fermi.
    Last edited by Mungri; 03-08-2010 at 02:39 AM.

Page 87 of 109 FirstFirst ... 37778485868788899097 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •