Page 113 of 123 FirstFirst ... 1363103110111112113114115116 ... LastLast
Results 2,801 to 2,825 of 3051

Thread: The Fermi Thread - Part 3

  1. #2801
    Xtremely Bad Overclocker
    Join Date
    Jan 2005
    Location
    East Blue
    Posts
    3,596
    Quote Originally Posted by LiquidFiction View Post
    nVidia will have to sell the company by 2011 Q1
    and 480 GTX will be go into history as "the other Voodoo 6000"

    Haha I love what is rumored currently
    Last edited by SoF; 03-29-2010 at 12:35 PM.
    | '12 IvyBridge - "ticks different"... | AwardFabrik IvyBridge round I by SoF | AwardFabrik IvyBridge round II by angoholic & stummerwinter
    | '11 The SandyBridge madness... | AwardFabrik / Team LDK OC-Season 2011/2012 Opening Event
    | '10 Gulftown LaunchDay OC round up @ASUS RIIE | 3DM05 2x GPU WR LIVE @Cebit 2010 @ASUS MIIIE | SandyBridge arrived @ASUS P8P67

    | '09 Foxconn Avenger | E8600 | Foxconn A79A-S | Phenom II 940 BE | LaunchDay Phenom II OC round up
    | '08 7.438s 1m LN2 | AMD 1m WR LN2 | 2nd AOCM | Phenom II teasing
    | '07 100% E2140 | 106.5% E2160 | 100% E4500 | 103% E4400 | 5508 MHZ E6850 | 7250 MHZ P4 641 126.5% by SoF and AwardFabrik Crew all on Gigabyte DS3P c? and LN2...
    | '06 3800+ X2 Manchester 0531TPEW noHS 3201MHZ c? | 3200+ Venice noHS 3279MHZ c? | Opteron 148 0536CABYE 3405MHZ c? all on Gigabyte K8NXP-SLI compressorcooled

    | '05 3500+[NC], 3000+[W], 2x 3200+[W], 3500+[NC], 3200+[V] 0516GPDW

    Quote Originally Posted by saaya
    sof pulled a fermi on all of us !!!

  2. #2802
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    QLD
    Posts
    942
    yeah keep dreaming, then ATI can bend all of us over the table and demand $1500 for a highend card as you Daamit fans always wanted.

  3. #2803
    Xtreme Member
    Join Date
    Jul 2005
    Posts
    429
    Quote Originally Posted by Dahmer View Post
    You're so green it makes my eyes hurt...

    One major point though, real gamers spend more money on games than on hardware to run the games. A real gamer will spend money on a system good enough to play it and that's it.
    Find me one person that has everything you mentioned and uses all that JUST for games then you have found yourself an idiot with too much money for his/her own good.

    I've got two friends that have a home theater system and they both use for....guess what....watching movies! Who would've known, using massive screens for movies =o
    Perhaps you are speaking of kids I'm not sure but most adults with a decent income can afford the best PC hardware at least the people I know of anyway. Define "real gamers" anyway are there "fake gamers"? When you say "spend more money on games" do tell me in the course of a year do you think they purchase 5 games? 10 games? Not very many "real gamer" titles come out.

    Also you proved my point as well when you say "uses all that JUST for games" a lot of people with 30" monitor and high end hardware just might use the PC for more than games shocker!
    PC1 EVGA Nvidia 790i Ultra | E8400 Retail @ 4.05GHz 1.35v | 8GB Mushkin Ascent @ DDR3-1680 | 2xBFG GTX 280 SLI @ stock | 30" Dell 3008WFP @ 2560x1600 XHD | XFI Fatality | 3x256GB Corsair(Samsung) SSD Raid0 & 1TB Samsung Backup | LG DVD / CDRW | NEC DVD DL 16x | CoolerMaster Stacker 830 8x120mm high 110 CFM per fan| 1000w Corsair SLI certified | Scythe Infinity (dremel mod to avoid caps on board)

    PC2 (Wife) ASUS P5WDH Bios 1101 | E6400 Retail @ 3.2 | 4GB Corsair 6400C4 | 1xBFG GTX 260 @ stock | 24" Dell 2405FPW @ 1920x1200 XHD | XFI Xtrememusic | 2x150G Raptor Raid0 & 1TB WD Backup | Pioneer DVD | CoolerMaster Stacker 830 | 850w PCP&C | Stock HSF

  4. #2804
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Manicdan View Post
    the same guy who got the 30" just had me rebuild his machine, a phenom quad 3.4ghz, 4GBs of ddr3, and a 5850, plays CoD MW2 maxed out, and i mean maxed the F out, as smooth as can be.
    mhhh that IS nice
    but then look at games like crysis, stalker, metro 2033... crysis2 is coming up... forget about playing that with a single gpu at 2560 res...

    Quote Originally Posted by bill_d View Post
    at 8xaa maybe but at 25x16 2xaa i can get 60fps on most all game with two 4890's a few game i can get 4xaa
    crysis? stalker? metro 2033? assasins creed? and thats multi gpu... like i said, while there have been a lot of improvements, id really avoid multi gpu when building a gaming rig...

    Quote Originally Posted by zalbard View Post
    Not 5000?
    charlie claimed 5-10k, ive heard 10-15k from others, and i think it was zed_x who mentioned 60k during the first weeks... though he also claimed the cards would have some weird new marketing name which they didnt, and that the 480 had 512sps etc...

    Quote Originally Posted by Coldon View Post
    I never said they were perfect did i? 5%? strange how the new ATI drivers gave a good ~10% increase across the board for an old architecture. You could then argue as to why these improvements weren't in the drivers at release since its an old architecture, they have technically had years to perfect and tweak it, and yet the launch drivers for the 5870s were horrendous.
    good point... but i think most of the tweaks were related to new games that they had to optimize the drivers for, and then a lot was most likely reducing cpu dependance, so you dont need as much mhz to drive the cards... so if you compare old vs new drivers with a 4ghz intel cpu you wouldnt notice that much of a boost...

    but dont forget, the 10.3 drivers brought the biggest boost for the 5800 series so far and it was UP TO 10% per gpu boost... on average it was maybe 3% accross the board at most... thats why i said i dont think nvidia will be able to improve average perf accross the board by much more than 5% in the next couple of weeks... in individual games, im sure we will see 10% boosts in some settings, maybe even 15%... but on average... dont think so...

    Quote Originally Posted by Coldon View Post
    This is a completely new architecture, and you expect them to have drivers optimized in 6 months?! how long did it take for the g80 drivers to get tweaked and optimized, the move to USM/USA needed a lot of driver tweaks and optimization. The same can be said here as well, though not to such a great extent as the g70 -> g80 change over.
    no idea... but its not like its scientists studdying an UFO, they built the darn thing so they should know how to optimize their drivers to make use of it, and id be surprised if they waited for final hardware to think about how to actually use the units and bandwidth with their driver/compiler... its very uncommon to see notable performance boosts accross the board that go beyond tweaking and fixing the drivers for new games, after 6+ months that a new architecture came out in my experience... you usually see notable boosts within 3 months of the launch and form there on there are barely perf boosts accross the board, just tweaks and fixes. fermi launched NOW, but is delayed by 6 months... so they started working on the drivers more than 6 months ago... so theyve had more than 6 months to tweak their drivers!
    there was an article about driver evolution on ati and nvidia cards on 2 sites in the last year iirc... they both concluded that within 6 months drivers rarely improve performance accross the board by more than 5% and the best was around 10% at some res for a certain card iirc.

    Quote Originally Posted by Coldon View Post
    I'm starting to feel that you are just arguing and disagreeing for the sake of disagreeing and voicing your opinion.
    thats nonsense! i couldnt disagree more!
    hehehe, yeah i tend to disagree a lot... i dont mean that you are wrong though, so please dont take it personally

    Quote Originally Posted by Coldon View Post
    Some people might like larger displays simply cause they are older and their eyes get tired staring at smaller displays, or they prefer sitting further back from their screens. Yeh, SLI/crossfire is not a great gaming experience but they can simply turn off some eye candy on a single GPU and still have excellent framerates. High levels of AA at such a high resolution is not as essential as it is at lets say 1680x1050.
    true... aa isnt that important at high res... unless its a low res alpha texture like a fence or twigs of a tree etc...
    but think about it, would you prefer lower detail and quality settings at 2560 res and 2-4" more screen than better iq at a slightly lower display?
    its a subjective thing... id def prefer a smaller screen with better iq...
    and thats not even taking into account the cost of a bigger screen plus more gpu oomph...



    Quote Originally Posted by tajoh111 View Post
    Fermi has inconsistent performance across the board. Games the old architecture was good at perform comparatively bad on fermi. Its a new architecture and whatever people may say.
    i think thats mostly based on texturing performance and geometry performance though... if you look at the games where fermi does well, its games that are shader and geometry heavy. in games that are texture heavy 480 is as fast as a 295 or even slower... im not really sure, but thats the impression i got when reading the fermi reviews... crysis is very texture heavy, and fermi performs the same as 5870. stalker and metro 2033 are very shader heavy and fermi does very well there... 3dmark series have always been pretty texture heavy, and fermi is as fast as a 5870 there...

    Quote Originally Posted by tajoh111 View Post
    More power to fuel higher resolutions is one of the fundamental reason why videocards have to get more powerful.
    do you really think in 2010 this is still a valid claim?
    i thought 5800 and fermi showed us that resolution is not longer a driving factor? i mean ati and nvidia had to come up with multi monitor solutions and 3d to somehow show a notable performance boost of the latest gen hardware, cause in 1080P which is still NOT the standard, there really isnt a need to buy the latest and greatest hardware... at all...

    i think we have reached a point from where on the amount of pixels becomes less and less important and the amount of pixels increases less and less. its all about increasing calculation power per pixel now. thats what several game devs said at the last graphics convention as well...

    Quote Originally Posted by damha View Post
    Let's start taking bets: ATI's next generation GPU will be more powerful and run cooler than the GTX480.
    Is that a far-fetched assumption?
    Video cards will get more powerful, but not Nvidia's version of powerful.
    ~5970 performance, faster @ tesselation, less heat than a 5970, ~399$, q1 2011... and ~470 performance at half the power consumption for 249$ is what id expect as well...

    i hope they focus on better 3d support and come up with a propper infrastructure and dont just tell customers to go and find displays and glasses themselves and figure out how to set it all up and get it working properly...

  5. #2805
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by tajoh111 View Post
    + 1 to both your points.

    Fermi has inconsistent performance across the board.
    that about sums it up. looks like they didnt have enough time to really figure out how to utilize the new architecture for older games. and honestly why should they care, who wants 10 more fps when u already have 150fps, but for newer games, those extra 5 will really help when your at 45fps. ill probably take a look at the hard ocp review to see what settings they found playable (i do like how they do that kind of thing) id bet you can enjoy every old game just fine, and with a few driver enhancements, up the 4xaa to 8x (woopty dooo. lol)

  6. #2806
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by pentium777 View Post
    Perhaps you are speaking of kids I'm not sure but most adults with a decent income can afford the best PC hardware at least the people I know of anyway.
    yes, we got it, your in the money and so are your friends :P
    its not about affording it, its about chosing to spend money on something you dont actually need or see no real gain from...

    Quote Originally Posted by pentium777 View Post
    Define "real gamers" anyway are there "fake gamers"? When you say "spend more money on games" do tell me in the course of a year do you think they purchase 5 games? 10 games? Not very many "real gamer" titles come out.
    i wouldnt caegorize gamers based on how many games they buy or how much they spend on hw but how many hours a day they play games... then again, for hardware and software companies those gamers are actually not interesting at all as they dont make a lot of money off of them

    Quote Originally Posted by pentium777 View Post
    Also you proved my point as well when you say "uses all that JUST for games" a lot of people with 30" monitor and high end hardware just might use the PC for more than games shocker!
    yeah but once you go 30", for whatever reason, you HAVE to play at 2560... or else the image quality will suck... well not suck, but why spend so much on a big screen and then play below its native res with a slightly blurry image? thats what i mean... once you go 30" you HAVE to play at 30" and you HAVE to invest more on hardware... and you might be able to get 99% the same game play experience as on single gpu 24" screens, but it wont be the same or even better... show me any pro gamer that plays on a 30" screen... ANY! see what i mean? ask most benchers here on xs... they bench on tri or quad sli... but when they play games they prefer single gpu rigs...

    Quote Originally Posted by Manicdan View Post
    that about sums it up. looks like they didnt have enough time to really figure out how to utilize the new architecture for older games. and honestly why should they care, who wants 10 more fps when u already have 150fps, but for newer games, those extra 5 will really help when your at 45fps. ill probably take a look at the hard ocp review to see what settings they found playable (i do like how they do that kind of thing) id bet you can enjoy every old game just fine, and with a few driver enhancements, up the 4xaa to 8x (woopty dooo. lol)
    maybe... or maybe its that older games are texturing limited on fermi... the newer the game, the more pixel and geometry heavy games get... mostly pixel shader heavy...
    i think fermi is definately texturing limited...
    maybe it has enough texturing performance so they focussed on pixel shader and geometry performance... but tbh im not very happy with the texturing in current games, and many people arent, just look at all the texture mods and hacks out in the wild... would be interesting to make an article about this... cant wait for some beyond3d or techreport or xbitlabs analysis of fermi

  7. #2807
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Quote Originally Posted by SoF View Post
    and 480 GTX will be go into history as "the other Voodoo 6000"

    Haha I love what is rumored currently
    We get the point already.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  8. #2808
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Its weird those. In anandtech review, 480 does significantly better at crysis warhead 11-17 percent better with 33 percent higher minimums than the 5870. The SLI results are a complete beat down in those results with SLI gtx480 likely matching CF 5970s.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  9. #2809
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by saaya View Post
    yeah but once you go 30", for whatever reason, you HAVE to play at 2560... or else the image quality will suck... well not suck, but why spend so much on a big screen and then play below its native res with a slightly blurry image? thats what i mean... once you go 30" you HAVE to play at 30" and you HAVE to invest more on hardware... and you might be able to get 99% the same game play experience as on single gpu 24" screens, but it wont be the same or even better... show me any pro gamer that plays on a 30" screen... ANY! see what i mean? ask most benchers here on xs... they bench on tri or quad sli... but when they play games they prefer single gpu rigs...
    single gpu isnt too scary, u just cant use the ultra textures or 4x+AA, but with that kind of resolution, and dot pitch, u may not need to use more than 2xaa.

    and yes if u can drop 1000$ on a monitor, it may just be better to drop 500$ on a good IPS around 25" and put an extra gpu in ur pc. thats what i was telling myself with the SSD drive i had, i could buy 2 more 4850s, or a 60GB SSD. but i was happy with my gpu performance (especially for WoW) and the SSD felt more worthwhile, even though its retarded expensive still. but back the point, a 1000$ on a monitor probably means an extra 200$ on GPU power.

    and keep in mind its only 40% more pixels.

  10. #2810
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Posts
    970
    Quote Originally Posted by Coldon View Post
    This is a completely new architecture, and you expect them to have drivers optimized in 6 months?! how long did it take for the g80 drivers to get tweaked and optimized, the move to USM/USA needed a lot of driver tweaks and optimization. The same can be said here as well, though not to such a great extent as the g70 -> g80 change over.
    This shouldn't be a problem with nv's supposed wizzards in writing drivers. Can't have it both ways, either they are or they aren't. Saying nv's drivers are second to none, with all this speculation on poor 4xx performance due to poor drivers seems to me like a dog chasing it's tale.

  11. #2811
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    Quote Originally Posted by flippin_waffles View Post
    This shouldn't be a problem with nv's supposed wizzards in writing drivers. Can't have it both ways, either they are or they aren't. Saying nv's drivers are second to none, with all this speculation on poor 4xx performance due to poor drivers seems to me like a dog chasing it's tale.
    Not really.

    They can have better drivers than AMD and be good at making them. But just recently they've pulled 15 - 20% more performance out of some games. On an Architecture thats extremely similar to the g80 which is 3 years old. Now are you honestly going to tell me that a BRAND SPANKING NEW ARCHITECTURE which has had basically four months of development won't get similar improvements?


    Compare the 5870 with release drivers to the drivers now. This arch is really just a cleaned up version of the rv770 (which was based off of 2900 amirite?) and yet they still added a clean 10 - 15% increase across the board. Do you really expect Fermi to not have similar improvements if not MUCH greater improvements in the coming months?

  12. #2812
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by orangekiwii View Post
    Compare the 5870 with release drivers to the drivers now. This arch is really just a cleaned up version of the rv770 (which was based off of 2900 amirite?) and yet they still added a clean 10 - 15% increase across the board. Do you really expect Fermi to not have similar improvements if not MUCH greater improvements in the coming months?
    +1,
    i think its alot of balancing the new gpu/memory power. the 4870s had much stronger ram to gpu power than we do now. i dont know much about building drivers, but when it comes to billions of calculations a second, i bet its alot of trial and error testing, and then boom, they find the sweet spot

  13. #2813
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    449
    Quote Originally Posted by saaya View Post
    and the 197 drivers for the gt200 based cards are perfect? theres no such thing as a perfect driver with constantly new hw and sw being introduced into the eco system...
    i think drivers for gf100 will improve, but not all that much... maybe 5% on average... cards that come out with a big delay usually already have the drivers tweaked for performance..
    Yea but unlike GF100, Cypress (Rv870 or whatever its new name is) was not a huge jump from RV770 architecture. On the Contrary GF100 in comparison to G80, G92 or G200 is a huge jump in some areas. It maybe a while before we see Fermi's real performance and by then it will probably be already too late .
    --lapped Q9650 #L828A446 @ 4.608, 1.45V bios, 1.425V load.
    -- NH-D14 2x Delta AFB1212SHE push/pull and 110 cfm fan -- Coollaboratory Liquid PRO
    -- Gigabyte EP45-UD3P ( F10 ) - G.Skill 4x2Gb 9600 PI @ 1221 5-5-5-15, PL8, 2.1V
    - GTX 480 ( 875/1750/928)
    - HAF 932 - Antec TPQ 1200 -- Crucial C300 128Gb boot --
    Primary Monitor - Samsung T260

  14. #2814
    Xtremely Bad Overclocker
    Join Date
    Jan 2005
    Location
    East Blue
    Posts
    3,596
    Quote Originally Posted by Dainas View Post
    yeah keep dreaming, then ATI can bend all of us over the table and demand $1500 for a highend card as you Daamit fans always wanted.
    Quote Originally Posted by Tim View Post
    We get the point already.
    I added one more smiley in my post so that just nobody get's me wrong - won't happen and would be very bad anyway...love benching different cards but Fermi still needs some time until it's worth spending some DICE or LN2 on it for me.
    | '12 IvyBridge - "ticks different"... | AwardFabrik IvyBridge round I by SoF | AwardFabrik IvyBridge round II by angoholic & stummerwinter
    | '11 The SandyBridge madness... | AwardFabrik / Team LDK OC-Season 2011/2012 Opening Event
    | '10 Gulftown LaunchDay OC round up @ASUS RIIE | 3DM05 2x GPU WR LIVE @Cebit 2010 @ASUS MIIIE | SandyBridge arrived @ASUS P8P67

    | '09 Foxconn Avenger | E8600 | Foxconn A79A-S | Phenom II 940 BE | LaunchDay Phenom II OC round up
    | '08 7.438s 1m LN2 | AMD 1m WR LN2 | 2nd AOCM | Phenom II teasing
    | '07 100% E2140 | 106.5% E2160 | 100% E4500 | 103% E4400 | 5508 MHZ E6850 | 7250 MHZ P4 641 126.5% by SoF and AwardFabrik Crew all on Gigabyte DS3P c? and LN2...
    | '06 3800+ X2 Manchester 0531TPEW noHS 3201MHZ c? | 3200+ Venice noHS 3279MHZ c? | Opteron 148 0536CABYE 3405MHZ c? all on Gigabyte K8NXP-SLI compressorcooled

    | '05 3500+[NC], 3000+[W], 2x 3200+[W], 3500+[NC], 3200+[V] 0516GPDW

    Quote Originally Posted by saaya
    sof pulled a fermi on all of us !!!

  15. #2815
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Posts
    970
    Quote Originally Posted by orangekiwii View Post
    Not really.

    They can have better drivers than AMD and be good at making them. But just recently they've pulled 15 - 20% more performance out of some games. On an Architecture thats extremely similar to the g80 which is 3 years old. Now are you honestly going to tell me that a BRAND SPANKING NEW ARCHITECTURE which has had basically four months of development won't get similar improvements?


    Compare the 5870 with release drivers to the drivers now. This arch is really just a cleaned up version of the rv770 (which was based off of 2900 amirite?) and yet they still added a clean 10 - 15% increase across the board. Do you really expect Fermi to not have similar improvements if not MUCH greater improvements in the coming months?

    I don't see it like that. I see a few percentage that you could attribute improvements to refining optimizations. The big jumps I see are in DX11 which isn't exactly old. I think it was just released in September. There's no excuse for nv not having as much development time with it as AMD has. I also see big improvments in TWIMTBP'd titles, or other games where AMD doesn't get pre-released code to optimize for which they have to basically buy off the shelf. Take Metro for example: optimizations still coming as they never had access to code. I can't think of a situation where nv wouldn't have the chance to optimize for their own games.

    2 completely different situations I think.

  16. #2816
    Xtreme Enthusiast
    Join Date
    Dec 2009
    Posts
    591
    Quote Originally Posted by SoF View Post
    I added one more smiley in my post so that just nobody get's me wrong - won't happen and would be very bad anyway...love benching different cards but Fermi still needs some time until it's worth spending some DICE or LN2 on it for me.
    You still think fermi will be worth nvidias time in the future?

    G80 is creaking and cracking and the more I read about fermi the more I realize its going to take a genius and an engineering miracle to bring out a well balanced GPU from that hungry and noisy beast.

    If Nvidia can do it without a major redesign, I will be forever impressed. Advise to nvidia (worthless I know): its time to move on. I don't want to see you like this.

  17. #2817
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Quote Originally Posted by SoF View Post
    I added one more smiley in my post so that just nobody get's me wrong - won't happen and would be very bad anyway...love benching different cards but Fermi still needs some time until it's worth spending some DICE or LN2 on it for me.
    Gotcha.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  18. #2818
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    it wont take a miracle to make a good gpu based off of fermi. it might catch us by surprise like rv770 but thats just from good engineering. they dont need a major redesign, if anyone does its ATi. fermi is really good at tessellation and designed to handle it very efficiently. they just need more work on process/yields.

  19. #2819
    Xtreme Member
    Join Date
    Jul 2006
    Posts
    403
    to be honest i think the hardest think about the new architecture is the predicative load balancing that needs to be done on the GPU, I think that with a bit of time and more cards in the wild, and in test groups that the drivers can easily be tweaked to provide better load balancing on a per game basis. I am guessing here though.

    @saaya, I personally think that 1920x1200 is the perfect res to game at, on a 24" screen, tho sometimes i do feel like even 24" is a little large since i tend to subscribe to the "nose up against the screen" CS style of play school.

    I think that this thread has devolved into nvidia vs ATI, which is all I've read on any forum lately and I'm getting a little tired of it, I personally don't lean towards either camp but I also will not write off the gtx480 as quickly as some people on this board will, I will give it a fair amount of chance, for all we know this new architecture might be the future, look at where the pathetic 2900xt ended up

  20. #2820
    Xtreme Enthusiast
    Join Date
    Dec 2009
    Posts
    591
    Quote Originally Posted by Chumbucket843 View Post
    it wont take a miracle to make a good gpu based off of fermi. it might catch us by surprise like rv770 but thats just from good engineering. they dont need a major redesign, if anyone does its ATi. fermi is really good at tessellation and designed to handle it very efficiently. they just need more work on process/yields.
    Oooh tessellation is such a big thing now nvidia's got it. We know ATI has got solid hardware under the hood. In fact, if you check this link you can see there isn't a single figure that nvidia has the edge on. This shows ATI is still missing something: they are producing a beast with better specs in theory, fewer transistors (a billion less), and excellent power consumption.

    Hardware or software? I am guessing ATI's processing units aren't saturated effectively, something in the hardware. Load balancing isn't doing its thing properly or maybe something entirely different.

    Tessellation? I can make a guess that ATI can improve tessellation performance with better drivers, given the nature of tessellation.

  21. #2821
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    considering people are still able to OC the crap out of these cards, i wonder if they can be undervolted and still run a solid 500-600mhz, but with a huge chunk less of power.

  22. #2822
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by damha View Post
    oooh tessellation is such a big thing now nvidia's got it. We know ati has got solid hardware under the hood. In fact, if you check this link you can see there isn't a single figure that nvidia has the edge on. This shows ati is still missing something: They are producing a beast with better specs in theory, fewer transistors (a billion less), and excellent power consumption.
    not really. its designed around big triangles. if you have small primitives from tessellation it pretty much destroys performance of pixel shaders.

    you might want to look harder.


    fyi heaven culls ~70% of triangles in the dragon scene.
    hardware or software? I am guessing ati's processing units aren't saturated effectively, something in the hardware. Load balancing isn't doing its thing properly or maybe something entirely different.

    Tessellation? I can make a guess that ati can improve tessellation performance with better drivers, given the nature of tessellation.
    their graphics pipeline is stalling for tessellation. its a hardware problem that will require a lot of thinking and problem solving to get right. im sure they will have a good solution in r900.

  23. #2823
    Xtremely Bad Overclocker
    Join Date
    Jan 2005
    Location
    East Blue
    Posts
    3,596
    Quote Originally Posted by Manicdan View Post
    considering people are still able to OC the crap out of these cards, i wonder if they can be undervolted and still run a solid 500-600mhz, but with a huge chunk less of power.
    considering the low default voltage I am not that optimistic that undervolting can save you "a huge chunk less" of power. Maybe a bit but if it is enough to justify the consumption for the speed you have at those clocks then - think you have the performance of a different card with less consumption then, so we are back at the point that it doesn't look good for undervolting. Also I am wondering that software voltage tools are still only announced to be released -.-
    Last edited by SoF; 03-29-2010 at 01:49 PM.
    | '12 IvyBridge - "ticks different"... | AwardFabrik IvyBridge round I by SoF | AwardFabrik IvyBridge round II by angoholic & stummerwinter
    | '11 The SandyBridge madness... | AwardFabrik / Team LDK OC-Season 2011/2012 Opening Event
    | '10 Gulftown LaunchDay OC round up @ASUS RIIE | 3DM05 2x GPU WR LIVE @Cebit 2010 @ASUS MIIIE | SandyBridge arrived @ASUS P8P67

    | '09 Foxconn Avenger | E8600 | Foxconn A79A-S | Phenom II 940 BE | LaunchDay Phenom II OC round up
    | '08 7.438s 1m LN2 | AMD 1m WR LN2 | 2nd AOCM | Phenom II teasing
    | '07 100% E2140 | 106.5% E2160 | 100% E4500 | 103% E4400 | 5508 MHZ E6850 | 7250 MHZ P4 641 126.5% by SoF and AwardFabrik Crew all on Gigabyte DS3P c? and LN2...
    | '06 3800+ X2 Manchester 0531TPEW noHS 3201MHZ c? | 3200+ Venice noHS 3279MHZ c? | Opteron 148 0536CABYE 3405MHZ c? all on Gigabyte K8NXP-SLI compressorcooled

    | '05 3500+[NC], 3000+[W], 2x 3200+[W], 3500+[NC], 3200+[V] 0516GPDW

    Quote Originally Posted by saaya
    sof pulled a fermi on all of us !!!

  24. #2824
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Tim View Post
    We get the point already.
    oh come on, dont be such a party pooper :P

    Quote Originally Posted by Manicdan View Post
    single gpu isnt too scary, u just cant use the ultra textures or 4x+AA, but with that kind of resolution, and dot pitch, u may not need to use more than 2xaa.

    and yes if u can drop 1000$ on a monitor, it may just be better to drop 500$ on a good IPS around 25" and put an extra gpu in ur pc. thats what i was telling myself with the SSD drive i had, i could buy 2 more 4850s, or a 60GB SSD. but i was happy with my gpu performance (especially for WoW) and the SSD felt more worthwhile, even though its retarded expensive still. but back the point, a 1000$ on a monitor probably means an extra 200$ on GPU power.

    and keep in mind its only 40% more pixels.
    yeah... only 40% more pixels but for some reason perf collapses when going to 2560x1600 with current hw... i expected a lot from the 480s in that regard... i thought theyd be really really nice at high res and a prfect companion for a 30" screen...

    yeah, if you buy a 2560 display you basically have to take into consideration that if you play games, you have to add some extra money to the cost of the display for upgrading your pcs graphics...

    Quote Originally Posted by tajoh111 View Post
    Its weird those. In anandtech review, 480 does significantly better at crysis warhead 11-17 percent better with 33 percent higher minimums than the 5870. The SLI results are a complete beat down in those results with SLI gtx480 likely matching CF 5970s.
    what drivers did anandtech use? what cpu speed and what windows version? i noticed big differences during the 5800 launch in reviews using a 2.66ghz 920 or a 4ghz 965, and there were weird differences in 32bit vs 64bit as well, some games seem to perform better on 64bit than others, and iirc ati was slightly better in 64bit than nvidia vs 32bit? cant remember...

    Quote Originally Posted by Manicdan View Post
    +1,
    i think its alot of balancing the new gpu/memory power. the 4870s had much stronger ram to gpu power than we do now. i dont know much about building drivers, but when it comes to billions of calculations a second, i bet its alot of trial and error testing, and then boom, they find the sweet spot
    yeah... and that sweet spot is probably different for every game, every res... if you use aa and af its different... and then they find another tweak and the sweet spot is different again...

    Quote Originally Posted by Coldon View Post
    @saaya, I personally think that 1920x1200 is the perfect res to game at, on a 24" screen, tho sometimes i do feel like even 24" is a little large since i tend to subscribe to the "nose up against the screen" CS style of play school.

    I think that this thread has devolved into nvidia vs ATI, which is all I've read on any forum lately and I'm getting a little tired of it, I personally don't lean towards either camp but I also will not write off the gtx480 as quickly as some people on this board will, I will give it a fair amount of chance, for all we know this new architecture might be the future, look at where the pathetic 2900xt ended up
    lol@nose at the screen hehe
    well tbh, if the pixels are big and the space between them is tiny, then theres no problem looking at the screen from close up i think
    but yeah, 24" 1920x1080 or 1200 sounds like the best res to me as well...
    i didnt know there were 27" screens with the same res tho... those might actually be very interesting as well

    about ati vs nvidia... like i said before, i dont think theres a big difference between 5850 5870 470 and 480... what can you do on one of those that you cant do on another? if wed build 4 rigs one with each of them, would any of us be able to tell them apart by playing games on it? i doubt it...

    for me it comes down to cost, driver preference and whether power and heat is important to the user...

    i dont think even the most radical ati fanboy would call fermi a slow card... its hotter and costs more than a 5870, but i doubt any ati fan would refuse to use one if youd give him one for free

  25. #2825
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by SoF View Post
    considering the low default voltage I am not that optimistic that undervolting can save you "a huge chunk less" of power. Maybe a bit but if it is enough to justify the consumption for the speed you have at those clocks then - think you have the performance of a different card with less consumption then, so we are back at the point that it doesn't look good for undervolting.
    yeah... stock volts are .99v :o
    whats interesting is that idle volts are .96... so i really think power management is broken... that explains the 85W video playback and 55W idle power consumption...

Page 113 of 123 FirstFirst ... 1363103110111112113114115116 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •