MMM
Page 4 of 21 FirstFirst 123456714 ... LastLast
Results 76 to 100 of 520

Thread: 7900GTX / GT Information

  1. #76
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Location
    NY
    Posts
    665
    Quote Originally Posted by nn_step
    So you believe that ATi will have the edge in DirectX and nVidia will continue having a major advantage in OpenGL?
    No, Shader intensive games. Fear is a nice example. I think oblivion results will be similar with fear, same as ut07, Whereas 7900 will simply dominate in older games, newer games and engines will love the x1900s shading power
    Unapproved link in signature. Signature has been removed.
    Please read the forums rules and guidelines. They are at the top of the forums.

    Edited by IFMU

  2. #77
    Muslim Overclocker
    Join Date
    May 2005
    Location
    Canada
    Posts
    2,786
    Quote Originally Posted by sabrewolf732
    No, Shader intensive games. Fear is a nice example. I think oblivion results will be similar with fear, same as ut07, Whereas 7900 will simply dominate in older games, newer games and engines will love the x1900s shading power
    Thats why ATI did the research and chose that design path... Whats the use of pixel processing power if shading is lacking and its what games will be in the future?

    My watercooling experience

    Water
    Scythe Gentle Typhoons 120mm 1850RPM
    Thermochill PA120.3 Radiator
    Enzotech Sapphire Rev.A CPU Block
    Laing DDC 3.2
    XSPC Dual Pump Reservoir
    Primochill Pro LRT Red 1/2"
    Bitspower fittings + water temp sensor

    Rig
    E8400 | 4GB HyperX PC8500 | Corsair HX620W | ATI HD4870 512MB


    I see what I see, and you see what you see. I can't make you see what I see, but I can tell you what I see is not what you see. Truth is, we see what we want to see, and what we want to see is what those around us see. And what we don't see is... well, conspiracies.



  3. #78
    Xtreme Mentor
    Join Date
    Sep 2005
    Location
    Netherlands
    Posts
    2,693
    maybe nvidia chose to make shure the 7800/7900 will b better in older games that dont need alot of shader power cause they got the G80 not that far away wich will b powerfull with shading.

    so that they didnt think the investment was worth it cause the G80 is so near.
    Time flies like an arrow. Fruit flies like a banana.
    Groucho Marx



    i know my grammar sux so stop hitting me

  4. #79
    Xtreme X.I.P.
    Join Date
    Aug 2004
    Location
    Chile
    Posts
    4,151
    i believe nvidia went that wait because as far as i know shader procesors do mostly physics so ageia (dont remember the spelling) could become handy fonr nvidia and they would also have a lot more pixel procesors that ATI, also if you look at the fact that nvidia sponsors more game developments i dont think developers will code to take advantage of the competition

  5. #80
    Xtreme Mentor
    Join Date
    Sep 2005
    Location
    Netherlands
    Posts
    2,693
    wasnt there a rumor ( a small one) that the G80 had 2 cores?
    maybe 1 is for physics?
    Time flies like an arrow. Fruit flies like a banana.
    Groucho Marx



    i know my grammar sux so stop hitting me

  6. #81
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Location
    NY
    Posts
    665
    Quote Originally Posted by metro.cl
    i believe nvidia went that wait because as far as i know shader procesors do mostly physics so ageia (dont remember the spelling) could become handy fonr nvidia and they would also have a lot more pixel procesors that ATI, also if you look at the fact that nvidia sponsors more game developments i dont think developers will code to take advantage of the competition
    The industry is headed towards more shader based games... that's pretty much fact.
    Unapproved link in signature. Signature has been removed.
    Please read the forums rules and guidelines. They are at the top of the forums.

    Edited by IFMU

  7. #82
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by Starscream
    wasnt there a rumor ( a small one) that the G80 had 2 cores?
    maybe 1 is for physics?
    Yeah the rumor mentioned something about a second specialized core.. Perhaps an ageia physics chip...
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  8. #83
    Xtreme Addict
    Join Date
    Aug 2004
    Location
    Austin, TX
    Posts
    1,346
    Quote Originally Posted by ahmad
    Thats why ATI did the research and chose that design path... Whats the use of pixel processing power if shading is lacking and its what games will be in the future?
    That's correct. B3D interviewed Carmack and Sweeney, who both said that the shader:tex ratio shouldn't be 1:1 (definately should be higher)
    oh man

  9. #84
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by ahmad
    I disagree. Nobody knew or expected the 7800GTX 512MB to be a limited edition for one thing (nvidia never said quanitities would be limited, just said it was going to be hardlaunched which it was).

    Now look at it from another point of view: if the card was that much better than the x1900XTX, then Nvidia could essentially price it as high as they want and people would still buy it because it is the best. People who put down cash for these things won't mind spending extra to get the best (I know I wouldn't and never do). Nvidia is obviously is not dumb and if they see a chance to make money, why let it go to waste?

    This is pretty sound reasoning IMHO. But I guess we will all find out for sure on the 9th
    The Inq and AACDirect both told us the 7800GTX 512mb was a limited edition card.

    I think NVidia is going to play it smart, not only an attempt to dominate the performance segment, but also dominate the price segment as well. You see, we all know the 7800GTX 512mb was merely a PR move to beat the x1800XT, no point in hiding it. The 7900GTX is to be an actual CARD, marketed to be bought in store, not just as a "haha ati, we win" thing.

    After the amount of flak NVidia caught over the phantom 512mb GTX, they'd be committing suicide not to release this one at a good price. Ontop of that, do you realllllly think NVidia would release 2 months after ATi and come up with a worse card than the x1900xtx?

    That's just being niave.

    As for the ratio of tex:shader, yes, we're seeing more shaders used than textures. Thing is, 16 textures may still yet be too low. That 16 may bottleneck it by the time it gets to the point where it can see a major league advantage from the 48 pixel shaders, stopping it from seeing it's full advantage anyway. Now, I can't say this for a fact, none of us can unless we personally work for companies like Epic, id, blizzard, etc. I can say this though, 32:32 will surely put up it's fight against 16:48, no one anywhere can deny that.

    We'll see who's better come march, it's only a few short weeks away, so how about we quit speculating and just wait for it.
    Last edited by DilTech; 02-16-2006 at 10:59 PM.

  10. #85
    Xtreme Mentor
    Join Date
    Jul 2004
    Posts
    3,247
    7900GTX @ 650Mhz/1.6GHz 512MB GDDR3 1.1ns RAMs

    7900GT @ 450Mhz/1.32GHz 256MB GDDR3 1.4ns RAMs

    http://www.hardspell.com/news/showco...?news_id=22084

  11. #86
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Location
    NY
    Posts
    665
    Quote Originally Posted by DilTech

    After the amount of flak NVidia caught over the phantom 512mb GTX, they'd be committing suicide not to release this one at a good price. Ontop of that, do you realllllly think NVidia would release 2 months after ATi and come up with a worse card than the x1900xtx?
    Remember the 5800? =P



    As for the ratio of tex:shader, yes, we're seeing more shaders used than textures. Thing is, 16 textures may still yet be too low. That 16 may bottleneck it by the time it gets to the point where it can see a major league advantage from the 48 pixel shaders, stopping it from seeing it's full advantage anyway. Now, I can't say this for a fact, none of us can unless we personally work for companies like Epic, id, blizzard, etc. I can say this though, 32:32 will surely put up it's fight against 16:48, no one anywhere can deny that.
    Thing is, in fear, an extremely shader dependent game the x1900xtx is up to 60% faster than a 7800gtx 512, the most the 7900gtx can be faster is 33% or so.
    Unapproved link in signature. Signature has been removed.
    Please read the forums rules and guidelines. They are at the top of the forums.

    Edited by IFMU

  12. #87
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Quote Originally Posted by onethreehill
    7900GTX @ 650Mhz/1.6GHz 512MB GDDR3 1.1ns RAMs

    7900GT @ 450Mhz/1.32GHz 256MB GDDR3 1.4ns RAMs

    http://www.hardspell.com/news/showco...?news_id=22084
    Here's to hoping we'll see an ATI bios for 700/800 on the XTX out before then Frankly, they could release a 750/850 bios for the XTX cos of how its binned.

    Perkam

  13. #88
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by sabrewolf732
    Thing is, in fear, an extremely shader dependent game the x1900xtx is up to 60% faster than a 7800gtx 512, the most the 7900gtx can be faster is 33% or so.
    everybody uses FEAR as an end-all example of shader performance. I don't know how you can look at FEAR benchmark results and not say it's optimized for ATI. There's no other shader intensive benchmark or game out there that shows ATI having nearly that much of an advantage.

    And yes, I'm aware that at the very last minute NVIDIA won the bid to get TWIMTBP stamped on the boxes. I'm also aware that up to that point, it was a heavily ATI-sponsored game, as ATI was doing as much as it could to get in bed with Vivendi and all of it's games (they managed to keep Tribes Vengeance, but unfortunately for them that turned up mediocre in popularity). Since they were around first, during the development of the engine, this relationship shows.

    So people should really reference another game instead. FEAR is totally biased and unrealistic.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  14. #89
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    everybody uses FEAR as an end-all example of shader performance. I don't know how you can look at FEAR benchmark results and not say it's optimized for ATI. There's no other shader intensive benchmark or game out there that shows ATI having nearly that much of an advantage.
    [RANT] Don't worry about it...its just residual comments that are still being said in response to Nvidia users pointing out like roosters off of the empire state building "LOOK, NVIDIA PWNS ATI IN DOOM3 !!!! WOOT !!!" So really no one can be blamed for it.

    ATI's shader advantage is not only in fear...as more shader intensive games come out, we'll see an even greater disparity between 7800 and X1900 performance, ESPECIALLY by the time we have 6.9 and 6.10 drivers coming out later this year. The REAL competitor to the X1900 will be the 7900 you say...I'm afraid that doesn't go all that well with people comparing X1800 and 7800GTX512 performance just cos they were out at the same time [/RANT]

    What we really need to see is the price range for the 7900GT and performance vs x1800 parts.

    Perkam
    Last edited by perkam; 02-17-2006 at 08:46 AM.

  15. #90
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by perkam
    [RANT] ATI's shader advantage is not only in fear...as more shader intensive games come out, we'll see an even greater disparity between 7800 and X1900 performance, ESPECIALLY by the time we have 6.9 and 6.10 drivers coming out later this year. The REAL competitor to the X1900 will be the 7900 you say...I'm afraid that doesn't go all that well with people comparing X1800 and 7800GTX512 performance just cos they were out at the same time [/RANT]
    Let me put it this way: I EXPECT the X1900XT(X) to pwn the daylights out of the 7800GTX 512 in shader-intensive stuff. I mean come on, with 48 shader pipes, that'd be sad if it didn't. But the performance advantage in FEAR extends all the way down to the X800 family. At some points, an X850XT PE performs the same as a 7800GT! I don't care who you are, that's not right.

    And yes, I do say the real competitor to the X1900 series is the 7900 series. Not only is the time of release a factor, as you said, but that the 7800GTX 512 was the only card that had the same amount of RAM coming from NVIDIA, or the closest in price at release (don't forget, at that time the GTX 256MB was over $100 cheaper than the X1800XT). Could NVIDIA help it that the 7800GTX 512 was powerful enough to nearly compare to a future card whose performance they had no way of anticipating?
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  16. #91
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Wales, UK
    Posts
    1,195
    Quote Originally Posted by Cybercat
    Let me put it this way: I EXPECT the X1900XT(X) to pwn the daylights out of the 7800GTX 512 in shader-intensive stuff. I mean come on, with 48 shader pipes, that'd be sad if it didn't. But the performance advantage in FEAR extends all the way down to the X800 family. At some points, an X850XT PE performs the same as a 7800GT! I don't care who you are, that's not right.

    And yes, I do say the real competitor to the X1900 series is the 7900 series. Not only is the time of release a factor, as you said, but that the 7800GTX 512 was the only card that had the same amount of RAM coming from NVIDIA, or the closest in price at release (don't forget, at that time the GTX 256MB was over $100 cheaper than the X1800XT). Could NVIDIA help it that the 7800GTX 512 was powerful enough to nearly compare to a future card whose performance they had no way of anticipating?
    So its unfair to lump the gtx256 and x1800 together as the x1800 was more expensive by $100, and yet its ok to compare it to the gtx512 which was far more expensive, and in limited avalability? And its not fair to compare the x1900 to a card that costs more and is nvidias best card, and is to remain so for the first three months of its product cycle?

    You simply go by whats avaliable at the time, and at what prices. So if someone chose any of these otions at the time of their release i could see how they would justify their purchases, however the gtx512 shouldn't be compared to any ati card as they are not in the same price bracket. The x1900 will be comparable to nvidias 7900 when it comes out as it will be ati's best at that time, so it comes back down to price/performance. You have to look at it both ways and not just say that its unfair to compare cards because the next one will be better - the launches are no longer syncronous, so that way of thinking, comparing one card to another has to go.

    Right now, the x1900 is both more powerful, and cheaper than a gtx512, and theres still a month before the 7900 is released. The x1900 will be months old at that time - and therefore you have already declared (in comparing the gtx256 and x1800, or gtx512 and x1900) that you shouldn't do this.

  17. #92
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by sabrewolf732
    Remember the 5800? =P





    Thing is, in fear, an extremely shader dependent game the x1900xtx is up to 60% faster than a 7800gtx 512, the most the 7900gtx can be faster is 33% or so.
    Yes, I remember the 5800. Poor DX9 performance and a 128bit memory bus, delayed to hell and back due to issues with a die shrink... Now, do you remember every other card NVidia has ever released outside of the FX series? My point exactly.

    Look at it this way, the 7900gtx only needs 430mhz to catch the 7800gtx 512mb, at at 700-750 it theoritically should downright destory all scores we've seen done by the 7800gtx 512mb. 550 on the 7900gtx should beat out anyscore ever reached on any cooling from the 7800gtx 512mb. Regardless of what happens, we'll definitely have a new 3dmark champion. I don't know where you get that it could only be 33% faster, 8 more pipelines and 150mhz faster... Do the math!

    This card may be closer to twice the speed of the 7800gtx 512mb, atleast theoritically. We all know increases are never linear, and therefore everything is just guesses until we actually see reviews.

    Quote Originally Posted by perkam
    [RANT] Don't worry about it...its just residual comments that are still being said in response to Nvidia users pointing out like roosters off of the empire state building "LOOK, NVIDIA PWNS ATI IN DOOM3 !!!! WOOT !!!" So really no one can be blamed for it.

    ATI's shader advantage is not only in fear...as more shader intensive games come out, we'll see an even greater disparity between 7800 and X1900 performance, ESPECIALLY by the time we have 6.9 and 6.10 drivers coming out later this year. The REAL competitor to the X1900 will be the 7900 you say...I'm afraid that doesn't go all that well with people comparing X1800 and 7800GTX512 performance just cos they were out at the same time [/RANT]

    What we really need to see is the price range for the 7900GT and performance vs x1800 parts.

    Perkam
    Perkam and Sabre, may I remind you that black & white 2 is also an extremely shader heavy game and a newer engine, it loses to the 7800gtx in. This is one of those examples of games where 16tmus might just not be enough. At maximum quality settings, the 7800GTX 512mb beat the x1900xtx by a whopping 31%, we could be looking at 50-60% for the 7900gtx here.

    http://www.anandtech.com/video/showdoc.aspx?i=2679&p=9

    Even the x1900xt in CF can't touch the 7800gtx 256mb. Why? Because of a lack of TMU's to back up those pixel shaders.

    This is why I tell you, we'll see who has the better design in less than a month away. Could be ATi, or it could be NVidia, we'll just have to wait and see.

    As for now ATi has the fastest card available, but next month we could see all that change. Less than 3 weeks time, if you haven't already bought a new videocard, and plan to this round, I strongly urge you to wait it out.
    Last edited by DilTech; 02-17-2006 at 09:48 AM.

  18. #93
    Muslim Overclocker
    Join Date
    May 2005
    Location
    Canada
    Posts
    2,786
    Quote Originally Posted by DilTech
    I think NVidia is going to play it smart, not only an attempt to dominate the performance segment, but also dominate the price segment as well. You see, we all know the 7800GTX 512mb was merely a PR move to beat the x1800XT, no point in hiding it. The 7900GTX is to be an actual CARD, marketed to be bought in store, not just as a "haha ati, we win" thing.

    After the amount of flak NVidia caught over the phantom 512mb GTX, they'd be committing suicide not to release this one at a good price. Ontop of that, do you realllllly think NVidia would release 2 months after ATi and come up with a worse card than the x1900xtx?

    That's just being niave.
    I think it would be niave to assume otherwise

    Your argument is weak. Its possible, but its weak. Why price it lower than the x1900XT? Why not similar? I think your post should say "what I hope for..."

    My watercooling experience

    Water
    Scythe Gentle Typhoons 120mm 1850RPM
    Thermochill PA120.3 Radiator
    Enzotech Sapphire Rev.A CPU Block
    Laing DDC 3.2
    XSPC Dual Pump Reservoir
    Primochill Pro LRT Red 1/2"
    Bitspower fittings + water temp sensor

    Rig
    E8400 | 4GB HyperX PC8500 | Corsair HX620W | ATI HD4870 512MB


    I see what I see, and you see what you see. I can't make you see what I see, but I can tell you what I see is not what you see. Truth is, we see what we want to see, and what we want to see is what those around us see. And what we don't see is... well, conspiracies.



  19. #94
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Why is simple, because by march we'll see x1900xtx's for around $550 everywhere, so pricing at $649 would be stupid. If they price it at $549 or below out the gate, then it should be pretty close to what we see the x1900xtx for at that time.
    Last edited by DilTech; 02-17-2006 at 10:12 AM.

  20. #95
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Location
    NY
    Posts
    665
    you due the math. It has 33% more pipes and 33% more clocks. 33% is the fastest it can theroretically be vs a gtx512.
    http://images.anandtech.com/graphs/a...0152/10664.png

    And afaik, No game has a tmu/shader ratio like fear does. Also, the x850xt pe is Close to a 7800gt in pretty much all games. The fact that it beats it in fear is not surprising. In the above img the x1900xtx has 100% perf over the 7800gtx 512. If that's not impressive, I don't know what is. And the x1800series and 7800series were rather close in fear, you couldn't really call it biased until the x1900 came out of course
    Last edited by sabrewolf732; 02-17-2006 at 10:38 AM.
    Unapproved link in signature. Signature has been removed.
    Please read the forums rules and guidelines. They are at the top of the forums.

    Edited by IFMU

  21. #96
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Wales, UK
    Posts
    1,195
    firstly performance doesn't scale linearly with number of pipes and clocks so its useless to say because it has 33% more pipes/clocks its going to be 33% faster.

    Secondly, by your logic it would be 76% better because 33% increase in clcoks would make it 33% better, and 33% increase in pipes would make it 33% better, it would be:

    1.33 (33% more clocks than original) x 1.33 (33% more pipes than original) x100 (to express answer as a percentage) =176.89 % of original = 76% better.

    Never use the term "you do the maths" then give the wrong sum, it looks silly.

    Of course this is not thee case at all, parts of the chip will go unused, and there may be a deficit in vital areas as well as the oversupply. Ati's implementation is a little weak in regards to TMU's and strong in shaders, and nvidia's 7900 will be devastating in regards to tmu's but still has less shaders and so may perform worse in certain applications.

    We won't know performance until its out.
    Last edited by onewingedangel; 02-17-2006 at 10:53 AM.

  22. #97
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Location
    NY
    Posts
    665
    No, that's not how performance is. To gain 33% in performance you must increase everything by 33%. Because you do 33% higher clocks and 33% more pipes doesn't mean 76% more perf. As I said, you do the math. And obviously this is theoretically.
    Unapproved link in signature. Signature has been removed.
    Please read the forums rules and guidelines. They are at the top of the forums.

    Edited by IFMU

  23. #98
    Xtreme Addict
    Join Date
    Apr 2005
    Location
    Wales, UK
    Posts
    1,195
    if card y has twice as many pipes doing the same ammount of work(same clock speeds) as card x, then twice as much work gets done.

    If card y has the same ammount of pipes but each pipe runs at twice the speed as card x, then twice the ammount of work gets done.

    If card y has twice as many pipes and is twice as fast then card y gets four times the work done as card x theoretically.

    Of course in practice this is not the case, other things come into play (memory access, imbalance in resources ie.some areas going idle when others are bottlenecking the system etc.)

    I was just showing you that you cannot make performance predictions based upon clock speed or pipeline count increases, and I hope I've shown you how your MATH is flawed.

  24. #99
    Xtreme Member
    Join Date
    Dec 2005
    Location
    Toronto, Canada
    Posts
    127
    Quote Originally Posted by onewingedangel
    firstly performance doesn't scale linearly with number of pipes and clocks so its useless to say because it has 33% more pipes/clocks its going to be 33% faster.

    Secondly, by your logic it would be 76% better because 33% increase in clcoks would make it 33% better, and 33% increase in pipes would make it 33% better, it would be:

    1.33 (33% more clocks than original) x 1.33 (33% more pipes than original) x100 (to express answer as a percentage) =176.89 % of original = 76% better.

    Never use the term "you do the maths" then give the wrong sum, it looks silly.

    Of course this is not thee case at all, parts of the chip will go unused, and there may be a deficit in vital areas as well as the oversupply. Ati's implementation is a little weak in regards to TMU's and strong in shaders, and nvidia's 7900 will be devastating in regards to tmu's but still has less shaders and so may perform worse in certain applications.

    We won't know performance until its out.
    Agreed, Let's use hardspell's clockspeeds for instance...

    7900 GTX 512 650MHZ/1600MHZ

    32 Pixel Shaders/32TMU/24ROP
    20800 MP Fillrate/15600 MP Output

    7800 GTX 512 550MHZ/1700MHZ

    24 Pixel Shaders/24TMU/16ROP
    13200 MP Fillrate/8800 MP Output

    Even at these conservative clockspeeds your looking at a ~57% Increase in Shader Power and Texturing Power over 7800 GTX 512.

    Typically Shader Power and Texturing Power in a 1:1 Ratio with a 0.66x+ Amount of ROP power will yield almost linear performance increases, so a 50% performance improvement bascially everywhere would still be quite impressive.

    Would be nice if it were clocked at 700MHZ or higher though as that would pretty much ensure the F.E.A.R crown as well. Everyone loves to say a 60% faster in F.E.A.R, tell me at what settings???

  25. #100
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Honestly the only thing we should expect is that it will beat the X1900XTX.. anything else would be appreciated..
    But should they seriously Fuk up and release something that performs worse *cough*5800*Cough* They will be in one heck of a pickle...
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

Page 4 of 21 FirstFirst 123456714 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •