Page 7 of 10 FirstFirst ... 45678910 LastLast
Results 151 to 175 of 227

Thread: Nvidia 270, 290 and GX2 roll out in November

  1. #151
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by Carfax View Post
    If you overclock the GTX 260 core alone, then you have a point..

    As you know, the shader processors and the core itself don't run at the same speed, so overclocking the core alone while leaving the shaders won't result in the best performance gains.

    Overclocking both the shaders and the core though, will see the same kind of progression you see on the R700.

    On my own card, I overclocked the core, the shaders and the memory.
    I thought rivatuner linked the core/shader by default.

    All along the watchtower the watchmen watch the eternal return.

  2. #152
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Quote Originally Posted by Clairvoyant129 View Post
    If the multi-GPU has the Nvidia logo on it, these fanboys will eat it up no problem.

    Wanna put money on that? I will NOT EVER (not today, not a week from now, not a month from now, not a year from now, not ten years from now, or 20, or at any point in time any year of the future) will I fall into that fad. It is stupidly expensive, stupidly expensive to watercool, horribly innefficient, eats up a rediculous amount of wattage, and I won't be buying into the bugs they come pre-equipped with. On top of that, they usually downclock the GPU's which is the opposite of what I do with my systems. I OC...I don't do UC's.

    One powerful GPU per card is all I need. It's all I will ever need, because if there is ever a need for more than that, then it's time to buy a new video card. There will never be a use for dual GPU's on one card. That's what SLI is for. One heat producer per card is enough.

  3. #153
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Birmingham AL.
    Posts
    1,079
    Quote Originally Posted by T_Flight View Post
    Wanna put money on that? I will NOT EVER (not today, not a week from now, not a month from now, not a year from now, not ten years from now, or 20, or at any point in time any year of the future) will I fall into that fad. It is stupidly expensive, stupidly expensive to watercool, horribly innefficient, eats up a rediculous amount of wattage, and I won't be buying into the bugs they come pre-equipped with. On top of that, they usually downclock the GPU's which is the opposite of what I do with my systems. I OC...I don't do UC's.

    One powerful GPU per card is all I need. It's all I will ever need, because if there is ever a need for more than that, then it's time to buy a new video card. There will never be a use for dual GPU's on one card. That's what SLI is for. One heat producer per card is enough.
    It is likely that it will come a time when this is the only options for the topend cards (once more refined) from both NV and AMD. So far NV is the only one doing underclocking on there multi GPU but if they ever design a chip for that purpose im sure there would be no need for UCing.
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

  4. #154
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by T_Flight View Post
    Wanna put money on that? I will NOT EVER (not today, not a week from now, not a month from now, not a year from now, not ten years from now, or 20, or at any point in time any year of the future) will I fall into that fad. It is stupidly expensive, stupidly expensive to watercool, horribly innefficient, eats up a rediculous amount of wattage, and I won't be buying into the bugs they come pre-equipped with. On top of that, they usually downclock the GPU's which is the opposite of what I do with my systems. I OC...I don't do UC's.

    One powerful GPU per card is all I need. It's all I will ever need, because if there is ever a need for more than that, then it's time to buy a new video card. There will never be a use for dual GPU's on one card. That's what SLI is for. One heat producer per card is enough.
    Multi-gpu is going to become the norm for the highend soon. And it is going to follow a similar progression as CPUs:

    separate sockets (ie, separate chips and/or boards) and poor software compatibility (ever try to install win98 on a 4 socket PPro server? Lol) ---> better multisocket solutions and good software support (winNT,2000,XP,linux,etc) ---> mutiple dies on the same chip package ---> native multicore

    The thing holding multi-GPUs back is not having the ability to share the same memory pool. But you can see that issue is being tackled by intel with larabee and you can bet that the other two are chasing the same goal.

  5. #155
    Xtreme Addict
    Join Date
    May 2008
    Posts
    1,192
    Quote Originally Posted by T_Flight View Post
    Wanna put money on that? I will NOT EVER (not today, not a week from now, not a month from now, not a year from now, not ten years from now, or 20, or at any point in time any year of the future) will I fall into that fad. It is stupidly expensive, stupidly expensive to watercool, horribly innefficient, eats up a rediculous amount of wattage, and I won't be buying into the bugs they come pre-equipped with. On top of that, they usually downclock the GPU's which is the opposite of what I do with my systems. I OC...I don't do UC's.

    One powerful GPU per card is all I need. It's all I will ever need, because if there is ever a need for more than that, then it's time to buy a new video card. There will never be a use for dual GPU's on one card. That's what SLI is for. One heat producer per card is enough.
    Yield and profit.
    Quote Originally Posted by alacheesu View Post
    If you were consistently able to put two pieces of lego together when you were a kid, you should have no trouble replacing the pump top.

  6. #156
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Quote Originally Posted by Solus Corvus View Post
    Multi-gpu is going to become the norm for the highend soon. And it is going to follow a similar progression as CPUs:

    separate sockets (ie, separate chips and/or boards) and poor software compatibility (ever try to install win98 on a 4 socket PPro server? Lol) ---> better multisocket solutions and good software support (winNT,2000,XP,linux,etc) ---> mutiple dies on the same chip package ---> native multicore

    The thing holding multi-GPUs back is not having the ability to share the same memory pool. But you can see that issue is being tackled by intel with larabee and you can bet that the other two are chasing the same goal.
    Multiple dies on the same package is an entirely different matter, and one I support and is the reason I will never support this "double cheeseburger" arrangement. I wouldn't mind if it has 80-100 dies, but it better be under one IHS to put one cooler or block on top of. I wouldn't care if the package was 4 inches square.

    With dual GPU's side by side all you get is double the heat, double the electric bill, but nowhere near double the performance. On top of that you get the microstutters. You double the price of your cooling, it puts more stress on the boards. It's a bad idea that should've been ditched after the first one of those blunders was unloaded on the market.

  7. #157
    Xtreme Addict
    Join Date
    Mar 2007
    Posts
    1,377
    Cool. Hope Tek9 fits.

  8. #158
    Registered User
    Join Date
    Mar 2005
    Location
    Mumbai, India
    Posts
    1,090
    I met with few guys here last week, working with industry as well as media... all of them were saying new GX2 is about to hit the scene soon. NVIDIA has software development center in city of Pune [not far from Mumbai]. Performance of GX2 was not talked about a lot. So I am not surprised by news.

  9. #159
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by Chruschef View Post
    [...] (2) We are of course, making this all up [...]

    this entire article is rendered useless because of that single statement. They just spewed a full page of BS slamming nvidia that was a fat pack of lies.

    There is no proof, no unbiased oppinion and therefore no validity in this article.


    and I'm no fanboi, I love nVidia. But I love the fact ATi slammed them in the butt this time around, competition drives performance. If nVidia had to much control of the market, their product quality would spin out of control; and this is the result of that.
    from the article
    (1) It was code named "Smoking Sepuku", but Nvidia didn't want to publicly suggest that this strategy was vaguely related to anything honorable, so it was renamed. We saw the memo, it was poignant. (2)

    (2) We are, of course, making this all up.
    The point 2 is referring to point 1, not the whole article. As in "Smoking Sepuku", not everything Charlie said (not that he is always 100% correct anways). I bolded, underlined, and made the 2 huge so you can see what it was referring to.
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  10. #160
    Xtreme Member
    Join Date
    Apr 2006
    Location
    los angeles
    Posts
    387
    Whoopty-fricking-ding-dong.
    loved that, haha
    if inq is right, and this is a big if, then nvidia will slowly die out or will have to conform to amd's strategy
    Seti@Home Optimized Apps
    Heat
    Quote Originally Posted by aNoN_ View Post
    pretty low score, why not higher? kingpin gets 40k in 3dmark05 and 33k in 06 and 32k in vantage performance...

  11. #161
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715
    I told it before month, launch was only postponed from 22.10. to november ...

  12. #162
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    112
    Delayed again? What`s wrong with NVIDIA? Doesn`t have competitive GPUs (price/performance) to AMD and launch date of it`s refresh parts is delayed and delayed again. It seems that AMD is going to be king of hill with another couple of months (it has the fastest card and the best performance/price ratio single GPUs).

    @OBR you have told NVIDIA is going to release MANY more than only GT200B but according to TheInquirer this is ONLY GT200B; GTX290 with full GT200B specs and GTX270 with disabled some parts like GTX260 to GTX280 (i`m talking only about hardware). Moreover NVIDIA doesn`t need much faster GPU at now but cheaper so it doesn`t have to release HD4870X2 killer but need to release HD4870/4850 killers with Price/performance ratio.
    Last edited by Barys; 10-09-2008 at 11:51 PM.

  13. #163
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Phoenix, AZ
    Posts
    866
    i freaking hope multi gpu, 1 card solutions dont become the norm, some of us act like SLI/CF dont exist, including me. Single card always until they fix the problems with SLI/CF. If they wanna lose my business to Intel. (granted they keep single gpu cards in the market) then thats what they would do, if they made multi gpu the market norm.
    This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
    This Announcement of the delayed post above has been brought to you by Nvidia Inc.

    RIGGY
    case:Antec 1200
    MB: XFX Nforce 750I SLI 72D9
    CPU:E8400 (1651/4x9) 3712.48
    MEM:4gb Gskill DDR21000 (5-5-5-15)
    GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
    PS:Corsair 650TX
    OS: Windows 7 64-bit Ultimate
    --Cooling--
    5x120mm 1x200mm
    Zalman 9700LED
    Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350


  14. #164
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by T_Flight View Post
    Multiple dies on the same package is an entirely different matter, and one I support and is the reason I will never support this "double cheeseburger" arrangement. I wouldn't mind if it has 80-100 dies, but it better be under one IHS to put one cooler or block on top of. I wouldn't care if the package was 4 inches square.
    I think that multiple separate gpus (whether they are on separate cards or the same one) will be around for a while - even after you get multiple dies under the same IHS. It's for the same reason that multi cpu still exist even though we have quad cores (well not the only reason). There is always someone that wants or needs more and an easy way to get that is to put together multiple of whatever is the best currently available.

    With dual GPU's side by side all you get is double the heat, double the electric bill, but nowhere near double the performance. On top of that you get the microstutters. You double the price of your cooling, it puts more stress on the boards. It's a bad idea that should've been ditched after the first one of those blunders was unloaded on the market.
    This is the real crux from my perspective. Software support needs to be better. Games and applications access the GPU through APIs. The GPU manufacturers need to make their drivers (or hardware) in a way that handles spreading the load across multiple gpus transparent to the API. Having to make a special profile for each game isn't going to work in the long run - it just puts more strain on the driver development team. Instead the GPUs (or drivers) need to have logic that will allow them to share memory and share the workload on a single frame: thus simultaneously ridding us of AFR and the doubled up memory.

  15. #165
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    Quote Originally Posted by Solus Corvus View Post
    I think that multiple separate gpus (whether they are on separate cards or the same one) will be around for a while - even after you get multiple dies under the same IHS. It's for the same reason that multi cpu still exist even though we have quad cores (well not the only reason). There is always someone that wants or needs more and an easy way to get that is to put together multiple of whatever is the best currently available.


    This is the real crux from my perspective. Software support needs to be better. Games and applications access the GPU through APIs. The GPU manufacturers need to make their drivers (or hardware) in a way that handles spreading the load across multiple gpus transparent to the API. Having to make a special profile for each game isn't going to work in the long run - it just puts more strain on the driver development team. Instead the GPUs (or drivers) need to have logic that will allow them to share memory and share the workload on a single frame: thus simultaneously ridding us of AFR and the doubled up memory.
    I believe the quote you have in there, is wrong. Not wrong maybe a few years ago, but wrong right now. Dual GPU solutions aren't perfect (like say the 4870 X2) but they are still pretty damn good if you know what you are getting.

    I know I have gotten double performance in various games, almost perfect 100% over 1 GPU. And the whole Microstutter thing is blown way out of proportion, most people think any stutter that occurs = microstutter, when in fact it is likely just a driver bug.

    On some other forums we have a X2 thread and it is quite obvious the X2 is only as successful as the driver you use. Some people report stuttering, in say, Crysis. All you have to do is point them to a better driver and it goes away. We also recommend Vista 64, as the CF drivers are much better on there compared to anything on XP (32 or 64).

    Either way when you get a dual GPU setup (whether CF or an X2) you have to be aware that brand new games are not going to be profiled and performance will be bad. There may be a day when this doesn't occur, but it is what happens right now. You just have to be aware of it, and if you don't like it there are plenty of single GPU solutions you can use (such as the GTX 280). (or do what most of us do and diable Catalyst AI and just use 1 GPU and don't take performance hit)

    And regardless of what you or others think, multi-GPUs are going to be the 'way of the future' at least, at AMD. But you are right that with that software becomes the most important part. Drivers will sink or swim these, but they are here to stay. I also think the R800 is actually uses MCM and might not have to use AFR/Pooled memory.
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  16. #166
    I am Xtreme
    Join Date
    Feb 2005
    Location
    SiliCORN Valley
    Posts
    5,543
    I still find it quite hilarious, but bad for Nvidia that after, how many years?, ATI brings out 2 cards, ok now 3 with the 4870X2 that can beat Nvidia. The aforementioned 2 (4850/4870) are not ahead of nvidia by only a smidgen, they win only on price, so everyone starts calling nvidia dead, they are doomed, omg nvidia lost!!

    the fact is boys and girls, ATI has not produced a competetive card since the 9600/9700/9800 series. the X800/X850 series was nice too but all along nvidia was raking in the profits and hearts and minds of PC enthusiasts.

    Nvidia has lost the race 1 time in nearly 15 years, and people are dooming nvidia, and frankly all this bashing by the community and websites is truly hurting nvidia.

    Now granted nvidia has hurt themselves, no one will deny that, with their fixed pricing, yada yada yada, and their mobile chipset solder issue, but to label them losers in the video card race is just total bullsh|t.

    I like any card that has the power to do what i want it to do, and frnakly i like the GTX260/280 and the 4850/4870, so in my mind everyone wins.
    but this continued smashing, raking over the coals, and literal market killing talk about nvidia has got to stop.

    ATI, 1 time in the last, oh 6 years produces a card that has FINALLY overtaken Nvidia (4870X2), and all the sudden ATI is king.

    this is the hilarious, and you people flip flopping, and bashing and raising ATI from the grave are hilarious.
    ATI is in FAR FAR worse shape than nvidia E V E R will be financially.
    "These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
    Welcome to the Roughnecks"

    "Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
    You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"

    Heat Ebay Feedback

  17. #167
    Xtreme Guru
    Join Date
    Aug 2005
    Location
    Burbank, CA
    Posts
    3,766
    ^^^ i agree

  18. #168
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    Quote Originally Posted by Lestat View Post
    I still find it quite hilarious, but bad for Nvidia that after, how many years?, ATI brings out 2 cards, ok now 3 with the 4870X2 that can beat Nvidia. The aforementioned 2 (4850/4870) are not ahead of nvidia by only a smidgen, they win only on price, so everyone starts calling nvidia dead, they are doomed, omg nvidia lost!!

    the fact is boys and girls, ATI has not produced a competetive card since the 9600/9700/9800 series. the X800/X850 series was nice too but all along nvidia was raking in the profits and hearts and minds of PC enthusiasts.

    Nvidia has lost the race 1 time in nearly 15 years, and people are dooming nvidia, and frankly all this bashing by the community and websites is truly hurting nvidia.

    Now granted nvidia has hurt themselves, no one will deny that, with their fixed pricing, yada yada yada, and their mobile chipset solder issue, but to label them losers in the video card race is just total bullsh|t.

    I like any card that has the power to do what i want it to do, and frnakly i like the GTX260/280 and the 4850/4870, so in my mind everyone wins.
    but this continued smashing, raking over the coals, and literal market killing talk about nvidia has got to stop.

    ATI, 1 time in the last, oh 6 years produces a card that has FINALLY overtaken Nvidia (4870X2), and all the sudden ATI is king.

    this is the hilarious, and you people flip flopping, and bashing and raising ATI from the grave are hilarious.
    ATI is in FAR FAR worse shape than nvidia E V E R will be financially.
    thx lestat for this informative post

  19. #169
    Xtreme Addict
    Join Date
    Nov 2006
    Location
    Red Maple Leaf
    Posts
    1,556
    Quote Originally Posted by Lestat View Post
    ATI is in FAR FAR worse shape than nvidia E V E R will be financially.
    That's because they're stuck to AMD.
    E8400 @ 4.0 | ASUS P5Q-E P45 | 4GB Mushkin Redline DDR2-1000 | WD SE16 640GB | HD4870 ASUS Top | Antec 300 | Noctua & Thermalright Cool
    Windows 7 Professional x64


    Vista & Seven Tweaks, Tips, and Tutorials: http://www.vistax64.com/

    Game's running choppy? See: http://www.tweakguides.com/

  20. #170
    Xtreme Addict
    Join Date
    Aug 2004
    Location
    Austin, TX
    Posts
    1,346
    Quote Originally Posted by Lestat View Post
    I still find it quite hilarious, but bad for Nvidia that after, how many years?, ATI brings out 2 cards, ok now 3 with the 4870X2 that can beat Nvidia. The aforementioned 2 (4850/4870) are not ahead of nvidia by only a smidgen, they win only on price, so everyone starts calling nvidia dead, they are doomed, omg nvidia lost!!

    the fact is boys and girls, ATI has not produced a competetive card since the 9600/9700/9800 series. the X800/X850 series was nice too but all along nvidia was raking in the profits and hearts and minds of PC enthusiasts.

    Nvidia has lost the race 1 time in nearly 15 years, and people are dooming nvidia, and frankly all this bashing by the community and websites is truly hurting nvidia.

    ATI, 1 time in the last, oh 6 years produces a card that has FINALLY overtaken Nvidia (4870X2), and all the sudden ATI is king.
    What are you talking about? ATI has been competitive with NV (frequently beating NV) with the sole exception of the R600 generation.

    For example, R580 completely destroyed its competition, especially now (sometimes even 100% faster than G70). R520 was slightly delayed though, although ATI has now improved on its execution.

    This is a return of ATI, new and improved in the following areas:
    1. Execution: no more (or fewer) delays!
    2. Cost effectiveness: smaller die and cheaper PCB means cheaper to manufacture, which means we can buy a 4850 for a mere $150.

    Quote Originally Posted by Lestat
    ATI is in FAR FAR worse shape than nvidia E V E R will be financially.
    That's because NV's engineering R&D budget is equal to ATI's revenues (or something along those lines). What do you expect when one company is 10x larger than the other?

    You can't really make that comparison, though.

  21. #171
    Registered User
    Join Date
    Jul 2007
    Posts
    49
    Quote Originally Posted by Shadowmage View Post
    That's because NV's engineering R&D budget is equal to ATI's revenues (or something along those lines). What do you expect when one company is 10x larger than the other?

    You can't really make that comparison, though.
    And still with such a R&D budget, all they gave us the last two years are G80/G92 crapolla (don't get me wrong, they were great chips, but, hell, 2 years on the same tech?). Talk about innovation right here...

  22. #172
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    Quote Originally Posted by Lestat View Post
    -snip-
    Actually, you're the one who is hilarious, being so emotionally attached to one brand.

  23. #173
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Birmingham AL.
    Posts
    1,079
    Ive lost hope on NV with this gen. I just hope that they will be able to work some magic around R800 time.
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

  24. #174
    Xtreme Member
    Join Date
    Oct 2005
    Posts
    197
    Quote Originally Posted by Lestat View Post
    The fact is boys and girls, ATI has not produced a competetive card since the 9600/9700/9800 series. the X800/X850 series was nice too but all along nvidia was raking in the profits and hearts and minds of PC enthusiasts.

    Nvidia has lost the race 1 time in nearly 15 years, and people are dooming nvidia, and frankly all this bashing by the community and websites is truly hurting nvidia.
    Your fact is not a fact but an opinion. I don't see the history that cut and dry, I don't see how you could. How you call your self an enthusiast and come to such conclusions. you must be an NV enthusiast.

  25. #175
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by Lestat View Post
    I still find it quite hilarious, but bad for Nvidia that after, how many years?, ATI brings out 2 cards, ok now 3 with the 4870X2 that can beat Nvidia. The aforementioned 2 (4850/4870) are not ahead of nvidia by only a smidgen, they win only on price, so everyone starts calling nvidia dead, they are doomed, omg nvidia lost!!

    the fact is boys and girls, ATI has not produced a competetive card since the 9600/9700/9800 series. the X800/X850 series was nice too but all along nvidia was raking in the profits and hearts and minds of PC enthusiasts.

    Nvidia has lost the race 1 time in nearly 15 years, and people are dooming nvidia, and frankly all this bashing by the community and websites is truly hurting nvidia.

    Now granted nvidia has hurt themselves, no one will deny that, with their fixed pricing, yada yada yada, and their mobile chipset solder issue, but to label them losers in the video card race is just total bullsh|t.

    I like any card that has the power to do what i want it to do, and frnakly i like the GTX260/280 and the 4850/4870, so in my mind everyone wins.
    but this continued smashing, raking over the coals, and literal market killing talk about nvidia has got to stop.

    ATI, 1 time in the last, oh 6 years produces a card that has FINALLY overtaken Nvidia (4870X2), and all the sudden ATI is king.

    this is the hilarious, and you people flip flopping, and bashing and raising ATI from the grave are hilarious.
    ATI is in FAR FAR worse shape than nvidia E V E R will be financially.
    What a joker.

    The 1900xt was a great card & could do AA & HDR at the same time & had the better IQ at the time.

Page 7 of 10 FirstFirst ... 45678910 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •