MMM
Page 4 of 5 FirstFirst 12345 LastLast
Results 76 to 100 of 101

Thread: [Expreview]GeForce GTX 295 will be announce in CES2009

  1. #76
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    We know nothing about the 55nm shrink, the best option is to wait a few days and see what's going on
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  2. #77
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    They must be wanting to commit comercial suicide. Not that I'd ever even consider a dual GPU card, but even if I did I wouldn't consider this abomination. If these rumors are true this is going to put them in the red if they attempt to mass produce these things. I seriously hope they don't.

    Heck I'm an nVidia user and wouldn't get one of those things near my system. Like was said earlier, I guess they are looking at joining ATi in the microstuttering club. Bad move nVidia. Until this you had better engineering. That just went out the window if this thing is true. I'm seriously hoping it's another one of those Inq rumors that will die quickly, but it sure isn't sounding like it.

  3. #78
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    Quote Originally Posted by T_Flight View Post
    They must be wanting to commit comercial suicide. Not that I'd ever even consider a dual GPU card, but even if I did I wouldn't consider this abomination. If these rumors are true this is going to put them in the red if they attempt to mass produce these things. I seriously hope they don't.

    Heck I'm an nVidia user and wouldn't get one of those things near my system. Like was said earlier, I guess they are looking at joining ATi in the microstuttering club. Bad move nVidia. Until this you had better engineering. That just went out the window if this thing is true. I'm seriously hoping it's another one of those Inq rumors that will die quickly, but it sure isn't sounding like it.
    Hey, I actually fully agree with you for once.

    Unless this card has a better dual GPU solution than ATI, it's just kinda pointless if you ask me. Now if it has some sort of amazing new hardware "load splitting" and scales 99.99% all the time and is just godlike, maybe that would be different.

    From what I picture in my head so far though, this card will be nothing but a little oven for your computer.

  4. #79
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Laser locking only the bus and not the shaders seems a lil weird to me. I'd speculate that they got the shader count wrong... 2x 216 shaders sounds more feasible to me.

    Lets wait and see

  5. #80
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by Sly Fox View Post
    Unless this card has a better dual GPU solution than ATI
    Any multiGPU is bad, so the rest doesn't matter.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  6. #81
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Quote Originally Posted by T_Flight View Post
    They must be wanting to commit comercial suicide. Not that I'd ever even consider a dual GPU card, but even if I did I wouldn't consider this abomination. If these rumors are true this is going to put them in the red if they attempt to mass produce these things. I seriously hope they don't.

    Heck I'm an nVidia user and wouldn't get one of those things near my system. Like was said earlier, I guess they are looking at joining ATi in the microstuttering club. Bad move nVidia. Until this you had better engineering. That just went out the window if this thing is true. I'm seriously hoping it's another one of those Inq rumors that will die quickly, but it sure isn't sounding like it.
    They need something to go against the x2....

    Having the flagship crown or at least being very competitive in the high end is important for name recognition.

    It will be very low volume compared to mainstream card sales, not such a big impact in their pockets.

    Smart move imo, and they should have released it a long time ago.

    They already lost alot of the flagship sales....i dont see people who have an HD x2 changing for this one. they will trade blows here and there but no reason to upgrade.

  7. #82
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    Well for synthetics, 4870X2 is beating GTX260 SLI in the top category scores......
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  8. #83
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    317
    Quote Originally Posted by K404 View Post
    Well for synthetics, 4870X2 is beating GTX260 SLI in the top category scores......
    If GTX 295 indeed has 480 shaders it would be more reminiscent of GTX280 SLI than GTX260 SLI, or better yet a hybrid of the two.

  9. #84
    Registered User
    Join Date
    Aug 2008
    Location
    Redwood City, CA USA
    Posts
    38
    Will this be a new PCB and new parts? I'm worried about buying any nVidia cards while there are reports of card problems due to heat, although the main place I saw the failure reports were on Inquirer. But I prefer nVidia cards for the historically better Linux driver support.

  10. #85
    Xtreme Member
    Join Date
    Jun 2008
    Posts
    197
    LOl, the moment Ati went the x2 way, nvidia will have to go that way too, it all started with 3870x2.

    But if we look a little back in time, nvidia was the real starter of the X2 style with the 7900GX2 to counterattack the 1950XTX, so i donīt understand how some talk about engenering leadership, it has been like this for some time now.

    And it will be like this from now on ----------->

  11. #86
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    290
    Quote Originally Posted by K404 View Post
    Well for synthetics, 4870X2 is beating GTX260 SLI in the top category scores......
    In actual games GTX 260 SLI beats the 4870 X2 almost everywhere. GTX 260-216 SLI or GTX 280 SLI easily beat it.

    Unfortunately for ATI, SLI scales better than Crossfire almost universally. This is especially true in the games where it matters, like Crysis.

    I'd like to see nVidia continue to develop single-GPU solutions but you can't blame them for going the multi-GPU route when ATI has done it so successfuly with the 4870 X2.
    Intel Core i7 920 @ 3.8GHz - Asus P6T Deluxe X58 - 6GB (2GBx3) G. SKILL DDR3-1600 @ 8-8-8-20 - 2 x EVGA GTX 280 1GB SLI - Corsair TX750 PSU - Windows Vista HP 64-bit

  12. #87
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by T_Flight View Post
    They must be wanting to commit comercial suicide. Not that I'd ever even consider a dual GPU card, but even if I did I wouldn't consider this abomination. If these rumors are true this is going to put them in the red if they attempt to mass produce these things. I seriously hope they don't.

    Heck I'm an nVidia user and wouldn't get one of those things near my system. Like was said earlier, I guess they are looking at joining ATi in the microstuttering club. Bad move nVidia. Until this you had better engineering. That just went out the window if this thing is true. I'm seriously hoping it's another one of those Inq rumors that will die quickly, but it sure isn't sounding like it.
    Quite a subjective opinion there.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  13. #88
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by JohnJohn View Post
    LOl, the moment Ati went the x2 way, nvidia will have to go that way too, it all started with 3870x2.

    But if we look a little back in time, nvidia was the real starter of the X2 style with the 7900GX2 to counterattack the 1950XTX, so i donīt understand how some talk about engenering leadership, it has been like this for some time now.

    And it will be like this from now on ----------->
    Nope as the 7900GX2 was a 2-PCB sandwich which is not ati X2 style at all.
    Last edited by Final8ty; 12-10-2008 at 02:17 AM.

  14. #89
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by Extelleron View Post
    In actual games GTX 260 SLI beats the 4870 X2 almost everywhere. GTX 260-216 SLI or GTX 280 SLI easily beat it.
    Links?
    AMD Phenom II X2 550@Phenom II X4 B50
    MSI 890GXM-G65
    Corsair CMX4GX3M2A1600C9 2x2GB
    Sapphire HD 6950 2GB

  15. #90
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    290
    Intel Core i7 920 @ 3.8GHz - Asus P6T Deluxe X58 - 6GB (2GBx3) G. SKILL DDR3-1600 @ 8-8-8-20 - 2 x EVGA GTX 280 1GB SLI - Corsair TX750 PSU - Windows Vista HP 64-bit

  16. #91
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Video Drivers Catalyst 8.7
    ForceWare 177.34

    Those drivers are too old to conclude anything
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  17. #92
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Test Labs
    Posts
    512
    So i'm guessing the GTX295 going to be very short lived just like the 9800GX2

    until they move to another single-pcb architecture that'll match the GTX295

  18. #93
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by v_rr View Post
    Video Drivers Catalyst 8.7
    ForceWare 177.34

    Those drivers are too old to conclude anything
    Cat. 8.10 added pretty substantial increases for ATI's Corssfire so I have to agree.

  19. #94
    Xtreme Member
    Join Date
    Sep 2007
    Location
    Montreal, Canada
    Posts
    263
    Quote Originally Posted by xsbb View Post
    So i'm guessing the GTX295 going to be very short lived just like the 9800GX2

    until they move to another single-pcb architecture that'll match the GTX295
    QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT QFT

    It will run too hot, have driver problems, clocks will be scaled down, will be beat by 260s in SLI, and have ungodly power consumption ( not a problem), and have scaling problems galore in future games. You heard it here first.

    It's a win-win for me, because if they prove me wrong ( highly unlikely) then we have a pretty kickass card, and if they don't prove me wrong then I was right all along

  20. #95
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by SKYMTL View Post
    Cat. 8.10 added pretty substantial increases for ATI's Corssfire so I have to agree.
    Yeah and so does Forceware 180.xx drivers for NV too.

    Anyway this GTX 295 is like only a slightly castrated GTX 280 SLI setup so theoretically it would probably be like 70~80% faster than a single 280 GTX but of course it will vary a bit from game to game as usual with these dual PCB constructions so some games will show quite a lot lower benefit too, maybe 50%.

    I've never liked this kinda solutions, it's just an easy way for NV to release a faster card and it's very short lived and won't get proper driver support for games in the future after the next gen series are released so some games will be buggy on it etc. It's both expensive and not very "future-safe" due to lacking driver support.
    Last edited by RPGWiZaRD; 12-15-2008 at 07:59 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  21. #96
    The Doctor Warboy's Avatar
    Join Date
    Oct 2006
    Location
    Kansas City, MO
    Posts
    2,597
    Quote Originally Posted by RPGWiZaRD View Post
    Yeah and so does Forceware 180.xx drivers for NV too.

    Anyway this GTX 295 is like only a slightly castrated GTX 280 SLI setup so theoretically it would probably be like 70~80% faster than a single 280 GTX but of course it will vary a bit from game to game as usual with these dual PCB constructions so some games will show quite a lot lower benefit too, maybe 50%.

    I've never liked this kinda solutions, it's just an easy way for NV to release a faster card and it's very short lived and won't get proper driver support for games in the future after the next gen series are released so some games will be buggy on it etc. It's both expensive and not very "future-safe" due to lacking driver support.
    +1 Totally agree.
    My Rig can do EpicFLOPs, Can yours?
    Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....

    Build XT7 is currently active.
    Current OS Systems: Windows 10 64bit

  22. #97
    Banned
    Join Date
    Dec 2008
    Posts
    63
    Quote Originally Posted by K404 View Post
    Well for synthetics, 4870X2 is beating GTX260 SLI in the top category scores......
    Thats not a fair comparison thuo because of cpu clocks.
    look at the gx2 beating the 88gts 512u sli in benchmarks

  23. #98
    Banned
    Join Date
    Dec 2008
    Posts
    63
    Quote Originally Posted by JohnJohn View Post
    LOl, the moment Ati went the x2 way, nvidia will have to go that way too, it all started with 3870x2.

    But if we look a little back in time, nvidia was the real starter of the X2 style with the 7900GX2 to counterattack the 1950XTX, so i donīt understand how some talk about engenering leadership, it has been like this for some time now.

    And it will be like this from now on ----------->
    Not really there were dual 66gt cards as well as 68gt and the 78gt duel and 79gt masterpiece,on one pbc no less

  24. #99
    Xtreme Enthusiast
    Join Date
    Aug 2005
    Location
    Melbourne, Australia
    Posts
    942
    There were dual voodoo 2 and dual ATI rage 128 pro cards too..
    Q9550 || DFI P45 Jr || 4x 2G generic ram || 4870X2 || Aerocool M40 case || 3TB storage


  25. #100
    Administrator
    Join Date
    Nov 2007
    Location
    Stockton, CA
    Posts
    3,569
    Quote Originally Posted by xsbb View Post
    So i'm guessing the GTX295 going to be very short lived just like the 9800GX2

    until they move to another single-pcb architecture that'll match the GTX295
    Agreed. I purchased my 9800gx2's the day they came out, EVGA brand. 280gtx's came out 91 days later, yup I missed my setup by 1 day.

Page 4 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •