Page 9 of 15 FirstFirst ... 6789101112 ... LastLast
Results 201 to 225 of 355

Thread: GeForce GTX295 Card Exposed

  1. #201
    Xtreme Legend
    Join Date
    Mar 2005
    Location
    Australia
    Posts
    17,242
    Quote Originally Posted by Glow9 View Post
    I do, but do you get mine? Paying $100 more for something that gives 10% is pretty pointless. Especially when the faster card runs hotter and uses more power. It even costs you more in the long run. I just think value is going to outweigh minimal performance differences in the next while.
    lol i am not arguing value for money in my previous post.......we are specifically talking about performance differences and fair comparisons

    not dollar for dollar comparisons

    i am not saying one is better than other
    hey i run a 8600GT in my 24/7
    its very fast for everything i need it for.....to me your GTX260 looks like a waste of money too but i am not arguing that lol
    Team.AU
    Got tube?
    GIGABYTE Australia
    Need a GIGABYTE bios or support?



  2. #202
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Quote Originally Posted by Yukon Trooper View Post
    And Nvidia feels the exact same way, so they're hustling to put a dual-GPU solution on the market.
    Yeah, you do have to wonder what they are thinking, or IF they were thinking. I think the insanity of trying to compete with ATi has caused them to compete against themselves, and is affecting their engineering choices. It just doesn't make sense. They tried that once before with the GX2 and proved this multi-GPU stuff is a complete failure on all levels, and now they are gonna go backwards after having the top performing single GPU solution and wanna do it again?

    It's insanity. That is THE definition of insanity...Doing the same thing over and over again, and expecting different results.

  3. #203
    Xtreme Addict
    Join Date
    May 2008
    Location
    Land of Koalas and Wombats
    Posts
    1,058
    Quote Originally Posted by T_Flight View Post
    Yeah, you do have to wonder what they are thinking, or IF they were thinking. I think the insanity of trying to compete with ATi has caused them to compete against themselves, and is affecting their engineering choices. It just doesn't make sense. They tried that once before with the GX2 and proved this multi-GPU stuff is a complete failure on all levels, and now they are gonna go backwards after having the top performing single GPU solution and wanna do it again?

    It's insanity. That is THE definition of insanity...Doing the same thing over and over again, and expecting different results.
    What's worse is they are going to do the same thing they have done for the 7950GX2 and the 9800GX2. They'll bring it out, then do driver fixes for 3 months until the next architecture comes out, then they'll forget about it and leave the owners high and dry yet again. I hate to say but their track record on dual gpu records isn't very positive. Current 9800GX2 driver support is abismal by any standards.

    DFI LT-X48-T2R UT CDC24 Bios | Q9550 E0 | G.Skill DDR2-1066 PK 2x2GB |
    Geforce GTX 280 729/1566/2698 | Corsair HX1000 | Stacker 832 | Dell 3008WFP


  4. #204
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Uh Oh, I just had another thought. I wonder how painfully expensive watercooling one of these beasts might be? Might take its own rad, pump, fans, and the block for it would be a nightmare.

    The 1st Factory (EVGA) block I saw for one of those 9800GX2's was Aluminum, and was huge. I think they made it out of Al to save weight to keep from breaking off the poor PCI-e port hanging on for dear life. Of course Aluminum is useless in a loop containing Cu or brass, so that was a non-starter from the get go. Can anyone imagine what a block like that would weigh in Cu? It would be like 4 pounds or some rediculous number! Becasue the GPU's were facing each other, the cards were sandwitched around the big huge massively thick block. I did see a Danger Den Cu block but like I said, I would hate to see what it weighed. It's full cover for the 9800GX2 only. Double sided full cover block only.


    If nVidia does go ahead and do this (which I seriously hope they don't), it would be wise to situate the GPU's on the outside so block mounting would be easier. Of course I hate to even give them any ideas for somerthing that's a bad idea to begin with.

  5. #205
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    If the reports on power savings from going to 55nm on GT200 are accurate, then the heat-dump might not be too bad for a water-cooled setup.

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  6. #206
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Killadelphia
    Posts
    598
    why not name it GTX300.
    CPU: INTEL Q9550 E0 @ 3600 1.28V + CORSAIR H50
    GPU: EVGA GTX 480 @ 725/1450/3800
    PPU: EVGA 9800GTX @ 738/1836/2250
    MB: ASUS P5E X38
    RAM: MUSHKIN 2X2GB
    SOUND: X-FI TITANIUM FATAL1TY | LOGITECH Z-5500 | LOGITECH G35
    HDD: WD 76GB X 2 RAPTORS RAID 0 10KRPM | WD 1TB CAVIAR BLACK 32MB STORAGE
    PSU: CORSAIR HX1000W PSU
    OS: WINDOWS 7 PRO X64 OS
    MON: HANNSPREE 28" 1920X1200 RESOLUTION


  7. #207
    Banned
    Join Date
    Dec 2005
    Location
    Everywhere
    Posts
    1,715
    Quote Originally Posted by Envydia007 View Post
    why not name it GTX300.
    Because GTX 300 series will be a next generation with G300 core .. lauch of that is planned to summer/fall 2009

  8. #208
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Quote Originally Posted by Yukon Trooper View Post
    If the reports on power savings from going to 55nm on GT200 are accurate, then the heat-dump might not be too bad for a water-cooled setup.
    Yeah the 280 wasn't as bad as I thought it was gonna be. Guys had me scared of it before I ever got it. It's really not that bad on air, but it really comes alive with water. They have alot in them and you can get alot more out of them with water.

    I've had mine up to max clocks already just to see it, but the heat is out of control at max clocks. It needs water badly. Water really does help them. Talonman did a thread in the Water forum, and he has awesome clcoks and temps. I can't even touch what he's getting but for only a few seconds loaded, and then have to back out.

    I hope they get all these new cards to run a bit cooler, and maybe increase the shader clocking headroom a bit. I'm still a single card fan. If I need more I can always SLI another one.

  9. #209
    Xtreme Enthusiast
    Join Date
    May 2006
    Location
    Green Bay, WI
    Posts
    616
    Quote Originally Posted by T_Flight View Post
    Talonman did a thread in the Water forum, and he has awesome clcoks and temps. I can't even touch what he's getting but for only a few seconds loaded, and then have to back out.
    Can I get a link?

  10. #210
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036

  11. #211
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    Single card setups are definitely the preferred method of attack.

    As far as heat and lower fabrication processes go, I can't wait until we're all running high-end setups on 250W PSU's again.

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  12. #212
    Xtreme Enthusiast
    Join Date
    Jul 2008
    Posts
    950
    Quote Originally Posted by [XC] Synthetickiller View Post
    I'm in the same boat.

    I don't think the gtx265 is worth it.

    I think you're looking at gtx285 or gtx295.

    If you dislike SLI or don't want to spend that much, you're looking at gtx285 effectively. I'm waiting to see pricing before I pick. I will not be getting a gtx265.

    Thats my 2 cents on the matter.
    yeah think i will go for either gtx 285-295 depending on the difference i have to put only had me gtx260-216 7 weeks so hopefully wont have to put much to new card. we will see... good luck man.

  13. #213
    Xtreme Enthusiast
    Join Date
    Mar 2008
    Location
    Alberta Canada
    Posts
    631
    I'll be interested to see numbers and prices once the new cards are out
    although, the card that is most likely catching my eye is the GTX265 (depending on power consumption/ heat output and folding power)
    Current System:
    eVGA 680i SLi "A2" P30 BIOS
    intel Core 2 Quad Q6600 (currently at stock)
    OCZ ReaperX 4GB DDR2 1000 (running at DDR2 800 Speeds with cas4)
    320GB Seagate 7200.10
    XFX 8800GT XXX 512MB (stock clocks)
    auzentech X-Fi Prelude
    PC Power and Cooling Silencer 750 Quad Copper
    Win XP Pro

  14. #214
    Xtreme Member
    Join Date
    Oct 2008
    Posts
    263
    I just hope they dont release these and then drop all support when the 300 series comes along, thats what happened when I bought the 7950gx2
    Whats up?

  15. #215
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by shoopdawoopa View Post
    I just hope they dont release these and then drop all support when the 300 series comes along, thats what happened when I bought the 7950gx2
    probably will happen again like 9800GX2, I think nvidia its shooting himself on the foot once again, while I'm anxious to see this monster, I think its a bad move, why I say its a bad move? well remember the 9800GX2? making the GT200 performance at start look mediocre? the GX2 made the nvidia statement about "100%>" performance increase from G80 (more like what we saw from G70>G80) statement a joke. What happen then? EOL... nvidia patron is to release single core ships at start if competition beats them bring on dual-ship and eol on next product line... thats my opinion. I could be wrong about this card tho !!
    ░█▀▀ ░█▀█ ░█ ░█▀▀ ░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░█▀▀ ░█▀▀ ░█ ░█ ░░░░█▀▀ ░█▀█ ░█ ░█ ░░░
    ░▀▀▀ ░▀ ░░░▀ ░▀▀▀ ░░▀ ░░░▀░▀ ░▀ ░▀▀▀ ░

  16. #216
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    154
    Quote Originally Posted by LOE View Post
    well, about ATIs dual GPU approach - I can classify that as ONE CARD with 2 gpus, but the nvidia solution has 2 PCBS, so it is pretty doubtful if I can call this "a single card" it is 2 cards sticked together with only one PCI-E slot
    I think that this thing about 2 pcbs glued together and blabla almost sounds like an elitist view; Having 2 PCBs make the card seem less impressive since it 'needs' two cards to perform, unlike just a single pcb. Practical aspects of heat and power aside, i don't see what the fuss is about. I mean, the q6600 was no amazing feat: Two dual cores jammed together. Yet you don't see people ing about that all day

  17. #217
    Xtreme Addict
    Join Date
    Jul 2008
    Location
    SF, CA
    Posts
    1,294
    Quote Originally Posted by noinimod View Post
    I think that this thing about 2 pcbs glued together and blabla almost sounds like an elitist view; Having 2 PCBs make the card seem less impressive since it 'needs' two cards to perform, unlike just a single pcb. Practical aspects of heat and power aside, i don't see what the fuss is about. I mean, the q6600 was no amazing feat: Two dual cores jammed together. Yet you don't see people ing about that all day
    /take off calm hat/
    it means the nvidia design team can't get off their asses and design a unified pcb like ati. a dual pcb sandwich is a ham handed throe of desperation. It would make sense if say, they needed the extra pcb for additional vram or a beefier vreg circuit, but otherwise it just means they want to put out another card but don't want to do a whole lot of design work. which is probably the same attitude they'll take with drivers.
    (don't compare to intel's core2 architecture, there aren't any parallels really)
    /replace calm hat/

    love you nvidia!

  18. #218
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by BulldogPO View Post
    Yep. HDMI is way to go, not DisplayPort.
    ??

    What is wrong with Displayport..?



    .

  19. #219
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Location
    AU
    Posts
    510
    Quote Originally Posted by LOE View Post
    well, about ATIs dual GPU approach - I can classify that as ONE CARD with 2 gpus, but the nvidia solution has 2 PCBS, so it is pretty doubtful if I can call this "a single card" it is 2 cards sticked together with only one PCI-E slot
    So you class a tower case that has 2 motherboards as two cases

  20. #220
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    Quote Originally Posted by X.T.R.E.M.E_ICE View Post
    So you class a tower case that has 2 motherboards as two cases
    Your analogy doesn't make sense.

    It would be "So you class a tower case that has 2 motherboards as two computers"

    And yes, you could certainly call them that.

  21. #221
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Xoulz View Post
    ??

    What is wrong with Displayport..?



    .
    here is the start

    1) its slower than HDMI/dvi since it uses duel encryption (so it sends and encrypted message to the monitor/tv then gets an offset back) then sends a packet to let it display the frame
    2) its got less bandwidth than dvi or HDMI and degrades more over distance
    3) it is not an open standard u have to pay to use it
    4) it offers no advantages for the user and no advantage for distributor only more blatancy
    5) it cannot be converted to another input, without loosing HDCP making it useless if u only have native displayport
    6) it blocks non HDCP content natively, so once it retches proliferation there will be no drm free media since live signing will be disabled (live sighing temporarily encrypts playback)

    the list goes on farther if u look around, but just know that its bad and dont buy anything with it hopefully we can kill this like divx disks
    Last edited by zanzabar; 12-15-2008 at 05:50 PM.
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  22. #222
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by Envydia007 View Post
    why not name it GTX300.
    Dont worry, Im sure that they will at some point.

  23. #223
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Location
    AU
    Posts
    510
    Quote Originally Posted by Sly Fox View Post
    Your analogy doesn't make sense.

    It would be "So you class a tower case that has 2 motherboards as two computers"

    And yes, you could certainly call them that.
    4870x2 has 2 GPU does that means it is 2 cards. I remember seeing a powercolor card i think it was that had brand a second pcb for output connection.

    EDIT: I think people need to know the difference between PCB and Card. LOL. This debate is getting real old.
    Last edited by X.T.R.E.M.E_ICE; 12-15-2008 at 06:20 PM.

  24. #224
    Xtreme Mentor
    Join Date
    Jul 2004
    Posts
    3,247

    GeForce GTX 295 benchmarks out?


  25. #225
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    so its good at physX and g200 optimized games, it looks like it will make amd get off their ass
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

Page 9 of 15 FirstFirst ... 6789101112 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •