Page 2 of 10 FirstFirst 12345 ... LastLast
Results 26 to 50 of 227

Thread: Nvidia 270, 290 and GX2 roll out in November

  1. #26
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by RPGWiZaRD View Post
    That would hurt GTX260 sales a bit too much, dropping it to something like say 150~170 EUR / 220~240 USD.



    I seriously doubt it, like INQ said it's more likely that they have to drop the clocks a bit in order to keep power draw realisticly.


    NV is not in a good situation right now, for customers there aren't any huge problems here but as a company it ain't doing so well.

    Why's that? nvidia is known by pulling unexpected stunts out of nowhere, and they actually work better when they are under pressure.

  2. #27
    all outta gum
    Join Date
    Dec 2006
    Location
    Poland
    Posts
    3,390
    Quote Originally Posted by Bo_Fox View Post
    GTX 290: die shrink slightly lower power consumption and/or higher clocks possibly GDDR5 memory
    No way. GDDR5 requires a specialized memory controller, and since its just an optical shrink, it still has GDDR3 controller.
    www.teampclab.pl
    MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12

    Test bench: empty

  3. #28
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    317
    Quote Originally Posted by Bo_Fox View Post
    Nah, although the INQ does bash Nvidia *hard*, it was a great article by the INQ nonetheless. It certainly does sound very plausible.

    Let's look at the reality for a minute.. do you really think that 55nm is going to bring more than a generous claim of "15%" power savings that the INQ made? It was generous of the INQ to say that, for Nvidia's sake. Do you think it would actually bring more than a 50 MHz boost in clock speed without increasing power usage? Apparently, 55nm only allowed the 9800GTX+ a 68 MHz clock increase using the same power envelope. The 9800GTX+ still failed to show a clear lead over the single-slot HD 4850, which was a great disappointment for Nvidia.
    I was speaking of a GX2 card, not GTX290. GTX290 won't be a threat to 4870x2 anyhow...

  4. #29
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Eh. While it was a pretty rough article, he did accurately point out the whole honoured tradition of 'milking the stupid.'

    It's true.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  5. #30
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Quote Originally Posted by Junos View Post
    When I saw the headline I thought yeah, finally some new info about GT200b. While reading the article through it became clear that it was very anti-nvidia and thought "its gotta be inquirer". And guess what, it was. I didn't even bother to finish it.
    Same here, I suspected it was theinq after the "NV is in deep doo-doo right now..." part. So I checked the "source" and stopped reading.

    Initial gtx280 price was due to the absense of something faster at the time....aimed at early adopters. Nvidia probably knew ATI was going to launch an dual GPU card, and likely expected/planned price drops, just like it happened. I may or may not be wrong, but theinq stance/interpretation of facts annoy the hell out of me. Very subjective and unprofessional.
    Last edited by Tonucci; 10-09-2008 at 06:30 AM.

  6. #31
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    Quote Originally Posted by RealTelstar View Post
    I really cant stand the ati fanboism of the INQ.

    The perf. margins between the 280 gtx and the 4870x2 is like 20% and not in every game title.
    haha %20 are you joking there are many games where differences are as much as %60 - 70

  7. #32
    Xtreme Addict
    Join Date
    Aug 2004
    Location
    Austin, TX
    Posts
    1,346
    Quote Originally Posted by Stevethegreat View Post
    I was speaking of a GX2 card, not GTX290. GTX290 won't be a threat to 4870x2 anyhow...
    The GX2 card won't be based off the GTX280 due to heat and power. In the best-case, it'll be based off a 55nm 9TPC GTX270-core216. As we can all probably guess, the GTX270-core216 probably only beats the 4870 by a little (and loses with 8xAA), so I don't think that the GX2 card would be much of a threat to ATI (marginal win for NV).

  8. #33
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Kuwait
    Posts
    616
    Quote Originally Posted by Shadowmage View Post
    The GX2 card won't be based off the GTX280 due to heat and power. In the best-case, it'll be based off a 55nm 9TPC GTX270-core216. As we can all probably guess, the GTX270-core216 probably only beats the 4870 by a little (and loses with 8xAA), so I don't think that the GX2 card would be much of a threat to ATI (marginal win for NV).
    I been looking to review for GTx260 216 vs HD4870 if you know one will you link it please? i got it guru3D thanks anway
    Last edited by Mk; 10-09-2008 at 06:37 AM.

  9. #34
    Xtreme Member
    Join Date
    Apr 2007
    Posts
    124
    Quote Originally Posted by C.Ron7aldo View Post
    I been looking to review for GTx260 216 vs HD4870 if you know one will you link it please?
    Tech Report just posted this - http://www.techreport.com/articles.x/15651 . It's the GTX 260 Core 216 vs a 1GB HD 4870.

  10. #35
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by C.Ron7aldo View Post
    I been looking to review for GTx260 216 vs HD4870 if you know one will you link it please? i got it guru3D thanks anway
    AMD's ATI Radeon HD 4870 with 1GB of GDDR5
    http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==


    The Radeon HD 4870 1GB: The Card to Get

    http://www.anandtech.com/video/showdoc.aspx?i=3415
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  11. #36
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by Stevethegreat View Post
    I was speaking of a GX2 card, not GTX290. GTX290 won't be a threat to 4870x2 anyhow...
    To sandwich and cool a dual-460mm^2 GPU solution is probably going to require water cooling.. if that's what you mean by engineering. Or maybe it will have to be done on a single 12-inch long PCB so that a dual-slot cooler can be used like the 4870X2? Or maybe it will have to have special external heatsinks connected with heatpipes? Just so that it can beat the 4870X2 by 10%?

    The only way Nvidia could keep on using the same 9800GX2-style sandwich cooler is if the new GTX270 cores do not use any more power than the 9800GX2 cores, which is certainly possible. However, unless they are cherry-picked like the INQ said, they would not be any faster than a single 4870 core. The 4870X2 already has a dual-slot cooler, so Nvidia could try to match it at best, just like the 9800GTX+ managed to match the single-slot HD 4850.

    What I'm glad to see is the GTX 290 further leading the single-GPU performance (since I'm a retired SLI veteran, preferring to avoid all the compatibility and microstuttering headaches).

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  12. #37
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by Bo_Fox View Post
    That was an excellent article by the INQ!!!

    And that is what I was thinking too! After Nvidia named the dual "8800GTS-512" solution a new number generation, the 9800GX2, there was a leak of the GTX-350. It was rumored to have 2GB of GDDR5 memory at 512-bit bus width. At first, I was wondering "what the heck"?!? Now, it all makes sense, given that the bandwidth is "doubled" on Nvidia's dual-GPU solution, and that Nvidia was always a sucker for making a dual-GPU solution since the 7900GX2 days. I knew that at 55nm, it was Nvidia's only chance of beating the 4870X2, by shrinking the GT200 cores and forcing them into a dual-GPU solution.

    The INQ did a good job at pointing out the power consumption and cooling issues. It is unquestionably going to be a huge challenge to cool this sandwiched thing!!

    GTX 290: die shrink slightly lower power consumption and/or higher clocks possibly GDDR5 memory

    GTX 350 (dual GT200b): ~15-30% more power consumption than 4870X2 probably has to be less than 600MHz per core due to cooling limits still loses to the 4870X2 in some games insane cooling fan noise :down:
    I have no idea what kind of crazy inq ish you are smoking, but a gtx 350 would absolutely dominate the 4870x2. Why? The bandwidth of a gtx 280 w/ gddr3 is already more than a 4870/4870x2 with gddr5, due to the 448/512 bit bus. PUT two of those together with some gddr5, and you will have a card that will leave the 4870x2 face down in the mud. (the power consumption at load would be high, but it would be 55nm so i doubt it would be that much more than the 55nm 4870x2, which runs hotter than two 65nm g98's in a 9800gx2.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  13. #38
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by Bo_Fox View Post
    To sandwich and cool a dual-460mm^2 GPU solution is probably going to require water cooling.. if that's what you mean by engineering. Or maybe it will have to be done on a single 12-inch long PCB so that a dual-slot cooler can be used like the 4870X2? Or maybe it will have to have special external heatsinks connected with heatpipes? Just so that it can beat the 4870X2 by 10%?

    The only way Nvidia could keep on using the same 9800GX2-style sandwich cooler is if the new GTX270 cores do not use any more power than the 9800GX2 cores, which is certainly possible. However, unless they are cherry-picked like the INQ said, they would not be any faster than a single 4870 core. The 4870X2 already has a dual-slot cooler, so Nvidia could try to match it at best, just like the 9800GTX+ managed to match the single-slot HD 4850.

    What I'm glad to see is the GTX 290 further leading the single-GPU performance (since I'm a retired SLI veteran, preferring to avoid all the compatibility and microstuttering headaches).

    Your completely forgetting that nvidia will be getting gddr5 which is the only thing that makes the 4870x2 competitive with the gtx series right now.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  14. #39
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Bandwidth >100GB/s matters very little.

    See this:


    Where did the fillrate, bandwidth, texturing advantages go? You have a chip half the size of the other that is unplayable.

    Future games will be made close to this type of specifications. Starting with Operation Flashpoint 2, but it'll be widespread in a while.

    ATI can spam a LOT of shader units and even TMUs inside their chips without using much die size. The same does NOT go for nVidia.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  15. #40
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Toronto, Canada
    Posts
    1,491
    Wow...

    People who love companies and defend them to the end are funny.

    On both sides.

    "This one is going to kick that one's butt... one day"

    "This one is better than that one"

    "This one has one GPU and that one has two, so it's not as good"

    The same damn arguments over and over and over and over. The responses are very often predictable and you can tell what brand is going to be in the sig before you even get to the bottom of the post.

    Give it a rest...holy crap.

    Single fastest card on the market = 4870 X2
    Single fastest GPU on the market = GT200

    That is what we know right now. All the rest is just speculation for the future.

    Either card is awesome... don't worry, your penis won't change size, no matter which brand you own.
    RIG 1 (in progress):
    Core i7 920 @ 3GHz 1.17v (WIP) / EVGA X58 Classified 3X SLI / Crucial D9JNL 3x2GB @ 1430 7-7-7-20 1T 1.65v
    Corsair HX1000 / EVGA GTX 295 SLI / X-FI Titanium FATAL1TY Pro / Samsung SyncMaster 245b 24" / MM H2GO
    2x X25-M 80GB (RAID0) + Caviar 500 GB / Windows 7 Ultimate x64 RC1 Build 7100

    RIG 2:
    E4500 @ 3.0 / Asus P5Q / 4x1 GB DDR2-667
    CoolerMaster Extreme Power / BFG 9800 GT OC / LG 22"
    Antec Ninehundred / Onboard Sound / TRUE / Vista 32

  16. #41
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by Macadamia View Post
    Bandwidth >100GB/s matters very little.

    See this:


    Where did the fillrate, bandwidth, texturing advantages go? You have a chip half the size of the other that is unplayable.

    Future games will be made close to this type of specifications. Starting with Operation Flashpoint 2, but it'll be widespread in a while.

    ATI can spam a LOT of shader units and even TMUs inside their chips without using much die size. The same does NOT go for nVidia.

    Dude, did you even look at the 4870 512 vs 4870 1gb in the above graph?
    Take a look at that graph you posted when you get a chance.
    Thats where fillrate, bandwidth, and extra gddr5 advantages went.

    Grid is not the most graphically demanding game. All the above shows is that it scales well on a 4870x2, arguably due to it's gddr5.
    we'd have to have a 4850x2 to really know.
    Last edited by fragmasterMax; 10-09-2008 at 07:11 AM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  17. #42
    Xtreme Addict
    Join Date
    Nov 2006
    Location
    Red Maple Leaf
    Posts
    1,556
    Quote Originally Posted by spursindonesia View Post
    Wow, a dual 460 mm^2 GPUs card, while it might or might not take the performance crown, it sure would be one heck of a power hog and heat generator device at the same time.
    That's excellent news! I can replace my space heater.

    It gets cold here in Canada.
    E8400 @ 4.0 | ASUS P5Q-E P45 | 4GB Mushkin Redline DDR2-1000 | WD SE16 640GB | HD4870 ASUS Top | Antec 300 | Noctua & Thermalright Cool
    Windows 7 Professional x64


    Vista & Seven Tweaks, Tips, and Tutorials: http://www.vistax64.com/

    Game's running choppy? See: http://www.tweakguides.com/

  18. #43
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by fragmasterMax View Post
    I have no idea what kind of crazy inq ish you are smoking, but a gtx 350 would absolutely dominate the 4870x2. Why? The bandwidth of a gtx 280 w/ gddr3 is already more than a 4870/4870x2 with gddr5, due to the 448/512 bit bus. PUT two of those together with some gddr5, and you will have a card that will leave the 4870x2 face down in the mud. (the power consumption at load would be high, but it would be 55nm so i doubt it would be that much more than the 55nm 4870x2, which runs hotter than two 65nm g98's in a 9800gx2.
    Are you talking about a card with 1024bit bus (2x 512bit) + 2Gb 4.0Ghz GDDR5 + 2x 460mm^2 GPU + NVIO?

    Thatīs simply crazy.
    That should be impossible to make and if itīs possible it would cost 800$++
    That would be a great shot in the foot bigger then GT200 was.
    Last edited by v_rr; 10-09-2008 at 07:19 AM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  19. #44
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by Tonucci View Post
    Lets just say that Im glad about being able to communicate in 4 different languages besides my native one. Content is what matters.

    Your sig speak tons about your immature self. From now your posts wont be displayed on my screen anymore.
    Thanks for showing me who you really are. I'm more than glad to block you (the first user I've ever blocked on any forums)! Go, you king of maturity! After all, what are you doing here on the forums with your rotten attitude, failing to see the humor? Aren't you too mature to talk about GTX 290 rumors in the first place? I reported you because of your rudeness.
    Last edited by Bo_Fox; 10-09-2008 at 07:28 AM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  20. #45
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by B.E.E.F. View Post
    That's excellent news! I can replace my space heater.

    It gets cold here in Canada.
    What about in the summer time?
    How much is it per k/w hour where you live.
    Man i wish we would take you guy's lead and build some ultra efficient heavy water reactors.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  21. #46
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by v_rr View Post
    Are you talking about a card with 1024bit bus (2x 512bit) + 2Gb 4.0Ghz GDDR5 + 2x 460mm^2 GPU + NVIO?

    Thatīs simply crazy.
    That should be impossible to make and if itīs possible it would cost 800$++
    That would be a great shot in the foot bigger then GT200 was.
    Whats crazy is that the 256 bit 55nm 4870 uses more than 50% more power than a 448 bit 65nm gtx 260, both cards at idle.

    WHATS CRAZY is that the 55nm 256 bit 4870 uses more power than a 448 bit 65nm gtx 260 at load, and you all are thinking it would be IMPOSSIBLE to put two die shrinked 55nm gtx 280's on a single card. [/URL]
    :ROFL:
    Stop being tards,
    Mark my words in a few months the 4870x2 will be old news like the 9800gx2 is now.

    http://www.techreport.com/r.x/gtx260...power-idle.gif
    Last edited by fragmasterMax; 10-09-2008 at 07:30 AM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  22. #47
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Quote Originally Posted by fragmasterMax View Post
    WHATS CRAZY is that the 55nm 256 bit 4870 uses more power than a 448 bit 65nm gtx 260 at load, and you all are thinking it would be IMPOSSIBLE to put two die shrinked 55nm gtx 280's on a single card. [/URL]
    :ROFL:
    Stop being tards,
    [/url]
    What people are doubting is that they can put in on a *dual PCB* product. If they were taking the ATI approach, I doubt anyone would be raising issues.

  23. #48
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by Bo_Fox View Post
    Thanks for showing me who you really are. I'm more than glad to block you (the first user I've ever blocked on any forums)! Go, you king of maturity! After all, what are you doing here on the forums with your rotten attitude, failing to see the humor? Aren't you too mature to talk about GTX 290 rumors in the first place? I reported you because of your rudeness.
    Rofl
    You guys are silly!
    There are alot of misconceptions on this forum, i'm not trying to come off as an nvidia fanboy, but the fact is the only thing that differentiates the 4870 from a 4850 (8800gts 512 performance) is GDDR5, which nvidia will soon get.
    When it does, the tables will be turned.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  24. #49
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Posts
    516
    Apparently the "new" 260 is not as much crap as had been thought.

    http://techreport.com/articles.x/15651

    Despite the fact that these are tremendously complex chips with hundreds of millions of transistors, AMD and Nvidia have achieved a remarkable amount of parity in their GPUs. In terms of image quality, overall features, performance, and even price, the Radeon HD 4870 1GB and the GeForce GTX 260 "Reloaded" are practically interchangeable. That fact represents something of a comeback for Nvidia, since the older GTX 260 cost more than the 4870 and didn't perform quite as well. If anything, the GTX 260 Reloaded was a smidgen faster than the 4870 1GB overall in our test suite.

    The GTX 260 is based on a much larger chip with a wider path to memory, which almost certainly means it costs more to make than the 4870, but as a consumer, you'd never know it when using the two products, so I'm not sure it matters much for our purposes. Even the GTX 260's power consumption is lower than the 4870's, and its noise levels are comparable.
    This is quite the reversal from techreport compared to their earlier R770 vs GT200 reviews. If the new 260 is capable of this, then the new 55nm versions may actually be worth while.

  25. #50
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Stop spread lies Fanboy:


    Quote Originally Posted by fragmasterMax View Post
    Whats crazy is that the 256 bit 55nm 4870 uses more than 50% more power than a 448 65nm gtx 260, both cards at idle.
    1Gb HD 4870 use 10% more power at Idle then GTX260:



    Quote Originally Posted by =fragmasterMax
    WHATS CRAZY is that the 55nm 256 bit 4870 uses more power than a 448 bit 65nm gtx 260 at load
    1Gb HD 4870 use less power at loadthe GTX260:


    http://www.bit-tech.net/hardware/200...tx-sli-pack/11
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

Page 2 of 10 FirstFirst 12345 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •