Page 3 of 10 FirstFirst 123456 ... LastLast
Results 51 to 75 of 227

Thread: Nvidia 270, 290 and GX2 roll out in November

  1. #51
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by hurleybird View Post
    What people are doubting is that they can put in on a *dual PCB* product. If they were taking the ATI approach, I doubt anyone would be raising issues.
    So, can you come up with some empiracal data (or ancedotal from a good source) that making a dual-pcb dual-gpu card is harder than just using a single long card?
    Does it really matter to you guys?!
    This reminds me of when ati said they had the first true quadcore, because all four where on one die. (Yet was destroyed by the q6600, and to this day hasn't changed much, except for stepping.)
    Who cares!?
    What matters is PERFORMANCE!
    Last edited by fragmasterMax; 10-09-2008 at 07:40 AM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  2. #52
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by v_rr View Post
    Stop spread lies Fanboy:

    take a look at the review link i just posted

    http://www.techreport.com/r.x/gtx260...power-idle.gif
    :rOFL: !!

    That at idle is with windows aero enabled, and which is probably taxes the gpu to the slightest extent, driving up power consumption.
    Last edited by fragmasterMax; 10-09-2008 at 07:43 AM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  3. #53
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Periander6 View Post
    Apparently the "new" 260 is not as much crap as had been thought.

    This is quite the reversal from techreport compared to their earlier R770 vs GT200 reviews. If the new 260 is capable of this, then the new 55nm versions may actually be worth while.
    I recommend read other review with some different opinion:
    AMD's ATI Radeon HD 4870 with 1GB of GDDR5
    http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==

    The Radeon HD 4870 1GB: The Card to Get
    http://www.anandtech.com/video/showdoc.aspx?i=3415

    And GTX260 core obvious low sales:

    Nvidia’s Partners Reluctant to Adopt New Flavour of a High-End Chip.
    Graphics Card Makers Hesitant to Produce GeForce GTX 260-216


    Nvidia Corp. recently released an improved version of its high-end graphics processing unit (GPU) in an attempt to offer a product that would be indisputably better compared to an offer from the arch-rival ATI, graphics product group of Advanced Micro Devices. While the new core does have advantages over the previous one, many leading-edge manufacturers of graphics cards have decided to stick with the old one for a while.

    The world’s most influential supplier of discrete graphics chips recently released an improved version of its GeForce GTX 260 graphics card that features 216 stream processing units, a substantial increase compared to the GeForce GTX 260 with 192 stream processors available earlier. The attempt was made in order to stop invasion of ATI Radeon 4870 graphics cards into the higher-end market. Nvidia has even maintained the price of the model 260 at the same level as less powerful model 260: $299 a card.

    But the attempt was not successful, it seems: many of the largest suppliers of Nvidia GeForce-based graphics cards, including, but not limited to, Asustek Computer, Gainward, MicroStar International as well as Leadtek Research, still do not sell graphics cards powered by the so-called GeForce GTX 260-216.

    According to market sources, many companies “just have too many” GeForce GTX 260-192 graphics cards available in stock. This seems to be correct, as Nvidia has been aggressively advertising the GeForce brand along with the new GeForce GTX 200-series graphics cards in the recent weeks and also pushing the older graphics cards into the hands of its partners and into the channel.

    But graphics cards makers have another arguments regarding low popularity of the novelty: while the new model GTX 260-216 has higher computing power over the predecessor, it still has the same amount of other fixed-function execution units (e.g. texture processors and render back ends) and the same 448-bit memory bus as the predecessor. According to suppliers of graphics cards, the actual performance “improvement is just not large enough” to drive a huge demand towards the new GeForce GTX 260-216.

    Despite of the fact that Nvidia revealed an improved version of the GeForce GTX 260, many graphics cards vendors still see that ATI Radeon HD 4850 and 4870 graphics cards are still very popular, which is why the transition to improved flavours of the Nvidia GeForce GTX 260 cards is likely to be slow. But, obviously, inevitable, which is a reason for ATI to worry.

    No comments from the companies mentioned were made for the news-story.
    http://www.xbitlabs.com/news/video/d..._End_Chip.html
    Last edited by v_rr; 10-09-2008 at 07:44 AM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  4. #54
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by fragmasterMax View Post
    Rofl
    You guys are silly!
    There are alot of misconceptions on this forum, i'm not trying to come off as an nvidia fanboy, but the fact is the only thing that differentiates the 4870 from a 4850 (8800gts 512 performance) is GDDR5, which nvidia will soon get.
    When it does, the tables will be turned.
    LOL...

    True, Nvidia just needs to do it on a single PCB but those chips are still huge.. that's the dilemma! Nobody wants a 12" long PCB, right?

    Hey, the 4870 has a 125MHz core gain over the 4850, so its not just the memory...

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  5. #55
    Xtreme Member
    Join Date
    Nov 2004
    Location
    Germany, Munich
    Posts
    201
    Quote Originally Posted by hurleybird View Post
    What people are doubting is that they can put in on a *dual PCB* product. If they were taking the ATI approach, I doubt anyone would be raising issues.
    This.
    And some people should realize that there won't be GDDR5 on Nvidia cards until the new generation rolls out.
    Also there's a so-called edit button, noneed for double/triple/quadruple posts here, especially when it comes to ROFLBOBBLING and flaming other members~
    Q6600 G0 | 4GB Mushkin | HD 4890 | X-Fi Fatal1ty
    Canon EOS 7D | 1000D | 70-200 F4 | Sigma 50 1.4 | 100 2.8 L IS Macro | 18-55 IS

  6. #56
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Toronto, Canada
    Posts
    1,491
    Quote Originally Posted by fragmasterMax View Post
    take a look at the review link i just posted

    http://www.techreport.com/r.x/gtx260...power-idle.gif
    :rOFL: !!
    Where's the 50% more usage in that image?
    RIG 1 (in progress):
    Core i7 920 @ 3GHz 1.17v (WIP) / EVGA X58 Classified 3X SLI / Crucial D9JNL 3x2GB @ 1430 7-7-7-20 1T 1.65v
    Corsair HX1000 / EVGA GTX 295 SLI / X-FI Titanium FATAL1TY Pro / Samsung SyncMaster 245b 24" / MM H2GO
    2x X25-M 80GB (RAID0) + Caviar 500 GB / Windows 7 Ultimate x64 RC1 Build 7100

    RIG 2:
    E4500 @ 3.0 / Asus P5Q / 4x1 GB DDR2-667
    CoolerMaster Extreme Power / BFG 9800 GT OC / LG 22"
    Antec Ninehundred / Onboard Sound / TRUE / Vista 32

  7. #57
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by fragmasterMax View Post
    Whats crazy is that the 256 bit 55nm 4870 uses more than 50% more power than a 448 bit 65nm gtx 260, both cards at idle.
    Do you know why? In the 4870 its GDDR5 doesn't reduce speed at idle, it remains at 900MHz. If you have ever had a card with GDDR5 you'd know that it consumes a buttload of power at idle if the frequencies are not lowered. Example: at idle, 300Mhz vs 900Mhz does almost nothing if you're talking about GDDR3, but with GDDR5 you're talking about 40+ watts. You know why 4870 doesn't clock the RAM to 500MHz at idle? The screen flickers when the change is done, and unfortunately 2D/3D detection on the 4870 sucks ass, so they can't lower it. The 2D/3D detection is changed in the 4870X2, so they can clock the GDDR5 to 500Mhz without flickering in not desired moments. That's why you see the 4870X2 consuming only a little bit more than a single 4870 at idle, even having twice the ram chips and two RV770s. That's also why you see the tremendous difference between 4870 and 4850 at idle.

    Be careful with what you say about GDDR5, it's fast but sucks power like if it was no tomorrow. And that's precisely what you don't want in a GX2 card.
    Last edited by STaRGaZeR; 10-09-2008 at 07:50 AM.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  8. #58
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    Quote Originally Posted by fragmasterMax View Post
    So, can you come up with some empiracal data (or ancedotal from a good source) that making a dual-pcb dual-gpu card is harder than just using a single long card?
    Does it really matter to you guys?!
    This reminds me of when ati said they had the first true quadcore, because all four where on one die. (Yet was destroyed by the q6600, and to this day hasn't changed much, except for stepping.)
    Who cares!?
    What matters is PERFORMANCE!
    Uhh no.. cooling a single-card dual-GPU solution is much easier than cooling a sandwich like the 9800GX2. With a single-card dual-GPU setup you can use off the shelf waterblocks, with a GX2 you need a custom block that's useless after that card is sold. Bring on the 12" graphic cards

  9. #59
    Xtreme Addict
    Join Date
    May 2008
    Posts
    1,192
    From the INQ quote

    (2) We are, of course, making this all up.
    Quote Originally Posted by alacheesu View Post
    If you were consistently able to put two pieces of lego together when you were a kid, you should have no trouble replacing the pump top.

  10. #60
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by zlojack View Post
    Where's the 50% more usage in that image?
    There isnīt there and there isnīt in any review. Itīs just trolling arround the forum.

    Also that review is just Lame. -> http://www.techreport.com/articles.x/15651/1
    They compare a Overclocked GTX 260 core-216 with a standart factory HD 4870 1Gb.
    Why they donīt pick a standart GTX 260 core-216 or they donīt pick a Overclocked HD 4870 1Gb??


    The AMPē! has a 649MHz GPU core, 1404MHz shaders, and 896MB of GDDR3 memory at 1053MHz, up from 576/1242/999MHz on the first wave of GTX 260 cards. Those clock speeds are also, I should note, higher than the stock clocks for the GeForce GTX 280, which are 602/1296/1107MHz.
    Why didnīt they picked this HD 4870 1GB:

    Powercolor HD4870 1GB PCS+ Edition
    Graphics Engine RADEON HD4870
    Video Memory 1GB GDDR5
    Engine Clock 800MHz
    Memory Clock 925Mhz (3.7Gbps)


    .
    .
    .
    Simple, money talks....
    The Anandtech and hardocp pick standart cards for a decent review.
    It's unfair to compare a 17% overclocked 260gtx to a stock clocked 4870
    Last edited by v_rr; 10-09-2008 at 07:57 AM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  11. #61
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Another anti-nVidia piece by TheINQ, whats new... :p

    Quote Originally Posted by Charlie 'Dancing in the Aisles' Demerjian
    If you are underwhelmed, then the dualie card is for you. We haven't seen a code name for it officially, but NV AIBs are talking about it. Take a 55nm GT200b/206 and put two PCBs together a la the 9800GX2, and you get the idea. There is one minor problem this time... heat.

    The G92 (8800GT/9800GTX/GT15x) was coolable, barely, with a single slot cooler. The GT200 is not. Even with a theoretical 20 per cent lower power draw, you would be at 290W for a dual 55nm 260 clone. If you use a 260-216 or jack the clock up, you are at 300+W in an instant, and we can see 350W without trying hard.
    I call BS on this. Heat doesn't seem to be much of an issue for the 450W 4870X2:
    http://www.anandtech.com/video/showdoc.aspx?i=3415&p=9



    So with the touted theoretical 20% lower power draw from 55nm, we get ~250W for a single GTX280. Since an 'X2' card doesn't double power consumption (4870 = 280W, 4870X2 = 450W so ~1.6x gain in power consumption) a 280GX2 could conceivably run COOLER than the 4870X2 whilst easily outperforming it.

  12. #62
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by fragmasterMax View Post
    Rofl
    You guys are silly!
    There are alot of misconceptions on this forum, i'm not trying to come off as an nvidia fanboy, but the fact is the only thing that differentiates the 4870 from a 4850 (8800gts 512 performance) is GDDR5, which nvidia will soon get.
    When it does, the tables will be turned.

    You might claim to not be a nVidia fanboy, but your deductions are still pitiful. No changing on that.
    -----
    SLIed GTX 260s still won't be anywhere close to even a decisive win. Nothing will be. For a $500+ card, you'd expect to play at 8x AA with max settings. And obviously for 8x, there is no competition between the 260/260-216 vs the 4870.
    Last edited by Macadamia; 10-09-2008 at 07:59 AM.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  13. #63
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by v_rr View Post
    There isnīt there and there isnīt in any review. Itīs just trolling arround the forum.

    Also that review is just Lame. -> http://www.techreport.com/articles.x/15651/1
    They compare a Overclocked GTX 260 core-216 with a standart factory HD 4870 1Gb.
    Why they donīt pick a standart GTX 260 core-216 or they donīt pick a Overclocked HD 4870 1Gb??




    Why didnīt they picked this HD 4870 1GB:

    Powercolor HD4870 1GB PCS+ Edition




    .
    .
    .
    Simple, money talks....
    The Anandtech and hardocp pick standart cards for a decent review.
    dude
    wow
    This is a throwback to third grade, when everybody thought they had the best everything, and couldn't do simple math.
    gtx 260- 114
    4870- 166
    hmmm 114/2=57+114=171 watts (what the pwr consumption would have to be to be 50% more)
    the power consumption of a 4870 is 166, a bit less than 50% more than the gtx 260.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  14. #64
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Quote Originally Posted by Tonucci View Post
    Same here, I suspected it was theinq after the "NV is in deep doo-doo right now..." part. So I checked the "source" and stopped reading.

    Initial gtx280 price was due to the absense of something faster at the time....aimed at early adopters. Nvidia probably knew ATI was going to launch an dual GPU card, and likely expected/planned price drops, just like it happened. I may or may not be wrong, but theinq stance/interpretation of facts annoy the hell out of me. Very subjective and unprofessional.
    All right, you didnt get my humor, and reacted bitterly to it...

    Well, here's a "mature" reply, hopefully to bring you back to contendedness...

    My take is that Nvidia has been working on the GT200 architecture for quite a while. After easily dominating the 3870 series from ATI, Nvidia was almost 100% sure that the 4870 could not be any more than 50% faster than the 3870 on the same 55nm process, according to the rumors. The 4870 actually ended up being closer to 100% faster than the 3870, and actually got rid of the AA "bug". That was an amazing feat by ATI, given that the die size of the chip increased by only like 30% or something. And ATI still managed to make an X2 version of it.

    All this time, Nvidia was set on doing a repeat of the 8800GTX profit rake, selling large quantities of GTX 280 at $650 a pop. Well, Nvidia did not have the slightest clue that the 4870 would be just as fast as the huge GTX260 until the very last minute.

    You had good logic there, but I think mine is better, no offense...

    The INQ was never a professional site. It was always a rumor site, and sometimes had spot-on rumors with accurate stats. I always found it an amusing read, for the humor.. hey, learn the humor in English, at least!!! :P

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  15. #65
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by fragmasterMax View Post
    dude
    wow
    This is a throwback to third grade, when everybody thought they had the best everything, and couldn't do simple math.
    gtx 260- 114
    4870- 166
    hmmm 114/2=57+114=171 watts (what the pwr consumption would have to be to be 50% more)
    the power consumption of a 4870 is 166, a bit less than 50% more than the gtx 260.
    You are picking a Biased review that puts head to head a heavy overclockd GTX260 vs standart HD 4870 1Gb.
    Those power comsuption numbers are off too.

    Use this review instead please:
    http://www.bit-tech.net/hardware/200...tx-sli-pack/11

    The diference in IDLE from GTX260-HD4870 1Gb is 10%:
    http://www.xtremesystems.org/forums/...6&postcount=52
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  16. #66
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by STaRGaZeR View Post
    Do you know why? In the 4870 its GDDR5 doesn't reduce speed at idle, it remains at 900MHz. If you have ever had a card with GDDR5 you'd know that it consumes a buttload of power at idle if the frequencies are not lowered. Example: at idle, 300Mhz vs 900Mhz does almost nothing if you're talking about GDDR3, but with GDDR5 you're talking about 40+ watts. You know why 4870 doesn't clock the RAM to 500MHz at idle? The screen flickers when the change is done, and unfortunately 2D/3D detection on the 4870 sucks ass, so they can't lower it. The 2D/3D detection is changed in the 4870X2, so they can clock the GDDR5 to 500Mhz without flickering in not desired moments. That's why you see the 4870X2 consuming only a little bit more than a single 4870 at idle, even having twice the ram chips and two RV770s. That's also why you see the tremendous difference between 4870 and 4850 at idle.

    Be careful with what you say about GDDR5, it's fast but sucks power like if it was no tomorrow. And that's precisely what you don't want in a GX2 card.
    Unless you show me some data sheets, i am going to take that with a mound of salt.
    According to this a 4870 with 1gb of gddr5 uses only ~20 more watts than a 4850. The 512 4870 uses 36 more watts. I think most of the difference in power consumption is due to the 125MHz core gain which bofox already mentioned. Show me a 4850 at 4870 clocks and then lets do a comparison.

    http://www.techreport.com/r.x/gtx260...power-idle.gif
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  17. #67
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Quote Originally Posted by v_rr View Post
    You are picking a Biased review that puts head to head a heavy overclockd GTX260 vs standart HD 4870 1Gb.
    Those power comsuption numbers are off too.

    Use this review instead please:
    http://www.bit-tech.net/hardware/200...tx-sli-pack/11

    The diference in IDLE from GTX260-HD4870 1Gb is 10%:
    http://www.xtremesystems.org/forums/...6&postcount=52
    What about Anandtech's numbers?

    http://www.anandtech.com/video/showdoc.aspx?i=3415&p=9

  18. #68
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Quote Originally Posted by Epsilon84 View Post
    Another anti-nVidia piece by TheINQ, whats new... :p



    I call BS on this. Heat doesn't seem to be much of an issue for the 450W 4870X2:
    http://www.anandtech.com/video/showdoc.aspx?i=3415&p=9



    So with the touted theoretical 20% lower power draw from 55nm, we get ~250W for a single GTX280. Since an 'X2' card doesn't double power consumption (4870 = 280W, 4870X2 = 450W so ~1.6x gain in power consumption) a 280GX2 could conceivably run COOLER than the 4870X2 whilst easily outperforming it.
    450W is the system consumption, but yeah, even tho the x2 uses alot of power (I'd estimate something around 300W) the stock cooler seems to be able to take care of it well.

  19. #69
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Epsilon84 View Post
    22% diference and still HD 4870 1Gb gets recommended for superior performance on that review.
    We can make an average of the two reviews:
    (10%+22%)/2 = 16%
    Last edited by v_rr; 10-09-2008 at 08:10 AM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  20. #70
    Xtreme Enthusiast
    Join Date
    Dec 2007
    Location
    Boston
    Posts
    553
    Quote Originally Posted by v_rr View Post
    You are picking a Biased review that puts head to head a heavy overclockd GTX260 vs standart HD 4870 1Gb.
    Those power comsuption numbers are off too.

    Use this review instead please:
    http://www.bit-tech.net/hardware/200...tx-sli-pack/11

    The diference in IDLE from GTX260-HD4870 1Gb is 10%:
    http://www.xtremesystems.org/forums/...6&postcount=52

    So it's biased because it's right?
    Having vista Aero on for most people would not be considered at idle.
    sorry
    And btw, if the gtx 260 is so heavily overclocked, why does it still use less than half the power consumption of a 4870 at idle?

    speaking of biased, skewed reviews look at this;

    "For our idle testing, we left the cards idling on the desktop for ten minutes, recording the average draw at the wall socket. For load testing, we used our benchmark routine from Crysis in DirectX 10 mode and measured the peak power consumption throughout the benchmark. We tested the cards in a number of other scenarios and this proved to be the most intensive in all cases, so you can consider this to be a worst-case scenario."

    in other words, they couldn't get there kill a watt meter to read an average over a time period. They just recorded the maximum that was being drawn from the wall. REal professional........
    Last edited by fragmasterMax; 10-09-2008 at 08:15 AM.
    As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
    "I am become Death, the destroyer of worlds."
    Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki

  21. #71
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    658
    Quote Originally Posted by Tonucci View Post
    450W is the system consumption, but yeah, even tho the x2 uses alot of power (I'd estimate something around 300W) the stock cooler seems to be able to take care of it well.
    Yeah, my mistake, but if its system power consumption then that makes a 280GX2 even more plausible doesn't it? If a GTX280 SLI setup only draws 55W more than a 4870X2, then surely a 55nm 280GX2 should get the power consumption down to 4870X2 levels at the very least, even if nVidia uses the 'double cheeseburger' approach for the GX2.

  22. #72
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by fragmasterMax View Post
    Rofl
    You guys are silly!
    There are alot of misconceptions on this forum, i'm not trying to come off as an nvidia fanboy, but the fact is the only thing that differentiates the 4870 from a 4850 (8800gts 512 performance) is GDDR5, which nvidia will soon get.
    When it does, the tables will be turned.
    Hmm imo nvidia is gonna keep using GDDR3, why? because either way they can achieve the same bandwidth using either 512-bit @2200 or 256-bit @4400 for the GTX280 for example, but in this case GDDR3 has lower latencies than GDDR5 which is better right? Also GDDR3 is cheaper and it has a lot more availability.


    Please correct me if i'm wrong...

  23. #73
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by fragmasterMax View Post
    And btw, the tech report link i have been referencing, shows both the default, gtx 260, and the gtx 216 which has a less crippled core, i'm pretty sure they are not overclocked.
    The AMPē! has a 649MHz GPU core, 1404MHz shaders, and 896MB of GDDR3 memory at 1053MHz, up from 576/1242/999MHz on the first wave of GTX 260 cards. Those clock speeds are also, I should note, higher than the stock clocks for the GeForce GTX 280, which are 602/1296/1107MHz.
    Taken directly from the review.
    Itīs unfair compare a 17% overclocked 260gtx core-216 to a stock clocked 4870 1Gb.
    Last edited by v_rr; 10-09-2008 at 08:15 AM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  24. #74
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    393
    Quote Originally Posted by fragmasterMax View Post
    take a look at the review link i just posted

    http://www.techreport.com/r.x/gtx260...power-idle.gif
    :rOFL: !!

    That at idle is with windows aero enabled, and which is probably taxes the gpu to the slightest extent, driving up power consumption.
    ... Where is the 50% more power at idle in that link? I love how Nv fanboys likes to pull numbers out of no where.

  25. #75
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    What a damned idiot fanboi that wrote that pile of crap!

    ...and people wonder why that place has the reputation it does. It's no wonder with oral diarrhea like that floating out their mouths.

Page 3 of 10 FirstFirst 123456 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •