Page 6 of 10 FirstFirst ... 3456789 ... LastLast
Results 126 to 150 of 227

Thread: Nvidia 270, 290 and GX2 roll out in November

  1. #126
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by Caveman787 View Post
    It's true see any review that reviews them overclocked, the gtx series doesnt scale as linearly with overclocking as the 4000 series.

    That's why a 850 overclocked 4000 series matches a gtx260 at 750 mhz. 100 mhz overclock compared to 175 mhz overclock.
    So, a 13,3% OC on an HD4870 is equivalent to a 30% overclock on a GTX260.

    Would you mind posting links to that please? TY

    Quote Originally Posted by Miss Banana View Post
    I don't get it, why is it a load of crap? The article contains many interesting things that could be considered news, if only one would be able to not get upset by someones opinion about a company. (a company that is not your mother btw)
    many interesting things that can be considered opinions. made by nVidia haters. i don't know of any site bashing ATI and getting their "news" posted here, why is that?
    Are we there yet?

  2. #127
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Carfax View Post
    Very well, on diminishing returns.

    While diminishing returns is obviously a factor, does that mean we should not push our hardware as high as it can safely go?

    Call it the enthusiast's curse, but most of us here will always try to push our hardware to the max, even if it results in minute performance gains.

    Do I need my C2Q at 3.6ghz, when I game at resolutions and settings that are GPU limited?

    No, of course not. But it's at 3.6ghz anyway, and it will stay there.

    I see your point, but from the perspective of an enthusiast, it's moot.
    ...This was the primary reason I bought my GTX 260. It's a helluva lot more overclockable than the HD 4870...
    ...the GTX 260 overclocks like a bat out of hell...
    ...your HD 4870s achieves a 19% overclock on the core through volt mod.
    My GTX 260 216 achieved a 28% overclock at stock volts and stock cooling..
    etc...
    The only real thing that is moot is your overclock comparison between 2 different video cards with no other information. Which is proven in your posts.
    Hint: CPU overclocks aren't related to your claims of greater video card overclocks.
    Last edited by Eastcoasthandle; 10-09-2008 at 01:56 PM.
    [SIGPIC][/SIGPIC]

  3. #128
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    Anybody remember this:

    ??

    Probably bogus, but anyways.. I'm hoping that the GTX 290 will be at least 725 MHz core and 1550 MHz shaders. That would instantly earn it 1+ TFLOPS status!

    I'm just hoping that Nvidia did a good job on the 55nm re-spin. It better not be a cut-down, bottlenecked version like the G92 series, with all the focus on a GX2 version.

    Quote Originally Posted by AuDioFreaK39 View Post
    I just gave a headstart and posted it on 6 other forums!


    I stumbled upon a Hardspell article written July 18, 2008 that speaks of a GTX 350 engineering sample with the same specs as listed above:

    HardSpell.com - NVIDIA GTX 350 ES version is ready and the specs revealed?!

    We got to know the related news but we are not so sure about this:

    NVIDIA GTX 350
    GT300 core
    55nm technology
    576mm
    512bit
    DDR5 2GB memory, doubled GTX280
    480SP doubled GTX280
    Grating operation units are 64 the same with GTX280
    216G bandwidth
    Default 830/2075/3360MHZ
    Pixel filling 36.3G pixels/s
    Texture filling 84.4Gpixels/s
    Cannot support D10.1 .10.0/SM4.0
    http://www.xtremesystems.org/forums/...eply&p=3305160

    Silly rumors... eh? Perhaps the ""GT300"" core is tweaked to integrate support for GDDR5 memory, using only 256-bit bandwidth per core? The only way this configuration could hold true is if it's a 12-inch long PCB with 3 PCI-E power plugs to supply enough voltage for those outrageous clocks, plus a powerful dual-slot cooler.

    Obviously, you guys love rumors but yet bash the INQ for being a rumor mill. Otherwise you guys wouldnt be posting in this thread based on rumors! Ha!!
    Last edited by Bo_Fox; 10-09-2008 at 02:09 PM.

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  4. #129
    Xtreme Member
    Join Date
    Jul 2008
    Location
    Tokyo, Japan
    Posts
    328
    Quote Originally Posted by Luka_Aveiro View Post
    many interesting things that can be considered opinions. made by nVidia haters. i don't know of any site bashing ATI and getting their "news" posted here, why is that?
    Hate ATI ?
    After ATI broke down the prices of graphics cards I thought even Nv fanboys love them.

  5. #130
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    [QUOTE=Luka_Aveiro;3346112]So, a 13,3% OC on an HD4870 is equivalent to a 30% overclock on a GTX260.

    Would you mind posting links to that please? TY

    [quote]
    In TechReport test more than 20% overclock and and increase shaders barely pull GTX 260 216 12% at best and more like 8% in average ahead compare to 4870 at stock... Everyone can see it.
    AMD Phenom II X2 550@Phenom II X4 B50
    MSI 890GXM-G65
    Corsair CMX4GX3M2A1600C9 2x2GB
    Sapphire HD 6950 2GB

  6. #131
    Banned
    Join Date
    Jun 2008
    Posts
    250
    Quote Originally Posted by Luka_Aveiro View Post
    many interesting things that can be considered opinions. made by nVidia haters. i don't know of any site bashing ATI and getting their "news" posted here, why is that?
    I think there's more than enough bashing going on in all directions, so let's not get into a "no MY company gets bashed more!" argument.

    Charlie obvioulsy has some personal issues with nvidia, and this reflects from his articles, often in an entertaining way. You don't have to agree with him, I know I don't, but being upset by his articles? Let's just say there's more important things in the world to get upset about.

  7. #132
    Xtreme Member
    Join Date
    Sep 2008
    Location
    Romania
    Posts
    157
    Hehe Charlie is the biggest troll on the internet, just look at this thread.. And no, he's no ATI fanboy, he's not payed by ATI, he just hates nvidia and he loves to make fanboys angry. Plus his articles give the INQ tons of page hits...
    the state is universally evil, there is no good country only good people

  8. #133
    Xtreme Enthusiast
    Join Date
    Mar 2008
    Location
    Dallas, TX
    Posts
    965
    [...] (2) We are of course, making this all up [...]

    this entire article is rendered useless because of that single statement. They just spewed a full page of BS slamming nvidia that was a fat pack of lies.

    There is no proof, no unbiased oppinion and therefore no validity in this article.


    and I'm no fanboi, I love nVidia. But I love the fact ATi slammed them in the butt this time around, competition drives performance. If nVidia had to much control of the market, their product quality would spin out of control; and this is the result of that.
    "fightoffyourdemons"


  9. #134
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by azraeel101 View Post
    Hehe Charlie is the biggest troll on the internet, just look at this thread.. And no, he's no ATI fanboy, he's not payed by ATI, he just hates nvidia and he loves to make fanboys angry. Plus his articles give the INQ tons of page hits...
    I couldn't care less about fanboys or not.

    I'm in to those things that i like to call "news". News, to be news, should be done in an opposite way Inq does their...

    Any news site that wants to be known as a news site, should look at the inQ and say: "hey, i'mma do the opposite of what this guys do... I'm going to give quality news!"

    All in all, I find the journalism quality presented by the Inq disgusting, even though, I must admit, they sometimes have very interesting personal points of view.

    Seems quite clear to me that those who enjoy the Inq news go along with their "bash the green team" news, and those also can't take them seriously, cause if they do, then I would say "Internet, serious business"...
    Are we there yet?

  10. #135
    Xtreme Member
    Join Date
    Nov 2005
    Location
    Boise, ID
    Posts
    353
    I don't read INQ's articles anymore. I will read the headlines and wait until the same information comes from a more balanced and less biased source. For example ANYWHERE else.

  11. #136
    Muslim Overclocker
    Join Date
    May 2005
    Location
    Canada
    Posts
    2,786
    I don't see why everyone is getting all horny over this. Its all pure speculation. spec-u-lation. Here are the facts:

    1- Nvidia is getting screwed because of the price cuts
    2- Nvidia needs die shrink to maintain prices because AIB partners are complaining - this is coming
    3- Nvidia will not be able to use GDDR5 on this GPU, so stop dreaming
    4- Nvidia needs a card to compete in the super high end, and they will do whatever they can to match the 4870x2

    So unless people have something to add to this information, the rest is all speculation.

    I can go ahead and say nVidia's new card will cause a nuclear explosion, and you can't tell me I am wrong

    As far as the article goes, I thought it was a good read.
    Last edited by ahmad; 10-09-2008 at 02:52 PM.

    My watercooling experience

    Water
    Scythe Gentle Typhoons 120mm 1850RPM
    Thermochill PA120.3 Radiator
    Enzotech Sapphire Rev.A CPU Block
    Laing DDC 3.2
    XSPC Dual Pump Reservoir
    Primochill Pro LRT Red 1/2"
    Bitspower fittings + water temp sensor

    Rig
    E8400 | 4GB HyperX PC8500 | Corsair HX620W | ATI HD4870 512MB


    I see what I see, and you see what you see. I can't make you see what I see, but I can tell you what I see is not what you see. Truth is, we see what we want to see, and what we want to see is what those around us see. And what we don't see is... well, conspiracies.



  12. #137
    Xtreme Enthusiast
    Join Date
    Dec 2003
    Location
    UK
    Posts
    567
    I don't understand fanboys/girls.

    I buy whatever I can afford that is decent for what I require. I use a mobile C2D as one of my rigs - this suits me fine as it's quiet and fast.

    I also use a dual Opteron rig because it is great for video encoding, etc.

    I use an Nvidia 8800GT, and will probably buy 2 Ati 4850's for the C2D rig.

    Hell I even mentioned in the standard folding thread I may sell the lot and go AMD X2 along with 4 9600GSO's for folding

    I think all companies do a great job by giving the consumer choice. You want to break WR's you go Intel/Ati. You want value you go AMD. You want to GPU fold you go Nvidia.

  13. #138
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Birmingham AL.
    Posts
    1,079
    Quote Originally Posted by ahmad View Post
    I don't see why everyone is getting all horny over this. Its all pure speculation. spec-u-lation. Here are the facts:

    1- Nvidia is getting screwed because of the price cuts
    2- Nvidia needs die shrink to maintain prices because AIB partners are complaining - this is coming
    3- Nvidia will not be able to use GDDR5 on this GPU, so stop dreaming
    4- Nvidia needs a card to compete in the super high end, and they will do whatever they can to match the 4870x2

    So unless people have something to add to this information, the rest is all speculation.

    I can go ahead and say nVidia's new card will cause a nuclear explosion, and you can't tell me I am wrong
    How can you say those are facts without factual proof? such as claiming no GDDR5? dont get me wrong, I speculate there will be no GDDR5 either but im guessing this cannot be proven at this point so basicly you told everyone there speculation is not worth 2 cents but yours are facts.
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

  14. #139
    Xtreme Addict
    Join Date
    Aug 2008
    Posts
    2,036
    Take a guess how mnay page hits that place has gotten from me? Here's a hint...ZERO. Anybody who hates a product based on a name is a fool, and has no buiseness even being involved with computers. It's what gives the rest of us bad names. It hinders progress, and spreads disreputable info that may lead people to make buying choices they later wish they hadn't. In some cases some OC'er may buy a video card based on "reverse hype" only to find out the fanboi led them down the wrong path, and now they have to buy another to be competitive in gaming, siming, benchmarking, or performance OC'ing. That's no game. People spend big money on these systems, and they pour blood sweat and tears into them. They don't wanna hear fanboi BS.

    I despise fanboi's with a passion. I am the Anti-Fanboi and as God as my witness I kid you not. I am so against it I will go to the ends of the earth to rid this planet of them no matter what I have to do, and will go on campaigns like I do on this forum to negate their slurs, and hatred of the computing industry.

    It's really stupid. If they hate things so much with computers why don't they find another hobby? I hate hearing their constant flames and baiting people into threads just like this. I'd love for somebody like that to bring that crap up to my face so I could tell them straight up what I think about it without this keyboard in between.

    You gotta have the IQ of the Operator's number to think like that. Not a lick of sense at all.

    You know something? It's not the Inquirer who is really to blame, but they do allow it, so they are guilty by association. They'd get alot better rep if they changed their attitude and started posting things that are at least believable, and had some base to it, and left the fanboism out of it. If I wrote crap like that I'd expect to be fired...whether I was a volunteer or I was paid. People like this write this crap, but they are the same type who wouldn't say it to the people's faces they are writing about. They talk real big from behind a KB, but they are the same ones you see hiding in a corner at a trade show because they are afraid to show their face after the enemies they have made.

    It really makes me sick. It really does. People here have some personal bias sometimes. It's human nature, but this place is Xtreme Systems, and people don't don't post pure trash talk. That's not what this place is about, the Admin Team does not like it, the users do not like it, and this place exists for people to truly build the Xtreme of the Xtreme or to any level of performance they want and to get straight info.

    It doesn't matter whether it's nVidia or ATI, people who are serious do not wanna see such trash littering our hobby. If the Inquirer wants to be taken serious, then they need to get serious. They earned the rep they have now, it's up to them if they wanna change it or not. It's not a rep I'd want...that's for sure.
    Last edited by T_Flight; 10-09-2008 at 03:05 PM.

  15. #140
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Birmingham AL.
    Posts
    1,079
    Sadly thow its the money there after and not so much the rep..
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

  16. #141
    Xtreme Member
    Join Date
    Mar 2007
    Location
    Pilipinas
    Posts
    445
    ROFL calm down, it's a tabloid site so they spread rumors, page hits and all that.

  17. #142
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Test Labs
    Posts
    512
    INQ?

    Take it with a grain of salt.

  18. #143
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Location
    Kuwait
    Posts
    616
    Quote Originally Posted by ahmad View Post
    I can go ahead and say nVidia's new card will cause a nuclear explosion, and you can't tell me I am wrong
    wow buddy be careful people like you and me are not aloud to say words like Nuke or explosion on the internet

  19. #144
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Location
    London
    Posts
    577
    Quote Originally Posted by Caveman787 View Post
    It's true see any review that reviews them overclocked, the gtx series doesnt scale as linearly with overclocking as the 4000 series.
    ATI's architecture overclocks their shaders along with the core, hence lower overclocks in terms of mhz but better scaling overall

    id love to see max overclocked 4870 1GB vs 260 core-216 benches (yeah, the shaders overclocked too). im sure theyd be dead even, just as they are at stock speeds
    i7 920@4.34 | Rampage II GENE | 6GB OCZ Reaper 1866 | 8800GT (zzz) | Corsair AX750 | Xonar Essence ST w/ 3x LME49720 | HiFiMAN EF2 Amplifier | Shure SRH840 | EK Supreme HF | Thermochill PA 120.3 | MCP355 | XSPC Reservoir | 3/8" ID Tubing

    Phenom 9950BE @ 3400/2000 (CPU/NB) | Gigabyte MA790GP-DS4H | HD4850 | 4GB Corsair DHX @850 | Corsair TX650W | T.R.U.E Push-Pull

    E2160 @3.06 | ASUS P5K-Pro | BFG 8800GT | 4GB G.Skill @ 1040 | 600W Tt PP

    A64 3000+ @2.87 | DFI-NF4 | 7800 GTX | Patriot 1GB DDR @610 | 550W FSP

  20. #145
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by SNiiPE_DoGG View Post
    ... its not about "inferior" architecture. Its about how the architecture behaves at higher clock speeds.

    you keep spewing crap about how it reaches higher frequencies but the fact of the matter is that the r700 architecture has 800 SPs in a unified design whereas Nvidia does this vastly differently. The increase of 30 mhz on the r700 architecture has much greater effect than a 30 mhz increase on the GT200 architecture. I don't have numbers on this but I think everyone who has used both cards can say that this is essentially true.
    If you overclock the GTX 260 core alone, then you have a point..

    As you know, the shader processors and the core itself don't run at the same speed, so overclocking the core alone while leaving the shaders won't result in the best performance gains.

    Overclocking both the shaders and the core though, will see the same kind of progression you see on the R700.

    On my own card, I overclocked the core, the shaders and the memory.
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  21. #146
    Xtreme Addict
    Join Date
    Jul 2004
    Location
    U.S of freakin' A
    Posts
    1,931
    Quote Originally Posted by LightSpeed View Post
    ATI's architecture overclocks their shaders along with the core, hence lower overclocks in terms of mhz but better scaling overall

    id love to see max overclocked 4870 1GB vs 260 core-216 benches (yeah, the shaders overclocked too). im sure theyd be dead even, just as they are at stock speeds
    Exactly, this is an important distinction between the two architectures.

    The R700 will scale better than the GTX 200 due to it's unified design, unless you increase both the core and the shader processor speed in tandem on the GTX.

    It's the shader processors that do the heavy lifting after all..

    Most GTX owners, including myself, understand this and act accordingly.
    Intel Core i7 6900K
    Noctua NH-D15
    Asus X99A II
    32 GB G.Skill TridentZ @ 3400 CL15 CR1
    NVidia Titan Xp
    Creative Sound BlasterX AE-5
    Sennheiser HD-598
    Samsung 960 Pro 1TB
    Western Digital Raptor 600GB
    Asus 12x Blu-Ray Burner
    Sony Optiarc 24x DVD Burner with NEC chipset
    Antec HCP-1200w Power Supply
    Viewsonic XG2703-GS
    Thermaltake Level 10 GT Snow Edition
    Logitech G502 gaming mouse w/Razer Exact Mat
    Logitech G910 mechanical gaming keyboard
    Windows 8 x64 Pro

  22. #147
    Xtreme Enthusiast
    Join Date
    Mar 2008
    Location
    Dallas, TX
    Posts
    965
    article = BS/Fanboy-ism.


    /thread.
    "fightoffyourdemons"


  23. #148
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Puerto Rico
    Posts
    1,374
    Quote Originally Posted by G0ldBr1ck View Post
    How can you say those are facts without factual proof? such as claiming no GDDR5? dont get me wrong, I speculate there will be no GDDR5 either but im guessing this cannot be proven at this point so basicly you told everyone there speculation is not worth 2 cents but yours are facts.
    ^^^ I was asking myself the same thing...

  24. #149
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    If this is indeed true then it looks like NV will be making a multi-gpu G200 architecture card. Where are all the multi-gpu haters now?

    Multiple gpus are the way of the future. They have problems right now, but that can be fixed. The next step is to allow multiple gpus to access the same memory pool through some sort of memory controller. And to integrate the multiple dies on a single package. From there gpus can be moved onto the same chip as the cpu and etc etc etc.

    If this dual card is indeed the fastest card at the time (in the games I play) and I have the money for it (not likely ), I'd get it. I don't really care about power consumption, I have a 1250W power supply for a reason. I don't really care about heat either, that's what my watercooling loops are for.

    Quote Originally Posted by Sly Fox View Post
    And the ultimate irony? Even on the next setup up (280 and kinda 4870X2) Crysis/STALKER still aren't truly "maxable" smoothly. Barring certain cases of course, I realize things aren't THAT simple.
    My 4870x2 plays STALKER clear sky at 1920x1200 maxed. Not just the "maximum" setting, or whatever it's called, but actually moving every slider to max - except for AA which I have at 4x+edge detect ("12x").

  25. #150
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    393
    Quote Originally Posted by Solus Corvus View Post
    If this is indeed true then it looks like NV will be making a multi-gpu G200 architecture card. Where are all the multi-gpu haters now?
    If the multi-GPU has the Nvidia logo on it, these fanboys will eat it up no problem.

Page 6 of 10 FirstFirst ... 3456789 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •