You just proved my point:
Thanks! :) ;) :PQuote:
Because what I really wanted to say was edited by the forum:
"S&^t flinging contests"
Printable View
Maybe he's trying to say that the Crysis 3 slide is fake, and the Egypt results are legit?
The Egypt results are third party review graph, not an NVIDIA one. I trust it the most.
Well, whatever the Egypt scores are, they show the Titanium card doesn't come close to the GTX 690 if you use one card.
Yea this looks like the first card that actually has enough power to handle over 4 Gb Vram, but if you're only running up to 1200p and you think you need one of these or that much Vram, just no.
They are going to be great for triple monitor / 2560 resolutions, even better for games that dont support SLI / Xfire (Almost every MMO that I play).
Hm, any chance of a 15 SMX unlock? Or laser cut? I wonder...
I'm guessing that the lower than expected performance in some tests like Crysis 3 may be due to how low the core frequency is? Is it possible that the GPU clock could be bottlenecking such a powerful GPU? Going from 1200+ Mhz capable GK104s down to 850 Mhz on these Titans is sad.
How so if the chip is wider?
I'm pretty sure these will come with all 15 enabled.
I mean, I don't see anything saying otherwise other then a few peeps saying it's a 14 smx core thing, which I think is hogwash lol.
I'm pretty sure we'll beable to unlock these card's voltage and tdp as well with a bios mod.
But it might be a hassle to try getting that old ver of msi afterburner working with it, since it'll have a diffrent id.
I think it can still be done though.
Here's hoping for a lighting to come out, soon...
This will kinda suck, seeing these cards come out and having to wait it out a bit longer to see if there's gonna be custom cards.
But then again, if you get a custom card, it'll be harder to figure out a hard quadro mod...
I'm almost thinking screw it, and paying $3200 for a real one...
But that would be one heck of a waste of money, I can't be doing that, I should be investing in solar..
What happened to Titan being about 50% over stock GTX 680 ? These preliminary results are just plain weak. What is this 30% over GTX 680 I see in Crysis 3 ? I am so dissapointed right now, and they will want $1000 for this card ? Why on earth would I want to get that card when the GTX 690 is about $999 at the egg right now and the AMD 7990 is about $899 with free shipping. I am a fan of Nvidia (i have a 680 atm) but i hope they get financially hurt if they will ask for a grand on a 30% perf gain over the current GTX 680. Outrageous !
You're basing your statement on one game? Look further down, the difference is almost 55 % in all the games on Egypt Hardware's image. I don't know how accurate they are, I've only seen benchmark performance.
Doesn't look that way...
http://goo.gl/U3YSV
Out of curiosity:
NCP4206 (Titan, 800$) -> 1 dollar x 1.000 pieces -> http://www.onsemi.com/PowerSolutions....do?id=NCP4208
CHiL 8228 (MSI 7970 Lighting, 500$) -> 5,5 dollars x 3.000 pieces -> http://ec.irf.com/v6/en/US/adirect/i...=CHL8228-00CRT
Is nVidia stingy?
Where is this ? I dont see a single graph that shows anything that is faster than 35% on the Egyptian slide:
http://imageshack.us/a/img404/8374/gtxtitan.png
That crysis 3 slide is terrible. 32% isn't that much considering the price.
max price for this card needs to fall to 800 dollars. And the only reason it is that high is the 6gb of memory. This card should be a 699 3gb card. It a bit early to write this card off, but testing needs to be done at 2560*1600 to see if this card is worth some of the hype.
Only thing that looks nice is that this card doesn't look like it has 2 precision snipped off. Which would make it an awesome way to get a cheap professional card.
I can imagine titan looks freakishly desirable for the workstation quadro market where it looks like a bargain.
That is compared to a stock clock 680? So the margin will be much less for the guys that already have nice clocking water cooled 680 Lightnings?
Yep, how long did it take for 680 Lightning to come out? Waiting sux.....Quote:
Originally Posted by NEOAethyr
This will kinda suck, seeing these cards come out and having to wait it out a bit longer to see if there's gonna be custom cards.
:(
delete this
Voltage will be adjustable, no worries.
I hope all reviewers who bench 3-way SLI with Titan try to avoid CPU bottlenecks. 4K anyone? :)
Crysis 2
Radeon HD 7970 GHz Edition: 68 %
GeForce GTX 680: 65 %
GeForce GTX Titan: 100 %
----------------
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/68)*100 = 147 % = 47 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/65)*100 = 154 % = 54 % faster
3DMark 2013 X Firestrike
Radeon HD 7970 GHz Edition: 77 %
GeForce GTX 680: 67 %
GeForce GTX Titan: 100 %
----------------
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/77)*100 = 130 % = 30 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/67)*100 = 149 % = 49 % faster
3DMark Vantage GPU
Radeon HD 7970 GHz Edition: 76 %
GeForce GTX 680: 81 %
GeForce GTX Titan: 100 %
----------------
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/76)*100 = 132 % = 32 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/81)*100 = 124 % = 24 % faster
Battlefield 3
Radeon HD 7970 GHz Edition: 74 %
GeForce GTX 680: 65 %
GeForce GTX Titan: 100 %
----------------
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/74)*100 = 135 % = 35 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/65)*100 = 154 % = 54 % faster
Far Cry 3
Radeon HD 7970 GHz Edition: 70 %
GeForce GTX 680: 73 %
GeForce GTX Titan: 100 %
----------------
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/70)*100 = 143 % = 43 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/73)*100 = 137 % = 37 % faster
Hitman
Radeon HD 7970 GHz Edition: 81 %
GeForce GTX 680: 73 %
GeForce GTX Titan: 100 %
----------------
GeForce GTX Titan vs Radeon HD 7970 GHz Edition: (100/81)*100 = 124 % = 24 % faster
GeForce GTX Titan vs GeForce GTX 680: (100/73)*100 = 137 % = 37 % faster
Conclusion
GeForce GTX Titan average increase over Radeon HD 7970 GHz Edition: (47 + 30 + 32 + 35 + 43 + 24) / 6 = 35 %
GeForce GTX Titan average increase over GeForce GTX 680: (54 + 49 + 24 + 54 + 37 + 37) / 6 = 42.5 %
All benchmarks done with drivers so premature they aren't even launch day drivers... Either 3DMark is way off the ball, or drivers are going to show some massive improvements as the scores I've seen in 3DMark paint a much more favourable picture. As I said, I haven't got game performance info, so I can't comment on the accuracy of the above.
Very nice improvements across the board over 7970 and 680. It looks like NV will fire the first bullet,let's see if AMD can dodge it :D. From AMD's camp there ain't much coming in this year so I doubt they will have an answer with single GPU solution. Maybe an official 7990 part will launch to rival 690 and new Titan.
Yea i was wrong...
I found it this way for Crysis 2:
(100-65/65)х 100 = 53.8%
NVIDIA Feel like this now
http://www.abload.de/img/sbylu8wsqb.gif
AMD Feel like this now
http://www.abload.de/img/2vwsca.gif
and you will feel like this with TITAN
http://www.abload.de/img/pbsqd5ns9u.gif
It surprises me that people still fail that much in math for such simple things :eek:
I don't think the math is all that important at the moment, we'll have to wait and see.
And besides, it's not like we're trying to establish a gate through a black hole lol...
http://www.abload.de/img/gtxtitan-nvidia90f5b.png
thx to my Friend ;)
Math is basic logic. If you're trying to make an argument.... how can you possibly expect to even be arguing the same thing, let alone have anyone take you seriously, if you cannot utilize those constructs? (This isn't aimed at you, but more of an "in general" response to your remarks.)
^^ Either jealous, or not serious.
I was just thinking that if prices came down on the GTX 690s, I might want to get one after getting a 680 replacement today, but then I checked tri SLI benchmarks and No way jose.
And I doubt there will be any price drops, but usually there is after new stuff is released.
AMD fanboy.
I was really ready to buy one of these as a replacement for my GTX 580. But if these benchmarks are close to the truth, I don't see me paying the price of 3 (!!!) 7970 GHZ cards for this (they cost 350 Euro over here, and you get even cool games with them).
so today was wrong ???
How good is NV multi-gpu scaling these days? Can I expect +90% or better for each additional card?
@bill_d
Today was never right. It was an internet rumor run amok. However, today is the day that specs got released.
We should have more information soon. Fudzilla reports release tomorrow, some other sites say the 21st.
http://www.nordichardware.se/Grafik/...-februari.htmlQuote:
Nvidia will launch its new flagship graphics card Geforce GTX Titan on February 21 after being told on Feb. 19 allows the media to publish pictures and details of the new card. Performance tests and complete tests therefore show up two days later.
Fudzilla big noob :down:
nordichardware big dogg :up:
haha, believe me when I say that I hold Fudzilla to be about as accurate as Helen Keller trying to hit a quacking duck with a bow and arrow. I learned that lesson back in the days of the DFI RD600.
That being said, my point was that we can all agree that the sites are placing a release between now and the 24th.
i'm still surprised there have not been more leaks by now, few must have got them to review
nvidia usually leaks like a sieve
The plexi window over the heatsink looks fantastic.... /drool.
No way. Dual GPU is very good in games that support scaling. Tri GPU needs huge CPU power, and by huge were thinking of a sandybridge-E clocked to at least 4.6 Ghz. Dual GPU > Tri GPU is like just a 25% increase on average, and very few games support it.
A 4.4 Ghz I7 980 might manage, but I was checking 3 way SLI GTX 680 benchmarks earlier and it looked far too disappointing on a 3.8 Ghz X58 CPU, and my 24/7 clock is 3.875 Ghz @ stock volts on mine (I hate increased heat output for 24/7 use).
I was just doing loads of research in advance in case GTX 690 prices tumble, but without a current high end SB CPU + X79 system, its really not worth going 3 way SLI. I could still get a 690 and sell the 680 I just got today though.
Yea it looks awesome, but IMO the 690 still looks nicer.
Well, if that new benchs are real and it is only about 40% faster than a 680, the price that we saw last days (1000e / $1300) are a total joke, anything more than 600e/$800 will be a rip-off
amd still spoiled the titans launch by getting Crysis 3
lol exactly what i just said
What is scaling from 1->2 cards for nvidia recently? I can probably just do some more research. I just generally get whatever is best at the time, as for my sig I gave my second 5970 to a friend because scaling was pretty terrible on amd cards at that time, though I've read they've improved since then. And as for my cpu I'll likely be moving to haswell in June/July depending on OC headroom. I would consider IB-E if it materializes and better yet if they solder the ihs properly.
I find rebuilding wc to be a huge pita so I will only be doing it once when I decide on haswell for sure or not, so in the mean time I'll be waiting to figure out the best version of the titan and play the f5 game.
I don't know about sli yet but two 7970 cfx have been great at 1600p and I've had cf back to x1950xtx the 6000's and 7000's have been much better than my 5970 was
I will be adding a second 680 directcu ii top to my nv system soon I just got a new maximus v formula to replace my maximus v gene so I could, should have waited for the formula to come out :shakes:
and they keep saying sli is better than cfx :shrug:
Thanks bill. I couldn't possibly care less about the amd vs nvidia mud slinging. Find a few reviewers you trust and then hope they test the games you want to play and go from there.
And yeah, 2560x1600 is all I care about. So that gets rid of all the sli/cfx reviews who only test at 1080p.
I'm mainly interested in reviews that show 2-way SLI/CFX AND 3-way SLI/CFX @ Nvidia Surround/Eyefinity resolutions of 5760 x 1200 (1080p) and up.
Anything else is just child's play.
http://images1.wikia.nocookie.net/__...s_com-2888.jpg
I don't deny some multi-monitor setups look great (especially monitors with really small bezels). But one 30" has always been plenty for me.
$1200 USD.....that is totally failure imo
Your kind of showing this behavior with comments like this.
AMD isn't spoiling anything with software. Hardware is the only thing that is going to spoil a hardware launch. Especially if the gtx titan is faster than a 7970 for that particular software title.
The only person/group that could spoil Nvidia launch at this point, is honestly nvidia. This can be done by overhyping/over pricing the card, which could very well happen.
Well, Nvidia is counting on our booming economy. We have all so much spare cash that we just don't care to spend a 1000 on a videocard even if it should normally be 600-700. Oh wait...
On a 120Hz (no VSync) 1080p monitor 3-way GTX 680 SLI still shows input lag when AA is enabled in BF3 or Far Cry 3 for instance. As long as I can feel that there's room for improvement, even for low resolution. Also, Far Cry 3 has lousy frame rate on 2560 x 1600. Far more fun to play on 1080p at 90-140fps.
Here's to hoping it drops a lot in price and is still available in 6 months :P
I think the price is just right. That card is a status symbol so it shouldnt be cheap. Poor people can play with their SLI and Crossfire trash (what a first world problem..)
Regarding the SLI convo -
Two cards scales very well, but only in supported games. Back when I used to use SLI / Xfire, I supported it with a passion, but then came all the MMOs that didn't support it - currently I've been playing DDO, GW2, and Path of Exile, none of which support dual GPU (zero FPS increase and second GPU just sits idle), and back to when they were running on my 560 ti SLI, I had plenty of lag with the FPS dropping to below 30 fps (I also see lag in around 30-40 FPS, need to maintain at least 45 FPS). So I chucked those out for a single Kepler instead.
Triple card scaling in reviews was mainly only noticable at ultra high resolutions, and with highly clocked SB-E CPUs. One review compared an X58 CPU @ 3.8, and SB-E @ 4.6, and the results from that turned me away from bothering with 3 GPUs.
I don't even need to bother with 2 GPUs, but I just love the look of the 690 and Titan cards so much, and this reference 680 I just got through RMA is ugly, yet a decent clocker running at 1254 / 7200.
I understand as I am an off-again on-again WoW player, and had a dabble in TERA Online. Now, TERA Online is an absolutely gorgeous game that plays great on Nvidia Surround--WoW not so much. Ergo, that is one of the reasons why I have a trusty DDR2 backup rig running a single GTX 460 and a single 1080p display. :)
Well even a single 560 ti wasn't managing for me in those games @ 1200p. It was enough for around 30 FPS with all the settings cranked up, but that's too laggy for me and I need 45 FPS minimum maintained.
Actually I'd be much better off getting a second 512 Gb M4 for raid 0 goodness and I don't think the GTX 690 will get reduced enough. Titan costs far too much for me, and I don't think there's going to be any GK104 replacements because this chip was always meant to be the mid range. This free 680 upgrade from KFA2 should be good for a long time, it looks like I'll be skipping a gen now.
Its true that Nvidia are just milking people dry with their prices, but then these cards are amazing.
the question I want to know is how these stack up in 3d programs like maya/PS and max as well as gaming/benching..
http://img1.gtimg.com/digi/pics/hv1/...7/82417246.jpg
GALAXY GeForce GTX Titan is still part of the Kepler family based on the GK110 the core, based on 28nm process technology, built 2688 stream processors, PCIe 3.0 x16 bus interface, will be equipped with 384 6GB GDDR5 memory, support for DirectX 11.1, Shader Model 5.0 , supports OpenGL 4.3, support Open CL 1.2.
GALAXY GeForce GTX Titan core frequency of 875MHz, but did not use the GPU Boost technology. Aspects of memory, GTX Titan used GDDR5 memory equivalent frequency 6008MHz, Memory Interface 384bit memory capacity will reach 6GB.
The card uses a dual-slot design, while the TDP is designed for 235W, 8pin +6 pin power connector.
Interfaces, GeForce GTX Titan will be equipped with two DVI, one HDMI and one DisplayPort interface.
http://digi.tech.qq.com/a/20130219/000838.htm
Galaxy is not the only brand doing 875 MHz without Boost, and there's another clocked at 900 MHz.
Good to know! Anything higher than 900? GHz Edition? :D
Im waiting for overclocking results with the stock voltage.
If these things manage to reach 1200-1300+ Mhz out of the box like GK104 can ... :eek2:
If it does 900mhz out of the box then lets hope its got a bit of headroom and it will be a rocket. If it does 1000 stock volts then maybe 1100 or more with water and volts turned up then fun part will be modding it with water to see what she does.
Cant wait.
:D
The samples I've seen don't clock as well as GK104, but 900 MHz still leaves a fair bit of headroom.
1000 without voltage increase is very hard i think. maybe a little handpicked cards.
So I was being way too over optimistic :(
If only we could have custom versions, Im sure that MSI / Asus / Galaxy etc could break 1000 Mhz if they were allowed to make non reference cards.
Why have the AIB partner clock the card higher and charge you a premium when you can do it yourself? ;)
We really need reviews and benchmarks, doesn't the NDA meant to expire today or something?
Initially it was said here that there would be no non-reference pcb's for Titan, has that been revised? ie Lightning?
:)