Yup, but it IS coming.
Printable View
Yup, but it IS coming.
^^ and lotsa DP!, everyone in this forum would surely think DP is a good thing!
New comparison then, assuming 2880 CCs:
Interestingly, the ratio between the boost and base clocks of the K6000 (1.13) is about twice as large as the difference in many other Kepler cards with boost. Maybe it's due to the lower base clock?Code:780 Ti (hypothetical)
K20X K6000 TITAN Prop. Half 225 W
CCs 2688 2880 2688 2880 2880 2880 2880
Core 732 797 837 911 874 857 804
GFLOPS 3935 4590 4500 5249 5035 4934 4630
Memory 5200 6000 6000 7228 6613 6794 6082
Bandwidth 208 288 288 347 317 326 292
TDP 235 225 250 239 244 225 225
GFLOPS/W 16.7 20.4 18.0 21.9 20.6 21.9 20.6
(Note that the M2070 has 3132 Mbps memory and the 6000 has 3000 Mbps memory.)
Personally Im still thinking that a fully enabled GK110 is hoping for too much. Im more of a conservative in regards to this. I would more expect the 780 Ti to get 1 of the 2 shader clusters that were disabled from the Titan enabled with higher clocks. Looking at what Ti cards have been in the past they have mostly been just another shader cluster with slightly different clocks. Now if it does release with all the chips shaders enabled and at the supposed $700 price point I will be overjoyed. But I wont leave myself open to be disappointed.
Is it asking too much......
http://i1281.photobucket.com/albums/...psf4422565.jpg
Let's see what [XC] Oj101 says about this :D
Wouldn't that be EXACTLY what I said? ;)
card from this Picture, can be easily 15 percents above Titan
But PedantOne says it's a Titan with 3 GB...a Titan has only 2688 cores. I guess I'll have to wait until the reviews :)
Btw the fillrates are wrong, they don't match the clocks. Which leads me to believe this screenshot is fake.
48*0.902 = 43.3 GPix/s, the picture says 45.2
240*0.902 = 216,5 GTex/s, the picture says 225.8
Memory bandwidth is also wrong. Very slightly, but still wrong:
384/8 * 7.008 GHz = 336.384 GB/s = 336.4 GB/s (rounded), the picture says 336.5 GB/s
i said, our sample here have more CUDAs then 780 GTX, but i never told exact number :) Lot of datas on that pictures is correct, for example ID, but some datas are faked
I'm not commenting on the clock speeds (I've never mentioned them), just the CUDA core count. I have this strange suspicion (probably unwarranted) that people are generating screens based on what's being said here :rofl:
Let's just hope power consumption doesn't go through the roof with this card. Is the GPU on the 780 Ti a GK110-400-A1 or B1 or even a GK180?
Is it the power consumption lower than Thermi part 2, AKA 290x? If these are the clocks of a stock card with just the regular fan, imagine the gtx lightnings. 780 Ti. 1050 normal clocks and 1150 boost clocks.
Look at the gtx 780 galaxy HOF edition and you will see b1 are already being used on some cards.
^ GTX 780 is GK110-300, Titan is GK110-400 ;)
The big Hawaii show was basically cut short by a simple pricedrop and launch of a Ti. Man.....
Hawaii could still do well and still do Nvidia alot of damage, if they get their stock situation sorted sorted out. And from the looks of its, they didn't even deliver that entire 8000 quantity BF4 editions, on Hardocp, new stock coming in are still coming with BF4 and that was as of yesterday. So that 8000 quantity was not just pre-order editions as people who ordered on wednesday were getting BF4 editions.
So much for 10's of thousands.
Mantle could be its saving grace, and 4K res performance, the 280x coming with 3Gb vs 770's 2 Gb. Apart from that not sure. nVidia played their cards for this generation well, keeping the monster under wraps. Goodness knows what Maxwell will bring to the table.
Will the current Titan water blocks, i.e, Titan XXL still be suitable with the Ti?
If the power circuitry has been modified there is a chance the PCB may be physically different as well unless the 780 Or Titan pcb was not entirely utilized.
GeForce GTX 780 GHz Edition clocked at 1006/1046 MHz.
beats 290X and Titan
780ti will do it even better...:D
http://videocardz.com/images/2013/10...vs-GTX-780.png
http://videocardz.com/images/2013/10...-GTX-TITAN.png
http://videocardz.com/images/2013/10...vs-R9-290X.png
more here: http://videocardz.com/47420/nvidia-u...80-ghz-edition
original source: http://www.expreview.com/29089.html
I think this is all great, regardless of who wins the performance crown. We haven't seen this type of back and forth between AMD and NVIDIA since the days of the 9800 gtx vs 4870
Hahaha... beats 290x... Good one.
FXAA, minimal AA settings, Max tesselation, mainly synthetics.
Galaxy HOF is using A1 still...
Thermi part 2? Wow... stretching reality a bit aren't you?
Not at all. Just means Nvidia is trying to stay competitive.
Where do you see any different?
290x is clearly going in and out of stock multiple times a day.
Different bins of silicon. They are saying 780Ti is a metal respin which is extremely unlikely, though not impossible.
It would mean there was some inherent architectural flaw with GK110 that they need to fix.
FXAA is what most people use and maybe 2x (4x on older games) MSAA at high res like 2560x1440 or better, more isn't performant enough to be worth using (low fps no one would actually want to play at, and basically useless in terms of visual quality in motion anyway). I game at 2560x1440 110hz, many use 1080p 144hz or 2560x1440 and many of them with high refresh rates with how inexpensive 27" has gotten for PLS/IPS panels.
Nvidia has indeed pretty much cut Hawaii's show short I'd say I have to agree. Who on Earth is going to buy a $550 card that runs at 95c (consistently hotter/louder than Fermi), has little oc headroom, and is loud, when you can buy a $500 one with lots of oc headroom, generally the same performance when taking into account pre-heated benchmarks rather than short sprints before the 290X throttles (which happens during normal gaming, clocks are 800-900mhz not 1000mhz+), and a quiet heatsink (or custom designs for $510)?
290X has had tiny amounts of units come in-stock and sell off, that doesn't have anything to do with them selling well, just minute quantities. As he said, so much for tens of thousands...
Competition is good, but there's a clear and obvious choice for buying unless you have a huge favoritism towards one brand: at $500 ($510-520 with custom coolers) and barely any slower if at all during actual gaming except 4K res which is so tiny a market as to be nonexistent so far, while being quiet and cool, the GTX 780 post-pricedrop is that.
I'm honestly perfectly fine with all that. Sure it'll be disappointing for AMD if they can't pull out as much profit as they would have expected from Hawaii, but from my perspective they've done their job. We wouldn't be seeing a GTX 780 Ghz edition for $550 if it weren't for the 290X, and I wouldn't be surprised if AMD sweetens their bundle with a bunch of games like the 7970 GHz packages.
Competition is not just good, it is the best thing we could ask for. I really don't care who manufactures my hardware as long as I'm getting good value for my money, and that normally won't happen if a company can sit back on their laurels (just look at Intel, for what I do my 2500k really isn't worth upgrading).
@AliG absolutely, I'm VERY glad to see there be fierce competition here, better pricing and performance for everyone regardless of who is the "winner". I'd hate to see what's happened with CPU's happen with GPU's... talk about bad for consumers. The only thing better than what's happened with the GPU fight now would be if AMD had a good card (and I'd be cheering its release) in the 20nm gen too and we saw this kind of competition there!
You don't buy a +$500 card to run FXAA at 1080p...
Hotter and louder than Fermi? Completely false... It actually pulls less power than Fermi.
It boosts over 900mhz consistently in gaming situations. So that is false.
There have been tens of thousands shipped by partners.
Tiny amounts? You do realized how many are on a pallet that gets shipped to the etailers, right?
Sorry, Nvidia just evened the playing field, they didn't take any sort of advantage.
You do if you don't like alpha and shader aliasing which is far worse than edge aliasing ;), most people know this. Most people running 1080p don't run 60hz nowadays, or they'd be on an IPS panel instead including 1080p 60hz or 2560x1440 60-120hz.
Hotter/louder indeed, power consumption is a different thing.
It boosts on some non-preheated ones to that but when pre-heated or in actual gaming it often falls to 800-900mhz as I said, rather than 1ghz.
Tens of thousands shipped? Doubt it since newegg seems to receive 5-10 at a time and thus sells out quickly due to the tiny quantity ;).
$500 + 100 or more worth of games (ignoring the shield coupon) versus 550 + no games... with similar performance but better acoustics/thermals, most people will go for the better-priced option. Don't get me wrong, I am VERY glad we're seeing good competition finally, it's only a good thing for us as end-users!
But speaking of stagnant CPU innovation, a lot of people have completely overlooked just how far along peripherals have come quietly in the background. I just got a Corsair K95 and holy :banana::banana::banana::banana: is a mechanical keyboard a massive upgrade over the previous ergonomic one. It's interesting that it's almost more comfortable to type with, because even though I sacrificed the wrist support, the keys are just so damn responsive. Or you could look at the Korean 1440p monitors and how SSDs are finally affordable. Just in general I think there's a lot of nifty stuff that's been rolling out without much noise
In guru3d and techpowerup, in gaming loads, they pull more or similar amounts gtx 480. Maybe only by 20 watts at times, but that still more. Thus to make an educated guess that 290x consumes as much as fermi is pretty safe to assume.
In addition they both operate at 95 degree and at that point, 290x still continue to throttle in test by HARDOCP. Hardocp had to set the fans beyond the uber limits to stop it from throttling and where it can maintain 1000mhz.
At this point, they actually beat Titan significantly(10% maybe), but noise goes up beyond fermi levels, beyond dual cards.
So it is very much a fermi, but Nvidia had a better excuse for it. That being, it was generation one silicon and it was even bigger than 290x is. AMD has lots of experience with 28nm and they are designing a refinement of GCN rather than a whole new architecture like fermi.
The amount of heat it can generate is spectacular.
http://hardforum.com/showpost.php?p=...10&postcount=2
An overclocked a 290x can get even in the 80's C with water cooling, I have never seen a card do that.
Add in the overvolt overclock studying with techpowerup and 290x is indeed piping hot.
So similar to Fermi, not worse. Fermi fan does a good 65-70db at stock. I didn't see 290x going over 60db.
Nvidia's excuse was they had a broken architecture that could barely beat a chip almost half the size. Nvidia simply didn't have a choice.
AMD made a decision and decided to set the limits to where they did to balance the design.
That 80C is false. If you believe that, well then I can't help you.
Sure.
PedantOne said the 780 Ti is GK110-x00-B2.
I asked what B1 was (and A1 for that matter).
If the 780 Ti has indeed 2880 cores, it is not GK110-400-xx but GK110-500 (or 600 or whatever)-xx.
Since the first fully enabled GK110 GPUs entered the market only very recently or are not even released yet (Quadro K6000, Tesla K40), the 780 Ti will most certainly not use different silicon, but the same. Thus there are no previous 2880-core GPUs that could be A1 or B1 in order for a B2 to exist.
Unless the stepping is independent of the number of enabled SMX clusters. But even then, why not B1 for the 780 Ti?
Edit:
tajoh111 is right. 290X pulls more power than Fermi with the uber bios, significantly so:
http://www.3dcenter.org/artikel/laun...9-290x-seite-2
http://www.3dcenter.org/artikel/eine...tromverbrauchs
279W on average for the 290X uber mode
235W on average for the 480
Both values are averages of multiple measurements that measured only the card itself, so it doesn't get more accurate than that.
- The 290X is late just like Fermi was.
- It is hot and loud just like Fermi was.
- It doesn't beat the competition at even ground across the board (quiet bios vs Titan stock or uber bios vs Titan@maximized targets) just like Fermi did. In 4K it wins slightly by under 10% but loses in 1080p or with SGSSAA:
http://ht4u.net/reviews/2013/amd_rad...ew/index46.php
https://www.computerbase.de/artikel/...90x-im-test/5/- It uses more power than the competition in either comparison (see above), the same or more than Fermi depending on the bios mode.
Now the positives:
The 290X's perf/W doesn't fall as far from Titan's compared to GTX 480 vs 5870.
The GPU is smaller, great feat of engineering
Price
Considering the power consumption scaling we have seen, I could see it being that hot. Wizard was already when applying just a bit more voltage and overclocking to 1190mhz. His system, consumed 650(originally 400) watts which meant the card was consuming 550watts. This was at 1.26 volts. An overvolted gtx 780 lightning consumed less than 400 watts for the whole systems so(300 watts in total) in the same test and it was overvolted even more and was clocked higher as well. I have seen people clock their cards at for a 1.4volts for a 290x under air, so if someone feels they want to push water even more and if anything else is in the loop, I could see it being overwhelmed and reaching those temps. Some of Intels hotter chips are capable of reaching 80 in water when sufficiently clocked. I have just looking around and fermi could pushing near 70 in a water cooling loop when stressed. And don't think, its capable of drinking as much power as an overclocked 290x. I would be scared to see what kind of voltage this thing can drink down at greater than 1.4 volts. I don't think we have anything use as much power as a 290x as far as power consumption going up as we overclock further.
Also According to tech powerup, in ubermode the card is just as loud as a gtx 480. Both cards are 9 decibels above a gtx 580.
You might hate to admit it but Thermi and 290x have a lot of common as far as thermals and temp. It almost striking.
http://hardforum.com/showthread.php?t=1788367Quote:
I enable Post AA (any level) (or leave it enabled as the default setting) 33 26.40%
I disable Post AA 92 73.60%
So we go from 290x is better than Titan, fact.
To "we can't compare 290x to Titan because GTX780 is basically Titan for cheap."
To "even footing 290x doesn't beat Titan."
Make up your mind and please stop shifting the goalposts.
Fermi doesn't pull only 235w. It can pull over 300w in gaming situations.
290x isn't competing against a card that has a max power consumption of 180w.
GTX480 was consistently pulling ~100w more in gaming situations to give ~10% more performance.
290x is using ~30-40w more to beat the competition by ~15% on average.
I already told you sites that showed they 290x can pull more than as much as a gtx 480 in the same scenarios.
290x is a complex product because it tries to pull two modes at once. One where it is a hair slower than titan quiet mode.
And a second mode that it where it is a bit faster than a gtx titan. It's the nature of the beast. No other cards have this quiet mode.
10 percent average maybe against a a gtx 780. Not 15 percent more on average unless maybe when we turn on uber mode.
The gtx 780 is so neutered and underclocked that it might as well be a small chip like the 5870. Its Nvidia attempt to sell massively docked gk110 and do no binning. If we consider how mangled it is and how much AMD is pushing their own chip(also it being fully enabled), it should be no surprise it gets beat. Particularly with more and more gaming evolved titles being used.
Fermi might have not shown its gaming worth so much in games for it size, but it more than made up for it for its primary purpose, the professional market.
It was the best professional compute card ever at the time. It was literally a made trash of anything at the time. It spanked AMD professional cards at the time being often twice as fast and sometimes 4 times as fast. Even today, AMD w9000 cards have a hard time competing against it and this was against a cut down chip with very low clocks.(hothardware and tomshardware review). So even those it had high power consumption, it justified it with its other capabilities that are yet to be seen on 290x.
Did you overlook where I said "on even ground"? I'll reiterate for you:
quiet mode (stock setting) vs. Titan stock setting
OR
uber mode vs Titan@maxed targets
Titan and the 290X are equally fast if you value a fair comparison - that's a fact. Don't fall for most of the English speaking reviews - they are biased towards AMD (maybe not even intentionally) by not or not properly pre-heating their cards and by testing uber mode vs Titan/780 stock settings with a temp target of 80C.
Read the review links I posted if you don't believe me. And don't quote selectively, quote whole sentences.
Source for your 300+W claim?
The 290X certainly can, 306,88W:
http://ht4u.net/reviews/2013/amd_rad...ew/index21.php
You do understand how an average works, right? In some games a card may pull more than the average, in others less. You won't find better power consumption data than what I posted.
And maybe you also overlooked where I said that the 290X wasn't as awful in perf/W compared to the competition as Fermi was.
And you are the one who should make up your mind. What is it now - Titan killer? Then compare it to Titan which it doesn't beat. Or GTX 780 killer? Okay, if you want to call 10-15% a "killer", fine. But then also use the GTX 780 power numbers instead of Titans: 189W (GTX 780) vs 239W (290X quiet mode). That is a hefty 50W more.
More evidence for 290x power drinking ways.
http://www.anandtech.com/show/2977/n...rth-the-wait-/
http://www.anandtech.com/show/7457/t...9-290x-review/
http://imageshack.us/a/img30/6678/hcxh.png
http://imageshack.us/a/img19/3287/h9wj.png
Guess what these reviews have in common.
There is a 5870 in both of them.
And there are cards that use about 100 more watts than a 5870..... guess what cards those are.
The gtx 480 and the 290x(in quiet mode). With the card in uber mode this grows to about 130 watts which would make it the biggest power drinking single gpu card of all time by a comfortable margin.
If nvidia didn't cut down the gtx 780 so badly and clock it so conservatively, AMD wouldn't even have that lead. the gtx 780 can be clocked quite a bit higher without increasing power consumption if we look at some reviews of overclocked models at Hardocp or techpowerup.
Ok... Where was all these "even footing" "we need to change our testing/benching methodology" the past 2-3 years?
Why do you need to shift the parameters of the test? Why can't you test stock vs stock?
That's right because the results don't show what you want.
Look at any GTX480 review.
No I don't know what average means... please explain.
I bet I can because those results look extremely strange. GTX780 and Titan pulling less than 200w?
There is something screwy going on there because those results don't reflect what other sites have found.
Ahhh... good old scientific method, lets just go ahead and throw you out the window, we don't need you.
All I hear is "ifs" "ands" and "buts," don't worry about comparing what is, we can only discuss what could be.
guys, few wats its nothing, specialy in enthusiast segment. I dont care if is 100W or not. Its "nothing for me" and Im not the biggest enthusiast in GPUs....
I don't know where it was in other reviews, but hardware.fr, PCGH, ht4u and Computerbase have been preheating Kepler cards from day one. Not finally several other sites begin to do it, among them hardwarecanucks, techreport and Anandtech iirc.
And it IS a stock vs stock comparison. Quiet mode is the stock setting, uber mode isn't.
The power consumption numbers there are not strange for one very simple reason. 3DCenter compiles values mostly from realistic gaming scenarios, i.e. with pre-heating (at least HT4U, PCGH and hardware.fr). This leads to all Nvidia Boost 2.0 cards clocking lower due to the 80C temperature target being hit, thus they also consume less power.
I don't need to look at "any" GTX 480 review, because this compilation of measurements is far better. Most reviews test the whole system which is very prone to errors. Where else on the web do you have 5-6 measurements of the consumption of the card itself? Nowhere.
Btw still waiting on proof on your claim that Fermi could use 300+W under gaming loads...
1 GHz 780 is better then Titan, and Uber moded R9 290X in many games, with AA. Thats fact! In average it is the same performance, in 1080 and 1600. Not in Star-Trek 4K resolution of course.
780 Ti has DP power disabled for ever, here is Titan still king, with his price.
Whats i know, Nvidia sent 780 Ti to all german reviewers this week. Many ppl have cards at home, some leak are on the way!
PS. I dont know which revision of GPU 780 Ti has, i think it is B1, but will see. I am out of office now, cant to check it. But try to ask my coleage to dismatle that card.
I can confirm that NVIDIA has killed DP performance with the 780 Ti. Using 1/3 DP doesn't make financial sense, as either it means selling the card for $1500+ or hurting their own workstation market. I'm disappointed, I use 64 bit floating point operations and the old GTX 580 offers more power than the 780 Ti. Ah well.
I don't know about the rest of them, which I doubt, but hardware.fr definitely did not. If they did, they didn't mention it.
Uber is stock. Maybe you meant to say Uber isn't default?
I didn't see them mention the list of games they used for power consumption.
I don't see them testing the card by itself...
I thought you didn't need to look at GTX480 reviews. You supposedly know it all.
I don't recall you asking for 5870 to be clocked up so that it could be at the same temperatures as GTX480... or the same power consumption.
Shifting the parameters. Tsk tsk.
I meant Boost 2.0 Kepler cards since those are the ones mainly affected by temperature. Boost 1.0 cards like the GTX 680 are barely affected. Sorry if that wasn't clear.
Yes, default. Stock is an unmodified card, I guess. Like no bios flash, no watercooling etc.
Well, you need to go to the respective reviews of course.
TPU tests with Crysis 2
PCGH tests with Battlefield Bad Company 2
hardware.fr tests with Anno 2070 and Battlefield 3
HT4U tests with Tom Clancy's HawX
I forgot what Heise tests with, I'll write them an email.
It is obvious by the values that these are only the cards themselves. The values are way too low for power consumption of the whole system. It is also well known that these sites do test that way.
What? The GTX 480 was measured as well by the sites I mentioned. What do I need other reviews for, especially with inferior testing methodologies?
And what's up with your second comment? Relevance?
The point is, quiet mode of the 290X is the default setting. Either you test both cards at default or not. You cannot have it both ways.
I had 4 gtx 480s overclocked with 1.2v each with the CPU at 5.2ghz and my rig was pulling 1700w during 3d mark
http://hwbot.org/submission/2358521_...80_21354_marks
I agree on this one... went from a 2560x1600 CCFL 30" panel to a much less expensive 27" 2560x1440 LED panel that can do 110hz and LOVE it. Also grabbed a mechanical keyboard with Cherry MX Brown switches, added O-ring dampers and it's amazingly comfy to type on (I use a wrist wrest with it). Been using SSD's since 2008 I think it was, but they've gotten to really good pricing and capacities at this point. Peripherals and less-glorified parts have definitely come a long way, no question.
Techreport has 5870 using 319 and 253 W in their reviews (first being the one of GTX480 and the latter being review of 290X). Fermi is using 424W and 290X 346W, differences: 105W and 93W. So the pendulum swings both ways and the conclusion could only be: the difference isn't big enough to argue about. Fermi and 290X has about the same power draw. As for acoustics, according to techreport, the winner is GTX480 by a fraction (49,9dB to 50,1dB). However, the 5870 had a load noise level of 51,0dB in that review and only 49,1 in the 290X review. In all comparisons I've used ?ber mode. If not the clear winner in power would be 290X and in acoustics it would also be a slight victory for AMD.
Ref:
http://techreport.com/review/25509/a...rd-reviewed/12
http://techreport.com/review/18682/n...-processors/13
Dude single worst card I have ever heard is Radeon 4890.... Mine sounded like a diesel turbo spinning up
290x can't be nearly that bad
In fact they did and they mentionned it :
http://www.hardware.fr/articles/910-...cole-test.htmlQuote:
Notez enfin que compte tenu de l'influence de la temp?rature sur les r?sultats, et du fait que nous mesurons les performances sur une table de benchs en laissant la temp?rature/fr?quence des diff?rentes cartes se stabiliser, la temp?rature de la pi?ce a ?t? contr?l?e et fix?e ? 26 ?C pour l'ensemble des tests.
They wait temperatures/frequencies stabilization to launch the bench. Room temperature is fixed @ 26 degrees celsius also, in order to give fair results accuracy.
NVIDIA GeForce GTX 780 Ti has 2880 CUDA cores
Quote:
You read that right! TITAN ULTRA is actually GeForce GTX 780 Ti. It packs 2880 CUDAs, 240 TMUs and 48 ROPs (ROPs are not mentioned though). NVIDIA GeForce GTX 780 Ti is clocked at 876 MHz for base and 928 MHz for boost. Card is equipped with 3GB GDDR5 memory running at 7 GHz. The GTX 780 Ti is using 384-bit interface.
^^ my early warning sign was the G's pushing me hard into my seat, that was followed by the speedo going off the dial..... oh the days...
Overclocked my cars, now I overclock my PC's....
wonder who in the forums had the sig on the overclocked testicles, that one made me laugh hard.
http://www.youtube.com/watch?v=ljFUcBzYkPs
Demoed on Titan, I don't think it would do so well on a 780GTX.
I never say that I am impressed,
I am F'ING IMPRESSED!
I should probably stay on topic to
Workstation cards don't even really utilize DP. DP is great for crunching numbers and scientific stuff, not so great for running CAD/CAM or rendering. That is why Nvidia segmented their professional VGA's. Quadro uses single precision and Tesla uses double precision. Thats how they force you to make 2 seperate costly purchases now. Fermi actually had a good balance of both.
Great that you mentioned this. I remembered not ever bothering with the noise my Radeon 4890 generated over the sound of my headphone or my ceiling fan. R9 290x's noise is probably a non-issue for me, not that I'll buy anything from this gen, though.
That said, I don't mine/fold/etc while I sleep, so they all stay in sweet idle.
Everything in my machine is at full load 24/7 and I sleep 3m away from it. I grew up on a busy main road and now stay in a very quiet neighborhood, the noise actually helps me to sleep at night :D
Something new about 780 Ti:
It seems, GPU is still GK110 revision A1. But you already knows, fully unlocked.
This card is monster, colleage made some test with modded INF actual drivers, OVERCLOCKED out of the box 250+ MHz on GPU with 6300 points in 3D Mark 11 Extreme. Without any volatge tuning, and with silent cooler. Nobody want R9 290X after this monster will be launched, maybe only for few dollars :)
Yuck, A1 isn't that the same as gtx titan but these are binned better I can imagine.
If the stock clocks are 875mhz + 250 = 1125mhz which isn't that impressive even without adding voltage(unless it naturally boosts to that) because we all know it doesn't take any effort for titan to reach past 1100mhz. What i would rather see is consistent 1300mhz overclocks compared to the 1200 we get now.
That 3dmark is pretty good though. I was just looking and it takes a titan overclocked to 1254mhz to reach 6400.
you are missed Boost clocks, 875+Boost+250 = more then yours 1125 MHz :)
vario - you are right, 699 is more then 549, but we are in enthusiast segment, for best performance you will pay always premium price
~1200 MHz without voltage increase? I don't believe that. That may be stable for a quick 3DMark bench, but not game stable. My two Titans are not stable at 1200 MHz@1.2V.
this is stable at 1200+ on all test he made, 3D11 Extreme Firestrike Extreme, Heaven 2560 + 8AA, Crysis 3, and many more games.
Maybe a golden sample. Oh, important: How far can you increase the power target? If it's only 106% (assuming 100% is again 250W like Titan and 780), it would be disappointing. Even with water, my Titans throttled before I could raise the power target to 120-125% using a modded bios.
So about 1175 mhz with boost clocks about which would be in line with that titan score. Its pretty decent, but what will be the important thing is how far this thing overclocks. Disappointed if they are all A1. Seems like Nvidia was hoarding all this time.
Well I guess for most people 1175mhz if it is quiet and doesn't consume much power is a lot more usable and faster than the under 1000mhz people are getting with 290x unless they turn up the fan profile to prevent throttling when they overclock.
Well then again, Nvidia might have raised the default voltage. Maybe not exccessive like the original 7970 ghz edition, but enough where higher clocks are possible without having to add real voltage.
Yup, but I would have hoped it wasn't the same stepping as Titan. A full gk110 only has 7 percent more shaders, I would hope it could boost higher than the leaks bios specifications without resorting to any overclocking. I.e 900 base clock about 1000 plus boost clocks under default.
If the gtx 780 ghz edition does exist, there might only be a 6-9% difference between the two for performance at the clocks gtx 780 Ti is set at.
If it was running a new stepping, I could see it being cooler than old gtx titan and thus boosting to higher levels. But with the same stepping, it might as well be running the same clocks since base clock only determines the minimum performance, temperature otherwise control how much the card boosts. And with the memory controller being pushed more and perhaps higher voltage, we might not get much of a frequency increase which is important for review performance(not so much overclocking performance).
Regarding clock speeds... Just... Wait and see. I'm biting my tongue very, VERY hard at the moment, as much as I have said already there's still a lot more that I can't say :(
Sounds like the 780 Ti clocks higher than what videocardz.com said or that it is an excellent overclocker (1200-1300 MHz).
Do you think it's wise to grab a couple of these on launch or wait for higher spec revisions, i.e, classy etc if it goes that way?
about default performance, find my leaked score in 3dmark 11 few days back, and remeber that number :)
NVIDIA GeForce GTX 780 Ti has great overclocking potential
Attachment 131698
Attachment 131699
damn:eek:
Quote:
NVIDIA GeForce GTX 780 Ti with 3GB, 6GB and 12GB memory
The new NVIDIA?s flagship is much more than GeForce GTX 780 replacement. Despite the fact that reference card will only arrive with 3GB memory, it does not mean that there won?t be the cards with more memory. The rumor has it that NVIDIA GTX 780 Ti will be available in 3 versions: 3GB, 6GB and even 12GB. Which version are you going to choose will only depend on how deep your pockets are.
No custom models next week
According to my information, we won?t see custom GTX 780 Ti models at launch. This is the same story as with R9 290X, manufacturers were simply not prepared for this release, so it will take some time to finalize the designs with custom PCBs and cooling solutions.
NVIDIA GeForce GTX 780 Ti has great overclocking potential
NVIDIA GTX 780 Ti features 876 MHz base clock and 928 MHz boost clock. I managed to confirm that with stock cooler it can overclock up to 1240 MHz (based on Afterburner readings). I even saw a GPU-Z screenshot with 1851 MHz memory clock (7.4 GHz). I don?t know if these are stable clocks, all I know it was enough to complete multiple 3Dmark benchmark runs, which I saw.
I made a chart with cumulative performance from 3DMark11 and 3DMark Fire Strike (both Extreme presets). These scores are based on 100% true tests, I was asked not to publish the numbers, but nobody said I cannot post them in percentages.
I took R9 290X as a reference for this comparison. Long story short, the GTX 780 Ti is about 6% faster than R9 290X, this is out-of-the-box performance. With overclocking GTX 780 Ti can be 15-23% faster than R9 290X. To illustrate, GTX 780 Ti scored 6100k points in 3DMark 11 with 1246 MHz clock, and let me remind you, we are still talking about reference cooler.
Since synthetic benchmarks are never the best performance indicator, thus I will post gaming performance as soon as possible. Treat this post as a sneak peak of what GTX 780 Ti is capable of.
I will post more information tomorrow so stay tuned.
That's not a GTX680 review... Nice try.
I will show you every single GTX780 ever made.
They don't need DP for that...
This was also showcased awhile ago, I wanna say last year, but I don't think they mentioned Flex.
I remember Oj101 mentioning something about this card.
http://videocardz.com/47530/nvidia-g...pecial-edition
NVIDIA GeForce GTX 780 Ti special black edition
While the original GTX 780 Ti will run at 876/928 MHz clocks, the exclusive special edition will boost to 1000/1033 MHz (unconfirmed). Both variants will have the same memory clock of 7 GHz. The clock speeds are not the only difference. The faster GTX 780 TI has no TDP limit (at least on paper). This card is basically what Uber Mode is to R9 290X, only here you get higher clocks and more memory.
According to our sources, GeForce GTX 780 Ti will arrive with black cooler, so this is more than just a new BIOS.
These are the clocks on want on a Ti. I am certain I will hate the price of these black cards.
For those running on water though will there be any point in the 'black edition'?
I'll be waiting for the GTX TI ULTRA BLACK ELITE 2" LONGER DEN U edition.