If the power circuitry has been modified there is a chance the PCB may be physically different as well unless the 780 Or Titan pcb was not entirely utilized.
Printable View
If the power circuitry has been modified there is a chance the PCB may be physically different as well unless the 780 Or Titan pcb was not entirely utilized.
GeForce GTX 780 GHz Edition clocked at 1006/1046 MHz.
beats 290X and Titan
780ti will do it even better...:D
http://videocardz.com/images/2013/10...vs-GTX-780.png
http://videocardz.com/images/2013/10...-GTX-TITAN.png
http://videocardz.com/images/2013/10...vs-R9-290X.png
more here: http://videocardz.com/47420/nvidia-u...80-ghz-edition
original source: http://www.expreview.com/29089.html
I think this is all great, regardless of who wins the performance crown. We haven't seen this type of back and forth between AMD and NVIDIA since the days of the 9800 gtx vs 4870
Hahaha... beats 290x... Good one.
FXAA, minimal AA settings, Max tesselation, mainly synthetics.
Galaxy HOF is using A1 still...
Thermi part 2? Wow... stretching reality a bit aren't you?
Not at all. Just means Nvidia is trying to stay competitive.
Where do you see any different?
290x is clearly going in and out of stock multiple times a day.
Different bins of silicon. They are saying 780Ti is a metal respin which is extremely unlikely, though not impossible.
It would mean there was some inherent architectural flaw with GK110 that they need to fix.
FXAA is what most people use and maybe 2x (4x on older games) MSAA at high res like 2560x1440 or better, more isn't performant enough to be worth using (low fps no one would actually want to play at, and basically useless in terms of visual quality in motion anyway). I game at 2560x1440 110hz, many use 1080p 144hz or 2560x1440 and many of them with high refresh rates with how inexpensive 27" has gotten for PLS/IPS panels.
Nvidia has indeed pretty much cut Hawaii's show short I'd say I have to agree. Who on Earth is going to buy a $550 card that runs at 95c (consistently hotter/louder than Fermi), has little oc headroom, and is loud, when you can buy a $500 one with lots of oc headroom, generally the same performance when taking into account pre-heated benchmarks rather than short sprints before the 290X throttles (which happens during normal gaming, clocks are 800-900mhz not 1000mhz+), and a quiet heatsink (or custom designs for $510)?
290X has had tiny amounts of units come in-stock and sell off, that doesn't have anything to do with them selling well, just minute quantities. As he said, so much for tens of thousands...
Competition is good, but there's a clear and obvious choice for buying unless you have a huge favoritism towards one brand: at $500 ($510-520 with custom coolers) and barely any slower if at all during actual gaming except 4K res which is so tiny a market as to be nonexistent so far, while being quiet and cool, the GTX 780 post-pricedrop is that.
I'm honestly perfectly fine with all that. Sure it'll be disappointing for AMD if they can't pull out as much profit as they would have expected from Hawaii, but from my perspective they've done their job. We wouldn't be seeing a GTX 780 Ghz edition for $550 if it weren't for the 290X, and I wouldn't be surprised if AMD sweetens their bundle with a bunch of games like the 7970 GHz packages.
Competition is not just good, it is the best thing we could ask for. I really don't care who manufactures my hardware as long as I'm getting good value for my money, and that normally won't happen if a company can sit back on their laurels (just look at Intel, for what I do my 2500k really isn't worth upgrading).
@AliG absolutely, I'm VERY glad to see there be fierce competition here, better pricing and performance for everyone regardless of who is the "winner". I'd hate to see what's happened with CPU's happen with GPU's... talk about bad for consumers. The only thing better than what's happened with the GPU fight now would be if AMD had a good card (and I'd be cheering its release) in the 20nm gen too and we saw this kind of competition there!
You don't buy a +$500 card to run FXAA at 1080p...
Hotter and louder than Fermi? Completely false... It actually pulls less power than Fermi.
It boosts over 900mhz consistently in gaming situations. So that is false.
There have been tens of thousands shipped by partners.
Tiny amounts? You do realized how many are on a pallet that gets shipped to the etailers, right?
Sorry, Nvidia just evened the playing field, they didn't take any sort of advantage.
You do if you don't like alpha and shader aliasing which is far worse than edge aliasing ;), most people know this. Most people running 1080p don't run 60hz nowadays, or they'd be on an IPS panel instead including 1080p 60hz or 2560x1440 60-120hz.
Hotter/louder indeed, power consumption is a different thing.
It boosts on some non-preheated ones to that but when pre-heated or in actual gaming it often falls to 800-900mhz as I said, rather than 1ghz.
Tens of thousands shipped? Doubt it since newegg seems to receive 5-10 at a time and thus sells out quickly due to the tiny quantity ;).
$500 + 100 or more worth of games (ignoring the shield coupon) versus 550 + no games... with similar performance but better acoustics/thermals, most people will go for the better-priced option. Don't get me wrong, I am VERY glad we're seeing good competition finally, it's only a good thing for us as end-users!
But speaking of stagnant CPU innovation, a lot of people have completely overlooked just how far along peripherals have come quietly in the background. I just got a Corsair K95 and holy :banana::banana::banana::banana: is a mechanical keyboard a massive upgrade over the previous ergonomic one. It's interesting that it's almost more comfortable to type with, because even though I sacrificed the wrist support, the keys are just so damn responsive. Or you could look at the Korean 1440p monitors and how SSDs are finally affordable. Just in general I think there's a lot of nifty stuff that's been rolling out without much noise
In guru3d and techpowerup, in gaming loads, they pull more or similar amounts gtx 480. Maybe only by 20 watts at times, but that still more. Thus to make an educated guess that 290x consumes as much as fermi is pretty safe to assume.
In addition they both operate at 95 degree and at that point, 290x still continue to throttle in test by HARDOCP. Hardocp had to set the fans beyond the uber limits to stop it from throttling and where it can maintain 1000mhz.
At this point, they actually beat Titan significantly(10% maybe), but noise goes up beyond fermi levels, beyond dual cards.
So it is very much a fermi, but Nvidia had a better excuse for it. That being, it was generation one silicon and it was even bigger than 290x is. AMD has lots of experience with 28nm and they are designing a refinement of GCN rather than a whole new architecture like fermi.
The amount of heat it can generate is spectacular.
http://hardforum.com/showpost.php?p=...10&postcount=2
An overclocked a 290x can get even in the 80's C with water cooling, I have never seen a card do that.
Add in the overvolt overclock studying with techpowerup and 290x is indeed piping hot.
So similar to Fermi, not worse. Fermi fan does a good 65-70db at stock. I didn't see 290x going over 60db.
Nvidia's excuse was they had a broken architecture that could barely beat a chip almost half the size. Nvidia simply didn't have a choice.
AMD made a decision and decided to set the limits to where they did to balance the design.
That 80C is false. If you believe that, well then I can't help you.
Sure.
PedantOne said the 780 Ti is GK110-x00-B2.
I asked what B1 was (and A1 for that matter).
If the 780 Ti has indeed 2880 cores, it is not GK110-400-xx but GK110-500 (or 600 or whatever)-xx.
Since the first fully enabled GK110 GPUs entered the market only very recently or are not even released yet (Quadro K6000, Tesla K40), the 780 Ti will most certainly not use different silicon, but the same. Thus there are no previous 2880-core GPUs that could be A1 or B1 in order for a B2 to exist.
Unless the stepping is independent of the number of enabled SMX clusters. But even then, why not B1 for the 780 Ti?
Edit:
tajoh111 is right. 290X pulls more power than Fermi with the uber bios, significantly so:
http://www.3dcenter.org/artikel/laun...9-290x-seite-2
http://www.3dcenter.org/artikel/eine...tromverbrauchs
279W on average for the 290X uber mode
235W on average for the 480
Both values are averages of multiple measurements that measured only the card itself, so it doesn't get more accurate than that.
- The 290X is late just like Fermi was.
- It is hot and loud just like Fermi was.
- It doesn't beat the competition at even ground across the board (quiet bios vs Titan stock or uber bios vs Titan@maximized targets) just like Fermi did. In 4K it wins slightly by under 10% but loses in 1080p or with SGSSAA:
http://ht4u.net/reviews/2013/amd_rad...ew/index46.php
https://www.computerbase.de/artikel/...90x-im-test/5/- It uses more power than the competition in either comparison (see above), the same or more than Fermi depending on the bios mode.
Now the positives:
The 290X's perf/W doesn't fall as far from Titan's compared to GTX 480 vs 5870.
The GPU is smaller, great feat of engineering
Price
Considering the power consumption scaling we have seen, I could see it being that hot. Wizard was already when applying just a bit more voltage and overclocking to 1190mhz. His system, consumed 650(originally 400) watts which meant the card was consuming 550watts. This was at 1.26 volts. An overvolted gtx 780 lightning consumed less than 400 watts for the whole systems so(300 watts in total) in the same test and it was overvolted even more and was clocked higher as well. I have seen people clock their cards at for a 1.4volts for a 290x under air, so if someone feels they want to push water even more and if anything else is in the loop, I could see it being overwhelmed and reaching those temps. Some of Intels hotter chips are capable of reaching 80 in water when sufficiently clocked. I have just looking around and fermi could pushing near 70 in a water cooling loop when stressed. And don't think, its capable of drinking as much power as an overclocked 290x. I would be scared to see what kind of voltage this thing can drink down at greater than 1.4 volts. I don't think we have anything use as much power as a 290x as far as power consumption going up as we overclock further.
Also According to tech powerup, in ubermode the card is just as loud as a gtx 480. Both cards are 9 decibels above a gtx 580.
You might hate to admit it but Thermi and 290x have a lot of common as far as thermals and temp. It almost striking.
http://hardforum.com/showthread.php?t=1788367Quote:
I enable Post AA (any level) (or leave it enabled as the default setting) 33 26.40%
I disable Post AA 92 73.60%
So we go from 290x is better than Titan, fact.
To "we can't compare 290x to Titan because GTX780 is basically Titan for cheap."
To "even footing 290x doesn't beat Titan."
Make up your mind and please stop shifting the goalposts.
Fermi doesn't pull only 235w. It can pull over 300w in gaming situations.
290x isn't competing against a card that has a max power consumption of 180w.
GTX480 was consistently pulling ~100w more in gaming situations to give ~10% more performance.
290x is using ~30-40w more to beat the competition by ~15% on average.
I already told you sites that showed they 290x can pull more than as much as a gtx 480 in the same scenarios.
290x is a complex product because it tries to pull two modes at once. One where it is a hair slower than titan quiet mode.
And a second mode that it where it is a bit faster than a gtx titan. It's the nature of the beast. No other cards have this quiet mode.
10 percent average maybe against a a gtx 780. Not 15 percent more on average unless maybe when we turn on uber mode.
The gtx 780 is so neutered and underclocked that it might as well be a small chip like the 5870. Its Nvidia attempt to sell massively docked gk110 and do no binning. If we consider how mangled it is and how much AMD is pushing their own chip(also it being fully enabled), it should be no surprise it gets beat. Particularly with more and more gaming evolved titles being used.
Fermi might have not shown its gaming worth so much in games for it size, but it more than made up for it for its primary purpose, the professional market.
It was the best professional compute card ever at the time. It was literally a made trash of anything at the time. It spanked AMD professional cards at the time being often twice as fast and sometimes 4 times as fast. Even today, AMD w9000 cards have a hard time competing against it and this was against a cut down chip with very low clocks.(hothardware and tomshardware review). So even those it had high power consumption, it justified it with its other capabilities that are yet to be seen on 290x.
Did you overlook where I said "on even ground"? I'll reiterate for you:
quiet mode (stock setting) vs. Titan stock setting
OR
uber mode vs Titan@maxed targets
Titan and the 290X are equally fast if you value a fair comparison - that's a fact. Don't fall for most of the English speaking reviews - they are biased towards AMD (maybe not even intentionally) by not or not properly pre-heating their cards and by testing uber mode vs Titan/780 stock settings with a temp target of 80C.
Read the review links I posted if you don't believe me. And don't quote selectively, quote whole sentences.
Source for your 300+W claim?
The 290X certainly can, 306,88W:
http://ht4u.net/reviews/2013/amd_rad...ew/index21.php
You do understand how an average works, right? In some games a card may pull more than the average, in others less. You won't find better power consumption data than what I posted.
And maybe you also overlooked where I said that the 290X wasn't as awful in perf/W compared to the competition as Fermi was.
And you are the one who should make up your mind. What is it now - Titan killer? Then compare it to Titan which it doesn't beat. Or GTX 780 killer? Okay, if you want to call 10-15% a "killer", fine. But then also use the GTX 780 power numbers instead of Titans: 189W (GTX 780) vs 239W (290X quiet mode). That is a hefty 50W more.
More evidence for 290x power drinking ways.
http://www.anandtech.com/show/2977/n...rth-the-wait-/
http://www.anandtech.com/show/7457/t...9-290x-review/
http://imageshack.us/a/img30/6678/hcxh.png
http://imageshack.us/a/img19/3287/h9wj.png
Guess what these reviews have in common.
There is a 5870 in both of them.
And there are cards that use about 100 more watts than a 5870..... guess what cards those are.
The gtx 480 and the 290x(in quiet mode). With the card in uber mode this grows to about 130 watts which would make it the biggest power drinking single gpu card of all time by a comfortable margin.
If nvidia didn't cut down the gtx 780 so badly and clock it so conservatively, AMD wouldn't even have that lead. the gtx 780 can be clocked quite a bit higher without increasing power consumption if we look at some reviews of overclocked models at Hardocp or techpowerup.
Ok... Where was all these "even footing" "we need to change our testing/benching methodology" the past 2-3 years?
Why do you need to shift the parameters of the test? Why can't you test stock vs stock?
That's right because the results don't show what you want.
Look at any GTX480 review.
No I don't know what average means... please explain.
I bet I can because those results look extremely strange. GTX780 and Titan pulling less than 200w?
There is something screwy going on there because those results don't reflect what other sites have found.
Ahhh... good old scientific method, lets just go ahead and throw you out the window, we don't need you.
All I hear is "ifs" "ands" and "buts," don't worry about comparing what is, we can only discuss what could be.