Well, now, let's see: http://i829.photobucket.com/albums/z...roadmap1-1.jpg
Printable View
Well, now, let's see: http://i829.photobucket.com/albums/z...roadmap1-1.jpg
AMD loves it when Nvidia doesn't show up for the fight.
It's all a series of trade offs after all it took nVidia exactly 2 years to get to their 2nd Generation compute architecture, while for AMD it's looking to be 10-15 months to reach it's 2nd Generation. Hey maybe both AMD and nVidia will reach their 3rd generation at the same time. :)
I suspect a HD 7970 Ghz+ edition (aka "7800 GTX 512" edition) should narrow (or increase) the difference between the difference between GTX 680 and HD 7970.
Really? i remember nVidia showing up every time and winning, maybe late, but they always show up and put on a good show.
5900fx sucked, they came back with 6800gt/ultra, 7800/7900gtx, 8800gtx, gtx280, gtx 580. All winners. And nvidia has far better software and drivers, that in itself is worth a lot.
trinity will rock, vishera will hold its own, intel's crying in a corner over seamicro and radeon HD 7000 entire lineup is out, Amd stock up around ~$8.30 today and nvidia still is a no show and out at apple.
ill overclock a 7970 to 1200mhz and = a gtx 680 for $~100 cheaper and ill be able to get one like yesterday.
good luck getting a gtx 680 when it launches.
ok sorry rant done.
i personally cant wait for nvidia's answer and their software tweaks...
this pricing structure is getting out of control!!!
I don´t think these Kepler are going to be much better than the new AMD/ATI. So I´m waiting for the Maxwell generation. My GTX580 is still a very good performer!
This can be true if it is really late, but if this launch is in time with Ivybridge it can actually be worse than if AMD was the one being late. If both products are around, then people won't wait for the competitors current product to be out because they are already out. IF both products out, if one outshines the other, people will buy the better product. Of course if the gtx 680 sucks badly, this basically means everyone will jump onto AMD because they know already keplar sucks.
I don't think your argument is valid. The 6800gt was a winner, but the x1950xtx beat the 7900gtx. The 8800 gtx might just be the great gpu of all time; it's dominance can only be described by the word leng-wait-for-it-i-hope-you're-not-lactose-intolerant-dary. The gtx 280 on the other hand sucked in a lot of ways and frankly was not a very strong performer compared to the 4870x2 at that. The gtx 580 could be described as a winner, but it took the epic fail known as the gtx 480 to get there and still it runs very hot.
I would say the two companies have been trading blows rather well ever since the introduction of the 4870. Winning would imply they actually have a superior product across the board, which frankly I don't think is true with the exception of the 8800gt/gtx and perhaps the gtx 680 generations if the rumors are true.
You're quite the optimist.
"intel crying in the corner over SeaMicro"?!
:ROTF:
I've got some bad news for you: $140b company wiith 80% market share don't care what $5b company does. AMD exists because intel allows them to. All intel would have to do is lower preices a bit temporarily and AMD wouldn't sell a part on the planet. That is the sad reality of the cpu market.
I agree, considering seamirco was purchased for a little more than 300 million in the multi-billion dollar server world I'm just not seeing the impact from that, might was well be AMD bought a shaved ice popsicle stand.
If Intel decided to compete with raw pricing and priced at AMD's current levels deliberately I doubt AMD would be able to hold out very long with such skinny margins, thats just the reality of the situation.
There's no way Intel could totally stop AMD from selling product but there's no doubt Intel could squeeze AMD on it's margins very easily, especially with the fabing advantage.
If Kepler shapes up to anything like the hype all AMD's previous advantages over Nvidia as far as performance per mm2 and power consumption will largely be nullified with Nvidia possibly gaining the advantage all around and not just raw single gpu performance.
However I'm a wait and see kind of person, I'll believe it all when I see it.
The advantage of SeaMicro is the density and the power and cooling savings that it's has plus the fact that it has actual servers in the marketplace. There's nothing quite like it out in the marketplace right now and that's a huge advantage. The power and cooling advantage is so large right now that really Intel's pricing becomes irrelevant for the target customers of the SeaMicro servers.
I hear that. Triple GTX 480s under water are just fine right now. I was thinking about waiting until Kepler to upgrade again, however, it's looking more and more like there aren't any games to upgrade for until, maybe Crysis 3?
Looks like a future Maxwell upgrade for the old Smooth as well...
WTF? If you release your card later, short of screwing something up badly, you SHOULD win.
What a ridiculous thing to say.
And before you ask, I'm waiting for Kepler. Then I'll decide which card to buy. I'm not sitting solely in either camp.
Buy more monitors?
Try 3d?
There are ways! Although I suspect you'll be crippled by vram.
It doesn't have a scale, how can you say it's in line with anything? For all you know it could be a logarithmic scale.
we know the performance of the 7950, 7970, 580 and 480.
At that point we know the difference between the 480 and 580, 580 and 7950, 580 and 7970 and 7950 and 7970...
So if we assume valid results, than the scale, the resolutions, the drivers are known and fixed. Using a different scale would make the results of the known invalid.
An interesting observation: Shogun 2, which is probably the game that uses the most VRAM to date (without mods), also performs the worst (compared to 7970).
So if the graph is indeed legitimate then it's quite obvious that 2GB is already not enough.
Do you observe that in Shogun 2 GTX 680 is worse than GTX 480/GTX 580 which have only 1.5gb VRAM.
It's not because memory. It's either bad support for Shogun 2 in current drivers, or may be some compute and rendering thinghs which are in Fermi but may be aren't anymore in Kepler.
It's a amd gaming evolved title and Amd hardware has always done better with this particular game.
I wonder how long it took Charlie to come up with that one. The man is without question a modern day nostradamus.
Since his brain was so tired from pinching off that pathetic thought, I'll have to add in what he obviously forgot to mention on purpose. It's that the same goes for any AMD cards when you run non-AMD optimized games.
Now taking into account the fact that nVidia has far better support from game developers and an in-house driver team that actually fixes problems... the safe bet is clear. :)
He didn't say " optimized for Nvidia cards", he used another expression which I couldn't exactly recall when I wrote my previous post. I think he said Physx titles but I'm not sure about that. I invite you to have a look yourselves and tell us what you find.
http://bbs.expreview.com/forum.php?m...ge=1#pid392116Quote:
It seems that air-cooled limit is expected to be superior to 1.4GHz
1.4Ghz,wow! Impressive.
What is the top clock 7970 can achieve with air cooling, 1250ish Mhz?
PS It seems NV made their own bulldozer( high(er) clock design,a bit slower per clock) but their version actually clocks high and consumes same or less amount of power as the design it replaces. Unlike Bulldozer unfortunately.
Kepler/Bulldozer comparison is really funny to read (you guy always jokin' ?). One is a winner, other is a looser.
GPU/CPU comparison is as good as car analogy IMHO.
If you want to know what nVidia thinkin' about Bulldozer, I recommend this link :
http://mediasite.colostate.edu/Media...157581c458a1d#
00:15:00
Bulldozer was supposed to be a winner... ;)
" ZOTAC GTX680 2GB DDR5 256 bit 1006 6008HDCP Dual DVI HDMI DP
Bron: Factoryprices - ZOTAC GTX680 2GB DDR5 256 bit 1006 6008HDCP Dual DVI HDMI DP - ZT-60101-10P - Zotac
Url: http://www.factoryprices.nl/product_...C-ZT-60101-10P
€ 607, 59 :eek: 607.00 EUR = 797.416 USD
Let the pre-orders begin !
As usual that's not how to compare EU->USD pricing, they are entirely different markets with EU having high taxes, cheapest HD 7970 for example are selling for like 480~500 EUR in Germany and in US it's $549.
Preorder prices are usually a lot higher and can jump up/down very suddenly but based on this I'd still guess it's targeted for at least $549.... which would be a shame. Totally expected though at this point. :(
Actually, Shogun 2 has a massive memory leak unless it is patched to the latest version. Basically, when ending one instance, changing settings and starting another saved game (a typical situation when benchmarking), the memory footprint went from ~1.1GB to 3GB+ in a matter of minutes. Luckily, CA fixed it but if you aren't using any of the last 2-3 patches, memory will prove to be a bottleneck on ALL cards to various degrees. One thing I have noticed is a near lack of sites actually updating their Shogun 2 installs to reflect the new in-game reality likely due to the hassle of retesting every card just because of a patch.
The only time when a patched Shogun 2 install uses more than 2GB is at 2560 x 1600 at the highest details with 4xMSAA+ enabled. A simple way around this is to use a compute-based AA routine like MLAA or FXAA which gives very similar image quality (due to the ultra high resolution) without any huge impact upon framerates.
In Europe it's parity 1€ = 1$ in prices.
So if the card it's 579$ we pay 579€ currency.
The worst it's that this applys even in coutrys that not have € currency which is very sad.
So i had to pay 220€ for my 2500K which is in $ about 270$.
A 7970 it's also about 550€ in $ about 690$ :down:
That's not nice at all....
For be honest, i will believe it when i see it lol ... Before that i will put it in the same bag of rumors who try to Hype the product. Like this graph of TDP /performance.. it is so ugly i cant imagine Nvidia will not fire the guy who have done it. Seriously, even a 12years kid will do better. ( nvidia logo blurred, the square in center with thoses circle, lol, the police used )
I think he want to do a comparison between the P4, Bulldozer who was designed for high clock... not performance or quality wise lol . ( but i dont want speak for him )
But i aggree with you, its just impossible to compare CPU and GPU...
58 and counting and this thread is sooo full of crap! :rolleyes:Yah know crappy pics. and crappy charts - seems to be in every other page...:D:D
I hope the thing were talking about would come out opposite of crap...LOL! Now, somebody nawty enough to bweak eNe De eey and bwing us some "real" leaks..
Come on guys, if we are this close to release, this is the best time to "leak" what you've got. :shocked:
cost is to high, half that it be fine.
:yepp:
So much hype... :eek:
Some cherry picked benchmarks @1080p where GTX680 beats 7970: http://translate.google.com/translat...age%2F6%23view
Is it only me or is it that with every site leaking some new GTX 680 details is getting slowed down to a crawl and not responding? Is GTX 680 interesting people THAT much? xD
Yea well the problem is that AMD undershooted and Nvidia overshooted their typical goals. The GTX 680 seems to be a very impressive chip taking into account every aspect of it and it doesn't help that despite the rather subpar performance for a new series of cards from AMD decided to price them very high (usually we get a bigger performance boost out a new series for same/similar pricing) and Nvidia will just jump on the same bandwagon as AMD and adjust the price according to these cards so we're paying highend prices for what was meant to be a midrange/performance card if only AMD had delivered more performance. xD
The waiting game for something more reasonable price/performance offer still continues for me at least, I've got a 3D Vision supporting monitor so I'd prefer an Nvidia card due to better 120Hz support.
Can you explain what you mean by "cherry picked" without bringing TWIMTBP into the conversation? IMO, the whole "NVIDIA vs. AMD sponsored game" conversation has pretty much gone out with the trash over the last year or so.
Case and point:
- The Witcher 2: NVIDIA sponsored, AMD holds performance edge
- Battlefield 3: NVIDIA sponsored, relatively even performance between the two
- Dirt 3: AMD sponsored, NVIDIA holds a performance edge
- Deus Ex: AMD sponsored, relatively even performance
- Batman AA: NVIDIA sponsored, relatively even performance
- Crysis 2: NVIDIA sponsored, even performance again
- Shogun 2: AMD sponsored and AMD holds an edge in non-AA situations due to a specific lighting shader being used. With AA enabled, things tend to even out
- Saint's Row 3: Not sponsored by anyone and plays like crap on all systems it seems...
IMO, in today's world of optimizations, there are no "cherry picked" games unless a reviewer decides to suddenly start using hardware-based PhysX in comparative benchmarks.
If these cherry picked benchmarks are the best a heavy OCed gk104 can do, it doesn't look good at all
http://tof.canardpc.com/preview/79f7...a2314c79b6.jpg http://tof.canardpc.com/preview/a50b...3249994597.jpg http://tof.canardpc.com/preview/fef3...c6c06782ce.jpg http://tof.canardpc.com/preview/6c03...485d2d4413.jpg http://tof.canardpc.com/preview/8c05...3fa1aa386f.jpg http://tof.canardpc.com/preview/53f1...8ece2bb968.jpg http://tof.canardpc.com/preview/5eb6...cd2ed8650f.jpg http://tof.canardpc.com/preview/0f75...923ff5b8df.jpg http://tof.canardpc.com/preview/e09b...e5ed97d815.jpg http://tof.canardpc.com/preview/cefa...453aec529a.jpg http://tof.canardpc.com/preview/1fab...05266ebda0.jpg http://tof.canardpc.com/preview/64df...ad4667635c.jpg http://tof.canardpc.com/preview/f8ba...504ee91046.jpg http://tof.canardpc.com/preview/abb5...81b2f011db.jpg
http://tof.canardpc.com/preview/377f...868f7859d8.jpg http://tof.canardpc.com/preview/1dba...d865880234.jpg http://tof.canardpc.com/preview/d263...5500fe1a39.jpg http://tof.canardpc.com/preview/414b...4c46ba8753.jpg
http://tof.canardpc.com/preview/5d8e...2378b85f0d.jpg http://tof.canardpc.com/preview/a1c0...d4c890a31f.jpg http://tof.canardpc.com/preview/e98e...9ccb005ab9.jpg http://tof.canardpc.com/preview/f7df...4e2513ca45.jpg http://tof.canardpc.com/preview/185d...e2b4b2b343.jpg http://tof.canardpc.com/preview/28ba...8188651334.jpg
http://tof.canardpc.com/preview/552b...7e3e573019.jpg http://tof.canardpc.com/preview/1473...cca72e34f6.jpg http://tof.canardpc.com/preview/e988...4634c6364a.jpg http://tof.canardpc.com/preview/e9c7...3d1b7aa781.jpg http://tof.canardpc.com/preview/c1d7...55e5c59f6b.jpg http://tof.canardpc.com/preview/2d3e...3b4103a31e.jpg http://tof.canardpc.com/preview/8437...c71ffbdbeb.jpg http://tof.canardpc.com/preview/737f...97c64d5241.jpg
http://tof.canardpc.com/preview/146c...82e29cdaea.jpg http://tof.canardpc.com/preview/e0c9...15d3cfa09b.jpg http://tof.canardpc.com/preview/a0f7...9d8cc7326a.jpg http://tof.canardpc.com/preview/d7c2...028da3c922.jpg
It's a Gaming Evolved title IIRC. And it runs just fine without the highest lighting setting that seems to bog the game down quite a bit.
The review used at least LP2 which at least used to be way out there in Nvidia camp, not sure what the current situation with that game is. And all the other games were titles known to work well for Nvidia. I guess we'll just have to wait for SKYMTL to release his unbiased DX:HR and Shogun II results. ;)
It's good for marketing... First 1Ghz top GPU EVAR!
So what the clock? on this benchmark suite, they show 706mhz default, and they run all at 1006mhz ? 1006mhz is default ? turbo boost ? or they have oc to 1006?
^ welcome to the new intransparent world of benchmarking, sponsored by Nvidia turbo :D
LOL @BF3 chart, where's 7970 FPS count ?
@SKYMTL
I think the whole "AMD sponsored" thing just means that AMD partnered with the developer to help ensure proper Eyefinity support, maybe a little additional help here and there; they generally aren't as involved with game companies as Nvidia tends to be.
I completely agree, the days of graphical features being exclusive to either AMD or Nvidia seem to have died a long time ago. I think game companies learned that having exclusive content towards one vendor does not help with game sales.
The world is turning upside down, Nvidia using less transisitors, smaller die, higher clocks, low tdp, cooler running, smaller memory bus and who knows what else.
Insanity I say!!!!
Saint's Row 3 never worked well for me and the Steam forums are loaded with people complaining about the game's performance on even high end systems.
Our games list will be expanded this time around. Skyrim, Batman: AC and Wargame: EU Escalation have all been added. Saint's Row would have been added too but like I said, there seems to be a long list of issues with that game.... :(
Do you have any proof of that? I have several friends at the Ubisoft offices here in Montreal that say differently about AMD's involvement with certain titles....
Oh darn....the most popular resolution around....
Yeah, that's totally cherry picked. :rolleyes:
indeed, personally i like better a card who OC instead of throttling down.. just this will need some study in real gaming situation.. Interesting feature. (clock up to 1100mhz in BF3 as reported in the article )
( anyway, on their article ...BF3 have no 8xAA setting, only 4xMSAA + FXAA )
little question to SKY? NDA is down or not ?
So if that chart is true, then it's pretty obvious a Fermi shader is not the same thing as a Kepler shader since they tripled the shaders but barely increased the transister count.
I hope the 680 trounces the 7970! and i hope it is only $500 I'd get 2 for a new Rig.
Whad'Ya know something that seems as actual benchmarks! :o
1) People who buy $500+ video cards are a minority
2) People who run 3D, multi monitor or 30" LCDs are even a smaller minority
If you feel youll need more video memory ( which is what you are eluding to with your snide comment ) then just get a aftermarket card with more vram or a 7970 and have a coke, smile and stfu? :rolleyes:
http://imgur.com/a/aQmuA
MOTHER OF GOD
Most if not all of that was already posted.
It is not. So take all with a grain of salt as we don't know which driver they are using....
Check out the Steam hardware survey. High end GPUs are MUCH more popular than ultra high resolution monitor setups. So it goes to reason that more people using high end cards for lower resolutions than for higher resolutions. ;)
This I personally find most interesting even if it's a bit hard to understand the broken english:
Adaptive Vsync, why hasn't this existed earlier! :) I don't quite understood fully the TXAA antialaising method, did he mean TXAA 1 only costs performance of MSAA2x but looks better than MSAA8x and TXAA looks even better and only costs performance similar to MSAA4x? Sounds good if true lol but appears games needs support for it.Quote:
New Adaptive VSync technology
NVIDIA, on the other hand, made for the actual operation of fluency of the game is optimized and to promote the new Adaptive VSync VSync setting are gamers, open the main purpose is to reduce the graphics card output image is generated in different fps too fast or too slow situations, and the output frame rate is locked in with the mainstream display screen 60fps. However, the image of the actual output of the graphics card will still be due to different scenes within the game for differences between the actual output fps lower than 60fps fps, will be directly reduced to 30 below 30fps is reduced to 20fps.
In fact, such cases are quite common, and convert 60fps to 30fps of the screen will significantly slow the beating, which greatly affect the fluency of the game, the new Adaptive VSync just solve the problem, when the Adaptive VSync enabled, the system automatically detection of the actual operation fps than 60 VSync will open a locked 60fps output frames, such as less than 60 will be automatically VSync turn off and running to the actual output frames to reduce all of a sudden dropped to 30fps from 60fps is the emergence of slow beat, can be very useful.
Practical function open: new antialiasing technology TXAA
In addition to the graphics performance, NVIDIA also for the needs of gamers, add a number of new and useful features for a new generation of the GeForce GTX 680, released a new anti-aliasing technology TXAA, a new generation of hardware rendering anti-aliasing technology, actually belongs to the old MSAA's enhanced version provides better anti-aliasing I can reduce the open antialiasing fluency decline, for the time being belonging to the GeForce GTX 680 unique anti-aliasing, but NVIDIA said that the above will be open at a later time to the GeForce GTX 500 or Model.
TXAA temporarily can be divided into two levels: TXAA 1 and TXAA, the former costs only 2x the MSAA similar, but the MSAA is better than the 8x anti-aliasing; As for TXAA 2 effect is more outstanding, but the cost only similar with 4x MSAA. TXAA game itself support to be open to use, support TXAA variety of games will be held during the year stage, although present, failed to test its real performance, but based on the NVIDIA official photos can be seen, antialiasing does, it is worth looking forward to.
In addition to the new TXAA Antialiasing, NVIDIA is also an existing graphics card users to bring good news, the introduction of the original few games before the corresponding FXAA to the NVIDIA Control Panel, users can easily turn the feature on. lead to better anti-aliasing for different games, and enhance the quality of the game screen.
Somebody needs to leak those drivers, now dammit! Can't wait :up:
Adaptive Vsync, BRILLIANT!!
Benchies look fine too, good performance, and whoa 195w confirmed, adaptive clocks are excellent. Good GPU.
Looks like a sledgehammer. nVidia has done it. Well deserved respect from me.
nVidia will not be happy with an early leak like that, that reviewer better hide lol.
after nvidia adaptative v-sync.... lucid MVP team.... MASS SUICIDE....
LOL
How does that adaptive vSync differ from a simple 60 FPS cap? I was hoping for something that would help with multi-gpu micro-stutter, but it looks to me this simply turns vsync off <60 FPS.
At least 4 monitor Surround seems to be confirmed. Hopefully it's also possible to use mixed monitors.
You could always use different monitors. 3D Surround is a different matter.
30% performance increase (from the previous flagship) with thrice the shader count?
How's that even possible?
By the way, the graphics industry increasingly becomes like the CPU industry; every new generation gives only marginally better performance than the last one. On one hand that's good because I can still hold on to my aging dual GTX460s, but on the other it's a bit of a shame since hardware evolution (in general) is closing to a halt (we have to find new techniques than mere shrinking of the transistors).
Until we're still stuck with the X360/PS3, it doesn't really matter. After the new generations hits, preferably really, really soon, then the need for higher performance will become much larger and they will try to fill it. You can bet on that.
@Pantsu & Migi06:
You can use the framelimiter in the driver to eliminate input lag and multi-gpu stuttering.
@Stevethegreat:
Not really. GK104 is only a beefed up GF114 successor. Don't let the name fool you, this is no classical high end part. In that regard, performance has increased considerably.