boxleitnerb: it is pre-NDA final NDA ends date :) They told us, you cannot to show anything to xxx date, BUT exact date of NDA will be updated in few days
Printable View
boxleitnerb: it is pre-NDA final NDA ends date :) They told us, you cannot to show anything to xxx date, BUT exact date of NDA will be updated in few days
Audio in games is something has needed work for some time on the PC. Surround hasn't worked properly for years now. So another big thumbs up for AMD for at least trying to sort it out.
Looking at it all, hawaii seems nice.Rest of the pack ,not really as it seems to be the same chips at the same prices with the same performance.
What seems really exciting about all this, is mantle and trueaudio.I didnt even knew how big of a deal mantle could be,new GFX API dedicated to amd hardware, 3dfx and its minigl comes to ming, S3 and METAL ,and how much better they were in comparison to generic D3d performance wise.Of course, it could all be big talk and nothing coming out of it, BUT, AMD has landed both consoles with their APUs and that changes picture dramatically.Also trueaudio, finally someone tries to tackle broken PC audio from vista times,and its ideal thing to incorporate to GPUs and APUs.Too bad all this is going to kick in from next gen and not this, as there seems to be 1 or 2 new chips with this and the rest is just a rebrand.
Truaudio, mantle = useless stuff, nothing interesting. We want performance, not these BS :)
That's one huge stock fan, hope its more quiet than the last ones...
I can't help but think of Glide when I read about Mantle. Does anyone know if it'll be closed source, as Glide was?
open but needs Graphics Core Next as far as I can see won't work on a 6970 or older and don't know what nv can do
http://anandtech.com/show/7371/under...cs-api-for-gcn
Agreed.
I tired of struggling with generation after generation of Creative cards with poor drivers (since the HDD corruption with Via chipsets I haven't experienced any major showstoppers with Creative, just an endless stream of niggles that were never addressed) so I gave up after the ZxR.
I also tried a couple of C-Media based cards (one ASUS, one Auzentech) but again drivers were buggy and had to rely on third party modified drivers to get most of the functionality most of the time.
So, I decided to ditch 5.1 and go stereo. Chose what looked like an appropriate (not audiophile grade, once you step there budget spirals out of control) match of stereo amp, DAC and speakers and never looked back. I lost a great deal of positional audio but I have good sound all the time. No SCP, no blue screens, no games stuck in an infinite audio loop.
I disagree with PedantOne. If this audio engine implementation delivers and is not just a marketing gimmick to make up for shortcomings in other areas is a big plus for me.
I'm looking forward to similar tech making it's way to their APUs someday :)
-PB
Bit confused. Is this pre-paperlaunch stuff?
That's the way I understand it, yes :shrug:
I see, just to wet the appetite. I guess times change. Few years back people hated paper launches, how times change. :)
Architecture looks good, but we need at least an NDA lift date, this kind of sucks.
sorry again off topic but i am returning to xfi. on board of p8z77-v pro is horrible after xfi.
Actually Asus really seems to have botched the Windows 8 release. There were complaints all over the forums for a while there with those Xonar cards.
I also find Dolby headphone very good on the Xonar models.
Aside from that, no games use open al or eax anymore anyways so as far as positional audio goes just about everything is pretty much the same.
I wonder if this truaudio is supp posed to work in conjunction with mantle. I don't see any other way it would be any better than whats already on the market.
So this Truaudio stuff, can this work together with a separate audiocard (like the Asus Essence STX I got)? Because I was thinking that the Asus Essence STX got great audio quality, but positional sound is not so good. If it can be combined with Truaudio it could be great for everything I do with it.
I was just going off speculation from two months ago that they "could" be out by late Sept. or early Oct... and I haven't had time to keep up with this thread.
That's one of the dumbest theories I've ever heard, of course it's 512bit... ^^
You can cut it out with the pissing contest already... danke much.
Say what
well it seems its only for there "BF4" bundle. I doubt its gona be limited overall.
That would be my guess as well. Will probably require an external DAC though USB or SPDIF or HDMI, or whatever they use. It's unlikely to work with current sound cards (especially considering the atrocious drivers...).
They may just make it work with various USB headphones, which would be cheap enough and pretty cool for your average gamer.
Baumann said that it would work through any of your normal outputs.
http://forum.beyond3d.com/showpost.p...&postcount=198Quote:
Originally Posted by Dave Baumann
Ah, that's pretty neat. Although I still think they will have many issues due to crappy ASUS and Creative drivers. In fact, not sure ASUS and Creative are going to be willing to assist this sort of thing, as it may just be their direct competitor...
Yeah, it sounds like it is meant to be relatively easy to implement from the user side but, like you said, sound card issues are way too common.
It gave me chills thinking about trying to setup/receive TrueAudio and then trying to change all the effects/settings on the soundcard to not interfere.
It seems, NDA for R9 290X will ends 10/02/2013! Only few days.
Lets hope it firestrikes 10k+ otherwise it pretty sad thats ofcourse IMHO
BF4 has date of public 28/29.10.
yes, but from 10/03/2013 are R9-290X + BF4 pre-orders. You can buy Radeon sooner, and there will be coupon or code for download after gale launch date.
http://www.youtube.com/watch?v=uR1Cy75DOXE
Quote:
showing it off running across three 4k resolution monitors from a pair of AMD Radeon R9 290X GPUs in Crossfire mode is pretty sweet. http://www.tweaktown.com
R9 290X will be a great card no doubt but I will hold on to my 7970 for the time being, I think I am still ok with it gaming at 1200p for now. When I see a new release that brings me close to 100% improvement then I will surely hit the trigger..
Quote:
You?ve heard about Mantle and how it will boost the performance of AMD?s cards on Battlefield 4, right? Well, today we get a glimpse of what we can expect from that low-level API. According to AMD, Battlefield 4′s demo (at its recent AMD event) was running on a single R9 290X card at 5760?1080 (though it has not been confirmed whether it was running with constant 60fps. We should note that we did not notice any slowdowns, so we are most probably looking at a 60fps gameplay heaven).
http://www.dsogaming.com/news/battle...-at-5760x1080/Quote:
Now that?s simply spectacular. Rumor has it that R9 290X?s specs are lower than those of Nvidia?s Titan card, yet with Mantle it seems that AMD?s latest GPU will be able to compete ? or even surpass ? Nvidia?s GPU flagship. Seems after all that AMD?s comment ? about R9 290X surpassing the Titan ? was true after all.
We?re still a bit skeptical about this whole ?Mantle? thing, and whether it will be use-able from other GPU manufacturers. Yes yes, AMD claimed that Mantle will be open and Nvidia will be able to use it ? though during its presentation the red team claimed that Mantle works only via GCN, and as well all know Nvidia?s cards are not based on that architecture.
I disagree, if the Hawaii chip is 30% faster (assuming), and the next gen 20nm tops Hawaii by 40% (assuming), then you already got 82% faster than the performance of Tahiti, that's good enough for an upgrade in my case.
I just think 30% over what I got ins't quite enough yet, my 7970 will last me a long time still..
Yeah but you said 100% and you only supposedly will get 82% ;). For full 100% perf. increase you wil essentially need 3 generations which in turn means 4+ years.
I have a 4870x2.
Would this new card, give me a 100% performance increase. for 1080p gaming. of the same settings i use now. to achieve ~60fps?
or would it still be a waste to upgrade. its only been, 5870, 6970, 7970.
So 4 complete generations
Or would i have to play some random game that might actually require it, for further eye candy i couldn't even notice. (or set in game)
Currently i max out all my games I have played. no noticeable perf issues.
I do have alot of games though. 570+ on steam, 20+ on origin 100+ on desura.
Have yet to encounter a need for a new video card. thought i haven't played 555 of the games yet.
if u had the money to buy all those games u have the money to upgrade that card, I couldnt use mine longer then 3 days until i sold it it was terrible.
yes, but i don't wanna waste it. i am just trying to figure out. if there is any reason to upgrade yet. or should i wait another gen or two.
Another else other then just higher res. etc. from these new gpu's i am missing.
I hear they do hardware acceleration of video steaming online now. which i kinda think is cool. If it can improve the quality of the image. Like my sound card can turn horrid sound into quite good.
I have been very interested in upgrading, since 6970 but keep talking myself out of it. Like all i will achieve is a 15% perf gain(under my usages, 1080p upto 60FPS).
People with money don't manage to build it if they frivolously waste it as they try :p: and they don't stay having money if they dump it on anything that looks shiny :D (just a general comment on "if you have the money to buy X, why not Y"). To answer Greg83's question, if you're only after 1080p 60fps you might see a small gain but if you're OK with "only" 2x MSAA or FXAA in the newest games I'm not sure you'll really see a big benefit, other than the obvious microstuttering problems (but if you don't notice them, no need to change of course).
I'm going to try and be one of those 8K people...hopefully PowerColor is one of the partners. If you guys got any info on where I would go to pre-order I would appreciate it.
Well...if non-BF4 edition R9 290x cards are out before the BF4-edition card is out...then never-mind. Wow, I just want a PowerColor R9 290x come on already!
Since I own a 780 Lightning I'm going to wait and see how it all comes out. If r290 is a beast then I might wait for the Lightning version of this too.
Fun times ahead boys
:D
I'll be damned they really are going without a crossfire bridge. Why?
Why not just redesign the crossfire bridge. Its not like sli where the bridged is packaged with the motherboard.
One of the things that I really loved about crossfire was that tri-fire works natively with Ivybridge and Haswell on motherboards like the Gigabyte UD5H. I can't imagine that this won't require more bandwith from the pci-e slots and possibly bottleneck a pci-e 2.0 x8 or pci-e 3.0 x4 slot.
Well, not that PCI-E 3.0 is any bottleneck really. Why not put the excess bandwidth in good use and at the same time shave off few cents from the board's BOM by omitting the CF link.
A lot of people use motherboards with pci-e 2.0. I can't imagine that this wouldn't bottleneck people running those boards especially since most default to x8.
Like I said about tri-fire it'll probably also cause issues at pci-e 3.0 x4 like the Gigabyte UD5H (works great for 7970 trifire) and MSI Mpower. I don't see how this has any pros for the consumer.
I agree. I think they rushed their decision a bit too much.
It's probably fine on 2011, but on 1155 some people's performance may suffer...
I guess there are two reasons to go with this: cutting costs, and they couldn't figure out how to solve all frame-pacing problems with the double-connection scheme.
most interesting is this:
Attachment 131371
Well that puts a damper on crossfiring these new cards on my am3+ mobo lol
direct access between gpu display pipelines over pci express.
explains it all.
plus wasn't it already talked about that the crossfire bridge only has the bandwhich of a pci-e 1x slot, with lower latencies. didn't have much use in recent crossfire as it was evolving and became redundant.
surely the optimized it and now left the old method behind completely.
this new slide shows , alternate frame rendering isn't the method used anymore.
might even mean , the texture ram issues of crossfire are gone to.
How good is mantel vs DirectX 11 in terms of picture quality and dos it have tessellation for example? or is this going to be basically OpenGL but faster?
Hi All Friends.
Sorry, Please Explain To Me DMA Engine In RADEON R9 290 X OR PCI - EXP .
Thank You.
Hehe :), that's awesome and must be a really enjoyable experience to upgrade each time. I'm planning on trying to hold out with a 670 w/ 680pcb/hsf 4GB SLI setup (got a great deal but may or may not end up being eligible for warranty which would mean I'd return them if not and won't know until they arrive soon) for a Maxwell refresh or Volta intro, upgrading from my single 780. I figure it wouldn't be too hard since those two cards should outpace my 780's oc'd speed by around 40-45%, and a new chip isn't likely to be much faster than that over GK110 anyway for the initial Maxwell 20nm chips. Combine that with trucking on a 2600K oc'd still from when they first launched (literally day 1) and I may have to turn in my badge :p: !
For everyone wondering about CF through PCI-e think about HSA.
You are right.. You had the conector wired on the DVI of the master card ( one cable was go to the slave, and the second to the monitor ). ( i have got CFX with X1800PE 512 ( huum not sure for this one, i remember have buy the X1900 only 2 months then > X1900-1950XTX > 2900XT > 4870 > 5870 > 7970 ( and SLI too at same time from the 6600GT SLI ) ( I have even still nearly all gpu here. )
The way they do CFX look to have been rewrite there. but i will not even be surprised you will see somthing similar from Nvidia about SLI in the next generation..
All it says is that its "compatible with frame pacing technology". I don't know where they made the leap to "this technology basically fixes most of the AMD’s frames pacing problems".
likely its not that removing it had not performance penalty, but keeping it , did hurt performance.
maybe they found the low latency, low bandwhich crossfire link was harming performance. unless they create a 16x pci-e on the top for the crossfire , it doesn't actually make any sense
now that they're trying to prevent lag, micro stuttering. if you have a lower latency bus for crossfire. you introduce a calculation differential .
does 1x pci-e bandwhich really make any difference.
sounds like crossfire left the old realm, of copy all to both cards, process 1/2 on each. to how that hybrid ati + nvidia card tech was gonna work.
maybe this will also result in closer to 100% scaling now.
even ati said in 4870's time that the crossfire link had become pretty useless and only became a bus used to say "hello" and initiate crossfire.
AMD are certainly bringing some interesting new ideas to there hardware. I'm really interested to see how all these ideas pan out. We could really see some of the best performance gains over the next couple of years. Interesting times ahead indeed.
Remember they now support Eyefinity with 4K displays @60 fps (the Dirt 3 video)... I think that's the reason about why they just don't update the CF bridge
I don't know about that. If you take a modern high end video card and run Heaven on a Z67 board, and then pop in a sound card or something into the bottom pci-e slot which will drop the top slot to x8 you'll lose a few points. I noticed that with a GTX680. Wasn't a huge difference but I wouldn't be shocked if the difference were larger with titan.
Nvidia's sli doesn't have that issue and the bridge does provide more bandwith. They could have gone that route.
I'm just disappointed that no one is really going to test this. Or if they do it'll be like anandtech's awful pci-e 3.0 comparison on Titan where they decided to use X58 with a full 16 lanes per card.
A few points of drop pretty much fits in with the 8x to 16x difference that happens on any card.
CrossDMA is the same as CrossFire bridge. Old bridge was ONE PCie 1.1 line, now is ONE from 16 PCIe 3.0 dedicated to sync, function is the same like bridge.
16x PCIe 3.0/8x PCIe 3.0 no performance hit, 16x PCIe 2.0 no performance hit, 8x PCIe 2.0 maybe minimal performance hit, but i think very very small
What telling you, Nvidia is not taking the same road as AMD for Maxwell ? Dont forget last gpus from Nvidia are still Kepler based ... I have the feeling it is even a need for 4K display and multiple monitors.
Anyway , the majority of data ( > 90% ) for CFX and SLI are allready moved through PCI-Express. Im not even sure the difference could be quantifiable in %...
The way the gpus are handle information through PCI express is changing now, with GPGPU new features, including virtual memory access, tiled resources etc etc...
AMD is not crazy, if it was have an negative impact and sacrify performance, they will have just keep the bridge.. lol
I've always been told that sli bridge handles more bandwith. Why do you think that you need a special bridge for tri-sli? You're comparing apples to oranges.
Thank you, my question is why not update the crossfire bridge to something that can handle a little more bandwith?
Thats not what you said before. My point is that its already happing on a slower card thats not putting excess stress on the pci-e lanes for multi-gpu sync.
No, its the same.
To cause a significant drop requires a system bottleneck, typically the HDD during streaming or GPU ram if you're running your resolution/AA way high.
Switching from 16x to 8x only shows a small decrease in performance typically due to DMA request depth since you have fewer lines in parallel and have to stack requests deeper in series to make up for it. A small latency hit at most, no bandwidth issues as long as you arent running into the bottleneck issues, which is fairly hard on a single card with 3gb or more of memory at under 1600p.
Been out of the loop for awhile. Really exciting to read about what AMD has planned and is doing with this new gen. Can't wait to upgrade my 6950. (It pays to skip a generation or 2 to get that wow feeling). I remember that AFR rendering slide presentation AMD did when the 4870x2 first came out. If AMD has figured out a more effective and efficient way to render and improve crossfire even the haters have to tip their hats. Since I already preordered my PS4, take more of my money please AMD!
And mantle...oh my!
I will be an early adopter.. anyone else?
http://www.overclock.net/t/1428344/v...#post_20890405Quote:
Why would you need it? It needs to be able to transfer that much data in that much time. While more speed and lanes will get it there faster, it only needs to be there by the deadline, which for a 60Hz monitor is 30 times every second for 2-card AFR.
For a 4k 60hz screen, it needs to be able to transfer a full 34MB of data every 33 milliseconds to get there by the deadline.
PCI-e 2.0 x8 (4GB/s) can transfer that in 8 milliseconds.
PCI-e 2.0 x8 can transfer one 1080p frame in 2 milliseconds.
Theoretically speaking, a PCI-e 2.0 x8 has enough speed to handle 4k eyefinity at 60hz. It would be like running crossfire in a PCI-e x2 slot, but you could. Practically speaking, you want a 2.0 x16 or 3.0 x8. Even at this most extreme of resolutions you still don't need a PCI-e 3.0 x16
oh, and because math:
PCI-e 3.0 x16 (16GB/s) can transfer 4k 60hz in 2 milliseconds.
PCI-e 3.0 x16 can transfer one 1080p frame in 0.5 milliseconds.
Pretty simple math to work out how much other resolutions and refresh rates will need.
Also there's that video of Dirt 3 on 3x4K displays on a pair of cards in crossfire. Seems to work fine.
I dont know why, i find some information a bit strange in this: No dynamic clocking, but can reduce clock.. I dont think AMD is back to 6000 series era. and the 288GB/s, like if AMD will have developp a 512bit MC for get the same bandwith of the 7970...
Possible, but like they was not so much information during the presentation for non NDA part outside: > 300GB/s and " > 5Tflops " .. you need 885mhz with 2816SP for match 5.0 Tflops and their slide said higher of 5Tlfops. without saying with this clock speed, you are far, really far of the Triangle rate given by AMD.
http://img266.imageshack.us/img266/6834/p6sq.jpg
Charts... Titan's minimum fps is higher than average...
Chart #2... Titan's minimum fps & average fps = exactly the same.
Sure about the validity of those "tests" :D
Being color blind it's hard to see which line is which in the graph, I can see the difference in color of the labels but in the graph it's just useless I just can't read that.
Something like blue and red would of worked.
I see that alot lately, people put together these graphs with half the colors being so close together I can't tell them apart when they are all squiggly lines in a graph.
I don't mean it as a rant but I'm just saying, use colors that aren't so close together in the spectrum guys...
Time to release my Asus 7970 TOP
http://www.3dmark.com/3dm/1255142
A probably score for the 290X...
Some comments I see on Beyond3D (source of the futuremark link)
Quote:
Graphics score: 10882, almost 1000 more than titan. 9238 total score.
If this was true, why in AMD slide it only showed near 8000 for R9-290X?.
Quote:
I dont know if it is true or not.. ( i mostly think it is a fake, could be overclocked etc ).. But i will not too much base myself on what we have seen on the presentation about performance.. They was really not give any real precise numbers.. only some hint ( higher of, more of ... )
Quote:
Memory reported = 3.072 MB. Doesn't R9 290X ship with 4 GB?
Quote:
GTX Titan show 3gb in 3dmark too.
W1zzard fix the clock info in the TPU postQuote:
32bit Vs. 64bit related problem
http://www.techpowerup.com/forums/sh...8&postcount=16
The 290X result shown above is likely fake. However, that's not to say the 290X won't reach those levels. ;)
My (sold) eVGA GTX 780 SC ACX pulled more than that with a memory OC and core at effectively what it boosted stock, (I installed a TI bios so it wouldn't throttle ever, but the out-of-box boost was ~1137 and I ran it at 1150 24/7). It was hitting ~11200 graphics score with just that... not impressed unless this thing oc's like a beast and thus is able to go higher than an oc'd 780 (core+mem) would, or it comes in at a notably nicer price (excluding the paltry limited-availability BF4 bundle edition that virtually no one will get if the rumors of 8k units are true worldwide). Particularly considering the missing features and that 780s now are able to be overvolted for great core clocks.
EDIT: And, it's entirely possible I'm getting jaded at companies just managing to match eachother months later either way it swings, but if a card coming out several months later isn't appreciably better in some way, I just shrug my shoulders at this point.