Was looking at http://downloads.guru3d.com/download.php?det=1315
I know they are a few days old but I havn't seen anything on the net about 7900's, they seem to keep em under wraps, is it the same core? Are they made to sink the x1900's?
Printable View
Was looking at http://downloads.guru3d.com/download.php?det=1315
I know they are a few days old but I havn't seen anything on the net about 7900's, they seem to keep em under wraps, is it the same core? Are they made to sink the x1900's?
it is a 90nm 7800GTX (Codename G70, 110nm) supposedly, rumored to have ~700MHz core clock and there is a possibility of 32 pipelines. I think the code name for the core is G71. It should be equal to or better than the X1900XTX
it is the next phase in the perpetual game of one up manship.. that I just love
Then from what I convey, performance gain is dismal at best? Although 90nm sounds good.
I don't think 32 pixel shader processors, 32 texture address units and 16 ROPs would be anything close to being called Dismal..it might not be huge gain but it will be noticable...Quote:
Originally Posted by OmegaMerc
yes yes, but i meant a performance jump from the 6800 -> 7800, but that was a stupid asumption since they are the same series, guess we will have to wait for 8xxx series.Quote:
Originally Posted by nn_step
this sounds like a big performance jump to me
Yeah about 33.3334% provided that the extra pipelines scale well.. :slobber: which would spank the X1900XT's :slapass:Quote:
Originally Posted by Xenogias
So, It is good news for guys like me, right?Quote:
Originally Posted by nn_step
well it will be worth the wait but if your going to wait.. G80 shouldn't be too much farther off.. which I think would be a better investment for your funds.. But then again I am assuming you don't upgrade your graphics alot..Quote:
Originally Posted by physics_geek
and I would suggest getting a 7300 just to tide yourself over for just alittle bit
edit: I thought they only made the MX-4000 in PCI and AGP.. why in god's green earth did they make PCIe?:scratch:
Well it is a PCI card. Picked it up from compUSA. Forgot to revise the sig.Quote:
Originally Posted by nn_step
I do upgrade my vidcard but, I do it once in every three yrs. Once, I upgrade I always grab the best one I could find, or just wait for the ones that are about to be released.
after CES2006
http://www.chilehardware.cl/modules....ticle&sid=1243
thats the most i know and i have heard the same things.
Now from what i've seen ati is really confident they have a great card, i think nvidia could have something new on their vga, if we look at history nvidia has added new features to thier vgas before ati (s.m 3) now what if nvidia launched a dx10 ready vga?? that would really put ati into preasure
Some older info (before ces 2006),
http://www.chilehardware.cl/index.ph...wtopic&t=14345
And if frecuency is also higher.....just add to the performance % :cool:Quote:
Originally Posted by nn_step
It's not gonna scale linearly (sp?), more texture units won't give it more performance, 32@700mhz can do 1200x1600 @ 11667/fps raw fillrate. I know that's not representitive but still. I do think that the 7900 willl be better though because the X1900XTX only barely beats the 7800GTX 512 in 05's pixel shader test (the X1800XTX gets owned). If the 7900GTX gets a significant boost there, it will probably win most tests. I'd like the X1900XTX to stay on top though, I am a fan boy at heart.
I think regardless of wanting info on 7900, if you upgrade your gfx card once every 3 years to the best...You need to wait for a DX10 card. G80 will be out in a few months, R600 by the end of the year. I would buy something affordable now/soon you can pawn off later to go to one of them. If I were you, i'd buy a 7600gt/x1700 (end of march?) or x1800xl/7800gt (depending on how much you want to spend, and therefore lose when selling later) and wait out an R600, but that's just me.
The former dx10 card will get you on the road sooner, although the later undoubtabley will be worth waiting for...Although, I to am a fanboy at heart...
But not paid by AEG. ;)
I to am a fanboy but only for BSD.. as for the parts I use.. only what works best..
physics_geek the best advise for you is not upgrade yet.. wait until G80 comes out
http://www.chilehardware.cl/index.ph...=175439#175439
basicaly the 7900gt is gonna replace 7800gtx 512
If you have info on the 7600 or more info on the 7900 series pelase share it if you can
Wow. I wonder what will be the price for 256MB version.
another thing expect nvidia to take the low segment and the mid segment completely
Don`t think you`re right about that. ATI has strong position on integrated grafic market = low-end.
Those 90nm G71 gonna clock real good with some decent cooling and more volts. I wonder will NVidia input software adjustable voltage or maybe they`ll introduce this later on DX10 compatible GPUs. ATI still has some benefit with this feature alowing to OC cards as much as possible w/o loosing the warranty.
low end= competition for x1300 gpu
As far as ati i'm not so sure about a good position as i mentioned in the article r590 is gonna be launched in april, but nvidia has g80 ready.
R590 or R600 ? So you saying that NVidia might present G80 in just a couple of months ? Erhhhmmm...so many new products in just half a year...
And Ati had the r600 - theres no evidence that there actually is a 12 month guarantee that ati dont release unified shaders due to the xbox 360 gpu. The current Ati desktop gpu the r580 is already faster than the xbox 360's r500, and the gpu was already based around far older tech (originally developed back in the r4xx generation). The current r600 may be far different from the r500, and the unique factors of the r500 design (the 10mb cache) are unlikely to be used on a desktop gpu where higher resolutions are used, so when Nvidia release their unified shader part there may be no reason that ati can't release their competitor.
i take it this means wel see a 128bit 7300?Quote:
Originally Posted by metro.cl
and i hope this means the 7600 will b 256bit but i think thats wishful thinking
no we wont see a 128bit 7300.. the low end will still be 64bitQuote:
Originally Posted by Starscream
and the 7600 will still be 128bit..:fact:
This is true, because the 7300 uses the G72 core, which, due to pin count, only supports a 64-bit interface. Unless they took a G73 (supposed to be 12 pipes) and disable a majority of the pipes, then they can manage a 128-bit 7300, but that would be a major waste. It's cheaper to make less lanes and use higher-clocked memory, especially considering how cheap DDR2 is right now.Quote:
Originally Posted by nn_step
But I also fully expect the 7600GT to own the mainstream segment. It should perform slightly above the 6800GS, which already owns the X1600XT. But I never considered the X1600XT to be a decent mainstream card. The X1700 on the other hand...
is still MIA.. and from what I hear still gets pwned in OpenGLQuote:
Originally Posted by Cybercat
open gl performance of ati versus nvidia is no longer cut and dry, nvidia lose their traditional lead in a number of games that have optimisations enabled in the driver.
I thought nvidia said they were going to implement FP32...
What does this mean?
Thanks.Quote:
2.Hardware FP16 --> RGBA: un tipo de multisampling antiaaliasing, lo que da a entender que Nvidia podra hacer HDR + Filtros (ATI lo tiene implementado)
3.Hareware FP16 Tex Compression: compresion de texturas sobre fp16 ya ke meter HDR + FSAA debe consume un gran ancho de banda
while rough cut translation means
2.Hardware FP16 -- RGBA: a type of multisampling antiaaliasing, which gives to understand that Nvidia podra to make HDR + Filters (ATI has it implemented) 3.Hareware FP16 Tex Compression: compression of textures on fp16 already ke to put HDR + FSAA must consumes great a bandwidth
If the 7900gt is a replacement for the 7800gtx512 does that mean that the 7900gtx is going to cost even more than the 7800gtx512, or that we're going to get far more bang for our buck?
I already did a :banana::banana::banana::banana:ty online translation. I want it from someone who speaks spanish :stick:Quote:
Originally Posted by onewingedangel
(Post reworked, to be less resenting.....)Quote:
Originally Posted by metro.cl
Hey metro, could you please provide a summary of the link(s) in English or some other translation (maybe Babelfish, Google or something could translate?) when posting non-english links as your news source, especially when you seem to link to your own site/posts? Myself, I wasn't even sure what language it was (just looked like some random mumbo-jumbo, besides the numbers, to me)! Sorry if it sounds harsh, was not meant that way, but it shouldn't be wishing for too much when you post your own scoop as news to the rest of us...
If it's your own news, then you really, really should consider translating it (all) into English for us all to 'enjoy' here, since this is an English forum (I think?) (ie not ChileHardWare).
Thanks!
[Edit: Sorry... I realized later on that I had put it in a not-too-friendly manner and also, thanks to Perkam, that I'd told you to actually stop posting non-english links, it wasn't what I intended to say (as you said yourself, the numbers and facts are what counts!). Sorry for that!]
learn to speak spanish, better yet, learn ANY other language than English. being multilingual is very good for youQuote:
Originally Posted by ahmad
here ill translate for you...
"I have great news from nvidia for you
They are going to display 3 new models @ cebit 2k6
7900gtx:
32 pixel processors (g71 8 cores x 1 quad x 4ps = 32pipes)
core 700mhz stock, up to 750mhz (im guessing he means if you OC on stock?)
memory 900mhz GDDR3 1.1nanosec, 512mb
processor fab:90nm
also
1.FP16 render target MSAA support: Improves quality in which HDR is processed
2.Hardware FP16 --> RGBA: new type of multi samp AA, whats known is that it will allow nvidia to use HDR and filters (ati has it implemented already)
3.Hardware FP16 texure compression: compression of textures over 16fp(AS?) since HDR+FSAA consumes alot of the available bandwidth.
7900GT
Pixel processors: 24 pixel processors
Core: 650mhz
Memory: 1400mhz with 512mb
process fab: 90nm
It would displace the 7800GTX 512mb since they wont be produced any longer
Also, they will release the 7600 series, although there isnt much more info available about it. It is known that Nv will retake the mid and low sector market around april.
Teaser:
ATI will debut the r590 core in april, with a fab process of 80nm."
(later down in posts)
"It is not believed that G71's will support DX10."
I lived in argentina for 8years, I speak/write spanish proficiently. I can be the official spanish to english translator! :banana:
edit: after putting 2 and 2 together, I realize OP is actually dude that posted in the chilehardware forums, duh.
Quote:
Originally Posted by krille
if you use your brain a little you will see the website is CHILE HARDWARE, Chile is a country from south america that speaks spanish.... :toast:
but metro if you are going to link to CHW please at least make a little resume in english.... if anybody wants the whole thing in english well we can try to translate it....
Thanks Omega!
read my post :slap:Quote:
Originally Posted by leviathan18
Welcome!Quote:
Originally Posted by softpain
Thanks for the advice. I already speak 3 languages including english.Quote:
Originally Posted by freecableguy
Some interesting info...much of it expected.
ATI's biggest worry will not be the 7900 by itself...seeing as the X1900XT made its debut at ~$510 at some places.
Where ATI will need improvement is its crossfire performance, which still leaves a lot to be desired...and the current GTX SLI beats the X19 cf configuration because of the maturity of its drivers.
Hopefully we'll see some good drivers prior to the 6.8s...seeing as the only meaningful drivers coming out of ati last year were 5.8/5.9 forward. If that same development cycle keeps up, cf users may have to wait until August to see some decent driver improvements for x19 cf...again, hopefully its a lot sooner.
Perkam
Hmm...two threads on the 7900....you know what this means :D
--Threads Merged--
Perkam
Quote:
Originally Posted by krille
Well mate number's aint that hard to get, plus i dont :banana::banana::banana::banana::banana: about most of the news been in english, i read japanese and chinese forums, and i speak spanish and try to speak english.
Small resume (based mostly on OmegaMerc translation):
7900gtx:
32 pixel processors (g71 8 cores x 1 quad x 4ps = 32pipes)
core 700mhz stock (nvidia might up the clocks to 750mhz )
memory 900mhz GDDR3 1.1nanosec, 512mb
processor fab:90nm
also
1.FP16 render target MSAA support: Improves quality in which HDR is processed
2.Hardware FP16 --> RGBA: new type of multi samp AA, whats known is that it will allow nvidia to use HDR and filters (ati has it implemented already)
3.Hardware FP16 texure compression: compression of textures over 16fp(AS?) since HDR+FSAA consumes alot of the available bandwidth.
7900GT
Pixel processors: 24 pixel processors
Core: 650mhz
Memory: 1400mhz with 512mb
process fab: 90nm
It would displace the 7800GTX 512mb since they wont be produced any longer
Also, they will release the 7600 series, although there isnt much more info available about it. It is known that Nv will retake the mid and low sector market around april.
Teaser:
ATI will debut the r590 core in april, with a fab process of 80nm."
(later down in posts)
"It is not believed that G71's will support DX10."
next time i'll do a small resume but i dont think there was much to read mostly was raw numbers, sorry for that.
p.d. sorry for creating another threat didnt see this one
Np Metro. Pls disregard Krille's comments. The forum welcomes people from all backgrounds and those speaking all languages...
Plus, spanish is the second language in U.S.A...so not that people mind. I must've read a thousand articles on HKEPC and other chinese websites because its the information thats important, not which language its in.
Perkam
I realize I slightly misworded myself... my humble apologies to Metro. Sorry mate!
(will edit other post and then this one)
Edit:
OmegaMerc > Thanks for translation, exactly what I was looking for! And any other time you'd help translate a spanish text would be greatly appreciated too!
leviathan18 > Not to be rude, but I explicitly said ChileHardware in my post (so I evidently used my brain :p:), but I wasn't sure what language they speak in Chile. Sorry for that one, just didn't know (I was leaning towards portuguese for some reason...). Again, all translations are highly valued!
metro.cl > Thanks for another translation! Of course a shorter resume works just fine too. Sorry again, hope no feelings were hurt!! :toast:
perkam > Yeah, sorry. Please disregard that... it didn't come out quite right. (Post edited.) Spanish news are as good as any!
Hope discussion can go back to normal now! :)
No problem mate :toast:Quote:
Originally Posted by krille
i'll try to make a resume next time :)
Quote:
Originally Posted by OmegaMerc
i did but what i was trying to say metro sometimes posts direct links to chilehardware as he knows english he can make a resume in english so ppl wont complain about the link in spanish... after that if anyone wants the whole thing in spanish i can translate or you or even metro.... we all can in fact even saaya can lots of ppl here speaks spanish too
Ah sorry for missunderstanding, anyways enough with the deviance! Back to topic on title! :fact: :fact:Quote:
Originally Posted by leviathan18
Can't wait for the new cards. I see a lot of information, yet nobody has provided a valid source of where they got their information from.
Provide a link so we know it's valid please...
i know for sure he has reliable sources to post that
ppl inside and ppl that talk with ppl that talk with more ppl inside.... so i know the numbers are going to be close on this....
as they are all in NDA metro cannot give names on this matter
Doesn't work like that. is the R580 200% better than the R520? cause it has 3x the shaders?Quote:
Originally Posted by Piotrsama
Look at the difference between a 7800GT and GTX at same clockspeeds= nothing at all basically.
I think he meant since the core is new, and will already have a performance gain over 7800/x1800. If it is clocked higher, it will always be better.Quote:
Originally Posted by Hicks
Yep. +, competition keeps the prices down. :DQuote:
Originally Posted by nn_step
I love it, competition fuels advancementQuote:
Originally Posted by nn_step
http://www.theinquirer.net/?article=29469
1 GB of ram possible on G71
Wonder if nvidia pops out a 1GB 7900GTX just for kicks and giggles?
should be pritty interesting.... :)
Quote:
Originally Posted by crackmann
wtf, does a 7800GTX / 1900XTX consume 512mb @ all? That would mean SLI would have 2GB!! I'd need like 4GB of RAM on my pc to keep up!:slobber:
Looks like CPU mfg's are falling far behind.
that is a special quaddro with a price tag of 8k $
I'm quite sure I read that G70 (maybe even NV40) could handle 2GB of RAM a while back. The issues they write about in the article are power related (a graphics card may only draw so much power according to today's standards), not memory controller.Quote:
Originally Posted by crackmann
A single 7800GTX should be able to host 2GB, wasn't it for the power concerns...Quote:
Originally Posted by OmegaMerc
And if you have the money to blow on such an abomination.. you either love AutoCad too much or you need to learn some basic economics:eek:Quote:
Originally Posted by leviathan18
yeah 90nm should have a great effect on powerconsumption...
was looking around yesterday at X1900XT CF vs. SLI 7800GTX 512MBs and saw that the 1900XTs consumed more power than the 7800GTXs (only with 2 vid cards in tho) so i am anxious to see how the power consumtion drops with G71.
It should be quite nice compaired to at 6800 Ultra :)...
is there an estimated release date yet?
http://vr-zone.com/?i=3231
Some more little tidbits.
How does that make sense? 2*700=1200 now? World is getting stranger by the day! :p:Quote:
The GeForce 7900 GT card comes with 256MB of K4J55323QG-BC14 GDDR3 memories from Samsung which has an effective data rate of 700MHz so a good guess will be 1.2GHz memory clock for the 7900 GT.
I think they mean since it is 700mhz mem, the stock clock will be 600, ala 1200. This doesn't stray from any other product that uses higher-rated memory than it's stock clock. This of course leaves room for EVGA, XFX and BFG stock overclocked cards.
Of course, you already knew that. They just left out some much-needed wording.
I guess we can be expecting ~1500+ overclocked mem speeds, if the x1800xl's are any indication. I wonder when we'll get the key specs, that of course being the pipes/tmu/rop setup. 24/16/16? Prolly. Couple that with 90nm speeds (600mhz+) we can probably start drawing some conclusions...That being it prolly will be a 7800gtx512-performance replacement part, like has been said.
I don't think there will be a 256mb version... As the 512mb 7800GTX is going to be the new 7900GT I don't suppose that the 7900GTX will have less onboard memory than the GT. Perhaps they'll have a lower end 7900 with 256?Quote:
Originally Posted by Cooper
Like has been done before, I'm sure we'll see both.
I also think that the 512mb version will come with a larger $$ premium than it's worth, just like XL/XL (X800), XL/XT (X18/X19), and 7800gt (amongst others) versions have shown in the past.
I figure intial pricing at ~$400 and ~$500, quickly going down ~$100 to be between 18XL's and 19XT's in the price range. Figure the same math for the 7900gtx, $600 going down to around $500 relatively quickly, if not slightly lower. A little bit more expensive than x19xt's, but prolly a little bit more powerful.
number of pipes = number of TMUs for NVIDIA. Texture operations are still attached to their shader engine with the GF7 architecture.Quote:
Originally Posted by turtle
I want one now!
http://www.digitimes.com/news/a20060216A7037.html (no 1 GB for those who were fantasizing :slap: )
This doesn't say anything about availability, but let us hope it won't be launched like the 7300 was :stick: (p word). And its likely that the 7600(GT) will show up the same time the r590 is expected.
The only funny part is:
This tells me one thing very clearly: it won't be able to de-throne ATI that easily. We saw how insanely nvidia priced the 7800GTX 512MB even when it beat ATI by a small margin in the days of the x1800XT. Now this...Quote:
Nvidia’s upcoming GeForce 7900 GPUs will have more competitive pricing than the Radeon X1900 from ATI Technologies, the sources noted.
Then the other interesting thing that will be a thorn in nvidia's side will be the RD580 (which I really do hope lives up to its hype) which is expected on March 3rd.
March is turning out to be a very interesting month for us hardware enthusiasts.
honestly the only things that are helped by having more than 256mb of Ram are Folding,WZ21 reaper, autocad and EXTREMELY memory intensive graphics applications
Quake 4 is helped, so is COD2, and doom 3, and f.e.a.r. at high rez all on high with AA/AF.
Also, Ahmad, the 7800GTX 512mb was such highly priced because it was limited edition. The 7900's won't be limited edition, therefore won't need a $649 price tag even if it DOES blow the x1900xtx out of the water.
What this tells us, the 7900GTX will be cheaper than the X1900XTX/XT upon release.
And the universal cycle of who has the best card continues.. Ultimately reaching a state of perfection at prices that we call can afford..
I disagree. Nobody knew or expected the 7800GTX 512MB to be a limited edition for one thing (nvidia never said quanitities would be limited, just said it was going to be hardlaunched which it was).Quote:
Originally Posted by DilTech
Now look at it from another point of view: if the card was that much better than the x1900XTX, then Nvidia could essentially price it as high as they want and people would still buy it because it is the best. People who put down cash for these things won't mind spending extra to get the best (I know I wouldn't and never do). Nvidia is obviously is not dumb and if they see a chance to make money, why let it go to waste?
This is pretty sound reasoning IMHO. But I guess we will all find out for sure on the 9th ;)
False, depending on the game, sometimes the x1900xt is closer to 60% faster than a gtx 512. So it all depends on game. ATI kind of went in a differant direction. In ut07 I believe the x1900 will be faster.Quote:
Originally Posted by nn_step
So you believe that ATi will have the edge in DirectX and nVidia will continue having a major advantage in OpenGL?
true SOME people will spend 100 more on a grafix cards cause it is a lil faster then another.Quote:
Originally Posted by ahmad
but not ALL.
but the group of people that buy high-end cards has increased over the last few years from what i notice on alot fo forums.
and most wont spend 100 to 150 extra for a slightly faster card.
like when the X1800Xt came out it was faster then the 7800GTX but was also more expensive and from the forums i read i noticed that some not all people went and bought a X1800XT.
most went and bought a GTX simply cause it was alot cheaper.
If nvidia has a faster card with the 7900GTX they have 2 options.
A: they put it 100/150 dollars above the X1900XTX.
some people that are ok with spending 100 more for a small boost will buy Nvidia and nvidia will make more profit per card.
the majority will go buy ATI.
b: they price it at the same price as the X1900XTX.
then people have the choice that for more or less the same money they can buy a faster card.
this will result in that the majority will go and buy the Nvidia card and only a few will buy the ATI.
con si that per card you make less profit but you will sell alot more cards and will steal customers from the competition.
if Nvidia is smart theyl go with B cause yeah less profit per card but alot more cards sold and more people will have their card instead of the competitions card.
the only reason why you would go with A is if u arent able to deliver your card in massive quantities or if the price to fabricate the card is very high.
you dont want to have part of the high-end customers you want to dominate it.
You want to make shure a large part of the people has ur card wich makes them more likely to buy ur card again next upgrade (unless that companys card at that point suxx).
that and you want to keep money away from ur competitor.
No, Shader intensive games. Fear is a nice example. I think oblivion results will be similar with fear, same as ut07, Whereas 7900 will simply dominate in older games, newer games and engines will love the x1900s shading powerQuote:
Originally Posted by nn_step
Thats why ATI did the research and chose that design path... Whats the use of pixel processing power if shading is lacking and its what games will be in the future?Quote:
Originally Posted by sabrewolf732
maybe nvidia chose to make shure the 7800/7900 will b better in older games that dont need alot of shader power cause they got the G80 not that far away wich will b powerfull with shading.
so that they didnt think the investment was worth it cause the G80 is so near.
i believe nvidia went that wait because as far as i know shader procesors do mostly physics so ageia (dont remember the spelling) could become handy fonr nvidia and they would also have a lot more pixel procesors that ATI, also if you look at the fact that nvidia sponsors more game developments i dont think developers will code to take advantage of the competition
wasnt there a rumor ( a small one) that the G80 had 2 cores?
maybe 1 is for physics?
The industry is headed towards more shader based games... that's pretty much fact.Quote:
Originally Posted by metro.cl
Yeah the rumor mentioned something about a second specialized core.. Perhaps an ageia physics chip...Quote:
Originally Posted by Starscream
That's correct. B3D interviewed Carmack and Sweeney, who both said that the shader:tex ratio shouldn't be 1:1 (definately should be higher)Quote:
Originally Posted by ahmad
The Inq and AACDirect both told us the 7800GTX 512mb was a limited edition card.Quote:
Originally Posted by ahmad
I think NVidia is going to play it smart, not only an attempt to dominate the performance segment, but also dominate the price segment as well. You see, we all know the 7800GTX 512mb was merely a PR move to beat the x1800XT, no point in hiding it. The 7900GTX is to be an actual CARD, marketed to be bought in store, not just as a "haha ati, we win" thing.
After the amount of flak NVidia caught over the phantom 512mb GTX, they'd be committing suicide not to release this one at a good price. Ontop of that, do you realllllly think NVidia would release 2 months after ATi and come up with a worse card than the x1900xtx?
That's just being niave.
As for the ratio of tex:shader, yes, we're seeing more shaders used than textures. Thing is, 16 textures may still yet be too low. That 16 may bottleneck it by the time it gets to the point where it can see a major league advantage from the 48 pixel shaders, stopping it from seeing it's full advantage anyway. Now, I can't say this for a fact, none of us can unless we personally work for companies like Epic, id, blizzard, etc. I can say this though, 32:32 will surely put up it's fight against 16:48, no one anywhere can deny that.
We'll see who's better come march, it's only a few short weeks away, so how about we quit speculating and just wait for it.
7900GTX @ 650Mhz/1.6GHz 512MB GDDR3 1.1ns RAMs
7900GT @ 450Mhz/1.32GHz 256MB GDDR3 1.4ns RAMs
http://www.hardspell.com/news/showco...?news_id=22084
Remember the 5800? =PQuote:
Originally Posted by DilTech
Thing is, in fear, an extremely shader dependent game the x1900xtx is up to 60% faster than a 7800gtx 512, the most the 7900gtx can be faster is 33% or so.Quote:
As for the ratio of tex:shader, yes, we're seeing more shaders used than textures. Thing is, 16 textures may still yet be too low. That 16 may bottleneck it by the time it gets to the point where it can see a major league advantage from the 48 pixel shaders, stopping it from seeing it's full advantage anyway. Now, I can't say this for a fact, none of us can unless we personally work for companies like Epic, id, blizzard, etc. I can say this though, 32:32 will surely put up it's fight against 16:48, no one anywhere can deny that.
Here's to hoping we'll see an ATI bios for 700/800 on the XTX out before then :p: Frankly, they could release a 750/850 bios for the XTX cos of how its binned.Quote:
Originally Posted by onethreehill
Perkam
everybody uses FEAR as an end-all example of shader performance. I don't know how you can look at FEAR benchmark results and not say it's optimized for ATI. There's no other shader intensive benchmark or game out there that shows ATI having nearly that much of an advantage.Quote:
Originally Posted by sabrewolf732
And yes, I'm aware that at the very last minute NVIDIA won the bid to get TWIMTBP stamped on the boxes. I'm also aware that up to that point, it was a heavily ATI-sponsored game, as ATI was doing as much as it could to get in bed with Vivendi and all of it's games (they managed to keep Tribes Vengeance, but unfortunately for them that turned up mediocre in popularity). Since they were around first, during the development of the engine, this relationship shows.
So people should really reference another game instead. FEAR is totally biased and unrealistic.
[RANT] Don't worry about it...its just residual comments that are still being said in response to Nvidia users pointing out like roosters off of the empire state building "LOOK, NVIDIA PWNS ATI IN DOOM3 !!!! WOOT !!!" ;) So really no one can be blamed for it.Quote:
everybody uses FEAR as an end-all example of shader performance. I don't know how you can look at FEAR benchmark results and not say it's optimized for ATI. There's no other shader intensive benchmark or game out there that shows ATI having nearly that much of an advantage.
ATI's shader advantage is not only in fear...as more shader intensive games come out, we'll see an even greater disparity between 7800 and X1900 performance, ESPECIALLY by the time we have 6.9 and 6.10 drivers coming out later this year. The REAL competitor to the X1900 will be the 7900 you say...I'm afraid that doesn't go all that well with people comparing X1800 and 7800GTX512 performance just cos they were out at the same time ;) [/RANT]
What we really need to see is the price range for the 7900GT and performance vs x1800 parts.
Perkam
Let me put it this way: I EXPECT the X1900XT(X) to pwn the daylights out of the 7800GTX 512 in shader-intensive stuff. I mean come on, with 48 shader pipes, that'd be sad if it didn't. But the performance advantage in FEAR extends all the way down to the X800 family. At some points, an X850XT PE performs the same as a 7800GT! I don't care who you are, that's not right.Quote:
Originally Posted by perkam
And yes, I do say the real competitor to the X1900 series is the 7900 series. Not only is the time of release a factor, as you said, but that the 7800GTX 512 was the only card that had the same amount of RAM coming from NVIDIA, or the closest in price at release (don't forget, at that time the GTX 256MB was over $100 cheaper than the X1800XT). Could NVIDIA help it that the 7800GTX 512 was powerful enough to nearly compare to a future card whose performance they had no way of anticipating?
So its unfair to lump the gtx256 and x1800 together as the x1800 was more expensive by $100, and yet its ok to compare it to the gtx512 which was far more expensive, and in limited avalability? And its not fair to compare the x1900 to a card that costs more and is nvidias best card, and is to remain so for the first three months of its product cycle?Quote:
Originally Posted by Cybercat
You simply go by whats avaliable at the time, and at what prices. So if someone chose any of these otions at the time of their release i could see how they would justify their purchases, however the gtx512 shouldn't be compared to any ati card as they are not in the same price bracket. The x1900 will be comparable to nvidias 7900 when it comes out as it will be ati's best at that time, so it comes back down to price/performance. You have to look at it both ways and not just say that its unfair to compare cards because the next one will be better - the launches are no longer syncronous, so that way of thinking, comparing one card to another has to go.
Right now, the x1900 is both more powerful, and cheaper than a gtx512, and theres still a month before the 7900 is released. The x1900 will be months old at that time - and therefore you have already declared (in comparing the gtx256 and x1800, or gtx512 and x1900) that you shouldn't do this.
Yes, I remember the 5800. Poor DX9 performance and a 128bit memory bus, delayed to hell and back due to issues with a die shrink... Now, do you remember every other card NVidia has ever released outside of the FX series? My point exactly.Quote:
Originally Posted by sabrewolf732
Look at it this way, the 7900gtx only needs 430mhz to catch the 7800gtx 512mb, at at 700-750 it theoritically should downright destory all scores we've seen done by the 7800gtx 512mb. 550 on the 7900gtx should beat out anyscore ever reached on any cooling from the 7800gtx 512mb. Regardless of what happens, we'll definitely have a new 3dmark champion. I don't know where you get that it could only be 33% faster, 8 more pipelines and 150mhz faster... Do the math!
This card may be closer to twice the speed of the 7800gtx 512mb, atleast theoritically. We all know increases are never linear, and therefore everything is just guesses until we actually see reviews.
Perkam and Sabre, may I remind you that black & white 2 is also an extremely shader heavy game and a newer engine, it loses to the 7800gtx in. This is one of those examples of games where 16tmus might just not be enough. At maximum quality settings, the 7800GTX 512mb beat the x1900xtx by a whopping 31%, we could be looking at 50-60% for the 7900gtx here.Quote:
Originally Posted by perkam
http://www.anandtech.com/video/showdoc.aspx?i=2679&p=9
Even the x1900xt in CF can't touch the 7800gtx 256mb. Why? Because of a lack of TMU's to back up those pixel shaders.
This is why I tell you, we'll see who has the better design in less than a month away. Could be ATi, or it could be NVidia, we'll just have to wait and see.
As for now ATi has the fastest card available, but next month we could see all that change. Less than 3 weeks time, if you haven't already bought a new videocard, and plan to this round, I strongly urge you to wait it out.
I think it would be niave to assume otherwise ;)Quote:
Originally Posted by DilTech
Your argument is weak. Its possible, but its weak. Why price it lower than the x1900XT? Why not similar? I think your post should say "what I hope for..." :slap:
Why is simple, because by march we'll see x1900xtx's for around $550 everywhere, so pricing at $649 would be stupid. If they price it at $549 or below out the gate, then it should be pretty close to what we see the x1900xtx for at that time.
you due the math. It has 33% more pipes and 33% more clocks. 33% is the fastest it can theroretically be vs a gtx512.
http://images.anandtech.com/graphs/a...0152/10664.png
And afaik, No game has a tmu/shader ratio like fear does. Also, the x850xt pe is Close to a 7800gt in pretty much all games. The fact that it beats it in fear is not surprising. In the above img the x1900xtx has 100% perf over the 7800gtx 512. If that's not impressive, I don't know what is. And the x1800series and 7800series were rather close in fear, you couldn't really call it biased until the x1900 came out of course :rolleyes:
firstly performance doesn't scale linearly with number of pipes and clocks so its useless to say because it has 33% more pipes/clocks its going to be 33% faster.
Secondly, by your logic it would be 76% better because 33% increase in clcoks would make it 33% better, and 33% increase in pipes would make it 33% better, it would be:
1.33 (33% more clocks than original) x 1.33 (33% more pipes than original) x100 (to express answer as a percentage) =176.89 % of original = 76% better.
Never use the term "you do the maths" then give the wrong sum, it looks silly.
Of course this is not thee case at all, parts of the chip will go unused, and there may be a deficit in vital areas as well as the oversupply. Ati's implementation is a little weak in regards to TMU's and strong in shaders, and nvidia's 7900 will be devastating in regards to tmu's but still has less shaders and so may perform worse in certain applications.
We won't know performance until its out.
No, that's not how performance is. To gain 33% in performance you must increase everything by 33%. Because you do 33% higher clocks and 33% more pipes doesn't mean 76% more perf. As I said, you do the math. And obviously this is theoretically.
if card y has twice as many pipes doing the same ammount of work(same clock speeds) as card x, then twice as much work gets done.
If card y has the same ammount of pipes but each pipe runs at twice the speed as card x, then twice the ammount of work gets done.
If card y has twice as many pipes and is twice as fast then card y gets four times the work done as card x theoretically.
Of course in practice this is not the case, other things come into play (memory access, imbalance in resources ie.some areas going idle when others are bottlenecking the system etc.)
I was just showing you that you cannot make performance predictions based upon clock speed or pipeline count increases, and I hope I've shown you how your MATH is flawed.
Agreed, Let's use hardspell's clockspeeds for instance...Quote:
Originally Posted by onewingedangel
7900 GTX 512 650MHZ/1600MHZ
32 Pixel Shaders/32TMU/24ROP
20800 MP Fillrate/15600 MP Output
7800 GTX 512 550MHZ/1700MHZ
24 Pixel Shaders/24TMU/16ROP
13200 MP Fillrate/8800 MP Output
Even at these conservative clockspeeds your looking at a ~57% Increase in Shader Power and Texturing Power over 7800 GTX 512.
Typically Shader Power and Texturing Power in a 1:1 Ratio with a 0.66x+ Amount of ROP power will yield almost linear performance increases, so a 50% performance improvement bascially everywhere would still be quite impressive.
Would be nice if it were clocked at 700MHZ or higher though as that would pretty much ensure the F.E.A.R crown as well. Everyone loves to say a 60% faster in F.E.A.R, tell me at what settings???
Honestly the only thing we should expect is that it will beat the X1900XTX.. anything else would be appreciated..
But should they seriously Fuk up and release something that performs worse *cough*5800*Cough* They will be in one heck of a pickle...