I was talking about at launch, but you go ahead and read the parts of my post that you feel like.Quote:
Originally Posted by onewingedangel
Printable View
I was talking about at launch, but you go ahead and read the parts of my post that you feel like.Quote:
Originally Posted by onewingedangel
Yes for sure developpers are just idiots...;-P c'mon give them more credits!Quote:
i believe nvidia went that wait because as far as i know shader procesors do mostly physics so ageia (dont remember the spelling) could become handy fonr nvidia and they would also have a lot more pixel procesors that ATI, also if you look at the fact that nvidia sponsors more game developments i dont think developers will code to take advantage of the competition
no, shaders do visuals effects....physics is done by the cpu...
reading these 5 pages is making my head explode (and yes i just sat down adn read everything) - why cant we just admit both companies are truely great and all just get along. In a year this rivlarly is going to be worse than the yankees vs red sox. if it hasnt already lol
The rivalry is a good thing.. it is just the Fan boys that need to be shot...;)Quote:
Originally Posted by BSill
the competition between the companies keeps prices low and keeps the performance level raising..
But the bickering of the Fanboys grows bothersome
Thats why XS is such a great community. The fanboy element is small compared to other boards. I don't doubt for a moment that most of the membership would jump back and forth from ATI/Nvidia and Intel/AMD in pursuit of more powah!Quote:
Originally Posted by nn_step
Overall Nvida has had some good launches lately with possibly the exception of the 7800GTX512 since it was available only in limited quanties. Does anyone have reason to believe that the launch of the 7900GT on March 9 won't have truckloads of cards available at most retailers?
Well if they aren't.. nVidia is in deep...
But I am banking on plenty of supply
nvm....! somewhere called me already...!
just remember 1XX XGB
1GB of Graphics RAM! that is going to make it cost twice as much
i think that most people hope too much and most of the time specs of future products always seem over exagereated.
G71 is an 90nm version of G70. Due to the shrinking of the die NV will be able to increase the clocks but thats about it. It has been confirmed that HDR+AA support wont be added in any G7x series and will only be avilable in G80.
At the moment we cant be sure about which card will be the fastest as ATi has changed alot of the rules this year and we have to wait and see what kind of impact 7900 will make. If current benchmarks are anything to go buy then X1900 should be faster in heavier titles such as FEAR and 7900 in Quake4.
You guys have to realise this is the year of DX10 and ATi has released thier last DX9 card just in time. 7900 will be too close to the release date of DX10 cards and that will impact its sales meaning that unless people dont choose to go for DX10 card, which seems crazy considering that 7900 is suppose to be the highest offering of NV and it will be out dated by the DX10 cards. X1900 will have been old enough to be have a very nice price tag and would be a much better choice for people that cant afford to jump on the DX10 bandwagon.
7900 is nothing more than NV's test run at 90nm in preparation for the G80. They might push it to beat X1900XTX but at the time it will be in stock it wont matter that much.
We are now quite far from DX10 which is why 7900 is getting all this attention, but thats today, come release date not many people will care about it becouse the forums will be full of rumours about R6xx and G8x.
Quote:
Originally Posted by softpain
havent you heard that most of the benches are cpu limited? developers might not be idiots but this is a business so money is a big factor.
nice post. Are we still looking to G80 in June @ Computex and R600 shortly after (or much later....perhaps coinciding with the 12/1/06 release of Vista)?Quote:
Originally Posted by Syn.
R600 is apparently going to be in october, according to the article that said the R590 is the X1800GTO.
a bit more info for ya guys
GT proform just like 78gtx512 SLi....
what GT? i don't know....
MS says that when Vista comes out there will be DX10 cards avilable, i dont know if thats good or bad news we will have to see. The way the progress is going on Vista and the way more info is slowly being released i would say that we should have Vista, Radeon X2800 and Geforce 8800 in Q4 this year. Thats talking stock whise, i dont know when which one will be revealed as it would be too much of guess work at this point.Quote:
Originally Posted by vapb400
ATi has just released X1900 range so they will keep things quiet so that attention stays on those cards. NV will focus on 7900 and only after that is released can we expect ATi to speak of R600 to take the attention away from 7900. NV will probably talk of G80 when we get closer to Vista release to probably counter the PR coming out of the red camp.
AMD is changing alot of things this year as well so we might expect some attention to be diverted on the chipsets. NV has to confirm support for AM2 and besides ATi doing the same they still have to release the SB600 not to mention the RD580 thats cooming out soon.
Of course we agree on both! :)Quote:
havent you heard that most of the benches are cpu limited? developers might not be idiots but this is a business so money is a big factor.
I just disagree with how big a factor you thought it is ( the money) about favoring nvidia (if the g71 is 24 and not 32 our disagreement will be for nothing) ;)
ps:My opinion is based on interviews I read from Oblivion (Bethesda) and SS2 lead programmers and Sweeny...
G71 to be 650mhz 24p but $499 ?
http://www.theinquirer.net/?article=29795
if it actualy only is a 24 pipe card its dissapointing and wont beat the x1900xtx but it could still become a success if they price it accordingly.
If it's 24 then it make it more logical to launch the G80 this summer....
I really don't see how people were expecting this to be a 32pipe card... But I already knew it was going to be a 24pipe. It was funny seeing it go from 750 > 700 and now to 650MHz. A 650MHz 7800GTX 512MB won't do anything for nvidia thats for sure.
It was really unusual at the end of the day for a refresh of a card to so significantly alter performance. Most refreshes are simply die shrinks a la G71. At least the rivalry is much more interesting now and the prices and availability are good. nvidia still needs to fix their IQ to get it to ATI levels but that's another story.
Eh, if what fuad says is true then I'm definitely just going to wait for G80.
24 pipes at 650 with only 800mhz on the ram? That's barely going to beat the 7800GTX 512mb.
If this is true(and it's Fuad the Fraud, confirmed to be given fake info all the time, even ATi admitted to giving him false info on an almost weekly basis) then ATi have won this round. If it wasn't for the 32 pipelined rumors from the get-go then I'd have already boughten a R580, but at this point(3 months til G80) there's just no point anymore.
Well, waiting never ends. Bought my card and haven't regretted it at all, been too busy enjoying it. No point in upgrading until new games come out that don't allow me to game at the settings I prefer and that doesn't look to be happening until fall at the earliest.
Might as well be 32p, but 32p rumour might also be out there to keep ppl from ATI for a while =)
Oh noes!!!:(Quote:
Originally Posted by Ubermann
My predictions:
24p
525mhz in quantity, maybe 6*0mhz with 7800GTX 512mb availability
1400mhz mem for the card that replaces the 7800GTX and maybe 1750mhz for the top 6*0mhz card
7900GT 24 pipes
7900gtx 32 pipes
the 7900 being only 24 pipes with a low price and the g80 max 3 months away could that mean that when the g80 comes we wont soon see any mid or low-end g80 based gpus?
so that when the G80 comes the 7300 is low-end below 100 bucks.
the 7600 will b the bottom half of mainstream and the 7900 the upper half and high-end G80.
and that end 2006 wel see mid and low end unified cards?
NVidia said in the same release that stated the G80 would be here in the summer that the midrange and low end of the series won't show up until early 2007.
I believe you mean 7900GT and 7900GTX :rolleyes:Quote:
Originally Posted by leviathan18
and what did i write????
gotta love edit button
I agree.. better to go back and fix than leave bad information...
if nvidia waited until march to release the 7900gt and 7900gtx they wont show with a 24 pipes card, they are going to take the crown again so they will show a 24 pipes for the 7900gt and a 32 pipes for the 7900gtx
If they do so, they'll problably rename the GTX back to Ultra, so the world knows these cards will be hard to get.Quote:
Originally Posted by leviathan18
We'll have to see quantities here...i think there's little doubt it will be a hard launch, but it reamins to be seen if they have the supply to deal with the demand.
ATI's only strategy on G71 release would be to undercut the GT with the X1900XT...seeing as the GTX512MB had trouble catching up to it, it may be same with the GT.
A mistake both companies made was to release a substitute product so soon after their original flagships...(at least for the X1800)...so now ATI can't undercut the newcomers with the X1800XT because they're clearly old tech in the eyes of the market.
Perkam
Lets just hope the 7900GTx wil have the 7800GTX availability and perhaps price while they are at it
ugh no dx10 for the masses :( When r300 launched didn't the 9700 np and 9500pro launch within a month later?
I wouldn't expect the low/mid end cards based on the R600 this year either....
sorry, i just dont get the point there.Quote:
Originally Posted by sabrewolf732
this might be slightly off topic but heres an update on G80
http://www.xbitlabs.com/news/video/d...220100915.html
i always said that NV does not like Unfied Shaders and it seems to be that even if they have a patent for unified shaders they wont be releasing a GPU with them just yet.
I must say, I'm not believing that article Syn...
It's definitely a rumour, reason being?
1.) Why would NVidia release a card with only as many pixel shaders as the R580 6-8 months AFTER the R580's release?
2.) DX10 requires pixel shaders, Vertex shaders, AND geometry shaders!(http://www.gamedev.net/reference/pro...d3d10overview/)
3.) AFAIK unified shaders is required for DX10, if G80 uses fixed pipelines then G80 will NOT be DX10 compliant...Quote:
With Direct3D 10 we have a new programmable unit – giving three in total: Vertex Shaders (VS), Geometry Shaders (GS) and Pixel Shaders (PS). All three form "Shader Model 4.0". Both vertex and pixel shaders are fundamentally the same as they always have been – but with a few added bells and whistles. However, the Geometry Shader is completely new – and allows us to write code that operates on a per-primitive basis. Not only that, but it also allows us to add geometry procedurally – effectively extending the hardware to a whole new class of algorithm.
i dunno, they quoted a guy from nvidia so who knows.Quote:
Originally Posted by DilTech
That quote came from the beginning of 2005, before microsoft announced that unified shaders/pipelines would be required for DX10/WGF2/Avalon/everything else microsoft has called it.
Quote:
Originally Posted by Syn.
:rolleyes: Obviously, what I meant was that affordable dx9 solutions were available within a few months of the first dx9 card, r300. NV has stated they won't release a lower end card based on g80 until early 07 (g80 launches in early summer). A pretty long time, sorry but not everyone has the money to get g80 :nono:
With all do respect I don't put alot of respect into DX10.. it is just another Forced M$ standard that I couldn't care less for...
AFAIK ATi isn't going to launch any DX10 parts aside from the R600 this year either. :nono:Quote:
Originally Posted by sabrewolf732
that's sad as well :(Quote:
Originally Posted by DilTech
You can't rip nVidia for something ATi is doing as well..
Well it wouldn't be the first time standards have been followed... :slapass:
But it would really be a step backwards if Nvidia does not decide to take the unified shader approach. Apparently there aren't any gains from Unified vs non-unified shaders. It just makes for a smaller and less complicated chip thats all. Its alot like SM3.0 vs SM2.0; its not so much that its "faster", it just has more features.
I didn't really attack nv or ati :stick: I said it sucks that there will only be the flagship for dx10 this year :stick:Quote:
Originally Posted by nn_step
I believe he meant you can't claim that nvidia is building dx10 hw for high end exlusively when ATI will be doing the same...
Though I imagine nvidia will not be able to be as fast in responding to ATI when the R600 hits as it'll have responsibilities to the gpu of the supposedly delayed PS3.
Perkam
nvidia isnt making the rsx for ps3 they sell the design to sony so they will see how they make the rsx afaik
rsx is basically a 7800gtx on steroids no? So you're saying that they sold it to sony and sony is fabbing the chips?Quote:
Originally Posted by leviathan18
the CEO of Nvidia said that the design for the RSX is 100% finished and is ready to deliver them (so i think that means that Nvidia is also delivering them).Quote:
Originally Posted by perkam
so they dont need to put anymore work into it.
Why would you need dx10 parts this year Sabrewolf?
I don't think you will see games using it in a real way ( not just for marketing) before 2007...
Vista...
Perkam
Quote:
Originally Posted by perkam
vista is 3-4 months delayed;)
Quote:
Originally Posted by sabrewolf732
afaik yes same to the ati r500 and MS xbox360 they sold the design
Delayed again for the nTh time since 1998.. when it was first mentioned...:p:Quote:
Originally Posted by metro.cl
http://www.theinquirer.net/?article=29795Quote:
NVIDIA'S flagship G71 won't be able to get to the speeds the firm wanted it to. We reported before that Nvidia needs at least 700MHz to beat ATI's R580 based X1900 XTX. However, according to our sources, the G71 will be nothing more than a 90 nanometre G70 die shrink. It won't have more than 24 pipelines but you never know what Nvidia has exactly up its sleeves.
It will use GDDR 3 memory running at 800MHz, with a 1600 MHz memory clock. In the case it ends up with 24 pipes it won't be enough to touch the R580, Radeon X1900 XTX performance crown. If Nvidia did put 32 pipes in G71 chip this would make its chip as a much better competitor and will make ATI run hard for its crown. We think that Nvidia could win most of the benchmarks with 32 pipes.
Nvidia plans to price its Geforce 7900 GTX very aggressively. That’s what you do if you cannot beat your competitor on the performance side. This card should cost $499 only and even the price indicates that the card won't be able to get the performance crown back. Last time Nvidia had a great card that managed to win all the benchmarks it priced its card at a saucy $650 and sold every single piece.
The cards should be sampled by the end of the month or at CeBIT. Nvidia will invite its loyal press, to Satan Clara to show them its new part. There will also be lower clocked G70 90 nanometre parts branded as Geforce 7800 GT and priced even less. Nvidia also plans to attack X1600 generation with its new 7600 GT, GS cards all scheduled for Cebit launch.
Remember, if G71 has 24 pipelines so does Sony Playstation 3. We will still sniff about it as some things are still unclear and G71 is more important to Nvidia that you can imagine. As long as it can ship millions of these chips at 550 MHz, a speed desired by Sony, Nvidia is fine.
Saw that a couple of hours ago....taken with huge amounts of
http://www.library.villanova.edu/blu...2gifs/salt.jpg
Perkam
I Don't think the dead sea has enough salt for all the crap out there...
I gaurantee this whole thread is going to degenerate into ppl posting pics of salt mines :p:
One item to note. There are already some game developers working on DX10 games. There have even been some leaks with DX10 video. Also, the DX10 hardware standards should all be complete so hardware may already be Vista ready and DX10 compatible. But, until Vista is released, and DX10 code iron clad, I would not want to announce being DX10 compatible as a manufacturer.
BTW, DX10 is looking very cool, especially the intial works from CryTech. All physical objects, like buildings have physical boundaries. So, an artillery barrage, like the ones in BF2, would also take out the buildings. Games will definitely take a leap forward.
And, I am hoping for a 32 pipe core. 24 pipes may not sell well, but if they come in at $500 and keep pace with the x1900xtx, then that is not all bad, either. Hope the spec rumors from the Taiwan sites hold true (32 pipes, 650 core, 1600 1.1 ns memory for the GTX and 24 pipes, 450 core and 1320 1.4 ns memory for the GT)
I doubt that highly do to the fact that Google only has 5 good pics of salt mines:p:Quote:
Originally Posted by sabrewolf732
I think it has to be more than 24/650MHz, if so I don't think it's gonna really catch the x1900xtx, which is more than 30% faster in several game titles. What I want is more info about the 7600gt/x1700xtQuote:
Originally Posted by HeavyH20
http://www.neowin.net/index.php?act=view&id=30487
Check out the DX10iness....
I wouldnt even play the storyline... Id just run over building...
:)
the crytek demo is INSANE, saw it a while agoQuote:
Originally Posted by crackmann
there you have 7600gt benchs appearQuote:
Originally Posted by sabrewolf732
Quote:
Originally Posted by metro.cl
no:slap:
thanks :D Doesn't seem much faster than a 6800gs however :( might as well not wait :( Hopefully it overclocks really well and the 1.8GHz a64 was holding it back? hmm gonna register chilehw forums, gotta improve my spanish, kinda lackluster :(Quote:
Originally Posted by metro.cl
infact it looks identical to a 5% overclock on a 6800gs:stick:Quote:
Originally Posted by sabrewolf732
7600 info belongs in the official 7300/7600 thread...which btw, had those benchies a week ago :rolleyes:
Perkam
sorry perkamy :(Quote:
Originally Posted by perkam
That core is tiny....certainly no bigger than a 90nm G70 :stick:
And no shim? Does that mean I actually have to try when I mount the cooler? :(
Here's an overlay of a 136-pin (or ball, or whatever....might actually be 144--it's the one on the 7900GT's PCB, lol) GDDR3 IC on top of the core.
http://img86.imageshack.us/img86/328...toverla6nn.jpg
n.b., the OUTSIDE edge of the red box is the OUTSIDE edge of the GDDR3 IC :)
The closeup of the GPU is not same GPU as the board pic ?
That's a zoom in from the pic of the entire PCB.
Hence why it's so ugly :D
EDIT: a zoom in of this pic (click if you want to see it....too lazy to rehost :p: ) http://images.dailytech.com/nimage/546_7900_front.jpg
7900 GTX 512 is set to have a MSRP of $599
http://www.theinquirer.net/?article=29821
I think some of the text is simply photoshopped out for some reason.Quote:
Originally Posted by Ubermann
7900gtx has 32 pipes :)
God I hope so..:slobber:Quote:
Originally Posted by metro.cl
dont think that is the 7900gt core look at this cores are the same.
http://www.vr-zone.com/index.php?i=3219
vr-zone pic says is a 7600gt core
http://img98.imageshack.us/img98/1932/7600gt1mb.jpg
dailytechpic says is a 7900gt core
http://img98.imageshack.us/img98/711/7900gt9et.jpg
some one must be wrong
the date are very difrents
one is feb 13th (VR pic)
and the other is today (DT pics)
btw the cards shoots from DT aren't similars
Anyone wish to flip the coin of public opinion?
Considering the core is smaller than a GDDR3 IC, it's probably the 7600GT. No way a 90nm shrink on the G70 could get it that small :stick:Quote:
Originally Posted by metro.cl
This is what the 7600 looks like:
http://lp.pcmoddingmy.com/albums/use...1139577987.jpg
http://lp.pcmoddingmy.com/albums/use...1139578213.jpg
Those pics are correct
some more news
a company is going to get some 4-cores sli card.
Let me guess it looks kind of like this:Quote:
Originally Posted by guess2098
http://www.3dfx.ch/gallery/albums/co..._100SB_001.jpg
Dual Core cards maybe but Quad I highly doubt..
hehhe i meant 1 card with 2 cores like the one dell had on ces
Dell's SLI set up had 4 cards.. it is just that the pairs of cards where connected.. it was a Hack job..Quote:
Originally Posted by guess2098
yeah i think it is like thatQuote:
Originally Posted by nn_step
but not really sure
talked to someone on lab yesterday said it is 4-cores SLi cards (2 cores per card)
actually what would be really sweet would be Quad Cores in a Quad SLi set up
http://www.hardwareoc.hu/upload/news...SLi-Quad_1.jpg
16 cores of POWER!
nvidia is getting lazy if it has to resort to 4 gpus :stick:
Dude that is like saying AMD became lazy when it went from Single cores to Dual Cores...:fact:Quote:
Originally Posted by sabrewolf732
it is just another trick they can use to boost performance..
Quote:
Originally Posted by nn_step
They are getting lazy.... Instead of Nvidia engineering a better gpu they're just adding more and more :nono:
it is called the law of dimishing returns..Quote:
Originally Posted by sabrewolf732
it is identical to the RISC/CISC debate..
I would rather have Quad RISC than a single great CISC
As long as performance increases, I'm happy ;)Quote:
Originally Posted by sabrewolf732
I'd rather have a great engineered product rather than some brute force attempt. I like my computers small, not too hot, and not consuming that much power. It's inefficient. I'm all about efficiency. Also, this just adding more and more cores attempt is actually raising the price. A few years ago to have the highest end vc set up cast you $300-$400. Now, with that advent of quad sli you can easily get into the area of several thousand dollars.Quote:
Originally Posted by nn_step