PDA

View Full Version : nvidia gt300



bigKr33
09-12-2009, 09:00 PM
I'm sure this has been asked several times, but I can't find any consistant info. I'm probably just looking in the wrong places.

I understand that the gpu technology conference is coming up very soon. What sort of things take place at these conferences in a nutshell. I'm really anticipating the release of the new gtx series or whatever its going to be called. Is there any news with reguards to its release date?

I really appreciate the info. Thanks

Oberon
09-12-2009, 09:52 PM
Most rumors (I don't know that there's any hard info) suggest that the cards will be released in limited quantities at the end of this year.

zanzabar
09-12-2009, 09:56 PM
its 4-6 months out so i wouldent trust any info on it, we didnt have hard gt200 info untill 1-2 months before it was out and i would expect that to continue

bigKr33
09-13-2009, 02:18 AM
Thanks for the input guys. Yeah thats a long way to go then if its 4-6 months. I may just have to stick with the gtx275 for now then. Well I shouldn't worry about it now anyway my computer still isn't finished:shakes:

Thanks again

perkam
09-13-2009, 07:51 AM
We started getting concrete information for the 5870 in early September, and its a September launch product.

The GT300 won't launch till Q1 2010. Do the math.

Perkam

Talonman
09-16-2009, 10:42 AM
We don't know how soon GT300 will launch...

It should be fast....

http://www.fudzilla.com/content/view/15535/1/

"A few people who saw just leaked performance data on ATI's soon to launch Radeon HD 5870 have told Fudzilla that they are absolutely confident that GT300 will win over Radeon HD 5870."

Possible specs on the 380...

http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units#GeF orce_300_Series

Code Name: GeForce 380 -G300

Year: Q4/09

FAB mm: 40

Transistors (Million): <2000

Die Size: 500

Number of Die: 1

Bus Interface: PCIe x16 2.0

Memory: 1024? 2048?

Configuration core: 2x 512?:128?:64?

Core: 700 Shade: 1600 Memory: 1100

Pixel: 44.8

Texture: 89.6

Bandwith: 281.6

Memory Type: GDDR5

Bus Width: 512 32x16

DirectX: 11

OpenGL: 3.1

GFLOPs (MADD+MUL): 2457.6

TDP watts: ≈225

Price: $499?

new MIMD instructions CUDA 3.0 and OpenCL




Possible Specs on the 390

Code Name: GeForce 390 -G300

Year: Q1/10

FAB mm: 40

Transistors (Million): 2x 2000

Die Size: 2x 500

Number of Die: 2

Bus Interface: PCIe x16 2.0

Memory: 1792? 3584?

Configuration core: 2x 480?:112?:44?

Core: 725? Shade: 1600? Memory: 1200?

Pixel: 2x 31.9?

Texture: 2x 81.2?

Bandwith: 2x 268.8?

Memory Type: GDDR5

Bus Width: 2x 448 32x14

DirectX: 11

OpenGL: 3.1

GFLOPs (MADD+MUL): 4608?

TDP watts: ≈425

Price: $699?

new MIMD instructions CUDA 3.0 and OpenCL

SNiiPE_DoGG
09-16-2009, 11:00 AM
LOL you posted specs from wikipedia - no the GT300 will not come out before end of 09.... if it was coming then Nvidia would have already announced themselves to steal hype from 58xx

Chumbucket843
09-16-2009, 11:13 AM
^^those specs dont look right. 425 watts on one card is ridiculous and how could you have 1792mb of memory on a 512 bit bus?

Oberon
09-16-2009, 11:17 AM
Also, fewer than 2 billion transistors and they expect to be that much faster than GT200? I know it's a new architecture and that performance doesn't scale with transistor count in any meaningful way, but I'm guessing that count is wrong.

Talonman
09-16-2009, 11:18 AM
LOL you posted specs from wikipedia - no the GT300 will not come out before end of 09.... if it was coming then Nvidia would have already announced themselves to steal hype from 58xx

With NDA still in effect, Wikipedia is about all we have right now... :)

Us Nvidia boys still think it will be close...

But like others have posted, still just rumors...

Here is an older link with some info.
http://www.brightsideofnews.com/news/2009/4/22/nvidias-gt300-specifications-revealed---its-a-cgpu!.aspx

bigKr33
09-16-2009, 08:16 PM
Yeah after doing a little more research, I don't think its gonna be out anytime soon. Plus the rumor about only a 2% yield, but who knows. I really don't think its gonna be this year. Maybe q1 2010

I may just get a change of heart though. That amd's new 5 series is very tempting and it comes out very soon. Would be nice to have to top off my finished build which will be really soon.

Talonman
09-17-2009, 10:42 AM
I still think the 300 will be out this year.

SNiiPE_DoGG
09-17-2009, 10:50 AM
I think unicorns are real!










doesnt make it true

B.E.E.F.
09-17-2009, 11:31 AM
I still think the 300 will be out this year.

Then this year must be over 12 months long.

http://majorleaguejerk.com/wp-content/uploads/2009/06/smarch.jpg

Talonman
09-17-2009, 11:42 AM
We shall see... :)

It's not like anybody but nVidia has the answer for sure.

iTravis
09-17-2009, 12:23 PM
I think unicorns are real!










doesnt make it true

If what Charlie said is true then yeh :ROTF:

Oberon
09-17-2009, 12:26 PM
Wait, define "out". Do you mean actually available for retail or do you mean listed everywhere but out of stock and, rather mysteriously, only review websites get parts?

Talonman
09-17-2009, 12:50 PM
Hopefully out in the store to buy... :)

I want the card that can beat a 295!!

SNiiPE_DoGG
09-17-2009, 12:55 PM
a 5870 you mean?

Levesque
09-17-2009, 01:02 PM
Nvidia CEO saying: ''DX11 is not important'' = no DX11 card coming soon from Nvidia... ;)

Talonman
09-17-2009, 01:07 PM
a 5870 you mean?

No, not the 5870...

It is slower at 1920x1200 with 4XAA and 8XAF... :)

And if the review sites have the gonads to include it in their graphs, they will also reflect this.

The 5870X2 with more memory will beat the 295...
But the 380 will beat a 5870, and the and 390 will beat the 5870x2.


Nvidia CEO saying: ''DX11 is not important'' = no DX11 card coming soon from Nvidia... ;)

That would be bad info. The 380 and 390 will be DX11.

B.E.E.F.
09-17-2009, 01:11 PM
We shall see... :)

It's not like anybody but nVidia has the answer for sure.

There's no hardware to release so they're quiet about it. When you start to hear more 'rumors' from the community, you can be sure the hardware is months away.


No, not the 5870...

It is slower at 1920x1200 with 4XAA and 8XAF... :)

Nobody knows. Reviews will be up soon then you can say that for sure

Talonman
09-17-2009, 01:15 PM
There's no hardware to release so they're quiet about it. When you start to hear more 'rumors' from the community, you can be sure the hardware is months away.


Possible, but you know they would love to have it out for the Christmas shopping season. :D


Nobody knows. Reviews will be up soon then you can say that for sure

Kind of like when the 300 is released?

Guessing dosen't seem to stop anybody else...

wez
09-17-2009, 02:32 PM
Kind of like when the 300 is released?

Guessing dosen't seem to stop anybody else...

The difference is we know the specs for 5870, and that it will be released this year :up:

SNiiPE_DoGG
09-17-2009, 02:46 PM
No, not the 5870...

It is slower at 1920x1200 with 4XAA and 8XAF... :)



huh?

http://www.techpowerup.com/reviews/AMD/HD_5000_Leaks/images/perf2.jpg

with 295 coming in at 25fps that has got to be very high resolution, quality and AA

Levesque
09-17-2009, 02:49 PM
That would be bad info. The 380 and 390 will be DX11.

We all know that. But that's why we now know the 380 won't be out soon... ;) Or else Nvidia's PR would be all over the internet saying that DX11 is the 2nd coming...

Talonman
09-17-2009, 04:18 PM
huh?

http://www.techpowerup.com/reviews/AMD/HD_5000_Leaks/images/perf2.jpg

with 295 coming in at 25fps that has got to be very high resolution, quality and AA

We will have to see how the reviews call it.... :)

From what I have seen in the posted benchmarks, the 295 looked to be on top, at 1920x1200 X4 AA, X8 AF.
http://img35.imageshack.us/img35/3665/benchmarkn.jpg

The 5870 will probably win a few, as it is a very impressive card.

I still would think the 295 will come out on top, most of the time...
(It is a dual chipped card after all.)

I do find the Crysis benchmark you posted especially odd, due to that game normally running best on nVidia GPU's.

Talonman
09-17-2009, 04:24 PM
We all know that. But that's why we now know the 380 won't be out soon... ;) Or else Nvidia's PR would be all over the internet saying that DX11 is the 2nd coming...

You may have a point, as they are not known for being a quiet bunch.
They are good at what they do. ;)

B.E.E.F.
09-17-2009, 05:43 PM
I still would think the 295 will come out on top, most of the time...
(It is a dual chipped card after all.)

The dual chips might be the reason it doesn't. Bottlenecks and all. Thanks for the benchmarks. Looks like they show that the HD5870 is indeed faster than the 295, and shows more promise with driver improvements.

Can't wait for the other benches. The HD5870 is a little pricey now.

Talonman
09-17-2009, 05:49 PM
Promise is the key word. :)

I still want nVidia to work on the same driver performance improvements with higher AA, that they did do with the 4XAA... :)

The 295 using 4XAA rocks.

I also want to see how much $$ that NewEgg sells the HD5870 for. I really do think it is a monster crard, and do like that it is able to run multiple FPD's.

I don't think however it should take the 295's crown yet... like how the 4870X2 took the 280's. (Not for 1920x1200 anyway.)
It may down the line, but I think we need to see those driver improvements first. ;)

For 2560x1600, I will give it to the 4870, but I do think there are allot more 1920x1200 guys.

Heck, they are sooo close in the performance of most things, we just need better benchmarks to confirm this raw info for a final ruleing.
It could go a tad either way, then all bets would be off.

LowRun
09-18-2009, 04:19 AM
We will have to see how the reviews call it.... :)

From what I have seen in the posted benchmarks, the 295 looked to be on top, at 1920x1200 X4 AA, X8 AF.
http://img35.imageshack.us/img35/3665/benchmarkn.jpg

The 5870 will probably win a few, as it is a very impressive card.

I still would think the 295 will come out on top, most of the time...
(It is a dual chipped card after all.)

I do find the Crysis benchmark you posted especially odd, due to that game normally running best on nVidia GPU's.

Did you notice in the graph you provided 5870 wins 10 out of the 10 tests @ 2560x1600 8xAA 16xAF. Looks a lot like the 295 beater you were asking for :shrug:

jaredpace
09-18-2009, 04:39 AM
Did you notice in the graph you provided 5870 wins 10 out of the 10 tests @ 2560x1600 8xAA 16xAF. Looks a lot like the 295 beater you were asking for :shrug:

Yes, 5870 also won in 12 out of 14 games at the highest resolutions tested (only losing in FC2 & Warhead).... def. beats the 295 when in matters most. Who gives a crap if one card is 180fps and the other is 195fps when the resolution is 1680x1050 0AA/0AF - may as well "put them to the test" :)

Talonman
09-18-2009, 05:23 AM
Did you notice in the graph you provided 5870 wins 10 out of the 10 tests @ 2560x1600 8xAA 16xAF. Looks a lot like the 295 beater you were asking for :shrug:

Yes I did... It did do an outstanding job at that res. :)

Good thing I'm a 1920x1200 guy!

Those guys that run at 2560x1600 are allot harder to find. What size display is that?


Yes, 5870 also won in 12 out of 14 games at the highest resolutions tested (only losing in FC2 & Warhead).... def. beats the 295 when in matters most. Who gives a crap if one card is 180fps and the other is 195fps when the resolution is 1680x1050 0AA/0AF - may as well "put them to the test" :)

When it matters most is when you run @ 2560x1600?

I still think there are WAY more 1920x1200 guys. (24" FPD usually are optimized for 1920x1200)

What RES do you guys run at?


FYI - Posted by: ownage

http://www.xtremesystems.org/forums/showthread.php?t=234525&page=9

http://img21.imageshack.us/img21/3395/gt300.jpg

GT300-A1 version
GT300
700 MHz Drawing clock (MHz) 700 MHz
Processor clock (MHz) 1600 MHz * 512 = 819200
281.6GB/s GDDR5 <225W 281.6GB / s GDDR5 <225W

GT200b GT200b
648 MHz Drawing clock (MHz) 648 MHz
Processor clock (MHz) 1476 MHz * 240 = 354240
159.0GB/s GDDR3 <204W 159.0GB / s GDDR3 <204W

Looks allot like wikipedia...

A 295's stock clock settings are Core = 576MHz, and Shaders (Stream Processors) = 1242MHz

My 295 will OC to: Talonman ------------ Q6600@ 3.81GHz --- (1) 295 679/1548/1224 ----- P24,252 (PhysX on)

If the A1 Revision would run at Core = 700MHz, and Shaders = 1600MHz at stock speed, that would be a nice jump! :)

The 512 bit bus is sweet, and I do hope it's 2GB of GDDR5.
If some extra performance is indeed realized going from SIMD to MIMD, we could be in for a good ride?

I can't wait until they strap that puppy in for a bench!
Better use a faster CPU than I have, to let it shine. ;)
She sounds like a runner...

LowRun
09-18-2009, 07:50 AM
FYI - Posted by: ownage

http://www.xtremesystems.org/forums/showthread.php?t=234525&page=9

http://img21.imageshack.us/img21/3395/gt300.jpg


FYI - Posted by jaredpace


Maybe you saw it here?

http://www.xtremesystems.org/Forums/showpost.php?p=3266406&postcount=58

Talonman
09-18-2009, 07:54 AM
I cant remember if I did...

Thanks for the link.

perkam
09-18-2009, 08:36 AM
Talon, you're the ONLY one going around defending the 295 vs the 5870 m8.

Why bother? It's not like people will forget to look at heat, size and wattage issues. The 5870 is a next generation card with next generation features.

NO ONE wants to spend $500 and be stuck in 2009.

Perkam

Talonman
09-18-2009, 09:56 AM
Perkam, You were the one who was trying to sell that the 4890 took a 285 down so bad... :ROTF:

You shouldn't be so quick when it's nVidia's crown, that your trying to hand back to ATI bro...

Looked like to me you were trying to sell your own thing...

http://www.xtremesystems.org/forums/showthread.php?t=234531

I saw your '$399 for > GTX 295 Performance !!!' thread.


http://www.xtremesystems.org/forums/showthread.php?t=233621&page=46

Posted by saaya:

'5850 beats 285 but not by much
so going from a 285 to a 5850 doesnt really make sense...

upgrading to a 5870 MAYBE makes sense...

upgrading from a 295 to 5870 would actually be a step back... unless your just into benching and want hundreds of fps... where you really need extra gpu perf to get acceptable playable fps, ie demanding games, the 295 seems to be about the same or faster than the 5870.'

Just wait for the official reviews. To me, it looks res dependent. I'm sure it does to others too.

B.E.E.F.
09-18-2009, 01:06 PM
Those guys that run at 2560x1600 are allot harder to find. What size display is that?

30" and up. But there are more models in 30" that are coming out in 1920x1200 and now another NEW standard for LCDs of 1920x1080. :shrug:


upgrading from a 295 to 5870 would actually be a step back... unless your just into benching and want hundreds of fps... where you really need extra gpu perf to get acceptable playable fps, ie demanding games, the 295 seems to be about the same or faster than the 5870.'[/COLOR]

Just wait for the official reviews. To me, it looks res dependent. I'm sure others too.

It depends on the needs and how much money you get from selling a used GTX 295 for.

If you're looking at power consumption, noise and heat, the HD5870 wins because its built on the 40nm process. And you don't get any multi-card issues like Crossfire/SLi. And yes, wait for the official reviews. We don't even know which driver this card has been tested with. Maybe the official driver offers more or even less performance. Who knows?

Talonman
09-18-2009, 01:24 PM
Thanks for the info on the '30" and up' 2560x1600 displays.

I will be waiting to see if the reviews cover 1920x1200, or if they go right for the 30" + res displays.

If they don't cover the lower res too, allot of people may walk away with info that isn't the most accurate for their system.

There will no doubt be some strong opinions on how the reviews were conducted. :)

Sounds like the makings for a good read! :up:

If the 5870 is faster than a 295, they better get it in the store fast, before the 380 and 390 are released...

Or it's reign will be a short one. ;)

jasonelmore
09-18-2009, 01:25 PM
meh, i believe i'll just stick with my gtx 295 for now until the gt 300 comes out in or around january. The only game that push's my gtx295, is crysis wars and from the benchmarks above, it looks like the 295 is neck and neck with the 5870. I'm just hoping that my dual pcb gtx295 holds it's value around Christmas time so i can sell it to get the gt300 with a $100 bucks to boot.

Talonman
09-18-2009, 01:38 PM
My dual pcb 295 will make it to the 380 too. :)

Sounds like a plan. :D

bigKr33
09-20-2009, 07:11 AM
I'm gonna try waiting for the gt300. I'm preferable a nvidia fan, but I need more info if I'm gonna wait. I really really would like to get the gtx295 successor. But if for whatever reason I don't hear anything soon I'm going straight for the 5870x2.

Clairvoyant129
09-20-2009, 09:09 AM
Perkam, You were the one who was trying to sell that the 4890 took a 285 down so bad... :ROTF:

You shouldn't be so quick when it's nVidia's crown, that your trying to hand back to ATI bro...

I prefer Nvidia cards but I still think you're trying too hard Talon. With drivers improvements, at the end of the day, HD5870 will be faster than a GTX 295 across all types of resolution that isn't CPU bound.

CyberDruid
09-21-2009, 06:34 AM
Any rumors or whispers about a dual GPU 300 series?

216
09-21-2009, 10:18 AM
Thanks for the info on the '30" and up' 2560x1600 displays.

I will be waiting to see if the reviews cover 1920x1200, or if they go right for the 30" + res displays.


Looks like I need a 5870 for my 30incher!
Brings water to my eyes!!! :rofl:

LordEC911
09-21-2009, 02:55 PM
Any rumors or whispers about a dual GPU 300 series?
Quite awhile ago, don't know how much to trust it though since recent talk of G300 is of a 40nm chip larger than G200...
They will have to wait for the process to mature quite a bit or maybe hope to hop on 32/28nm pretty quickly and maybe get it out in 2H '10.

Chumbucket843
09-21-2009, 03:47 PM
Quite awhile ago, don't know how much to trust it though since recent talk of G300 is of a 40nm chip larger than G200...
They will have to wait for the process to mature quite a bit or maybe hope to hop on 32/28nm pretty quickly and maybe get it out in 2H '10.

thought it twas 104 wafers per die? the gt200a is ~94 and gt200b is ~120. they arent increasing the bus size and the shaders scale pretty well with fab process so we will see if the rumours are true.

lonewolf218
09-23-2009, 04:31 AM
We all know that. But that's why we now know the 380 won't be out soon... ;) Or else Nvidia's PR would be all over the internet saying that DX11 is the 2nd coming...

Or Nvidia knows they have an ace up their sleeve and really aren't worried what ATI says. Everything ATI/AMD says should be taken with a grain of salt, I've personally never seen their products live up to the hype. The fact they are comparing their new products to Nvidia's last generation pretty much sums it up for me.

216
09-23-2009, 09:02 AM
The fact they are comparing their new products to Nvidia's last generation pretty much sums it up for me.

What else are they gunna do! Compare yields with the gt300 :rofl:
Use your loaf!

Jamesrt2004
09-23-2009, 10:02 AM
Or Nvidia knows they have an ace up their sleeve and really aren't worried what ATI says. Everything ATI/AMD says should be taken with a grain of salt, I've personally never seen their products live up to the hype. The fact they are comparing their new products to Nvidia's last generation pretty much sums it up for me.

this just made my day.. damn that ATI comparing there products to other ones on the market *shakes fist*

anyway normally we see around what between 5~15% increase from driver performance in games etc, so I still think the 5870 will just match a trade punches with the 295 while beaing cheaper (which is always good) think the cheapest 5870 vs 295 on NEWEGG (im from UK i dont know any other US site :P) was $90 now thats a big gap and not worth it to "swap" cards, but if your looking at the two 5870 wins especially with the low power consumpton and the (finally) fixed idle power consumption...

I do believe gt300 will beat the rv8XX but theres a reason Nvidia are quiet, normally they are the ones showing off when they have something coming up... its just quite... mm unsettling? ... I hope it does well, im just worried the monolithic design(lower yields) with the (crap) 40nm process at tmsc atm... its kinda shot them in the foot a tad :/ ...

btw dont believe those leakes specs on wiki.. even the 58XX was over 2 billion assuming gt300 is still "monolithic" design it just has to have more right :/?

meh.

good luck to them both but just want nvidia's stuff out sooner!!!!!



p.s also if you didnt see the hype for the 4XXX series your a deluded fanboy.. lol every1 agrees that was an awesome launch I mean they surprised even theirselves.

lonewolf218
09-23-2009, 10:51 AM
What else are they gunna do! Compare yields with the gt300 :rofl:
Use your loaf!

You missed the point entirely, the 5870 is newer tech, it should beat the 295 hands down, it's apples and oranges, instead ATI wants us to believe they have pulled off a great victory. Nvidia's answer for for the 5870 is right around the corner and it's rather presumptious for ATI to proclaim victory just yet.

Jamesrt2004
09-23-2009, 12:33 PM
You missed the point entirely, the 5870 is newer tech, it should beat the 295 hands down, it's apples and oranges, instead ATI wants us to believe they have pulled off a great victory. Nvidia's answer for for the 5870 is right around the corner and it's rather presumptious for ATI to proclaim victory just yet.
then why complain if gt300 is as great as you think it will then have no fear.. they will just show them beating the ati cards :rolleyes:


imo they got out a new card.. swapping blows with not a old gen card... a DUAL old gen card of a price of $90 UNDER it.. thats still awesome.. price/perf wise. and when it comes down in price it will still be the best price/perf wise.. which is what everyone should actually pay attention to

lonewolf218
09-23-2009, 02:01 PM
imo they got out a new card.. swapping blows with not a old gen card... a DUAL old gen card of a price of $90 UNDER it.. thats still awesome.. price/perf wise. and when it comes down in price it will still be the best price/perf wise.. which is what everyone should actually pay attention to

Well, almost everyone. There is one other factor in the decision- folding@home.
Until Stanford gets their act together with respect to ATI GPU's, Nvidia is the only option for some of us.:(

Maybe it's just wishful thinking, but I hope the gt3xx's do blow away the 5870's because then 1)ATI, having no claim to the top spot, would lower their prices to remain attractive. 2) Nvidia, realizing their cards are priced much higher than ATI, would also be forced to cut prices. 3) We the consumers, green or red, would save $$.
I would call that a victory for us all.:)

Otis11
09-23-2009, 06:47 PM
Well, almost everyone. There is one other factor in the decision- folding@home.
Until Stanford gets their act together with respect to ATI GPU's, Nvidia is the only option for some of us.:(

Maybe it's just wishful thinking, but I hope the gt3xx's do blow away the 5870's because then 1)ATI, having no claim to the top spot, would lower their prices to remain attractive. 2) Nvidia, realizing their cards are priced much higher than ATI, would also be forced to cut prices. 3) We the consumers, green or red, would save $$.
I would call that a victory for us all.:)


Unfortunately, Nvidia, realizing they had the superior card, would not lower their prices and might even raise them... If they have the superior card. (I think they will, but not going to claim anything for the time being) :shrug:

I'm really a Nvidia person because of the applications with Cuda and the DC section, but IMO - if you're not into either of those, this is a really attractive offer. :up:

Just a note: It doesn't matter that ATI's solution is newer tech, it's all that's on the market and we don't even have a timeline for the 300 series. I will hold out because of what I do with my cards, but again, if that's not for you, I don't see why you wouldn't upgrade. This seems to be an amazing card, at a great price. The only reason I see to go anything else is for DC/Cuda...

Jamesrt2004
09-24-2009, 01:55 AM
Unfortunately, Nvidia, realizing they had the superior card, would not lower their prices and might even raise them... If they have the superior card. (I think they will, but not going to claim anything for the time being) :shrug:

I'm really a Nvidia person because of the applications with Cuda and the DC section, but IMO - if you're not into either of those, this is a really attractive offer. :up:

Just a note: It doesn't matter that ATI's solution is newer tech, it's all that's on the market and we don't even have a timeline for the 300 series. I will hold out because of what I do with my cards, but again, if that's not for you, I don't see why you wouldn't upgrade. This seems to be an amazing card, at a great price. The only reason I see to go anything else is for DC/Cuda...

+1

I think the 5850 will be the best bang for buck this time around, I just worried about nvidia's strategy this time around (4870 was only ati card i've owned ever) but price/perf wise there just kicking nvidia in the nuts and it sucks :(


I agree about how ati should do more DC,

but apparently theres like 2 types of things they can do like single point and multi point, nvidia are amazing at single point (like the normal DC stuff) and ATI at multi point (milkyway DC app n stuff like that)

so i just think its the kinda calculations they do that effect it but meh I can't fold anyway's parents complain about the power draw =(

bigKr33
09-24-2009, 02:47 AM
I'm really impressed with the 5870's pricing, but what about the x2 version? Thats the one I would want to buy really. However its the scaling that worries me about the ati's. We know the dual counterparts perform smack dab between the single and crossfire setups. And two 5870's in crossfire are only around 30% faster +/- 5% than the single setups. So are we only going to expect a 15-20% increase in performance for dual gpus from ati? And at what cost... $599?

Don't get me wrong, 30% is big performance jump. But not when nvidia scales to nearly 50% with two gpu in sli.

Please don't bash me, I'm just going by what I have seen with benchmarks and reviews. Theres some of us that go for price/performance, and theres some that go for whats simply faster. Either way I'm still considering the ati as an option though. Just my $.02

wez
09-24-2009, 03:03 AM
I'm really impressed with the 5870's pricing, but what about the x2 version? Thats the one I would want to buy really. However its the scaling that worries me about the ati's. We know the dual counterparts perform smack dab between the single and crossfire setups. And two 5870's in crossfire are only around 30% faster +/- 5% than the single setups. So are we only going to expect a 15-20% increase in performance for dual gpus from ati? And at what cost... $599?

Don't get me wrong, 30% is big performance jump. But not when nvidia scales to nearly 50% with two gpu in sli.

Please don't bash me, I'm just going by what I have seen with benchmarks and reviews. Theres some of us that go for price/performance, and theres some that go for whats simply faster. Either way I'm still considering the ati as an option though. Just my $.02

I’m going to wait for the x2, at first I was going for two 5870, but the price for a pair is a bit on the steep side this early on, and a single card isn’t enough to justify an upgrade, at least not until I see some dx11 apps :)

Regarding the cfx scaling I’d say it’s due to not so optimal drivers, and hopefully it will get better further down the road. As in some games the scaling is where it should be already.

bigKr33
09-24-2009, 03:15 AM
I’m going to wait for the x2, at first I was going for two 5870, but the price for a pair is a bit on the steep side this early on, and a single card isn’t enough to justify an upgrade, at least not until I see some dx11 apps :)

Regarding the cfx scaling I’d say it’s due to not so optimal drivers, and hopefully it will get better further down the road. As in some games the scaling is where it should be already.

I would hope its the drivers. I noticed that the scaling for the hd4000 is about the same as the hd5000. Well either way lets just hope there is a driver improvement.

Jamesrt2004
09-24-2009, 04:04 AM
I'm really impressed with the 5870's pricing, but what about the x2 version? Thats the one I would want to buy really. However its the scaling that worries me about the ati's. We know the dual counterparts perform smack dab between the single and crossfire setups. And two 5870's in crossfire are only around 30% faster +/- 5% than the single setups. So are we only going to expect a 15-20% increase in performance for dual gpus from ati? And at what cost... $599?

Don't get me wrong, 30% is big performance jump. But not when nvidia scales to nearly 50% with two gpu in sli.

Please don't bash me, I'm just going by what I have seen with benchmarks and reviews. Theres some of us that go for price/performance, and theres some that go for whats simply faster. Either way I'm still considering the ati as an option though. Just my $.02

all the good benchmark sites if seen showd max 90% with lowest around 60% scaling,, not 20/30 :S... unless your using one card as 50% then I agree,

50% would be perfect scaling and thats not true ati scales way better normally then nvidia cards, 5870 is just :banana::banana::banana::banana:ty drivers :ROTF: dunno why they dont optimize them a lot more before they come out :/

Farinorco
09-24-2009, 04:08 AM
You missed the point entirely, the 5870 is newer tech, it should beat the 295 hands down, it's apples and oranges, instead ATI wants us to believe they have pulled off a great victory. Nvidia's answer for for the 5870 is right around the corner and it's rather presumptious for ATI to proclaim victory just yet.

What? Since when the new generation cards "should beat" the past generation cards of higher end "hands down"? You meant GTX285, right?

Anyway, speaking about "should", keep in mind that the Radeons both HD4800 and HD5800 are aimed to the sweet spot value segment at 264 and 333 mm^2, while GeForce GTX are aimed to the enthusiast I-don't-mind-how-much-to-pay-while-I've-the-most-powerful-chip-in-the-world market at 576 (470 in 55nm) and rumored 530+ mm^2. So I found pretty impressive that the new 333mm^2 chip is keeping so close to a dual 470mm^2 solution from past year. Do you really think that the new 330mm^2 "should be beating hands down" the dual 470mm^2 GeForce?

I think the performance increase from Radeon HD4000 to HD5000 series is pretty impressive, and I really hope that the GeForce is even higher, because they need it to not be in the same bad situation than last year (using enthusiast chips to compete against the value performance chips of competitors, so losing money in all fronts).

The monstruous GeForce GTX chips should be competing against the X2 Radeon solutions of the same generation, so let's hope this time it happens this way.

bigKr33
09-24-2009, 04:38 AM
all the good benchmark sites if seen showd max 90% with lowest around 60% scaling,, not 20/30 :S... unless your using one card as 50% then I agree,

50% would be perfect scaling and thats not true ati scales way better normally then nvidia cards, 5870 is just :banana::banana::banana::banana:ty drivers :ROTF: dunno why they dont optimize them a lot more before they come out :/

I'm just doing simple math. I just use the percentage of framerate difference. Example, 75fps vs 100fps is a 25% difference. Although I am curious how you got those percentages.

Farinorco
09-24-2009, 07:21 AM
I'm just doing simple math. I just use the percentage of framerate difference. Example, 75fps vs 100fps is a 25% difference. Although I am curious how you got those percentages.

Anandtech, for example, shows an average improvement of +59% in CF mode, with 4 out of 8 games giving >70% extra performance, being the less +28% and the more +92%, taking apart the case of Dawn of War II (here CF is obviously malfunctioning, cause it's giving less performance than single card). TPU is giving a +49% @1920x1200 and a +61% @2560x1600 counting all the games (I have not checked if all of them where working as they should or not). Nothing similar to a 20-30% improvement.

Ahm, and 75fps vs 100fps is not a 25% increase. It's a 33% increase. 75->100, you're increasing by 25, and 25 is 1/3 (33%) out of 75. ;)

But I thought this was a thread about GT300...

bigKr33
09-24-2009, 08:40 AM
Anandtech, for example, shows an average improvement of +59% in CF mode, with 4 out of 8 games giving >70% extra performance, being the less +28% and the more +92%, taking apart the case of Dawn of War II (here CF is obviously malfunctioning, cause it's giving less performance than single card). TPU is giving a +49% @1920x1200 and a +61% @2560x1600 counting all the games (I have not checked if all of them where working as they should or not). Nothing similar to a 20-30% improvement.

Ahm, and 75fps vs 100fps is not a 25% increase. It's a 33% increase. 75->100, you're increasing by 25, and 25 is 1/3 (33%) out of 75. ;)

But I thought this was a thread about GT300...

Yeah you totally lost me. 75 out 100 leaves 25 left. I really have no idea how you get 33%. That would be in 1/3's :wth:

I would need a link to how they measure scaling because really its confusing. I'm not saying you're wrong or anything, but were obviously seeing this in a completely different way.

Farinorco
09-24-2009, 08:53 AM
Yeah you totally lost me. 75 out 100 leaves 25 left. I really have no idea how you get 33%. That would be in 1/3's :wth:

I would need a link to how they measure scaling because really its confusing. I'm not saying you're wrong or anything, but were obviously seeing this in a completely different way.

No, it's not a completely different way. It's simple math (as you said). You simply are doing maths wrong.

If you have 75fps, and you increase up to 100fps, you are increasing +25fps. You said yourself. Now, how much are those 25fps expressed like a percentage of the original quantity, 75fps? 25/75*100=33.33%.

You're doing your maths wrong, because you're not calculating how much the original quantity is increased, but how much that increase is in terms of the final quantity. Which is wrong. Yes, going to 75fps from 100fps is a 25% decrease from 100fps, but going to 100fps from 75fps is a 33% increase from the 75fps.

The same way, doubling the original quantity (let's say from 50fps to 100fps) is always a 100% increase, while halving the original quantity (let's say from 100fps to 50fps) is always a 50% decrease.

With your maths, doubling performance is a 50% increase, and it's not, doubling is a 100% increase.

Going from 75 to 100: +33%. Going from 100 to 75: -25%.
Going from 60 to 90: +50%. Going from 90 to 60: -33%.
Going from 75 to 150: +100%. Going from 150 to 75: -50%.

bigKr33
09-24-2009, 09:00 AM
Ok now I get it. Sorry didn't mean to have you go through all that.

Otis11
09-24-2009, 09:00 AM
percent change is (new - old)/old so if it went from 75 to 100 thats (100-75)/75 or 33% increase

now if it went from 100 to 75 for some odd reason, then you get (75-100)/100 or 25% decrease

Not that confusing, just math really, but that's why you can get stats to say anything, because you can say the new one is 33% faster or the old one is 25% slower and infer (incorrectly) that the new one must be 25% faster.

Just another reason why there are only "Lies, Damn Lies and Statistics." (Mark Twain)

Now don't kill me. :sofa:



Edit: anyone mind telling me which side I just helped? I just do the math... :rofl:

Farinorco
09-24-2009, 09:11 AM
Ok now I get it. Sorry didn't mean to have you go through all that.

Don't worry, I don't mind it at all ;)

If you want to apply a quick formula when calculating how much a quantity is expressed as a percentage of the original (old) value, you simply have to do:

new value / old value * 100.

Then if you have a >100 value, the quantity will have been increased by the difference, and if you have a <100 value, the quantity will have been decreased by the difference (for example, from 75 to 100, it's a 100/75*100=133% out of the initial value, so that's a 133-100=33% increase, and from 100 to 75, it's a 75/100*100=75% out of the initial value, so it's a 100-75=25% decrease).

You can do what Otis11 say too, in the end, it's exactly the same thing.


Edit: anyone mind telling me which side I just helped? I just do the math... :rofl:

You have helped the CF to scale a little better :lol:

Jamesrt2004
09-25-2009, 07:35 AM
I feel this threads offtopic lol.. :)

(dunno how i got my percentages I always change how i work stuff out cos ill do it my way then see how someone did theirs then mix them and get confused :ROTF:


anyway GT300... believe that the yields are "fine" or do you think there bsing a lil bit..... personally if they said they where ok or satisfactory i could believe it... i just dont think "fine" is good... meh yor thoughts people

(to try bring on track)

justin.kerr
09-25-2009, 07:53 AM
Looking at Anandtechs review, I see that World of Warcraft scaled pretty well? I have never played it, but I always hear that SLI does not scale, is this a new phenomenon? Or am I just behind the times?

bigKr33
09-25-2009, 08:52 AM
Looking at Anandtechs review, I see that World of Warcraft scaled pretty well? I have never played it, but I always hear that SLI does not scale, is this a new phenomenon? Or am I just behind the times?

No I would say nvidia's sli does scale a bit better. Then again though it can be hit or miss, some games scale better than others. In general though nvidia does scale very well (two gpus sli)

justin.kerr
09-25-2009, 09:08 AM
Sorry I was not clear, just that particular game is suppose to scale poorly. And it looks like it scaled very well in the Anandtech article.

jaredpace
09-27-2009, 04:02 PM
Yeah you totally lost me. 75 out 100 leaves 25 left. I really have no idea how you get 33%. That would be in 1/3's :wth:


Think of it like this: One 5870 gets you 100 FPS. Two 5870's gets you 159.2 FPS.

It's a 59.2% increase (which is anandtech's average btw:p:)


Ok now I get it.
okay didn't read down far enough

Chumbucket843
09-27-2009, 04:22 PM
Think of it like this: One 5870 gets you 100 FPS. Two 5870's gets you 159.2 FPS.

It's a 59.2% increase (which is anandtech's average btw:p:)


okay didn't read down far enough

its either pci-e bandwidth or drivers. a 4870x2 scales around 1.7x faster in most games.

Jamesrt2004
09-28-2009, 07:55 AM
its either pci-e bandwidth or drivers. a 4870x2 scales around 1.7x faster in most games.

drivers, pci-e bandwith is fine atm, probs last time it will be for the pcie gen1 thought, would need them both at x16 also =(

***Deimos***
09-28-2009, 05:22 PM
drivers, pci-e bandwith is fine atm, probs last time it will be for the pcie gen1 thought, would need them both at x16 also =(

Why 5870 CF scaling "poor" compared to 4870?
Single 5870 already makes 2560x1600 4AA playable in most games. Diminishing returns as hitting CPU bottleneck, PCIE bottleneck, game and driver algorithm scaling and overhead etc.
Besides, 5870 is new, probably using separate code path in drivers which may not include all optimizations.

Why 4870x2 faster than 5870
Both 4870x2 and 5870 have exact same number of shaders, ROP, tex. Personally think its mostly 5870 bandwidth limitation. I've got a an excel spreadsheet with all the scores, but 5870 loses by 22%-42% in RE5, Batman and ESPECIALLY HAWX.

5870 vs GT300
GT200 already made switch to 32ROP and 80tex - consequently die size >500mm^2. So, GT300 doesn't need to increase this. Just the shaders and their arrangement/scheduling. And shader execution units dont use much space. With RV670, AMD went from 320 to 800 in RV770, both on 55nm. Thus, nVidia can easily go to 1000+ SP on 40nm.

Its also doubtful after 384 and 512bit, nVidia would go back to 256bit memory interface. If they did, impossible to beat AMD, since both limited by same GDDR5 chips. GT300 probably 384bit.

Finally, the three big problems:
1. nVidia never made Tesselation. And only recently DX10.1. Although almost certainly faster than 5870, **could** be like FX5800 in DX11. Gamers may be disappointed DX11 drivers late, and poor performance (remember nVidia was late with DX10 drivers).

2. nVidia is still pushing PhysX. What will happen with DX11? Abandon PhysX? Support both DX11 and PhysX and higher QA cost?

3. no word of multi-display. There is no reason to buy GT300 to run 1 monitor. GTX295 already runs 2560x1600 (with AA!!) is virtually every game. If AMD gets "eyeInifinity" working well with CF, it will effectively open up higher non-CPU bound resolutions.

EDIT: IF GT300 is bigger and hotter than 5870, how will it stay within PCIE spec? Its max 225W for 6+6 pin, or 300W for 6+8 pin.

***Deimos***
09-28-2009, 05:30 PM
IMHO,

if GT300 is anything like GT200 (576mm^2), GT200b 9(470mm^2),
nVidia will get pwned on yields and profit margins. Expect $699 MSRP or higher.

Now matter how "cool" it is for it to be the fastest single GPU. 95% of folks will choose the much cheaper AMD card.

WeeMaan
09-29-2009, 03:00 AM
http://forum.beyond3d.com/showpost.php?p=1341596&postcount=2552

Nice, more than 3 billion transistors.

Helloworld_98
09-29-2009, 03:38 AM
http://forum.beyond3d.com/showpost.php?p=1341596&postcount=2552

Nice, more than 3 billion transistors.

it might help if the image is a decent size.

and this is going to use a lot of power if it has 3 billion transistors, if they do make a dual GT300 card, It will be dual 8 pin.

***Deimos***
09-29-2009, 11:53 AM
Just in case amnesia has gripped you,
At launch 3 years ago Nov 2006, 8800GTX, the first DX10 card, standing in at 680mil transistors on 90nm (just like 7900GTX), was $599 MSRP and a worrysome 11" long.

First to support separate clock for shaders, CUDA, and first > 256bit (384) memory with whopping 768MB of 80GB/s bandwidth.

In contrast, 5870 looks like a small evolutionary step (ie CUDA launched 3 years ago.. DX11 Computing.. yawn). GT300 will need to have some unprecedented crazy jaw-droping feature to catch everybody's attention like 8800GTX. And if you recall 8800GTX was THE ONLY card that could really run Crysis (even better in some situations than 9800GTX, 2 years later).

Chumbucket843
09-29-2009, 12:59 PM
it might help if the image is a decent size.

and this is going to use a lot of power if it has 3 billion transistors, if they do make a dual GT300 card, It will be dual 8 pin.

power envelope is said to be under 225 watts and a 295 doesnt even have dual 8 pin and it has around 290 tdp. it really depends on clock speed. if a lot of those transistors are in the memory bus then it will be a lot cooler.

***Deimos***
09-29-2009, 05:44 PM
power envelope is said to be under 225 watts and a 295 doesnt even have dual 8 pin and it has around 290 tdp. it really depends on clock speed. if a lot of those transistors are in the memory bus then it will be a lot cooler.

U saying GT300 or GT300x2 under 225W?

I always thought the codename GT300 = 300W = 6 + 8 pin. Makes sense no?
[perhaps GT300 = $300 :rofl::ROTF:]

AN00BIS
09-30-2009, 07:06 AM
I just found this Article on Brightsideofnews.com: enjoy
Just like we disclosed in the first article "nVidia GT300 specifications revealed – it's a cGPU!", nVidia GT300 chip is a computational beast like you have never seen before. In fact, we would go as far out and state that this is as closest as GPU can be to a CPU in the whole history of graphics technology. Now, time will tell whatever GT300 was the much needed revolution.

Beside the regular NV70 and GT300 codenames [codename for the GPU], nVidia's insiders called the GPU architecture - Fermi. Enrico Fermi was an Italian physicist who is credited with the invention of nuclear reactor. That brings us to one of codenames we heard for one of the GT300 board itself - "reactor".
When it comes to boards themselves, you can expect to see configurations with 1.5, 3.0 GB and 6GB of GDDR5 memory, but more on that a little bit later.

GPU specifications
This is the meat part you always want to read fist. So, here it how it goes:
3.0 billion transistors
40nm TSMC
384-bit memory interface
512 shader cores [renamed into CUDA Cores]
32 CUDA cores per Shader Cluster
1MB L1 cache memory [divided into 16KB Cache - Shared Memory]
768KB L2 unified cache memory
Up to 6GB GDDR5 memory
Half Speed IEEE 754 Double Precision
:shocked::shocked::shocked::shocked::shocked:


As you can read for yourself, the GT300 packs three billion transistors of silicon real estate, packing 16 Streaming Multiprocessor [new name for former Shader Cluster] in a single chip. Each of these sixteen multiprocessors packs 32 cores and this part is very important - we already disclosed future plans in terms to this cluster in terms of future applications. What makes a single unit important is the fact that it can execute an integer or a floating point instruction per clock per thread.

TSMC was in charge of manufacturing the three billion transistor mammoth, but it didn't stop there. Just like the G80 chip, nVidia GT300 packs six 64-bit memory controllers for a grand total of 384-bit, bringing back the odd memory capacity numbers. The memory controller is a GDDR5 native controller, which means it can take advantage of built-in ECC features inside the GDDR5 SDRAM memory and more importantly, GT300 can drive GDDR5 memory in the same manner as AMD can with its really good Radeon HD 5800 series. The additional two memory interfaces will have to wait until 28nm or 22nm full node shrinks, if we get to them with an essentially unchanged architecture. You can expect that the lower-end variants of GT300 architecture will pack less dense memory controller for more cost efficiency, especially on the memory side.

GPGPU is dead, cGPU lives!
Just like we reported earlier, GT300 changed the way how the GPU is functioning. If we compare it to the old GT200 architecture, comparisons are breathtaking. Fermi architecture operates at 512 Fused Multiply-Add [FMA] operations per clock in single precision mode, or 256 FMA per clock if you're doing double precision.
The interesting bit is the type of IEEE formats. In the past, nVidia supported IEEE 754-1985 floating point arithmetic, but with GT300, nVidia now supports the latest IEEE 754-2008 floating-point standard. Just like expected, GT300 chips will do all industry standards - allegedly with no tricks.

A GPU supports C++ natively?
Ferni architecture natively supports C [CUDA], C++, DirectCompute, DirectX 11, Fortran, OpenCL, OpenGL 3.1 and OpenGL 3.2. Now, you've read that correctly - Ferni comes with a support for native execution of C++. For the first time in history, a GPU can run C++ code with no major issues or performance penalties and when you add Fortran or C to that, it is easy to see that GPGPU-wise, nVidia did a huge job.

To implement ISA inside the GPU took a lot of bravery, and with GT200 project over and done with, the time came right to launch a chip that would be as flexible as developers wanted, yet affordable.

In a nutshell, this is just baseline information about what nVidia is going to introduce in the next couple of weeks. Without any doubt, we can see that nVidia reacted to Larrabee by introducing a part that is extremely efficient, natively support key industry standards and more importantly, doesn't cost an arm and a leg.

The line-up is consisted out of high-end consumer part [GeForce], commercial [Quadro] and scientific [Tesla]. You can expect memory sizes from 1.5GB for consumer GeForce 380 to 6GB for commercial Quadro and Tesla parts.
:D:D:D:D:D:clap::clap::clap:

Otis11
09-30-2009, 10:32 AM
^^^ :clap:


U saying GT300 or GT300x2 under 225W?

I always thought the codename GT300 = 300W = 6 + 8 pin. Makes sense no?
[perhaps GT300 = $300 :rofl::ROTF:]


We can only hope... :rofl:

Helloworld_98
09-30-2009, 11:06 AM
U saying GT300 or GT300x2 under 225W?

I always thought the codename GT300 = 300W = 6 + 8 pin. Makes sense no?
[perhaps GT300 = $300k :rofl::ROTF:]

fixed :rofl:

Chumbucket843
09-30-2009, 12:44 PM
To implement ISA inside the GPU took a lot of bravery, and with GT200 project over and done with, the time came right to launch a chip that would be as flexible as developers wanted, yet affordable.

OMFG.

it looks like the sales from g80 have payed off in full. way to go nvidia. all you have to do is not kill my wallet now.

BenchZowner
09-30-2009, 12:46 PM
Official whoops... (http://www.nvidia.com/content/PDF/fermi_white_papers/NVIDIAFermiArchitectureWhitepaper.pdf)

cowie
09-30-2009, 02:03 PM
i heard it comes with a oprating system and power supply
http://www.nvidia.com/object/fermi_architecture.html

Dezmen
10-03-2009, 12:45 PM
Here are prices for new Fermi GT300 GPUs:

Fermi GX2 - $649
Fermi GTX - $399
Fermi GT - $299

Those prices are confirmed, so its retail prcie :P A really hope on them, will get GX2 ^^

bigKr33
10-03-2009, 04:12 PM
Do you have a source for those prices?

komadyret
10-03-2009, 04:23 PM
Anybody catch this: http://www.semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/

Charlie finally got some good points, and it gave me a good laugh on nvidias expence reading this.


In a really pathetic display, Nvidia actually faked the introduction of its latest video card, because it simply doesn't have boards to show. Why? Because it didn't get enough parts to properly bring them up, much less make demo boards. Why do we say they are faked? If you look at the pictures, it is painfully obvious that Fermi cards don't exist. Well, painful if you happen to be Dear Leader who waved fakes around and hopes to get away with it, but hilarious if you are anyone not working at Nvidia.

Dezmen
10-03-2009, 04:32 PM
small part source http://www.fudzilla.com/content/view/15795/1/
Forgot where i saw prices <_<

Otis11
10-03-2009, 05:09 PM
Anybody catch this: http://www.semiaccurate.com/2009/10/01/nvidia-fakes-fermi-boards-gtc/

Charlie finally got some good points, and it gave me a good laugh on nvidias expence reading this.

If that's true, thats very sad... just read the article and got to say, if those pics are real NVIDIA has a problem... :shakes:

***Deimos***
10-03-2009, 07:11 PM
Here are prices for new Fermi GT300 GPUs:

Fermi GX2 - $649
Fermi GTX - $399
Fermi GT - $299

Those prices are confirmed, so its retail prcie :P A really hope on them, will get GX2 ^^

There is absolutely no way in hell that nVidia, which has lost a lot of pride/ego/money/margins on GT200, would price their pinnacle of engineering at crazy low $399. And when is the last time you saw nVidia launch "GX2" anywhere soon after GTX? nEvEr.
- all the last high end for 5+ years, even from ATI have been $550-650. This includes, X800XTPE, X1900XTX, 8800GTX, 8800UTRA which was $830 MSRP, and ofcourse GTX280 too.
- the only break in this trend is 4870, and now 5870. Cheapest high-end launch since GeForce 1 days.

DilTech
10-03-2009, 07:56 PM
Deimos...NVidia admitted they priced the GT200 too high and said it's a mistake they will never make again... Those prices fit in line with that statement.

Otis11
10-03-2009, 08:20 PM
Deimos...NVidia admitted they priced the GT200 too high and said it's a mistake they will never make again... Those prices fit in line with that statement.

I sure hope you're right! :up:

And I hope the reason isn't poor performance. :shakes:

Helloworld_98
10-04-2009, 12:53 AM
Deimos...NVidia admitted they priced the GT200 too high and said it's a mistake they will never make again... Those prices fit in line with that statement.

even so fud's statement is way off, to get the dual 300 into the 300w barrier would require dropping a load of shaders.

Dezmen
10-04-2009, 04:53 AM
Nvidia is not so stupid, they understand that their prices was too high pretty good.So they will make prices lower to kill Ati xD I think prices are real, at least almost same as will be.

Chumbucket843
10-04-2009, 08:26 AM
even so fud's statement is way off, to get the dual 300 into the 300w barrier would require dropping a load of shaders.

just like the gtx295...

Chumbucket843
10-04-2009, 08:36 AM
There is absolutely no way in hell that nVidia, which has lost a lot of pride/ego/money/margins on GT200, would price their pinnacle of engineering at crazy low $399. And when is the last time you saw nVidia launch "GX2" anywhere soon after GTX? nEvEr.
- all the last high end for 5+ years, even from ATI have been $550-650. This includes, X800XTPE, X1900XTX, 8800GTX, 8800UTRA which was $830 MSRP, and ofcourse GTX280 too.
- the only break in this trend is 4870, and now 5870. Cheapest high-end launch since GeForce 1 days.
are you aware of a recession? its hard to make profit during them. you must of misread his post. their "pinnacle of engineering" is not the gtx360 either. prices of next gen cards at launch are usually relative to the speed of last gen or competitors.

***Deimos***
10-04-2009, 11:04 AM
Deimos...NVidia admitted they priced the GT200 too high and said it's a mistake they will never make again... Those prices fit in line with that statement.

See if you can find a pattern here (you can win a cookie!)?
High-end, Date, MSRP/Street at launch
GeForce Oct99 $300
GeForce2 Apr00 $350
GeForce3 Feb01 $375
GeForce4 Feb02 $320-$450
GeForceFX Jan03 $400
6800 Apr04 $540
7800 June05 $600
8800GTX Nov06 $600 (Ultra debut at $830)
GTX280 June08 $650

ATI/AMD trend?
9700pro - $399.
X800XTPE - $549.
X1800XT - $549.

HD5870 was **only** $379 .. cheapest high-end launch, 2nd only to $299 4870.

Wow, we're incredibly spoiled.

high-end is NEVER meant to be affordable. Its to set level for mainstream. If GT300 was $379, is nVidia supposed to "give-away" GTX285 for $150?

Chumbucket843
10-04-2009, 11:17 AM
HD5870 was **only** $379 .. cheapest high-end launch, 2nd only to $299 4870.

Wow, we're incredibly spoiled.


http://img58.imageshack.us/img58/9288/expensiveati2.th.jpg (http://img58.imageshack.us/i/expensiveati2.jpg/)

Crow-
10-04-2009, 05:13 PM
http://img58.imageshack.us/img58/9288/expensiveati2.th.jpg (http://img58.imageshack.us/i/expensiveati2.jpg/)

$638 ?!?! :eek:

OMG :shakes: :rofl: :ROTF:

I'm thinking how much will they ask for the 5870x2 :rolleyes:

Dezmen
10-04-2009, 10:34 PM
Thats scamming <_< Newegg sells 5870 for 379$ :P and 5850 for 289 :P

Jamesrt2004
10-05-2009, 01:41 AM
wonder if that site sold any lmao

Crow-
10-05-2009, 02:23 AM
wonder if that site sold any lmao

There is always someone stupid enough to buy that :ROTF:

owcraftsman
10-05-2009, 01:30 PM
Every where I've seen the 5850/70 is out of stock or like Amazon says "This item has not yet been released." but available for pre-order. The egg say, "Auto Notify" we all know what that means. Not that I would want one but ATI seems to be cutting off it's nose to spite it's face, Nvidia has been no better in the past either with having high volumes of stock at release. Even if you are willing to pay what ever the price may end up being you will be lucky to get one of the Fermi's by years end I suspect.

Jamesrt2004
10-05-2009, 05:55 PM
Every where I've seen the 5850/70 is out of stock or like Amazon says "This item has not yet been released." but available for pre-order. The egg say, "Auto Notify" we all know what that means. Not that I would want one but ATI seems to be cutting off it's nose to spite it's face, Nvidia has been no better in the past either with having high volumes of stock at release. Even if you are willing to pay what ever the price may end up being you will be lucky to get one of the Fermi's by years end I suspect.

I guess UK's kinda lucky, theres consistently some instock somewhere which is good, (i guess having 5/6 "BIG" hardware sites pays off here compared to one or 2 ginormous ones :) ...

anyway I agree, if everythign goes completley perfect there will be a handful around end of november.. nothing good stockwise till start of year even if everythign goes well tho

owcraftsman
10-05-2009, 06:54 PM
Well as soon as you say it somebody proves you wrong I have a forum bud who just got a 5850 from ZZF USA.

railmeat
10-05-2009, 07:15 PM
Thats scamming <_< Newegg sells 5870 for 379$ :P and 5850 for 289 :P


agree.....dont those toolbags know newegg is on the planet earth....:rolleyes:

5870x2 is the only way to go as far as ati ....paitence...gt300,larabee are coming.

Otis11
10-05-2009, 07:48 PM
Yeah, Hope 300 is good (and cheap! - i know it's not gonna happen. But still hoping)

Anyone heard anything on Larrabee? It's been kinda quite...

owcraftsman
10-05-2009, 07:50 PM
agree.....dont those toolbags know newegg is on the planet earth....:rolleyes:

5870x2 is the only way to go as far as ati ....paitence...gt300,larabee are coming.

You think that's goo I saw an E6600 selling for 450 USD just the other day google it :ROTF:

ajaidev
10-06-2009, 08:55 AM
Did u people read this:-

http://www.brightsideofnews.com/news/2009/10/5/gt300-demo-card-was-a-dummy3b-does-that-really-matter.aspx

"o we hear that the Tesla card shown at GTC by NVIDIA was a dummy; no GPU, no Memory, just a PCB and cooling solution on it. This information was "leaked" from a source inside AMD and sent around the internet until it found a home with someone willing to write about it as a sign that the GT300 is in trouble and that NVIDIA was/is lying about it."

Huh, people in XS and several forums noticed that the card was a fake its really odd BSN would point the finger at AMD. Another thing the pics of 5870 x2 of a man with a 5870 x2 in front of a Ruby cardboard cutout seemed quite real IMO, but i don't know could be a fake...

Dezmen
10-06-2009, 10:29 AM
Did u people read this:-

http://www.brightsideofnews.com/news/2009/10/5/gt300-demo-card-was-a-dummy3b-does-that-really-matter.aspx

"o we hear that the Tesla card shown at GTC by NVIDIA was a dummy; no GPU, no Memory, just a PCB and cooling solution on it. This information was "leaked" from a source inside AMD and sent around the internet until it found a home with someone willing to write about it as a sign that the GT300 is in trouble and that NVIDIA was/is lying about it."

Huh, people in XS and several forums noticed that the card was a fake its really odd BSN would point the finger at AMD. Another thing the pics of 5870 x2 of a man with a 5870 x2 in front of a Ruby cardboard cutout seemed quite real IMO, but i don't know could be a fake...



lol! All know that already. This was posted after Nvidia showed card... But, Fermi will have same size and design.

P.S http://www.newegg.com/Product/Product.aspx?Item=N82E16814125299 379$, its always in stock, you can buy at any time :P

owcraftsman
10-06-2009, 04:02 PM
When I start seeing reviews (actual cards in hand) we'll talk but I'm sure it's not the first fake.

LordEC911
10-07-2009, 12:19 AM
Did u people read this:-

http://www.brightsideofnews.com/news/2009/10/5/gt300-demo-card-was-a-dummy3b-does-that-really-matter.aspx

"o we hear that the Tesla card shown at GTC by NVIDIA was a dummy; no GPU, no Memory, just a PCB and cooling solution on it. This information was "leaked" from a source inside AMD and sent around the internet until it found a home with someone willing to write about it as a sign that the GT300 is in trouble and that NVIDIA was/is lying about it."

Huh, people in XS and several forums noticed that the card was a fake its really odd BSN would point the finger at AMD. Another thing the pics of 5870 x2 of a man with a 5870 x2 in front of a Ruby cardboard cutout seemed quite real IMO, but i don't know could be a fake...

I wonder when someone is going to write that the bolded part was sent out from Nvidia...:rofl:


lol! All know that already. This was posted after Nvidia showed card... But, Fermi will have same size and design.

P.S http://www.newegg.com/Product/Product.aspx?Item=N82E16814125299 379$, its always in stock, you can buy at any time :P
Unlikely... They don't know the size/design yet since they don't have a product.
I'm sure they have several ideas/designs they are working on but seeing as how they couldn't show a working card since they don't have one that isn't in a testbed/debug setup there is obviously nothing set in stone.

owcraftsman
10-07-2009, 07:10 AM
Source (http://www.fudzilla.com/content/view/15813/65/)

Nvidia's excuse for not showing GT300, Fermi based board is that they want to hide some things from the red green competition.

Since the latest GPU technology conference was all about how to make money on GPU computing, Nvidia didn't want to cloud this with a DirectX 11 demo, which would actually prove that Fermi GT300 board is real.

The second reason is that the board has a lot of wires sticking out, as well as test modules, and looks like a character from Terminator. Judging from we've learned in the last few days we are talking about a handful of boards if not even less than that.

Source (http://www.fudzilla.com/content/view/15798/65/)

The real engineering sample card is full of dangling wires. Basically, it looks like an octopus of modules and wires and the top brass at Nvidia didn't want to give that thing to Jensen. This was confirmed today by a top Nvidia VP president class chap, but the same person did confirm that the card is real and that it does exist.

It appears they have little more than bench top engineers dream ATM.
It's obvious all the hype is little more than a ploy to hurt 5870/50 sales.
I hope the experts are wrong but we probably won't see Fermi until 2010 H1.

safan80
10-12-2009, 08:14 PM
if there's no sign of the GT300 in November I'll probably get a 5870x2 4GB (2GB per gpu).

bigKr33
10-12-2009, 08:24 PM
I'm still suprised theres nothing on the gt300. Besides from the specs atleast, but I want to see some benchmarks and prices most importantly. The 5870x2 I'm sure is around the corner and still there is no word from nvidia.

To(V)bo Co(V)bo
10-12-2009, 09:46 PM
Heck, a big turd pile has a better smell than gt300 right now.

Jakethesnake011
10-12-2009, 10:21 PM
I think they will bring something good to the table, with the things that I have been hearing about them, most being bad, they better bring something that can compete with the 58xx series, which I dont feel there will be a problem with that in perforance but i dont know about price.

bigKr33
10-12-2009, 11:16 PM
Yeah very true, nvidia has been on top performance wise since the 6 series. I have no doubt in my mind that still holds true with the gt300. However it needs to be released to prove that:rolleyes:

Final8ty
10-12-2009, 11:53 PM
5780X2 listing

Either ATI have cracked sheared Mem or a listing error as that X2 will only have 512MB per GPU.

Final8ty
10-12-2009, 11:54 PM
if there's no sign of the GT300 in November I'll probably get a 5870x2 4GB (2GB per gpu).

Right up my street.

dan7777
10-13-2009, 12:18 AM
Either ATI have cracked sheared Mem or a listing error as that X2 will only have 512MB per GPU. isn"t it 1GB per gpu ?

Final8ty
10-13-2009, 01:03 AM
isn"t it 1GB per gpu ?

It should be at least but they always combine them , so it should be listed as a 2GB card & not a 1GB card.

To(V)bo Co(V)bo
10-13-2009, 08:23 PM
Video ram has no real impact without true x64 game binarys. Most games cant even utilize more than 2gigs of ram on a x64 platform due to not being a real x64 program. If you run a x86 game on x64 it basically stays x86. Then there is the whole mirroring video memory on system ram thing. We are really far from seeing more than 1 gig used effectively, more or less just marketing.

Aussie FX
10-14-2009, 07:15 AM
http://img58.imageshack.us/img58/9288/expensiveati2.th.jpg (http://img58.imageshack.us/i/expensiveati2.jpg/)

LOL, I assume that would be Australian prices and I know of 2 retailers that were selling at that price for the first week. They didn't sell many. Now they are ~$500AUD. So the USD prices are pretty much spot on at $375

That screenshot would have been when the AUD was ~70c ->$1USD.

To add a little relevence when the 5870 launched here GTX295's were anywhere between $800-$900 AUD.

In fact GTX295's are still $800 today.
http://www.austin.net.au/ProductList/ProductDetail/tabid/104/Default.aspx?ProductCode=GCGTX29517HWFF+

5870 @ $543 which is still very expensive, but there's a lot of price gouging going on here. In reality they should be somewhere between $400 and $450 going by current exchange rates.
http://www.mwave.com.au/newAU/mwaveAU/productdetail.asp?CartID=mAU@52T81U4CZUMDVCKTHSJM4 VTP2ZGVDHEMOXDELRO8JV9NJC&sku=42060169

The exchange rates here against the USD have zoomed up to near 90c->$1USD due to us not falling into recession. So in a week or two I would expect prices to plateau to the correct levels

But it all comes down to international monetary exchanges so all the prices are relative.

What does worry me is nVidia pulling all their highend cards. AMD might have a recommended retail of $375 but retailers in my country will charge whatever they can get. :shrug:
That's why my biggest nightmare would be AMD going bust.

Competition is necessary and that's why fanboism is insanity.

owcraftsman
10-16-2009, 04:08 AM
Has anybody seen this?

http://www.youtube.com/watch_popup?v=10xKBSEEvwQ#t=79

NaMcO
10-16-2009, 04:09 AM
I guess everyone has seen that video and it's complete bull:banana::banana::banana::banana: :down:

B-Shot
10-16-2009, 04:21 AM
Has anybody seen this?

http://www.youtube.com/watch_popup?v=10xKBSEEvwQ#t=79

WOW I want to make graphs with fake card too. I can even make my own stats from dreams I had:rofl:
(this thread is starting to give me gas)

rwurdig
10-16-2009, 07:26 AM
Any news on the gt300?

Sam_oslo
10-16-2009, 09:12 AM
Any news on the gt300?

Some info about the architecture is surfacing. As here, Inside Fermi: Nvidia's HPC Push (http://www.realworldtech.com/page.cfm?ArticleID=RWT093009110932&p=11)

Lestat
10-16-2009, 09:55 AM
I guess everyone has seen that video and it's complete bull:banana::banana::banana::banana: :down:

it's not b.s. when they use extremely nvidia biased benchmarks....
nvidia has always been better than ati with farcry 2 and crysis.. and vantage typically too when they add in the cpu physx score running on the gpu...

i dont think those numbers are completely out of the question
a little high but not so far fetched its not possible.

owcraftsman
10-16-2009, 09:56 AM
I guess everyone has seen that video and it's complete bull:banana::banana::banana::banana: :down:


Funny..... the guy who posted it over at EVGA forums swears it's not fake.