Your posts are (in general) good reads Deimos.Either full of information, full of teh lolz (not in a negative way!) or both
![]()
Your posts are (in general) good reads Deimos.Either full of information, full of teh lolz (not in a negative way!) or both
![]()
Notice any grammar or spelling mistakes? Feel free to correct me! Thanks
They've been EOL for quite a while now, try to find a GTX 275 in stores...
the only one left is the 285 for a ridiculous $375 which is close to the 5870. How much longer do we have to wait!!?!?!?!?
Don't forget about the G92 which is still goin strong in the low end, but can't really compete with ATI anymore in price. I think the G92 will never go EOL...just keep puttin them out
2560x1600 is not some mythical unicorn that no one runs with on modern games. It's 100% a good resolution to test in especially since it pushes the GPU and not the CPU in a test. Hardly biased since most people with these kinds of cards would be running it regardless. Battlefield Bad Company 2, which I didn't have issues with my 5870 with, was running pretty well (45-50fps most of the time) at 2560x1600 4x AA. A 20%-25% boost on that from a 480 would put it into the nicely playable territory (57-63fps). It's a brand new DX11 game we're talking about as well: same thing with games like Warhammer Online (another current MMO) where it's close to being playable but just dips and is a bit too low on average to really enjoy. Need For Speed Shift, same story really... 45-50fps on the 5870 when not encountering the issues from hitting other objects...
In short, if the 470 is 10% faster at those settings, then we can probably safely say a 480 would be 25%, and thus extremely attractive to high-end gamers. I disagree with you that 2560x1600 would be a rarity for people buying one to two $500 videocards. I'd think most people dropping that much cash on cards ($500-1k or so for a setup) would definitely have already bought the $750-1000 monitor to really show them off with since monitors last several years generally whereas a top-end card lasts 6-8 months as high-end.
I got my Dell 30" widescreen 3007WFP-HC 2560x1600 LCD for $750 shipped (refurb, pristine condition) with a 5-year warranty from them. You can find similar ones new for $1100-1200. What you describe is like buying a super-highend projector and then using it at 40" screen with a measly $100 home theatre in box setup: no one does it. They run 75-100" or more screens and get nice bookshelf or tower speakers to make the setup actually shine. There's little-to-no point in buying crossfire 5870's, a 5970, SLI 470's or 480's just to run them at 1920x1080 or 1680x1050, and I doubt most people do. It's overkill.
Last edited by GoldenTiger; 03-22-2010 at 04:32 PM.
I can only think of 9, maybe 11 G92 products... (GT330 is probably that 40nm chip)
April 2010 - the all new, super uber cool, GTS450 (Iron Man2 edition) - c'mon nVidia, no need to hide those PR slides, we ALL knew another G92 was coming.Code:The Life of Jeremy Bentham Nov-10 8800 GT Dec-07 8800 GTS 512MB Jan-08 8800 GS Mar-08 9800GX2 Apr-08 9800 GTX May-08 9600 GSO Jul-08 9800 GTX+ 9800 GT Mar-09 GTS150 GTS250 Feb-10 GT330?
We all had a good ride.. now its time to take old Yeller out back with the shotgun. Common, old boy, dont stare at me with those big round puppy eyes - ugh I cant do it - ok ok this is last and final year, I promise![]()
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
You're missing the overall point - which is what many people do when they are backed into the corner by a sound argument supported by facts.
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
well if deimos could clarify his original point that would help. are you talking about rebrands or just different sku's based off of g92? there is a difference all you have to do is look
9800gtx
gt330
g92 was a great architecture, nvidia just cant get enough of it, the consumers might have a different opinion.
G92 will never die ....
You *completely* missed my point...
I clearly said it isn't realistic to expect to run 2560x1600 fluently with a single video card. It is multi gpu grounds. I also said the majoirty of people who do shell out the cash on a displays like this are likely going to do it justice with a adequte system ( assuming it is being used for gaming purposes ). No where did I say 30" LCDs are rare, merely that expecting to tame games at 2560x1600 with reasonable IQ is unrealistic. A single 4870 didn't even last a year running stuff at 1920x1200. I don't expect the 5870 to be any different ( I already find its performance to be lacking in some games, even without anti aliasing ) I suppose it does depends on ones defintion of playability but in my eyes most things aren't playable at 2560x1600 with the currently available single gpu cards (and before you comment I've put a 5870, 5970 and GTX295 through their paces on a 30" display). The fact remains 30" LCDs are the ultra high end. I'm willing to bet more multi gpu users run 22-24" 1080/1200 displays for that matter...
Feedanator 7.0
CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i
Well, Deimos and the red camp are sure having fun trolling this thread....
I'm not sure what some of you consider fluid and playable but I find a majority of games I have very playable at 2560x1600 with a lowly 260 with 2-4x aa details generally as high as they can be set without issue.
It seems there is too much bench/graph comparing going without enough actual hands on time in the real world or something.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
GTX480 machine power consumption at full load is nearly 130W more than HD5870 machine,full load temperature is 92℃.
Last edited by mindfury; 03-23-2010 at 02:23 AM.
http://www.nordichardware.com/en/com...e-gtx-480.html
gtx480 and 5970 side my side
FX-8350(1249PGT) @ 4.7ghz 1.452v, Swiftech H220x
Asus Crosshair Formula 5 Am3+ bios v1703
G.skill Trident X (2x4gb) ~1200mhz @ 10-12-12-31-46-2T @ 1.66v
MSI 7950 TwinFrozr *1100/1500* Cat.14.9
OCZ ZX 850w psu
Lian-Li Lancool K62
Samsung 830 128g
2 x 1TB Samsung SpinpointF3, 2T Samsung
Win7 Home 64bit
My Rig
I think the pattern is fairly obvious. All except probaby GT330, use "G92 and/or G92b".. ie the same chip.
Is 8800GS a rebrand? 9800GX2? I guess you could say GTS250 is a rebrand, but it has different PCB, memory, heatsink. box!! etc.
=============
G200 eol a good thing?
Imagine you bought a GTX275 or worse, a GTX295 yesterday. Week later, your "investment" resembles those GM stocks you thought would be a good idea.
============
Those arguing not to worry if nVidia loses money on GTX470.
So how do they recuperate several $100m of R&D expense?
Uber super cool and brand new DX10 G92 products?
Those high margin non-existent DX11 cards that haven't launched?
Perhaps the millions of Tesla products you see every teen in school buying?
What is my point?
GTX 280 launced with MSRP of $650. Because of AMD's 4870, it was reduced a week later to $500. Lets pretend board has fixed cost of $300. That's difference of $350 vs $200 for profit... or almost half as much.
Fastforward a 2 years later, and nVidia is still hasn't learned. Boards and chips are bigger, hotter and more expensive to make.
Everything else being equal, consumers would probably buy:
- cooler/lower-power product
- higher performance
- nVidia if it still represents great brand name image - questionable?
So, you can't just say 10% higher performance, so we'll make MSRP 10% higher - need to factor in cons of power/heat/availability. Things would be rosy if a 384 bit board, 50% more DRAM chips, and 50% bigger die with low yields only added 10% more to costs. Very likely a lot lot more.
With 0 competition, AMD has sold millions of DX11 products and made mucho $$$. nVidia is at $0.00 And, people who already own a DX11 card, are unlikely to "upgrade".
So, just like AMD with Phenom(1), how do you sell the product?:
- lower price? -> temporary gain in marketshare at expense of less profits.
- lower power? -> trouble enough as it is getting working dies
- higher clocks? -> power is already very high
- lower costs? -> would you buy more expensive 768MB card? What if "some assembly required"?
- PhysX marketing? -> trouble is, its nothing new since all GF8/9/200 have it.
- DX11 promotion? -> sucks you spent last 6 months saying its not a big deal.
- CUDA GPGPU apps? -> its nice bonus. People don't buy $500 video cards to accelerate their Adobe PDF Reader.
- EyeInfinity? -> Don't got none.
The desperate "solution" is the same as it was in fx era... its the only thing you can really change...
questionable driver optimizations at the expense of rendering quality and driver stability.
Ofcourse like G80, about a year later they're gonna have a 32/28nm 448SP 256bit 1GB GDDR5 board. They're gonna call it "480GT"
Last edited by ***Deimos***; 03-22-2010 at 07:40 PM.
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
So that metal part on top IS a part of the heatsink... interesting.
Gotta give kudos to the team that designed that thing, they certainly were creative with shoving as much heatsink onto the pcb as possible!
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
How is this different from ATI when compared to their previous gen, the boards and chips are bigger, hotter and of course more expensive to make as well.
As far as dx11 sure but obviously with the inventory shortages at etailers that is a clear indicator nvidia have managed to sell off inventory easily even in light of ATI's dx11 & performance advantage.With 0 competition, AMD has sold millions of DX11 products and made mucho $$$. nVidia is at $0.00 And, people who already own a DX11 card, are unlikely to "upgrade".
I think you're blowing the desperation of the market out of proportion, graphics cards don't sell to the masses like iphones or something. Just because DX11 or newer faster hardware is released the majority of users aren't running out and buying new hardware simply because there is new hardware released, not everybody lives on the bleeding edge of tech.
And if you're worried that the card you bought today may be obsolete a month later when a new model comes out well you need a new hobby. We all know there is new hardware always around the corner and if you don't time your purchase early in the cycle your hardware will be superseded sooner or possibly the same day you make your purchase.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
Bookmarks