PDA

View Full Version : Lost the magic smoke



penguins
03-20-2012, 04:25 PM
Medium Sized Story short:

Beer Bottle Cap + back of video card = instant shutdown.
overnight cool off time just in case. unplug plug back in oh, it's cool just needed to chill
sssssstttt SMOKE NoooooO!!!

so GTX 570 dead.

I've WAS going to upgrade Monitors to a 30" with a nice huge desktop res.
but now Looks like it's time to upgrade my video card.

So I've already got the EK 570/580 Block. So logically it would make sense to keep it and go for a 580.
I am Limited to 1 Card and I want to stay with Nvidia for Compatibility reasons.
I know this isn't the Video Card section, but I know you will know.
So Which 580 would be good for high resolutions?
I figured the 1.5 gb 580 would do. But I don't want to go buy a card that isn't as future proof.

EVGA GTX 580 015-P3-1580-AR
EVGA GTX 580 3072MB GDDR5 (03G-P3-1584)

Look like my two best options.
anyways thanks for your help everyone.

the price difference is really small, so nevermind I'll go 3g, couldn't find where to delete.

NKrader
03-20-2012, 05:38 PM
oh no try to catch it and put it back in!

id personally get the 3gb just because.

JoeM
03-20-2012, 06:28 PM
Is the 3G card reference? In other words, will the waterblock fit it still?

penguins
03-20-2012, 06:50 PM
Yes it will, about to go buy it.

penguins
03-20-2012, 07:56 PM
well, I just got back from Fry's Stupid people. I ask a guy if the rebate is still good because it was returned,
He said of course, I say thanks and start to walk away, "Can I write that up for you?" "no" "I get points for it" "I know"

I Know I sound douchey not wanting this guy to get commission but what did he really do for it?if I didn't have a box in my hands I could Have still asked the same question. and He wouldn't get commission for that.

To make things worse, I wanted to pay with a Credit Card, who would have thought I would have to Stand there waiting for the poor teller's Super ( who was sitting on a turned over trash bin having a conversation with two other people )
it took 5 minutes for her to get off her lazy ass. and about 20 'Super on 17's the part that pissed me off the most was she
asked me how I was doing, took less than 10 seconds to type in her stupid code!! all the while the teller is apologizing over and over.
I eventually had to tell him to stop apologizing it's not his fault, his super is just too lazy to stand up. He got a chuckle, and Was the only good experience I had during my wonderful outing to my Local Fry's.


sorry for the rage post but Seriously OMG.

Utnorris
03-20-2012, 08:18 PM
I would have waited for the GTX680's to hit next week, but if it's an EVGA you should have 90 days to decide to upgrade. Did you call EVGA and ask if it could be RMA'd? Probably a long shot, but you never know.

fr0wn3r
03-21-2012, 06:15 AM
Yeah man, I'm with Utnorris... Think you rushed it a bit. 3GB cards are better for higher resolution monitors but it won't help you that much to get really smooth FPS showdown on 30" while playing games e.g. If it's for some other kind of work, they will suffice otherwise 580 is just limited with it's processing power...
I know you didn't want to change the wb but 680 is around a corner with slightly higher price from that 3GB version of 580 so try to do what Utnorris said.

Just a friendly advice. :)

Will watch of beer-caps in the future, who would've imagine the situation like that. Gotta love life! :D

NKrader
03-21-2012, 10:05 AM
3GB cards are better for higher resolution monitors but it won't help you that much to get really smooth FPS showdown on 30" while playing games

with current gen games.

altho i suppose that worrying about tommorow really doesnt matter that much to you bleeding edge guys.

penguins
03-21-2012, 02:47 PM
eh, the card was only 40 dollars more than the 1.5 and I do want to just keep the waterblock and just get the pc to work.

uhg... but seeing as the 680 will be cheaper and better ? I'll just goto Fry's and return it. I was most likely going to anyway when I got home and opened the box the whole thing almost fell apart. damnit now I gotta sell my waterblock lol.

thanks for the update y'all, haven't even really been paying attention to any news lately.

and Yea who would have thought that the cap would bounce off me and into the computer eh :|

penguins
03-21-2012, 07:45 PM
Returned the card mainly because I didn't trust it wasn't as bashed to hell as the box it came in.

I've also got a concern about things fitting. the EK 580 block was the smallest block of all the manufactures when I got it.

if it's any taller than the 580 block I won't even be able to fit it.

ie. http://img233.imageshack.us/img233/406/onyx2435.jpg

safan80
03-21-2012, 08:05 PM
Well if you plan on running a 30" monitor keep in mind that at 2560x1600 skyrim uses 2.4GB of Vram with the high res texture pack.

fr0wn3r
03-22-2012, 02:14 AM
Yeah, I see what you mean. Really tight in there. Dunno what to say. So far some leaked pics show just an ordinary successor to 580, but you can't really tell the difference with no side by side pics and with no existing water-blocks...
Searched the web a bit for a price, according to rumors MSRP should get around the 499$, cheaper than Radeon 7970's price atm. Looks like nVidia wants to start the price-wars, pressing the AMD to lower their prices. Performance wise should end up better or at least as good as AMD's current flagship. Specs for both, well you can find them everywhere... Main difference would be 256 bit for new nVidia vs 384 bit on AMD's side, and 2GB RAM vs 3GB RAM but expect the 4 GB versions of 680 too...

EDIT:

Few reviews are out...
http://www.xbitlabs.com/articles/graphics/display/nvidia-geforce-gtx-680.html
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review

As you will see nVidia decided to name this chip 680 though the codename of a chip GK104 is direct successor of GTX 560Ti. The "real" 580 successor, if you follow the nomenclature strictly, is GK110 and will prolly be called 685. If you bother such things at all... The reason for this is the best performance per watt as they say.