Ah, this is truely,
Do you want a new boat, or whats in the box. Omg the box could be anything, it could even be a boat!
I'll take the box!
We all can wait until mid november to make decisions.
Printable View
Ah, this is truely,
Do you want a new boat, or whats in the box. Omg the box could be anything, it could even be a boat!
I'll take the box!
We all can wait until mid november to make decisions.
http://www.theinquirer.net/gb/inquir...eal-masterhood
It talks about the 8800GT slides. Not sure if this was posted already. Nvidia lying? NOOOOOOOO! :D
I like this one.Quote:
Originally Posted by The INQUIRER
Yeah, gotta love the inq on this one. I always find things humorous when fud and inq are telling complete opposites. It's like a technology soap opera.
It made me giggle, even more-so how confident he seems in a part that he hasn't yet seen. If it still has the same issue with AA, then NVidia are 100% right, and the 8800gt will remain best in class. ;)
As for Q1, I'm hoping for ATi there, but I'm not holding my breath on it. NVidia are getting what is essentially an extra 3-4 months to work on that next high-end chip seeing as how they apparently won't need to launch it this year. 3-4 months is quite a bit of tweak time, especially considering it appears that nvidia will just be going up against a dual-die RV670 anyway.
Fore sure the person that wrote that article likes ATI more and did that article with weight on "nvidia's playing tricks on us and is unfair yaddayadda". I don't see the deal here, it's only called marketing. Any manufacturer does this, make their own company look as good as possible and makes the opponent look as weak as possible. It's only marketing damnit, I would be doing all these tricks myself if I was NVIDIA CEO or whatever. Therefore I have no problems with it, as I only see it as natural.
That article made be giggle at times though, gotta like the inqistic tone.
What about slide 20? If Nvidia really said ATI can't play HD DVD at 2560x1600 but Charlie has it playing on his rig then Nvidia is lying. :rolleyes:
He also says the HD3870 beats the 8800GT in most cases. hmmm, I can't wait for this slugfest. :D
Oh, and these are strong words too.
Come on slugfest! :lsfight:Quote:
As we have pointed out, the RV670 will be out and will match it for price performance, and according to all the leaked numbers I have seen, beat it on most counts.
EVGA Cards:
EVGA 512-P3-N801-AR - 600/1800MHz (default)
EVGA 512-P3-N805-A1 - 675/1950MHz (OC)
EVGA 512-P3-N806-A1 - 700/2000MHz (OC)
I'm going to try to get the last one of course :rolleyes:
Roll on Monday!
so since the gt will be better than the gts according to preview then this would mean that the 8800gts would drop in price? Lower than the gt?:rofl:
The GTS is being revamped as well infinity. Supposedly it'll be a G92 but with 128 shaders.
DX10.1 is more-so for developers than end users. It mainly just gives more control over the gpu to the developer via api(the games will have control over how much AA is applied to each edge, thus why the requirement for support for atleast 4xAA), and makes for more efficient multi-gpu control.
Thus far, there's nothing too major in DX10.1 to benefit the actual user. Also, there's presently nothing announced that will use DX10.1, but that's obviously likely to change at some point.
See above.
If developers start to use dx 10.1 more often wouldn't their games also use dx10.1 thus users having to buy dx10.1 hardware or will the developers switch of their games back to dx10 when it goes retail?
hi guys! i'm a nub, as you can see by my post total <
so, someone said they just sold their GTS, and i'm guessing it's to upgrade to this badboy, what are your general thoughts on that move? it's tempting to think that i can sell my card, get a faster one, and have money left over for some sammiches. :)
good move, bad move?
Depends. Is it a 320MB? Is it a 640MB? How much would you buy it for? How much would you sell your own card for? Etc.
If it's a 640MB GTS then I wouldn't spend more than $260-$270, if it's a 320MB version I wouldn't buy it at all.
Personally, I sold my graphics card not too long ago (X1900XTX) and now I'm waiting for either rv670 or 8800gt to come out. I can't play anything in the meantime but it's alright.
You know... since it's based on 65nm technology It will overclock pretty high. High enough to beat an 8800 Ultra haha. Although I don't understand why don't Nvidia release the high-end card first... Anyways I did wait for a new one, so let this one be it.
http://www.clubit.com/product_detail...mp=AFC-CJ_CBIT
In Stock .... nice price
I doubt they even had them in stock at the time, they probably counted the numbers they were "alotted" by nVidia (of the 50,000 total) for the Oct 29 launch as "in stock."
another marketing gimmick by a shop i suppose. :D
so when is the official online release date I need to know when to sell my gts?!
maybe I can downgrade through nvidias step UP program lol, they should send me some money back har har har:rofl:
How to overclock this 8800GT? Rivatuner or Expertool doesnot to work ...
that seems kinda odd, since they used Expertool here: http://news.expreview.com/2007-10-18...1918d6065.html
http://www.expreview.com/img/news/07...equency_02.jpg
and i think they also used atitool, but only Expertool did the shaders.
It's the BFG 8800GTS 640 OC2. :yepp:
It's an awesome card, and has treated me well in the last 6 months, but the new 8800GT is tempting.
http://farm3.static.flickr.com/2140/...0feaac9849.jpg
To be honest, I would love to get 300+ for the card, and if I get it out there fast enough, I might be able to. But if I can sell it, and then with that money get an 8800GT, I'd be happy with that, too. We'll see :D
100mhz oc on the core? Man I hope you just got a bad card and they all don't clock this poorly. What are your temps after overclocking? Maybe this card will really show its true colors after some vmods?
so that 900mhz core gpu-z screen looks kinda fishy after seeing OBR's results....http://www.hardspell.com/pic/article...40054138d8.jpg
im guessing thats a suicide/bug shot?? many ppl suggest ~750 core to be max, and ~2000 max on mem for oc ...
That GPU-Z shot is a bug - its not reading values correctly
if you read the thread where you pulled that image from you'd know that was a bug on GPU-Z... which i believe was fixed in the latest build.
also, if you took the time to read the review of the card before it was pulled down from the chinese site, it showed he achieved a max core overclock of around 730MHZ, and i think that was with a VF900 cooler on it. there were a few different screen shots w/ different clocks. one was @ 700/1000 and the other was 730/1000.
so 700 on stock cooler seems typical, not a bad card!
edit: I'm not and nvidia guy or an ati guy.. i'll take which ever one performs good. I'm just correcting misinformation from spreading... since i been researching this card and what's been released about it :p:
lol are you serious man?!! I never saw the site cause it was taken down. All I ever saw was this mysterious screen shot floating around showing 900mhz core. ...didn't know this was a bug. so how would I know about this if the review was pulled before I could see it? and its a disappointment since most later rev.8800gts on 90nm will do 650-700mhz core on average on the stock cooler from 500mhz.
if 700mhz is a typical overclock on this card then rv670 ends up being slightly more exciting.
Quote:
:up:
huh? no 20k?
quad @ 3.6 and both cards o/c to 750/2000 = >20k
nice post masacaras
Original thread those images came from (non-expreview branded): http://forum.lowyat.net/topic/541066/+360
oh man!
i may have to sell my 2 7900gtos for a pair of these :up:
http://forum.lowyat.net/index.php?ac...post&id=318212
these run extremely hot.
this image is from http://forum.lowyat.net/topic/541066/+420
The stock coolers are rather dinky. After market cooling will be your friend.
For the record, I wouldn't expect the RV670 to run any cooler.
Those are some ridiculous temperatures...
Damn high temps.
Summer + bad airflow in case + stock 8800GT cooler = 100*C.
We can call 8800GT's kettles, toasters or pans :ROTF:
Eh, it isn't the first time we've seen temps like that. I wonder what the fan speed was set to?
:::edit:::
Fan's on auto, makes me wonder if auto is the standard 60% the other 88's saw. Kick it to 100% and it'll probably drop quite a bit.
Also, that 73ºC ambient is telling me someone has some VERY bad case air flow.
http://forum.lowyat.net/index.php?ac...post&id=318443
Here's a stock MSI OC version in idle... I kind of doubt it's going to jump 40C between idle and full load. The other one must be under the worst of air flow.
I really hope we don't see those temps, I have good case air flow and my comp room is in the basement so I am usually 18c ambiant so I guess I shouldn't worry much.
I will try it on the stock cooler first but I think eventually I will move to a Zalman
what water blocks would fit these? :D
since ill take 2 of those .. **fingers crossed** hopefully that stupid logo cover can be taken off and ill put a 120mm fan blowing away from the heatsink
i dont like thermalright, zalman's ram sink .. those things fall rather easily ... and the Mosfet heatsink as well... ill just use the stock cooler
but boy that 18k 3dmark06 is stunning .
. .. where r all the RV670 benches ... also that 8800GTS G92 core as well ...
Some one shoot that guy with the 8800GT SLI and take over this setup , in what forum they allow you to run a QX6850 @ 12X300 with 3:4 :mad: ,ABC overclocking says 9X400 1:1 and if you are aiming for low OC don't get a QX
http://www.expreview.com/img/news/071026/DSC00525_1.JPG
:D
:yepp:Quote:
hopefully that stupid logo cover can be taken off and ill put a 120mm fan blowing away from the heatsink
isnt monday yet?
also take note that this is a load temp with ati 3d artifact checker running for 32min straight. I don't know about you but I game for more than 32 min usually.
And assuming the fan is on auto...exactly how high does the gpu temp have to get in order for that dinky fan on the cooler to work at 100%?
you have to take into account that ATI tools fuzzy square test thing also loads the GPU much more then an actual game does - in terms of generating heat. I don't think you'll ever see the temps you see under ATI tool while gaming... at least this was true when i was first overclocking my x1800xl a year or so ago... or was that 2 years ago? lol
so was that 18k 3dmark run with 8800GT oced or was it @ stock??
Volt mod anyone?
It was stock. :up:
8800GT SLI@600/1800
Link: http://forum.lowyat.net/topic/541066/+360
EVGA pics
http://images10.newegg.com/NeweggIma...130-303-01.jpg
http://images10.newegg.com/NeweggIma...130-303-05.jpg
http://images10.newegg.com/NeweggIma...130-303-06.jpg
http://images10.newegg.com/NeweggIma...130-303-07.jpg
http://images10.newegg.com/NeweggIma...130-303-08.jpg
http://images10.newegg.com/NeweggIma...130-303-09.jpg
http://images10.newegg.com/NeweggIma...130-303-10.jpg
damn!! with a Q6x00 @ 4GHz + GT SLI @ say 700/2000 ... 20k should be ez
@ warboy, any links?? i want more info about those cards lol!! cuz EVGA usually sells it ~$30 to $50 cheaper than XFX, and BFG (at least for GTS, GTX), and happens to be in CA which means RMA = a bit quicker
good thing i havent bought Quakewars ...
man that card is freaking sexy :up:
good fine Warboy
So far we have a clown from foxcon, Hair from Inno3D, and and a blue energy ray watchamacallit from EVGA...
Wait till we see Frobot in action. :D
Wow the EVGA is nice. Since it's coming out November 29th it kind of sucks that I can't preorder it on newegg... :(
If they got 18,200 with a quad clocked at 3600 and 2 stock gt cards, then I'm assuming that overclocked with quad should get around 21-22k
Thats depending on how well they overclock.
Gts was nice, core 500mhz and overclocked to avg 660+
Gt is at 600, so if we can get 760-775, that would be great!
Also, more importantly, if nvidia can work on their sli drivers, it would then make almost perfect sense to own sli as opposed to one gtx or ultra.
They should be shooting for close to 2x the speed of every game. 1.3-1.4x just makes no cents
Also, if the gt is able to score that high, can we expect close to 23-24k in gts sli ;)
i think he means Oct 29th
So everyone wants the evga for the 700/1000/1750 clocks?
OIC.
I don't think that will launch until November 12.
It will be very interesting to see if this SKU actually surpasses the GTX, or manages to fit between the GT and GTX. If it's a 640mb with 112/128SPs then we could be talking about minute percentage points that differentiate the GT, GTS, and GTX yet there is a huge gulf between the prices O__________O.
Brings back the days of the Geforce 4 series when the low end and high end weren't far apart but the prices represented a wide range.
tweaktown has a review of the msi 8800gt:
http://www.tweaktown.com/reviews/121...ion/index.html
Zotac GeForce 8800 GT 512MB
1269,00€ (= $1826.35)
http://geizhals.at/eu/a291462.html
Sparkle GeForce 8800 GT, 512MB
242,95€ (= $349.65)
http://geizhals.at/eu/a291468.html
And of course, they got nothing in stock.
But I love the price tag of 243€ before launch, it should drop 10-15% after a while...
Well I dunno ... everything looks fine to me ....... Except For The Noise LEVEL !!!
I didn't know that little card could be this loud. :(
http://images.tweaktown.com/imagebank/msi88gt_g_13.gif
And I thought this card cost less power than GTS ? :confused:
http://images.tweaktown.com/imagebank/msi88gt_g_11.gif
And temperature is very high:
http://images.tweaktown.com/imagebank/msi88gt_g_12.gif
Power consumption is a bit odd...
But high temps make it even more exciting because the higher the temps and louder fan on stock cooling the better it OCs with better HSF/water cooling.
;)
Higher power consumption may be a result of the higher clock speeds set on the Card.
LOL in tweaktown review they have hd2900XT only 11000 marks 3dmark 2006
just to compare i run a 3dmark exactly with same system used in the review (q6600@3000mhz & HD2900XT default)
http://img150.imageshack.us/img150/2...viewsetpq0.jpg
~ + 1500 marks
:up:
That tweaktown is....not trusty :p:
Also measureing the temperature behind of the core? wtf...And about very hot. Its also a singleslot card. My x1900GT is easily around 75-80C at stock.
So they basicly invalidate their own test :DQuote:
There are two places we pull temperature from - the back of the card directly behind the core and if the card is dual slot and has an exhaust point we also pull a temperate from there, as seen in the picture.
yep
i think this numbers are more realistic :
Quote:
GeForce 8800GT graphics detailed evaluation
Intel Core 2 Duo QX6800 @ 2.93G
DDR2 800 4-4-4-12
NVIDIA nForce680i SLI
Windows Vista
8800GT Default (600/1800) = 12038 marks
8800GT Overcloked @ 650/2000Mhz = 12824 marks
http://img240.imageshack.us/img240/9...3d06thubx8.jpg
http://translate.google.com/translat...language_tools
also in tweaktown review the 8800GT DEFAULT Beats 8800Ultra
8800 Ultra = 13143marks
http://img240.imageshack.us/img240/9...3d06thubx8.jpg
8800GT TWEAKTOWN = 13730 marks
http://img248.imageshack.us/img248/4663/gtau4.jpg
:up:
8800Ultra/8800GT review (13143 marks 8800ULTRA):
Processor:Intel Core 2 Duo QX6800 @ 2.93G
Memory: DDR2 800 4-4-4-12
Motherboard:NVIDIA nForce680i SLI
Operating System:Windows Vista
8800GT Tweaktown review (13732 marks 8800Gt)
Processor: Intel Core 2 Quad Q6600 @ 3GHz (333MHz x 9)
Motherboard: ASUS Blitz Extreme
Memory: 2 X 1GB Corsair XMS3 DDR-3 1066MHz 7-7-7-21
Operating System: Windows XP Professional SP2,Windows Vista
well tweaktown used windows XP SP2 and the other review used windows VISTA , in 3dmark 2006 that matters a lot but even with XP i think it´s too much (13730 marks @ 8800Gt default) .
regards