Please.
Unless I'm mistaken, no where on that article does it refer to a card design that would be traceable, rather, it refers to ID Tags that they will "remove."
Hence, the photoshop arguement.
Printable View
Please.
Unless I'm mistaken, no where on that article does it refer to a card design that would be traceable, rather, it refers to ID Tags that they will "remove."
Hence, the photoshop arguement.
.Quote:
When DailyTech contacted the site owner to get verification of the benchmarks, the owner replied that the benchmark screenshots could not be published due to origin-specific markers that would trace the card back to its source -- the author mentioned the card is part of the Microsoft Vista driver certification program
Sorry, I misread the sentence maybe (english not first language) but specific markers may refers not only to numbers but maybe to their location on the card:confused:
if the card they are using is not mass production i am sure that AMD engineers can recoginse the card without even needing an ID number.
in fact i am:stick:Quote:
Originally Posted by Cybercat
is the article's author reading this forum? it looks like his changes coincide with posts here....
This is not the only forum with this news...
yeah, the best thing we should do is to wait that sampsa and ati/amd guys get some ES samples and they show r600 preformance:fact:Quote:
Originally Posted by Ubermann
Simply put; guys we were just eager and highly over anticipated the G80 and i'm sure its the same thing happening now, jus blow it over, pay them and r600 no mind till its actually launched. All this happens because we are waitin so long that anything we hear about this card it becomes a gospel :slobber: .... i'm through waitin on specs and pics so when it release i'll be sure to see it:fact:
Of course, the XT has 25% more shader units so it will be a much better folder than the GT.Quote:
Originally Posted by nn_step
Not if you Fold it doesn't.Quote:
Originally Posted by dejavuxx
Well, if this is not a fake, ATI is back in business :)
Now to wait 'til Friday or Saturday to see what those pics look like...I want to see that massive pcb/cooler.
More like another 2 weeks for me :( Oh well at least im going someplace warm.Quote:
Originally Posted by turtle
U wont see any pics cause the site is down :mad:Quote:
Originally Posted by turtle
Its up and promise pics soon =)
i can't wait, and am very impatient :D
LOL,Quote:
Originally Posted by roibm
ATI whas allready in business. They pwned the 7900 serie and just because nVidia has the fastest cards for less then 3 months (g80 i mean), it doens't mean ATI whas out of business.
I hope R600 won't let us down, because lots of overclockers waited long for this GPU.
And if i mean long, i mean more then half a year. It would suck if R600 would be a disappointment.
I just wanna know how it looks like....
Is it just me or is their website down? LOL
Maybe their benches were all legit and ATi shut them down!!! :eek:
That would suck bigtime :eek:Quote:
Originally Posted by HKPolice
i want to see that big massive uber 133t haxor R600 card, gvd. :stick:
Here's the only official News you can Find on ATI (AMD) Home Site;
http://img59.imageshack.us/img59/4735/ds44gfsd2bd0.jpg
In other News:
Other Rumors:Quote:
First, keep in mind that the memory interface will be a 512-bit wide one. If you pair that with a maximum of 2GBs of GDDR4 Ram (it will actually feature 1 to 2 GBs of Ram so this is no mistake!) you’ll get about 140GB/s of data throughput which will leave 8800 cards in dust. With a maximum of 86GB/s, 8800 GTX seems to be far away when compared to Ati’s new toy.
The PCB is probably the most complex one ever used in the video card industry. It’s a 12 layer one, a little shorter than 8800 GTX’s but a lot “fatter”. And so is the card since it houses the R600 and a total of 2GBs of Ram. Add up the cooling device and the power lines and memory connectors and you’ll quickly realize that R600 will also be the heaviest card on the market. The “guilty” part in all this equation is actually the mighty quad heatpipe cooler produced by Arctic Cooling especially for R600. As the cooler and heatsink are actually integrated into the PCB in order to improve its structural integrity, the final card should look quite interesting.
A new Rage Theater 200 chip will be used to take care of the card’s VIVO functions. As for the actual performance part, you may already know that R600 will integrate 64 unified shaders, similar to Nvidia’s 128 stream processors found in 8800 GTX but far more complex. The unified shader units are 4-way SIMD capable, making them comparable to 256 real shader units. And that should be enough to challenge Nvidia again. On the downside, R600 can only manage 16 pixels per clock, making it slower than 8800 GTX but officials from AMD said that it won’t be a problem since the actual frequency of R600 will be around 750MHz and will compensate this issue.
Quote:
Rumors inside Ati have confirmed that the early R600 samples running at 750MHz have produced a score of 11,000 points when running Futuremark’s 3DMark 2006. While this statement remains to be proven when the final product will be available on the market, several voices also said that Ati is already pre-testing some of its samples at a whopping 1GHz speed.
Various reports coming from Ati (again this is not an official statement) suggested that the postponing of the launch had a lot to do with the fact that the cooling system cannot sustain the heavy temperature dissipation especially when the GPU is running at high speeds. I kind of doubt that R600 will actually come out on the market running at a full 1GHz, but it will surely be clocked higher than G80. And if you keep in mind that maybe 80% of the buyers will overclock the card as much as they can, this becomes a real bottleneck.
At the moment, G80 is the way to go. Ati still has a lot of things to take care of, especially problems regarding high clock speeds and the integration of GDDR4 into the PCB. The G80 pre-calculates loads of answers and stores them in its texture units (G80 has plenty of those). Ati has more computing power but less pre-load design so it’s hard to say which will be the winner. Anyhow, it’s good to see that the architectures of the two giants seem to differ a lot.
ATi is curiously targeting February the 14th as the actual launch date. Yep, it’s Valentine’s Day and this time it will coincide with the R600 release. Give or take a few bucks, it’s a “perfect” gift for your girlfriend. As illogical as it may be, the launch comes at the perfect time for CEBIT which will start on March 15 and last until to March 21. Whether R600 will be the new king at CEBIT is really hard to say especially since it seems that Nvidia has 3 months of silence ahead. And I doubt they’re going to rest their brains all that time. Several voices from Nvidia have already pointed out that the company is preparing a tweaked 80nm part which will perform better than G80. The chip is called G81 and is expected to come out just in time to meet R600.
Quote:
Since September, Samsung kept on announcing about how GDDR4 was going to be the next-best-thing in IT. And I proudly use the word “since” because Samsung’s GDDR4 wafers from where GDDR4 chips first emerged were in many cases either faulty or simply not working as
they should. And then came Nvidia’s G80 equipped with GDDR-3 (384-bit wide). Meanwhile, Ati continued to develop the R600 product line keeping alive both the GDDR3 and GDDR4 lines.
It seems that the future equals GDDR4 since Ati has recently announced that along with the R600 line, the Stream Processor 2 line (GPGPU board based on R600) will only support GDDR4 memory chips. The name for the new Ati graphics board is still unknown but it’s beginning to be quite clear that both products will rely solely on GDDR4. And that will surely translate into two things.
A projected 2GB GDDR4 R600 will probably be even more expensive than a G80 based one. And then there’s the problem of availability since 2GB GDDR4 monsters will force the memory producers to supply a larger quantity of chips. And since GDDR4 is only at the beginning of its time, this could easily turn into a great GDDR4 shortage.
No other details about R600 are available. Rumors claim that the final chip is rotated at 45 (not 60 as it was originally presumed) degree angle in order to keep the memory traces as short as possible. Moreover, R600’s back could end up with a G80-style DIN-DVI-DVI and with the HD output connector located at the top of the PCB, not in the middle as it was with the older x1950 series. Lots of good news, except for one: the R600 was postponed until February. Reasons are unknown but that seems to be no problem for DirectX 10’s users since Nvidia has some problems herself with a working DirectX 10 driver for Vista slatted until December the 15.
Oh dear god, will nothing play Oblivion at that level?? I wonder what 2 of these could do... Any ideas?
Paper launch on Jan 30th no doubt.
2gb card :drool:
wow the level of speculation in this thread is crazy.
Here is a little bit for you.
a PROTO of just 4Units and 256MB DDR3 can compete with a 7900GT and win
:slapass:Quote:
We are in the process of migrating our servers. Estimated downtime: 2 hours.
It's up again... still no pics of the card.... :(
R600 Dealyed ? (INQ)
INQQuote:
The last revision of the prototype chip - upon which a certain "pre-review" is based - also suffers from problems which are serious enough to get another re-spin, sources tell me. This re-spin puts a hold on the launch for another couple of weeks, and now R600 is looking like an early March launch, probably the week before SnowBIT in Hangover. However, AMD/ATI is making severe changes to the whole line up and we can say that this launch, when it happens, will be very, very shocking for the 3D industry.
I put it in this thread..
Lol:Quote:
One small note on the tech demos. Prepare for your mandibles to drop, because Adrienne should look like a high-school attempt at 3D graphics compared to ATI's thing. µ
looks like nvidia will have a new chip out before the r600 the 80 nanometre version of the 8800gtx!
the way its going will end up a DNF project :rolleyes:
The late launch is very good for nV cause i think even a 8900GTX will have problems with R600.
By then they should have two more revisions, even a low energy model (160W - 200W) - maybe they'll move to 45 nm to achieve that, one thing's for sure - Huge problems with drivers, since ATi has big problems whit Drivers even for present generation - the next will be a disaster so here's a speculation of my own - If ATi will be out by March with some beta drivers - the drivers from November/December 2007 will add more then 30% bonus.
PS.Seen it before, over and over again.
I wonder what the Inq is referring to when they say shocking to the 3d industry? Maybe something to do with GPGPU? Physics acceleration on the same card? 16 cards in Crossfire mode lol? A 1ghz 2gb GDDR4 card would be nice :rolleyes:
But the Inq is guilty of overhyping.. remember the R520 Fudo?
As he says, delayed by weeks, big deal ?
Hangover? Does he mean Hanover?
Hangover isnt nice :D I think he ment Hannover in germany
He was thinking about the New Year's hangover :DQuote:
Originally Posted by w0mbat
oh the subleties of fud bot humor
If you read the INQ much, you'll notice that they have dumb nicknames for various places and things. "Hangover" was intentional, likely due to the parties associated with CeBIT. Notice he also calls it SnowBIT in the article.Quote:
Originally Posted by CedricFP
It's the INQ. It's what they do. If you'd like to read more, here is a guide to some of their naming conventions and definitions.
INQ Guide to Jargon
:D lets seeQuote:
Update - Server Migration Done, ATI R600 Test Pt. 2 Coming Now
Indeed :)Quote:
Originally Posted by Syn.
http://img.webring.com/r/p/popcorn/logo
Just sent a comment requesting info on the Adaptive AA method used in Oblivion...Quality or Performance. I know supersampling punishes NV in that game. We'll see...
I see. I had no idea, thanks for the info :)Quote:
Originally Posted by trakslacker
They just posted this.
http://level505.com/2007/01/03/the-f...0-test-part-2/
Sounds like some 2560 X 1600 benchies will be uploaded soon.
Funny it looks like AMD is behind all of this.:woot:
then why the billion Google ads?Quote:
Originally Posted by Metroid
"I'm right on top of that, Rose." :p:
^ Obscure Christina Applegate quote ftw. :rolleyes:
Quote:
Originally Posted by nn_step
Well the truth is uncertain. I have to watch X-Files again.
"We are holding them back until we are sure the performance was not biased by this. It’s not gonna take long."
Maybe they should say: We are holding them back because we never had the card in the first place. :)
" We are holding the results back until our ad revenues reach $1200 so we can afford two real R600's when they appear in March .."
Regards
Andy
I think the delay is more to generate hits on the site. I've refreshed that site several times today looking for these phantom benchmarks.. Someone will be dumb enough to click a few of those google ads..
If they were worried about the CPU temps they were getting, why would they even commence benchmark runs until it was sorted out? Now they are taking more time to retest, convenient.....
I would love to believe that these results are real, but Im even more suspicious than I was before...
*thinks to him self*
what a great way to make a buck... Say that you have some pre=released hard ware run a bunch of fake benchies and then release the URL to hardware forums and put lots of adds on the page... Sit back and watch the Hit Count and $$$ roll in.
Wanna see some 8900GTX benches? :fact:Quote:
Originally Posted by Timmay
I really wish, we'd get some concrete info on the R600, mainly release date. im not bothered about a hard launch, as long as it kicks ass. ive been waiting for R600 for ages.
dont be so biased eh?Quote:
Originally Posted by Poodle
Quote:
Important Notice
We had some unsual development of heat on our Intel QX6700 (180 °F) which obviously caused the CPU to throttle down. We are investigating this issue right now and if we find this had no impact on the GPU test earlier, we publish the full scores. We are holding them back until we are sure the performance was not biased by this. It’s not gonna take long.