uh, you can find minutes to most pre-release info meetings with hardware vendors if you look hard enough. should fill in the gaps ;)
Printable View
uh, you can find minutes to most pre-release info meetings with hardware vendors if you look hard enough. should fill in the gaps ;)
http://www.digit-life.com/articles2/...g70-part2.htmlQuote:
Originally Posted by DilTech
According to its designation, the chip is manufactured or packaged in Korea. That's confusing. It's well known that TSMC has its own facilities for packaging dies, that's why the die should have been stamped TAIWAN, if manufactured by TSMC. There is a packaging plant in Korea, but it mostly processes dies from IBM. Has the American blue giant taken up NV47\G70 manufacturing again? The matter is left open
It's not 100% your story, a hardware site does have it posted and you are 3 months out of date.
Regards
Andy
Nvidia and tsmc have done this before, I think they have a partner in Korea for the product when they are under development:
http://www.xbitlabs.com/news/video/d...704104447.html
Regards
John.
Uh, DilTech, you know that the original R400, the one that is now the XBOX chip...was unified shader? Except back then we called it programmable shaders.
try dumping THAT into google, and see what you come up with. Research/sources my ass! :rolleyes:
There ya go, I already said that's what it is man... READ MY POSTS...Quote:
Originally Posted by DilTech
The R400 was set aside when nvidia threw out the NV40 because it wasn't nearly as fast(no one expected twice the performance out of it!) and made into the R500, which is now the R600.
Quote:
Originally Posted by zakelwe
Notice I put that in my FIRST POST, I'm well aware of what it says. Infact, I QUOTED THAT EXACT PHRASE!!!
However, they only said it appears ibm is making them, I dug in deep on WHY ALL the chips say korea when they're fabbed by tsmc, who happens to produce their chips in taiwan.... Sure, they might have tested them in korea fabs, but atleast some would say taiwan, and if you look at every review, ALL OF THEM SAY KOREA!
ok, but think about it...reports are 16 pipes. it cannot possibly be based off of the R420, and maintain the level of performance that has been touted already, if that were the case...IT JUST NOT FRIKEN GONNA HAPPEN. It would have to be a completely new degin to the pipeline...hence, not based on the R420. Maybe the pci-e/memory interface is somewhat the same, but because current memory interfaces don't allow you to use HDR +AA, because of the way textures are rendered, and because ATI have been at the forefront of HDR and HLSL since they designed the 9600 for Microsoft to test DX9 on, i don't see it as likely that they are using the same design @ 600mhz w/ 16 pipes. if it was 24pipes, you might have something...
If it as as you claim, then how did they get the extra performance that has been touted? please explain THAT to me, and then you can get back up on that high-horse of yours. However, if you cannot, then you best get digging.
the point that i was trying to make, but your immature defensive wall won't allow you to see, is that before you decide what this r520 is composed of, check back at the reason WHY the r400 was switched to the r500.
Quote:
Originally Posted by cadaveca
Same way the NV47 is twice as fast as the NV40 in those high rez cases, optimizing the architecture. For NVidia, they simply added a second shader pipe, 2 vertex shaders, and 8 more pipes, while adding support for 64bit fpp.. For ATi it's the 512bit internal memory bus(even though it's only attached to the ram by a 256bit bus). There's suppose to also be added alu's for each pipe, but I'm betting that's false now.
Also, only the XT, which is now CONFIRMED to be a 24 piped card, is as fast as the GTX, the 16 piped cards will not be. They aren't going to be beating the NV47 with a 16 piped card at all, read the official R520 thread!(which, if you notice, is also started by me)
Confirmation here http://www.vr-zone.com/?i=2667&s=1
Also here
http://www.hkepc.com/hwdb/r520firstlook-e.htm
Keep in mind, that isn't speculation, they DO have the card!
Looks like I DO have something, doesn't it?
I'm going defensive because people are typing without reading(as proven by your R400 statement), or they're typing without thinking(as proven by other statements here), and just saying "who cares" simply because the core is actually the NV47, instead of asking themselves WHY!
Can I get back on my horse now, or do you still wanna beat it? :horse: :horse: :horse:
Finally, why the R400 was cancelled, which was caused by many reasons. One, ATi not being able to complete it in a timely manner, and two being R&D cuts. There's other reasons too, but I'm tired of hunting for links when this has nothing to do with the conversation at hand
http://www.dvhardware.net/article1882.html
http://www.geek.com/news/geeknews/20...1219023173.htm
Well, DilTech, i'm just trying to push you to dig deeper. You don't have the full story yet.
i still don't see any confirmation of 24pipes...and remember that VR-Zone is shamino, and he's a nVidia boy.
The NV47 is faster becasue of the extra pipes, vertex shaders, and CORE/MEM SPEED, and not much else. anything else can at most only amount to 2-3% performance increase, not the almost 100% increase. :rolleyes: :stick: If you knew as muc has you proclaim ,you'd know that memoery ahs the largest impact at higher resolutions, both on speed and size. :rolleyes:
AS i said earlier, if it is 24pipes, then you are right in it being jsut a R420 spin-off. if it is 16, then it is not. the memory controler is only how they were able to use 2 banks of 256mb chips for mem and not pay a latency hit. There may be minor imporvements there, but not enough for the performance number they claim. This is why you do not see very many nvidia 512mb cards...they loose out bigtime on that second bank, or cost more than $1000 becasue of the cost of the ram in 1 512mb bank.
You are missing the point however. the r400 saw changed to the r500 not becasue it didn't perform well enough, but becasue under the current-at-the-time DX9 scheme it did not perform. The actual performance numbers would have made it an easy 16pipe, 8vs, programmable shader beast, and i'm pretty sure that it would beat the 7800GT @ about 550mhz. But this would have had the entire DX structure change...and you know the crazyness that would have brought...especially if M$ was involved. So it was set as a core for the next version of windows, and DX10.
So think about all of this for a moment. Supposedly new core, but where are the enhancements? SM3.0? You know that this is only really a programmer's benefit, and not performance, right? And if that's the case, ATI really IS in trouble..because thier up-and-coming card with unified shader pipeline has only been delayed due to Windows Vista, and nothing else.
All these poeple fighting microsoft and the relse of Vista are slowing the industry down, in terms of the gaming side, and 3d, for sure!
Anyway, i must digress, this is VERY offtopic...however, i think it's already been shown that there's no "conspiracy" going on here..regardless of what you may personally think. Had you dug a bit deeper in the first place, you would have found that ALOT of nVidia chips have Korea on them...which means absolutely nothing BUT THAT THEY WERE PACKAGED THERE. NOT FABBED, JUST SIMPLY PACKAGED.
X1800 XT 16 Pipes 600MHz Core 700MHz Mem 512MB GDDR3Quote:
Originally Posted by DilTech
http://www.anandtech.com/video/showdoc.aspx?i=2532
Also, interesting find, but don't be overly caught up in an eagerness to do something impressive by discovering this that you begin to ignore relevant counter-evidence out of ego reasons
hmm...550 mhz, positioned right between the GTX and GT...interesting...why'd that pop up today tho? :confused:Quote:
Next up is the Radeon X1800 XL, which is positioned between the GeForce 7800 GTX and the 7800 GT. The XL drops the core clock down to 550MHz, and the memory clock down to 625MHz. Other than the lower clock speeds, the XL is identical to the XT, meaning it still has 512MB of GDDR3 memory connected to a 256-bit memory bus. The X1800 XL will be priced at $499. Both the X1800 XT and X1800 XL appear to be dual-slot designs from previous roadmaps and existing box art. This configuration planout also details that there will be HDCP support for the X1800 XL and X1800 XT via Texas Instrument's TFP513PAP DVI transmitter
Cause i think Nda expire the 15Quote:
Originally Posted by cadaveca
Quote:
Originally Posted by Doumz
I think NDA kicks in tomorrowQuote:
Originally Posted by Anandtech
Diltech
My thoughts are that in your first post you made claims that this 100% your story, no hardware site has it posted. This is not true. Your theories are all your own though, but that is not news.
You then say
"The chip was then supposidly cancelled so that NVidia could concentrate on the G70, a core that was suppose to be in the works along side the NV47, however, it turns out the NV47 was never actually cancelled... The 7800gtx IS INFACT THE NV47."
Can you point me to a link where nvidia have said the nv47 has been cancelled so that they could concentrate on G70 ? If you cannot quote nvidia what are your sources for this claim ? What reliability to give to these sources if they are not nvidia ?
Other people say G70 is nv47, they just renamed it. They have said it months ago, this is not a news story. That is in the digit-life article as well.
You then come up with 3 theories which are interesting and then come out with
"Regardless, NVidia is going thru ALOT of trouble to hide the core's true name"
But are they ? You have already said it is reported as nv47 at hardware level and that nvidia have gone to no trouble to hide that the chips are made in Korea ( you stated that people do not take heatsinks off so would not notice however most reviews do, did and have noticed .. not very clever by nvidia in that case ).
Rather than the simple, and probably correct, assumption that nvidia nv47 is G70 and that nvidia decided to change code names for some reason you have opted for G70 being another chip that is not released yet and perhaps renamed to G71.
Some good evidence for this would be if both nv47 and G70 were listed in the nvidia display .inf file as this would indicate they are two distinct chips. I think G70 was mentioned as far back as leaked beta drivers in Feb 2005, can you show me an .inf file where they both are present ?
Your post is interesting in regards to where the chip is made and who is making it, this is a question that would be interesting to answer, but this still does not make this a news story I am afraid IMO.
my heart goes out to you and your kind.. how you managed to survive this long in our technology-driven world is a miracle. I will pray for you.Quote:
Originally Posted by Turok
much better :)Quote:
Originally Posted by Mesce
NV47, G70, doesnt matter what its called. Yes, I realize the purpose of the thread was to highlight that this is just an interim product and there will be a faster one. But, guess what... there will always be a new faster one!
I'm surprised that more psychologists and psychotherapists dont visit these forums. Its a bit more understandable at rage3d.com or nvnews.net but same old arguments misconceptions rumours and imagination. Pretty much EVERYTHING being argued about is speculation. You might as well be arguing about whether or not it will rain 1 year from now.. quoting some fisherman, using tea-leaf-reading as proof, and perhaps even tossing some coins.
But, after all this arguing, lets just imagine for a sec that one of you is 100% right. Wow.. oh wait. You can see into the future, and *know* exactly the specs and price and launch date and everything about the next product (something nVidia probably doesnt know). Perhaps you know everything about all new products launched in the next 10 years!... so, please tell me, how does this benefit you? You still only have a 6600gt or some other current card.. you can't bring back things from the future. Worst of all you dont even get to feel *special* because whenever you go to forums telling people all about the wonders of the future, they all think you're a raving lunatic, or no better than the hundreds and thousands of others with their own speculations.
slowly steadily new faster products will be released.. dont waste precious time stressing over that when you could be playing games NOW!
ok, I'm going back to BF2...
COOL - Country Of Origin Labeling.
G70 die (chip) is fabricated at TSMC. The die is then packaged in Korea probably by Amkor (http://www.amkor.com). Amkor is the #2 semiconductor packaging companies in the world and it is one of the packaging companies that Nvidia uses. Other assemblers being ASE (#1), STATS-ChipPAC (#3), SPIL (#4). All have multiple factories in various countries. (Taiwan, China, Malaysia, Singapore are the most common) Amkor is one of the oldest contract assmblers and have major facilities in Korea and Philillpines. (Phillippines was once a popular location for semi assembly but the current hotbed is Taiwan and China for proximity with fabs and end-customers)
It would then be tested at AMKOR or by other test houses. The package is marked as the last place of manufacturing. Test is not considered part of the manufacturing process. (DoD/Pentagon and US Customs)
This is due to US Customs Regulations. The full document is too long, so I extracted the relevant paragraph :
U.S. customs laws require each imported article produced abroad to be marked in a conspicuous place as legibly, indelibly, and permanently as the nature of the article permits, with the English name of the country of origin, to indicate to the ultimate purchaser in the United States the name of the country in which the article was manufactured or produced. See:
http://www.customs.ustreas.gov
Even the laser marking is on the die, it is actually marked during the test
process and the datecode is the packaging completion week by assemblers and not TSMC fabrication.
It is on the die and not on the substrate (the green PCB) because it is the most convenient location for flip-chip package. TSMC does foundry work and wafer sort. It does not do any package assembly. It does have an alliance with ASE to provide turnkey services.
For the same reason AMD Athlon 64 are marked "Assembled in Malaysia) even though the die is fabricated at Dresden/Germany and final test in Singapore. Intel processors are marked Malaysia, Philippines, Puerto Rico and USA. intel also have a facility in China but I am not sure it is for China only or exported as well. intel fabs are located in USA (NM and Oregon), UK (Northern Ireland) and Israel.
AMD Athlon64 Malaysia marking and not Germany.. the precise example I was waiting for somebody to use.
Its a damn shame that on a website where there supposed to be so many informed enthusiasts, there is so much confusion and misinformation.
Thank you very much siliconman for one of best first post ever...
jesus its called a suspicion. dont get all uptight about it. Personally, i could see this happening without any kind of info dug up. I mean, ATI is delaying the crap out of everything, so maybe when the actually do release R520, they can make damn sure its gonna beat the 7800GTX. So it would be smart for nvidia to "cancel" the 7800 Ultra, and just be sitting on it in case R520 is the :banana::banana::banana::banana::banana:, that way they can release a monster 32 (maybe) pipe card that just wipes the floor with ATIs best offering. Honestly it makes great buisness sense.
Digging up an old thread
Although I disagreed with Diltech's original story I was rather intrigued by the fact that the 7800GTX were marked Korea rather than Taiwan ...
as things have progressed this has become more and more interesting.
As people have pointed out the packaging factory and stamp does not really show where the core was produced though it seems strange for TSMC to produce the core in Taiwan and then ship to Korea for packaging.
Looking at the latest 512MB GTX the package now says Taiwan. There are other things that indicate the core is perhaps made differently.
a) Massive increase in clocks
b) All clock domains the same
c) Kingpin reports no cold bug.
d) Packaging now says Taiwan.
because of the above I still wonder if Diltechs original claim that the original G70 was IBM made might be correct. It seems convenient that with all those changes the cores are now stamped Taiwan where TSMC have their core and packaging fabs.
Hmmm.
Regards
Andy
if u read up, it is infact a tweaked PCB that allows for the higher mem clocks on the 512mb GTX, and NOT a tweaked core.
The 512mb GTX cores are infact just cherrypicked cores that nVidia has been storing up for this card since day one.
There's a nice review on hexus.net on the 7800GTX which details the core and how it works'n all that jazz. http://www.hexus.net/content/item.php?item=3904&page=1
On page three there's a photo of the core which details the core as being made... or at least marked in taiwan, and also baring the marking "GF-7800-U" hmmm. Just read it. It's got dates, core comparisons, details on the board, the whole bit. - (or at least what they think they know) :para:
ANyhow , no conspiracy theories here, just adding something to the mix.
I can't comment on the weird situation with the Korea/Taiwan markings beceause it simply makes no sense to me....I have a feeling you might be onto it with the IBMvs.TSMC thing, I really don't have the slightest clue though....but I do know that there is a lot more than what meets the eye with the GTX A2 revision.Quote:
Originally Posted by zakelwe
1) The mem controller has been de-cold bugged (yes, it, like in the A64, was the reason) and has been generally retweaked for internal bandwidth (the new cards run significantly looser timings and as I saw in one review, it tends to perform identically at identical speeds [some games benefit from 512MB while one game [D3 I think] did have a noticable slow-down]);
2) They used 'mobile tech' from the beginning--that's how we had the various speeds. What about various voltages? Yes, 1.45V is the voltage we all read and is probably the voltage the entire A2 core is getting but was 1.4V the voltage that entire A1 core was getting? ;)
3) The various clockspeeds is just a BIOS setting--it can be yours for the small cost of a new BIOS :D