Does it have the 256-bit icing bus?
Overclocked cocoa count? Or are we talking white chocolate?
-PB
Printable View
Does it have the 256-bit icing bus?
Overclocked cocoa count? Or are we talking white chocolate?
-PB
Seen here in 3-SLI..
http://www.lolhome.com/img_big/funny...1158109222.jpg
hehehehe nice one :D
Lol nice pic.
Overclocking on that seems like it could be nice, when you're done you can eat your recored setting gingerbread memory and cpu.
What extra caps?, I ate those...
can titan be an atonement for overclockers from nvidia because of greenlight program
actually one for graphics second is dedicated physx and third is SB ZxR
If you guys are worried about getting your GTX 690's ass handed by the new Single-GPU, worry not, it'll be slower than the GTX 690. ( doesn't take a genius to figure that out even without any real info at all, but stating the obvious sometimes ain't a bad thing... especially when surrounded by blonde women :p: )
Even if it is a bit slower, it still is the better choice (for new buyers obviously). No profiles, no microstuttering, more memory. If these results are true, this truly is a beast.
I doubt that it'll that close. I would be shocked if it were to come out of the box clocked as high as a GTX680.
You're pretty much comparing a pair of GTX560s to a 580 here.
Overclocked 560s. Afaik, GTX680 was clocked about 10% higher than originally intended to compete with Tahiti. Anyway, a 560 Ti SLI is about 30% faster than a 580 at 85% scaling. So if Titan is closer to the 690 than that, kudos.
It's really all about the tdp for me.
Seems like it's lower then 2x680's, and close to that in performance maybe, so I'll probably be getting one just about as soon as they come out, I'll wait a tiny bit to see some reviews or something 1st.
I'de hate to buy a ref card if a lighting did come out..., that's a hard one to say though...
I should really get into sli, save me a few $, and it would give me the opportunity to play with the sli compatibility bits and get my profiles updated for that.
But I do wanna save as much as I can on energy if possible, at least somewhat lol.
Has anyone else here tried 3dvision on codmw2?
That's one good example of a game that's showing me that my current card isn't up to snuff at all :(.
I'm getting around 20fps lol.
http://www.pcper.com/reviews/Graphic...eview/3DMark11
I think the more accurate comparison would be actually be a pair of gtx 460 considering the difference in number of shaders is greater this generation and gk110 is based on 2nd generation spin technology vs 1st generation spin technology(i.e gk11x vs gk10x).
So the difference doesn't have to be that big.
as i am against dual gpus then this card may be my hero
I love a strong single gpu vid card, I have tried SLi but other than benchmarking I get no satisfaction from it, but getting good scores from a single gpu is great fun. Titan will be fun, even if it needs soldering to do so ....
:D
ASUS GeForce GTX Titan Has The Same Clocks As GTX 690 and 512-bit interface?
http://videocardz.com/images/2013/02...ng-Listing.png
Quote:
If you were concerned about a slow GeForce GTX Titan clock from earlier reports, then we have great news for you. An Australian shop has just revealed the final clocks of the GTX Titan, thankfully, which are definitely higher.
ASUS model, which is purportedly a reference one, will receive the same clocks as GeForce GTX 690. That is: 915 MHz base clock, 1019 MHz boost clock and 6008 MHz effective memory clock. This ends the discussion about this very slow clock (732 MHz), which was mentioned by SweClockers two weeks ago. In my previous post, you can read about all previous rumors, some of them were not true, so I strongly recommend taking all the info about Titan with a grain of salt, including this very post too. This source however, is straight from retailer, a third one to be exactly. So this is quite possibly the most accurate leak we can expect from an unofficial source before the launch. What this new source also reveals is that the GTX Titan will also introduce a GPU Boost technology, probably the most recognized technology from Kepler architecture.
512-bit memory interface?
It’s worth noting, that the listing mentions 512-bit memory interface. I cannot say why it’s wider, than said before, it may be just a typo. Or this can be a new leak, which would explain extraordinary performance of this card.
Anything in between 40% and 60% faster than the GTX 680 is quite a feat ( considering the "business" part of the companies & profitability initiatives of course ).
Everything at this point is a guess. As always, some guesses are better than others.
Among the GTX 680 vs GTX 690 is only 35% difference in performance? If that right there is the Titan will trounce the GTX 690 even if 40-60% stronger than the GTX 680.
http://tpucdn.com/reviews/ASUS/ARES_...rfrel_1920.gif
Find it hard to believe that Austin Computers has the inside word.
Plus that price is outlandish.
-PB
1920x1200 ? NoAA ?
You know that lower resolution with just 2xAA or none at all, isn't helping the GTX 690 and SLI "open up", right ?
http://images.anandtech.com/graphs/graph5805/46182.png
http://images.anandtech.com/graphs/graph5805/46181.png
1920x1200 and SLI/CF with high end cards is like comparing a Miata and a LFA-1 with a 1/12th mile drag :D
but there is but
http://i288.photobucket.com/albums/l...forces/nv1.jpg
really??
SKYMTL- each big event you have new password and user name.
512bit and 6gb makes no sense
512 bits is just the memory bus size... I think they can put any amount of memory on there and it would work.
of course you can make 512bit work with 6gb but it is not efficient.
What would be the perf difference? 1%?
think like this to populate 384bit bus with 32 bit chips you need 12 chips and for 512bit you need 16 chips. you can achieve 6gb with 12 512mb chips in 384 bit but you need 8 512mb 8 256mb chips to achieve 6gb on 512bit. using different size chips is not prefered as it increases unneccesary complexity while you can solve tihs simply by using same size chips.
Two weeks left ..... hurry up .....
:D
BenchZowner - These reviews are increasingly difficult to take seriously, I made this comment about the percentage difference with this in mind, criticize, know that they are false.
Each ram chip is 32bits, old sdram and maybe ddr1 was 16bits per chip if I remember.
32bit x 16 = 512bit
6144meg's / 16 = 384 megs per chip.
This isn't likely, and it doesn't match the specs of the quadro.
32bit x 12 = 384bit
6144megs / 12 = 512 megs per chip.
This seems more likely, and it matches the quadro.
Each crossbar is 64bit as far as I know, on the gpu.
The gpu would likely need a min of 2 chips to run.
And any multible of that would work as long as it has the crossbar's for it.
In this case it's 6, 6x 64bit = 384bit.
Yes it's not an even number in the sense that it's not 4 or 8 or 16.
But it's fine, instead of say 200%, it's more like 150% perf increase in ram.
There is no perf penalty as far as I'm guessing, I'm sure it uses 64bit per read, it's just it can pipeline it more into a chain of 384bits.
Just a guess, for the heck of it.
8x sgssaa at a 50% fps increase in theory... hmm
I don't know any numbers right off hand...
But I do know that 4xsgssaa is a tad bit sluggish on the 680 in some games, other games 8x is fine.
That and the card still struggles with games like mafia 2.
The 780 might be the thing I'm looking for, but it's more likely that it's not, I'm betting the next gen after that is the one to get.
But I'm gonna get it anyways probably.
I really like sgssaa with a -15 bais, I wish there was a 16x ss grid transparency aa to go along with 16x aa, I think that would be the sweet spot in terms of quality (8xsgssaa still lowers the texture quality a bit).
Edit:
I should be checking perf reviews of the 690 in certain games to figure out what the performance could look like at best, excluding the ram amount thing though.
Lets stay simply rational... The GK110 you can find on Tesla card have a 384bits bus.. why will you Nvidia release a card based on this sku and rework the MC for get a 512bits bus instead of use the same memory controller ?
Then titan will be 2xGTX670,they can market 2x256 bit as 512 bit :)
GTX Titan OpenCL Environment Vs 7970GHz :
http://tof.canardpc.com/preview2/2a8...a23d3f40f4.jpg
Reporting 2688SP/875MHz/6GB
http://clbenchmark.com/compare.jsp?c...fig_1=14470292
Found on 3DCenter, thanks to Godmode :up:
hi guys
Exclusive NVIDIA GEFORCE GTX Titan Final Specification
http://www.arabpcworld.com/images/20...ation-APCW.jpg
http://www.arabpcworld.com/?p=26335Quote:
Reached our hands today the final specifications for the card NVIDIA GEFORCE Titian It really card violent specifications, will come card based on nucleus GK110 manufactured precision manufacturing 28nm where Contains block your pal GPU on a number 14 units SMX number 2688 CUDA Core and the number 224 and Unit clothing and 48 ROPs.
Card comes with huge memory capacity up to 6GB DDR5 384 bit bus width card will come a frequency of 6000 MHz.
The card will come at 240 Watt power consumption and This Tstay there bombers Energy 6 8 Ben Any maximum Asthlat rate can absorb energy PCB card is 300 watts.
The actual launch date of the card is on 18 February.
Hide us frequencies to put in another time, but all we can say it's powerful.
Wait surprises soon.
Aw, not another cut-down top end chip... :shakes:
i hope it will not be a big disappointment after this much huge expectations.
if all this Specifications correct AMD will feel like this:D
http://s7.postimage.org/bl2sae3xj/avatar126343_1.gif
this result in Physx Fluid, Driver?
I thought there was a new integration node with 8800GTX..,. as well as a totally different architecture. Let alone how crappy SLI was those days.... bad example, mate ;)
lulz? Where do you see a 35% difference in performance? I see 42% difference...
What is there to see in that picture?
prava 83 -> 118 = 35 :D
Guys, are we really discussing results obtained in a CPU bottleneck?
42% faster = 1.42. Let's get back to Titan ;)
Math is a wonderful science isn't it ? :D
Anyone else feel the supposed boost clocks seem a little high?
This is off topic but what memory speed are the radeon hd 7000 cards?
yes. if 680 boosts around 100mhz to 1100mhz-1200mhz, then a chip 2x bigger boosting to 1000mhz+ does seem aggressive.
the only clock speed I trust so far is the opencl one below. 875mhz is around 100mhz more than the k20x, and this makes sense. trade some efficiency for more performance in this enthusiast card, and boost will take it to 900-something mhz, very close to 1000mhz, maybe even a little over. I will be slightly surprised if titan really boosts over 1000mhz, but not very surprised if it is just 1005mhz or whatever. I just think ~950mhz makes more sense. this is still looking to be a 250W card easily
Stock was 5500mhz , GHZ it is 6000mhz.
Anyway, yes boost seems too much high for 240W, and the slide is strange ( 2 colors font, font dont completely match, strange way of write the specs for the memory ( no space between the / and have you allready see a gpu brand write "b" for bit .. )
Cant say if it is a true one or not ( depend who have done the slide who is similar of the 680 slide )
Considering 28nm is a mature process at this point both AMD and Nvidia should both have an idea how best to utilize the transistor designs for 28nm to optimize their gpus for power and performance.
Just because 1st get 680 consumed xxx power for xxx clock speed doesn't mean new designs on the same process will perform identical even on the same node.
I'm sure there is allot learned as far as what kind of transistor to use where to optimize power efficiency as general gpu design optimization for a particular node.
For the most part I'll reserve my thoughts until I see real numbers.
Thing is, thr 580 didn't want to clock that well, 950 was very good, 1000 was stellar and took a lot of juice and cooling, aaaand custom PCB to make it comfortable. Stock vGPU they were nothing overly special, 900 - 925 MHz. This is another power hungry (probably hot with the same size cooler to dissipate 230+ w) card limited with (so we're told) no voltage adjustment and 8 phase 'nvidia spec' VRM, I honestly think a moderate at best OC is expected. That said they should be powerful even stock, so with all that firepower for gamers I doubt overclocking is even neccessary unless using huge resolution/cracked hardcore GFX games.
real benchmarks:
http://www.hardware-infos.com/news/4...ark-score.html
http://www.techpowerup.com/gpudb/b14...GTX_Titan.html
GPU Name: GK110
Process Size: 28 nm
Transistors: 7,080 million
Die Size: 502 mm?
Render Config
Shading Units: 2688
TMUs: 224
ROPs: 48
SMX: 14
Pixel Rate: 51.2 GPixel/s
Texture Rate: 205.0 GTexel/s
Floating-point: 4,919.04 GFLOPS
Graphics Features
DirectX: 11.1
OpenGL: 4.3
OpenCL: 1.2
Shader Model: 5.0
Released: February 18, 2013
Bus Interface: PCIe 3.0 x16 ?
ASUS Part #: GTXTITAN-6GD5
Clock Speeds
GPU Clock: 915 MHz
Boost Clock: 1019.5 MHz
Memory Clock: 1502 MHz (6008 MHz effective)
Memory
Memory Size: 6144 MB
Memory Type: GDDR5
Memory Bus: 384 bit
Bandwidth: 288.4 GB/s
Reference Board
Slot Width: Dual-slot
TDP: 235 W
Outputs: 2x DVI
1x HDMI
1x DisplayPort
Good finds.... :up:.
your welcome!
guys come on it is the same Result from Obr-Hardware That appeared two days before:)
http://i.imgur.com/dtAs2QE.png
http://www.obr-hardware.com/2013/02/...l-numbers.html
http://www.xtremesystems.org/forums/...=1#post5169787
Ooops...your right!
They may be the source where the OBR took this results ...
Special Report: Nvidia GeForce GTX official winning features of Titan - Donanimhaber
Quote:
First let me say, contrary to the news in the last few days, not 512-bit GeForce GTX Titan will support 384-bit memory bus. GK110 28nm Kepler architecture, parallel processing unit owned 2688 graphics chip graphics card, Nvidia's official documents as the world's fastest single-GPU graphics card goes. So, contrary to some claims GeForce GTX Titan, dual-GPU GeForce GTX 690 is faster, but performance is very close to the values of the information.
GeForce GTX 690 is the same as in the high-tech arena, and thermal conductivity, high strength metal used in the used graphics card equipped with vapor chamber-based cooling is accompanied by 6GB of GDDR5 memory. The performance of texture fill 288 billion / sec (GeForce GTX 690 is 234) we learn that the graphics card 5.4 teraflops of computing power will at the same time.
Quote:
PCIe 3.0 compliant graphics card with Nvidia's recommended minimum power supply for the 600 Watt.
288Gtexels/s and 5.4Tflops is really something !Quote:
Astronomical video card is expected to be launched with a retail price, GK110 graphics chip performance / watt performance benchmarks that will capture the single-GPU graphics card for a very long time as is expected to remain as
Yeah, it's a fake. 288Gtex doesn't fit with 5,4 Tflops.
They should learn at least calculating.
They probably mean 288GB/s bandwidth. Lost in translation most likely.
Well, I think that 42% said they are incorrect because we speak percent previously stipulated.
The basis of the account is the Asus Ares, all other percentage are made from it and I repeat, do not speak of frames per second, but an estimate of pre conceived.
Fake. 502mm2 is wrong, that gives it away. Also at these clocks, 235W TDP is outright impossible.
Why would 502mm2 be so wrong, gtx580 was around 520mm2 and gtx680 is 294mm2. Going by gtx680 size Titan is not a doubling of 680 specs and neither should the size be so 502mm2 seems like a realistic assumption to me.
GTX580 with a 520mm2 die was a 244w TDP rated part stock clocked at 772mhz core, 235w TDP for around a 500mm2 Titan core shoehorns within the general dimensions and TDP.
The 780GTX might be 502mm2, but Titan is GK110, these chips have been available for nearly a year, there are xrays of them online and you don't miss that much difference in size. Either Titan is GK110 and the size on the chart is wrong, or it isn'tGK110 and the chip on the chart is wrong.
Info from TPU and Donanimhaber is correct. Launch date is correct too. I seen Three Titans today (it looks better then GTX 690, too sexy boards), few Magazines has cards for review.
Performance is few percents under GTX 690, numbers from OBR are correct.
Is it a waste to buy that card for my system ? I can oc the cpu to a stable 4.4ghz if needed.. will use the card for 3 1080p monitors.
The link I posted is two days older than OBR and has approximately the same score :shrug:
IF this is correct. This could scare AMD away from the high end or the GPU compute professional market, if the price of these cards start settling down. Unless AMD starts making monster cards like Nvidia, they might only just match this card at 20nm and at that point, that would only equal a disastrous launch for a next, next generation card(50% is nothing to scoff at as a improvement within the same manufacturing process). AMD could beat this card with a smaller card, but it will have to give up alot of GPU compute abilities thus surrender a lot of potential application performance, like they did in the old days.
AMD probably doesn't want to make a big die because AMD doesn't have enough reputation or sales in the pro market to make it worth it considering the R and D and the lost wafers on other products.
Although we don't have much data or nothing is certain at this point, paper specs really paint the geforce titan to be a monster card based on what we know how much die size and components it has over gk104. Unlike the ridiculous 6970 hype which had people saying 60 percent performance jumps on the same manufacturing process on a card that was just a little bigger.
http://www.xtremesystems.org/forums/...umor%29/page17