Custom cards are coming, but only late 2013Q2.
Printable View
Custom cards are coming, but only late 2013Q2.
I might not beable to get one anyways now.
I need to get a new system before hand anyways.
I just blew up my truck, bloody sucks but ohwell I guess...
Well i guess i'll get two lightnings or dcII's whatever first and or easier to mod
i will be in france in may. if can find a custom one then i can get one. i don't want to change my card but titan is tempting.
yeah, i returned my reference board, want better one, will wait for Lightning or Matrix version
My sources are private, not something I read online - my proof sits in emails and IM chats that I treat with 110 % confidentiality as I don't want to lose them. It's up to you if you want to believe me or not, but keep in mind that I haven't been wrong about Titan yet - and I was the first with accurate 3DMark 11 scores :) You're welcome to look further back into my track record if you want, I was also the first to say that Sandy Bridge-E would have "tiered" BCLKs :p:
No need to be so defensive, I will take your word for it, its commonplace to ask for proof on claims people make, I hope you are right though I hate to think what a Titan Lightning would be worth ....
:D
Some people have complained that the 6GB of memory the Geforce Titan has is a waste of money. However I read that both PS4 and Xbox720 will have 8GB of which 6 GB can be used as videoram. So, wouldn't it be useful for future console ports to have 6GB ready on your videocard?
Nope!
6GB is too much, but worthy ( not fully utilized, but semi-decent ) for 5760x1xxx resolution gaming & GPGPU.
PS4, Xbox, etc.... muhahahaha. seriously :D
Doesn't it say "shared ram"? So it's like any low end vid card borrowing system ram when it's free? How much is actually dedicated video ram?
I know I will never need 6GB of ram, 3GB would be plenty. What do you guys do that needs so much ram? Maybe that's worthy of another thread.
:)
YOu need understand both use GDDR5 as shared memory between the cpu and gpu , i dont think the choice to use high latency ram is made without the API in mind.. basically the cpu will just support the first instructions landing, and most of the GDDR5 will be used for the gpu with Virtual memory in mind ( basically both cores can land any instruction, codes in the memory who been taken directly for the gpu and vice versa ) thoses consles are highly multithreaded and will use parrallel work as never before. Both support HSA instructions, last DX11.1 multithreading features, OpenCL ( and the list of library is long like an harm ( including PhysX3 on software ) and this with a simple and efficient API / OS dedicated to only one thing, make work this hardware .
Yeah, it seems that on the PS4 the memory is 'shared' between the OS/software and the textures etc But realistically, I do not think a lot more than 2GB would go to the OS/software. Consoles normally have a lot less overhead than PC's. So this would leave a lot of Ram that could be used as Vram. And filling it up with high quality textures seems one of the easiest things to do.
Have you seen the Lightning pic that I hear is circulating? I will try to find it.
Edit: actually it's from the techpowerup article on cebit, haven't these pics already been discussed? Im sure I saw them mentioned already.
:)
[xc] - so your saying the partners will be making non ref Titans but no Lightning?
:(
No, I'm thinking of the financial implications and hearing my wallet whimpering from here :(
haha yes I can feel the pain in my right back pocket building already ....
What do you think? USD1299 RRP so $1500 to the rest of us?
....ouch ....
:(
If I had to say something regarding to this, it would be that both Caches and improvements on a CPU's Branch Prediction Unit could mitigate the cons of using high latency RAM (Which is the reason why I think that performance seems to barely scale by using enthusiast class DDR-III Memory Modules on non-Memory Bandwidth starved systems, like Llano and Trinity). Besides, as far that I recall, GDDR RAM characterized for having higher Timmings, however, they also have much higher working Frequencies. I don't know how worse or better the absolute access latency is when compared to standard DDR, but maybe the performance hit for the CPU side is not that bad.
The lightning supposedly is announced tomorrow or within a few days after.
I'm apparently still game for this lol.
I hope they hurry there butts up lol :).
As for it existing to set 3dmark recoreds, screw 3dmark :P.
To follow up -
My two TITANs from ASUS do 1.0Ghz out of the box on air. I get the feeling that more performance is possible for SLI via driver updates, however they still rock 5760x1200. I hope they are not so rare a setup that NVIDIA spends some more time optimizing SLI.
Games that would stutter with 2x GTX 680s no longer do - however it still feels like more could be squeezed out of these monsters. Very happy with the out of box overclocks!
When you consider how far king pin overclocked his cards(1750 which is what ln2 7970s lightnings are getting), I think the power deliver rather than the chip that are limiting these cards.
I think if the lighting edition is able to bypass the TDP limit, I think clock similar to clocked achieved as a 7970. That being in the 1200 to 1300 range. It might take water to get these clocks but percentage wise, the gtx titan is among the better OCers out there.
:rofl:
Titan overclocks like garbage. For most of us they throttle down to unacceptable speeds. I just received mine and am so disappointed in its overclocking performance that its going back. I've never done that before. GPU boost 2.0 does not work as advertised.
My Titan is the worst clocking card that I've received in years.
Here is a user on ocn with a water cooled Titan throttling to 967mhz at 1.012v. Yeah, these cards are amazing clockers. :rolleyes: