Ray Tracing To Debut in DirectX 11 *Official April Fools Thread!*
"This is breaking news. Microsoft has not only decided to support ray tracing in DirectX 11, but they will also be basing it on Intel's x86 ray-tracing technology and get this ... it will be out by the end of the year! In this article, we will examine what ray tracing is all about and why it would be superior to the current raster-based technology. As for performance, well, let Intel dazzle you with some numbers. Here's a quote from the article: 'You need not worry about your old raster-based DirectX 10 or older games or graphics cards. DirectX 11 will continue to support rasterization. It just includes support for ray-tracing as well. There will be two DirectX 11 modes, based on support by the application and the hardware.'"
http://tech.slashdot.org/tech/08/03/31/1423247.shtml
I smell trouble for nvidia. They lack a x86 licence. AMD was right to purchase ATI. I just hope they kick Intel in the nuts. Its not like there is a tonne of Dx10 games out yet. DX11 will take at least 2-3 years. Maybe by then windows 7 will be out.
APRIL FOOLS! All april fools jokes found on tech sites are to be posted in this thread. Thank you, the staff.
xGT300 intial specs leaked!!
Quote:
A few initial documents reveal that xGT300 has already passed the initial design phase and that alpha-stage dies have begun to tap out. The chip is different from the G80/G92/GT200 in several ways. First, the x in the name, it means multiple cores (think Larrabee to some extent).
The top-end piece has four cores, each with 64 shaders, 18 TMUs (each with 2 addressing & filling units), and 12 ROPS. This may not sound like much, but the system is designed such that it breaks the image down into four parts and renders each one separately. Oh, and the cores are going to run at well over a 1 ghz each with the shaders more than double that.
After taking a step back to a 256 bit bus with the 9800 GTX after the 8800's 384 bit bus. Well, the xGT300 is being prepped with a 512-bit bus and is expected to have its memory measured in gigabytes and not megabytes.
:clap:
more info:
http://arstechnica.com/articles/paedia/hardware/real-next-gen-nvidia-specs-leaked.ars