PDA

View Full Version : Oxide Games Reveals VR-Only Nitrous 2 Engine, 'Not Enough Bullets' Game



StyM
03-01-2017, 06:09 AM
http://www.tomshardware.com/news/oxide-games-nitrous-2-engine,33777.html


AMD had much to say during the Capsaicin and Cream event, but the Red Team wasn?t the only company with news to share. Dan Baker from Oxide Games took the stage to introduce his next game and the next-gen game engine powering it.
Baker is up to his old tricks again. The co-founder of Oxide Games is out to push the performance boundaries of next generation game development technology. He and his company are hard at work building the second-generation Nitrous engine--and this time he?s not letting the old guard hold him back.
Baker played a role in bringing the Khronos Group?s Vulkan API to market, and his company was among the first to embrace AMD?s Mantle API. You may also recall that a little over a year ago, we spoke with Baker about his experiments with DX12 in the first Nitrous engine.

At the time, Ashes of the Singularity was still in the development, but a lot of eyes were on Oxide Games because Ashes was poised to be one of the first DX12 games on the market. Baker spent the time to explore DX12?s Explicit Multiadapter feature, which allowed him to build an experimental build of the game that simultaneously supports Nvidia and AMD graphics cards.
Now that the development of Ashes of the Singularity is out of the way, Baker looked towards the virtual reality market and wondered what boundaries he could push next.


?When we finished Ashes [of the Singularity], and we had the first DX12 engine, we said ?Well, ok. We still have to support the old APIs, but what could we do if we did next-generation only? And what would a new engine look like if you were only DX12 and only VR? And what would it look like??? said Baker. ?What we came up with is Nitrous 2.0.?


?We have this new technology where we?re actually able to rasterize the entire scene at 90 [fps], and in the asynchronous compute we shade at 30 [fps]--We decouple the two. And we can share the data between the eyes,? said Baker. ?So, that means, in theory, our shading can be 6x more efficient than it ordinarily would be. And async compute is working perfectly for this.?