Page 1 of 2 12 LastLast
Results 1 to 25 of 34

Thread: NVIDIA Shows 4K Surround Gaming on Assassin?s Creed IV 24.9 Megapixels

  1. #1
    Xtreme Enthusiast
    Join Date
    Jun 2010
    Posts
    588

    Thumbs up NVIDIA Shows 4K Surround Gaming on Assassin?s Creed IV 24.9 Megapixels

    NVIDIA has been talking about 4K gaming monitors quite often these days, but we haven?t heard much about 4K Suround monitor setups until today. NVIDIA has a 4K surround demo up and running that you really need to see in person to believe.



    NVIDIA?s demo has three ASUS PQ321Q 31.5-inch 4K Ultra HD Monitor setup at 3840 x 2160 in portrait mode. That means you end up with a 6480 x 3840 monitor setup for 24.88 million pixels of pure goodness. They were then running Assassin?s Creed IV: Black Flag and allowing those here at the event to take the mega-panel setup for a test drive.







    Here is a quick look at the available resolutions in the in-game menu! Our current 5760?1080 setup officially feels tiny after looking at this 6480 x 3840 setup! How isNVIDIA powering such a monitor setup?



    Very cool demo and this is the first time that we have seen a 4K Surround gaming setup! We just had to share!





    NVIDIA was using a high-end Maingear Shift gaming PC that was running three NVIDIA GeForce GTX Titan 6GB video cards! This system has a combined 18GB of GDDR5 memory and starts at $6,613.00 with three of these video cards. Each ASUS PQ321Q 4K Ultra HD display runs $3499.00, so this 4K Surround setup would set you back an additional $10,497! Not everyday you get to see $20,000 PC setup!

    http://www.legitreviews.com/nvidia-s...reed-iv_126752
    WOOOOOF

  2. #2
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    It will be a little of topic but i don't really get why nvidia paniced this much. Yes amd is doing good moves since last year. Amd also got the attention by being late but imo these must not hassle nvidia this much. Nvidia still has the upper hand and i believe if they show all these one month after the release of 290x i think it would be better for them.

    Sent from my SM-N9000Q using Tapatalk


    When i'm being paid i always do my job through.

  3. #3
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    How do you mean they have upper hand? You mean they have upper hand for another few weeks? If so then it's true.

  4. #4
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    Quote Originally Posted by informal View Post
    How do you mean they have upper hand? You mean they have upper hand for another few weeks? If so then it's true.
    i mean in general they have a huge lead on sales numbers because of nividia's image in people minds and this will take much more effort for amd to break. this is the upper hand i am talking about.
    Last edited by kromosto; 10-19-2013 at 05:47 AM.


    When i'm being paid i always do my job through.

  5. #5
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    Well AMD is not sitting still .

  6. #6
    Xtreme Mentor
    Join Date
    Feb 2009
    Location
    Bangkok,Thailand (DamHot)
    Posts
    2,693
    impressive
    Intel Core i5 6600K + ASRock Z170 OC Formula + Galax HOF 4000 (8GBx2) + Antec 1200W OC Version
    EK SupremeHF + BlackIce GTX360 + Swiftech 655 + XSPC ResTop
    Macbook Pro 15" Late 2011 (i7 2760QM + HD 6770M)
    Samsung Galaxy Note 10.1 (2014) , Huawei Nexus 6P
    [history system]80286 80386 80486 Cyrix K5 Pentium133 Pentium II Duron1G Athlon1G E2180 E3300 E5300 E7200 E8200 E8400 E8500 E8600 Q9550 QX6800 X3-720BE i7-920 i3-530 i5-750 Semp140@x2 955BE X4-B55 Q6600 i5-2500K i7-2600K X4-B60 X6-1055T FX-8120 i7-4790K

  7. #7
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    115
    Quote Originally Posted by PatRaceTin View Post
    impressive


    what's with the comments though? eyefinity stretches image, not real 12k?

  8. #8
    Registered User
    Join Date
    Jul 2009
    Location
    New zealand
    Posts
    23
    I wish this was true...

    EVGA sr2 2x x5667 8c/16t total with EVbot (@ 4.3ghz 24/7) | EVbot | 12Gbddr3 2300mhz stock Gskill CL9 |4x gtx480 on h20|4x 128x corsair ssd |6x wd 2tb| 2x Strider 1500watt + 500 watt |2x swiftech XT ultra ,3x quad rads with two loops | mountain mod case||5.1Ghz,best|Laptopell XPS 1730| 4Gb| 2x 9800gtx build log

  9. #9
    Xtreme Monster
    Join Date
    May 2006
    Location
    United Kingdom
    Posts
    2,182
    Quote Originally Posted by kromosto View Post
    It will be a little of topic but i don't really get why nvidia paniced this much.
    It must be because they are not supplying any of its chips to next generation video game consoles, mainly, Nintendo, Sony and or Microsoft. As John Carmack already talked about, AMD has the chance to get close to Intel on the CPU and get far ahead from Nvidia on the GPU due to optimizations.

  10. #10
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    Quote Originally Posted by Metroid View Post
    It must be because they are not supplying any of its chips to next generation video game consoles, mainly, Nintendo, Sony and or Microsoft. As John Carmack already talked about, AMD has the chance to get close to Intel on the CPU and get far ahead from Nvidia on the GPU due to optimizations.
    yes that is a good reason for panic...


    When i'm being paid i always do my job through.

  11. #11
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by cronicash View Post
    I wish this was true...

    22ms input lag
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  12. #12
    Xtreme Member
    Join Date
    Jun 2003
    Location
    Italy
    Posts
    351
    Quote Originally Posted by informal View Post
    How do you mean they have upper hand? You mean they have upper hand for another few weeks? If so then it's true.
    upper hand means they sell more even when their products are inferior
    3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R

  13. #13
    Registered User
    Join Date
    Jul 2009
    Location
    New zealand
    Posts
    23
    Quote Originally Posted by STaRGaZeR View Post
    22ms input lag
    ASUS PB278Q 1440p input lag = 26ms
    30" DELL U3011 1600p input lag = 26ms
    23" Samsung S23A950D 1080p 120hz input lag = 22ms

    Sure your not confusing this with pixel response?
    EVGA sr2 2x x5667 8c/16t total with EVbot (@ 4.3ghz 24/7) | EVbot | 12Gbddr3 2300mhz stock Gskill CL9 |4x gtx480 on h20|4x 128x corsair ssd |6x wd 2tb| 2x Strider 1500watt + 500 watt |2x swiftech XT ultra ,3x quad rads with two loops | mountain mod case||5.1Ghz,best|Laptopell XPS 1730| 4Gb| 2x 9800gtx build log

  14. #14
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Metroid View Post
    It must be because they are not supplying any of its chips to next generation video game consoles, mainly, Nintendo, Sony and or Microsoft. As John Carmack already talked about, AMD has the chance to get close to Intel on the CPU and get far ahead from Nvidia on the GPU due to optimizations.
    NVidia backed out willingly on the console dealings on their own two feet. Not sure why they would be worried about that.

    Remember the HD 2900xt? I know ATi/AMD does, similar shader design to the xbox360 (main difference is that the HD2900xt could do geometry shading as well). It still lost to the 8800GTX in xbox360 ports, even when they were designed around the AMD shader tech.

    I wouldn't put too much thought into the console situation. Even if the game was written from top to bottom around what AMD has in the consoles, you have to realize that those gpus are essentially a stripped down 7850. You could optimize for that to perfection, and it'll STILL run like a banshee on a 780 just off the extra horse-power alone.

    True story.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  15. #15
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    nvidia backed at consoles are their choice of course but is it because they wanted to or because of the things than can't match amd like price.

    and for the console port games performance i think like you. but if amd supply a good console to pc port tool with it's sdk for xbox this can easily change. back in that times amd didn't interested on software side to much but today they realized importance of it, mantle is a proof of this. well we will see.


    When i'm being paid i always do my job through.

  16. #16
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by kromosto View Post
    nvidia backed at consoles are their choice of course but is it because they wanted to or because of the things than can't match amd like price.

    and for the console port games performance i think like you. but if amd supply a good console to pc port tool with it's sdk for xbox this can easily change. back in that times amd didn't interested on software side to much but today they realized importance of it, mantle is a proof of this. well we will see.
    It was because the money wasn't right. The amount made off console deals is a drop in the bucket compared to the normal profits, so NVidia walked away.

    Also, Xbox One uses DX, not mantle. One of the architects made that statement directly.

    Although, in all fairness, I firmly do believe that AMD was the better choice for the new consoles. APU for a game console just makes sense, less power, lower price.
    Last edited by DilTech; 10-20-2013 at 11:27 AM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  17. #17
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    Quote Originally Posted by DilTech View Post
    Also, Xbox One uses DX, not mantle. One of the architects made that statement directly.
    i didn't mean they will use mantle. mantle is the proof they understand the the power of software. amd build a very good software team last years so it will not be a surprise if they provide a good porting utility that will convert xbox games to pc with a considerably less effort for developers and put every possible optimization while porting the game by this software. if they won't do that it will be like last generation as you stated because no developer wants to put effort on implementing hardware specific optimizations while converting the game to pc.


    When i'm being paid i always do my job through.

  18. #18
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by cronicash View Post
    ASUS PB278Q 1440p input lag = 26ms
    30" DELL U3011 1600p input lag = 26ms
    23" Samsung S23A950D 1080p 120hz input lag = 22ms

    Sure your not confusing this with pixel response?
    Check the OSD in the pic I quoted
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  19. #19
    Registered User
    Join Date
    Jul 2009
    Location
    New zealand
    Posts
    23
    Quote Originally Posted by STaRGaZeR View Post
    Check the OSD in the pic I quoted
    I know it says 22ms. Whats funny about that? Checked lag input of most 4k screens

    Last edited by cronicash; 10-20-2013 at 12:26 PM.
    EVGA sr2 2x x5667 8c/16t total with EVbot (@ 4.3ghz 24/7) | EVbot | 12Gbddr3 2300mhz stock Gskill CL9 |4x gtx480 on h20|4x 128x corsair ssd |6x wd 2tb| 2x Strider 1500watt + 500 watt |2x swiftech XT ultra ,3x quad rads with two loops | mountain mod case||5.1Ghz,best|Laptopell XPS 1730| 4Gb| 2x 9800gtx build log

  20. #20
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Is there a bigger list somewhere that includes 1600p and 1440p monitors?

    All along the watchtower the watchmen watch the eternal return.

  21. #21
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by cronicash View Post
    I know it says 22ms. Whats funny about that? Checked lag input of most 4k screens
    That it sucks, specially for a supposedly gaming focused monitor. It's also known that some monitors suck more than others, nothing new here. 22ms is almost 3 frames at 120Hz. That's the price of post processing, it takes time.

    Also that pic comes directly from nvidia, may need a reality check, which usually means even more.
    Last edited by STaRGaZeR; 10-20-2013 at 01:44 PM.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  22. #22
    Xtreme Mentor
    Join Date
    Sep 2005
    Location
    Netherlands
    Posts
    2,693
    Quote Originally Posted by DilTech View Post
    It was because the money wasn't right. The amount made off console deals is a drop in the bucket compared to the normal profits, so NVidia walked away.

    Also, Xbox One uses DX, not mantle. One of the architects made that statement directly.
    Aye, cause arent those deals mostly about who wil supply the most for the least amount of money.

    AMD has the advantage that they can do both CPU and GPU and therefor offer an all in 1 solution something Nvidia cant.
    So for AMD the profit might be good enough where for Nvidia it might have been to low.

    But even without any of the console deals Nvidia wil still do pretty well.
    Time flies like an arrow. Fruit flies like a banana.
    Groucho Marx



    i know my grammar sux so stop hitting me

  23. #23
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Posts
    531
    Quote Originally Posted by DilTech View Post
    NVidia backed out willingly on the console dealings on their own two feet. Not sure why they would be worried about that.

    Remember the HD 2900xt? I know ATi/AMD does, similar shader design to the xbox360 (main difference is that the HD2900xt could do geometry shading as well). It still lost to the 8800GTX in xbox360 ports, even when they were designed around the AMD shader tech.

    I wouldn't put too much thought into the console situation. Even if the game was written from top to bottom around what AMD has in the consoles, you have to realize that those gpus are essentially a stripped down 7850. You could optimize for that to perfection, and it'll STILL run like a banshee on a 780 just off the extra horse-power alone.

    True story.
    That, right there, is terribly false. If NVIDIA isn't in any of the next-gen consoles is because it couldn't present a competitive product, not because they didn't want to. Sure they will tell everybody that they didn't want to... but its far from the truth.

    Would you not want to be into a product line that will sell into the hundred million units? EVERYBODY would want to... but the fact is that NVIDIA couldn't simply because they lack the x86 license required to qualify. For AMD this is a huge success, a HUGE one.
    Quote Originally Posted by NKrader View Post
    im sure bill gates has always wanted OLED Toilet Paper wipe his butt with steve jobs talking about ipad..
    Mini-review: Q6600 vs i5 2500K. Gpu scaling on games.

  24. #24
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by cronicash View Post
    I know it says 22ms. Whats funny about that? Checked lag input of most 4k screens

    That's actually why I'm glad my cheapo Korean PLS monitor only has a DVI input, no switchboard => almost no input lag. So even though the pixel response time isn't as fast as my old 2ms TN panel, it's still much better for just about everything
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  25. #25
    Registered User
    Join Date
    Jul 2009
    Location
    New zealand
    Posts
    23
    Cause flash gordon cant handle 30 thousandth's of a second and its hugely noticeable. pfffft...

    Does your gpu/ gpus not have an equal amount of input lag depending on post processing, render ahead and sli/ xfire frame sync'ing?

    Would be even better if they added an optional interpolation (replacing dropped frames) function into the gpu (possibly separate ic on the pcb) to perfectly sync frames to 60hz without too much input lag. Then there wouldn't be any of this niche crap (having to choose from a limited/overpriced monitor lineup) and nvid would still make a decent buck on tech...

    But looks as though they're going to use the gimmicky strobing...

    Or even use CEC over hdmi (2.0 preferably) so that the gpu may sync to monitor scans and possibly interpolate dropped/ un sync'd frames.

    http://www.blurbusters.com/

    Lol, at least they didn't call it N-sync...
    Last edited by cronicash; 10-20-2013 at 08:58 PM.
    EVGA sr2 2x x5667 8c/16t total with EVbot (@ 4.3ghz 24/7) | EVbot | 12Gbddr3 2300mhz stock Gskill CL9 |4x gtx480 on h20|4x 128x corsair ssd |6x wd 2tb| 2x Strider 1500watt + 500 watt |2x swiftech XT ultra ,3x quad rads with two loops | mountain mod case||5.1Ghz,best|Laptopell XPS 1730| 4Gb| 2x 9800gtx build log

Page 1 of 2 12 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •