Results 1 to 5 of 5

Thread: [News/Rumor] NVIDIA's next-gen Volta GPUs teased, 3 high-end models

  1. #1
    Join XS BOINC Team StyM's Avatar
    Join Date
    Mar 2006
    Location
    Tropics
    Posts
    9,468

    [News/Rumor] NVIDIA's next-gen Volta GPUs teased, 3 high-end models

    http://www.tweaktown.com/news/53396/...els/index.html

    Well, according to the latest rumors from Baidu user USH Ishimura, NVIDIA's next generation Volta architecture is going to be an absolute powerhouse. The user said that the successor to the GP104 (which is the GPU that powers the GTX 1080 and GTX 1070) will offer "really strong" performance.

    The rumor continues, adding that there will be three high-end gaming graphics cards unveiled with the next-gen Volta architecture. Right now under the Pascal architecture, we have the GP102 which powers the Titan X, GP104 which powers the GeForce GTX 1080 and GTX 1070, while the GP106 powers the mid-range GTX 1060.

    NVIDIA will reportedly unleash the GV102, GV104 and GV110 - but we have no idea which GPU will be powering the new GeForce GTX 1180. I'd say GV110 could be an utter beast, powered by HBM2 memory, and a price tag that should make the Titan X walk away with its tail between its legs.

    The next-gen GV102 should arrive as a Volta-based Titan X successor, while the GV104 will succeed the GP104-based GeForce GTX 1080. But then we have to think of what NVIDIA will name the next-gen cards. We know that NVIDIA is tapping Samsung for its 14nm process on next-gen GPUs, which might arrive as the new Volta-based cards.

    Does the GeForce GTX 1180 and GTX 1170 sound right? I don't know... but I also said that about the worry of NVIDIA naming the new Pascal cards GTX 1080 as people would associate them with the 1080p resolution - and that hasn't been a problem at all.

  2. #2
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Is a forum post a credible news source now?
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  3. #3
    Xtreme Addict
    Join Date
    Dec 2004
    Location
    Flying through Space, with armoire, Armoire of INVINCIBILATAAAAY!
    Posts
    1,939
    On the other hand, this contains approximately zero actual new information.

    Nvidia is planning to make a successor to their current product (duh! what tech company isn't?)
    It's codenamed "Volta" - we knew this from roadmap slides for like a year now
    It's going to be faster than current products (again, unsurprising)
    It's going to use HBM2 - this is a reasonable guess, although they might change their mind if even faster GDDR5X shows up. This is going to be a last-minute decision, likely on a per-SKU basis.
    It's going to use Samsung's fabs - we've already heard this rumor elsewhere.
    Sigs are obnoxious.

  4. #4
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    I don't buy it. Volta might offer big gains in deep learning optimization, but GP102 is already a huge die. I just don't see Nvidia finding a way to cram in a significant increase in shaders - or even increase clocks much - without running into power/thermal issues.

    It's certainly possible they will work on improve their asynchronous compute; I just don't see a revolutionary update coming our way.
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  5. #5
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by AliG View Post
    Volta might offer big gains in deep learning optimization, but GP102 is already a huge die.
    471 mm^2 GP102 is not that big. GP100 is 610 mm^2 - 30% larger.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •