MMM
Page 2 of 10 FirstFirst 12345 ... LastLast
Results 26 to 50 of 226

Thread: SuperPi on GPU, were going CUDA

  1. #26
    Xtreme Member
    Join Date
    Mar 2005
    Location
    The Big Apple
    Posts
    338
    I used to own a Cuda, great american muscle car

  2. #27
    Xtreme Member
    Join Date
    Jun 2007
    Posts
    324
    call it GPi
    Last edited by cookerjc; 05-25-2008 at 08:13 PM.

  3. #28
    I am Xtreme
    Join Date
    Feb 2005
    Location
    SiliCORN Valley
    Posts
    5,543
    has there been any HONEST EDUCATED specualtion as to the true power of CUDA vs todays dual and quad core cpu's ?
    i read the quick article about photoshop running with the GPU plugin and although nothing technical was given the 1 line statement they gave makes it seem like a GPU is light years beyond todays current cpu's.

    if this is the case then why hasnt AMD or intel made the CgPU ? i mean,, if AMD/ATI want to win the war,, make the dam next thing an ATI CPU.....

    if the GPU is so powerful,,, then,,, ... wtf are they waiting for???
    "These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
    Welcome to the Roughnecks"

    "Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
    You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"

    Heat Ebay Feedback

  4. #29
    I am Xtreme
    Join Date
    Feb 2005
    Location
    SiliCORN Valley
    Posts
    5,543
    and what about SLI-Pi ?? or X-Pi...


    given the piss poor multi core support Pi currently has,, IF the gpu's are multi multi multi cores then do you really expect it to work all that well ? and if it does then fix the dam pc version to be multicore caple,, TRUELY capable,, not just 80% one core and 20% the other..
    "These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
    Welcome to the Roughnecks"

    "Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
    You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"

    Heat Ebay Feedback

  5. #30
    Xtreme Cruncher
    Join Date
    Nov 2005
    Location
    Rhode Island
    Posts
    2,740
    Quote Originally Posted by Lestat View Post
    has there been any HONEST EDUCATED specualtion as to the true power of CUDA vs todays dual and quad core cpu's ?
    i read the quick article about photoshop running with the GPU plugin and although nothing technical was given the 1 line statement they gave makes it seem like a GPU is light years beyond todays current cpu's.

    if this is the case then why hasnt AMD or intel made the CgPU ? i mean,, if AMD/ATI want to win the war,, make the dam next thing an ATI CPU.....

    if the GPU is so powerful,,, then,,, ... wtf are they waiting for???
    Ever seen Folding @ Home on ATI GPUs? There ya go. They have released toolkits do to so. The thing with GPUs is that they are very powerful very parallel "CPUs". They are only very good at doing a few specific tasks, most of time its with images and image related things. The F@H GPU client can't crunch all the same Work Units the CPU version gets, because of limitations of what the GPU can do.
    Fold for XS!
    You know you want to

  6. #31
    Xtreme Enthusiast
    Join Date
    Aug 2005
    Location
    Melbourne, Australia
    Posts
    942
    It would be cool to see if wwww could implement CUDA support in wprime
    (Especially since its already multithreaded)
    Q9550 || DFI P45 Jr || 4x 2G generic ram || 4870X2 || Aerocool M40 case || 3TB storage


  7. #32
    Xtreme Mentor
    Join Date
    Aug 2006
    Location
    HD0
    Posts
    2,646
    I can see intel shaking in their boots.

  8. #33
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    lol, what exactly is the point?

    pi became a common bench cause it turned out to indicate gaming and memory performance with only a quick benchmark run. im curious about this, but just running pi on a gpu might be pretty pointless and not have any relation to real world performance. it might scale when real apps dont scale anymore, or stop scaling when real apps still scale, or scale different etc

    But a new bench is always exciting, and im looking forward to more cuda stuff!
    im actually surprised there is barely any app that uses a gpu to crunch on data even tho ati and nvidia both support this since years... i know its hard to code, but still, the perf you can tab into with such an app is quite big, so it should be worth the effort?

  9. #34
    Xtreme Cruncher
    Join Date
    Jul 2003
    Location
    Finland, Eura
    Posts
    1,744
    I´m still waiting for QMC Boinc CUDA client, please somebody make that happen


    http://mato78.com - Finnish PC Hardware news & reviews
    BulldogPO @ Twitter


  10. #35
    Xtreme Addict
    Join Date
    May 2004
    Posts
    1,756
    Quote Originally Posted by SKYMTL View Post
    Naming it with the CUDA moniker is the same as advertising IMO.
    QFT.

  11. #36
    Xtreme Member
    Join Date
    Oct 2006
    Posts
    169
    First f@h, and now this. Nvidia's marketing team works very2 well indeed.
    Let's just hope they'll deliver.

    -tam2-

  12. #37
    Banned
    Join Date
    May 2005
    Location
    Belgium, Dendermonde
    Posts
    1,292
    Before everyone thinks GPGPU is the future, i would like to point some things out

    1) GPU's are good at calculating parallell stuff
    2) GPU's truly truly suck when it comes to branching, because when you branch you stall every other shader...
    3) CUDA is slower then when you implement a similar thing in directx, you can program gpu's with directX also instead of using CUDA

    And ATI has something like CUDA, called CTM (close to metal), to bad you have to program in assembler with CTM, in CUDA you can program in C & C++

  13. #38
    Xtreme Cruncher
    Join Date
    Nov 2005
    Location
    Rhode Island
    Posts
    2,740
    [QUOTE=GoThr3k;3015331]Before everyone thinks GPGPU is the future, i would like to point some things out

    1) GPU's are good at calculating parallell stuff
    2) GPU's truly truly suck when it comes to branching, because when you branch you stall every other shader...
    3) CUDA is slower then when you implement a similar thing in directx, you can program gpu's with directX also instead of using CUDA

    And ATI has something like CUDA, called CTM (close to metal), to bad you have to program in assembler with CTM, in CUDA you can program in C & C++[/QUOTE]

    So ATI F@H is programmed in Assembly? Anyways, that keeps the nubs out of programming
    Fold for XS!
    You know you want to

  14. #39
    I am Xtreme
    Join Date
    Feb 2005
    Location
    SiliCORN Valley
    Posts
    5,543
    Quote Originally Posted by [XC] Lead Head View Post
    Ever seen Folding @ Home on ATI GPUs? There ya go. They have released toolkits do to so. The thing with GPUs is that they are very powerful very parallel "CPUs". They are only very good at doing a few specific tasks, most of time its with images and image related things. The F@H GPU client can't crunch all the same Work Units the CPU version gets, because of limitations of what the GPU can do.
    i have seen a few posts on it (the folding client), but to me that doesnt prove alot. in one specific area yes.

    and yes i understand the GPU being image/graphic limited my comment was one more of hope than misunderstanding or disbelief.

    BUT taking that same knowledge and metallity and making a cpu can't be that life altering that intel/amd can't do it.

    i fully understand and appreciate the technology and how different chips are designed to do different things but we do not live, and never have, on a flat plane,,, the world is round,,, to believe that we can't do something will force us to stay in the dark ages of technology.

    and yes i have read a few tidbits here and there about tapping into the gpu for extra computing power.
    let's just hope they can expand on that in our life time. expand on it in such a way that everyday computing graduates to the next level.
    "These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
    Welcome to the Roughnecks"

    "Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
    You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"

    Heat Ebay Feedback

  15. #40
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Somewhere on earth between Taipei, Paris and Montreal
    Posts
    1,194
    Interesting! Hope this will work fine and can represent something interesting.

    They were more and more software using GPGPU to help CPU in heavy load like Adobe Premiere Pro plugin for HD videos, Photoshop CS4 (S Next), Folding @ home etc...

    Maybe the next version of Sisoft sandra will have a measurement for such "GPGPU" calcul ?
    Overclocking-TV Staff
    Quote Originally Posted by hipro5 View Post
    Ha, ha, ha.....NO WAY.....When I show someone preparing to take a shot, I hided the cigarette....

  16. #41
    Xtreme X.I.P.
    Join Date
    Apr 2005
    Posts
    4,475
    And ATI has something like CUDA, called CTM (close to metal), to bad you have to program in assembler with CTM, in CUDA you can program in C & C++
    CTM allows both low and high level programming apart from high-level only for CUDA.

  17. #42
    Xtreme Member
    Join Date
    Apr 2006
    Location
    Belgrade, Serbia
    Posts
    187
    One very important thing to note:

    You will have to have (at least) two NVIDIA GPUs in your system to run SuperPI or any other CUDA applicaton on a GPU for more than 5 seconds.

    Primary display adapter in Windows cannot work under full load for more than 5 seconds without the task being aborted by the operating system because the OS assumes that the driver has got stuck.

    Such a limitation does not apply to the secondary adapter. Therefore, you will be able to use only secondary adapter for running any CUDA applications which take more than 5 seconds to complete their workload.

    That means you will have to invest in a dual PCI-Express x16 mainboard and into another NVIDIA card, even if it is only 9600GT (or cheaper) for the primary display adapter.

    And when you are already investing, why not go SLI? That way NVIDIA sells two cards and a mainboard chipset. Nice way of to boost the sales.

    Some of you already have SLI, and some may want to get it because of this "exciting" announcement so a word of warning to you:

    1. You won't be able to run CUDA applications with SLI enabled EVER. Each CUDA application must manage multiple GPUs on its own.

    2. Multi-GPU CUDA applications require that each GPU thread be associated with a distinct CPU thread. It means that for maximum performance on a Quad GPU setup you would need Quad-Core CPU as well.

  18. #43
    I am Xtreme
    Join Date
    Feb 2005
    Location
    SiliCORN Valley
    Posts
    5,543
    Quote Originally Posted by audiofreak View Post
    One very important thing to note:

    You will have to have (at least) two NVIDIA GPUs in your system to run SuperPI or any other CUDA applicaton on a GPU for more than 5 seconds.

    Primary display adapter in Windows cannot work under full load for more than 5 seconds without the task being aborted by the operating system because the OS assumes that the driver has got stuck.

    Such a limitation does not apply to the secondary adapter. Therefore, you will be able to use only secondary adapter for running any CUDA applications which take more than 5 seconds to complete their workload.

    That means you will have to invest in a dual PCI-Express x16 mainboard and into another NVIDIA card, even if it is only 9600GT (or cheaper) for the primary display adapter.

    And when you are already investing, why not go SLI? That way NVIDIA sells two cards and a mainboard chipset. Nice way of to boost the sales.

    Some of you already have SLI, and some may want to get it because of this "exciting" announcement so a word of warning to you:

    1. You won't be able to run CUDA applications with SLI enabled EVER. Each CUDA application must manage multiple GPUs on its own.

    2. Multi-GPU CUDA applications require that each GPU thread be associated with a distinct CPU thread. It means that for maximum performance on a Quad GPU setup you would need Quad-Core CPU as well.
    ewwwwww....

    thats not good,,,,
    pi doesnt run full core % though maybe we'll be ok.. (atleast it doesnt for CPU)
    still an interesting twist

    that would be a neat tool since you touched on it, a GPU based task manager... something to show GPU load, processes and memory usage.

    someone get on that lol...
    "These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
    Welcome to the Roughnecks"

    "Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
    You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"

    Heat Ebay Feedback

  19. #44
    Xtreme 3D Team Member
    Join Date
    Jun 2006
    Location
    small town in Indiana
    Posts
    2,285
    Quote Originally Posted by Lestat View Post
    ewwwwww....

    thats not good,,,,
    pi doesnt run full core % though maybe we'll be ok.. (atleast it doesnt for CPU)
    still an interesting twist

    that would be a neat tool since you touched on it, a GPU based task manager... something to show GPU load, processes and memory usage.

    someone get on that lol..
    .
    Sombody did, Rivatuner charting shows it along with more as does GPU-Z. in the sensor page. (ver .22 is out now)
    QX 9650 5ghz with 1.55v 4.8ghz with 1.5v 24/7 in a VAPOLI V-2000B+ Single stage phase cooling.
    DFI LP LT X-38 T2R
    2X HD4850's water cooled , volt modded
    Thermaltake 1KW Psu
    4x Seagate 250GB in RAID 0
    8GB crucial ballistix ram

  20. #45
    Xtreme Guru
    Join Date
    Dec 2002
    Posts
    4,046
    superpi on cpu: battle of the seconds

    superpi on gpu: battle of the milliseconds

    so wholl brake 9ms ?

  21. #46
    Xtreme Member
    Join Date
    Mar 2008
    Location
    Germany
    Posts
    351
    is calculating pi a process thats easily multi-thread'able?

    that thing about needing a second GPU kinda killed my interest in the last second . Hope they can make a work-around and we see good software
    X3350 | DFI LP X38 T2R | d9gkx
    9800gtx | Raptor1500AHFD/5000AACS/WD3201ABYS
    Corsair 620HX | Coolermaster CM690

  22. #47
    Banned
    Join Date
    May 2005
    Location
    Belgium, Dendermonde
    Posts
    1,292
    kinda strange you need a second GPU, some of my comrades using CUDA dont need tthat

  23. #48
    Xtreme Member
    Join Date
    Jul 2006
    Location
    Brazil
    Posts
    118
    nice.. my poor 8500gt is on list.

  24. #49
    Xtreme Enthusiast
    Join Date
    Apr 2008
    Posts
    912
    Can't see why you'd need a second GPU - just don't stress it 100%, it looks like.

    If you need a second GPU to run any GPGPU application I wonder how the ATI f@h client does it, or how folding on nvidia will work.

  25. #50
    Xtreme Addict
    Join Date
    Feb 2007
    Posts
    1,674
    can't you just get a cheapo pci card for your primary display?

Page 2 of 10 FirstFirst 12345 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •