MMM
Results 1 to 25 of 155

Thread: Intel Files Lawsuit Against Nvidia

Threaded View

  1. #27
    Xtreme Enthusiast
    Join Date
    Oct 2006
    Posts
    617
    supercomputers started off as single-core solutions, but now they have thousands of cores, often mixtures of CPUs from different manufacturers, even nvidia GPUs are turning up in them.

    the single-socket desktop became parallel decades later - hyperthreaded single cores, duals, hyperthreaded duals, quads, hyperthreaded quads, and soon we'll have hyperthreaded hex-cores.

    so i think multi-GPU will stay, and expand. a multi-GPU setup is like a multi-processor server.
    going again by the principle that history repeats - in the past if you wanted multiple CPU cores you needed to buy a multi-socket motherboard, now you can get multi-core CPUs. in the past if you wanted multiple GPUs you needed a multi-PCI-e motherboard, but in the future multi-GPU single cards will become more popular, ATI have made multi-GPU their high end card policy already. quad-sli/xfire on a single PCB? it's coming. wanna know a upcoming gaming technology that'll get everybody buying multiple cards? stereoscopic vision. we'll have each card working on a different picture instead of a single picture being split between two cards, so scaling will be perfect. in cases where two GPUs scale well when working on a single picture stereoscopic 3d will make quad GPU setups interesting instead of a micro-stutter-fest.

    but nvidia have taken on AMD and Intel instead of just ATI by giving their GPUs a CPU feature - programmability. and it wasn't a half-assed effort. sub-$100 NV cards beat a overclocked core2quad at folding. NV GPU owners have a seriously powerful piece of silicon that's unused when they're not gaming, but in theory it could accelerate any desktop app, and obselete the CPU in many.

    i'm not sure how this will pan out for nvidia, they may not have enough money to invest to get CUDA into the market. they need to lower the cost of entry for software companies, ie they need to sponsor CUDA development for the really big companies, but that isn't cheap. intel is going to rain on their parade pretty hard with larrabee which will already be compatible with x86, so today's programmers will find it easier to work with. can you imagine how much NV would like to have the world's biggest software company dedicating most of its time to programming CUDA? now you see NV's problem - microsoft dedicates most of its time to x86. larrabee may well accelerate microsoft windows and any program that runs on windows out of the box. if you're a occasional gamer, or not into the graphically stressful first person shooters, do you buy a $200 NV card, or a $200 Intel card with the gfx power of a $100 NV card + the ability to speed up any multi-threaded windows app, and multitasking in general?

    in the longer run i agree with you that GPUs and CPUs will become the same thing. a GPU with CUDA is a great maths processor, and like the maths processor it will end up on-die. there's so much waste in having the GPU and CPU seperate - two memory controllers, two sets of memory, two programming languages...

    if i were Jen-Hsun Huang there's one thing i would do today - i'd get a few hundred motherboards specially made, and find elaborate and entertaining ways of giving them away:
    GTX285 core (maybe two, so it'd be the most powerful single PCB card in existence) where the CPU should be. GDDR5 where the ram should be. a slot that resembles a PCI-e x16 slot which you can add one of two small add-in cards to, one has a LGA1366 socket + 3 DDR3 slots on it, one has a AM3 socket + 2 or 4 DDR3 slots. the GTX285 should have a tru120e on it, and the CPU should have a thin GPU-style cooler that cools the CPU, chipsets and VRM.
    an essential marketing stunt
    Last edited by hollo; 02-22-2009 at 02:19 PM.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •