http://www.aiseek.com/index.html
Now if I can just find room in my 4X4 machine with SLi/Crossfire, a Bigfoot NIC, Creative X-Fi sound card, Ageia PPU, and RAID card with processor I'll pick it up.
Printable View
http://www.aiseek.com/index.html
Now if I can just find room in my 4X4 machine with SLi/Crossfire, a Bigfoot NIC, Creative X-Fi sound card, Ageia PPU, and RAID card with processor I'll pick it up.
Smells like complete Pu.
Go away!!!
We dont want another add-in card!! :slapass: :nono: :stick: :slap: :mad:
There is going to be nothing left for the cpu to do if it continues at this rate!
I'd definitely put this in the "why?" category next to the Ageia PPU.
wow. how large will mobos be in the future??
Regardless...They won't have enough pci slots. ;)
i love the idea! but mobos won't have room for it. and after i get this my bigfoot nic card, my ppu, dual or quad graphics cards, raid accelrator and xfi (sound accelerator) what is the cpu going to do? wash my dishes? clean my room? what? now if all of the above had the ability to run folding when not gaming:slobber: :slobber: so many wu's:slobber: :slobber: :slobber: :slobber: :slobber: :slobber: :slobber: :slobber: :slobber: :slobber: :slobber:
this is just the beginning of a new era. once we get 8+ cores on our cpu everything will start to be integrated more.
The first thing I thought of back when the PPU was first announced was; what are they going to think of next.. an AI card? And there it is...
Obviously, this Intia processor has a lot to prove yet, but if this can improve AI in games over and above what a dedicated CPU core could possibly do [ala PPU with physics] and if it results in improving gameplay to a revolutionary scale, who would be against that?
Even if I didn't have an open PCI slot [which I do] and if it was all that, I'd make sure I had an open slot for it mighty quick. :stick:
As usual with these "add-ons" though, it'll depend a whole lot on game developer adoption as to whether this thing succeeds or not. I certainly hope it does succeed... I'm for anything that could improve gameplay.
I have only 1 gfx and the PCI's on my board is not accessable anymore lol, GFX in second slot in order to fit a chipset cooler.
And not a single positive though about this :rolleyes:
IMO this is good idea, but as with PPU it doesn`t have ANY software that we use NOW. I still think that with DX10 and multy-core CPUs there won`t be any need for such devices for home PC. But there`re also other non-gaming simulations where this device can be used.
ummm better AI card?
what if HAL stops liking me? :rolleyes:
Edit: on a more techinical note..
It would make Swarm far more realistic... and definately cut down on the processor requirement...
(Subtitles Processing Unit) SPU is the next big thing! :D
Just imagine the performance boost! :eek:
We all love quality subtitles, but not all computers are powerful enough! :p:
Thumbs up if game devs can and will take advantage of it.
Popo > Lol. Subs is teh sh1t!
yeah, in 3 years we are forced to buy 4 add-in cards for about 800 € to be able to play new games with decent fps.
LEAVE ME ALONE!
They'll start making builtins first of all,
Second of all, theire approaching this exact solution wrong.
If this is an introduction to ai processing, that would be fine, but graph processing is not the way to go.
If they made an actual functional neural network, that i'd understand. And it's not hard to do too, just gotta take the chance and apply the muscle.
If they made a neural network based device next....horizons would truly be endless.
And lets not see ageia all over again of course...
Why don't you leave them alone if you don't like them? Is there a universal law I don't know about which forces you to play games at max settings?Quote:
Originally Posted by RaZz!
Currently we are at a point where the CPU is lacking the flexibility to do everything we want it to. This was also the case many years ago when you might have had a CPU and 4 co-processors to do all the extra bits. Now we are at a similar stage but once we get multicore in full (4-8 cores standard) then you have the capability to chuck all this onto the CPU. So you have an all in one system where all you need is a single CPU which incorporates CPU/NB/SB/GPU/PPU/AIPU/(whatever other PU you can think of). That will probably be the case for low end systems with add-in cards and seperate chipsets being used in the mid to high end systems. I think a seperate GPU will always be necessary for intensive 3D situations untill quite a long way into the future.
partially true.. one would think that for uber settings you would need to upgrade your graphics card far more than you would have to upgrade your CPU.Quote:
Originally Posted by Cobalt
However Till this day you can play games perfectly fine (for the most part) on an 9800Pro. Which had used only 110 Million transistors Which is slightly more than the transistors used in a K8 processor.(105 Milllion) so in theory you could remove one K8 core (of a quad core setup) and have room for a good GPU
OK then reserve add in cards for just the high end...
you would be happy if you were forced to buy addin-cards for 800€ and more just to play a game? :eek:Quote:
Originally Posted by xenolith
if addin-cards really establish this would be the situation..
damn it
my future dual cloverton system can do everything software
no need for some stupid pci card
personally i think this will fail like ageia and loose a bunch of investors money.
game developers wont make thier games require these kinds of tech because only .01% of the market would own it. therefore it would only be an "enhancement" and i dont know anybody to pay 300 dollars for some silly square dirt flying up when you shoot the ground like in ghost recon advance warfighter
Who is this forcing you to do anything? You do realize you can play all the latest games [with med-low settings] on a rig three years old, right?Quote:
Originally Posted by RaZz!
I can't believe the negativity i'm seeing here toward technological advancements in discrete solutions. You guys do realize this is xtremesystems forum, right?
Unfortunately, I agree. I'm hopeful though that maybe somebody like AMD/ATI or Nvidia will buy these technologies and integrate them all into future high-end GPU platforms.Quote:
Originally Posted by brandinb
Lets put one thing straight, Bigfoot's card is a 'nix box on a card. Not exactly something you can integrate into a Processor. That being said, i am excited about that card coming out. I just wish for it to be cheaper and PCIeQuote:
Originally Posted by xenolith