AI is still in its early days, Krzanich writes, and the underlying hardware that?s used to execute deep learning tasks is bound to change. ?Some scientists have used GPGPUs [general purpose graphical processing units] because they happen to have parallel processing units for graphics, which are opportunistically applied to deep learning,? he writes. ?However, GPGPU architecture is not uniquely advantageous for AI, and as AI continues to evolve, both deep learning and machine learning will need highly scalable architectures.? via HPC Wire
Bookmarks