The NewReality: Fast Inference Processing For 90% Less?

  • 📰 ForbesTech
  • ⏱ Reading Time:
  • 20 sec. here
  • 5 min. at publisher
  • 📊 Quality Score:
  • News: 21%
  • Publisher: 59%

AI News

Inferencing,Low-Cost,Power Efficient

I love to learn and share the amazing hardware and services being built to enable Artificial Intelligence, the next big thing in technology.

Most of the investment buzz in AI hardware concentrates on the amazing accelerator chips that crunch the math required for neural networks, like Nvidia’s GPUs. But what about the rest of the story? CPUs and NICs that pre- and post-process the query add significant costs and are not designed for AI; they are general-purpose devices and can cost tens of thousands of dollars per server.

An Israeli startup called NeuReality, led by Moshe Tanach, has done just that, and the results are impressive. Instead of a “CPU-Centric” architecture, the company front-ends each Deep Learning Accelerator with dedicated silicon. NeuReality has justthat describes their approach to ‘’Network Addressable Processing Units ” and has measured the potential performance and cost savings.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 318. in Aİ

Ai Ai Latest News, Ai Ai Headlines