IBM reveals details about Telum processor for AI inferencing

IBM on Monday unveiled details of the upcoming IBM Telum Processor that contains on-chip acceleration for AI inferencing while a transaction is taking place.
IBM Telum Processor
IBM Telum processor, revealed today at the annual Hot Chips conference, is designed to help customers achieve business insights at scale across banking, finance, trading, insurance applications and customer interactions. A Telum-based system is planned for the first half of 2022.

According to recent Morning Consult research commissioned by IBM, 90 percent of respondents said that being able to build and run AI projects wherever their data resides is important. IBM Telum is designed to enable applications to run efficiently where the data resides, helping to overcome traditional enterprise AI approaches that tend to require significant memory and data movement capabilities to handle inferencing.

Enterprises can use Telum to conduct high volume inferencing for real time sensitive transactions without invoking off platform AI solutions, which may impact performance. Clients can also build and train AI models off-platform, deploy and infer on a Telum-enabled IBM system for analysis.

According to the Federal Trade Commission’s 2020 Consumer Sentinel Network Databook, consumers reported losing more than $3.3 billion to fraud in 2020, up from $1.8 billion in 2019. Telum can help clients to move their thinking from a fraud detection posture to a fraud prevention posture.

The chip features an innovative centralized design, which allows clients to leverage the full power of the AI processor for AI-specific workloads, making it ideal for financial services workloads like fraud detection, loan processing, clearing and settlement of trades, anti-money laundering and risk analysis.

The chip contains 8 processor cores with a deep super-scalar out-of-order instruction pipeline, running with more than 5GHz clock frequency, optimized for the demands of heterogenous enterprise class workloads.

The redesigned cache and chip-interconnection infrastructure provides 32MB cache per core, and can scale to 32 Telum chips. The dual-chip module design contains 22 billion transistors and 19 miles of wire on 17 metal layers.

Telum is the first IBM chip with technology created by the IBM Research AI Hardware Center. Samsung is IBM’s technology development partner for the Telum processor, developed in 7nm EUV technology node.

Related News

Latest News

Latest News