Amazon offers cloud tools to customers to make own chatbots

Amazon Web Services (AWS) has released Cloud technologies aimed at helping its customers develop their own chatbots and image-generation services powered by artificial intelligence (AI).
AWS event for CIOsMicrosoft and Alphabet’s Google are adding AI chatbots to consumer products like their search engines, but they are also eying another huge market: selling the underlying technology to other companies via their cloud operations, Reuters news report said.

Amazon Web Services (AWS), the world’s biggest cloud computing provider, said AWS Bedrock service enables businesses to customize what are called foundation models – the core AI technologies that do things like respond to queries with human-like text or generate images from a prompt – with their own data to create a unique model.

ChatGPT creator OpenAI, for example, offers a similar service, letting customers fine-tune the models behind ChatGPT to create a custom chatbot.

AWS did not reveal the price for using AWS Bedrock services for generating chatbots.

How it works

Amazon Bedrock provides the flexibility to choose from a range of FMs built by leading AI startups and Amazon. Customers can select the best model. Bedrock’s serverless service helps customers to start quickly, privately customize FMs with your own data, and easily integrate and deploy them into your applications using the AWS tools and capabilities without managing any infrastructure.

AWS said the Bedrock service will let customers work with Amazon’s own proprietary foundation models called Amazon Titan, but it will also offer a menu of models offered by other companies. The first third-party options will come from startups AI21 Labs, Anthropic and Stability AI alongside Amazon’s own models.

The Bedrock service lets AWS customers test-drive those technologies without having to deal with the underlying data center servers that power them.

“It’s unneeded complexity from the perspective of the user,” Vasi Philomin, vice president of generative AI at AWS, told Reuters. “Behind the scenes, we can abstract that away.”

Those underlying servers will use a mix of Amazon’s own custom AI chips as well as chips from Nvidia, the biggest supplier of chips for AI work but whose chips have been in tight supply this year.

“We’re able to land tens of thousands, hundreds of thousands of these chips, as we need them,” Dave Brown, vice president of Elastic Compute Cloud at AWS. “It is a release valve for some of the supply-chain concerns that I think folks are worried about.”