Deep learning to be a must for future businesses

Deep learning is becoming increasingly popular for both projects and hiring. Part of the rapid evolution is a result of big research labs such as Facebook and IBM investing in the research.

In the business world, about 30 percent of data science platform vendors have the first version of deep learning in products, according to research firm Gartner.

Last week, US Food and Drug Administration approved a new machine learning application for medical imaging. The medical imaging platform developed by Arterys has been approved for use in aiding doctors diagnose heart problems.

The platform is a cloud-based self-teaching neural network that has learned from 1000 cases thus far. According to a biotechin.asia report, the platform will continue to learn and improve its knowledge as it examines more number of cases.

The US space agency NASA is using Neurala startup’s deep learning tools to develop a software controller for robotic rovers that could autonomously explore Mars.

Owing to its popularity, deep-learning jobs grew from practically zero jobs in 2014 to around 41,000 jobs today.

ALSO READ:  Strong growth expected in deep learning market

Research and Markets predicts deep learning market is expected to be worth $1.72 billion by 2022, growing at a CAGR of 65.3 percent between 2016 and 2022.

The hardware segment of the deep learning market is also expected to grow owing to the growing need for hardware platforms with a high computing power to run deep learning algorithms.

Some of the companies involved in the development of hardware for the deep learning technique are Google, Microsoft, Intel, Qualcomm and IBM.

Growth boosters

The major factors driving the deep learning market globally are the robust R&D for the development of better processing hardware and increasing adoption of cloud-based technology for deep learning.

Most recently, IBM expanded its machine learning portfolio with TensorFlow. The enterprise IT vendor said its PowerAI distribution for popular open source Machine Learning and Deep Learning frameworks on the POWER8 architecture will now support the TensorFlow 0.12 framework that was originally created by Google.

Besides those, the increasing usage of deep learning in data analytics, cyber security, fraud detection and database systems is fuelling the growth of data mining applications in the deep learning market.

Medical industries generate huge amounts of data sets related to medication, patient details and diagnosis.

Deep learning and Machine learning

Deep learning is not a stand-alone technology. In fact, it’s a branch of machine learning. Gartner says companies should look to machine learning as a potential service offering.

Potential applications include anomaly detection, speech control and queries, sentiment analysis and facial recognition.

Data and analytics leaders should begin looking for potential opportunities to incorporate deep learning in the organization, specifically any critical business problems with significant “perceptual components.”

They can consider acquisition of start-ups to acquire talent and technology.

US technology giant Apple has move ahead with this strategy. In September, Apple acquired a Hyderabad-based machine learning startup, Tuplejump, which helps companies to store, process and visualise big data with its unique software.

ALSO READ: Apple buys machine learning startup India

Apple is on a machine learning company buying spree and also bought two start-ups Perceptio and Turi.

Meanwhile, Microsoft has bought Montreal-based Maluuba, a research-oriented startup focused on deep learning.

In a blog post, the tech giant wrote, “Maluuba’s expertise in deep learning and reinforcement learning for question-answering and decision-making systems will help us advance our strategy to democratize AI and to make it accessible and valuable to everyone — consumers, businesses and developers.”

Natural-language generation

Visualization has been a major driver of modern business intelligence (BI), but data in this form can be difficult to fully interpret.

At the same time, natural-language generation (NLG) is able to create a written or spoken content-based narrative of data findings alongside the visualizations to produce a full story about key action items.

Currently, BI teams integrate stand-alone NLG engines, but as the technology evolves, that will change.

Gartner notes that by 2019, natural-language generation will be a standard feature of 90 percent of modern BI and analytics platforms, expanding the possibilities of deep learning.

NLG will enable next-generation BI and analytics platforms to automatically find, visualize and narrate important findings.

Therefore, data and analytics leaders should begin to integrate NLG with existing BI/data discovery and other tools. Monitor potential capabilities and roadmaps of BI, data discovery and data science, as well as emerging startups.

Arya MM
[email protected]

Related News

Latest News

Latest News