Things to note before investing in on-premises big data capabilities

big data

As firms transform their business digitally, they will be required to depend on on-premises big data wall to relocate data and insights. Thus, the migration of data and analytics to the public cloud that began in 2016 is still going strong in 2017 and will continue in 2018.

Reductions in cost and increases in analytics power are planting the seeds of disruption. While most firms think they have time to migrate, disruptors steal customers by leveraging big data cloud innovation such as serverless analytics and artificial intelligence (AI).

Spending on public cloud services and infrastructure will grow at 24.4 percent to $122.5 billion in 2017, according to IDC.

Also Read: Public cloud services infra spending growth forecast

Public cloud spending will achieve 21.5 percent compound annual growth rate (CAGR) – nearly 7 times the rate of overall IT spending growth during 2016-2020. By 2020, public cloud spending will be $203.4 billion. In 2017, spending on public cloud services and infrastructure will reach $122.5 billion.

United States will be the largest market for public cloud services generating more thatn 60 percent of total revenues.  Western Europe will be spending $24.1 billion in Asia Pacific excluding Japan will be spending $9.5 billion on public cloud.

Also Read: Public cloud services market growth forecast-Gartner

According to market research firm Forrester, the time has come to think exponentially and see the immediate need for action.

It warns enterprises to stop investing in on-premises big data capabilities right now. “We recognize this is not practical for many; therefore, you should at least redirect many of your on-premises plans toward public cloud platforms and hybrid interim solutions,” the firm said in a report.

Therefore, it suggests four steps for enterprises to consider before investing. Those are:

Select a basic cloud strategy

As a first step, firms need to develop a list of existing big data analytics workloads and candidate system-of-insight solutions, then look for common application objectives such as the need for business self-service or support for digital innovation or agility. Also, consider the need for rapid progress to support high-priority business opportunities as well as enterprise technology SLAs that demand control over the infrastructure for mission-critical applications.

Additionally, enterprises need to spend time to create an inventory of their existing big data analytics and insight applications on various public cloud services.  this can be done talking to your marketing, sales, customer experience, and customer insights teams, as they are likely using many SaaS solutions already. Also, find operational technology groups that may be using cloud services for M2M/internet-of-things (IoT) solutions, which 52% of enterprises say they already have or are planning for.

Identify candidate cloud platform technologies

This is a key step while moving big data into public cloud. Insight service provider platforms from Deloitte or Infosys and cloud BI platforms like Birst host your data and let your business perform analytics in a SaaS application.  Through embedded BI, these platforms can also serve as development platforms for building lightweight business analytics insight applications, including predictive models.

Insight service providers use their platforms to build custom applications, while BI cloud platforms feature more self-service capability.

Global public cloud vendors like Google and IBM are integrating their data management and analytics offerings into insight PaaS.

Some cloud BI vendors like GoodData and Hadoop cloud solution vendors like Qubole are also adding more PaaS features — more developer tools, a greater number of analytic choices, and application runtime environments.

Adjust your big data analytics road map

The third step to consider is prioritizing efforts that close the loop between data, customer insight, and action. For instance, the Golden State Warriors basketball organization uses beacon technology to track movements through its arena, then it uses insight to make offers like seat upgrades.

By combining customer data and external data sets available in the public cloud from vendors like 1010Data, marketing pros can learn a lot when fans either decline or accept the offers.

Also part of the road map, it would be better if you could try out different cloud platforms as you can try a public cloud before you buy it.

If you are planning a multicloud strategy, try out different platforms hosted on different cloud infrastructures. For example, Qubole can run on AWS and Azure, but its cost-optimizing tools are more sophisticated on AWS.

Rinse and repeat for other basic cloud strategies

Instead of starting on a data lake and hoping that it yields benefits, plan to build several systems of insight on different cloud platforms.

For example, a retailer is using IBM’s Spark-based Watson Data Platform for business analytics while using Databricks for Spark-as-a-service data science and AI algorithm training. That’s OK, as long as you track both projects to ensure you know how to consolidate or connect them later.

As you get better at building systems of insight, you will naturally identify shared data and insight requirements. For example, an airline is capturing sales data in a common data platform and plans to make that knowledge available for use in customer insights projects across the organization, relieving the strain on limited customer insights team members.

Finally, design a big data fabric that weaves technologies like object stores, streaming services like Kinesis, and Spark-as-a-service. Ensure your fabric leverages common metadata services such as Azure Data Catalog and common security services like AWS Identity and Access Management.

Source: Forrester