There’s an immense amount of data out there, and lots of companies scaling the mountain are working with modern day picks and shovels trying to get value from this new form of economic resource. In fact forward-looking organisations have been leveraging insights from big data for years, and even decades.
It’s been a continual evolution, a journey driven by ever-larger volumes of data. The velocity and variation in data are also increasing. Whereas the highly structured clean, precise, and validated data that is generated by business transactions and the likes, sits in the more familiar data marts and data warehouses, companies are collecting raw data from a number of sources outside of the immediate corporate bounds, created by things, people or processes.
Examples abound of hugely successful enterprises such as Airbnb and Uber who extract their core business value from data and networks without owning physical assets. Not long ago, an activity like a cab ride meant you hailed one on the street, told the driver your destination, and paid in cash. No data. Today, you use your app, track the route via GPS, pay with a credit card, and then rate the driver on social media – and the driver can also rate you as a passenger. Four types of data are created by just that one activity. Large volumes of this raw unstructured data reside in data reservoirs or data lakes, and new methods of extracting meaning from these pools form a large part of the big data story.
As with any tech evolution, along with the pioneers who are a long way down the road, other companies find themselves at different stages of the journey. Some are just taking first steps. They know they need to start somewhere, and understand that data is a valuable corporate asset. It’s a bit like sorting through stacks of old newspapers and magazines stored away in the garage. Standing in front of a mound of unsorted stuff is always daunting. But the enormity of the challenge is a market perception, not a technical reality.
Other companies seem to meet roadblocks. With computing resources already at capacity, businesses believe they need to buy additional infrastructure, storage, and computing power, and find a data scientist to top up the existing IT specialists. Even with this high awareness, companies are still throwing data away because they don’t have the appropriate tools and technology.
The reality is that the tools and technologies are already available to support any big data journey if you know where to look.
Big data and cloud
What’s new is the broader accessibility of these tools in the cloud. Cloud computing means that infrastructure resources are immediately at hand, and when those cloud resources take the form of a big data lab then the ideal solution is already in place. Now, big data environments can be deployed in days to handle diverse types of data without lengthy configuration, available as pre-built environments that people can just go ahead and use. All that’s needed is to hand the keys over to the business user – not a data scientist.
With the best solutions, users do not have to worry about the detailed mechanics that happen behind the scenes, and have an elastic resource which expands and contracts based on immediate needs. The user can focus their attention on a single pane of glass that coordinates the analytics process, and on what matters most - the business at hand, as opposed to technology and infrastructure. All of this is essential to getting fast value from data.
The start of a journey
That’s what we call a Data Lab, and it’s available from Oracle in the Oracle Cloud. In a cloud environment, big data is not a single product but rather a portfolio of technologies that transparently correlates diverse data sets into business insights. The big data discovery journey is guided by integration of data sources allowing visualisation in a way that makes sense to the business user. But above and beyond that, it’s an environment that lets you experiment and quickly work through process of elimination.
It’s a turning point very similar to Edison’s invention factory in 1876 where he vowed to turn out a minor evolution every six weeks and a major invention every six months. He ultimately achieved his goal, producing many inventions that shaped the modern world, including the familiar light bulb which we now rely on in our daily lives.
A modern-day innovation lab encourages fun and creative analytics that scale and connect to your business. As the data is inherently correct, the final answer should be correct. But there’s no official start point, and no official finishing line. It’s OK to have just a rough compass bearing indicating the general direction. The first question might not be the right question, and you will discover more questions in the future. Trial and error is permitted. But the remit is to start the journey and just use it, to find the correlations and answers that are right for your business, and tell you where to go next. Both line-of-business users as well as data scientists can use the Data Lab, bringing it into the mainstream and integrating it with corporate planning and business execution.
The data analyst and the line-of-business user team up as power players to find new ways to address challenges, innovate how the company does business, and find new revenue streams. In fact, the true success metric is how many new Citizen Data Scientists get created.
Data projects contribute to the corporate competitive edge by adding value to existing products, improving processes, and helping invent new products. Use data collected from customer interactions can be leveraged to better engage with customer, increase revenue, and improve loyalty and customer satisfaction.
With so much data being stored and actively processed, one additional concern is security. Organisations go to great lengths to secure their core data assets within on-premise IT, and wonder if cloud security matches those standards. However, the reality is that for an organisation primarily involved in finance or commerce, the focus is necessarily on their core business, not IT. Cloud providers such as Oracle have a long history of providing enterprise IT systems development and are specialists in security, with large teams dedicated to making sure the entire computing environment is secure, from the periphery of a data centre through all three layers of the stack – software, middleware, and hardware.
In addition, concerns about the physical location of data in the cloud can also be allayed. Big data analysis can take place over distributed stores of data, often close to the collection point, with processing done locally, depending on the data type.
According to Forrester, a leading independent research firm, enterprise data warehouses are now evolving beyond traditional data storage and delivery, with scale, performance, and innovation distinguishing the leaders. Oracle was the top-ranked vendor in the current offering category in The Forrester Wave: Enterprise Data Warehouse, Q4 2015. Furthermore, Oracle was the top-ranked in the strategy category in the same report.
A new form of capital
No matter the size of an organisation, business sector or geographic location, making decisions without a comprehensive set of data is like shooting in the dark hoping to hit your mark, or at best making an educated guess. Data should be treated as a new form of capital, to be invested throughout the enterprise for competitive advantage. And the scope must encompass data outside of the traditional sources and formats.
Like it or not, every organisation will be impacted by the big data revolution. The only question is whether you want to rely solely on your traditional data-warehouses to get partial answers to partial questions, or proactively extend it to harness the value of all your data.
The writer a director at Oracle