DATA ANALYTICS | Modernization in a Big Data world
ByAlthough Big Data remains, for some, an overhyped term, the reality is that the explosion of data is unlike anything we’ve seen before – and it is here to stay. Companies big and small are looking to sustain value and competitive edge by leveraging all of their data assets — regardless of how big, how structured, or how fast moving that data. In todays’ competitive markets, new upstarts have the luxury of building their infrastructure, governance and organizational processes from scratch to best monetize their data asset. For example, organizations like Expedia are able to increase conversion rates through offers that leverage real-time pricing information for hoteliers and historical customer data to predict what offers would be most relevant to consumers .
However, large organizations who have evolved their IT infrastructures over several decades don’t have it so easy. Much of their IT budget is spent on operating their legacy systems, and it is difficult for IT to keep up with the pace of change that Big Data demands. Analytics is at the forefront of this pace. These organizations need to modernize several key aspects of their business systems, processes, and culture; in essence re-architect their enterprises to compete and thrive in a data-rich world.
Adapting to modern needs or habits
Everyone is talking about fact-based decision making, but are their organizations set up to do it in the most efficient way? Let’s face it, most organizations have old systems and infrastructure of different vintages, some that haven’t been upgraded or replaced in decades. In response to this, upstart and leading edge competitors will leverage their data to out maneuver, smarter and faster, while laggards struggle with their systems and fumble through the new digital world of social media, mobile, and machine-to-machine communication in addition to their impact on delivering services and interacting with customers.
This also begs the question, is everyone getting the right information at the right time? Usually, at an enterprise scale, the answer is no. Some systems in organizations may be better than others, and individual lines of business within a large organization may have good intelligence within their domain. Rarely will you find large organizations that have interconnected all of their data at an enterprise level. There have been huge strides in this area through capabilities in data warehousing, but those systems are showing stress in handling the growing volumes of data and an appetite of the business for fact-based decisions. Most organizations rely on decision support systems to make decisions against silos of data that are incomplete and out of date. Often these decisions are made in the interest of an individual business unit or department, but are not what is best for the enterprise. Trustworthy and complete data has to be shared across the enterprise and externally to partners, otherwise no one has the right, or complete picture.
Leveraging data and analytics allows companies to experiment against their populations in ways that were never before possible. This provides the opportunity to use an experimental approach to determine outcomes of different tactics and offerings BEFORE they are rolled out fully. For example, I may choose to buy impressions on a website based on a small section of my population leveraging a new method of segmentation that generates a higher conversion rate click through an offer to a purchase. It is this ability to test-and-learn that gives organizations the agility to find the best paths, and start to double down on those strategies. Speed of execution is paramount and can make the difference between thriving in your market and being left in the dust, outsmarted by your competition.
Installing modern equipment
There are very powerful reference architectures on how to leverage massively parallel processing (MPP) to handle big data analytics. These MPP infrastructures were once reserved for universities, military science and research using build-for-purpose and very distinct software systems. Today these same reference architectures with very fast networks and massively parallel processing servers are readily available, affordable, scalable and agile.
Enter: the era of the modern data center. Organizations are deploying server clusters as networked appliances, leveraging commodity-based hardware and commercial and open-source software that is readily available to take advantage of these server farms. These modern data center architectures are critical for IT to deliver services that are fundamental to their business strategies. IT organizations leverage virtualization to provision this infrastructure out as cloud-based services, or services that can be rapidly turned on for business consumption using ways to provide data, analytic platforms, and full application systems on-demand.
Utility of these modern data centers are being provisioned through cloud-based services, either internally or externally to the organization. Internal cloud services are done when organizations have the means to create the data centers and provision applications fast that can integrate with their legacy systems. New entrants who do not have their legacy systems with different vintages can by-pass this step completely and leverage cloud services that are provided from external organizations that maintain the modern data centers. This eliminates the need for organizations to make capital investments in the data center yet still compete using on-demand cloud services, dramatically lowering their barriers to entry.
Adopting modern ideas or methods
There’s a cultural change that needs to happen in organizations to take advantage of Big Data and advanced analytics. Changing a culture is not easy, and it takes focus and leadership. Todays’ leaders need to have a bias toward analytics, where strategic risk-taking is based on empirical evidence that is driven from analysis of all the available data. These leaders need to transform their organization from a world where our ability to harness the data is scarce to a world where it is abundant. Investments in developing a competency in analytics requires the right skillset to get the data organized, the right infrastructure to use the data efficiently, and new processes on how that data is managed and governed.
New roles are emerging at senior levels of the organization, like Chief Data Officers (CDO) and Chief Analytic Officers (CAO). These are senior level roles who often participate in the board of governors in their organizations. These roles are pressing forward to develop data governance practices to set organization-wide policies on how data is treated and exploited. Analytic centers of excellences coordinate activities in leveraging analytics in operational decisions and supports the use of proven practices more broadly across these organizations.
But where does this leave the Chief Information Officer? This role emerged in automation of systems and their ongoing operations and support. Big Data and Analytic applications puts tremendous pressure on how IT manages and governs their processes today. For example, release cycles of classic IT systems takes far too long to get new systems deployed and are too rigid to take advantage of this experimentation near impossible. Systems need to be able to react quickly to changing inputs and new models that perform better need to be deployed fast. We can’t treat analytic systems the way that we treat classic IT system development, and new governance policies and processes for this Big Data phenomenon must emerge.
CIOs must transform their infrastructures to take advantage of these new trends in affordable ways. When it comes to data they must adopt systems where data storage costs drop dramatically. Never mind all that effort spent on deciding what to keep and what to discard — now we can keep it all! We need to change the mindset of decision makers to leverage their data and analytics to run their business. Those who don’t will get left behind.
Presenting information visually
Data visualization is the presentation of data in a pictorial or graphical format. Though the term may be new, the concept is not. For centuries, people have depended on visual representations, such as charts, graphs and maps, to understand information more easily and quickly. As more and more data is collected and analyzed, decision makers at all levels will increasingly look to data visualization software to find relevance among millions of variables, communicate concepts and hypotheses to others, and even predict the future. Organizations will need to think of their data as a great but unedited story which needs the help of visual analytics to bring it to life.
In summary
As the Big Data phenomenon rages on, few are considering that it’s more than just data and software, addressing notion of culture and adaptation. For large established organizations, a cultural change needs to take place for organizations to take advantage of Big Data and advanced analytics. Like any transformation there is room for innovation and for leaders to emerge to guide these organizations through the transition and ensure that they emerge healthy and able to thrive in the digital world of Big Data.
Tagline: Marc Smith is an Enterprise Architect in the SAS Americas enterprise architecture practice. He works with teams across US and Canada to solve customers’ complex business problems and to drive sales through expertise in high performance analytics. Marc has been architecting, implementing, and selling Business Analytics applications and solutions for over 20 years in financial services, government, telecommunication, retail, health and life sciences, energy, mining and metallurgy industries.
No Comment