• There are no suggestions because the search field is empty.
00 - Hero Blog
00 - Single Post

Ready to take the plunge? Assessing risk and internal issues

Ready to take the plunge? Assessing risk and internal issues

When a company decides to take the plunge into big data, it has to fully understand the issues and internal risks that are inherent to the decision and determine how to integrate it into the company’s various standards. To get the most from big data, you need a governance and data management framework that considers the company’s various dimensions, namely its organization, processes and technology.

 Big data is shaking up current practices and the various fields in data management, like architecture, modelling and design, storage and operations, integration security, quality, data warehousing and business intelligence systems. The quality of your big data is of tantamount importance given its massive volumes, high velocity and diverse sources.

Unlike traditional data management, which focuses on data from internal enterprise systems, big data is more about large volumes of externally sourced data and how it can be applied to derive new knowledge or ideas that are useful to the company and otherwise unobtainable. For example, big data BI can combine consolidated internal data from traditional data warehouses and semi (e.g. social media) or non-structured (e.g. video) sources.

Moving toward a new generation of data warehouses

Big data BI requires a new generation of data warehouses that incorporate new features and technologies to take data from various sources, clean it and store it in a common format. The new-gen warehouse is a central and standardized repository of structured, semi-structured and unstructured data to feed big data BI tools—thereby becoming the universal access layer for all data sources.

 Even though most companies have data warehouses and business intelligence systems, introducing big data technology presents a number of challenges. Necessarily, there will be impacts on the IT department, affecting things as simple as the skills required to operate the technology effectively.

 Generally speaking, big data technology involves a new infrastructure made up of new tech components, and it means the company will need new work methods. So how do you organize the operation of this technology? Do you create a new group or update an existing structure? How many additional people and skills will you need?

 Big data experts usually recommend agile processes that put speed ahead of stability and security. How do you integrate them in a group where system stability and security are still the main concerns? In short, what big data governance structure is needed?

 To help companies with big data projects, NOVIPRO has developed a risk-analysis method that covers all aspects that need to be assessed in order to help you avoid pitfalls and develop a risk-reduction plan. The analysis is based on widely recognized reference frameworks, such as CoBIT, GTAG, ITIL and DMBOK. As part of the analysis, we take into account risk factors that impact the organization, processes and technologies, and then we devise a targeted improvement plan to help clients attain big data maturity

Later in this series, we’ll take a closer look at the issues surrounding the infrastructure needed to support big data exploitation. 

 Read the next article of our Big Data Series : Big Data, a box of surprises?