Accendo Reliability

Your Reliability Engineering Professional Development Site

  • Home
  • About
    • Contributors
    • About Us
    • Colophon
    • Survey
  • Reliability.fm
  • Articles
    • CRE Preparation Notes
    • NoMTBF
    • on Leadership & Career
      • Advanced Engineering Culture
      • ASQR&R
      • Engineering Leadership
      • Managing in the 2000s
      • Product Development and Process Improvement
    • on Maintenance Reliability
      • Aasan Asset Management
      • AI & Predictive Maintenance
      • Asset Management in the Mining Industry
      • CMMS and Maintenance Management
      • CMMS and Reliability
      • Conscious Asset
      • EAM & CMMS
      • Everyday RCM
      • History of Maintenance Management
      • Life Cycle Asset Management
      • Maintenance and Reliability
      • Maintenance Management
      • Plant Maintenance
      • Process Plant Reliability Engineering
      • RCM Blitz®
      • ReliabilityXperience
      • Rob’s Reliability Project
      • The Intelligent Transformer Blog
      • The People Side of Maintenance
      • The Reliability Mindset
    • on Product Reliability
      • Accelerated Reliability
      • Achieving the Benefits of Reliability
      • Apex Ridge
      • Field Reliability Data Analysis
      • Metals Engineering and Product Reliability
      • Musings on Reliability and Maintenance Topics
      • Product Validation
      • Reliability by Design
      • Reliability Competence
      • Reliability Engineering Insights
      • Reliability in Emerging Technology
      • Reliability Knowledge
    • on Risk & Safety
      • CERM® Risk Insights
      • Equipment Risk and Reliability in Downhole Applications
      • Operational Risk Process Safety
    • on Systems Thinking
      • Communicating with FINESSE
      • The RCA
    • on Tools & Techniques
      • Big Data & Analytics
      • Experimental Design for NPD
      • Innovative Thinking in Reliability and Durability
      • Inside and Beyond HALT
      • Inside FMEA
      • Institute of Quality & Reliability
      • Integral Concepts
      • Learning from Failures
      • Progress in Field Reliability?
      • R for Engineering
      • Reliability Engineering Using Python
      • Reliability Reflections
      • Statistical Methods for Failure-Time Data
      • Testing 1 2 3
      • The Manufacturing Academy
  • eBooks
  • Resources
    • Accendo Authors
    • FMEA Resources
    • Glossary
    • Feed Forward Publications
    • Openings
    • Books
    • Webinar Sources
    • Podcasts
  • Courses
    • Your Courses
    • Live Courses
      • Introduction to Reliability Engineering & Accelerated Testings Course Landing Page
      • Advanced Accelerated Testing Course Landing Page
    • Integral Concepts Courses
      • Reliability Analysis Methods Course Landing Page
      • Applied Reliability Analysis Course Landing Page
      • Statistics, Hypothesis Testing, & Regression Modeling Course Landing Page
      • Measurement System Assessment Course Landing Page
      • SPC & Process Capability Course Landing Page
      • Design of Experiments Course Landing Page
    • The Manufacturing Academy Courses
      • An Introduction to Reliability Engineering
      • Reliability Engineering Statistics
      • An Introduction to Quality Engineering
      • Quality Engineering Statistics
      • FMEA in Practice
      • Process Capability Analysis course
      • Root Cause Analysis and the 8D Corrective Action Process course
      • Return on Investment online course
    • Industrial Metallurgist Courses
    • FMEA courses Powered by The Luminous Group
    • Foundations of RCM online course
    • Reliability Engineering for Heavy Industry
    • How to be an Online Student
    • Quondam Courses
  • Calendar
    • Call for Papers Listing
    • Upcoming Webinars
    • Webinar Calendar
  • Login
    • Member Home
  • Barringer Process Reliability Introduction Course Landing Page
  • Upcoming Live Events
You are here: Home / Articles / Using Big Data to Identify Trends and Evolving Risks

by Greg Hutchins Leave a Comment

Using Big Data to Identify Trends and Evolving Risks

Using Big Data to Identify Trends and Evolving Risks

Guest Post by Greg Caroll (first posted on CERM ® RISK INSIGHTS – reposted here with permission)

Probably one of the most recognized but least understood disruptive technologies for Risk Management in the 2020’s is Big Data.

This is the 2nd in the risk analytics series covering the Top 10 Disruptive Technologies that will change Risk Management as we know it being:

10. Scenario Analysis – to provide operational management with decision marking collateral
9. Big Data – to identify trends and evolving risk
8. Neural Networking – to identify and map real world interrelationships
7. Predictive Analytics – to set up threat management & preventive action programs
6. Virtual & Augmented Reality – to gain a quantum leap in staff training and awareness
5. IoT – Intelligent Things – to monitor changes in environmental factors in real-time
4. Deep Machine Learning – to monitor customer and staff sentiment, etc
3. Automated Processes – robots to replace laborious risk assessments & reviews
2. Blockchain Distributed Trust Systems – to obsolete Cybersecurity & Supply Chain risk
1. The biggest single change will be ….

This week I look at No. 9 – Using Big Data to identify trends and evolving risks.

To recap on the lead-in article in this series “The Future of Risk Management in the 2020s”, the Word Economic Forum 2018 Global Risks Report urged the need for “individuals and organizations to think critically and creatively about how they can respond to a rapidly evolving risks”.

Big Data provides the single greatest opportunity for identifying trends and evolving risks.

What is Big Data?

Big Data is the aggregation and analyses of the huge masses of public data (and not so publicly) available in cloud. Most of this is unstructured data (i.e. documents and articles) from published research papers, government stats, transport & power usage, social media chatter, the news, announcements, and rumours in the press, or even information from competitors’ websites. More often than not, it is also generated as a by-product of other applications or activities, such as cloud based CRM or accounting system.  Leaving aside its legal, ethical, and sovereignty issues, a number of large on-line software providers are aggregating the statistics of your data and usage, which makes it anonymous, and then provide it for consumption thru Big Data.  You will be astounded by the wealth of free big data sets available, just Google: free big data sources.

How do you know what’s relevant?

Companies like Google and PeopleSoft have developed extensive metrics and demographics on data in the cloud, based on who accesses it and its relationship to other data found useful in the similar searches. This is identified based on the length of time spent at that document and repeat visits by a large volume of users using similar searches. It’s not based on “meta-data” or SEO engineering.

How do you use Big Data?

The most common way of accessing Big Data is using the Hadoop technology. Hadoop is an open source (public domain) framework that allows the processing and storage of extremely large data sets in a distributed computing environment. It provides a method of aggregating multiple data sources then extracting meaningful results from simple queries, much like you would use to find internal sales figures, but from worldwide activity for your selected demographics (locale, target group, activity, sentiment, etc.).  This is the innovation that moved that masses of data on the internet from being an obstacle to a disruptive resource. There are numerous 3rd party software packages available to facilitate establishing a Hadoop data lake.

Be aware! Big Data is its own discipline and shouldn’t be combined or confused with Analytics.  Just as you don’t start building a house before you design it, and design and carpentry disciplines are not interchangeable, so too is Big Data and Analytics. Big Data is infrastructure while Analytics is consumption.  Your aim with Big Data is to establish a comprehensive Data Inventory including external sources, followed by establishing a good Data Governance framework to ensure confident in that data before you start making assumptions based on it.

Why use Big Data for Risk Management?

Big Data can be used to extract risk collateral but more importantly, for identifying threats and evolving risks. Traditional risk management systems rely almost exclusively on risk reviews to identify threats and evolving risks, which is both ineffective and subject to bias. People don’t know what they don’t know (see “Does anyone really understand Emerging Risks?”).

To be usable for analytics, academics refer to the need to identify the “data journey” with lots of “touch points” (data point sources).

If you have moved from the unproductive Risk Register based risk management to Scenario based risk event management (see Using Scenario Analysis for Risk Based Decision Making), you now have identified both the journey (the scenario steps) and the touch points (the drivers and influences on risk events).  Against these you can set Hadoop queries that will monitor any activities that affect those scenarios.  This is how you identify trends and evolving risks.

Here is an example that comes from an HBR article “Don’t Let Data Paralysis Stand Between You And Your Customers” By Harald Fanderl:

“One bank, for instance, was looking for ways to use big data to spot early indications of loss risk in its small business lending and service operations. Touchpoint data revealed subtle changes in customer behavior that raised questions in the fraud team’s mind. It was only when the team connected the dots across touchpoints, however, that the bank discovered behavior patterns that highly correlated with imminent risk of default. These included changed behaviors in online account checking frequency, number and type of call center inquiries and branch visits, and credit line use. Analyzing those complex patterns allowed the bank to develop an early warning system that flagged high-risk customers.

Big data harbors big opportunities to improve customer journeys and value. What it requires is a commitment to focus on what really matters.”

According to Gartner 70% of big data initiatives have not moved past the “pilot” phase, i.e. fail. This is because they are really IT playing with concepts not solving real problems.  A Big Data project must be based on identifying the drivers & influences that affect risk then identify Big Data sources to measure those influences.  Next ensure confidence in that data thru a stringent data governance regime testing quality, accuracy, bias, and consistency before thinking about analytics. Ensure you have a corporate policies on its selection and use.

Using Big Data to identify trends and evolving risks

The beauty in this is that instead of you relying on an individual within your organization to identify an evolving risk to your business, the “wisdom of the crowd” identifies the threats posed by world or local events which can be matched, thru Big Data, against the identified risk drivers and influences within your risk scenarios.

These Hadoop queries are relatively easy to develop and consume using currently available desktop Analytics tools like Google Analytics, PowerBI and even Excel.  Coupled with modern ERM software that provides KRI monitoring tools with trigger notifications, operational management can be notified of an identified evolving risk, presented with the applicable scenarios, and utilize Predictive Analytics (this will be a subject on its own in an upcoming article) to evaluate and test possible courses of action. This provides them with invaluable collateral for informed decision making.  Practical and value-adding risk management.

Next week I will look at No. 8 – “Using Neural Networking to identify and map Risk interrelationships”, in more detail.

Bio:

Greg Carroll 
- Founder & Technical Director, Fast Track Australia Pty Ltd.  Greg Carroll has 30 years’ experience addressing risk management systems in life-and-death environments like the Australian Department of Defence and the Victorian Infectious Diseases Laboratories among others. He has also worked for decades with top tier multinationals like Motorola, Fosters and Serco.

In 1981 he founded Fast Track (www.fasttrack365.com) which specialises in regulatory compliance and enterprise risk management for medium and large organisations. The company deploys enterprise-wide solutions for Quality, Risk, Environmental, OHS, Supplier, and Innovation Management.

His book “Mastering 21st Century Risk Management” is available from the www.fasttrack365.com website.

Filed Under: Articles, CERM® Risk Insights, on Risk & Safety

About Greg Hutchins

Greg Hutchins PE CERM is the evangelist of Future of Quality: Risk®. He has been involved in quality since 1985 when he set up the first quality program in North America based on Mil Q 9858 for the natural gas industry. Mil Q became ISO 9001 in 1987

He is the author of more than 30 books. ISO 31000: ERM is the best-selling and highest-rated ISO risk book on Amazon (4.8 stars). Value Added Auditing (4th edition) is the first ISO risk-based auditing book.

« Dust Control Concepts
Assessing Your Asset Management System »

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

CERM® Risk Insights series Article by Greg Hutchins, Editor and noted guest authors

Join Accendo

Receive information and updates about articles and many other resources offered by Accendo Reliability by becoming a member.

It’s free and only takes a minute.

Join Today

Recent Articles

  • Gremlins today
  • The Power of Vision in Leadership and Organizational Success
  • 3 Types of MTBF Stories
  • ALT: An in Depth Description
  • Project Email Economics

© 2025 FMS Reliability · Privacy Policy · Terms of Service · Cookies Policy