Get Out of the Spreadsheet Abyss

When an organization turns a blind eye to the proliferation of spreadsheet based processes, it subjects itself to substantial risks. Have you ever encountered (or enabled) the following scenario?

  • Enterprise reporting is lacking thus a “power-user” (i.e. analyst) is conscripted into cobbling together an ad-hoc spreadsheet based process to address the management request;
  • The power user exports data from various business applications and then manipulates the data output typically with macros and formulas;
  • This initial spreadsheet is manually integrated with business unit data from various other unofficial spreadsheets, further distancing the data from the source business application;
  • Multiple tabs are added, charts are generated and data is pivoted (all manually);
  • Management finds value in the report and elevates it to a repeatable process;
  • The original request increases in complexity over time as it requires more manipulations, calculations and rogue data sources to meet management needs;
  • Management doubles down with the need for a new request and the process is repeated;
  • IT proper is NEVER consulted on any of the requests;

The business unit is now supporting a “spreadmart”. The term is considered derogatory in data circles.

“A spreadmart (spreadsheet data mart) is a business data analysis system running on spreadsheets or other desktop databases that is created and maintained by individuals or groups to perform the tasks normally done by a data mart or data warehouse. Typically a spreadmart is created by individuals at different times using different data sources and rules for defining metrics in an organization, creating a fractured view of the enterprise.” [1]

Although the initial intentions of these requests may be reasonable, the business never bothers to approach IT to propose building out a proper data store. Additionally, the conscripted analysts are unhappy with their additional manual responsibilities. Spreadsheet wrangling and manual integration activities shift precious time away from more value-added pursuits such as data analysis and formulating recommendations.

From management’s perspective, why should they pay IT to build out an officially sanctioned solution that will deliver the same information that an internal team of analysts can provide? After all, the spreadmart is responsive (changes can be made quickly) and it’s inexpensive (as opposed to new investments in IT). Eventually, the manual processes are baked into the job description and new hires are enlisted to expand and maintain this system. The business sinks deeper and deeper into the spreadsheet abyss.

The short term rewards of the spreadmart are generally not worth the longer term risks.

Risks:

“It’s not an enterprise tool. The error rates in spreadsheets are huge. Excel will dutifully average the wrong data right down the line. There’s no protection around that.” [2]

The spreadmart can also be bracketed as a “data shadow” system to borrow a term from The Business Intelligence Guidebook, authored by Rick Sherman. Here are the problems associated with “data shadow” systems as paraphrased from The Business Intelligence Guidebook [3]:

  • Productivity is severely diminished as analysts spend their time creating and maintaining an assortment of manual data processes;
    • I would add that team morale suffers as well;
  • Business units have daggers drawn as they try to reconcile and validate whose numbers are “more correct”;
    • As a result of a new silo, the organization has compounded its data governance issues;
  • Data errors can (and will) occur as a result of manual querying, integrating and calculating;
  • Data sources can change without notice and the data shadow process is not on IT’s radar for source change notifications;
  • Embedded business logic becomes stagnant in various complex macros or code modules because they are hidden or simply not understood by inheritors;
  • The solution doesn’t scale with increasing data volume or number of users;
  • Audit trail to ensure control and compliance does not exist;
    • “It is often ironic that a finance group can pass an audit because the IT processes it uses are auditable, but the data shadow systems that they use to make decisions are not, and are ignored in an internal audit”;
  • Process and technical documentation does not exist which impacts the ability to update the solution;

Additionally, these processes are not backed up with any regularity, multiple versions may exist on multiple users’ desktops and anyone can make changes to the embedded business logic. The bottom line is that the business is potentially making decisions based upon erroneous data which can have serious financial and reputational impacts.

“F1F9 estimated that 88 percent of all spreadsheets have errors in them, while 50 percent of spreadsheets used by large companies have material defects. The company said the mistakes are not just costly in terms of time and money – but also lead to damaged reputations, lost jobs and disrupted careers.” [4]

Mitigation:

There is nothing wrong with the business responding to an emerging issue by requesting a one-time ad-hoc solution. The highest risks emerge when the ad-hoc process is systematized and a number of repeatable ad-hoc processes proliferate unchecked; and IT is never involved in any discussions.

IT proper is highly effective when it is allowed to merge, integrate and validate data. Business unit analysts and spreadsheets should be out of the collection and integration game for repeatable management reporting. Analysts should focus on analysis, trending and interpretation. Too often analysts get tossed into productivity traps involving hours of cutting, pasting and linking to someone else’s spreadsheet for data integration in order to meet management demands.

When IT is brought into the discussion, they must not point fingers but rather understand why the shadow system was established in the first place. Likewise, the business unit should not point fingers at IT for being unresponsive or limited by budget constraints. Once the peace treaty has been established, IT should analyze and reverse-engineer the cobbled together integration processes and data sources (which admittedly is a time consuming event) and deliver more controlled and scalable processes.

The new data integration processes should culminate in loading data to a business specific, validated, central data mart. The central mart doesn’t try to impose an unfamiliar tool upon the business but rather automates integration activities and references more “trustworthy” data sources. Spreadsheets can still be used by analysts to access the data but the analysts are not expected to be manual aggregators using a sub-standard ETL tool.

“Go with the flow. Some users will never give up their spreadsheets, regardless of how robust an analytic environment you provide. Let them keep their spreadsheets, but configure them as front ends to the data warehouse. This way, they can use their spreadsheets to access corporate-approved data, metrics and reports. If they insist on creating new reports, provide an incentive for them to upload their reports to the data warehouse instead of distributing them via e-mail.” [5]

Have I ever had to “get out” of a situation where data governance was lacking and burned-out, morale depleted analysts spent all of their time collecting and integrating spreadsheets to maintain an inefficient spreadmart?

I’ll never tell!

References:

[1] https://en.wikipedia.org/wiki/Spreadmart

[2] http://ww2.cfo.com/analytics/2012/01/imagine-theres-no-excel/

[3] Sherman, R. (2015). Business intelligence guidebook: from data integration to analytics.

[4] http://www.cnbc.com/id/100923538

[5] https://www.information-management.com/news/the-rise-and-fall-of-spreadmarts

Advertisement

The Data Quality Face-Off: Who Owns Data Quality, the Business or IT?

This article is also posted on LinkedIn.

Data is the lifeblood of organizations. By now you’ve probably heard the comparative slogans “data is the new oil” or “data is the new water” or “data is the new currency”. A quick type of “data is the new” into the Google search bar and the first result delivered is “data is the new bacon”. I’m not sure how apt this slogan is except to say that both can be highly fulfilling.

With the exception of a well-known Seattle based retailer, most enterprises experience substantial data quality issues as data quality work is typically an exercise in “pass the buck”. An Information Week article shrewdly commented on the risks associated with the lack of data quality:

“There are two core risks: making decisions based on ‘information fantasy,’ and compliance. If you’re not representing the real world, you can be fined and your CFO can be imprisoned. It all comes down to that one point: If your systems don’t represent the real world, then how can you make accurate decisions?” [3]

The Handbook of Data Quality: Research and Practice has reported that 60% of enterprises suffer from data quality issues, 70% of manufacturing orders contained poor quality data and that poor data quality management costs companies roughly $1.4 billion every year [4].

Organizational Hamilton/Burr style face-offs occur in which IT and the business are at loggerheads over the state of data quality and ownership. The business typically believes that since data is co-mingled with the systems that IT already manages, IT should own any data issues. With the high costs of poor data quality I just cited, and the risks of nimble disrupters utilizing data more efficiently to attack incumbents’ market share, both IT and the business need to be on the same team with regard to data quality for the organization’s sake.

“The relationship between IT and the business is a source of tension in many organizations, especially in relation to data management. This tension often manifests itself in the definition of data quality, as well as the question of who is responsible for data quality.” [5]

Anecdotally, IT units do not have the desire to be held responsible for “mysterious” data and/or systems that they had no hand in standing up. In my opinion, the enterprise IT mindset is to make sure the data arrives into the consolidated Enterprise Data Warehouse or other centralized data repository; and if downstream users don’t raise concerns about data quality issues, all the better for IT. Garbage-In, Garbage-Out. If the checksums or record counts from source to target match, then it’s time to call it a day.

The developer or analyst related mindset is to immediately dive in and start building applications or reports with the potential to deliver sub-optimal results because the data was misunderstood or misinterpreted as the “golden copy”. Up-front data profiling isn’t in the equation.

Gartner has suggested that the rise of the Chief Data Officer (particularly in banking, government and insurance industries) has been beneficial towards helping both IT and the business with managing data [2]. The strategic usage of a CDO has the potential to free up the CIO and the enterprise IT organization so they can carry on with managing infrastructure and maintaining systems.

However, most experts will agree that the business needs to define what constitutes high-quality acceptable data and that the business should “own the data”. However, IT is typically the “owner” of the systems that house such data. Thus, a mutually beneficial organizational relationship would involve IT having a better understanding of data content so as to ensure a higher level of data quality [5].

From a working together perspective, I find this matrix from Allen & Cervo (2015) helpful in depicting the risks arising from one sided data profiling activities without business context and vice versa. It illustrates how both “business understanding and data profiling are necessary to minimize any risk of incorrect assumptions about the data’s fitness for use or about how the data serves the business” [1]. Although originally offered in a Master Data Management context, I find the example fitting in illustrating how business and IT expertise should work together.

picture1From: Allen, M. & Cervo, D (2015) Multi-Domain Master Data Management: Advanced MDM and Data Governance in Practice. Morgan Kaufmann Publishers. Chapter 8 – Data Integration.
  • From the bottom left quadrant, low business knowledge and inadequate data profiling activities leaves the enterprise in a less than optimal position. This is not the quadrant an organization needs to languish within.
  • The top left quadrant illustrates that business context is high but “knowledge” is unsubstantiated because of the lack of understanding of data quality via profiling exercises. Cue quality guru W. Edwards Deming stating, “Without data, you’re just a person with an opinion.”
  • The bottom right quadrant illustrates the opposite problem where data profiling without business knowledge doesn’t yield enough context for meaningful analyses. Cue a “bizarro” W. Edwards Deming stating, “Without an opinion, you’re just a person with data.”
  • The “Goldilocks” quadrant in the upper right yields appropriate understanding of data quality and the necessary context in which to conduct meaningful analyses.

The more engaged that the business and IT are in finding common ground with respect to understanding existing data and solving data quality issues, the better positioned the organization is to avoid regulatory fines, departmental strife, threats from upstart firms and overall bad decision making. Refuse to become complacent in the data space and give data the attention it deserves, your organization’s survival just may depend upon it.

References:

[1] Allen, M. & Cervo, D (2015) Multi-Domain Master Data Management: Advanced MDM and Data Governance in Practice. Morgan Kaufmann Publishers. Chapter 8 – Data Integration.

[2] Gartner, (January 30, 2014). By 2015, 25 Percent of Large Global Organizations Will Have Appointed Chief Data Officers
http://www.gartner.com/newsroom/id/2659215

[3] Morgan, Lisa. (October 14, 2015). Information Week, 8 Ways To Ensure Data Quality. Information Week.
http://www.informationweek.com/big-data/big-data-analytics/8-ways-to-ensure-data-quality/d/d-id/1322239

[4] Sadiq, Shazia (Ed.) (2013). Handbook of Data Quality: Research and Practice: Cost and Value Management for Data Quality.

[5] Sebastian-Coleman, L. (2013). Measuring Data Quality for Ongoing Improvement: A Data Quality Assessment Framework. Morgan Kaufmann Publishers. Chapter 2 – Data, People, and Systems.

Picture Copyright: wavebreakmediamicro / 123RF Stock Photo

The IT Department Needs To Market Its Value or Suffer the Consequences

This article is also published on LinkedIn.

By now it’s an all too common cliché that the IT department does not garner the respect it deserves from its counterpart areas of the business. This perceived respect deficiency can manifest itself in the lack of upfront involvement in business strategy (we’ll call you when it breaks), unreasonable timelines (do it yesterday), rampant budget cuts and layoffs (do more with less) and/or limited technical promotional tracks (promotions are for business areas only).

IT pros tend to believe that if they’re adding value, delivering difficult solutions within reasonable timeframes and providing it all in a cost efficient manner, the recognition and gratitude will follow. Typical IT and knowledge worker responsibilities fall under the high level categories of “keep things running” (you’re doing a great job so we don’t notice) or “attend to our technical emergency” (drop what you’re doing).

It’s fair to say that there is a perception gap between the true value and the perceived value of what IT brings to the table. Anecdotally, there certainly seems to be a disconnect between the perceived lack of difficulty in business asks and the actual difficulty in delivering solutions. This perception gap can occur not only between IT and the “business” but also between the non-technical IT manager and the technical rank and file.

In this era of automation, outsourcing and job instability, there is an element of danger in one’s contributions going unnoticed, underappreciated and/or misunderstood. Within IT, leaders and the rank and file need to overcome their stereotypical introverted nature and do a better job of internally marketing their value to the organization. IT rank and file need to better market their value to their managers, and in turn the IT department collectively needs to better market its value to other areas of the business.

Perception matters, but IT must deliver the goods as well. If the business misperceives the actual work that the IT department provides and equates it to commoditized functions such as “fix the printers” or “print the reports” then morale dips and the IT department can expect to compete with external third parties (vendors, consulting firms, outsourcing outfits) who do a much better job of finding the ear of influential higher–ups and convincing these decision-makers of their value.

I once worked on an extremely complex report automation initiative that required assistance from ETL developers, architects, report developers and business unit team members. The purpose was to gather information from disparate source systems, perform complex ETL on the data then and store it in a data-mart for downstream reporting. Ultimately the project successfully met its objective of automating several reports which in-turn saved the business a week’s worth of manual excel report creations. After phase 1 completion, the thanks I received was genuine gratitude from the business analyst whose job I made easier. The other thanks I received was “where’s phase 2, this shouldn’t be that hard” from the business manager whose technology knowledge was limited to cutting and pasting into excel.

Ideally my team should have better marketed the value and helped the business partner understand the appropriate timeliness (given the extreme complexity) of this win, instead of just being glad to move forward after solving a difficult problem for the business.

I believe Dan Roberts accurately paraphrases the knowledge worker’s stance in his book Marketing IT’s Value.

“’What does marketing have to do with IT? Why do I need to change my image? I’m already a good developer!’ Because marketing is simply not in IT’s comfort zone, they revert to what is more natural for them, which is doing their technical job and leaving the job of marketing to someone else, which reinforces the image that ‘IT is just a bunch of techies.’”

The IT department needs to promote better awareness of its value before the department is shut out of strategic planning meetings, the department budget gets cut, project timelines start to shrink and morale starts to dip. IT workers need to promote the value they bring to the table by touting their wins and remaining up to date in education, training and certifications as necessary. At-will employment works both ways, if the technical staff feels stagnant, undervalued and underappreciated, there is always a better situation around the corner; especially considering the technical skills shortage in today’s marketplace.

“It’s not about hype and empty promises; it’s about creating an awareness of IT’s value. It’s about changing client perceptions by presenting a clear, consistent message about the value of IT. After all, if you don’t market yourself, someone else will, and you might not like the image you end up with [1]”

References:

[1] Colisto, Nicholas R.. ( © 2012). The CIO Playbook: Strategies and Best Practices for IT Leaders to Deliver Value.

[2] Roberts, Dan. ( © 2014). Unleashing the Power of IT: Bringing People, Business, and Technology Together, second edition.