Enterprise Technologies

Get Out of the Spreadsheet Abyss

When an organization turns a blind eye to the proliferation of spreadsheet based processes, it subjects itself to substantial risks. Have you ever encountered (or enabled) the following scenario?

  • Enterprise reporting is lacking thus a “power-user” (i.e. analyst) is conscripted into cobbling together an ad-hoc spreadsheet based process to address the management request;
  • The power user exports data from various business applications and then manipulates the data output typically with macros and formulas;
  • This initial spreadsheet is manually integrated with business unit data from various other unofficial spreadsheets, further distancing the data from the source business application;
  • Multiple tabs are added, charts are generated and data is pivoted (all manually);
  • Management finds value in the report and elevates it to a repeatable process;
  • The original request increases in complexity over time as it requires more manipulations, calculations and rogue data sources to meet management needs;
  • Management doubles down with the need for a new request and the process is repeated;
  • IT proper is NEVER consulted on any of the requests;

The business unit is now supporting a “spreadmart”. The term is considered derogatory in data circles.

“A spreadmart (spreadsheet data mart) is a business data analysis system running on spreadsheets or other desktop databases that is created and maintained by individuals or groups to perform the tasks normally done by a data mart or data warehouse. Typically a spreadmart is created by individuals at different times using different data sources and rules for defining metrics in an organization, creating a fractured view of the enterprise.” [1]

Although the initial intentions of these requests may be reasonable, the business never bothers to approach IT to propose building out a proper data store. Additionally, the conscripted analysts are unhappy with their additional manual responsibilities. Spreadsheet wrangling and manual integration activities shift precious time away from more value-added pursuits such as data analysis and formulating recommendations.

From management’s perspective, why should they pay IT to build out an officially sanctioned solution that will deliver the same information that an internal team of analysts can provide? After all, the spreadmart is responsive (changes can be made quickly) and it’s inexpensive (as opposed to new investments in IT). Eventually, the manual processes are baked into the job description and new hires are enlisted to expand and maintain this system. The business sinks deeper and deeper into the spreadsheet abyss.

The short term rewards of the spreadmart are generally not worth the longer term risks.

Risks:

“It’s not an enterprise tool. The error rates in spreadsheets are huge. Excel will dutifully average the wrong data right down the line. There’s no protection around that.” [2]

The spreadmart can also be bracketed as a “data shadow” system to borrow a term from The Business Intelligence Guidebook, authored by Rick Sherman. Here are the problems associated with “data shadow” systems as paraphrased from The Business Intelligence Guidebook [3]:

  • Productivity is severely diminished as analysts spend their time creating and maintaining an assortment of manual data processes;
    • I would add that team morale suffers as well;
  • Business units have daggers drawn as they try to reconcile and validate whose numbers are “more correct”;
    • As a result of a new silo, the organization has compounded its data governance issues;
  • Data errors can (and will) occur as a result of manual querying, integrating and calculating;
  • Data sources can change without notice and the data shadow process is not on IT’s radar for source change notifications;
  • Embedded business logic becomes stagnant in various complex macros or code modules because they are hidden or simply not understood by inheritors;
  • The solution doesn’t scale with increasing data volume or number of users;
  • Audit trail to ensure control and compliance does not exist;
    • “It is often ironic that a finance group can pass an audit because the IT processes it uses are auditable, but the data shadow systems that they use to make decisions are not, and are ignored in an internal audit”;
  • Process and technical documentation does not exist which impacts the ability to update the solution;

Additionally, these processes are not backed up with any regularity, multiple versions may exist on multiple users’ desktops and anyone can make changes to the embedded business logic. The bottom line is that the business is potentially making decisions based upon erroneous data which can have serious financial and reputational impacts.

“F1F9 estimated that 88 percent of all spreadsheets have errors in them, while 50 percent of spreadsheets used by large companies have material defects. The company said the mistakes are not just costly in terms of time and money – but also lead to damaged reputations, lost jobs and disrupted careers.” [4]

Mitigation:

There is nothing wrong with the business responding to an emerging issue by requesting a one-time ad-hoc solution. The highest risks emerge when the ad-hoc process is systematized and a number of repeatable ad-hoc processes proliferate unchecked; and IT is never involved in any discussions.

IT proper is highly effective when it is allowed to merge, integrate and validate data. Business unit analysts and spreadsheets should be out of the collection and integration game for repeatable management reporting. Analysts should focus on analysis, trending and interpretation. Too often analysts get tossed into productivity traps involving hours of cutting, pasting and linking to someone else’s spreadsheet for data integration in order to meet management demands.

When IT is brought into the discussion, they must not point fingers but rather understand why the shadow system was established in the first place. Likewise, the business unit should not point fingers at IT for being unresponsive or limited by budget constraints. Once the peace treaty has been established, IT should analyze and reverse-engineer the cobbled together integration processes and data sources (which admittedly is a time consuming event) and deliver more controlled and scalable processes.

The new data integration processes should culminate in loading data to a business specific, validated, central data mart. The central mart doesn’t try to impose an unfamiliar tool upon the business but rather automates integration activities and references more “trustworthy” data sources. Spreadsheets can still be used by analysts to access the data but the analysts are not expected to be manual aggregators using a sub-standard ETL tool.

“Go with the flow. Some users will never give up their spreadsheets, regardless of how robust an analytic environment you provide. Let them keep their spreadsheets, but configure them as front ends to the data warehouse. This way, they can use their spreadsheets to access corporate-approved data, metrics and reports. If they insist on creating new reports, provide an incentive for them to upload their reports to the data warehouse instead of distributing them via e-mail.” [5]

Have I ever had to “get out” of a situation where data governance was lacking and burned-out, morale depleted analysts spent all of their time collecting and integrating spreadsheets to maintain an inefficient spreadmart?

I’ll never tell!

References:

[1] https://en.wikipedia.org/wiki/Spreadmart

[2] http://ww2.cfo.com/analytics/2012/01/imagine-theres-no-excel/

[3] Sherman, R. (2015). Business intelligence guidebook: from data integration to analytics.

[4] http://www.cnbc.com/id/100923538

[5] https://www.information-management.com/news/the-rise-and-fall-of-spreadmarts

Advertisements

The Benefits of Service Oriented Architecture for Financial Services Organizations

The banking and financial industry is an industry where legacy systems are prevalent. Banking systems tend to skew older and are very heterogeneous in nature. This heterogeneity of legacy banking systems is also coupled with the fact that replacement and integration of these systems is a difficult undertaking.

Mazursky (as cited in Baskerville, Cavallari, Hjort-Madsen, Pries-Heje, Sorrentino & Virili, 2010) states that older architectures complicate integration of enterprise applications because the underlying elements have created ‘closed’ architectures. Closed architectures restrict access to vital software and hardware configurations and force organizations to rely on single-vendor solutions for parts of their information computer technology. Thus, closed architectures hinder a bank’s ability to innovate and roll out new integrated financial products.

The flexibility of SOA facilitates easier connectivity to legacy backend systems from outside sources. Because of the size and complexity of most enterprise banking systems, a potential reengineering of these systems to flawlessly accommodate interaction with newer systems is not economically feasible. Inflexible systems can be difficult to modernize and maintain on an on-going basis. Furthermore, banks’ legacy systems can suffer from a lack of interoperability, which can stifle innovation.

“According to a survey commissioned by Infosys and Ovum in May 2012, approximately three quarters of European banks are using outdated core legacy systems. What’s more, 80% of respondents see these outdated systems as barriers to bringing new products to market, whilst 75% say such systems hinder, rather than enable, process change. Integration cost is effectively a barrier to service provision flexibility and subsequent innovation” (Banking Industry Architecture Network, 2012).

The greater the number of banking systems that can easily interact, connect and share data with other systems and services, the more innovative banks can become with respect to their product offerings for consumers. Market demands can be more easily responded to when new product creation is allowed to flourish with appreciably lower system integration costs. New applications can also be developed in much shorter time frames in order to react to customer demand as long as services are maintained and shared across the enterprise.

According to Earley & Free (2002), a bank that identifies a new business opportunity such as wealth management but has never provided such a service in the past can quickly ramp up functionality by utilizing services previously designed for other applications. The new wealth management application is designed without duplicating functionality in a more cost efficient and rapid manner. Smaller and mid-tier banks can capitalize on bringing new applications and services to market quickly and economically. This responsiveness allows smaller banks to offer similar products as those offered by their larger competitors. The modularity of SOA design precludes the need for substantial re-engineering of smaller/mid-tier banks’ information computer technology. This is an important benefit because smaller banks do not have comparable financial resources with respect to the more sizable industry players.

Additionally, SOA has the potential to free up development resources sooner, enabling those resources to engage in other development initiatives. “Few banks will be able to buy or build new capabilities quickly enough to satisfy market requirements without an SOA” (Earley & Free, 2002). In the survey conducted by Infosys Consulting and Ovum (a London based research organization), 100% of banks responded that SOA would become the “dominant, accepted future banking architecture“ (Banking Industry Architecture Network, 2012).

Furthermore, banks can employ multiple services to complete a business process. One service that determines the value of a portfolio of financial assets at market rates (mark to market calculations) could be coupled with another service that calculates the Value at Risk (VaR) of the bank’s entire portfolio. In a similar fashion to the new wealth management application example previously cited, the dual components could be made available to many other enterprise wide financial services and applications that require portfolio valuation and risk related information. In this manner, the functionalities are not inefficiently repeated across every application where they are requested.

Additionally, via use of SOA, “New technology provides the ability to mix and match outsourced business processes with on-premise assets and services” (Essvale Corporation Limited, 2011). Software designed or in use by third party vendors can become more easily integrated with bank systems and processes due to the high connectivity of an SOA approach. According to Gartner research, smaller and mid-tier banks are adopting SOA in order to make the most of their limited IT budgets and resources. “Until now, midtier banks had to rely on customized software packages from a single vendor, and assumed all of the maintenance costs and function limitations inherent in a single, proprietary set of solutions” (Earley et al., 2005). Due to a rising interest in SOA, technology vendors that serve the financial services industry are increasingly working with integration providers to offer a standard set of component integration (Earley, et al., 2005).

One of the benefits of SOA standardization is the enablement of more functionality, performed by much less underlying code. This leads to less complex, more cost effective system maintenance; thereby reducing operational risks.

“A fully implemented SOA provides a bank with a highly predictable application environment that reduces risk in day-to-day operations, due to the minimization and isolation of change to the production systems. Banks that fail to take this approach must constantly change their interfaces as external and internal requirements change. This introduces significant risk and the need for near-continuous testing to ensure that the customer ‘touchpoints’ and the back-end processes do not fail, while ensuring that one data or service change doesn’t adversely affect other data changes integrated through an interface” (Earley et al., 2005).

Conclusion

SOA has an important role to play in the architectural repertoire of banking and financial organizations. Its loosely coupled design characteristic allows services to be shared and reused across the enterprise without disparate systems concerning themselves with the underlying development code. Multiple services can be combined together to form reusable chunks of a business process. Outside sources can connect to legacy backend systems via an API, which increases the opportunity to mix and match vendor capabilities with in-house assets. SOA also helps banks and financial firms ramp up new applications and functionality quickly and economically, increasing product responsiveness to market demands. When SOA is combined with Event Driven Architecture, dynamic event driven systems can be developed that do not rely solely on the less proactive request/reply paradigm.

Banks and financial companies need to remain innovative, cost effective and anticipate customer needs in order to remain profitable. SOA allows organizations to become more agile and flexible with their application development. The rise of applications on mobile cloud enabled platforms means that customers will need to connect to data wherever it dwells. “Bank delivery is focused on reactively providing products on customer request, and mass-market, one-size-fits-all products (for mainstream retail banking). However, it is no longer feasible to force-fit mass-market bank products when technology devices, context and location are key elements to the type of customized bank service a customer needs” (Moyer, 2012). As SOA continues to mature with cloud enabled solutions and the rise of mobile computing, it is primed to be the building block for the next generation of banking application functionality.

References:

Baskerville, R., Cavallari, M., Hjort-Madsen, K., Pries-Heje, J., Sorrentino, M., & Virili, F. 2010. The strategic value of SOA: a comparative case study in the banking sector. International Journal of Information Technology and Management, Vol. 9, No. 1, 2010

Banking Industry Architecture Network. (2012). SOA, standards and IT systems: how will SOA impact the future of banking services? Available from https://bian.org/wp-content/uploads/2012/10/BIAN-SOA-report-2012.pdf

Early, A., & Free, D. (2002, September 4). SOA: A ‘Must Have’ for Core Banking (ID: SPA-17-9683). Retrieved from Gartner database.

Early, A., & Free, D., & Kun, M. (2005, July 1). An SOA Approach Will Boost a Bank’s Competitiveness (ID: G00126447). Retrieved from Gartner database.

Essvale Corporation Limited. (2011). Business knowledge for it in global retail banking: a complete handbook for it professionals.

Overview of Service Oriented Architecture

 

Service Oriented Architecture (SOA) can be described as an architectural style or strategy of “building loosely coupled distributed systems that deliver application functionality in the form of services for end-user applications” (Ho, 2003). Ho (2003) proclaims that a service can be envisioned as a simple unit of work as offered by a service provider. The service produces a desired end result for the service consumer. Another way to envision the concept of a service is to imagine a “reusable chunk of a business process that can be mixed and matched with other services” (Allen, 2006). The services either communicate with each other (i.e. pass data back and forth) or work in unison to enable or coordinate an activity.

When SOA is employed for designing applications and/or IT systems, the component services can be reused across the enterprise, which helps to lower overall development costs amongst other benefits. Reuse fosters consistency across the enterprise. For example, SOA enables banks to meet the needs of small, but profitable market segments without the need to redevelop new intelligence for a broad set of applications (Earley, Free & Kun, 2005). Furthermore, any number of services can be combined together to mimic a business processes.

“One of the most important advantages of a SOA is the ability to get away from an isolationist practice in software development, where each department builds its own system without any knowledge of what has already been done by others in the organization. This ‘silo’ approach leads to inefficient and costly situations where the same functionality is developed, deployed and maintained multiple times” (Maréchaux, 2006).

Architectural Model

Services are only accessed through a published application-programming interface, better known as the API. The API, which acts as the representative of the service to other applications, services or objects is “loosely coupled” with its underlying development and execution code. Any outside client invoking the service is not concerned with the service’s development code and is hidden from the outside client. “This abstraction of service implementation details through interfaces insulates clients from having to change their own code whenever any changes occur in the service implementation” (Khanna, 2008). In this manner, the service acts as a “black box” where the inner workings and designs of the service are completely independent from requestors. If the underlying code of the service were switched from java to C++, this change would be completely oblivious to would-be requestors of the service.

Allen, (2006) describes the concept of loose coupling as, “a feature of software systems that allows those systems to be linked without having knowledge of the technologies used by one another.” Loosely coupled software can be configured and combined together with other software at runtime. Tightly coupled software does not offer the same integration flexibility with other software, as its configuration is determined at design-time. This design-time configuration significantly hinders reusability options. In addition, loosely coupled applications are much more adaptable to unforeseen changes that may occur in business environments.

In the early 1990’s some financial firms adopted an objected oriented approach to their banking architecture. This approach is only superficially similar to a service oriented architecture approach. In an object oriented (OO) approach, the emphasis is on the ability to reuse objects within the source code. SOA emphasizes a “runtime reuse” philosophy in which the service itself is discoverable and reused across a network (Earley, Free & Kun, 2005). SOA also provides a solution to the lack of interoperability between legacy systems.

References:

Allen, P. (2006). Service orientation: winning strategies and best practices.

Early, A., & Free, D., & Kun, M. (2005, July 1). An SOA Approach Will Boost a Bank’s Competitiveness (ID: G00126447). Retrieved from Gartner database.

Ho, H. (2003). What is Service-Oriented Architecture? O’Reilly XML.com.

Khanna, Ayesha. (2008). Straight through processing for financial services: the complete guide.

Maréchaux, J., (2006, March 28). Combining Service-Oriented Architecture and Event-Driven Architecture using an Enterprise Service Bus. IBM developerWorks. Retrieved from http://www.ibm.com/developerworks/library/ws-soa-eda-esb/