Airborne Express & The Evolution of the Air Express Industry 1973-2002

Once again, I’m digging into my digital “crates” to share a brief writeup. This one deals with Airborne Express and the Air Express Industry. I composed this brief synopsis back in February of 2007; one year before the organization ceased operations in the United States. Airborne Express was once a low cost alternative to the FedEx-UPS duopoly and was eventually acquired by the parent company of DHL Express in 2003. DHL Supply Chain as of 2017 is the 4th largest supply chain and logistics company in North America [1]. Although DHL is a market leader in various countries internationally, the company could not find success with the Airborne Express acquisition in the United States. After five years of trying to make inroads, DHL shut down Airborne Express in 2008 and eventually lost 10 billion (USD) on the venture [2].

History and Development of the Industry

The majority of the freight industry circa 1973 was constituted of the major players in the passenger airline industry. A handful of all cargo airlines most notably Flying Tiger (founded by 10 pilots from the famous WW2 volunteer fighter unit), also participated in this space [3]. The dynamics of the industry were soon changed by Fred Smith Jr. and Federal Express. While a student a Yale, Smith envisioned “that as companies relied more on computers and technology, they would want to keep their equipment working without creating a huge inventory of parts.” [4] Federal Express pioneered the hub and spoke route system. The central hub was located in Memphis and the “spokes” were the routes between Memphis and the cities that Federal Express served. The model proved to be a success due to its many efficiencies and new entrants into the industry copied this operating process.

As federal regulations were relaxed on the air cargo industry, all-cargo carriers began to increase their route structure. The result was a substantial withdrawal from all-cargo flights by the major airlines and an increase in demand for next-day package delivery services.

The Eighties: Rapid Growth and Low Returns.

Despite the rapid growth in the air express industry during the nineteen eighties, profit margins were declining. Federal Express needed to be aware of the strong competitive force represented by UPS’s potential entry into the overnight delivery industry. UPS decided to enter the industry in 1982 and by 1984 “it had a daily volume of about 175,000 packages for its overnight and second-day service, compared with about 290,000 packages for Federal Express.” [5] UPS’s aggressive push into the industry included a strategy to compete on price and offer overnight parcel services at prices that were half that of other industry players. In 1987 Federal Express further shocked the industry by initiating price cuts on its overnight service [6]. The move was designed to preempt expansion by UPS into this space. Brazen price discounting increased the rivalry amongst established firms and inevitably led to reduced profitability in the industry.

Another threat to industry profitability came from the strong competitive force represented by the bargaining power of buyers. Powerful buyers in an industry can squeeze profits out of that industry. Major corporations that utilized the air express industry began to pool together and leveraged their purchasing power to bargain for more price reductions. “To bag a three-year deal as International Business Machines Corp.’s primary U. S. overnight carrier, Airborne Freight Corp. dangled discounts as much as 84% below Federal’s rate card. With price-cutting of that magnitude, it’s clear why a 34% jump in volume produced only 13% more revenues for the air-express industry in the first half of 1987.” [7]

Furthermore, new technologies during this time were seen as a threat to industry players. Due to the burgeoning popularity of the fax machine during the eighties, financial analysts were predicting that they could eventually displace 30% of Federal Express’s overnight-letter shipments [8].

Rising Prices in 1989

The industry during this period of time had just emerged from the shakeout phase and was entering the mature phase of the industry life cycle model. Purolator Courier, Emery, CF Air Freight and Flying Tiger had either been acquired or were marginalized due to poor operational efficiencies. As a result the industry was dominated by a small number of companies. After many years of trying to take share from FedEx and Airborne Express, UPS recognized that the market had matured at this point in time. During the mature stage of the industry life cycle model companies tend to reduce the industry competition and preserve industry profitability. UPS for the first time since it entered the air express market in 1982 began to raise prices on its next day air service. UPS wanted to convey to the rest of the industry that the price war was over. Price signaling was clearly utilized to influence the rest of the players in the industry to raise their prices. Federal Express and Airborne were more than happy to implement a tit-for-tat strategy and thus raise their prices. UPS had succeeded in raising industry profitability although the upcoming recession rendered this effect short lived.

Airborne Express Strengths and Weaknesses

Airborne Express remains a distant third in the US air express industry in terms of market share. However, the company did manage to survive the industry shakeout of the late nineteen eighties due to its distinctive competencies and low cost structure. (See Figure 1)

One of the company’s main strengths is its low cost structure relative to its peers. This low cost structure helped Airborne maintain a 16.5 share of the U.S. domestic express market in 2001, roughly the same as in previous years [10]. Airborne’s other strengths include ownership of its own airport, its exclusive grant of a foreign trade zone and its patent on C-containers. In addition, Airborne has very high brand loyalty. It won the Brand Keys Customer Loyalty Award for the parcel delivery category for five years in a row [9]. Brand loyalty is a significant asset as it helps a company retain market share.

In order to compete with its more powerful rivals in the industry, Airborne concentrated on the niche market of high volume corporate accounts. While this strategy provides constant volume it does have its drawbacks. Number one, a predominately corporate customer base will make Airborne much more sensitive to downturns in the economy. Number two, high volume customers are able to drive down prices and can command significant discounts.

Recommendations

Airborne needs to continue to cut its costs and increase its productivity in order to compete with the larger players in the industry. Its operating costs and capital spending were slashed from 368 million in 2000 to 126 million in 2001 [10]. On the productivity side, Airborne has taken positive steps by centralizing its customs brokerage service at its sort facility in Wilmington, Ohio. “This change will improve customer service and provide Airborne with greater regulatory control.” [11] Lower costs and productivity improvements will begin to pay off when the economy rebounds and help place the company in a more favorable position.

Airborne also needs to ramp up its domestic ground services in order to hedge against the shrinking market in high margin overnight deliveries. Domestic overnight shipments fell from 58% of total volume in 1998 to 52% in 2001 [12]. The company needs to be ready to respond to this shift in demand and capture revenues in ground delivery services.

Although airborne needs to maintain ties to its corporate customer base, it also needs to pay more attention to smaller customers. Its GDS service was introduced on a limited basis and was targeted at large corporate customers. This service needs to be expanded to individual customers and small businesses in order to take advantage of the large capital spending Airborne undertook to establish ground services. Increasing the number of shipping kiosks available to individual customers could be established through a strategic alliance with a retail partner. In this manner Airborne can compete with the Fedex/Kinko’s and UPS/Mailboxes Etc partnerships.

References

[1] Transport Topics: http://www.ttnews.com/top50/logistics/ (retrieved May, 13, 2017)

[2] http://content.time.com/time/specials/packages/article 0,28804,1855948_1864555_1864556,00.html (retrieved May, 13, 2017)

[3] http://www.answers.com/topic/flying-tiger-line

[4] Miller, Karin. “FedEx founder favors ‘Buck Rogers Ideas’” Fort Wayne Journal Gazette. Final Edition (2 Dec. 2001): 1D. Factiva

[5] Rotbart, Dean. “Federal Express Sinks Near Its 52-Week Low, And ‘Buy’ Recommendations Are Appearing.”  The Wall Street Journal (20 Mar 1984): Factiva

[6] Foust,Dean “Top Of The News FEDERAL EXPRESS DELIVERS A PRICE SHOCK — It’s firing the first shot in what could be an industrywide war” Businessweek (30 March 1987): Vol 2991, pg 31. Factiva

[7] Foust, Dean. “The Corporation WHY FEDERAL EXPRESS HAS OVERNIGHT ANXIETY — UPS, facsimiles, and a mature market have it worried” Businessweek (9 Nov 1987): Vol 3025, pg 62. Factiva

[8] Foust, Dean “Cover Story MR. SMITH GOES GLOBAL — HE’S PUTTING FEDERAL EXPRESS’ FUTURE ON THE LINE TO EXPAND OVERSEAS”  Businessweek (13 Feb 1989): Vol 3091, pg 66. Factiva

[9] “Air & Express Briefs.” Traffic World Magazine (15 July 2002) Factiva

[10] Putzger, Ian “Airborne Again” The Journal of Commerce Week (11 Mar 2002) Factiva

[11] “Airborne Express Enhances International Express Services” PR Newswire (7 Oct 2002) Factiva

[12] “Moody’s Lowers Airborne Express Senior Notes To Ba1” Dow Jones Corporate Filings Alert (15 Mar 2002) Factiva

More Than You Want to Know About State Street Bank’s Technology Strategy Part 3

This article is a continuation of my earlier analyses (Part 1, Part 2 here) where I waded into State Street’s strategy for Technology Infrastructure, IT Capability and Staffing, Information Risk & Security, Stakeholder Requirements, and Project ROI. In this final part of my three part series I will broach the company’s strategy for Data Acquisition, Social Media, Organizational Change Management and Project Strategy. State Street’s cloud implementation and virtualization initiative is a worthy example of business strategy/need influencing the firm’s information technology strategy.

State Street: Strategy for Data Acquisition and Impact on Business Processes:

The nature of State Street’s business as a custodian bank with trillions of dollars under custody management and multiple clients distributed worldwide means that the organization houses and processes a tremendous amount of data (internally generated and externally collected). The sheer volume and complexity of this data presents challenges as the bank looks to file regulatory reports and provide data back to its clients. The company receives an untenable 50,000 faxes a month from its client base (Garber, 2016). According to consulting firm Accenture, “(State Street) was unable to adequately track trades through each step in the trading lifecycle because there were multiple reconciliation systems, some reconciliation work was still being done manually and there was no system of record. To maintain industry leadership and comply with regulations, the company’s IT platform had to advance” (Alter, Daugherty, Harris, & Modruson, 2016).

The bank’s cloud initiative helped facilitate and speed up the burdensome process of transferring data back and forth between its clients. In addition, (as of 2016) a new digital initiative (code named State Street Beacon) will potentially help drive more cost savings. The cloud architecture project was a boon to data analysis capabilities as it enabled clients to access their data in the State Street cloud and subsequently enrich the data from multiple sources to support forecasting (Camhi, 2014). In this case, the bank’s infrastructure as a service (IaaS) enabled platform as a service (PaaS) capabilities.

The bank has embarked upon the development of 70 multiple mobile apps and services that support its PaaS strategy. In one case, the bank developed a tablet and mobile accessible app for its client base named State Street Springboard. This application put investment portfolio data in the hands of its client base. Additionally, “Since State Street’s core competency is transaction processing, its Gold Copy app is one of the most important new tools it offers: The app lets a manager follow a single ‘gold’ version of a transaction as it moves through all of the company’s many departments and office locations — say, as it makes it way through trading, accounting, and reporting offices globally” (Hogan, 2012). This capability of the Gold Copy app enables more effective management of counterparty risk as an asset moves through the trading process.

State Street’s new infrastructure and massive data collection provides the bank with new big data capabilities that can better inform business units in the area of risk management. Data insights can potentially fortify stress testing, “What-If” and “Black Swan” scenario modeling. As we’ve seen in the recent 2016 case of Britain’s pending withdrawal from the European Union (i.e. “Brexit”), uncertainty in global financial markets is a certainty.

State Street: Strategy for Social Media/Web Presence:

            State Street has traditional Facebook, LinkedIn and Twitter social presences but it has also used other social platforms to support various aims such as employee interaction and brand awareness. The bank was named a “Social Business Leader for 2013” by Information Week. As part of a “social IT” strategy (in which technology supports a collaboration initiative), the company held an “Innovation Rally” on its eight internal online forums to gather employee ideas on how best to improve its business. From 12,000 total submitted postings, employees could attempt to build a business case around the best ideas for executive management to implement (Carr, 2013). The company also launched an internal “State Street Collaborate” platform with the aim of crowdsourcing employee knowledge to help people find an appropriate in-house expert regarding diverse work related topics of interest.

Additional social initiatives include a partnership with TED to provide employees the opportunity to give a “Ted Talk” in front of their peers; the aim is to promote knowledge sharing within the organization. The bank also experiments with a presence on the (almost defunct) video sharing platform “Vine” where it can showcase the organization in quick six second sound bites. This approach caters to clients and the future millennial talent base.

State Street: Strategy for Organizational Change Management, Project Strategy and Complexity:

            As stated earlier in this series, State Street started migrating its new cloud applications to production by selecting those with low volume and low complexity and then gradually ramped up to migrating the more complex applications (McKendrick, 2013). The standardization and virtualization aspects of cloud infrastructure that the bank implemented is conducive to agile development. Standardization and a reusable code approach reduces complexity by limiting development choices, simplifying maintenance and enabling new technology staff to get up to speed on fewer systems. Developers are placed in agile teams with business subject matter experts to provide guidance and to increase stakeholder buy-in. Per Perretta, “We circle the agile approach with additional governance to ensure that the investments are paying off in the appropriate timeframe” (High, 2016).

Another key approach the State Street uses to gauge project complexity is that of predictive analytics. The bank can help its internal business teams better understand the project costs and delivery timetables by analyzing historical data on all of the projects implemented over the years. The predictive analytics model uses inputs such as “…scope, team sizes, capability of the team, the amount of hours each team member spent, and ultimately, how well it delivered on these programs” (Merrett, 2015). As the business teams list their project requirements, the predictive model is created in real time which provides all parties with additional clarity.

Finally, it would be remiss to mention banking and transformation in the same breath without mentioning the requisite layoffs and outsourcing activities. For all the benefits of the bank’s cloud computing initiatives, technology workers who do not fit in with the new paradigm find themselves subjected to domestic and non-domestic outsourcing activities. A standardized infrastructure platform leads to fewer distinct systems in the technology ecosystem, an emphasis on code reuse and increased automation. This perfect storm of efficiency gains has lead to roughly 850 IT employees shuffled out of the organization to either IBM, India based Wipro or outright unemployment. Staffing cuts occurred amongst the employees who maintained and monitored mainframes and worked with other non-cloud based infrastructure systems. The bank was interested in shifting fixed costs for variable costs by unloading IT staff who were perceived as not working on innovative cutting edge technologies. The layoffs amount to “21% of State Street’s 4,000 IT employees worldwide” (Tucci, 2011b).

References:

Alter, A., Daugherty, R., Harris, J., & Modruson, F., (2016). A Fresh Start for Enterprise IT. Accenture. Retrieved from https://www.accenture.com/us-en/insight-outlook-journal-fresh-start-enterprise-it

 

Camhi, J. (2014). Chris Perretta Builds Non-Stop Change Into State Street’s DNA. Bank Systems & technology. Retrieved from http://www.banktech.com/infrastructure/chris-perretta-builds-non-stop-change-into-state-streets-dna/d/d-id/1317880

Carr, D. (May 30, 2013). State Street: Social Business Leader Of 2013. Retrieved 6/25/16 from http://www.informationweek.com/enterprise/state-street-social-business-leader-of-2013/d/d-id/1110179?

 

Garber, K. (February 29, 2016). State Street doubles down on digital. SNL Bank and Thrift Daily. Retrieved from Factiva 6/20/16.

High, P. (February 8, 2016). State Street Emphasizes Importance Of Data Analytics And Digital Innovation In New Role. Retrieved from http://www.forbes.com/sites/peterhigh/2016/02/08/state-street-emphasizes-importance-of-data-analytics-and-digital-innovation-in-new-role/#a211b1320481

Hogan, M. (September 3, 2012). State Street’s Trip to the Cloud. Barron’s. Retrieved from Factiva 6/20/16

McKendrick, J. (January 7, 2013). State Street’s Transformation Unfolds, Driven by Cloud Computing. Forbes. Retrieved from http://www.forbes.com/sites/joemckendrick/2013/01/07/state-streets-transformation-unfolds-driven-by-cloud-computing/#408e1acf64cf

Merrett, R. (April 2, 2015). How predictive analytics helped State Street avoid additional IT project costs. CIO. Retrieved from http://www.cio.com.au/article/571826/how-predictive-analytics-helped-state-street-avoid-additional-it-project-costs/

Tucci. L. (July, 20, 2011). State Street tech layoffs: IT transformation’s dark side. Retrieved from http://searchcio.techtarget.com/blog/TotalCIO/State-Street-tech-layoffs-IT-transformations-dark-side

How to Build a Waterfall Chart in Tableau

In this video I will show you how to go “Chasing Waterfalls” in Tableau (apologies to TLC). Waterfall charts are ideal for demonstrating the journey between an initial value and an ending value. It is a visualization that breaks down the cumulative effect of positive and negative contributions. You’ve probably seen them used in financial statements or at your quarterly town hall meeting. Enjoy!

If you’re interested in Business Intelligence & Tableau please subscribe and check out my videos here: Anthony B. Smoak

Coursera Final Assignment: Essential Design Principles for Tableau

Dashboard 1

I recently completed Essential Design Principles for Tableau offered by the University of California Davis on Coursera. I’ll offer some review commentary. I thought it was a solid class as it covered data visualization concepts such as pre-attentive attributes and the Gestalt principles. This class was a bit more heavy on the conceptual side of the house as opposed to delving into practical Tableau instructions. However, there are other classes in the specialization that have a more hands on practical approach.

In this assignment we had to highlight the three worst performing product Sub-Categories in each region. Additionally, we had to demonstrate how these worst performers compared to other product Sub-Categories in their respective regions. Finally, the visualization had to highlight the three worst performing Sub-Product Categories overall with a color emphasis. The scenario given to the class was that a sales manager had to cut the three worst performing Sub-Categories in her region and needed a visualization that addressed her concerns.

Guidance was not provided on how to identify the three worst performing categories. Some people in the class simply used profit as their key performance indicator (KPI) which I think is misguided. You learn in business (or business education) that profits do not equal profitability.  From Investopedia:

Profitability is closely related to profit, but it is the metric used to determine the scope of a company’s profit in relation to the size of the business. Profitability is a measurement of efficiency – and ultimately its success or failure. It is expressed as a relative, not an absolute, amount. Profitability can further be defined as the ability of a business to produce a return on an investment based on its resources in comparison with an alternative investment. Although a company can realize a profit, this does not necessarily mean that the company is profitable.

For these reasons I used the Average Profit Ratio of the products in each Sub-Category as my KPI as opposed to raw profits. If you had to sell $100,000 of product A to make $1,000 in profit (1% profit ratio), would you eliminate product B which requires $1000 in sales to generate $500 in profit (50% profit ratio)? Only if you want to go out of business!

In order to complete the visualization you see above on Tableau Public I had to incorporate nested sorting principles and also highlight the three worst performing elements on a bar chart. Luckily for you, I have videos that will demonstrate how to accomplish these tasks.

You can check out the rest of my videos on my Youtube Channel or find them on this site under Videos.

How to Highlight the Top 3 Bar Chart Values in Tableau

In this video I will show you how to highlight the top three highest sales values on a bar chart. I will also teach you how to add a nested dimension and properly sort the values while keeping the top three values highlighted. Enjoy!

If you’re interested in Business Intelligence & Tableau please subscribe and check out my videos here: Anthony B. Smoak

 

Get Out of the Spreadsheet Abyss

When an organization turns a blind eye to the proliferation of spreadsheet based processes, it subjects itself to substantial risks. Have you ever encountered (or enabled) the following scenario?

  • Enterprise reporting is lacking thus a “power-user” (i.e. analyst) is conscripted into cobbling together an ad-hoc spreadsheet based process to address the management request;
  • The power user exports data from various business applications and then manipulates the data output typically with macros and formulas;
  • This initial spreadsheet is manually integrated with business unit data from various other unofficial spreadsheets, further distancing the data from the source business application;
  • Multiple tabs are added, charts are generated and data is pivoted (all manually);
  • Management finds value in the report and elevates it to a repeatable process;
  • The original request increases in complexity over time as it requires more manipulations, calculations and rogue data sources to meet management needs;
  • Management doubles down with the need for a new request and the process is repeated;
  • IT proper is NEVER consulted on any of the requests;

The business unit is now supporting a “spreadmart”. The term is considered derogatory in data circles.

“A spreadmart (spreadsheet data mart) is a business data analysis system running on spreadsheets or other desktop databases that is created and maintained by individuals or groups to perform the tasks normally done by a data mart or data warehouse. Typically a spreadmart is created by individuals at different times using different data sources and rules for defining metrics in an organization, creating a fractured view of the enterprise.” [1]

Although the initial intentions of these requests may be reasonable, the business never bothers to approach IT to propose building out a proper data store. Additionally, the conscripted analysts are unhappy with their additional manual responsibilities. Spreadsheet wrangling and manual integration activities shift precious time away from more value-added pursuits such as data analysis and formulating recommendations.

From management’s perspective, why should they pay IT to build out an officially sanctioned solution that will deliver the same information that an internal team of analysts can provide? After all, the spreadmart is responsive (changes can be made quickly) and it’s inexpensive (as opposed to new investments in IT). Eventually, the manual processes are baked into the job description and new hires are enlisted to expand and maintain this system. The business sinks deeper and deeper into the spreadsheet abyss.

The short term rewards of the spreadmart are generally not worth the longer term risks.

Risks:

“It’s not an enterprise tool. The error rates in spreadsheets are huge. Excel will dutifully average the wrong data right down the line. There’s no protection around that.” [2]

The spreadmart can also be bracketed as a “data shadow” system to borrow a term from The Business Intelligence Guidebook, authored by Rick Sherman. Here are the problems associated with “data shadow” systems as paraphrased from The Business Intelligence Guidebook [3]:

  • Productivity is severely diminished as analysts spend their time creating and maintaining an assortment of manual data processes;
    • I would add that team morale suffers as well;
  • Business units have daggers drawn as they try to reconcile and validate whose numbers are “more correct”;
    • As a result of a new silo, the organization has compounded its data governance issues;
  • Data errors can (and will) occur as a result of manual querying, integrating and calculating;
  • Data sources can change without notice and the data shadow process is not on IT’s radar for source change notifications;
  • Embedded business logic becomes stagnant in various complex macros or code modules because they are hidden or simply not understood by inheritors;
  • The solution doesn’t scale with increasing data volume or number of users;
  • Audit trail to ensure control and compliance does not exist;
    • “It is often ironic that a finance group can pass an audit because the IT processes it uses are auditable, but the data shadow systems that they use to make decisions are not, and are ignored in an internal audit”;
  • Process and technical documentation does not exist which impacts the ability to update the solution;

Additionally, these processes are not backed up with any regularity, multiple versions may exist on multiple users’ desktops and anyone can make changes to the embedded business logic. The bottom line is that the business is potentially making decisions based upon erroneous data which can have serious financial and reputational impacts.

“F1F9 estimated that 88 percent of all spreadsheets have errors in them, while 50 percent of spreadsheets used by large companies have material defects. The company said the mistakes are not just costly in terms of time and money – but also lead to damaged reputations, lost jobs and disrupted careers.” [4]

Mitigation:

There is nothing wrong with the business responding to an emerging issue by requesting a one-time ad-hoc solution. The highest risks emerge when the ad-hoc process is systematized and a number of repeatable ad-hoc processes proliferate unchecked; and IT is never involved in any discussions.

IT proper is highly effective when it is allowed to merge, integrate and validate data. Business unit analysts and spreadsheets should be out of the collection and integration game for repeatable management reporting. Analysts should focus on analysis, trending and interpretation. Too often analysts get tossed into productivity traps involving hours of cutting, pasting and linking to someone else’s spreadsheet for data integration in order to meet management demands.

When IT is brought into the discussion, they must not point fingers but rather understand why the shadow system was established in the first place. Likewise, the business unit should not point fingers at IT for being unresponsive or limited by budget constraints. Once the peace treaty has been established, IT should analyze and reverse-engineer the cobbled together integration processes and data sources (which admittedly is a time consuming event) and deliver more controlled and scalable processes.

The new data integration processes should culminate in loading data to a business specific, validated, central data mart. The central mart doesn’t try to impose an unfamiliar tool upon the business but rather automates integration activities and references more “trustworthy” data sources. Spreadsheets can still be used by analysts to access the data but the analysts are not expected to be manual aggregators using a sub-standard ETL tool.

“Go with the flow. Some users will never give up their spreadsheets, regardless of how robust an analytic environment you provide. Let them keep their spreadsheets, but configure them as front ends to the data warehouse. This way, they can use their spreadsheets to access corporate-approved data, metrics and reports. If they insist on creating new reports, provide an incentive for them to upload their reports to the data warehouse instead of distributing them via e-mail.” [5]

Have I ever had to “get out” of a situation where data governance was lacking and burned-out, morale depleted analysts spent all of their time collecting and integrating spreadsheets to maintain an inefficient spreadmart?

I’ll never tell!

References:

[1] https://en.wikipedia.org/wiki/Spreadmart

[2] http://ww2.cfo.com/analytics/2012/01/imagine-theres-no-excel/

[3] Sherman, R. (2015). Business intelligence guidebook: from data integration to analytics.

[4] http://www.cnbc.com/id/100923538

[5] https://www.information-management.com/news/the-rise-and-fall-of-spreadmarts

AOL Time-Warner: You’ve Got Merger

Most people have fond memories of nineties staple AOL which was purchased by Verizon networks for $4.4 billion in 2015. AOL was a true internet pioneer that provided many customers their first taste of dial-up internet access. After its much ballyhooed purchase of “old media” company Time-Warner in 2000, AOL Time-Warner began a slow decline in popularity as emerging broadband technology cut into its market share. By 2003, the combined company posted a $99 billion dollar loss.

The AOL Time-Warner association finally unraveled in 2009 as Time-Warner was spun off. The AOL Time-Warner merger remains (as of 2017) the biggest in US history at roughly $166 billion dollars. Currently, Time-Warner’s market capitalization is north of $75 billion, while AOL’s is estimated at about $2.5 billion.

Fast forward seventeen years after the merger; communications behemoth Verizon plans to launch a new division called “Oath” which will house AOL and its other media properties. I suppose the name is a reaction to the “fake news” phenomenon which surfaced in the 2016 election cycle. One of Oath’s more prominent holdings includes another nineties staple, Yahoo; which was recently purchased by Verizon for $4.8 billion.

The reason I decided to take this AOL trip down memory lane was to share a brief case write-up I generated back in March of 2007 for an MBA strategy class. The write-up briefly discusses the history of AOL and its fateful merger with Time-Warner. Keep in mind that this perspective is from 2007.

History and Development

The media conglomerate known as AOL Time-Warner was formed when America Online, Inc merged with Time-Warner Inc. on January 11, 2001. At the time of the announcement, Time-Warner was the world’s largest media and entertainment company with revenues of 26.8 billion dollars, approximately 5 and half times more than AOL’s 4.8 billion [1]. This 166 billion dollar marriage of leading companies in content assets and internet distribution was the largest proposed merger ever.

AOL initially began operations as Quantum Computer Services. “In 1985, Quantum began offering a graphical-user interface (GUI) BBS for PCs and soon expanded GUI services to Apple and Tandy computers. [2]” The company was renamed America Online in 1989 and concentrated on providing easy online access to a predominately technically illiterate mainstream audience. By the time of the merger with Time-Warner, AOL had grown to be the nations largest online company with close to 22 million subscribers.

Time, was originally founded by Henry Luce and Briton Hadden with $86,375 borrowed from friends and Yale classmates [3]. Warner Bros. Pictures, Inc was formed by brothers Harry, Allen, Sam & Jack in 1923 [4]. In 1990 Time Inc merged with Warner Communications. The 18 billion dollar merger would allow Time to benefit from Warner’s strong international distribution, while Warner would gain from Time’s strong programming [5].

Flat Rate Pricing

AOL initially employed a two part pricing strategy for access to its services. Members were offered a $19.95 rate for 20 hours use and then charged $2.95 for each additional hour they spent online. AOL at this point in time was subjected to very strong competitive forces in the marketplace. The company had previously stuck to its two part pricing strategy even though it had trouble keeping subscribers because smaller rivals were offering unlimited use of the Internet for a single fee.  When AOL’s second biggest competitor Microsoft attempted to set price with a flat pricing scheme of $19.95 for unlimited access, AOL had to match in order to keep its subscribers from defecting.

The shift in pricing strategy had a tremendous effect on the demand for AOL’s service. The subsequent surge in demand illustrates that the demand for AOL’s services was elastic. The price elasticity of demand for certain products or services is highly contingent upon the number and closeness of the substitutes available. For internet access there were many options that were available to consumers during the mid nineties. Thus if the price of a close substitute were to be reduced, buyers of other products would be enticed to switch to the lower cost option. In AOL’s case, buyers switched to the option that offered the most value for the same price as associated switching costs were negligible. The value associated with AOL was its exclusive content and proprietary network in addition to broader internet connectivity. Competitive flat rate pricing along with AOL’s strong brand reputation and a highly elastic demand helped increase its subscriber base and kept its current subscribers from switching to Microsoft’s rival MSN service.

AOL’s Quest for Bandwidth

In the late nineties AOL realized that consumers in the future would demand higher speed connectivity to online content. Unfortunately the only service that AOL offered at this point in time was low bandwidth connectivity via dial-up. The company was at an inflection point where it could decide to stay with a maturing technology or invest in ways to stay competitive as the internet connectivity landscape transitioned. Several high bandwidth options were in the running to become the dominant technology of the future. Of these technologies, access via cable modems using the coaxial cable used to transmit TV signals looked to be the most promising. AOL realized that it would have to develop new strategies to stay competitive in the upcoming high growth mass market populated by the early majority of cable modem users.

AOL’s competitors, Microsoft and AT&T, made significant investments or outright purchases of existing cable operators. As a prerequisite for the acquisition of TCI by AT&T, AOL lobbied the FCC to force TCI to open its cable networks to rival ISPs. “Predictably, this proposal did not sit well with cable operators, especially since they have spent billions on infrastructure upgrades to sell their own Internet-over-cable services.” [6] This strategy proved to be unsuccessful for AOL so predictably the company forged ahead with plays in other broadband categories such as satellite and DSL technologies. Strategic alliances with Hughes Electronics, Bell Atlantic and SBC Communications allowed AOL to hedge against a proliferation in broadband connectivity.

 Strategy Behind the Merger

In theory, the merger of AOL and Time-Warner would allow both companies to realize substantial synergies. According to Steve Case, “We will draw on one another’s strengths, combining AOL’s superior distribution capacity and Internet expertise with Time-Warner’s programming and cable network assets,” [7]. The combined conglomerate would give the new company unprecedented reach across traditional and new internet media. As an example, the conglomerate could offer a multimedia package to advertisers encompassing AOL’s internet offerings and Time-Warner’s traditional media properties. In addition AOL would finally gain access to a cable network allowing it to provide high speed access via the promising coaxial cable method.

Another justification for the merger was the expected costs savings that the new company would realize. For example,” AOL will also be able to shave significant customer acquisition costs by taking advantage of Time-Warner’s vast CD music printing business. One of AOL’s most expensive marketing costs is outsourcing the pressing of its software CDs, which are sent to prospective customers. “ [8] AOL Time-Warner would be presented with bundling opportunities as hit music CDs could contain AOL marketing material and software.

 Beyond 2002

Two years after the historic merger with Time-Warner, AOL’s advertising revenue has dropped and its subscription growth has slowed. As the internet landscape has moved towards broadband, AOL still heavily relies upon dial-up service. A key problem for AOL at this juncture is how to keep users from defecting when they switched to high speed access over cable modems? AOL most concentrate on enriching its content to remain a viable player in the internet landscape. “Content, broadly defined-from downloading music and films to exclusive movie and news clips to prime-time series previews (“The Sopranos” and “Friends”) to the pages of Time magazine-also will be the catalyst that entices dial-up, narrowband subscribers to the more lucrative broadband front as AOL transforms itself into the HBO of the Internet.” [9]

Secondly AOL Time-Warner should concentrate on acquiring more cable operations in order to increase the reach of its broadband services. In 2002 Cablevision was selling at about 25% of its value and Adelphia has indicated that it will sell some of its best cable assets as well [10].

References

[1] Sutel, Seth. “Time Warner being acquired by AOL for about $166 billion”  Associated Press Newswires (10 January 2000): Factiva

[2] http://www.historyoftheinternet.com/chap5.html

[3] http://money.cnn.com/2000/01/10/deals/aol_warner/timeline.htm

[4] http://en.wikipedia.org/wiki/Warner_Bros.

[5] Coy, Peter. “Time Inc. and Warner Communications Merge” The Associated Press. (4 March 1999): Factiva

[6] “Coax Access Fight Goes Regional City Councils Weigh TCI-AT&T Merger” ISP Business News Vol. 5, Issue: 3 (18 January 1999): Factiva

[7] Auchard, Eric “FOCUS – AOL, Time Warner agree to world’s biggest merger.” Reuters News

By Eric Auchard (10 January 2000): Factiva

[8] Cho, Joshua “AOL-TW Synergies Meet with Skepticism.(Company Business and Marketing)” Cable World Volume 12; Issue 11 (13 March 2000): Factiva

[9] Mermigas, Diane “AOL and ABC should emphasize content” Electronic Media Vol: 21 Num: 48 (2 December 2002): Factiva

[10] Gilpin, Kenneth “Cable Industry Plays Catchup” The New York Times (19 May 2002): Factiva

 

 

Building a Donut Chart in Tableau Using NBA Data

In this video I will show you how to create a donut chart in Tableau. Since a donut chart is essentially a hoop, I put together this quick visualization using NBA data. Visualization aficionados will advise to use pie/donut charts sparingly but they can add value when showing values with respect to the whole. Enjoy!

More Than You Want to Know About State Street Bank’s Technology Strategy Part 2

This article is a continuation of my earlier analysis (Part 1 here) where I waded into State Street’s strategy for Technology Infrastructure and IT Capability and Staffing. In this second part of my three part series I will broach the company’s strategy for information risk and security, stakeholder requirements and project return on investment. State Street’s cloud implementation and virtualization initiative is a good example of business strategy/need influencing the firm’s information technology strategy.

State Street: Strategy for Information Risk & Security:

State Street has acquired a substantial client base and houses sensitive financial data that is subjected to regulatory scrutiny. Given the sensitive nature of its data and operations, the cloud infrastructure that the bank chose to implement was that of a virtualized private cloud. Former Chief Innovation Officer Madge Meyer stated, “We’re totally virtualized, our network is a virtual private IP network. Our servers are 72 percent virtualized and our storage is all virtualized for structured/unstructured data” (Burger, 2011). For State Street, a private cloud offers the benefits of a public cloud with the added benefit of being owned and operated by the bank (i.e. exclusive dedication). While no architecture is 100% secure, the risk of an information breach is mitigated as the controlling organization’s data can be completely isolated from the data of another organization.

Additionally, the cloud implementation and virtualization initiative gave rise to shared services that are centrally managed but enforced across the enterprise. This single security framework can be applied across all of the application touch points precluding the need for multiple security frameworks across disparate systems.

State Street: Strategy for Stakeholder Requirements, Testing & Training/Support:

The architecture group within State Street works together with the business to tie together strategic objectives. The idea to embrace cloud implementation (and the additional data functionality it enabled for the bank’s clients) emanated in the architecture group. Thus, the business and the board of directors were key stakeholders in the initiative. The board of directors has a special dedicated technology committee that receives “a complete rundown of the technology strategy and the work that we (IT group) are doing in terms of digitizing the business” (High, 2016). According to Perretta, “They (architecture group) created a proof of concept with an eye toward: Here are the capabilities that our entire organization is going to need, here are the technologies that we can deploy, and here’s how to make them operational” (Camhi, 2014).

State Street started migrating its new cloud applications to production by selecting those with low volume and low complexity and then gradually ramped up to migrating the more complex applications (McKendrick, 2013). Dual pilots of the cloud architecture were conducted using roughly 100 machines. Once favorable results were achieved, a larger pilot consisting of 500 machines was stood up. Approximately 120 use cases were tested in the pilot in order to let the development team understand the failure points of the system (Tucci, 2011a).

The standardization and virtualization aspects of the cloud infrastructure the bank implemented was conducive to agile development. Virtual machines on the cloud allowed development teams to spin up multiple server instances as opposed to physically installing a new box in the legacy non-virtualized environment. Contention between teams waiting for server use is virtually eliminated. “When adding cloud computing to agile development, builds are faster and less painful, which encourages experimentation” (Kannan, 2012). The relative ease at which development and testing servers can be instantiated promotes “spur of the moment” experimental builds that could yield additional innovative features and capabilities.

State Street: Project ROI and Key Success Measures:

Prior to State Street’s cloud infrastructure upgrade initiatives, potential operating cost savings were projected to be $575 million to $625 million by the end of 2014; which State Street is on track to achieve. “The bank had pretax run-rate expense savings from the initiative of $86 million in 2011, $112 million in 2012, and $220 million in 2013” (Camhi, 2014).

When the IT group makes a budget request for substantial investments, they must lay out the potential benefits to the business. Some of the benefits are timely payback, regulatory compliance, data quality improvement and faster development cycle times (providing features and functionality with re-use and less coding). The ultimate aim is to connect the IT strategy to business results in a way that yields advantage for the organization.

In 2011, State Street published a matrix on the advantages of cloud computing vs. traditional IT. The following figure provides insight into State Street’s IT and business unit considerations with respect to making an investment in a fixed or variable cost infrastructure (Pryor, 2011).

Traditional IT Cloud Computing
Cash Flow Hardware / software purchased upfront Costs incurred on a pay-as-you-go basis
Risk Entire risk taken upfront with uncertain return Financial risk is taken incrementally and matched to return
Income Statement Impact Maintenance and depreciated capital expense Maintenance costs only
Balance Sheet Impact Hardware / software carried as a long-term asset Cost incurred on a pay-as-you-go basis

From a funding perspective, State Street employs the chargeback funding method for its private cloud initiative. Architectural capabilities empower end-users to automatically provision virtualized servers for usage. There are policies in place that determine how long a virtual server may remain instantiated and how much load balancing is performed across the infrastructure. Server usage is monitored, measured and chargeback is calculated based upon end-user processing time. Subsequently, the usage is billed back to the end-user’s respective business unit. “In short, it puts a management layer of software over the virtualized servers and operates them in a highly automated, low touch, fashion” (Babcock, 2011).

Don’t miss part 3 of the analysis:
More Than You Want to Know About State Street Bank’s Technology Strategy Part 3

References:

Babcock, C. (November 9, 2011). 6 big questions for private cloud projects. Information Week. Retrieved from Factiva.

Burger, K. (October, 1, 2011). Riding The Innovation Wave; Technology innovation has been key to State Street Corp.’s success, according to chief innovation officer Madge Meyer — and she’s been willing to take some risks to prove it. Bank Systems + Technology. Retrieved from Factiva

Camhi, J. (2014). Chris Perretta Builds Non-Stop Change Into State Street’s DNA. Bank Systems & technology. Retrieved from http://www.banktech.com/infrastructure/chris-perretta-builds-non-stop-change-into-state-streets-dna/d/d-id/1317880

High, P. (February 8, 2016). State Street Emphasizes Importance Of Data Analytics And Digital Innovation In New Role. Retrieved from http://www.forbes.com/sites/peterhigh/2016/02/08/state-street-emphasizes-importance-of-data-analytics-and-digital-innovation-in-new-role/#a211b1320481

Kannan, N. (August 20, 2012). 6 Ways the Cloud Enhances Agile Software Development. CIO. Retrieved from http://www.cio.com/article/2393022/enterprise-architecture/6-ways-the-cloud-enhances-agile-software-development.html

McKendrick, J. (January 7, 2013). State Street’s Transformation Unfolds, Driven by Cloud Computing. Forbes. Retrieved from http://www.forbes.com/sites/joemckendrick/2013/01/07/state-streets-transformation-unfolds-driven-by-cloud-computing/#408e1acf64cf

Tucci. L. (July, 2011a). In search of speed, State Street’s CIO builds a private cloud. Retrieved from http://searchcio.techtarget.com/podcast/In-search-of-speed-State-Streets-CIO-builds-a-private-cloud

Michael Porter’s Generic Differentiation Strategy Explained

I previously touched upon Michael Porter’s generic cost leadership strategy here. Porter asserts that a business model can’t offer the best product or service at the lowest price and maintain a sustainable competitive advantage. An organization employing a strategy that attempts to be “all things to all people” will become stranded in mediocrity (i.e. earn less than industry average profitability).

A differentiation strategy advocates that a business must offer products or services that are valuable and unique to buyers above and beyond a low price. The ability for a company to offer a premium price for their products or services hinges upon how valuable and unique these offerings are in the marketplace. A differentiator invests its resources to gain a competitive advantage from superior innovation, excellent quality and responsiveness to customer needs. [1]

“It should be stressed that the differentiation strategy does not allow the firm to ignore costs, but rather they are not the primary strategic target.” [2]

If you could boil the differentiation strategy down to a manageable sound-bite, it would look something like this; differentiation enables a firm to command a higher price.

Starbucks coffee doesn’t taste materially better than offerings from rival Dunkin’ Donuts, but Starbucks has crafted the “Starbucks Experience” complete with intimate environments, sustainable sourcing and mobile ordering to differentiate itself with a cult-like following (i.e. command higher than industry average prices for a commodity item).

Advantages:

Differentiation allows a firm to build brand loyalty, obtain customers who exhibit less price sensitivity and increase its profit margins. As opposed to cost leaders, differentiators are not as concerned with supplier price increases. Differentiators can more easily pass on price increases to their customers because customers are more willing to pay the increases.

Differentiators are protected from powerful buyers since only they can supply the distinct product or service offering. Differentiators are also protected against the threat of substitute products in that a new competitor must invest substantial resources to both match the capabilities of the differentiator and break customer loyalty.

“Differentiation is different from segmentation. Differentiation is concerned with how a firm competes—the ways in which it can offer uniqueness to customers. Such uniqueness might relate to consistency (McDonald’s), reliability (Federal Express), status (American Express), quality (BMW), and innovation (Apple). Segmentation is concerned with where a firm competes in terms of customer groups, localities and product types.”[1]

Risks:

Porter assets that there are risks to the differentiation strategy.

  • “The cost differential between low-cost competitors and the differentiated firm becomes too great for differentiation to hold brand loyalty. Buyers thus sacrifice some of the features, services, or image possessed by the differentiated firm for large cost savings;
  • Buyers’ need for the differentiating factor falls. This can occur as buyers become more sophisticated;
  • Imitation narrows perceived differentiation, a common occurrence as industries mature.”

All differentiators should be on guard for firms that seek to imitate their distinct offerings while never charging a higher price than the market will bear [1].

The differentiation strategy should not be mistaken for providing unique products simply for the sake of being unique; rather the differentiation should be tied to customer demand or willingness to pay.

References:

[1] Hill, Charles. W. L., & Jones, Gareth. R. (2007). Strategic Management Theory. Houghton Mifflin Company

[2] Porter, M. E. Competitive Strategy: Techniques for Analyzing Industries and Competitors. New York: Free Press, 1980.

Picture Copyright:  urubank / 123RF Stock