Month: April 2017

More Than You Want to Know About State Street Bank’s Technology Strategy Part 3

This article is a continuation of my earlier analyses (Part 1, Part 2 here) where I waded into State Street’s strategy for Technology Infrastructure, IT Capability and Staffing, Information Risk & Security, Stakeholder Requirements, and Project ROI. In this final part of my three part series I will broach the company’s strategy for Data Acquisition, Social Media, Organizational Change Management and Project Strategy. State Street’s cloud implementation and virtualization initiative is a worthy example of business strategy/need influencing the firm’s information technology strategy.

State Street: Strategy for Data Acquisition and Impact on Business Processes:

The nature of State Street’s business as a custodian bank with trillions of dollars under custody management and multiple clients distributed worldwide means that the organization houses and processes a tremendous amount of data (internally generated and externally collected). The sheer volume and complexity of this data presents challenges as the bank looks to file regulatory reports and provide data back to its clients. The company receives an untenable 50,000 faxes a month from its client base (Garber, 2016). According to consulting firm Accenture, “(State Street) was unable to adequately track trades through each step in the trading lifecycle because there were multiple reconciliation systems, some reconciliation work was still being done manually and there was no system of record. To maintain industry leadership and comply with regulations, the company’s IT platform had to advance” (Alter, Daugherty, Harris, & Modruson, 2016).

The bank’s cloud initiative helped facilitate and speed up the burdensome process of transferring data back and forth between its clients. In addition, (as of 2016) a new digital initiative (code named State Street Beacon) will potentially help drive more cost savings. The cloud architecture project was a boon to data analysis capabilities as it enabled clients to access their data in the State Street cloud and subsequently enrich the data from multiple sources to support forecasting (Camhi, 2014). In this case, the bank’s infrastructure as a service (IaaS) enabled platform as a service (PaaS) capabilities.

The bank has embarked upon the development of 70 multiple mobile apps and services that support its PaaS strategy. In one case, the bank developed a tablet and mobile accessible app for its client base named State Street Springboard. This application put investment portfolio data in the hands of its client base. Additionally, “Since State Street’s core competency is transaction processing, its Gold Copy app is one of the most important new tools it offers: The app lets a manager follow a single ‘gold’ version of a transaction as it moves through all of the company’s many departments and office locations — say, as it makes it way through trading, accounting, and reporting offices globally” (Hogan, 2012). This capability of the Gold Copy app enables more effective management of counterparty risk as an asset moves through the trading process.

State Street’s new infrastructure and massive data collection provides the bank with new big data capabilities that can better inform business units in the area of risk management. Data insights can potentially fortify stress testing, “What-If” and “Black Swan” scenario modeling. As we’ve seen in the recent 2016 case of Britain’s pending withdrawal from the European Union (i.e. “Brexit”), uncertainty in global financial markets is a certainty.

State Street: Strategy for Social Media/Web Presence:

State Street has traditional Facebook, LinkedIn and Twitter social presences but it has also used other social platforms to support various aims such as employee interaction and brand awareness. The bank was named a “Social Business Leader for 2013” by Information Week. As part of a “social IT” strategy (in which technology supports a collaboration initiative), the company held an “Innovation Rally” on its eight internal online forums to gather employee ideas on how best to improve its business. From 12,000 total submitted postings, employees could attempt to build a business case around the best ideas for executive management to implement (Carr, 2013). The company also launched an internal “State Street Collaborate” platform with the aim of crowdsourcing employee knowledge to help people find an appropriate in-house expert regarding diverse work related topics of interest.

Additional social initiatives include a partnership with TED to provide employees the opportunity to give a “Ted Talk” in front of their peers; the aim is to promote knowledge sharing within the organization. The bank also experiments with a presence on the (almost defunct) video sharing platform “Vine” where it can showcase the organization in quick six second sound bites. This approach caters to clients and the future millennial talent base.

State Street: Strategy for Organizational Change Management, Project Strategy and Complexity:

As stated earlier in this series, State Street started migrating its new cloud applications to production by selecting those with low volume and low complexity and then gradually ramped up to migrating the more complex applications (McKendrick, 2013). The standardization and virtualization aspects of cloud infrastructure that the bank implemented is conducive to agile development. Standardization and a reusable code approach reduces complexity by limiting development choices, simplifying maintenance and enabling new technology staff to get up to speed on fewer systems. Developers are placed in agile teams with business subject matter experts to provide guidance and to increase stakeholder buy-in. Per Perretta, “We circle the agile approach with additional governance to ensure that the investments are paying off in the appropriate timeframe” (High, 2016).

Another key approach the State Street uses to gauge project complexity is that of predictive analytics. The bank can help its internal business teams better understand the project costs and delivery timetables by analyzing historical data on all of the projects implemented over the years. The predictive analytics model uses inputs such as “…scope, team sizes, capability of the team, the amount of hours each team member spent, and ultimately, how well it delivered on these programs” (Merrett, 2015). As the business teams list their project requirements, the predictive model is created in real time which provides all parties with additional clarity.

Finally, it would be remiss to mention banking and transformation in the same breath without mentioning the requisite layoffs and outsourcing activities. For all the benefits of the bank’s cloud computing initiatives, technology workers who do not fit in with the new paradigm find themselves subjected to domestic and non-domestic outsourcing activities. A standardized infrastructure platform leads to fewer distinct systems in the technology ecosystem, an emphasis on code reuse and increased automation. This perfect storm of efficiency gains has lead to roughly 850 IT employees shuffled out of the organization to either IBM, India based Wipro or outright unemployment. Staffing cuts occurred amongst the employees who maintained and monitored mainframes and worked with other non-cloud based infrastructure systems. The bank was interested in shifting fixed costs for variable costs by unloading IT staff who were perceived as not working on innovative cutting edge technologies. The layoffs amount to “21% of State Street’s 4,000 IT employees worldwide” (Tucci, 2011b).

Revisit earlier analyses here:

References:

Alter, A., Daugherty, R., Harris, J., & Modruson, F., (2016). A Fresh Start for Enterprise IT. Accenture. Retrieved from https://www.accenture.com/us-en/insight-outlook-journal-fresh-start-enterprise-it

Camhi, J. (2014). Chris Perretta Builds Non-Stop Change Into State Street’s DNA. Bank Systems & technology. Retrieved from http://www.banktech.com/infrastructure/chris-perretta-builds-non-stop-change-into-state-streets-dna/d/d-id/1317880

Carr, D. (May 30, 2013). State Street: Social Business Leader Of 2013. Retrieved 6/25/16 from http://www.informationweek.com/enterprise/state-street-social-business-leader-of-2013/d/d-id/1110179?

Garber, K. (February 29, 2016). State Street doubles down on digital. SNL Bank and Thrift Daily. Retrieved from Factiva 6/20/16.

High, P. (February 8, 2016). State Street Emphasizes Importance Of Data Analytics And Digital Innovation In New Role. Retrieved from http://www.forbes.com/sites/peterhigh/2016/02/08/state-street-emphasizes-importance-of-data-analytics-and-digital-innovation-in-new-role/#a211b1320481

Hogan, M. (September 3, 2012). State Street’s Trip to the Cloud. Barron’s. Retrieved from Factiva 6/20/16

McKendrick, J. (January 7, 2013). State Street’s Transformation Unfolds, Driven by Cloud Computing. Forbes. Retrieved from http://www.forbes.com/sites/joemckendrick/2013/01/07/state-streets-transformation-unfolds-driven-by-cloud-computing/#408e1acf64cf

Merrett, R. (April 2, 2015). How predictive analytics helped State Street avoid additional IT project costs. CIO. Retrieved from http://www.cio.com.au/article/571826/how-predictive-analytics-helped-state-street-avoid-additional-it-project-costs/

Tucci. L. (July, 20, 2011). State Street tech layoffs: IT transformation’s dark side. Retrieved from http://searchcio.techtarget.com/blog/TotalCIO/State-Street-tech-layoffs-IT-transformations-dark-side

How to Build a Waterfall Chart in Tableau

In this video I will show you how to go “Chasing Waterfalls” in Tableau (apologies to TLC). Waterfall charts are ideal for demonstrating the journey between an initial value and an ending value. It is a visualization that breaks down the cumulative effect of positive and negative contributions. You’ve probably seen them used in financial statements or at your quarterly town hall meeting. Enjoy!

If you’re interested in Business Intelligence & Tableau please subscribe and check out my videos here: Anthony B. Smoak

Coursera Final Assignment: Essential Design Principles for Tableau

Dashboard 1

I recently completed Essential Design Principles for Tableau offered by the University of California Davis on Coursera. I’ll offer some review commentary. I thought it was a solid class as it covered data visualization concepts such as pre-attentive attributes and the Gestalt principles. This class was a bit more heavy on the conceptual side of the house as opposed to delving into practical Tableau instructions. However, there are other classes in the specialization that have a more hands on practical approach.

In this assignment we had to highlight the three worst performing product Sub-Categories in each region. Additionally, we had to demonstrate how these worst performers compared to other product Sub-Categories in their respective regions. Finally, the visualization had to highlight the three worst performing Sub-Product Categories overall with a color emphasis. The scenario given to the class was that a sales manager had to cut the three worst performing Sub-Categories in her region and needed a visualization that addressed her concerns.

Guidance was not provided on how to identify the three worst performing categories. Some people in the class simply used profit as their key performance indicator (KPI) which I think is misguided. You learn in business (or business education) that profits do not equal profitability.  From Investopedia:

Profitability is closely related to profit, but it is the metric used to determine the scope of a company’s profit in relation to the size of the business. Profitability is a measurement of efficiency – and ultimately its success or failure. It is expressed as a relative, not an absolute, amount. Profitability can further be defined as the ability of a business to produce a return on an investment based on its resources in comparison with an alternative investment. Although a company can realize a profit, this does not necessarily mean that the company is profitable.

For these reasons I used the Average Profit Ratio of the products in each Sub-Category as my KPI as opposed to raw profits. If you had to sell $100,000 of product A to make $1,000 in profit (1% profit ratio), would you eliminate product B which requires $1000 in sales to generate $500 in profit (50% profit ratio)? Only if you want to go out of business!

In order to complete the visualization you see above on Tableau Public I had to incorporate nested sorting principles and also highlight the three worst performing elements on a bar chart. Luckily for you, I have videos that will demonstrate how to accomplish these tasks.

You can check out the rest of my videos on my Youtube Channel or find them on this site under Videos.

How to Highlight the Top 3 Bar Chart Values in Tableau

In this video I will show you how to highlight the top three highest sales values on a bar chart. I will also teach you how to add a nested dimension and properly sort the values while keeping the top three values highlighted. Enjoy!

If you’re interested in Business Intelligence & Tableau please subscribe and check out my videos here: Anthony B. Smoak

 

Get Out of the Spreadsheet Abyss

When an organization turns a blind eye to the proliferation of spreadsheet based processes, it subjects itself to substantial risks. Have you ever encountered (or enabled) the following scenario?

  • Enterprise reporting is lacking thus a “power-user” (i.e. analyst) is conscripted into cobbling together an ad-hoc spreadsheet based process to address the management request;
  • The power user exports data from various business applications and then manipulates the data output typically with macros and formulas;
  • This initial spreadsheet is manually integrated with business unit data from various other unofficial spreadsheets, further distancing the data from the source business application;
  • Multiple tabs are added, charts are generated and data is pivoted (all manually);
  • Management finds value in the report and elevates it to a repeatable process;
  • The original request increases in complexity over time as it requires more manipulations, calculations and rogue data sources to meet management needs;
  • Management doubles down with the need for a new request and the process is repeated;
  • IT proper is NEVER consulted on any of the requests;

The business unit is now supporting a “spreadmart”. The term is considered derogatory in data circles.

“A spreadmart (spreadsheet data mart) is a business data analysis system running on spreadsheets or other desktop databases that is created and maintained by individuals or groups to perform the tasks normally done by a data mart or data warehouse. Typically a spreadmart is created by individuals at different times using different data sources and rules for defining metrics in an organization, creating a fractured view of the enterprise.” [1]

Although the initial intentions of these requests may be reasonable, the business never bothers to approach IT to propose building out a proper data store. Additionally, the conscripted analysts are unhappy with their additional manual responsibilities. Spreadsheet wrangling and manual integration activities shift precious time away from more value-added pursuits such as data analysis and formulating recommendations.

From management’s perspective, why should they pay IT to build out an officially sanctioned solution that will deliver the same information that an internal team of analysts can provide? After all, the spreadmart is responsive (changes can be made quickly) and it’s inexpensive (as opposed to new investments in IT). Eventually, the manual processes are baked into the job description and new hires are enlisted to expand and maintain this system. The business sinks deeper and deeper into the spreadsheet abyss.

The short term rewards of the spreadmart are generally not worth the longer term risks.

Risks:

“It’s not an enterprise tool. The error rates in spreadsheets are huge. Excel will dutifully average the wrong data right down the line. There’s no protection around that.” [2]

The spreadmart can also be bracketed as a “data shadow” system to borrow a term from The Business Intelligence Guidebook, authored by Rick Sherman. Here are the problems associated with “data shadow” systems as paraphrased from The Business Intelligence Guidebook [3]:

  • Productivity is severely diminished as analysts spend their time creating and maintaining an assortment of manual data processes;
    • I would add that team morale suffers as well;
  • Business units have daggers drawn as they try to reconcile and validate whose numbers are “more correct”;
    • As a result of a new silo, the organization has compounded its data governance issues;
  • Data errors can (and will) occur as a result of manual querying, integrating and calculating;
  • Data sources can change without notice and the data shadow process is not on IT’s radar for source change notifications;
  • Embedded business logic becomes stagnant in various complex macros or code modules because they are hidden or simply not understood by inheritors;
  • The solution doesn’t scale with increasing data volume or number of users;
  • Audit trail to ensure control and compliance does not exist;
    • “It is often ironic that a finance group can pass an audit because the IT processes it uses are auditable, but the data shadow systems that they use to make decisions are not, and are ignored in an internal audit”;
  • Process and technical documentation does not exist which impacts the ability to update the solution;

Additionally, these processes are not backed up with any regularity, multiple versions may exist on multiple users’ desktops and anyone can make changes to the embedded business logic. The bottom line is that the business is potentially making decisions based upon erroneous data which can have serious financial and reputational impacts.

“F1F9 estimated that 88 percent of all spreadsheets have errors in them, while 50 percent of spreadsheets used by large companies have material defects. The company said the mistakes are not just costly in terms of time and money – but also lead to damaged reputations, lost jobs and disrupted careers.” [4]

Mitigation:

There is nothing wrong with the business responding to an emerging issue by requesting a one-time ad-hoc solution. The highest risks emerge when the ad-hoc process is systematized and a number of repeatable ad-hoc processes proliferate unchecked; and IT is never involved in any discussions.

IT proper is highly effective when it is allowed to merge, integrate and validate data. Business unit analysts and spreadsheets should be out of the collection and integration game for repeatable management reporting. Analysts should focus on analysis, trending and interpretation. Too often analysts get tossed into productivity traps involving hours of cutting, pasting and linking to someone else’s spreadsheet for data integration in order to meet management demands.

When IT is brought into the discussion, they must not point fingers but rather understand why the shadow system was established in the first place. Likewise, the business unit should not point fingers at IT for being unresponsive or limited by budget constraints. Once the peace treaty has been established, IT should analyze and reverse-engineer the cobbled together integration processes and data sources (which admittedly is a time consuming event) and deliver more controlled and scalable processes.

The new data integration processes should culminate in loading data to a business specific, validated, central data mart. The central mart doesn’t try to impose an unfamiliar tool upon the business but rather automates integration activities and references more “trustworthy” data sources. Spreadsheets can still be used by analysts to access the data but the analysts are not expected to be manual aggregators using a sub-standard ETL tool.

“Go with the flow. Some users will never give up their spreadsheets, regardless of how robust an analytic environment you provide. Let them keep their spreadsheets, but configure them as front ends to the data warehouse. This way, they can use their spreadsheets to access corporate-approved data, metrics and reports. If they insist on creating new reports, provide an incentive for them to upload their reports to the data warehouse instead of distributing them via e-mail.” [5]

Have I ever had to “get out” of a situation where data governance was lacking and burned-out, morale depleted analysts spent all of their time collecting and integrating spreadsheets to maintain an inefficient spreadmart?

I’ll never tell!

References:

[1] https://en.wikipedia.org/wiki/Spreadmart

[2] http://ww2.cfo.com/analytics/2012/01/imagine-theres-no-excel/

[3] Sherman, R. (2015). Business intelligence guidebook: from data integration to analytics.

[4] http://www.cnbc.com/id/100923538

[5] https://www.information-management.com/news/the-rise-and-fall-of-spreadmarts

AOL Time-Warner: You’ve Got Merger

Most people have fond memories of nineties staple AOL which was purchased by Verizon networks for $4.4 billion in 2015. AOL was a true internet pioneer that provided many customers their first taste of dial-up internet access. After its much ballyhooed purchase of “old media” company Time-Warner in 2000, AOL Time-Warner began a slow decline in popularity as emerging broadband technology cut into its market share. By 2003, the combined company posted a $99 billion dollar loss.

The AOL Time-Warner association finally unraveled in 2009 as Time-Warner was spun off. The AOL Time-Warner merger remains (as of 2017) the biggest in US history at roughly $166 billion dollars. Currently, Time-Warner’s market capitalization is north of $75 billion, while AOL’s is estimated at about $2.5 billion.

Fast forward seventeen years after the merger; communications behemoth Verizon plans to launch a new division called “Oath” which will house AOL and its other media properties. I suppose the name is a reaction to the “fake news” phenomenon which surfaced in the 2016 election cycle. One of Oath’s more prominent holdings includes another nineties staple, Yahoo; which was recently purchased by Verizon for $4.8 billion.

The reason I decided to take this AOL trip down memory lane was to share a brief case write-up I generated back in March of 2007 for an MBA strategy class. The write-up briefly discusses the history of AOL and its fateful merger with Time-Warner. Keep in mind that this perspective is from 2007.

History and Development

The media conglomerate known as AOL Time-Warner was formed when America Online, Inc merged with Time-Warner Inc. on January 11, 2001. At the time of the announcement, Time-Warner was the world’s largest media and entertainment company with revenues of 26.8 billion dollars, approximately 5 and half times more than AOL’s 4.8 billion [1]. This 166 billion dollar marriage of leading companies in content assets and internet distribution was the largest proposed merger ever.

AOL initially began operations as Quantum Computer Services. “In 1985, Quantum began offering a graphical-user interface (GUI) BBS for PCs and soon expanded GUI services to Apple and Tandy computers. [2]” The company was renamed America Online in 1989 and concentrated on providing easy online access to a predominately technically illiterate mainstream audience. By the time of the merger with Time-Warner, AOL had grown to be the nations largest online company with close to 22 million subscribers.

Time, was originally founded by Henry Luce and Briton Hadden with $86,375 borrowed from friends and Yale classmates [3]. Warner Bros. Pictures, Inc was formed by brothers Harry, Allen, Sam & Jack in 1923 [4]. In 1990 Time Inc merged with Warner Communications. The 18 billion dollar merger would allow Time to benefit from Warner’s strong international distribution, while Warner would gain from Time’s strong programming [5].

Flat Rate Pricing

AOL initially employed a two part pricing strategy for access to its services. Members were offered a $19.95 rate for 20 hours use and then charged $2.95 for each additional hour they spent online. AOL at this point in time was subjected to very strong competitive forces in the marketplace. The company had previously stuck to its two part pricing strategy even though it had trouble keeping subscribers because smaller rivals were offering unlimited use of the Internet for a single fee.  When AOL’s second biggest competitor Microsoft attempted to set price with a flat pricing scheme of $19.95 for unlimited access, AOL had to match in order to keep its subscribers from defecting.

The shift in pricing strategy had a tremendous effect on the demand for AOL’s service. The subsequent surge in demand illustrates that the demand for AOL’s services was elastic. The price elasticity of demand for certain products or services is highly contingent upon the number and closeness of the substitutes available. For internet access there were many options that were available to consumers during the mid nineties. Thus if the price of a close substitute were to be reduced, buyers of other products would be enticed to switch to the lower cost option. In AOL’s case, buyers switched to the option that offered the most value for the same price as associated switching costs were negligible. The value associated with AOL was its exclusive content and proprietary network in addition to broader internet connectivity. Competitive flat rate pricing along with AOL’s strong brand reputation and a highly elastic demand helped increase its subscriber base and kept its current subscribers from switching to Microsoft’s rival MSN service.

AOL’s Quest for Bandwidth

In the late nineties AOL realized that consumers in the future would demand higher speed connectivity to online content. Unfortunately the only service that AOL offered at this point in time was low bandwidth connectivity via dial-up. The company was at an inflection point where it could decide to stay with a maturing technology or invest in ways to stay competitive as the internet connectivity landscape transitioned. Several high bandwidth options were in the running to become the dominant technology of the future. Of these technologies, access via cable modems using the coaxial cable used to transmit TV signals looked to be the most promising. AOL realized that it would have to develop new strategies to stay competitive in the upcoming high growth mass market populated by the early majority of cable modem users.

AOL’s competitors, Microsoft and AT&T, made significant investments or outright purchases of existing cable operators. As a prerequisite for the acquisition of TCI by AT&T, AOL lobbied the FCC to force TCI to open its cable networks to rival ISPs. “Predictably, this proposal did not sit well with cable operators, especially since they have spent billions on infrastructure upgrades to sell their own Internet-over-cable services.” [6] This strategy proved to be unsuccessful for AOL so predictably the company forged ahead with plays in other broadband categories such as satellite and DSL technologies. Strategic alliances with Hughes Electronics, Bell Atlantic and SBC Communications allowed AOL to hedge against a proliferation in broadband connectivity.

 Strategy Behind the Merger

In theory, the merger of AOL and Time-Warner would allow both companies to realize substantial synergies. According to Steve Case, “We will draw on one another’s strengths, combining AOL’s superior distribution capacity and Internet expertise with Time-Warner’s programming and cable network assets,” [7]. The combined conglomerate would give the new company unprecedented reach across traditional and new internet media. As an example, the conglomerate could offer a multimedia package to advertisers encompassing AOL’s internet offerings and Time-Warner’s traditional media properties. In addition AOL would finally gain access to a cable network allowing it to provide high speed access via the promising coaxial cable method.

Another justification for the merger was the expected costs savings that the new company would realize. For example,” AOL will also be able to shave significant customer acquisition costs by taking advantage of Time-Warner’s vast CD music printing business. One of AOL’s most expensive marketing costs is outsourcing the pressing of its software CDs, which are sent to prospective customers. “ [8] AOL Time-Warner would be presented with bundling opportunities as hit music CDs could contain AOL marketing material and software.

 Beyond 2002

Two years after the historic merger with Time-Warner, AOL’s advertising revenue has dropped and its subscription growth has slowed. As the internet landscape has moved towards broadband, AOL still heavily relies upon dial-up service. A key problem for AOL at this juncture is how to keep users from defecting when they switched to high speed access over cable modems? AOL most concentrate on enriching its content to remain a viable player in the internet landscape. “Content, broadly defined-from downloading music and films to exclusive movie and news clips to prime-time series previews (“The Sopranos” and “Friends”) to the pages of Time magazine-also will be the catalyst that entices dial-up, narrowband subscribers to the more lucrative broadband front as AOL transforms itself into the HBO of the Internet.” [9]

Secondly AOL Time-Warner should concentrate on acquiring more cable operations in order to increase the reach of its broadband services. In 2002 Cablevision was selling at about 25% of its value and Adelphia has indicated that it will sell some of its best cable assets as well [10].

References

[1] Sutel, Seth. “Time Warner being acquired by AOL for about $166 billion”  Associated Press Newswires (10 January 2000): Factiva

[2] http://www.historyoftheinternet.com/chap5.html

[3] http://money.cnn.com/2000/01/10/deals/aol_warner/timeline.htm

[4] http://en.wikipedia.org/wiki/Warner_Bros.

[5] Coy, Peter. “Time Inc. and Warner Communications Merge” The Associated Press. (4 March 1999): Factiva

[6] “Coax Access Fight Goes Regional City Councils Weigh TCI-AT&T Merger” ISP Business News Vol. 5, Issue: 3 (18 January 1999): Factiva

[7] Auchard, Eric “FOCUS – AOL, Time Warner agree to world’s biggest merger.” Reuters News

By Eric Auchard (10 January 2000): Factiva

[8] Cho, Joshua “AOL-TW Synergies Meet with Skepticism.(Company Business and Marketing)” Cable World Volume 12; Issue 11 (13 March 2000): Factiva

[9] Mermigas, Diane “AOL and ABC should emphasize content” Electronic Media Vol: 21 Num: 48 (2 December 2002): Factiva

[10] Gilpin, Kenneth “Cable Industry Plays Catchup” The New York Times (19 May 2002): Factiva

 

 

Building a Donut Chart in Tableau Using NBA Data

In this video I will show you how to create a donut chart in Tableau. Since a donut chart is essentially a hoop, I put together this quick visualization using NBA data. Visualization aficionados will advise to use pie/donut charts sparingly but they can add value when showing values with respect to the whole. Enjoy!