The Definitive Walmart Technology Review

Did you know that the legal name “Wal-Mart Stores, Inc.” was changed effective Feb. 1, 2018 to “Walmart Inc.”? The name change is intended to reflect the fact that today’s customers don’t just shop in stores.

I’ve kept an eye on Walmart because historically the company was the leading innovator in regard to advancing retail focused technology and supply chain strategy. Even though Walmart isn’t the typical “underdog”, in the fight for online retail supremacy it currently finds itself in this position; and everyone can appreciate an underdog (see Philadelphia Eagles). Currently Walmart is locked in a fierce battle with Amazon to carve out a more substantial space in digital and online retail. As such, the company is bolstering its capabilities by focusing on technology enabled business processes, training and digital growth to keep pace with Amazon’s world domination efforts.

Walmart is experimenting with cutting edge technology such as virtual reality, autonomous robots, cloud storage platforms, cashier-less stores and even blockchain. It’s also formed various online retail and technology-based alliances to keep pace with Team Bezos. Walmart’s kitchen sink technology strategy seems to be paying dividends from an online sales perspective. E-Commerce industry luminary Marc Lore and his influence can been seen in some of the innovative technology plays.

Blockchain

Walmart, yes Walmart is getting in on the blockchain ledger revolution (or hype). The company plans to team up with IBM, JD.com and Chinese based Tsinghua University to create a blockchain food safety alliance.

Here is how the partnership will work according to ZDNet:

  • Walmart, JD, IBM and Tsinghua University will work with regulators and the food supply chain to develop standards and partnerships for safety.
  • IBM provides its blockchain platform
  • Tsinghua University is the technical advisor and will provide expertise in technology and China’s food safety ecosystem.
  • Walmart and JD will help develop and optimize technology that can be rolled out to suppliers and retailers.

Walmart has shown that blockchain technology has reduced the time it takes to trace food from farm to store from days to seconds. During product recalls this capability could prove useful for the retailer.

If Walmart were to offer a more investor appealing use of blockchain (e.g. a “Sam’s-Coin” ICO), you could count me in for a high two figure investment.

Ok Google

Goole Home Walmart

Image courtesy of ZDNet

Straight from “the enemy of my enemy is my friend” playbook, Walmart and Google announced a partnership to make Walmart’s items available on Google’s shopping service, Google Express.

The New York Times reports that “it’s the first time the world’s biggest retailer has made its products available online in the United States outside of its own website”. We can readily see what Walmart gets out of the alliance (expanded presence on the dominant search engine) but interpreting Google’s angle requires a bit more perspicacity.

Google fears that its search engine is being bypassed by consumers who go straight to Walmart to search for products. Google, (you may have heard) dabbles a bit in search and online advertising. A substantial shift in product search behavior that favors Amazon is bad for business. If Google can expand its own online marketplace with Walmart’s appreciable offerings as well as entice customers to use Google Home and the mobile based Google Assistant to locate products, then the company can retain a greater share of initial product searches. Google Express already offers products from Walmart competitors Target and Costco although Walmart’s collaboration offers the largest number of items.

“Walmart customers can link their accounts to Google, allowing the technology giant to learn their past shopping behavior to better predict what they want in the future. Google said that because more than 20 percent of searches conducted on smartphones these days are done by voice, it expects voice-based shopping to be not far behind.”

An existing Walmart application feature called “Easy Reorder” is slated for integration with voice enabled shopping via Google Assistant. Currently, when a consumer logs into the Walmart mobile app, they can easily see their most frequently purchased in-store and online items and easily reorder those items. Integration with Google Express provides an additional data channel to bolster the effectiveness of this Walmart offering.

The Matrix:

Virtual reality would not be the first technology play that one would likely associate with Walmart. However, the company’s tech incubator (Store No. 8) has purchased Spatialand, a VR development tools company. Spatialand has previously worked with Oculus, Intel and “rap rock” artists Linkin Park to create virtual content.

Walmart’s intent is to use Spatialand to develop “immersive retail environments”. My expectations aren’t high that this acquisition will pay off in the near to medium term, but the company is demonstrating that it is trying to be on the vanguard of future retail technology. One can imagine this acquisition eventually enabling Star Trek “holodeck” capabilities where customers can enter virtual stores or dressing rooms and interact with products while in the comfort of their homes.

I propose that Spatialand and Walmart owned Bonobos would make ideal mash-up partners. Instead of trekking to a physical Bonobos store and trying on shirts and or slacks, consumers can create an avatar with similar dimensions and play virtual dress-up in 3D. The garments can then be shipped directly.

Additionally, Walmart is actually using Oculus headsets to train employees at its 170 training academies. The company has partnered with STRIVR, a virtual reality startup based in Menlo Park. I envision virtual customers stampeding through entrances on a Black Friday opening.

“STRIVR’s technology allows employees to experience real-world scenarios through the use of an Oculus headset, so that employees can prepare for situations like dealing with holiday rush crowds or cleaning up a mess in an aisle.”

There’s an App for That:

Walmart is trying to balance keeping its in-store traffic high while accommodating its growing mobile customer base. The company recently enhanced its Walmart application with a new “Store Assistant” feature that activates when a customer walks in the door. The app will allow customers to build shopping lists, calculate costs and locate in-store items at all of its domestic locations.

“So-called mobile commerce revenue — mostly generated via smartphones — will reach $208 billion, an annual increase of 40 percent, EMarketer forecasts.” – Bloomberg

Channeling its inner Tim Cook, Walmart launched a nationwide rollout of the Walmart Pay system as an additional application enhancement.

“To use the three-step payment system, shoppers link their chosen payment method to their Walmart.com account, open the camera on their smartphone and snap a photo of a QR code at the register. That notifies the app to process the customer’s payment. Shoppers can link their credit or debit cards, prepaid accounts or Wal-Mart gift cards to their payments; however, they still cannot use ApplePay.” – CNBC

Rise of the Machines:

As labor costs rise and technology increases, we can be sure of one thing. Robots are coming to take all of our jobs and Walmart isn’t doing much to disabuse us of this notion. As part of a pilot, the company is employing autonomous robots to about 50 locations to help perform “repeatable, predictable and manual” tasks. Primary tasks include scanning store shelves for missing stock, inventory calculations and locating mislabeled and unlabeled products. The robots stay docked in charging stations inside the store until they are activated and given an autonomous “mission”.

According to Reuters, Walmart’s CTO Jeremy King states that the robots are 50% more productive and can scan shelves three times faster at a higher accuracy. Walmart’s current carbon-based units (my words) can only scan the shelves about twice a week.

While Walmart insists that the robots won’t lead to job losses, I say to remember that technology always marches forward. Today’s “repeatable, predictable and manual” tasks are tomorrow’s non-repeatable, unpredictable and automated tasks.

Walmart scanner

Image courtesy of Walmart

Additionally, in 2016 the company “patented a system based on mini-robots that can control shopping carts, as well as complete a long list of duties once reserved for human employees.” Keep an eye on those robots as Walmart does not have a reputation for overstaffing.

In all seriousness, shelf inventory checks ensure that customer dollars aren’t left on the table due to un-shelved items. If Walmart can significantly lower the occurrences of un-shelved products with its army of shelf scanning Daleks, then the robots will pay for themselves.

Mr. Drone and Me

walmart drones

Never missing an opportunity to improve supply chain efficiency, reduce labor costs and keep pace with Amazon, Walmart is experimenting with drones. The company’s Emerging Science division has been tasked to consider future applications of drone technology to enhance operational efficiency.

Currently, inventory tracking at the company’s fulfillment centers requires employees to use lifts, harnesses and hand-held scanning devices to traverse 1.2 million square feet (26 football fields) of warehouse space; a process that can take up to a month to complete. If you consider that Walmart has close to 190 distribution centers domestically, the inventory process consumes a significant amount of labor hours when aggregated across the company.

The current plan is to utilize fully automated quad-copter drones mounted with cameras to scan the entire warehouse for inventory monitoring and error checking.

“The camera is linked to a control center and scans for tracking number matches. Matches are registered as green, empty spaces as blue, and mismatches as red. An employee monitors the drone’s progress from a computer screen.”

Drones would eventually replace the inventory quality assurance employees.

Millenial Digs:

Walmart Austin ATX
The lobby at Walmart ATX in downtown Austin. Jay Janner/Austin American-Statesman

In an effort to increase its appeal to the young, hip and tech savvy, Walmart has opened a new engineering tech design center in Austin. The new millennial digs are located in a renovated 8,000 square foot space. “Walmart ATX” will house minds that work with cutting edge technology such as machine learning, artificial Intelligence, blockchain, internet of things and other emerging technologies. Factors such as a deep talent pool and low cost of living drove the creation of this Austin hub.

Here’s hoping Amazon decides to stay far away from its retail rival and brings its talents to Atlanta, Georgia.

E-Books

The saying goes that Amazon is trying to become more like Walmart and Walmart is trying to become more like Amazon. Walmart teaming up with Japanese e-commerce giant Rakuten to sell e-books solidifies this sentiment. It should go without saying that Amazon has a sizable lead in selling e-books. However, Walmart is leaving no stone unturned as far as offering products that keep e-commerce shoppers from Amazon’s web presence.

Online Grocery Pickup:

Walmart is experimenting with allowing customers to place their orders online for pickup at a local store. The scheme currently requires the company’s human employees (eventually robots) to walk the aisles using a handheld device in order to fulfill customer orders. The device acts as an in-store GPS that maps the most efficient route to assemble the customer order.

Customers then pull into a designated pickup area where live human beings (eventually robots) will dispense the pre-assembled order.

I’m not kidding about “eventually robots” dispensing the pre-assembled order. Walmart currently has an automated kiosk in Oklahoma City that dispenses customer orders from internal bins. Customers walk up to the interface and input a code which then enables the kiosk to retrieve the order. Hal, open the pod bay doors; the future is here and apparently its name is Oklahoma City.

Walmart kiosk

Courtesy of The Oklahoman

These approaches address the “last mile” problem which plagues large e-commerce players and start-ups alike. As consumer preferences shift from physical stores to online channels, repurposing stores into dual e-commerce fulfillment centers wrings additional utility from these assets.

In Home Delivery:

Another innovative e-commerce “Last Mile” proposal from Walmart involves the creation of a smart home delivery pilot. By partnering with a smart lock startup company August Home, outsourced delivery drivers will be supplied a one-time passkey entry into a customer’s home to unload cold and frozen groceries into the refrigerator. The home owner is alerted via phone notification that the driver has entered their property and can watch the in-home delivery livestreamed from security cameras. An additional notification is sent to the consumer when the door has automatically locked. This limited scale program is only being piloted in Silicon Valley (of course).

Per the Washington post:

“This is a group of people who are already used to a certain level of intrusiveness.. But God help the teenager playing hooky or the family dog who’s not expecting the delivery man.”

I can envision a future sci-fi use case involving “smart fridges” and automatic home replenishments. This pilot move is a search for an advantage in grocery delivery as Amazon recently purchased Whole Foods without overtly signaling what disruptive services may emerge from the amalgamation.

Smart Cart Technology

Jet Smart Cart

When Walmart made the largest ever purchase of a U.S. e-commerce startup with Jet.com for $3.3 billion, the company was looking for a way to ramp up online sales and infuse itself with fresh perspectives for online selling.

As I’ve mentioned previously, Jet.com has the potential to infuse Walmart with much needed digital innovation. This fresh perspective has the potential to add tremendous value to the organization as a whole. The “old guard” rooted in Walmart’s core business model needs to allow acquisitions to thrive instead of imposing the more conservative legacy culture.

The Jet infusion of innovative ideas back to the mothership is happening. Current Walmart e-commerce head Marc Lore launched Jet.com around an innovative “smart cart” system that offers the potential of lowering the price of customer orders. Here is how it works according to Forbes:

“If you have two items in your cart which are both located in the same distribution center and can both fit into a single box, then you will pay one low price. If you add a third item that is located at a different distribution center and cannot be shipped in a single shot with the other two items, you will pay more. As you shop on the site, additional items that can be bundled most efficiently with your existing order are flagged as ‘smart items’ and an icon shows how much more you’ll save on your total order by buying them.”

The order price can be further lowered if customers use a debit card or decline returns. This smart cart process is expected to launch on Walmart’s flagship site in 2018.

Who Needs Cashiers?

Customers shopping at roughly 100 stores across 33 states can participate in Walmart’s “Scan and Go” service. Via a dedicated mobile app, customers can scan the barcodes of items as they shop, pay through the app using Walmart Pay, and then exit the store after showing a digital receipt to an employee. As customers shop and scan with their phones, they can observe the running total of their purchases. This service is currently available at all Sam’s Club locations.

In this case Walmart is keeping pace with grocery competitor Kroger which is also experimenting with digital checkout experiences. Kroger has a “Scan, Bag & Go” service rolling out at 400 grocery chains.

Additionally, Walmart’s skunkworks retail division “Store No. 8” is working on a futuristic project codenamed “Project Kepler”. This initiative goes a step further and eliminates both cashiers and checkout lines by using a combination of advanced imaging technology on par with Amazon’s “Amazon Go” concept. As customers take items off of shelves, they are automatically billed for their purchase as they walk out of the store. The Jet.com acquisition is in play here as this initiative is being led by Jet’s CTO Mike Hanrahan.

Grocers already operate on razor thin margins therefore removing cashier interaction from the shopping equation fits in with the goal of lowering labor costs. Walmart employs approximately 2 million reasons to turn this future technology into reality.

Send in the Clouds:

According to the Wall Street Journal, Walmart is telling some of its technology vendors that if they want to continue being a technical supplier then they cannot run applications for the retailer on the leading cloud computing service, Amazon Web Services. Vendors who do not comply run the risk of losing key Walmart business. This is where we open our strategy textbooks to Porter’s Five Forces and key in on “Bargaining Power of Buyers” in the retail information technology provider space. The Economist reports that in 2015 Walmart poured a staggering $10.5 billion into information technology, more than any other company on the planet. To misquote E.F. Hutton, when Walmart speaks, you listen if you’re a technology vendor. The company’s cloud ultimatum is responsible for an uptick in usage of Microsoft’s Azure offering.

As I’ve mentioned in other posts, Walmart is known for its “build not buy” philosophy in regard to technology. Most of its data is housed on its own servers or Microsoft Azure which is the primary infrastructure provider for e-commerce subsidiary Jet.com. According to CNBC, about 80 percent of Walmart’s cloud network is now in-house.

Walmart’s cloud application development is facilitated by the company’s own open source cloud application development platform named OneOps. The aim of OneOps is to allow users to deploy applications across multiple cloud providers (i.e. allow users to easily move away from Amazon Web Services). Walmart has also been a huge contributor to OpenStack, which is an open source cloud offering and has been working with Microsoft, CenturyLink and Rackspace.

OneOps was originally developed by Walmart Labs and has since been released as an open source project so that Walmart can benefit from a broader community that’s willing to offer improvements. The main codebase is currently available on GitHub (https://github.com/oneops/).

Foreign Investment:

flipkart

 

Walmart is currently challenging tech titans Amazon and China’s Alibaba for a lucrative stake in India’s burgeoning online retail market. India’s expanding middle class makes its online market a lucrative target. The market is purported to reach $220 billion by 2025 according to Bank of America Merrill Lynch. Walmart is essentially barred from outright owning physical store locations in India due to the country’s restrictive foreign investment regulations. Foreign ownership for multi-brand retailers is limited to 51% and retailers must source 30% of its goods from small suppliers which poses a difficulty for Walmart. Walmart uses its global buying power to squeeze deep discounts from major suppliers such as Unilever and Proctor and Gamble. Smaller Indian firms will have more difficulty yielding to exorbitant price concessions.

Therefore, Walmart is currently in talks to purchase a sizable stake in Indian online retailer Flipkart. Flipkart is a highly attractive opportunity because it has been able to effectively compete with Amazon in India despite being outspent by Team Bezos. Flipkart currently has a 44% market share in India which is running ahead of Amazon’s share at 31%. Walmart’s multibillion dollar investment will likely value Flipkart at $20 to 23 billion.

An infusion of capital from Walmart makes sense for both parties; Flipkart can hold off attacks from Amazon while Walmart gets a piece of the action in a growing and lucrative online market. Amazon has stated its intention to invest $5 billion in India in order to beef up the number of its fulfillment centers. Ironically, Flipkart was launched in 2007 by two former Amazon employees, Sachin and Binny Bansal.

Walmart isn’t the only company looking for a piece of Flipkart as Google is also purported to make a sizable investment in the Indian firm at a valuation of $15 to $16 billion.

Walmart has had difficulties operating in India previously as evidenced by its now disbanded partnership with Bharti Enterprises. The two companies built 20 superstores branded as Best Price Modern Wholesale, but the venture fizzled due to aforementioned regulatory restrictions.

Meanwhile in China, Walmart partnered with JD.com which is a fierce Alibaba rival. Walmart and JD will merge their membership systems, so members can receive similar discounts at both retailers. In addition, the two companies will jointly work to create a system that enables JD.com to fulfill customer orders from Walmart inventories. Walmart initially had its own Chinese marketplace named Yihaodian but sold it to JD in 2016 due to its small market share in comparison to both JD and Alibaba.

Header Image Copyright: moovstock / 123RF Stock Photo

Advertisements

Advanced Bar Chart Labeling in Tableau

Here is a quick and easy, yet advanced tip for placing your labels just to the inside of your bar chart. This tip will provide you another alignment option in addition to the default ones. Credit to Andy Kriebel for the tip.

If you’re interested in Business Intelligence & Tableau subscribe and check out my videos either here on this site or on my Youtube channel.

My Journey to Obtaining the Certified Business Information Professional (CBIP) Certification

As of the date of this blog post I can proudly say that I have completed the certification suite of exams that comprise the Certified Business Intelligence Professional (CBIP) designation. My aim in taking the test was threefold.

  1. Discover how my knowledge and experience stacked up against professional standards issued by a reputable body in data and computing.
  2. Find additional motivation to constantly educate myself regarding data and business intelligence since the certification requires renewal.
  3. Bolster credentials, because it never hurts one’s bottom line to show you have expertise in your profession.

If you’ve found this page via search, you’re no doubt already acquainted with this certification offered by The Data Warehouse Institute (TDWI). I started with what I though would be the most difficult test based upon what I have researched; the Information Systems Core (i.e. IS Core). However, this was not the case as the specialty exam was the most difficult in my opinion.

Test 1: Information Systems Core (i.e. IS Core):

12/15/17: I wish I could share some detailed information about the test but that is not allowed per CBIP guidelines. All I can say is that the scope of information covered is very broad.

“The IS Core examination (Information Systems Technology) covers the base 4 year model curriculum from ACM and AIS for information systems – the entire spectrum of organizational and professional skills, teams and supervision, strategic organizational systems development and project management, systems development, web development, databases and systems integration – the subject matter, testing your ability to recognize, differentiate, and understand the definitions of the concepts covered.” – CBIP Examinations Guide

For adequate preparation, you’ll first need to spend $135 on the examinations guide. Unfortunately, the examinations guide is not something you can simply study and then go sit for the test. It is basically a reference book that points you to other sources to consider for test preparation. The guide also outlines the various subject areas that will appear on the test. Let me stress that you should not sit for this test without pertinent work experience and education. You will need to draw upon your knowledge and experiences to have a legitimate shot at passing.

My intent was to devote about 3 weeks’ worth of study time to tackle the IS Core but my work severely got in the way of that plan. I ended up devoting only ten hours of study time in total, but this was certainly not by design.

First I took the sample test of 42 questions in the examinations guide and fared pretty well. This gave me some confidence to continue with my scheduled exam date when I found out that my work was going to shorten my available study time.

The test was difficult. I’m not going to sugar coat this aspect. While I was taking the proctored exam, I could count on two hands where I was confident that I had chosen the correct answer (out of 110 questions). Part of the difficulty of the exam is the fact that you are presented with 4 choices where at least two of these choices could be a satisfactory answer.

Test 2: Data Foundations

12/31/17: I performed much better on the Data Foundations test, scoring well above the mastery level threshold of 70%. I was buoyed by my performance on the Information Systems Core test and only scheduled about 10 hours of study time in preparation for Data Foundations. I used one reference book to prepare. My advice for this test would be to have an understanding of metadata concepts; (this is listed as a subject area already cited in the CBIP Examinations Guide). Make the DAMA Guide to the Data Management Body of Knowledge your best friend. I used the 1st edition in lieu of the 2nd edition in my preparation since I already had the 1st edition in my possession.

Test 3: Specialty Exam: Data Management

1/14/18: This was the most difficult of the three exams that I sat. It may have been a function of my limited preparation as I only put in about 3 hours of study time. The scope of topics regarding this exam is so broad that I planned to again leverage my experience and knowledge to power me through. The majority of questions on this exam required narrowing down the answers to the two best answers and then selecting one. There is a persistent overlap between what could be acceptable and what the exam decrees is the one right answer. I’m not giving away anything that isn’t already on the outline shared by TDWI but you’ll really need to brush up on your knowledge of data governance, data management, data warehousing and master/reference data.

My Background:

Not to be immodest (I only want to share my mindset for sitting the exam with somewhat minimal study) but I’ve been working with data for 15 plus years and hold both an MBA and a Masters in Information Management. Before becoming a BI/data and analytics consultant, I worked back office in a bank supporting the monthly update of three credit risk data marts. Thankfully all of that hard gained experience working in a financial institution’s back office paid-off. Surprisingly, the number of right answers I gained from study time were minimal. Your mileage may vary in this regard.

Reference Material:

Here are the reference materials I used in my preparation; fortunately, (with the exception of the CBIP manual) I already had these in my library due to graduate studies. Depending upon your level of experience, you may need to supplement your effort with additional books. I will say that both Wikipedia and Search Business Analytics were very helpful for looking up unfamiliar terms.

 

Best of luck to you on your journey to CBIP certification!

Photo Copyright: dragonimages / 123RF Stock Photo

Create A Barbell/DNA Chart in Tableau with NBA Data

 

A Barbell, Dumbbell or DNA chart should be considered when you want to illustrate the difference or degree of change between two data points. In this video I will use NBA data from the 2016-2017 season (courtesy of basketball-reference.com) to illustrate the difference between team wins and losses.

If you’re interested in Business Intelligence & Tableau subscribe and check out my videos either here on this site or on my Youtube channel.

Data Profiling with Experian Pandora

 

Experian Pandora is a robust data profiling and data quality tool that enables users to quickly obtain useful statistics related to their data. Once you load your data into the interface, you can identify statistical anomalies and outliers in the data within seconds. To gain these types of insights, normally I have to write SQL scripts or use the SSIS Data Profiling Tool against SQL Server data. Experian Pandora is much easier to use against data in .csv files or Excel spreadsheets since you can simply drag and drop those items into the interface.

A lack of data profiling leads to poor data quality which leads to inaccurate information and poor business performance. I believe you will find this tool a worthy add to your data toolbox.

Download the Free Data Profiler: https://www.edq.com/experian-pandora/data-profiling/

If you’re interested in Business Intelligence & Tableau subscribe and check out all my videos either here on this site or on my Youtube channel.

Anthony B. Smoak

General Motors’ Information Technology: IT’s Complicated

General Motors and its relationship with technology has been one of innovation followed by periods of stagnation. Its technology staffing strategy of choice has been acquisition, followed by pure outsourcing, until it settled on its current insourcing approach. New startups like Tesla and Uber have a profound effect on a rapidly evolving automotive industry. GM as an industry incumbent must embrace new trends regarding autonomous vehicles and all the requisite software and technology to remain viable. The company currently believes than an insourced IT staff can help it develop competitive advantages.

The EDS Acquisition

General Motors has a long history of employing Electronic Data Systems Corporation (EDS) to service its information technology needs. The $2.5 billion acquisition of EDS in June of 1984 from billionaire Ross Perot was a move to help impose structure upon GM’s unorganized maze of data-processing systems and IT infrastructure. From the start, there were culture clashes between the two organizations; although EDS saw significant revenue increases after the acquisition. The management styles of brash, outspoken EDS founder Ross Perot and the bureaucratic GM CEO Roger Smith were incompatible.

“Problems surfaced within a year when the differences in management style between Perot and Smith became evident. The August 1984 issue of Ward’s Auto World suggested ‘Mr. Perot is a self-made man and iconoclast used to calling his own shots … Roger B. Smith [is] a product of the GM consensus-by-committee school of management, never an entrepreneur.’” [1]

Additionally, six thousand GM employees were transferred from GM to EDS at lower pay [2], which served to stoke the fires of the culture clash.

From 1984 until it was eventually spun-off in 1996, EDS was a wholly owned subsidiary of GM. Although there was an ownership separation, the two behemoths were still tightly coupled in regard to technology staffing. The decision to divest itself of EDS was a strategic decision by GM to focus on its core competency of vehicle manufacturing. EDS also gained the freedom to win additional technology contracting work from other organizations.

1600px-EDS-Plano-TX-5071

HP Enterprise Services (formerly EDS, Electronic Data Systems) corporate headquarters in Plano, Texas (Wikipedia)

Post EDS Spin-Off

Post spin-off, General Motors continued to contract with EDS for technology services as it still accounted for a third of EDS’s revenues at the time. Perceived as Texas “outsiders” by the Detroit incumbents, EDS found it difficult to deal with the fragmented nature of GM’s systems across various business units and divisions. While EDS had the requisite technical expertise, it did not always have enough internal influence to navigate GM’s intense political landscape. Obtaining consensus amongst business units in regard to technology decisions was a challenging endeavor. In an attempt to address these issues, incoming GM CIO Ralph Szygenda spearheaded the creation of an internal matrixed organization called Information Systems & Services (IS&S).

IS&S was created as a matrix organization consisting of internal GM technology executives and various other technologists (e.g. business and systems analysts). The new organizational structure consisted of a dual reporting relationship; IS&S members simultaneously reported to the CIO organization and to their local business unit leadership.

Generally, matrix organizations are instituted in order to promote integration. The advantage of the matrix organization is that it allows members to focus on local initiatives in their assigned business unit and it enables an information flow from the local units to the central IT organization. General Motors is a famously siloed global organization. With the creation of IS&S, members could now promote information sharing between different functions within GM and address the cross-organizational problems that had challenged EDS.

The matrix structure is not without weaknesses. To quote a famous book, “No man can serve two masters.” Employees in a matrix organization often deal with additional frustrations as they attempt to reconcile their allegiances and marching orders from conflicting authorities.  “Matrix organizations often make it difficult for managers to achieve their business strategies because they flood managers with more information than they can process” [3]. From my own personal experiences of working with IS&S while employed at GM subsidiary Saturn, I observed that members were inundated with meetings as they tried to stay up to date with the plans and initiatives of the central IT organization while trying to remain focused on their internal business units.

 A Return to EDS Insourcing

From the creation of IS&S in 1996 until 2012, GM relied upon a variety of outsourced contractors and vendors to deliver information technology services such as Capgemini, IBM, HP and Wipro. In 2010 GM renewed an existing technology outsourcing contract with the old EDS (now HP) for $2 billion.

The general wisdom in regard to outsourcing is that companies will seek to focus on those core activities that are essential to maintain a competitive advantage in their industry. By focusing on core competencies, companies can potentially reduce their cost structure, enhance product or service differentiation and focus on building competitive advantages.

In a reversal of its longstanding IT sourcing strategy, GM made headlines in 2012 with the decision to insource and hire thousands of technologists to supplement its bare bones IT staff. New GM CIO Randy Mott reasoned that an internal technical staff would be more successful working with business units and would deliver technology needs at a cheaper cost than outside providers. These savings could then be used to drive IT innovation and fund the capabilities needed to compete in a rapidly evolving automotive industry.

“By the end of this year (2012) GM will employ about 11,500 IT pros, compared with 1,400 when Mott started at the company four years ago, flipping its internal/external IT worker ratio from about 10/90 to about 90/10, an astounding reversal” [4].

GM decided to hire over 3,000 workers from HP that were already working at GM as part of its Global Information Technology Organization. The move could be considered an act of “getting the band back together” as HP purchased EDS in 2008 for $13.9 billion. Randy Mott was the CIO of HP before assuming the same position at GM. It is plausible that this fact factored into GM’s insourcing decision calculus.

It should be noted that insourcing IT personnel is not without risks. Insourcing requires a company to compete for technical resources which can be difficult in cutting edge technology areas. Furthermore, the complexities of running IT in house “requires management attention and resources that might better serve the company if focused on other value-added activities” [3].

GM’s Information Technology Transitions from Commodity to Innovation

The automotive industry is embarking upon significant changes as it deals with innovations and disruptions from the likes of Uber and Tesla. To illustrate this point, Tesla (founded in 2003) had a higher market capitalization than both GM and Ford for a period of three months in 2017. Auto industry incumbents like GM are focusing on automating and streamlining commoditized processes as well as applying IT to more innovative value-added functions (e.g. computerized crash testing, simulations to shorten vehicle development times and data analysis for profit optimization).

In its early years, GM had been widely perceived as an innovator before making a series of missteps that harmed this reputation. GM fell behind on hybrid engine development after taking a technology lead in the electric vehicle space. The company defunded its lauded EV1 offering in the early 1990s to appease the bean counters. The company also starved innovative upstart Saturn of the necessary funds to introduce new models for a period of five years.

2000-2002_Saturn_SL_--_03-16-2012_2

2000-2002 Saturn SL2 (Wikipedia) The innovative Saturn subsidiary was starved of funds.

“G.M.’s biggest failing, reflected in a clear pattern over recent decades, has been its inability to strike a balance between those inside the company who pushed for innovation ahead of the curve, and the finance executives who worried more about returns on investment” [6].

After a government bailout in 2009, the company promised to emerge leaner and commit itself to technology leadership. Automakers are now focusing on software development as a source of competitive advantage. As a result, GM has opened four information technology innovation centers in Michigan, Texas, Georgia and Arizona. These locations were chosen in order to be close to recent college graduates from leading computer science programs.

GM Opens Fourth IT Innovation Center in Chandler, Arizona

One of GM’s 4 new Information Technology Innovation Centers 

Additionally, GM purchased Cruise automation which is developing autonomous driving software and hardware technology. It is even testing a ride-sharing app for autonomous vehicles. The purchase will bolster GM’s technology staff and efforts in an emerging space.

“Harvard Business School professor Alan MacCormack, an expert in product development management within the software sector, says that outsourcing even routine software development can carry risks for companies that are seeking innovation in that area. He notes that today’s vehicles have more software and computing power than the original Apollo mission. ‘Everybody can make a decent enough powertrain. But what differentiates you is what you can do with your software,’ he says of car makers generally. ‘Companies have to be careful that they don’t outsource the crown jewels’” [6].

The company also developed an internal private cloud nicknamed Galileo, to improve its business and IT operations and consolidated twenty three outsourced data centers into two insourced facilities [7].

With its new cadre of insourced technologists, GM will need to find a way to bridge the ever-persistent culture gaps between innovative technologists, bureaucratic management and the Excel zealots in finance.

“IT is core, I think, to GM’s revival, and I think it will be core to their success in the future,” – Former GM CEO Dan Akerson [7]

References:

[1] http://www.fundinguniverse.com/company-histories/electronic-data-systems-corporation-history/

[2] Nauss, D.  (May 20, 1994). Pain and Gain for GM : EDS Spinoff Would Close Stormy, Profitable Chapter. Los Angeles Times. Retrieved from http://articles.latimes.com/1994-05-20/business/fi-60133_1_gm-employees

[3] Keri E. Pearlson, Carol S. Saunders, Dennis F. Galletta. (December 2015). Managing and Using Information Systems, A Strategic Approach
6th edition. Wiley Publishing ©2016

[4] Preston, R. (April 14, 2016). General Motors’ IT Transformation: Building Downturn-Resistant Profitability. ForbesBrandVoice. Retrieved from https://www.forbes.com/sites/oracle/2016/04/14/general-motors-it-transformation-building-downturn-resistant-profitability/#67b37d551222

[5] Boudette, N. (July 6, 2017). Tesla Loses No. 1 Spot in Market Value Among U.S. Automakers. The New York Times. Retrieved from https://www.nytimes.com/2017/07/06/business/tesla-stock-market-value.html

[6] Leber, J. (November 5, 2012). With Computerized Cars Ahead, GM Puts IT Outsourcing in the Rearview Mirror. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/506746/with-computerized-cars-ahead-gm-puts-it-outsourcing-in-the-rearview-mirror/

[7] Wayland, M. September 18, 2017. GM plots next phase of IT overhaul Unlocking the Potential of a Vast Data Empire. Automotive News. Retrieved from http://www.autonews.com/article/20170918/OEM06/170919754/gm-it-randy-mott

Featured Image Copyright: akodisinghe / 123RF Stock Photo

Tableau Filtering Actions Made Easy

This is a guest post provided by Vishal Bagla, Chaitanya Sagar, and Saneesh Veetil of Perceptive Analytics.

Tableau is one of the most advanced visualization tools available on the market today. It is consistently ranked as a ‘Leader’ in Gartner’s Magic Quadrant. Tableau can process millions of rows of data and perform a multitude of complex calculations with ease. But sometimes analyzing large amounts of data can become tedious if not performed properly. Tableau provides many features that make our lives easier with respect to handling datasets big and small, which ultimately enables powerful visualizations.

Tableau’s filtering actions are useful because they create subsets of a larger dataset to enable data analysis at a more granular level. Filtering also aids user comprehension of data. Within Tableau data can be filtered at the data source level, sheet level or dashboard level. The application’s filtering capabilities enable data cleansing and can also increase processing efficiency. Furthermore, filtering aids with unnecessary data point removal and enables the creation of user defined date or value ranges. The best part is that all of these filtering capabilities can be accessed by dragging and dropping. Absolutely no coding or elaborate data science capabilities are required to use these features in Tableau.

In this article, we will touch upon the common filters available in Tableau and how they can be used to create different types of charts. After reading this article, you should be able to understand the following four filtering techniques in Tableau:

  1. Keep Only/Exclude Filters
  2. Dimension and Measure Filters
  3. Quick Filters
  4. Higher Level Filters

We will use the sample ‘Superstore’ dataset built in Tableau to understand these various functions.

1. Keep Only/Exclude Filters in Tableau

These filters are the easiest to use in Tableau. You can filter individual/multiple data points in a chart by simply selecting them and choosing the “Keep Only” or “Exclude” option. This type of filter is useful when you want to focus on a specific set of values or a specific region in a chart.

While using the default Superstore dataset within Tableau, if we want to analyze sales by geography, we’d arrive at the following chart.

1.png

However, if we want to keep or exclude data associated with Washington state, we can just select the “Washington” data point on the map. Tableau will then offer the user the option to “Keep Only” or “Exclude”. We can then simply choose the option that fits our need.

2.png

2. Dimension and Measure Filters

Dimension and measure filters are the most common filters used while working with Tableau. These filters enable analysis at the most granular level. Let’s examine the difference between a dimension filter and a measure filter.

Dimension filters are applied to data points which are categorical in nature (e.g. country names, customer names, patient names, products offered by a company, etc.). When using a dimension filter, we can individually select each of the values that we wish to include or exclude. Alternatively, we can identify a pattern for the values that we wish to filter.

Measure filters can be applied to data points which are quantitative in nature, (e.g. sales, units, etc.). For measure filters, we generally work with numerical functions such as sum, average, standard deviation, variance, minimum or maximum.

Let’s examine dimension filters using the default Tableau Superstore dataset. The chart below displays a list of customers and their respective sales.

3.png

Let’s examine how to exclude all customers whose names start with the letter ‘T’ and then subsequently keep only the top 5 customers by Sales from the remaining list.

One way would be to simply select all the customers whose names start with ‘T’ and then use the ‘Exclude’ option to filter out those customers. However, this is not a feasible approach when we have hundreds or thousands of customers. We will use a dimension filter to perform this task.

When you move the Customer Name field from the data pane to the filters pane, a dialogue box like the one shown below will appear.

4.png

As shown in the above dialogue box, you can select all the names starting with “T” and exclude them individually. The dialogue box should look like the one shown below.

5.png

The more efficient alternative is to go to the Wildcard tab in the dialogue box and select the “Exclude” check box. You can then choose the relevant option “Does not start with”.

6.png

To filter the top 5 customers by sales, right click on “Customer Name” in the Filters area, select “Edit Filter” and then go to the “Top” tab in the filter dialogue box. Next, choose the “By Field” option. Make your selections align to the following screenshot.

top-5-customers-by-sales-filter

After performing the necessary steps, the output will yield the top 5 customers by sales.

top 5 customers by sales

Let’s move on to measure filtering within the same Tableau Superstore dataset. We’re going to filter the months where 2016 sales were above $50,000. Without a measure filter applied, our sales data for 2016 would look like the following:

9.png

To filter out the months where sales were more than $50,000, move the sales measure from the data pane to the filter pane. Observe the following:

10.png

Here, we can choose any one of the filter options depending upon our requirement. Let’s choose sum and click on “Next”. As shown below, we are provided with four different options.

11.png

We can then choose one of the following filter options:

  • Enter a range of values;
  • Enter the minimum value that you want to display using the “At least” tab;
  • Enter the maximum value that you want to display using the “At most” tab;
  • From the Special tab, select “all values”, “null values” or “non-null” values;

Per our example, we want to filter for sales that total more than $50,000. Thus, we will choose the “At least” tab and enter a minimum value of 50,000.

12.png

In the output, we are left with the six months (i.e. March, May, September, October, November, December) that have a sum of sales that is greater than $50,000.

13.png

Similarly, we can choose other options such as minimum, maximum, standard deviation, variance, etc. for measure filters. Dimension and measure filters make it very easy to analyze our data. However, if the dataset is very large, measure filters can lead to slow performance since Tableau needs to analyze the entire dataset before it filters out the relevant values.

3. Quick Filters

Quick filters are radio buttons or check boxes that enable the selection of different categories or values that reside in a data field. These filters are very intuitive and infuse your visualizations with additional interactivity. Let’s review how to apply quick filters in our Tableau sheet.

In our scenario, we have sales data for different product segments and different regions from 2014 to 2019. Our data looks like the following:

14.png

We want to filter the data by segments and see data for only two segments (Consumer and Corporate). One way to do this would be to use a dimension filter, but what if we want to compare segments and change the segment every now and then? In this scenario, a quick filter would be a useful addition to the visualization. To add a quick filter, right click on the “Segment” dimension in Marks pane and choose “Show Filter”.

15.png

Once we click on “Show Filter”, a box will appear on the right side of the Tableau screen. The box contains all constituent values of the Segment dimension. At this point, we could choose to filter on any segment value available in the quick filter box. If we were to select both Consumer and Corporate values, Tableau will display two charts instead of three.

16

Similarly, we can add other quick filters for region, country, ship status or any other dimension.

17.png

4. Higher Level Filters

Dimension, measure and quick filters are very easy to use and make the process of analyzing data hassle free. However, when multiple filters are used on a large data source, processing becomes slow and inefficient. Application performance degrades with each additional filter.

The right way to begin working with a large data source is to initially filter when making a connection to the data. Once the data is filtered at this stage, any further analysis will be performed on the remaining data subset; in this manner, data processing is more efficient. These filters are called Macro filters or Higher-Level filters. Let’s apply a macro level filter on our main data source.

We can choose the “Add” option under the Filters tab in top right corner of the Data Source window.

18.png

Once we click on “Add”, Tableau opens a window which presents an option to add various filters.

19.png

Upon clicking “Add” in the Edit Data Source Filters dialogue box, we’re presented with the entire list of variables in the dataset. We can then add filters to the one we select. Let’s say we want to add a filter to the Region field and include only the Central and East region in our data.

20.png

Observe that, our dataset is filtered at the data source level. Only those data points where the region is either Central or East will be available for our analyses. Let’s turn our attention back to the sales forecast visualization that we used to understand quick filters.

21

 

In the above window, we observe options for only “Central” and “East” in the Region Filter pane. This means that our filter applied at the data source level was successful.

Hopefully after reading this article you are more aware of both the importance and variety of filters available in Tableau. However, using unnecessary filters in unorthodox ways can lead to performance degradation and impact overall productivity. Therefore, always assess if you’re adding unnecessary options to your charts and dashboards that have the potential to negatively impact performance.

Author Bio:

This article was contributed by Perceptive Analytics. Vishal Bagla, Chaitanya Sagar, and Saneesh Veetil contributed to this article.

Use Parameters in Tableau to Enhance Your Tables

When you receive a requirement to make a boring Excel style table in Tableau, consider spicing up the table by incorporating parameters. One clever use of parameters enables you to incorporate user defined rows and columns into a Tableau table layout. As a user selects a parameter value (representing a column or row), the table is dynamically updated to show the column or row that was selected.

“Parameters are useful when you want to add interactivity and flexibility to a report, or to experiment with what-if scenarios. Suppose you are unsure which fields to include in your view or which layout would work best for your viewers. You can incorporate parameters into your view to let viewers choose how they want to look at the data.

When you work with parameters, consider the following two things that are important in making them useful:

They need to be used in calculations.
The parameter control needs to be displayed so that viewers can interact with it.”

In this video I will show you how to infuse an otherwise boring table with some parameter driven interactivity. Enjoy!

Reference: Tableau Online Help

Benford’s Law Visualization in Tableau

Benford’s law, also called the first-digit law, is an observation about the frequency distribution of leading digits in sets of numerical data. The law states that in many naturally occurring collections of numbers, the leading significant digit is likely to be small [1]. For example, in sets that obey the law, the number 1 appears as the most significant digit about 30% of the time, and the percentages decrease all the way down to a leading digit of 9, which appears 4.6% of the time.

Why Run This Analysis?

When fraudsters are fabricating data, they may not know to manufacture fake data in a manner that conforms to Benford’s Law.  Constructing a Benford’s Law visualization in Tableau can help you determine if your numerical data is fake or at least raise doubts about its authenticity.

In short, remember that one isn’t always the loneliest number!

If you’re interested in Business Intelligence & Tableau subscribe and check out all my videos either here on this site or on my Youtube channel.

[1] https://en.wikipedia.org/wiki/Benford%27s_law

 

Return Unmatched Records with SQL and Microsoft Access

Over the course of many years of building SQL scripts, I’ve tended to help SQL novices perform the set difference operation on their data. This post will not provide in-depth coverage on SQL run plans and tuning minutiae, but I do want to provide a high level overview for the novice.

If we define set A as the three numbers {1, 2, 3} and set B as the numbers {2, 3, 4} then the set difference, denoted as A \ B, is {1}. Notice that the element 1 is only a member of set A.

A picture is worth a thousand words as they say. A Venn diagram will be effective at illustrating what we’re trying to accomplish in this post.

Venn Diagram Difference

This blog post will cover using SQL and Microsoft Access to address capturing the shaded records in set A. If you have a database table named A and wanted to determine all of the rows in this table that DO NOT reside in another table named B, then you would apply the set difference principle.

LEFT OUTER JOIN & IS NULL SYNTAX

There are multiple ways to implement the set difference principle. It helps if there is a common join key between both sets of data when performing this analysis.

If I were working with two tables, one containing inventory data and one containing order data. I could write the following SQL script to return all the inventory rows that do not reside in the orders table.

SELECT table_inventories.*
FROM   table_inventories
       LEFT OUTER JOIN table_orders
                    ON table_inventories.id = table_orders.id
WHERE  table_orders.id IS NULL  

MICROSOFT ACCESS EXAMPLE

Consider the following tables in Microsoft Access. Observe that table_orders has fewer records than table_inventories.

Access Example Inventory Access Example Orders

We can construct a set difference select query using these tables to return all of the products in table_inventories that have not been ordered. Create a query in Microsoft Access in a similar fashion as shown below.

Access SQL Difference Join

The result of this query would produce the following two products that are not in table_orders.

Access Example Query Result

The Microsoft Access Query & View Designer would automatically generate the following SQL if you cared to open the Access SQL editor.

SELECT table_inventories.*
FROM   table_inventories
LEFT JOIN table_orders
ON table_inventories.id = table_orders.id
WHERE  (( ( table_orders.id ) IS NULL ));

Notice that LEFT JOIN is automatically created instead of LEFT OUTER JOIN. In Microsoft Access, the OUTER operation is optional. Also notice that Access loves to add additional parentheses for reasons known only to Microsoft.

Per Microsoft Access SQL Reference:

Use a LEFT JOIN operation to create a left outer join. Left outer joins include all of the records from the first (left) of two tables, even if there are no matching values for records in the second (right) table [1].

NOT EXISTS SYNTAX

Let’s step away from Microsoft Access for the remainder of this post. The NOT EXISTS approach provides similar functionality in a more performance friendly manner as compared to the LEFT OUTER JOIN & IS NULL syntax.

SELECT table_inventories.*
FROM   table_inventories
WHERE  table_inventories.id NOT EXISTS (SELECT table_orders.id
FROM   table_orders);

EXCEPT SYNTAX (T-SQL)

Alternatively, we could use the SQL EXCEPT operator which would also accomplish the task of returning inventory ids that do not reside in the orders table (i.e. inventory items that were never ordered). This syntax would be appropriate when using SQL Server.

SELECT table_inventories.id
FROM   table_inventories
EXCEPT
SELECT table_orders.id
FROM   table_orders

Per Microsoft:

EXCEPT
Returns any distinct values from the query to the left of the EXCEPT operator that are not also returned from the right query [2].

MINUS SYNTAX (ORACLE)

The following script will yield the same result as the T-SQL syntax. When using Oracle, make sure to incorporate the MINUS operator.

SELECT table_inventories.id
FROM   inventories
MINUS
SELECT table_orders.id
FROM   table_orders

Now take this tip and get out there and do some good things with your data.

Anthony Smoak

 

References:

[1] Access 2007 Developer Reference. https://msdn.microsoft.com/en-us/library/bb208894(v=office.12).aspx

[2] Microsoft T-SQL Docs. Set Operators – EXCEPT and INTERSECT (Transact-SQL). https://docs.microsoft.com/en-us/sql/t-sql/language-elements/set-operators-except-and-intersect-transact-sql

[3] Oracle Help Center. The UNION [ALL], INTERSECT, MINUS Operators. http://docs.oracle.com/cd/B19306_01/server.102/b14200/queries004.htm

Venn diagram courtesy of http://math.cmu.edu/~bkell/21110-2010s/sets.html