Penetration Testing: The Legal Way to Hack

Here is a small excerpt from a larger paper I submitted for an Information Security class back in 2013. This excerpt broaches the penetration test which has grown in popularity thanks to several high profile cyber security breaches in recent years. Enjoy!

The penetration test is the activity in which a security vendor or white hat hacker will deploy their skills acquired from training, certification and practical experience. The aim of the penetration test is to discover system or network vulnerabilities and exploit those vulnerabilities with the consent of the system owner(s). The penetration test scans for vulnerabilities and looks to actively exploit any uncovered vulnerabilities; it is a complement to the vulnerability scanners used during a vulnerability assessment. The penetration test helps identify which vulnerabilities are real and discern whether they can actually be exploited. “Vulnerability scanners can tell what potential risks are, but pen tests can provide the actual facts about the risks, including if they are exploitable and what information could be exploited if they were” (Howarth, 2010a, para. 2).

There are many different flavors of pen testing. A manual or automated test may be executed. The manual test is more involved and typically more costly if an outside authority is used as it requires significantly more expertise than an automated test. The automated testing approach is carried out via the logic, rules and or AI embedded in a software product. One such commercial product on the market is SAINTexploit, which not only exposes vulnerability points but also exploits those vulnerabilities to prove their existence. SC Magazine for security professionals rates SAINTexploit as an overall 4.75/5 star product for automated penetration testing. The annual cost of the product is $8,745 for 1,000 unique targets; the product must be renewed annually for continued usage (Stephenson, P., 2013). “Automated tools can provide a lot of genuinely good information, but are also susceptible to false positives and false negatives, and they don’t necessarily care what your agreed-upon scope says is your stopping point” (Walker, M., 2013, Chapter 11).

The two types of penetration testing as defined by the EC-Council (the certification body for the Certified Ethical Hacker designation) are external and internal. External assessments test and analyze publicly available information, as well as conduct scanning and exploits from outside the network perimeter. The internal assessment is the opposite and is performed from within the network perimeter.

The concept of black, white and grey box testing also come into play with respect to determining what information is known beforehand in order to carry out the penetration test. Walker (2013) notes that in a black box test the attacker has no information of the system or infrastructure beforehand. The black box test requires the longest to accomplish and is the closest simulation to an actual attack. White box testing simulates an insider with complete knowledge of the systems and infrastructure, who carries out the penetration test. Finally the grey box test provides limited information on the targeted systems and/or infrastructure.

Another parameter that can make the pen test more closely resemble real world conditions is the incorporation of social engineering. The white hat is given permission to use phishing attacks in order to gain access to passwords or other sensitive information. With phishing, the ethical hacker can design any number of email messages, websites, or even utilize phone calls under false pretenses in order to get a user to install malicious software or hand over sensitive information. The organization can gauge the results of these controlled social engineering attacks to see which users need a refresher in the company security policy or to determine if the current security policy is effective.

An organization carrying out an external penetration test by using an outside company should have the scope and the rules of the test clearly defined in contractual or service level agreement terms. In the event of a disruption of service or any other catastrophic event, both parties should know ahead the responsible party for correcting any issues. Graves (2010, Chapter 15) asserts that the documents necessary to have signed from the client before conducting a white hat a penetration test are:

  • “Scope of work, to identify what is to be tested”
  • “Nondisclosure agreement, in case the tester sees confidential information”
  • “Liability release, releasing the ethical hacker from any actions or disruption of service caused by the pen test”

Although penetration testing is widely used by organizations to test for system, network or human vulnerabilities there are some limitations to their effectiveness. All of the potential varying client parameters around the pen test (e.g. financial systems are out of scope, no social engineering, etc..) can work to hide exploits that would still be vulnerable to an actual black hat attack. Real world attacks can use a combination of social engineering, physical, and electronic methods often coordinated by an experienced team. The aforementioned combination of methods and expertise is very hard to simulate in a controlled environment. “The [enterprise’s] board and other stakeholders will not care about a clean network pen test if an attacker enters the building and, through a combination of social engineering and other low-tech gadgets like the hidden camera tie, steals your protected information” (Barr, J., 2012b).


Barr, J. G. (a) (November 2012). Recruiting Cyber Security Professionals. Faulkner Information Services. Retrieved March 23, 2013

Graves, K. CEH—Certified Ethical Hacker—Study Guide. Sybex. © 2010. Books24x7. Retrieved March 24, 2013

Howarth.F. (2010). (a) Emerging Hacker Attacks. Faulkner Information Services. Retrieved April 17th, 2013

Stephenson, P. (2013). SAINTmanager/SAINTscanner/SAINTexploit v7.14 Retrieved March 23, 2013 from

Walker, M. CEH Certified Ethical Hacker: All-in-One Exam Guide. McGraw-Hill/Osborne, © 2012. Books24x7. Retrieved Mar. 24, 2013


Create a Hex Map in Tableau the Easy Way

There are may different ways to create a hex map in Tableau. The hex map helps visualize state geographic data at the same size which helps to overcome discrepancies that make smaller states harder to interpret. Also, larger states (e.g. Alaska) can overwhelm a traditional map with their size.

I’ve found that the quickest and easiest way to build a hex map is to leverage a pre-built shape file. Shape files can be found at various open data sources like or

In this video I will use a shape file created by Tableau Zen Master Joshua Milligan who runs the blog He has a blog post where you can download the shape file I reference. Hats off to Joshua for creating and sharing this great shape file!

Tableau Quadrant Analysis Part 2: Dynamic Quadrants

There are a couple of tweaks that can be made to the Quadrant Analysis video I showed you earlier. We can enhance upon the first iteration of the analysis by making the visualization interactive. I will create parameter driven quadrants where the reference lines are not static at a 50% intersection.

You can tweak the instructions to suit your actual visualization as necessary, but the concepts will remain the same.

We’re going to create two new parameters and have those parameters dynamically control the placement of our reference lines. Then we’re going to update the calculated field which defines the color of each data point or mark, with the parameters we created. In this manner, the colors of each mark will dynamically update as the references lines are adjusted.

To put this in English, as you change the parameter values, the reference lines will move and the mark colors will update.

Watch the video above and/or follow along with the instructions below.

Remove Existing Reference Lines:

Step 1:

  • Remove all existing reference lines from the original quadrant analysis. Simply right click on a reference line and select “Remove”.
  • Also remove the annotations from the 4 quadrants.

Create Parameters

Step 2:

  • Create a parameter named “Percentile FG Pct” (without quotes). Select the dropdown triangle next to “Find Field” icon and choose “Create Parameter”.


Make sure your parameter is setup as a “Float” and the Range of values reflects the picture below. The Display Format will be set as “Percentage” with zero decimal places.


Step 3: Duplicate Your Parameter

  • Right click on your new parameter and select “Duplicate”.
  • Right click on “Percentile Wins” and select “Edit”.
  • Name the new parameter “Percentile Wins”.

Step 4: Show the Parameters Controls

  • Right click on each parameter and select “Show Parameter Control”.
  • Right click on each drop down triangle in the upper right corner of the Parameter Control and select “Slider”.


Step 5: Add Reference Lines

  • Right click on the Percentile of FG% Axis at the bottom of the viz. Select “Add Reference Line”. The Line Value should refer to the X axis parameter (i.e. Percentile FG Pct). For the Line Formatting I choose the third dashed lined option.


  • Right click on the Percentile of Wins Axis on the left side of the viz. Select “Add Reference Line”. The Line Value should refer to the Y axis parameter (i.e. Percentile of Wins).

At this point you should have two parameter controls that adjust the placement of the respective reference lines on the visualization.

However, you’ll notice that the colors of the marks do not change as the reference lines move in increments.

Step 6: Edit the original calculated field to use parameters instead of hardcoded percentage values

Right click on the calculated field (i.e. “Color Calc” in my case), select “Edit” and change all references of “.5” to the corresponding parameter name.

  • The original calculated field:







Is edited to become:

IF RANK_PERCENTILE(SUM([FG%])) >= [Percentile FG Pct] AND RANK_PERCENTILE(SUM([Wins])) >= [Percentile Wins] THEN ‘TOP RIGHT’

ELSEIF RANK_PERCENTILE(SUM([FG%])) < [Percentile FG Pct] and RANK_PERCENTILE(SUM([Wins])) >= [Percentile Wins] THEN ‘TOP LEFT’

ELSEIF RANK_PERCENTILE(SUM([FG%])) < [Percentile FG Pct] and RANK_PERCENTILE(SUM([Wins])) < [Percentile Wins] THEN ‘BOTTOM LEFT’

ELSEIF RANK_PERCENTILE(SUM([FG%])) >= [Percentile FG Pct] and RANK_PERCENTILE(SUM([Wins])) < [Percentile Wins] THEN ‘BOTTOM RIGHT’



In the above formula both [Percentile FG Pct] and [Percentile Wins] are parameter values that have replaced the hardcoded values of “.5”.

Final Result:

As you change your parameter values on the parameter control, the corresponding reference line moves and the color of each mark changes automatically to fit its new quadrant.


Before w/ Static Quadrants

Notice how the marks are colored according to their respective quadrant in the screen print below.


After w/ Parameter Driven Quadrants

I hope you enjoyed this tip. Now, get out there and do some good things with your data!

Anthony Smoak

Quadrant Analysis in Tableau

Release your inner Gartner and learn how to create a 2×2 matrix in Tableau. In this video I will perform a quadrant analysis in Tableau using NBA data to plot FG% vs Wins. Since the data points will be compact, we’ll use percentiles to expand the data and create a calculated field to color the data points per respective quadrant.

Make sure to check out part 2 of this series where I will show you how to make the quadrant boundaries interactive.

If you’re interested in Business Intelligence & Tableau subscribe and check out my videos either here on this site or on my Youtube channel.

The Definitive Walmart Technology Review

Did you know that the legal name “Wal-Mart Stores, Inc.” was changed effective Feb. 1, 2018 to “Walmart Inc.”? The name change is intended to reflect the fact that today’s customers don’t just shop in stores.

I’ve kept an eye on Walmart because historically the company was the leading innovator in regard to advancing retail focused technology and supply chain strategy. Even though Walmart isn’t the typical “underdog”, in the fight for online retail supremacy it currently finds itself in this position; and everyone can appreciate an underdog (see Philadelphia Eagles). Currently Walmart is locked in a fierce battle with Amazon to carve out a more substantial space in digital and online retail. As such, the company is bolstering its capabilities by focusing on technology enabled business processes, training and digital growth to keep pace with Amazon’s world domination efforts.

Walmart is experimenting with cutting edge technology such as virtual reality, autonomous robots, cloud storage platforms, cashier-less stores and even blockchain. It’s also formed various online retail and technology-based alliances to keep pace with Team Bezos. Walmart’s kitchen sink technology strategy seems to be paying dividends from an online sales perspective. E-Commerce industry luminary Marc Lore and his influence can been seen in some of the innovative technology plays.


Walmart, yes Walmart is getting in on the blockchain ledger revolution (or hype). The company plans to team up with IBM, and Chinese based Tsinghua University to create a blockchain food safety alliance.

Here is how the partnership will work according to ZDNet:

  • Walmart, JD, IBM and Tsinghua University will work with regulators and the food supply chain to develop standards and partnerships for safety.
  • IBM provides its blockchain platform
  • Tsinghua University is the technical advisor and will provide expertise in technology and China’s food safety ecosystem.
  • Walmart and JD will help develop and optimize technology that can be rolled out to suppliers and retailers.

Walmart has shown that blockchain technology has reduced the time it takes to trace food from farm to store from days to seconds. During product recalls this capability could prove useful for the retailer.

If Walmart were to offer a more investor appealing use of blockchain (e.g. a “Sam’s-Coin” ICO), you could count me in for a high two figure investment.

Ok Google

Goole Home Walmart

Image courtesy of ZDNet

Straight from “the enemy of my enemy is my friend” playbook, Walmart and Google announced a partnership to make Walmart’s items available on Google’s shopping service, Google Express.

The New York Times reports that “it’s the first time the world’s biggest retailer has made its products available online in the United States outside of its own website”. We can readily see what Walmart gets out of the alliance (expanded presence on the dominant search engine) but interpreting Google’s angle requires a bit more perspicacity.

Google fears that its search engine is being bypassed by consumers who go straight to Walmart to search for products. Google, (you may have heard) dabbles a bit in search and online advertising. A substantial shift in product search behavior that favors Amazon is bad for business. If Google can expand its own online marketplace with Walmart’s appreciable offerings as well as entice customers to use Google Home and the mobile based Google Assistant to locate products, then the company can retain a greater share of initial product searches. Google Express already offers products from Walmart competitors Target and Costco although Walmart’s collaboration offers the largest number of items.

“Walmart customers can link their accounts to Google, allowing the technology giant to learn their past shopping behavior to better predict what they want in the future. Google said that because more than 20 percent of searches conducted on smartphones these days are done by voice, it expects voice-based shopping to be not far behind.”

An existing Walmart application feature called “Easy Reorder” is slated for integration with voice enabled shopping via Google Assistant. Currently, when a consumer logs into the Walmart mobile app, they can easily see their most frequently purchased in-store and online items and easily reorder those items. Integration with Google Express provides an additional data channel to bolster the effectiveness of this Walmart offering.

The Matrix:

Virtual reality would not be the first technology play that one would likely associate with Walmart. However, the company’s tech incubator (Store No. 8) has purchased Spatialand, a VR development tools company. Spatialand has previously worked with Oculus, Intel and “rap rock” artists Linkin Park to create virtual content.

Walmart’s intent is to use Spatialand to develop “immersive retail environments”. My expectations aren’t high that this acquisition will pay off in the near to medium term, but the company is demonstrating that it is trying to be on the vanguard of future retail technology. One can imagine this acquisition eventually enabling Star Trek “holodeck” capabilities where customers can enter virtual stores or dressing rooms and interact with products while in the comfort of their homes.

I propose that Spatialand and Walmart owned Bonobos would make ideal mash-up partners. Instead of trekking to a physical Bonobos store and trying on shirts and or slacks, consumers can create an avatar with similar dimensions and play virtual dress-up in 3D. The garments can then be shipped directly.

Additionally, Walmart is actually using Oculus headsets to train employees at its 170 training academies. The company has partnered with STRIVR, a virtual reality startup based in Menlo Park. I envision virtual customers stampeding through entrances on a Black Friday opening.

“STRIVR’s technology allows employees to experience real-world scenarios through the use of an Oculus headset, so that employees can prepare for situations like dealing with holiday rush crowds or cleaning up a mess in an aisle.”

There’s an App for That:

Walmart is trying to balance keeping its in-store traffic high while accommodating its growing mobile customer base. The company recently enhanced its Walmart application with a new “Store Assistant” feature that activates when a customer walks in the door. The app will allow customers to build shopping lists, calculate costs and locate in-store items at all of its domestic locations.

“So-called mobile commerce revenue — mostly generated via smartphones — will reach $208 billion, an annual increase of 40 percent, EMarketer forecasts.” – Bloomberg

Channeling its inner Tim Cook, Walmart launched a nationwide rollout of the Walmart Pay system as an additional application enhancement.

“To use the three-step payment system, shoppers link their chosen payment method to their account, open the camera on their smartphone and snap a photo of a QR code at the register. That notifies the app to process the customer’s payment. Shoppers can link their credit or debit cards, prepaid accounts or Wal-Mart gift cards to their payments; however, they still cannot use ApplePay.” – CNBC

Rise of the Machines:

As labor costs rise and technology increases, we can be sure of one thing. Robots are coming to take all of our jobs and Walmart isn’t doing much to disabuse us of this notion. As part of a pilot, the company is employing autonomous robots to about 50 locations to help perform “repeatable, predictable and manual” tasks. Primary tasks include scanning store shelves for missing stock, inventory calculations and locating mislabeled and unlabeled products. The robots stay docked in charging stations inside the store until they are activated and given an autonomous “mission”.

According to Reuters, Walmart’s CTO Jeremy King states that the robots are 50% more productive and can scan shelves three times faster at a higher accuracy. Walmart’s current carbon-based units (my words) can only scan the shelves about twice a week.

While Walmart insists that the robots won’t lead to job losses, I say to remember that technology always marches forward. Today’s “repeatable, predictable and manual” tasks are tomorrow’s non-repeatable, unpredictable and automated tasks.

Walmart scanner

Image courtesy of Walmart

Additionally, in 2016 the company “patented a system based on mini-robots that can control shopping carts, as well as complete a long list of duties once reserved for human employees.” Keep an eye on those robots as Walmart does not have a reputation for overstaffing.

In all seriousness, shelf inventory checks ensure that customer dollars aren’t left on the table due to un-shelved items. If Walmart can significantly lower the occurrences of un-shelved products with its army of shelf scanning Daleks, then the robots will pay for themselves.

Mr. Drone and Me

walmart drones

Never missing an opportunity to improve supply chain efficiency, reduce labor costs and keep pace with Amazon, Walmart is experimenting with drones. The company’s Emerging Science division has been tasked to consider future applications of drone technology to enhance operational efficiency.

Currently, inventory tracking at the company’s fulfillment centers requires employees to use lifts, harnesses and hand-held scanning devices to traverse 1.2 million square feet (26 football fields) of warehouse space; a process that can take up to a month to complete. If you consider that Walmart has close to 190 distribution centers domestically, the inventory process consumes a significant amount of labor hours when aggregated across the company.

The current plan is to utilize fully automated quad-copter drones mounted with cameras to scan the entire warehouse for inventory monitoring and error checking.

“The camera is linked to a control center and scans for tracking number matches. Matches are registered as green, empty spaces as blue, and mismatches as red. An employee monitors the drone’s progress from a computer screen.”

Drones would eventually replace the inventory quality assurance employees.

Millenial Digs:

Walmart Austin ATX
The lobby at Walmart ATX in downtown Austin. Jay Janner/Austin American-Statesman

In an effort to increase its appeal to the young, hip and tech savvy, Walmart has opened a new engineering tech design center in Austin. The new millennial digs are located in a renovated 8,000 square foot space. “Walmart ATX” will house minds that work with cutting edge technology such as machine learning, artificial Intelligence, blockchain, internet of things and other emerging technologies. Factors such as a deep talent pool and low cost of living drove the creation of this Austin hub.

Here’s hoping Amazon decides to stay far away from its retail rival and brings its talents to Atlanta, Georgia.


The saying goes that Amazon is trying to become more like Walmart and Walmart is trying to become more like Amazon. Walmart teaming up with Japanese e-commerce giant Rakuten to sell e-books solidifies this sentiment. It should go without saying that Amazon has a sizable lead in selling e-books. However, Walmart is leaving no stone unturned as far as offering products that keep e-commerce shoppers from Amazon’s web presence.

Online Grocery Pickup:

Walmart is experimenting with allowing customers to place their orders online for pickup at a local store. The scheme currently requires the company’s human employees (eventually robots) to walk the aisles using a handheld device in order to fulfill customer orders. The device acts as an in-store GPS that maps the most efficient route to assemble the customer order.

Customers then pull into a designated pickup area where live human beings (eventually robots) will dispense the pre-assembled order.

I’m not kidding about “eventually robots” dispensing the pre-assembled order. Walmart currently has an automated kiosk in Oklahoma City that dispenses customer orders from internal bins. Customers walk up to the interface and input a code which then enables the kiosk to retrieve the order. Hal, open the pod bay doors; the future is here and apparently its name is Oklahoma City.

Walmart kiosk

Courtesy of The Oklahoman

These approaches address the “last mile” problem which plagues large e-commerce players and start-ups alike. As consumer preferences shift from physical stores to online channels, repurposing stores into dual e-commerce fulfillment centers wrings additional utility from these assets.

In Home Delivery:

Another innovative e-commerce “Last Mile” proposal from Walmart involves the creation of a smart home delivery pilot. By partnering with a smart lock startup company August Home, outsourced delivery drivers will be supplied a one-time passkey entry into a customer’s home to unload cold and frozen groceries into the refrigerator. The home owner is alerted via phone notification that the driver has entered their property and can watch the in-home delivery livestreamed from security cameras. An additional notification is sent to the consumer when the door has automatically locked. This limited scale program is only being piloted in Silicon Valley (of course).

Per the Washington post:

“This is a group of people who are already used to a certain level of intrusiveness.. But God help the teenager playing hooky or the family dog who’s not expecting the delivery man.”

I can envision a future sci-fi use case involving “smart fridges” and automatic home replenishments. This pilot move is a search for an advantage in grocery delivery as Amazon recently purchased Whole Foods without overtly signaling what disruptive services may emerge from the amalgamation.

Smart Cart Technology

Jet Smart Cart

When Walmart made the largest ever purchase of a U.S. e-commerce startup with for $3.3 billion, the company was looking for a way to ramp up online sales and infuse itself with fresh perspectives for online selling.

As I’ve mentioned previously, has the potential to infuse Walmart with much needed digital innovation. This fresh perspective has the potential to add tremendous value to the organization as a whole. The “old guard” rooted in Walmart’s core business model needs to allow acquisitions to thrive instead of imposing the more conservative legacy culture.

The Jet infusion of innovative ideas back to the mothership is happening. Current Walmart e-commerce head Marc Lore launched around an innovative “smart cart” system that offers the potential of lowering the price of customer orders. Here is how it works according to Forbes:

“If you have two items in your cart which are both located in the same distribution center and can both fit into a single box, then you will pay one low price. If you add a third item that is located at a different distribution center and cannot be shipped in a single shot with the other two items, you will pay more. As you shop on the site, additional items that can be bundled most efficiently with your existing order are flagged as ‘smart items’ and an icon shows how much more you’ll save on your total order by buying them.”

The order price can be further lowered if customers use a debit card or decline returns. This smart cart process is expected to launch on Walmart’s flagship site in 2018.

Who Needs Cashiers?

Customers shopping at roughly 100 stores across 33 states can participate in Walmart’s “Scan and Go” service. Via a dedicated mobile app, customers can scan the barcodes of items as they shop, pay through the app using Walmart Pay, and then exit the store after showing a digital receipt to an employee. As customers shop and scan with their phones, they can observe the running total of their purchases. This service is currently available at all Sam’s Club locations.

In this case Walmart is keeping pace with grocery competitor Kroger which is also experimenting with digital checkout experiences. Kroger has a “Scan, Bag & Go” service rolling out at 400 grocery chains.

Additionally, Walmart’s skunkworks retail division “Store No. 8” is working on a futuristic project codenamed “Project Kepler”. This initiative goes a step further and eliminates both cashiers and checkout lines by using a combination of advanced imaging technology on par with Amazon’s “Amazon Go” concept. As customers take items off of shelves, they are automatically billed for their purchase as they walk out of the store. The acquisition is in play here as this initiative is being led by Jet’s CTO Mike Hanrahan.

Grocers already operate on razor thin margins therefore removing cashier interaction from the shopping equation fits in with the goal of lowering labor costs. Walmart employs approximately 2 million reasons to turn this future technology into reality.

Send in the Clouds:

According to the Wall Street Journal, Walmart is telling some of its technology vendors that if they want to continue being a technical supplier then they cannot run applications for the retailer on the leading cloud computing service, Amazon Web Services. Vendors who do not comply run the risk of losing key Walmart business. This is where we open our strategy textbooks to Porter’s Five Forces and key in on “Bargaining Power of Buyers” in the retail information technology provider space. The Economist reports that in 2015 Walmart poured a staggering $10.5 billion into information technology, more than any other company on the planet. To misquote E.F. Hutton, when Walmart speaks, you listen if you’re a technology vendor. The company’s cloud ultimatum is responsible for an uptick in usage of Microsoft’s Azure offering.

As I’ve mentioned in other posts, Walmart is known for its “build not buy” philosophy in regard to technology. Most of its data is housed on its own servers or Microsoft Azure which is the primary infrastructure provider for e-commerce subsidiary According to CNBC, about 80 percent of Walmart’s cloud network is now in-house.

Walmart’s cloud application development is facilitated by the company’s own open source cloud application development platform named OneOps. The aim of OneOps is to allow users to deploy applications across multiple cloud providers (i.e. allow users to easily move away from Amazon Web Services). Walmart has also been a huge contributor to OpenStack, which is an open source cloud offering and has been working with Microsoft, CenturyLink and Rackspace.

OneOps was originally developed by Walmart Labs and has since been released as an open source project so that Walmart can benefit from a broader community that’s willing to offer improvements. The main codebase is currently available on GitHub (

Foreign Investment:



Walmart is currently challenging tech titans Amazon and China’s Alibaba for a lucrative stake in India’s burgeoning online retail market. India’s expanding middle class makes its online market a lucrative target. The market is purported to reach $220 billion by 2025 according to Bank of America Merrill Lynch. Walmart is essentially barred from outright owning physical store locations in India due to the country’s restrictive foreign investment regulations. Foreign ownership for multi-brand retailers is limited to 51% and retailers must source 30% of its goods from small suppliers which poses a difficulty for Walmart. Walmart uses its global buying power to squeeze deep discounts from major suppliers such as Unilever and Proctor and Gamble. Smaller Indian firms will have more difficulty yielding to exorbitant price concessions.

Therefore, Walmart is currently in talks to purchase a sizable stake in Indian online retailer Flipkart. Flipkart is a highly attractive opportunity because it has been able to effectively compete with Amazon in India despite being outspent by Team Bezos. Flipkart currently has a 44% market share in India which is running ahead of Amazon’s share at 31%. Walmart’s multibillion dollar investment will likely value Flipkart at $20 to 23 billion.

An infusion of capital from Walmart makes sense for both parties; Flipkart can hold off attacks from Amazon while Walmart gets a piece of the action in a growing and lucrative online market. Amazon has stated its intention to invest $5 billion in India in order to beef up the number of its fulfillment centers. Ironically, Flipkart was launched in 2007 by two former Amazon employees, Sachin and Binny Bansal.

Walmart isn’t the only company looking for a piece of Flipkart as Google is also purported to make a sizable investment in the Indian firm at a valuation of $15 to $16 billion.

Walmart has had difficulties operating in India previously as evidenced by its now disbanded partnership with Bharti Enterprises. The two companies built 20 superstores branded as Best Price Modern Wholesale, but the venture fizzled due to aforementioned regulatory restrictions.

Meanwhile in China, Walmart partnered with which is a fierce Alibaba rival. Walmart and JD will merge their membership systems, so members can receive similar discounts at both retailers. In addition, the two companies will jointly work to create a system that enables to fulfill customer orders from Walmart inventories. Walmart initially had its own Chinese marketplace named Yihaodian but sold it to JD in 2016 due to its small market share in comparison to both JD and Alibaba.

Header Image Copyright: moovstock / 123RF Stock Photo

Advanced Bar Chart Labeling in Tableau

Here is a quick and easy, yet advanced tip for placing your labels just to the inside of your bar chart. This tip will provide you another alignment option in addition to the default ones. Credit to Andy Kriebel for the tip.

If you’re interested in Business Intelligence & Tableau subscribe and check out my videos either here on this site or on my Youtube channel.

My Journey to Obtaining the Certified Business Information Professional (CBIP) Certification

As of the date of this blog post I can proudly say that I have completed the certification suite of exams that comprise the Certified Business Intelligence Professional (CBIP) designation. My aim in taking the test was threefold.

  1. Discover how my knowledge and experience stacked up against professional standards issued by a reputable body in data and computing.
  2. Find additional motivation to constantly educate myself regarding data and business intelligence since the certification requires renewal.
  3. Bolster credentials, because it never hurts one’s bottom line to show you have expertise in your profession.

If you’ve found this page via search, you’re no doubt already acquainted with this certification offered by The Data Warehouse Institute (TDWI). I started with what I though would be the most difficult test based upon what I have researched; the Information Systems Core (i.e. IS Core). However, this was not the case as the specialty exam was the most difficult in my opinion.

Test 1: Information Systems Core (i.e. IS Core):

12/15/17: I wish I could share some detailed information about the test but that is not allowed per CBIP guidelines. All I can say is that the scope of information covered is very broad.

“The IS Core examination (Information Systems Technology) covers the base 4 year model curriculum from ACM and AIS for information systems – the entire spectrum of organizational and professional skills, teams and supervision, strategic organizational systems development and project management, systems development, web development, databases and systems integration – the subject matter, testing your ability to recognize, differentiate, and understand the definitions of the concepts covered.” – CBIP Examinations Guide

For adequate preparation, you’ll first need to spend $135 on the examinations guide. Unfortunately, the examinations guide is not something you can simply study and then go sit for the test. It is basically a reference book that points you to other sources to consider for test preparation. The guide also outlines the various subject areas that will appear on the test. Let me stress that you should not sit for this test without pertinent work experience and education. You will need to draw upon your knowledge and experiences to have a legitimate shot at passing.

My intent was to devote about 3 weeks’ worth of study time to tackle the IS Core but my work severely got in the way of that plan. I ended up devoting only ten hours of study time in total, but this was certainly not by design.

First I took the sample test of 42 questions in the examinations guide and fared pretty well. This gave me some confidence to continue with my scheduled exam date when I found out that my work was going to shorten my available study time.

The test was difficult. I’m not going to sugar coat this aspect. While I was taking the proctored exam, I could count on two hands where I was confident that I had chosen the correct answer (out of 110 questions). Part of the difficulty of the exam is the fact that you are presented with 4 choices where at least two of these choices could be a satisfactory answer.

Test 2: Data Foundations

12/31/17: I performed much better on the Data Foundations test, scoring well above the mastery level threshold of 70%. I was buoyed by my performance on the Information Systems Core test and only scheduled about 10 hours of study time in preparation for Data Foundations. I used one reference book to prepare. My advice for this test would be to have an understanding of metadata concepts; (this is listed as a subject area already cited in the CBIP Examinations Guide). Make the DAMA Guide to the Data Management Body of Knowledge your best friend. I used the 1st edition in lieu of the 2nd edition in my preparation since I already had the 1st edition in my possession.

Test 3: Specialty Exam: Data Management

1/14/18: This was the most difficult of the three exams that I sat. It may have been a function of my limited preparation as I only put in about 3 hours of study time. The scope of topics regarding this exam is so broad that I planned to again leverage my experience and knowledge to power me through. The majority of questions on this exam required narrowing down the answers to the two best answers and then selecting one. There is a persistent overlap between what could be acceptable and what the exam decrees is the one right answer. I’m not giving away anything that isn’t already on the outline shared by TDWI but you’ll really need to brush up on your knowledge of data governance, data management, data warehousing and master/reference data.

My Background:

Not to be immodest (I only want to share my mindset for sitting the exam with somewhat minimal study) but I’ve been working with data for 15 plus years and hold both an MBA and a Masters in Information Management. Before becoming a BI/data and analytics consultant, I worked back office in a bank supporting the monthly update of three credit risk data marts. Thankfully all of that hard gained experience working in a financial institution’s back office paid-off. Surprisingly, the number of right answers I gained from study time were minimal. Your mileage may vary in this regard.

Reference Material:

Here are the reference materials I used in my preparation; fortunately, (with the exception of the CBIP manual) I already had these in my library due to graduate studies. Depending upon your level of experience, you may need to supplement your effort with additional books. I will say that both Wikipedia and Search Business Analytics were very helpful for looking up unfamiliar terms.


Best of luck to you on your journey to CBIP certification!

Photo Copyright: dragonimages / 123RF Stock Photo

Create A Barbell/DNA Chart in Tableau with NBA Data


A Barbell, Dumbbell or DNA chart should be considered when you want to illustrate the difference or degree of change between two data points. In this video I will use NBA data from the 2016-2017 season (courtesy of to illustrate the difference between team wins and losses.

If you’re interested in Business Intelligence & Tableau subscribe and check out my videos either here on this site or on my Youtube channel.

Data Profiling with Experian Pandora


Experian Pandora is a robust data profiling and data quality tool that enables users to quickly obtain useful statistics related to their data. Once you load your data into the interface, you can identify statistical anomalies and outliers in the data within seconds. To gain these types of insights, normally I have to write SQL scripts or use the SSIS Data Profiling Tool against SQL Server data. Experian Pandora is much easier to use against data in .csv files or Excel spreadsheets since you can simply drag and drop those items into the interface.

A lack of data profiling leads to poor data quality which leads to inaccurate information and poor business performance. I believe you will find this tool a worthy add to your data toolbox.

Download the Free Data Profiler:

If you’re interested in Business Intelligence & Tableau subscribe and check out all my videos either here on this site or on my Youtube channel.

Anthony B. Smoak

General Motors’ Information Technology: IT’s Complicated

General Motors and its relationship with technology has been one of innovation followed by periods of stagnation. Its technology staffing strategy of choice has been acquisition, followed by pure outsourcing, until it settled on its current insourcing approach. New startups like Tesla and Uber have a profound effect on a rapidly evolving automotive industry. GM as an industry incumbent must embrace new trends regarding autonomous vehicles and all the requisite software and technology to remain viable. The company currently believes than an insourced IT staff can help it develop competitive advantages.

The EDS Acquisition

General Motors has a long history of employing Electronic Data Systems Corporation (EDS) to service its information technology needs. The $2.5 billion acquisition of EDS in June of 1984 from billionaire Ross Perot was a move to help impose structure upon GM’s unorganized maze of data-processing systems and IT infrastructure. From the start, there were culture clashes between the two organizations; although EDS saw significant revenue increases after the acquisition. The management styles of brash, outspoken EDS founder Ross Perot and the bureaucratic GM CEO Roger Smith were incompatible.

“Problems surfaced within a year when the differences in management style between Perot and Smith became evident. The August 1984 issue of Ward’s Auto World suggested ‘Mr. Perot is a self-made man and iconoclast used to calling his own shots … Roger B. Smith [is] a product of the GM consensus-by-committee school of management, never an entrepreneur.’” [1]

Additionally, six thousand GM employees were transferred from GM to EDS at lower pay [2], which served to stoke the fires of the culture clash.

From 1984 until it was eventually spun-off in 1996, EDS was a wholly owned subsidiary of GM. Although there was an ownership separation, the two behemoths were still tightly coupled in regard to technology staffing. The decision to divest itself of EDS was a strategic decision by GM to focus on its core competency of vehicle manufacturing. EDS also gained the freedom to win additional technology contracting work from other organizations.


HP Enterprise Services (formerly EDS, Electronic Data Systems) corporate headquarters in Plano, Texas (Wikipedia)

Post EDS Spin-Off

Post spin-off, General Motors continued to contract with EDS for technology services as it still accounted for a third of EDS’s revenues at the time. Perceived as Texas “outsiders” by the Detroit incumbents, EDS found it difficult to deal with the fragmented nature of GM’s systems across various business units and divisions. While EDS had the requisite technical expertise, it did not always have enough internal influence to navigate GM’s intense political landscape. Obtaining consensus amongst business units in regard to technology decisions was a challenging endeavor. In an attempt to address these issues, incoming GM CIO Ralph Szygenda spearheaded the creation of an internal matrixed organization called Information Systems & Services (IS&S).

IS&S was created as a matrix organization consisting of internal GM technology executives and various other technologists (e.g. business and systems analysts). The new organizational structure consisted of a dual reporting relationship; IS&S members simultaneously reported to the CIO organization and to their local business unit leadership.

Generally, matrix organizations are instituted in order to promote integration. The advantage of the matrix organization is that it allows members to focus on local initiatives in their assigned business unit and it enables an information flow from the local units to the central IT organization. General Motors is a famously siloed global organization. With the creation of IS&S, members could now promote information sharing between different functions within GM and address the cross-organizational problems that had challenged EDS.

The matrix structure is not without weaknesses. To quote a famous book, “No man can serve two masters.” Employees in a matrix organization often deal with additional frustrations as they attempt to reconcile their allegiances and marching orders from conflicting authorities.  “Matrix organizations often make it difficult for managers to achieve their business strategies because they flood managers with more information than they can process” [3]. From my own personal experiences of working with IS&S while employed at GM subsidiary Saturn, I observed that members were inundated with meetings as they tried to stay up to date with the plans and initiatives of the central IT organization while trying to remain focused on their internal business units.

 A Return to EDS Insourcing

From the creation of IS&S in 1996 until 2012, GM relied upon a variety of outsourced contractors and vendors to deliver information technology services such as Capgemini, IBM, HP and Wipro. In 2010 GM renewed an existing technology outsourcing contract with the old EDS (now HP) for $2 billion.

The general wisdom in regard to outsourcing is that companies will seek to focus on those core activities that are essential to maintain a competitive advantage in their industry. By focusing on core competencies, companies can potentially reduce their cost structure, enhance product or service differentiation and focus on building competitive advantages.

In a reversal of its longstanding IT sourcing strategy, GM made headlines in 2012 with the decision to insource and hire thousands of technologists to supplement its bare bones IT staff. New GM CIO Randy Mott reasoned that an internal technical staff would be more successful working with business units and would deliver technology needs at a cheaper cost than outside providers. These savings could then be used to drive IT innovation and fund the capabilities needed to compete in a rapidly evolving automotive industry.

“By the end of this year (2012) GM will employ about 11,500 IT pros, compared with 1,400 when Mott started at the company four years ago, flipping its internal/external IT worker ratio from about 10/90 to about 90/10, an astounding reversal” [4].

GM decided to hire over 3,000 workers from HP that were already working at GM as part of its Global Information Technology Organization. The move could be considered an act of “getting the band back together” as HP purchased EDS in 2008 for $13.9 billion. Randy Mott was the CIO of HP before assuming the same position at GM. It is plausible that this fact factored into GM’s insourcing decision calculus.

It should be noted that insourcing IT personnel is not without risks. Insourcing requires a company to compete for technical resources which can be difficult in cutting edge technology areas. Furthermore, the complexities of running IT in house “requires management attention and resources that might better serve the company if focused on other value-added activities” [3].

GM’s Information Technology Transitions from Commodity to Innovation

The automotive industry is embarking upon significant changes as it deals with innovations and disruptions from the likes of Uber and Tesla. To illustrate this point, Tesla (founded in 2003) had a higher market capitalization than both GM and Ford for a period of three months in 2017. Auto industry incumbents like GM are focusing on automating and streamlining commoditized processes as well as applying IT to more innovative value-added functions (e.g. computerized crash testing, simulations to shorten vehicle development times and data analysis for profit optimization).

In its early years, GM had been widely perceived as an innovator before making a series of missteps that harmed this reputation. GM fell behind on hybrid engine development after taking a technology lead in the electric vehicle space. The company defunded its lauded EV1 offering in the early 1990s to appease the bean counters. The company also starved innovative upstart Saturn of the necessary funds to introduce new models for a period of five years.


2000-2002 Saturn SL2 (Wikipedia) The innovative Saturn subsidiary was starved of funds.

“G.M.’s biggest failing, reflected in a clear pattern over recent decades, has been its inability to strike a balance between those inside the company who pushed for innovation ahead of the curve, and the finance executives who worried more about returns on investment” [6].

After a government bailout in 2009, the company promised to emerge leaner and commit itself to technology leadership. Automakers are now focusing on software development as a source of competitive advantage. As a result, GM has opened four information technology innovation centers in Michigan, Texas, Georgia and Arizona. These locations were chosen in order to be close to recent college graduates from leading computer science programs.

GM Opens Fourth IT Innovation Center in Chandler, Arizona

One of GM’s 4 new Information Technology Innovation Centers 

Additionally, GM purchased Cruise automation which is developing autonomous driving software and hardware technology. It is even testing a ride-sharing app for autonomous vehicles. The purchase will bolster GM’s technology staff and efforts in an emerging space.

“Harvard Business School professor Alan MacCormack, an expert in product development management within the software sector, says that outsourcing even routine software development can carry risks for companies that are seeking innovation in that area. He notes that today’s vehicles have more software and computing power than the original Apollo mission. ‘Everybody can make a decent enough powertrain. But what differentiates you is what you can do with your software,’ he says of car makers generally. ‘Companies have to be careful that they don’t outsource the crown jewels’” [6].

The company also developed an internal private cloud nicknamed Galileo, to improve its business and IT operations and consolidated twenty three outsourced data centers into two insourced facilities [7].

With its new cadre of insourced technologists, GM will need to find a way to bridge the ever-persistent culture gaps between innovative technologists, bureaucratic management and the Excel zealots in finance.

“IT is core, I think, to GM’s revival, and I think it will be core to their success in the future,” – Former GM CEO Dan Akerson [7]



[2] Nauss, D.  (May 20, 1994). Pain and Gain for GM : EDS Spinoff Would Close Stormy, Profitable Chapter. Los Angeles Times. Retrieved from

[3] Keri E. Pearlson, Carol S. Saunders, Dennis F. Galletta. (December 2015). Managing and Using Information Systems, A Strategic Approach
6th edition. Wiley Publishing ©2016

[4] Preston, R. (April 14, 2016). General Motors’ IT Transformation: Building Downturn-Resistant Profitability. ForbesBrandVoice. Retrieved from

[5] Boudette, N. (July 6, 2017). Tesla Loses No. 1 Spot in Market Value Among U.S. Automakers. The New York Times. Retrieved from

[6] Leber, J. (November 5, 2012). With Computerized Cars Ahead, GM Puts IT Outsourcing in the Rearview Mirror. MIT Technology Review. Retrieved from

[7] Wayland, M. September 18, 2017. GM plots next phase of IT overhaul Unlocking the Potential of a Vast Data Empire. Automotive News. Retrieved from

Featured Image Copyright: akodisinghe / 123RF Stock Photo