B.I. Basics Part 4: Learn the QlikView ApplyMap Function

There will come a time in your QlikView load scripting endeavors where you will need to map a single key value to a lookup table and return the lookup value. If you’ve ever wanted a Qlikview function that is somewhat analogous to a CASE statement for simple lookups/transformations, then look no further than the ApplyMap function.

My video breaks down the hard to interpret user manual definition and provides a simple example that will have you performing QlikView lookups in no time.

Michael Porter’s Generic Cost Leadership Strategy Explained


Back when I was a heads down developer analyst working at General Motors, my mindset was completely focused on being a data expert and techie. At the time I did not have a broader understanding of business concepts and business strategies. Thus, I considered myself a “one dimensional” resource (a very competent one dimensional resource but one dimensional nonetheless). I set out to remedy my blind spots by acquiring business knowledge so I would have an understanding of broader concepts, become less myopic, and position myself favorably in the marketplace against other one dimensional techies (like myself at that time).

Subsequently, it was in business school where I first learned of American academic Dr. Michael E. Porter of Harvard Business School fame. Mr. Porter is regarded as the preeminent thought leader in the area of business strategy and competitiveness.

Generic Competitive Strategies:

I found value in studying and discussing Porter’s framework that defined generic competitive strategies. A generic competitive strategy is a business level strategy that companies adopt in order to obtain a competitive advantage. The strategies are termed generic because they can be pursued by any and every company across a range of industries. The three primary strategies employed in the framework are:

Cost Leadership (low cost structure, e.g. Wal-Mart, Dell, Southwest Airlines)
Differentiation (offering unique product and services for a premium, e.g. Apple, BMW, Starbucks)
Focus (limiting scope to narrow market segments, e.g. local restaurant or local service provider)


These three strategies help contextualize how businesses aim to obtain profits in their respective marketplaces; they also help businesses understand how they can seek new opportunities for advantage. Porter originally emphasized that a company should target only one of the strategies in the framework or risk paying a “straddling penalty” (a la the doomed airline offshoot Continental Lite). Porter later softened his stance in this regard recognizing the benefits of a hybrid approach in some cases.

Cost Leadership Strategy:

This post focuses on cost leadership because it’s the strategy that relates tangentially to IT and the concept of globalization. As IT workers are aware, the forces of globalization have no mercy in their enablement of companies to offshore work in an attempt to lower costs. This outsourcing of IT work to offshore firms is happening at organizations such as Disney, the University of California and [Insert Any Bank Name Here]. Obviously not all technology related offshoring is done in order to focus on a cost leadership strategy but the activity’s initial intent is to lower a firm’s IT cost structure (refer to my post on how IT has to do a better job communicating its value).

Returning to the main point, the cost leadership strategy is employed when a company aims to be the lowest cost producer in the market. Strategic managers in the organization make a concerted effort to lower business costs in order to achieve a competitive advantage. A lower cost structure enables a business to reap higher than average profitability.

Businesses attempting to implement this strategy may aim to increase inventory turnover, lower their wage expenses and/or manufacturing costs, gain bargaining power over suppliers, develop distinctive competencies in logistics, develop low cost distribution channels or any combination thereof. As an aside, I could prattle on ad-nauseam about Wal-Mart and how its technological capabilities provided the organization significant advantages (and I have here: Part 1, Part 2 and Part 3).


When the cost leader and another company decide to compete in the same price range for the same customers, the cost leader will have the inherent advantage because it will reap higher profits due to its lower cost structure. The cost leader will be able to weather the “price war” due to its lower cost structure advantage.

“The cost leader chooses a low to moderate level of product differentiation relative to its competitors. Differentiation is expensive; the more a company expends resources to make its products distinct, the more its costs rise. The cost leader aims for a level of differentiation obtainable at a low cost. Wal-Mart, for example does not spend hundreds of millions of dollars on store design to create an attractive shopping experience as chains like Macy’s, Dillard’s, or Saks Fifth Avenue have done.” [1]

The cost leader also positions its products to appeal to the “average customer”. The aim is to provide the least number of products desired by the highest number of customers. Although customers may not find exactly what they are seeking, they are attracted to the lower prices [1].


Since the strategy involves providing the lowest costs, companies must strive for a large market share when employing this strategy. The cost leadership strategy has been linked to lower customer brand loyalty which in turn means that customers can be swayed by lower priced substitutes from other competitors.

Additionally, as technological change enters the marketplace, new competitors can attack cost leaders through innovation thus nullifying the cost leader’s accumulated advantages. For example, Amazon has accumulated substantial knowledge and proficiencies in the online e-tail space and has placed Wal-Mart on the defensive in this arena as Wal-Mart’s expertise is tailored to its brick and mortar assets.

Or as foretold in Porter-speak back in his 1996 HBR article “What is Strategy”,

“A company may have to change its strategy if there are major structural changes in its industry. In fact, new strategic positions often arise because of industry changes and new entrants unencumbered by history often can exploit them more easily.” [2]


[1] Hill, Charles. W. L., & Jones, Gareth. R. (2007). Strategic Management Theory. Houghton Mifflin Company

[2] Porter, M. E. “What Is Strategy?” Harvard Business Review 74, no. 6 (November–December 1996): 61–78.

Header Image Copyright: olivier26 / 123RF Stock Photo</a>

Diagram Image By Denis Fadeev – Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=32434551

More Than You Want to Know About State Street Bank’s Technology Strategy Part 1


In November of 2010, Investment Management firm State Street Bank publicly announced an overall transformation of its technology infrastructure. State Street is a massively sized transaction services provider to both mutual and pension fund managers. The custodian bank holds $23 trillion in investor accounts in 29 countries around the world. In an organization with a massive store of data (most of it subjected to regulatory oversight), enterprise wide data conformity and accessibility is a challenge.

In a case of business strategy/need influencing information technology strategy, State Street’s COO in 2009 (Jay Hooley) wanted to help an institutional client calculate its exposure to a particular market. The information request was highly urgent given the financial consequences of late reactions during the great recession. “Getting the numbers turned into a painful exercise as State Street’s middle- and front-office staffers reconciled disparate data sets housed in different client systems and in nine of State Street’s 29 global locations” (Fest 2013). This failure to deliver for the client in an instantaneous manner spurred Hooley to call upon his information technology leadership to present a strategy to address this business need. The result of the IT leadership planning effort was the idea to move the bank’s diverse legacy infrastructure to a more standardized and nimble cloud computing architecture.

State Street: Strategy for Technology Infrastructure:

Former State Street CIO Chris Perretta, who as of 2016 holds a similar position at MUFG America’s Holdings Corporation, spent a considerable amount of time evangelizing the benefits of cloud computing to both the business and the bank’s board members. In its early stages, the technology vision resulting from the COO’s planning request was to position an updated infrastructure as a competitive advantage for the business in terms of cost savings, automation, and future development efficiency. Furthermore, the updated infrastructure could be an enabler of new product revenue streams. It should be noted that a shift to the cloud for a financial organization the size of State Street was unusual. “Too Big to Fail” sized banks are not typically known for their innovative technology development. Derisively, the bank has been known as “Staid Street” for its conservative manner. Within financial services, innovation is usually the domain of smaller, nimbler “fintech” startups looking for scalability and speed to market.

From an infrastructure perspective, State Street embarked upon migrating from disparate legacy data centers running proprietary Unix servers to a standardized cloud architecture based upon commoditized x86 servers running Linux. The initial cloud service was built from a Massachusetts based disaster recovery center and the bank currently has six major data centers in the U.S., Europe and Asia along with three backup facilities (Brodkin, 2011). In addition to the rollout of virtualization capabilities and distributed database functionality, Perretta states, “New tools for provisioning, change control, load balancing, a common security framework and various types of instrumentation to enable multi-tenant infrastructures are all part of the mix” (Brodkin, 2011).

Traditionally State Street has relied upon the “build rather than buy” approach as it builds customized software (development traditionally accounting for ~20%-25% of annual IT budget) to meet its needs (CMP TechWeb, 2012). The standardized cloud platform now enables developers to reuse code for future development which can shorten project timeframes.

State Street: Strategy for IT Capability & Staffing:

State Street’s IT organizational structure can be characterized as federalism. With the federalist approach, the organization gains the benefit of having centralized leadership and vision at the “top of the house”, yet allows decentralized co-located IT groups to remain responsive to their respective divisions. As described by former CIO Perretta, “We line up delivery capacity with each unit, and each CIO is responsible for delivering business services to that unit” (MacSweeney, 2009). For example, The CIO has a direct report on the ground in China where the company operates a subsidiary (State Street Technology Zhejiang Co). Furthermore, the bank is tolerant of “skunk works” style projects that organically develop in different IT divisions throughout the enterprise (MacSweeny, 2009).

On the centralized side of the federalist equation, the bank operates a shared services group that is responsible for technical necessities distributed throughout the enterprise (i.e. security, information and communications). This federalized approach makes sense for a sprawling organization that is comprised of disparate business operations across its custodian bank, investment management, investment research and global divisions.

With the introduction of the cloud infrastructure at State Street, the technology staffing vision is to acquire individuals with architectural knowledge who can think “big picture” yet are able to wallow in the details as necessary. The bank employs a chief architect whose aim is to drive technology innovation that leads to strategies that will impact the business in an advantageous manner. Perretta states, “We don’t use him to manage projects; we use him to come up with the ideas that make sense for our business community. Now he does those pilots, and then we industrialize them for the rest of the organization” (Tucci, 2011).

To be continued in Part 2 and Part 3 where I address additional areas such as:

  • Strategy for Information Risk & Security
  • Strategy for Stakeholder Requirements, Testing & Training/Support
  • Project ROI and Key Success Measures
  • Strategy for Data Acquisition and Impact on Business Processes
  • Strategy for Social Media/Web Presence
  • Strategy for Organizational Change Management, Project Strategy and Complexity


Brodken, J. (April, 14, 2011). State Street modernizing with cloud, Linux technologies; Virtualization, open source drive cloud project at State Street. Network World Fusion. Retrieved from Factiva 6/19/2016

CMP TechWeb. (June, 25, 2012). State Street Private Cloud: $600 Million Savings Goal. Retrieved from Factiva 6/19/2016

Fest, G. (January 1, 2013). State Street’s Dig (Data); Championed by CEO Jay Hooley, boston-based state street is remapping a huge technology infrastructure to reap the benefits of the cloud and big data. American Banker Magazine. Vol.123, No.1. Retrieved from Factiva

MacSweeney, G. (August, 1, 2009). Serious Innovation; CIO Christopher Perretta supports all of State Street’s IT needs by mixing new technologies and rapid development and even encouraging ‘skunk works’ experimentation when appropriate. Wall Street & Technology. Retrieved from Factiva

Tucci. L. (July, 2011). In search of speed, State Street’s CIO builds a private cloud. Retrieved from http://searchcio.techtarget.com/podcast/In-search-of-speed-State-Streets-CIO-builds-a-private-cloud

Photo courtesy of DAVID L. RYAN/GLOBE STAFF

A Review of Syracuse University’s Executive Master of Science in Information Management Program (MSIM)


Syracuse University iSchool Commencement May 14, 2016

I wanted to share an assessment of the Syracuse University Executive Master of Science in Information Management (MSIM) program as I recently graduated from the program in the summer of 2016. I will say that if you are considering a distance program, you must be self-disciplined and a self-starter. Typically there are no set class times, there are only set assignment due dates.

Just to give you a bit of background on my experiences, back in 2012 I was toiling full-time in the back-office of an Atlanta based bank in a data warehousing/requirements gathering capacity when I acquired (or I should say re-acquired) the Masters bug. I already held an MBA with a focus in IT management but I wanted additional exposure to areas that focused on current “sexier” technology based topics such as data science, R programming, visualization and information security.

In contrast, the IT management focus of my MBA studies focused on strategic planning, project management, justifying IT investments, business process analysis and on-site company practicums. Basically IT Management is the academic “CIO starter pack” for the so-inclined executive. I wanted to supplement the more strategic MBA IT focus I had already gained with a granular data focused learning opportunity. I was thinking ahead and trying to position myself to tackle more interesting information related work in visualization and data analysis.

Fortunately, my employer at that time offered a tuition reimbursement program that would help offset some of the costs of a new graduate degree. Thus, at the start of 2012 I decided to take the plunge and started looking at reputable graduate degree programs in IS/MIS/CIS or Information Management.

Just because I was looking at online degree programs didn’t mean I was willing to compromise on academic quality or university reputation, thus I started researching STEM programs at reputable schools: Carnegie Mellon, Northwestern, Brandeis, Boston University and Syracuse University.

Based upon my tenure, here are some pros and cons of the Syracuse Executive MSIM program as I will be as impartial as possible in my assessment. Please note that my experience does not include an assessment of the full time on campus program.

From the Syracuse iSchool website:

The 30-credit MS in Information Management for Executives requires students to have six or more years of full-time professional experience in the information management field with a record of continuingly increasing job responsibilities. This program track is also available, online, full time, or part time and can be completed in as little as 15 months.


  • Syracuse University is a well-known respected institution with a solid name brand;
  • According to USNews the iSchool at Syracuse is rated as having the #1 ranked program in Information Systems within an iSchool
    • Syracuse edged out other respected schools such as Michigan, UNC, Washington, Maryland and UT-Austin
  • All of the Big-4 professional services firms recruit from the full time program;
    • I took this as an indication of program quality
  • The degree is offered by the same school that houses the on campus program and not offered from a Professional Studies school
    • This held sway with me as I was looking for schools that did not herd distance students into a program with separate faculty and lower admissions standards
  • Any student in the Executive MSIM program can at anytime enroll in an on-campus class
    • This was further proof of academic parity between all MSIM degree options as I enrolled in two on-campus classes during summer terms
    • The majority of the classes were taught by PhD level faculty
  • You can combine a graduate degree with a CAS (Certificate of Advanced Study) in multiple disciplines;
    • Data Science
    • e-Government
    • Information Security Management
    • Information Systems & Telecommunications Management
    • School Media
  • You can transfer in to the MSIM program up to 6 credit hours from another graduate level program;
  • There are Syracuse University alumni chapters in most major cities that present networking opportunities
  • If you’re a college basketball fan Syracuse gives you a new rooting interest in a quality program. College football fans however…
  • If you complete the program, you WILL learn new concepts that can be put to work immediately in your current position and make you more confident in your abilities


  • Expensive!!!
    • The program was much cheaper than Carnegie Mellon and slightly cheaper than Northwestern’s Continuing Studies offering but it was pricey nonetheless. Back in 2012 the classes were roughly $3800 for 1 class. By the time I finished in 2016 I was paying $4400 per class (a whopping 16% increase)
    • The majority of classes I took were of high quality and I gained additional knowledge to fill in some gaps but there were some classes were I felt the material was on par with a MOOC (which is not necessarily bad) but at $4000 a class, students will have much higher standards
    • The high cost of the program forced me to drag out the program over 3 and a half years
  • Would have liked to see greater connectivity solely between Executive MSIM students. I listed parity and sameness of academic quality as a plus but I would have liked to know (for networking purposes) who else was in the executive program even if they weren’t in my particular class. Classes were comprised of executive online students, non-executive online students and full-time on-campus students who wanted to take an online option. This cross-blend of students enriched the overall class experience but I still wanted a means of connection with all students in the executive MSIM program
  • Your student advisor can be unresponsive
    • There were times when I received an email response from my advisor up to a week later. There were times when my emails and phones calls to my advisor were completely ignored. Slow and especially non-responses should not be a possibility when customers are paying the university roughly $4,000 a class
  • Work. School. Social Life. Pick 2. This problem is not unique to Syracuse graduate programs
    • In my first class I had to write a 25-page paper which meant taking a week of “vacation” to finish!

All in all, I was pleased with the program. I was able to combine the M.S. in Information Management degree with a Certificate in Data Science which included exposure to tools like R, Tableau and Qlikview.

I was able to travel to Syracuse during the summer on two different occasions and complete two classes which were both of high quality. If you enroll in the program you should absolutely attend a summer Maymester class to acquaint yourself with the city and the campus. On my first visit I stayed in Haven Hall and lived like a student. On my subsequent visit it was a week in the Marriott since I had hotel points to use from my consulting job.

I specifically enjoyed the capstone class I took on campus, IST-755 Strategic Management of Information Resources. The class involved lectures combined with readings, a mid-term test and an in-class group strategic presentation based on an assigned case problem. After the week of class, students needed to compose an individual strategic paper.

From a cost perspective, there are innovative graduate programs leveraging MOOC (i.e. Massive Open Online Course) components which are currently offered at prestigious public universities. The MOOC underpinnings of these offerings allow the degrees to be offered at a mere fraction of Syracuse’s cost.

Both Georgia Tech (full disclosure, my MBA was earned here) and the University of Illinois are offering M.S. programs in computer science and data science for roughly $7,000 and $19,000 respectively. I would love to see more reputable universities with quality STEM programs follow suit in this regard (looking at you NYU, Cal-Berkeley, Carnegie Mellon and Syracuse).

However, if you can take advantage of your employer’s tuition reimbursement plan to subsidize the cost, I recommend the Syracuse Executive Master’s of Science in Information Management program for the quality academics taught by PhDs and the flexible curriculum. The program will be especially useful for talented individuals mired in back-office banking looking to transition to consulting!!

Photo Copyright : maglara

B.I. Basics Part 3: Create a Gantt Chart in Excel

If you’ve ever had to put together a quick timeline to share with someone without the need to resort to full blown Microsoft Project then you will find this video helpful. I will show you how to create a very simple but effective Gantt chart that will satisfy your inner project manager. Definitely keep this tip in your Excel toolkit.

B.I. Basics Part 2: Sorting “Correctly” in Tableau

For those of you that are familiar with Tableau, you know that sorting can be an exercise in frustration and futility. Fortunately when you understand how Tableau intends its sort functionality to work, you’ll discover that there is a method to the madness. My video presents a simple solution that will alleviate your sorting frustration and should find a place in your Tableau toolbox.

Enterprise Risk Management at Microsoft

This is a brief writeup from an Enterprise Risk Management class that I took back in 2013. The case describes Microsoft from the mid to late 90’s and its efforts to implement an Enterprise Risk Management group. The case mentions former head of treasury Brent Callinicos, who went on to become a regional CFO at Microsoft and the CFO for Uber.

For those who are interested in the case details, check out “Making Enterprise Risk Management Pay Off: How Leading Companies Implement Risk Management” by Thomas L. Barton, William G. Shenkir & Paul L. Walker.


Historically, the technology sector has always been subjected to swift, rapid changes. Microsoft has always tried to anticipate new threats and technology advances (i.e. dealing with both existing risks and unanticipated risks). Back in the late 1990’s, technological changes due to the rise of the internet provided Microsoft a different landscape from the historical era of the unconnected, standalone PC. In Microsoft’s 1999 annual report, the first item discussed under “issues and uncertainties” is “rapid technological change and competition”[1].

Additionally as the mid 90’s era Microsoft launched new products, it also ventured into new business models. The launch of Expedia in 1996 positioned Microsoft as a player in the travel agency business and its Home Advisor product made the company a licensed mortgage broker. These novel business models exposed the organization to a new set of risks, which in-turn exposed the risk management group to new challenges.

Moving to an Enterprise-wide Risk Management Approach

Microsoft has always competed in a very competitive landscape replete with technologically savvy competitors and condensing product life cycles. As a result, an enterprise wide commitment to risk management was a necessary and prudent choice to remain competitive in the company’s markets.

The momentum that triggered a more enterprise wide view of risks at the company was the establishment of the risk management group in 1997. Prior to 1997, there was no such group to start the process of implementing an ERM framework. Within the treasury group, the risk management group head Brent Callinicos (also notably the eventual CFO of Uber) set out to develop a consolidated risk identification, measurement and management approach.

The treasury group started with finance risk management changes by increasing the complexity and effectiveness of VAR analyses. Furthermore, treasury presented a paper to the finance committee of the board of directors that analyzed the derivative losses of several major companies. This report precipitated a more integrated approach to the various financial risks handled within treasury. The creation of Gibraltar (a treasury information system) allowed the company to view all of its risks “holistically rather than on a silo basis” [1].

From a business risk perspective, the risk management group worked closely with business unit managers in order to develop risk-financing plans and to aid business units with appropriate quantitative risk modeling. This evangelist approach was an effective method for gaining buy-in regarding the risk management group’s aims.

Microsoft’s Enterprise Risk Management Structure

Microsoft’s risk management group is nestled within the treasury function of the organization. The leader of the risk management group is the corporate treasurer who reports directly to the CFO. Treasury manages somewhere in the neighborhood of $80 billion for the software company [2]. Business risk is divided into worldwide products, worldwide sales & support and worldwide operations. The company does not have a CRO as it decided that a CRO would not be practical.

In my opinion, I believe that Microsoft housed their risk management under the treasury function because they viewed a standalone risk organization under a CRO as duplicative. Treasurers concentrate primarily on managing financial risks but by nature must also be generalists with respect to many types of risk. In a multinational technology company such as Microsoft, various market currency risks exist that require appropriate anticipation and response.

Microsoft is inherently technologically driven. The company has very smart, knowledgeable people naturally embedded into its lines of business. These smart people understand the risks of their technological products and desire to see these products succeed. To the benefit of the organization, the embedded personnel have an inherently risk minded mentality. Therefore the job of the risk management group is to partner with and support the lines of business and various operations groups by adding “incremental value”; i.e. information that the business units may not have considered.

“Microsoft is first run by the product group, then maybe by sales, and finance and risk management will come after that. The risk management group or treasury group will not run the company”[1].

Microsoft previously looked at risk in separate silos. In order to look at risk holistically, the risk management group had to step back and take a strategic assessment, which is a much more challenging endeavor. With this holistic approach, the grouping or correlation of risks are considered as opposed to dealing with one specific risk at a time. For example, Microsoft considered property insurance as the legacy best way to manage the risk of building damage in an earthquake. With a new scenario analysis approach employed by the risk management group, additional risks must be considered that are correlated to property damage. This new correlation mentality required partnering with multiple areas of the company to incorporate additional risks for an appropriate risk assessment.

Use of Scenario Analysis

Scenario analysis is used to understand the risks with respect to situations where it is very hard to quantify or measure the precise impact to the organization. Sequences of events regarding severe earthquake damage or severe shocks to the stock market are risks that are difficult to quantitatively measure and thus scenario-based tactics are applicable to try and gauge the fallout. Additionally, Microsoft uses scenario analysis to conduct stress testing which consider hard to measure impacts of political and geographic circumstances. An order of magnitude approach is used in scenario analysis as opposed to an exact measurement approach. Microsoft uses qualitative language such as, “..the quantification of business risks is not exact…”  and , “Does this feel about right for this risk” in their scenario analyses.

Once the risk management group has identified the risks associated with each scenario, it then partners with other business units to understand impacts. The risk group will also investigate other external organizations that have experienced similar events in order to learn how these organizations weathered their experiences.

The Main Benefits of Enterprise Risk Management

One substantial benefit for Microsoft in moving to an ERM approach is that the company can view and assess its risks holistically as opposed to assessing risks in an independent/uncorrelated fashion. This is evident in initiatives such as the company’s Gibraltar treasury system which provides an aggregated view of market risks.

Another benefit is that the risk management group works across the organizational footprint and can provide input to various groups so that each group can “stay current on what is happening in the business” [1]. The risk management group can diffuse information across the organization by working closely with business unit managers. Face time with product and operations managers allows the risk group to understand risks across the enterprise which contributes to a holistic understanding.

This approach is mutually beneficial for both groups as the risk group gains understanding of new risks (continuous cataloging of risks) and the business units gain insight into risks they may not have previously considered.

“By having the business units educate us on the intricate details of their business, the risk management group can be aware of perhaps 90 percent of the risks facing Microsoft”[1].

Closing Thoughts

At Microsoft, the risk management group doesn’t necessarily have to posses the all-encompassing best risk solution for every line of business. Risk management considers the product managers and the respective lines of business as the most knowledgeable sources of risk within their own domains. The risk group is on hand to provide additional insight for incremental improvement and to enhance or build upon the risk knowledge already contained within the lines of business.

This approach makes sense for a technology company that is teeming with very risk aware and knowledgeable personnel at the operational levels who are designing or working with complex products.

In my work experience at a traditional bank, the risk group was assumed to have the best procedures, templates and analyses with respect to handling credit, market and operating risks.  From Microsoft I have learned that highly efficient and capable risk management can also be a synthesis of understandings from risk management proper and the lines of business.


[1] Barton,T., Shenkir,W., Walker, P. (2002). Making Enterprise Risk Management Pay Off.

[2] Groenfeldt, T.  (Nov, 2013). Microsoft Treasury Wins Best Risk Management Award. Forbes. http://www.forbes.com/sites/tomgroenfeldt/2013/11/19/microsoft-treasury-wins-best-risk-management-award/#4fcade2124ed

Copyright: mrincredible / 123RF Stock Photo

B.I. Basics: Create an SSIS Data Profiling Task In SQL Server

Data Profiling is necessary when trying to gain an understanding of a given data set. A data profiling assessment should begin before any reporting or application development work begins. My video will demonstrate how to create a basic SSIS Data Profiling Task using SQL Server Data Tools.

According to the DAMA Guide to the Data Management Body of Knowledge:

“Before making any improvements to data, one must be able to distinguish between good and bad data…. A data analyst may not necessarily be able to pinpoint all instances of flawed data. However, the ability to document situations where data values look like they do not belong provides a means to communicate these instances with subject matter experts, whose business knowledge can confirm the existences of data problems.”

Here is additional information direct from Bill Gates’s former startup outfit regarding the types of data profiling tasks available in SSIS: https://msdn.microsoft.com/en-us/library/bb895263.aspx

More Than You Want to Know About Wal-Mart’s Technology Strategy Part 3

This article is the final piece and a continuation of my earlier analyses (Part 1, Part 2) where I waded into Wal-Mart’s strategy for information risk & security, stakeholder requirements and project ROI. Whether you love or hate Wal-Mart, no one can argue that historically the organization has been highly innovative, effective and efficient. In this third part of my three part series I will broach the company’s strategy for data acquisition, social media, and project execution.

Wal-Mart: Strategy for Data Acquisition and Impact on Business Processes:

Wal-Mart has always been on the forefront of how an organization acquires, handles and shares data with internal and external parties. The company’s information technology spans across their 11,500 stores operating under 63 banners in 28 countries and e-Commerce websites in 11 countries (Wal-Mart Stores, Inc, 2015). This diverse assortment of digital and traditional brick and mortar assets services 260 million customers. The company’s sprawling POS system must remain highly operable and robust enough to harvest daily sales data across its global footprint. In order to accommodate data from a substantial customer base of millions of shoppers, the company must have the requisite back end storage infrastructure. As of 2015, Wal-Mart is embarking on a plan to build a massive private cloud which is expected to make 40 petabytes of data available everyday (Buvat et al., 2015). From a strategic standpoint the in-house cloud build-out makes sense, as employing Amazon Web Services would be making the company reliant on a key competitor’s technology offering to house sensitive data.

The data acquired from its POS systems allows the company to better allocate its product mix in real time. For example, on “Black Friday”, which is typically the busiest shopping day of the year, the company’s buyers mine the day’s sales data as early as 6am on the East Coast in order to optimize the company’s product offerings for the day (Sullivan, 2004).

From a data acquisition perspective, Wal-Mart pioneered the use of bar code scanners. The usage of scanners and Universal Product Codes was not just to register the price and make the cashier’s job of processing customers faster (as its early competitors had used the technology); Wal-Mart realized that bar codes enabled the company to determine where and how sales were made. In order to link POS, inventory and supply chain management data together with headquarters, the company invested in its own personal satellite system in 1984 (Sherman, 2013). This investment impacted in-store business processes as analytics could be run against purchase data to determine customer market basket mix and product to product correlations. Being on the forefront of purchasing data analytics allowed the company to more effectively place products in-stores that customers demanded.

This information linkage enabled by Wal-Mart’s satellite investment also led to profound business practice impacts with respect to its supply chain management process. As mentioned earlier in this analysis, with the development of Retail Link, Wal-Mart led the industry by providing first P&G and then the rest of its supplier network with direct visibility to real time store-shelf data. This was a highly innovative move as suppliers and internal buyers could work together on “forecasting, planning, producing, and shipping products as needed” (Sherman 2013). Enabled by its new technology system, the data sharing of information between supplier partners allowed those suppliers to manage inventory at Wal-Mart stores with a more comprehensive understanding of product demand, while Wal-Mart benefited from lower inventory storage costs.

Wal-Mart: Strategy for Social Media/Web Presence:

Wal-Mart operates e-commerce web presences across 11 different countries. Although it is by far the biggest brick and mortar retailer in the world, it has struggled to compete with online “e-tail”, competitors such as Amazon and Target. Updating its digital know-how and refreshing its digital properties will be a requirement in order to keep pace in a shifting industry dynamic. To this end, Wal-Mart has embarked upon a number of strategies to keep itself relevant and enhance its digital capabilities. The company is leveraging its web properties to not only analyze purchasing behavior but also review customer search histories and customer social media interactions. The latter activities are aimed at boosting its online sales and predicting customer demand for its brick and mortar locations.

“Teams at WalmartLabs use visualization techniques to analyze social activity to capture insights that may indicate changes in product demand. Walmart can then use these insights to stock extra inventory at locations where it expects higher demand and reduce it from locations with lower demand” (Buvat et al., 2015).

Wal-Mart has also embarked upon a strategy of purchasing startup companies with the intention of adding to the retailer’s knowledge in the mobile, analytics and social media realm. The retailer purchased a company called Kosmix, whose founders had sold a digital company to Amazon in the late 1990’s. Kosmix is a social media data aggregator which analyzes data from Twitter and other social networks with the aim of helping the company understand the relationship between customers and products. Shortly after the Kosmix acquisition, a small team at @WalmartLabs prototyped a new search capability code-named “Polaris”. Polaris helps to determines customer intent when embedded within Walmart.com. “As a result, if a user types in the word ‘denim’, it returns results on ‘jeans’ while ‘chlorine tablets’ returns results related to pool equipment (Ribiero, 2012). Within 9 months the prototype was production ready as it replaced an Oracle based product (Endeca) whose search functionality was simply keyword based. The company claims that it sees a 10% – 15% boost in shoppers completing a purchase using the Polaris search algorithm.

One recent splashy acquisition in the e-tail space involved Wal-Mart’s purchase of Jet.com for 3 billion in cash. Wal-Mart realizes that customer preferences have shifted to the online retailer space while Wal-Mart has a substantial legacy footprint in brick and mortar locations. According to the Wall Street Journal, “The retailer gains access to a larger group of young, wealthy, urban shoppers through Jet” (Nassau, 2016). The company also gains access to a startup minded employee talent base, startup executive experience and Jet.com’s proprietary pricing software and customer data.

Additional social project activities employed by Wal-Mart include adding a ratings review capability to its products on Walmart.com and a partnership with Facebook to offer individual pages for each of the retailer’s 3,500 stores. The company has also used the hashtag #lovedata to appeal to potential technology hires.

From an employee social engagement perspective, Wal-Mart developed an internal web site called mywalmart.com with the aim of developing an employee social media community where employees are allowed to blog and answer questions related to the company. Roughly 75% of the company’s 1.4 million U.S. based associates log on to the site (Tuttle, 2010). This effort required integrating multiple disparate websites running on different web platforms (for example payroll and benefits sites).

It should be noted that although the company is taking positive steps in the social media space, its initial attempts in the early 2000’s were clumsily executed. “Its Walmarting Across America blog in 2006, about two Wal-Mart enthusiasts traveling around the U.S. in an RV, was revealed to be less than authentic when it was learned that Wal-Mart paid for the flights, the RV and the gas of the two protagonists. ‘The Hub,’ a MySpace-like clone, closed in October 2006, just 10 weeks after it launched, while a Wal-Mart sponsored Facebook group reportedly had lackluster results” (Edelson & Karr, 2011).

Wal-Mart: Strategy for Organizational Change Management, Project Strategy and Complexity:

Best practice project management principles for handling complexity include implementing risk management practices and analyzing project risk/rewards. This series has already addressed Wal-Mart’s heavy reliance on ROI as a measure of project success. But Wal-Mart also has the advantage of running a common information systems platform for its global operations. This has allowed the company to offset the higher costs of developing in-house systems by building a system once and then rolling it out along with the respective business processes enterprise wide.

The company also lowers IT project complexity by closely examining the current process that the system is supposed to improve. This activity has been described by former CIO Kevin Turner as “eliminate before we automate”.

“Eliminate steps, processes, reports, keystrokes; eliminate any activity that you possibly can for two reasons: One, you’ll end up building a whole lot better system that’s easier to support, and two, invariably you will have a better solution that’s more [user] friendly” (Lundberg, 2002).

Once the system is built, then a piloting phase occurs amongst the stores, distribution centers and customers that will present the most challenges. If the challenges are caught and addressed by pilot builds in the most trying situations, then installing at less challenging locations should experience minimal interruptions.

From an Enterprise Risk Management standpoint Wal-Mart uses a 5 step process that allows its ERM group to work with the business to mitigate many of the risks that the company faces. “The five-step ERM process involves: 1. risk identification, 2. risk mitigation, 3. action planning, 4. performance metrics, and 5. shareholder value” (Atkinson, 2003).

An additional project strategy is to funnel all IT projects through the central IT office with a single enterprise-wide portfolio overseen directly by the CIO. King (2014) asserts that objectively ranking projects by using an automated spreadsheet tool helps to eliminates politically-driven decisions. Resources are assigned to projects based upon criticality and available funds. When available employees and funds are exhausted for the quarter, remaining projects on the list do not make the cut. Technology resources are asked to remain flexible as they rotate to different jobs within the company to gain additional skills. The flexibility of IT resources is a boon to change management plans as resources can be swapped out without minimal interruption to the overall project plan.

In case you missed something, make sure to revisit Part 1 & Part 2 of the series.

If you’re interested in Business Intelligence & Tableau check out my videos here: Anthony B. Smoak


Atkinson, W. (December, 1, 2003). Enterprise Risk Management at Wal-Mart. Risk Management. Retrieved from Factiva.

Buvat, J., Khadikar, A., KVJ, S. (2015). Walmart: Where Digital Meets Physical. Capgemini Consulting. Retrieved from https://www.capgemini-consulting.com/walmart-where-digital-meets-physical

Edelson, S., & Karr, A. (April, 19, 2011). Wal-Mart to Buy Kosmix. Retrieved from Factiva.

King, R. (October 2014). Wal-Mart Becomes Agile But Finds Some Limits. Dow Jones Institutional News. Retrieved from Factiva

Lundberg. A. (July 1, 2002). Wal-Mart: IT Inside the World’s Biggest Company. CIO magazine. Retrieved from http://www.cio.com/article/2440726/it-organization/wal-mart–it-inside-the-world-s-biggest-company.html?page=2

Nassau, Sarah (2016). Wal-Mart to Acquire Jet.com for $3.3 Billion in Cash, Stock. Wall Street Journal http://www.wsj.com/articles/wal-mart-to-acquire-jet-com-for-3-3-billion-in-cash-stock-1470659763

Sherman, Richard J.. ( © 2013). Supply chain transformation: practical roadmap to best practice results. [Books24x7 version] Available from http://common.books24x7.com.libezproxy2.syr.edu/toc.aspx?bookid=49746.

Ribeiro, J. (August 31, 2012). Walmart rolls out semantic search engine, sees business boost. Retrieved from http://www.computerworld.com/article/2491897/internet/walmart-rolls-out-semantic-search-engine–sees-business-boost.html

Sullivan, L. (September 24, 2004). Wal-Mart’s Way: Heavyweight retailer looks inward to stay innovative in business technology. Retrieved 6/17/16 from http://www.informationweek.com/wal-marts-way/d/d-id/1027448?

Tuttle, D. (February, 17, 2010). Wal-Mart’s social media community earns accolades. Retrieved 6/17/16 from Factiva

WAL-MART STORES, INC. (January 31, 2016). FORM 10-K. Retrieved from https://www.sec.gov/Archives/edgar/data/104169/000010416915000011/wmtform10-kx13115.htm

Photo Copyright: dutourdumonde / 123RF Stock Photo

Anthony Smoak Final Project: Data Visualization and Communication with Tableau



I recently earned a verified course certificate from Coursera in the “Data Visualization and Communication with Tableau” class. This class is the 3rd offered in the “Excel to MySQL: Analytic Techniques for Business” Coursera Specialization. I’m looking forward to taking a couple more MOOCs dealing with Tableau and visualization to supplement and reinforce existing knowledge. I would recommend the class to anyone looking to frame an analysis and learn a good bit about using Tableau.