Make Flashy Maps in Tableau with Mapbox

The default maps in Tableau are just fine but sometimes you need to kick up the flamboyancy factor in your visuals. Integrating maps from Mapbox with Tableau is the perfect way to add some Liberace flash to your development game.

Mapbox is an open source mapping platform for custom designed maps.  By creating an account with Mapbox, you can either design your own maps on the platform or use their preset maps, which are all more impressive than the out of the box option in Tableau.

All you need to do is enter your generated API token (provided by Mapbox) into Tableau’s Map Services interface and you’ll have access to some pretty impressive mapping options.

If you’re interested in Business Intelligence & Tableau subscribe to my  Youtube channel.

 

 

Advertisements

Starbucks, Digital and Analytics: A Perfect Blend

Starbucks is a differentiator, an early adopter in regards to technology and a savvy user of data analytics. Employing the “Starbucks Experience” differentiation strategy (e.g., customer service, ambiance, interior aesthetics, prime locations), the company is able to command above market prices for a commodity product.

Surprisingly, for an organization that was not born in the digital era, Starbucks embraces new technology like a forward-looking digitally native company. The company has demonstrated a willingness to take risks on its road to digital maturity which helped it acquire a unique value proposition in loyalty and customer satisfaction. From a company strategy perspective, the organization is aware that early adopters can acquire advantages.

“In 1998, it was one of the first companies to launch a website; in 2002, it began offering WiFi to its customers, helping to start the transition from quick coffee stop to all-day hangout; and a full decade ago, Starbucks was establishing its social media presence. Now, while others are setting up mobile payment terminals and struggling to start a loyalty program, Starbucks is seeing 11 percent of its sales from mobile order and pay, and 14.2 million Starbucks Rewards members accounting for 37 percent of U.S. company-operated sales.” [1]

As the importance and impact of data has come to prominence in today’s business environment, the company relies heavily upon analytics and digital technologies to enhance business performance. Starbucks processes approximately 90 million transactions per week, [2] therefore the company has to have the requisite culture and resources in place to wring optimal value from data and analytics.

Starbucks Technology Leadership

Any company’s drive for a culture of digital, data and analytics is greatly influenced by senior leadership. As such, Starbuck’s senior leadership increasingly reflects a silicon valley background and mindset. Former chairman and CEO Howard Schultz has been referred to as the “Steve Jobs” of coffee. Schultz’s handpicked successor Kevin Johnson is a veteran technology player having previously served as CEO of Juniper Networks and as President of Microsoft’s Windows division.

Another high level technology hire was that of Gerri Martin-Flickinger who has held technology leadership roles at Adobe, VeriSign and McAfee Associates. Ms. Martin-Flickenger serves as the company’s first Chief Technology Officer, which is a title change from the previous CIO position held by Curt Garner. Incidentally Garner left Starbucks to become the CIO of Chipotle, as his new suitor was impressed with Starbuck’s use of online ordering and mobile pay capabilities.

The ultimate technology name drop belongs to Satya Nadella who serves on Starbuck’s board of directors (as of 2017) and is currently the CEO of Microsoft. Coincidentally, Starbucks is a prominent customer leveraging the Microsoft Azure cloud platform.

Mobile Order / Mobile Pay

One of Starbuck’s most laudable achievements has been its execution of mobile ordering and mobile pay via its iOS and Android applications. Customers who are short on time can pop open the Starbucks app on a smartphone and then place and pay for their order. Purchases lead to an accumulation of loyalty “stars” which can be redeemed for free products. This digital relationship has helped drive demand in physical stores and increase ticket spend per customer.

“They’re actually broadening the footprint of their stores with technology. If everything was a walk-in order, you’d only be able to sell what people could drive up and wait around to get. But by having mobile pay and drive thru, they can extend that store footprint out for miles.”
 [1]

This functionality has been so popular that it has caused operational challenges in the physical stores as customers have walked out after seeing large crowds waiting on mobile orders. The company has beefed up staffing and changed store layouts where applicable to accommodate the increased demand. Starbucks has also based loyalty program rewards on total spend as opposed to number of transactions. The latter approach allowed customers to game the system by sub-dividing orders into multiple transactions which caused bottlenecks.

 Future enhancements to the application include a personalization engine that will allow stores to target customers for differentiated treatment (e.g., birthday rewards, discounts for previously purchased items). To increase customer satisfaction and operational efficiencies, geolocation can be used to track a customer’s presence near the store and allow the point of sale terminal to pre-assemble the customer’s typical order. The barista can then confirm the order once the customer arrives and then submit at the press of a button.

Additionally, voice ordering capabilities via Amazon’s Alexa platform will allow customers to order and pay for food and drinks as if they were speaking to a human barista. The company demonstrated this capability at an investor conference when it placed an order for a “Double upside down macchiato half decaf with room and a splash of cream in a grande cup.” [3]

Due to the company’s investment in this digital capability, Starbucks is in a position to capitalize on its homegrown mobile order and payment processing technology via licensing to other retailers. The decision would have to be weighed against the advantages of keeping its technology and processes proprietary.

Geolocation

Starbucks is a savvy user of data and analytics to help determine where to place its next retail locations. The organization has a real estate analytics team (amongst many others) that spearheads its site selection strategy.

Using an in-house mapping and business intelligence platform called Atlas, the organization can combine data from various internal and external sources to create models which help drive decision making.

“Through a system called Atlas, Starbucks links to as many external and internal APIs as possible, connecting the data with R to build cannibalization models that can determine impact to existing stores if a new store enters the area. This model drives decision-making in cities across the US and world.” [4]

As part of its growth strategy, Starbucks is planning to open 3,000 stores in mainland China through 2022. That is approximately 600 stores per year, or 1 every 15 hours [6]. Choosing the correct site locations is absolutely critical to store success and the correct mix of data and analyses must be conducted to enable a successful rollout.

Starbucks weaves together various data points such as weather, auto traffic, consumer demographics, population density, income levels, coffee purchase patterns, current store locations and even levels of mobile phone ownership to construct its site selection strategy. The company analyzes all pertinent data points overlaid on a visualization map powered by a spatial data warehouse.

Weather and sales data individually can tell two different stories but blended together they can offer the company new insights. For example, if it’s forecasted to be hotter than average in a location, Starbucks can geo-design a localized promotion for cold beverages [6].

Store Labor Analytics

In a presentation given by Leslie Hampel (VP, Store Operations) at Dartmouth’s Tuck School of business, she described the process of balancing the company’s top down strategic store scheduling decisions with emergent strategies from store level managers. If an improvement suggestion is offered by one store for use in all others, then its impact has to be modeled on a statistically significant group of stores first.

Starbucks is very risk averse in regards to making labor related changes. If you consider that labor is about a third of Starbuck’s cost base, a decision that is wrong by 1% will be wrong by about 40 million dollars [7].

In addition to taking suggestions from stores, the company will use analytics to systematize best scheduling practices by examining the best operating 10% of stores. These localized processes are then tested on a statistically significant group of 100 stores. An internal analytics team will assess the impact of the changes to the company’s financials, as well as impacts to store sales and the customer experience. At the end of the 90 day testing period the new process will either be rolled out to multiple stores, tweaked and re-tested, or simply abandoned [7].

As an aside, it’s worth noting that the creation of Starbuck’s famous Frappuccino’s were the result of an emergent strategy from a store level manager in California. Although Starbucks stores are only supposed to sell company approved drinks, the manager sold them in her store against the company mandate. Despite corporate management’s initial reluctance to stock and sell the drink (until seeing the local sales data), Frappuccino’s are now a billion dollar business for Starbucks [8]. Always listen to your ground level employees, as they are the closest to the customer.

Starbucks & Bitcoin

No post about digital technology would be complete without a nod to Bitcoin.

To burnish its tech bona fides, Starbucks has been in talks with Microsoft, the parent of the New York Stock Exchange and the Boston Consulting Group to develop a crypto-initiative. Bakkt (pronounced “backed”) will allow consumers to store and convert digital currencies to dollars that can be used for in-store purchases.

The idea has the backing of major corporations and could help lend some Fortune 500 legitimacy to the somewhat murky and volatile world of cryptocurrencies. Although cryptocurrencies have yet to reach mainstream appeal, Starbucks is showing its willingness to be an early adopter of a new payments solution as it did with its mobile ordering and payment initiatives.

References:

[1] 5 Ways Starbucks is Innovating the Customer Experience. https://www.qsrmagazine.com/consumer-trends/5-ways-starbucks-innovating-customer-experience

[2] Starbucks’ CTO brews personalized experiences  https://www.cio.com/article/3050920/analytics/starbucks-cto-brews-personalized-experiences.html

[3] Starbucks Adds Voice Ordering to iPhone, Amazon Alexa http://fortune.com/2017/01/30/starbucks-alexa-voice-ordering/

[4] Data Analytics in the Real World: Starbucks https://www.northeastern.edu/levelblog/2016/03/04/data-analytics-in-the-real-world-starbucks/

[5] China is getting nearly 3,000 new Starbucks https://money.cnn.com/2018/05/16/news/companies/starbucks-in-china-store-expansion/index.html

[6] Esri 2014 UC: Starbucks Coffee and IT. Coffee beans and business strategy. https://www.esri.com/videos/watch?videoid=3654&isLegacy=true

[7] Using Data to Create & Maintain the Starbucks Experience https://www.youtube.com/watch?v=sUkQwhMwOig @32:42

[8] Rothaermel, Frank T. 2015. Strategic Management 2nd Edition. New York: McGrawHill, Irwin (2nd edition).

Image Copyright : monticello on 123rf.com

The London Whale Trading Incident

Once again I am digging into my digital crates to share an informative post. Here is a small writeup from a Syracuse graduate Enterprise Risk Management class (IST 625) I completed concerning JP Morgan Chase and the “London Whale”. The post is slightly edited from the final version I submitted. The assignment was as follows:

“This assignment requires you to research an organization that has suffered a devastating loss from a so-called ‘low frequency or low probability but-high consequence’ event, to understand what happened to that company during and after that event, and to synthesize what we can learn from their experiences.”

In gambling parlance, a whale is a high roller who bets big and has the potential to cause the house substantial losses if he/she stops betting at the wrong time. Although the whale at the center of this episode wasn’t sitting in a casino, the house (JPMorgan) experienced substantial losses when the betting stopped.

JP Morgan Overview

JPMorgan Chase is the largest financial holding company in the United Sates and has more than 2 trillion dollars in assets. It is also a global financial services firm with more than 250,000 employees (United States Senate pg. 24). The company has more than 5,600 branches and is a strong market participant in the mortgage lending, investment banking and credit card spaces (Hoover’s Company Records, 2014). JP Morgan’s principal bank subsidiary, JPMorgan Chase Banks is also the largest bank in the United States.

The current Chairman and CEO of JP Morgan Chase is Mr. James “Jamie” Dimon. Previous to 2012, Mr. Dimon’s name was associated with the eponymously named “Dimon principle”. “The ‘Dimon principle,’ as it is known, was shorthand for a safe bank with regular profits” (Ebrahimi, Aldrick & Wilson, 2012). With Mr. Dimon guiding the firm, JP Morgan held an excellent reputation with respect to risk management. The same could not be said of similar financial firms who were no longer “going concerns” as a result of the 2008 Great Recession.

“So when the US Government desperately sought someone with the balance sheet for the corporate rescue acts necessary to prevent financial meltdown, it sent for Dimon. In what were admittedly sweetened deals, JP Morgan swallowed bankrupt investment bank Bear Stearns and cash-strapped retail lender Washington Mutual” (Osborne, 2011). Emerging from the 2008 financial crisis unscathed, Mr. Dimon became more powerful and confident. He frequently railed against the need for government regulations with regard to proprietary trading in large financial firms. (Scuffham & Laurent, 2012) quote Mr. Dimon as stating, “We must not let regulatory reform and requirements create excessive bureaucracy and unnecessary permanent costs.”

Unfortunately for JP Morgan Chase and Mr. Dimon, subsequent trading events and risk management failures of 2011 would tarnish the firm and the CEO’s cultivated and highly regarded reputation.

London Whale Trades

(English, 2012) offers an analogy of the high risk/low frequency event that occurred in JP Morgan’s Chief Investment Office (i.e. CIO). Imagine a scalper who purchased 50,000 tickets to a sporting event with a capacity of 75,000 seats. The event is not nearly as popular as was anticipated by the scalper, thus the going price on the tickets plummets rapidly as would be ticket buyers wait for prices to fall further from face value. The scalper intended to hold tickets for the long term expecting high demand but with too many people on the sidelines betting that prices would fall, the scalper had to cut losses and sell at a drastic loss.

The “London Whale” trader at the center of this JP Morgan controversy was a French national and London based trader named Bruno Iksil who was known for being a successful leviathan risk taker. Mr. Iksil worked for the firm’s Chief Investment Office. Since JP Morgan has an excess of deposits after the firm makes loans available to business and consumers, this excess cash is invested by the CIO group to hedge against disparate investment actions undertaken by other areas of the bank.

The stated purpose of the CIO unit was to protect the bank from losses and interest rate risk by acting as a hedge and offsetting the bank’s other credit risks. The CIO unit is not tasked with proprietary or “prop” trading (essentially placing bets) intended to boost profits. Prop trading is the mandate of the company’s Investment Banking Division. In 2009 alone the CIO group’s Synthetic Credit Portfolio (SCP) of financial derivatives generated 1.05 billion dollars for the bank (United States Senate, pg.87). Ultimately JP Morgan would end up being a victim of its own success as it continued to conduct proprietary trades in the CIO division.

Bruno Iksil and the London CIO office were steadily racking up daily losses in the hundreds of millions of dollars by investing in synthetic derivatives (i.e. Credit Default swaps or CDS). The trading positions that the CIO office held were not hedging against other bank investments, as was the purported charge of this office. Credit default swaps are financial derivatives the provide investors insurance on bonds against potential default. Mr. Iksil, “has been selling protection on an index of 125 companies in the form of credit-default swaps. That essentially means he is betting on the improving credit of those companies, which he does through the index—CDX IG 9—tracking these companies” (Bianco, 2012, para 5).

Needless to say, the companies in the index did not improve. The initial 100 million dollar position that the CIO office held in the CDX IG 9 index was essentially cornering the market and when there were no willing buyers, the firm had to sell at massive loss.

In April of 2012 the press began running stories about the identity of the “London Whale”. The massively large trades in credit default swaps (the same complex financial instruments that doomed A.I.G during the 2008 financial crisis) began to affect credit markets worldwide. Initially, by the end of the week on May 11, 2012 when the firm held a hastily convened conference call regarding transparency around the London Whale trades, JP Morgan suffered a loss of 14.4 billion from its market capitalization as its stock price fell 11.47% in two days (Ebrahimi, Aldrick & Wilson, 2012). By the end of May the synthetic derivatives portfolio alone had lost 2 billion dollars. By the end of June the losses doubled to 4.4 billion and eventually reached 6.2 billion by the end of the year (United States Senate, pg. 12).

Initial Management Response

Once the CIO division management learned of Bruno Iksil’s precarious investment positions, “it could have announced the maximum possible losses from the trades. Instead it said what the losses were at that moment in time, and hoped a change in sentiment and some clever trading would stop them spiralling [sic]”. To the London Whale’s credit, once he observed the potential for disaster, he suggested that the division take a loss or “full pain” which would have been an additional 100 million dollars wiped out (Farrell, 2012), far less than the eventual 6.2 billion dollar total loss number.

Amazingly, Mr. Iksil’s management began to take actions that would conceal the magnitude of losses reported. Recorded telephone calls, instant messages and a shadow spreadsheet containing actual projected losses, revealed how traders were pressured to minimize the expected losses of the SCP (Synthetic Credit Portfolio) (United States Senate Report, pg. 20).

Internal CIO management also disregarded their own risk metrics such as the VaR or value at risk, which estimates the maximum risk of loss over the course of a day. This warning sign metric was ignored and then actually raised. CEO Jamie Dimon, Chief Risk Officer John Hogan and CIO head Ina Drew, “approved the temporary increase in the Firm-wide VaR limit, and Ms. Drew approved a temporary increase in CIO’s 10-Q VaR limit.” (JPMorgan Chase & Co, 2013 pg. 79)

Senior bank management was told that potential losses were massive and no longer functioned as a hedge to the bank; management then proceeded to downplay those issues until the losses mounted into the billions of dollars (United States Senate, pg.21). On an April 13, 2102 first quarter conference call CEO Jamie Dimon dismissed the initial publicity surrounding the London Whale trades by characterizing them as a “complete tempest in a teapot” (United States Senate pg. 17). By June 2013, Mr. Dimon’s stated to a Senate Panel on the trading losses, “Let me first say, when I made that statement, I was dead wrong” (PBS NewsHour, 2012).

Remediation and Outcomes

CEO Jamie Dimon stated, “CIO will no longer trade a synthetic credit portfolio and will focus on its core mandate of conservatively investing excess deposits to earn a fair return” (JPMorgan Chase & Co., 2012(a), pg. 3). Management instituted a number of changes as a result of the CIO trading imbroglio. All CIO managers based in London with any responsibility for the Synthetic Credit Portfolio were separated from the firm with no severance and without 2012 incentive compensation. (JPMorgan Chase & Co., 2012(a), pg.22).

JP Morgan instituted significant changes for the better in the CIO Risk Management organization. A new Chief Risk Officer was empowered to hire additional senior level officers to “extend the capacity of the Risk function within CIO, Treasury and Corporate, and he has made 20 such hires since May 2012” (JPMorgan Chase & Co., 2012(a), pg.114). Along with upgraded personnel skills in the CIO Risk organization, management rightfully instituted a common sense approach to structural issues.

In the pre “Whale trades” environment, the CIO Risk Committee met infrequently and did not contain any members from outside of the CIO organization. This lack of diversity in the realm of “risk-thought” fostered a group think/rubber-stamp mentality. CIO Risk managers did not feel “sufficiently independent” from the CIO business to ask hard questions or criticize trading strategies (JPMorgan Chase & Co., 2012(a), pgs. 12-13).

Industry Impact

“Dimonfreude” was a term coined in the wake of the trading losses, “it means taking great satisfaction in the misfortunes of the JPMorgan boss” (Foley, 2012). Yet, the fall out from JP Morgan’s episode was more than mere embarrassment for the firm and the CEO’s reputation in the area of risk management. To the chagrin of Mr. Dimon, this episode strengthened the case for more government oversight of the financial industry. In the words of then Treasury Secretary Timothy Geithner, “I think this failure of risk management is just a very powerful case for financial reform” (Shorter, Murphy & Miller, 2012, pg. 24).

References

Bianco, J. (2012). Understanding J.P. Morgan’s Loss, And Why More Might Be Coming. The Big Picture. Retrieved February 2, 2014, from http://www.ritholtz.com/blog/2012/05/understanding-j-p-morgans-loss-and-why-more-might-be-coming/

English, S. (2012). How London Whale’s errors attracted the market sharks. The Independent. Retrieved from Factiva.

Ebrahimi., H., Aldrick, P., & Wilson, H. (2012). The day JP Morgan’s Jamie Dimon lost his sparkle; Breathtaking risk failures at JP Morgan have left the bank’s reputation on the edge. The Telegraph Online. Retrieved from Factiva.

Farrell, M. (2013). JPMorgan slashes Dimon’s bonus by 53%. CNN Wire. Retrieved from Factiva.

Foley, S. Jamie Dimon Chief executive, JP Morgan Chase (2012). The Independent. Retrieved from Factiva.

Hoover’s Company Records. (2014). JPMorgan Chase & Co. Austin, U.S. Retrieved from http://search.proquest.com.libezproxy2.syr.edu/docview/230565788?accountid=14214

JPMorgan Chase & Co. (2012)(a). JPMORGAN CHASE REPORTS SECOND-QUARTER 2012 NET INCOME OF $5.0 BILLION,OR $1.21 PER SHARE, ON REVENUE OF $22.9BILLION. Retrieved February 1, 2014 from http://files.shareholder.com/downloads/ONE/2939707738x0x582870/6a286dff-ad7e-40ba-92ef-e6ff1b3be161/JPM_2Q12_EPR_Final.pdf

JPMorgan Chase & Co. (2012)(b). CIO Task Force Update. Retrieved February 1, 2014 from http://files.shareholder.com/downloads/ONE/2939707738x0x582869/df1f2a5a-927e-4c10-a6a5-a8ebd8dafd69/CIO_Taskforce_FINAL.pdf

JPMorgan Chase & Co. (2013). Report of JPMorgan Chase & Co. Management Task Force Regarding 2012 CIO Losses. Retrieved February 1, 2014 from http://files.shareholder.com/downloads/ONE/2272984969x0x628656/4cb574a0-0bf5-4728-9582-625e4519b5ab/Task_Force_Report.pdf

Osborne, A. (2012). JP Morgan $2bn loss: Dimon’s in the rough; Be careful what you wish for. The Telegraph Online. Retrieved from Factiva.

PBS NewsHour. JPMorgan Chase’s Big Losses, Big Risk: Blip on Radar or Systemic? Retrieved February 1, 2014 from http://www.pbs.org/newshour/bb/business-jan-june12-jamiedimon_06-13/

Scuffham, M. & Laurent, L. (2012). Trader known as ‘London Whale’ for his huge, hidden bets. The Globe and Mail. Retrieved from Factiva.

Shorter, G., Murphy, E., Miller, R. (2012). JP Morgan Trading Losses: Implications for the Volcker Rule and Other Regulation. Congressional Research Service. Washington, DC. Retrieved from https://www.fas.org/sgp/crs/misc/R42665.pdf

United States Senate. (2013). JPMorgan Chase Whale Trades: A Case History of Derivates Risks And Abuses. Staff Report. Washington, DC. Retrieved from http://www.hsgac.senate.gov/download/report-jpmorgan-chase-whale-trades-a-case-history-of-derivatives-risks-and-abuses-march-15-2013

Picture Copyright : Andrey Kiselev on 123rf.com

How to Drill into Data Using Set Actions in Tableau

Drilling with Set Actions

If you’ve ever tried to use the default drill functionality within Tableau, you know that it could be a more user friendly experience. The default table drill functionality opens all of the options at the next drill level which can force a user to lose sight of the data upon which they’re focusing. A more user-friendly option enables the user to only drill into a specific selected value where focus and attention can be maintained. This is otherwise known as asymmetric drill down.

Fortunately as of version 2018.3, Tableau has added Set Actions as a new functionality. At a high level, developers can take an existing set and update its values based upon a user’s actions in the visualization. The set can be employed via a calculated field within the visualization, via direct placement in the visualization or on the marks card property.

In lay terms this means empowering a user with more interactivity to impact their analyses.

In this first video, I’ll demonstrate a use of set actions on an NBA data set. We’ll drill from Conference to Division to Team to Player. This tip will be easily applicable to your Tableau data. And with the bonus tree-map tip you’ll release your inner Piet Mondrian.

Feel free to interact with the set action example on Tableau Public and then download and dissect the workbook.


Drilling with Level of Detail (LOD) Calculations
If you want to stay with a classic approach, a nice Level of Detail (LOD) workaround can be employed to drill into the next level. Here is a tip that accomplishes a similar outcome where I demonstrate a technique originally presented by Marc Rueter at Tableau Conference 2017.

Now that I’ve equipped you with the knowledge to incorporate customized drilling functionality into your analyses, go forth and do some great things with your data!

References:

https://onlinehelp.tableau.com/current/pro/desktop/en-us/actions_sets.htm
https://www.tableau.com/learn/tutorials/on-demand/set-actions
https://www.basketball-reference.com/leagues/NBA_2018.html
https://www.youtube.com/watch?v=d22A4XVoUEs

Image Copyright dzxy on 123rf.com

How to Get Things Done in T-SQL

Whenever I have to fire up SQL Server to perform some analyses there are a few functions, keywords and capabilities that I always find myself referring to in order to analyze my data. As is the case with most T-SQL users, even those of us that have been using T-SQL for over a decade, in our heads we always know what we want to do but will refer to our favorite syntax reference sources in order to progress. I decided to make a handy reference sheet for myself and then decided to post it here for anyone else.

How to Create a Temporary Table in T-SQL / SQL Server

Temporary (i.e., temp) tables enable the storage of result sets from SQL scripts yet require less record locking overhead and thus increase performance. They remain in effect until they are explicitly dropped, or until the connection that created them is discontinued.

As I see it, their main benefit is that they preclude me from writing difficult to comprehend nested queries since I can place a result set inside a temp table and then join it back to a normal table at-will.

In this example, the results of permanent table ‘TABLE1’ will be placed into global temporary table ##TEMPTABLE:

SELECT 
     FIELDNAME1,
     FIELDNAME2,
     FILEDNAME3 
INTO ##TEMPTABLE 
FROM TABLE1

Temp tables are stored in the tempdb system database.

TempDB

“The tempdb system database is a global resource that is available to all users connected to the instance of SQL Server or connected to SQL Database.”

Additional Reference:

What do the Hashtags Mean in T-SQL Temp Table Creation?

 The number of hash signs “#” preceding the name of the temp table affects whether the scope of the table is local or global.

  • If you precede the temp table name with “#”, then the table will be treated as a local temp table.
  • If you precede the temp table with “##”, then the table will be treated as a global temp table.

“You can create local and global temporary tables. Local temporary tables are visible only in the current session, and global temporary tables are visible to all sessions. Temporary tables cannot be partitioned. Prefix local temporary table names with single number sign (#table_name), and prefix global temporary table names with a double number sign (##table_name).”

Additional References:


How to Drop a Temp Table
in T-SQL / SQL Server

 There are times when you will need to rerun code that creates a temp table. If the temp table has already been created, you will encounter an error.

“There is already an object named ‘##TEMP_TABLE_NAME’ in the database.”

Place the following code above the creation of your temp tables to force SQL Server to drop the temp table if it already exists. Change ##TEMP_TABLE_NAME to your table name and use the correct number of hashtags as applicable to a local (#) or global (##) temp table.

IF OBJECT_ID('tempdb..##TEMP_TABLE_NAME') IS NOT NULL
DROP TABLE ##TEMP_TABLE_NAME

How to Add a New Field to a Temp Table in T-SQL / SQL Server (ALTER TABLE)

Here is example T-SQL that illustrates how to add a new field to a global temp table. The code below adds a simple bit field (holds either 1 or 0) named FIELD1 to the temp table, declares it as NOT NULL (i.e., it must have a value) and then defaults the value to 0.

ALTER TABLE ##TEMP_TABLE
ADD FIELD1 Bit NOT NULL DEFAULT (0)

The following code changes the data type of an existing field in a global temp table. FIELD1 has its data type changed to NVARCHAR(2) and is declared as NOT NULL.

ALTER TABLE ##TEMP_TABLE
ALTER COLUMN FIELD1 NVARCHAR(20) NOT NULL;

Additional References:


How to Use a CASE Statement in T-SQL / SQL Server

The following information on the CASE statement is direct from Microsoft:

The CASE expression evaluates a list of conditions and returns one of multiple possible result expressions. The CASE expression has two formats:

  • The simple CASE expression compares an expression to a set of simple expressions to determine the result.
  • The searched CASE expression evaluates a set of Boolean expressions to determine the result.

Both formats support an optional ELSE argument.

CASE can be used in any statement or clause that allows a valid expression. For example, you can use CASE in statements such as SELECT, UPDATE, DELETE and SET, and in clauses such as select_list, IN, WHERE, ORDER BY, and HAVING.

Examples from Microsoft:

SELECT
ProductNumber,
Category =
CASE ProductLine
WHEN 'R' THEN 'Road'
WHEN 'M' THEN 'Mountain'
WHEN 'T' THEN 'Touring'
WHEN 'S' THEN 'Other sale items'
ELSE 'Not for sale'
 END,
Name
FROM Production.Product
ORDER BY ProductNumber;


SELECT
ProductNumber,
Name,
"Price Range" =
CASE
WHEN ListPrice =  0 THEN 'Mfg item - not for resale'
WHEN ListPrice = 50 and ListPrice = 250 and ListPrice < 1000 THEN 'Under $1000'
ELSE 'Over $1000'
END
FROM Production.Product
ORDER BY ProductNumber ;

Here is a link to great post that highlights some of the unexpected results when using the CASE statement.


How to Use the Cast Function in T-SQL / SQL Server

When you need to convert a data field or expression to another data type then the cast function can be helpful. I typically have the need to take imported text fields and evaluate them as a datetime. The cast statement below helps me resolve this issue.

Select cast(txtOrder_Date as datetime) as Order_Date

This statement can also be used in a WHERE clause to filter the text as if it were a true datetime field/.

Where cast(txtOrder_Date as datetime)) between '20170101' and '20181231'

Furthermore, you can cast a literal string to an integer or decimal as needed.

Select cast(‘12345’ as int) as Integer_Field
Select cast(‘12345.12’’ as decimal (9,2)) as Decimal_Field

When your FIELDNAME is a text value, you can use the cast function to change its data type to an integer or decimal, and then sum the results. Here are a few examples I have had to use in the past with the sum function.

sum(cast(FIELDNAME as int)) as Sum_Overall_Qty

sum(cast(ltrim(rtrim(FIELDNAME2)) as decimal(38,2))) as Sum_Sales_Price

Additional Reference:


Using the REPLACE Function in T-SQL / SQL Server

The Replace function is useful when you need to replace all occurrences of one character or substring with another character or substring. The following select will replace the string ‘Anthony’ with ‘Tony’.

Select REPLACE(‘My name is Anthony’, ‘Anthony’, ‘Tony’);

REPLACE Function

Additional Reference:


How to Convert a Negative Text Number in Parenthesis Format to a Numeric Data Type (T-SQL / SQL Server)

I’ve found this particular expression useful when trying to convert a negative number in text format to a decimal value when the text is enclosed in parentheses; i.e., changing (123.45) to -123.45

It makes use of the REPLACE function to find the leading parenthesis and replace it with a negative sign. This first REPLACE is nested inside another REPLACE function in order to find the trailing parenthesis and replace

Select cast(replace(replace('(123.45)','(','-'),')','') as money);

REPLACE function2

You can also use the convert function to accomplish the same result. Below I used this line of code to sum the negative formatted text (represented by FIELD_NAME) by converting it to the money data type after replacing the parenthesis.

sum(convert(money,replace(replace(FIELD_NAME,'(','-'),')',''))) as Sum_Domestic_Price


COALESCE Function in T-SQL / SQL Server

The COALESCE function is very useful when replacing NULL field values with a substitute value. Per Microsoft, the COALESCE function evaluates in order a comma delimited list of expressions and returns the current value of the first expression that initially does not evaluate to NULL.

For example,

SELECT COALESCE(NULL, NULL, 'third_value', 'fourth_value');

returns the third value because the third value is the first value that is not null. I will use the COALESCE function at times to replace NULL values with 0 for use in calculations.

Select COALESCE(NULL_FIELD, 0)

https://docs.microsoft.com/en-us/sql/t-sql/language-elements/coalesce-transact-sql?view=sql-server-2017

Yet Another Market Basket Analysis in Tableau

This video represents part two in my Market Basket Analysis series.

The steps in the post were inspired by the book Tableau Unlimited written by former co-worker of mine, Chandraish Sinha. I wasn’t planning to construct another market basket analysis video but when I saw the approach outlined in his book, I felt like it warranted sharing with my readers and followers.

In this version we’ll use default Tableau Superstore data to show the relationship between sub-categories on an Order; all without using a self table join. The visualization and analysis is driven by a user selection parameter.

Once the user selects a sub-category, the bar chart visualization updates to reflect the number of associated sub-category items on the same order.

Sample Superstore Data 2

Watch the video and as always get out there and do some great things with your data!

Feel free to also check out Part 1 here where we create a simpler correlation matrix version that shows all the sub-category relationships in one visual.

 

 

 

Market Basket Analysis in Tableau

 

A favored analysis technique employed by retailers to help them understand the purchase behavior of their customers is the market basket analysis. When you log on to Amazon, most likely you’ve noticed the “Frequently Bought Together” section where Jeff Bezos and company would like to cross-sell you additional products based upon the purchase history of other people who have purchased the same item.

Market Basket Analysis influences how retailers institute sales promotions, loyalty programs, cross-selling/up-selling and even store layouts.

If a retailer observes that most people who purchase Coca-Cola also purchase a package of Doritos (I know they’re competing companies), then it may not make sense to discount both items at once as the consumer might have purchased the associated item at full price anyhow. Understanding the correlation between products is powerful information.

In this video, we’ll use Tableau Superstore data to perform a simple market basket analysis.

Sample Superstore Data 2

Feel free to interact with this market basket analysis on Tableau Public and then download and dissect the workbook.

Watch the video and as always get out there and do some great things with your data.

Feel free to also check out Part 2 here where we’ll create an analysis driven by a user selection parameter.