A Review of Syracuse University’s Executive Master of Science in Information Management Program (MSIM)

636048035618466909

Syracuse University iSchool Commencement May 14, 2016 (I am somewhere in the far bottom right)

I wanted to share an assessment of the Syracuse University Executive Master of Science in Information Management (MSIM) program as I recently graduated from the program in the summer of 2016. I will say that if you are considering a distance program, you must be self-disciplined and a self-starter. Typically there are no set class times, there are only set assignment due dates.

Just to give you a bit of background on my experiences, back in 2012 I was toiling full-time in the back-office of an Atlanta based bank in a data warehousing/requirements gathering capacity when I acquired (or I should say re-acquired) the Masters bug. I already held an MBA with a focus in IT management but I wanted additional exposure to areas that focused on current “sexier” technology based topics such as data science, R programming, visualization and information security.

In contrast, the IT management focus of my MBA studies focused on strategic planning, project management, justifying IT investments, business process analysis and on-site company practicums. Basically IT Management is the academic “CIO starter pack” for the so-inclined executive. I wanted to supplement the more strategic MBA IT focus I had already gained with a granular data focused learning opportunity. I was thinking ahead and trying to position myself to tackle more interesting information related work in visualization and data analysis.

Fortunately, my employer at that time offered a tuition reimbursement program that would help offset some of the costs of a new graduate degree. Thus, at the start of 2012 I decided to take the plunge and started looking at reputable graduate degree programs in IS/MIS/CIS or Information Management.

Just because I was looking at online degree programs didn’t mean I was willing to compromise on academic quality or university reputation, thus I started researching STEM programs at reputable schools: Carnegie Mellon, Northwestern, Brandeis, Boston University and Syracuse University.

Based upon my tenure, here are some pros and cons of the Syracuse Executive MSIM program as I will be as impartial as possible in my assessment. Please note that my experience does not include an assessment of the full time on campus program.

From the Syracuse iSchool website:

The 30-credit MS in Information Management for Executives requires students to have six or more years of full-time professional experience in the information management field with a record of continuingly increasing job responsibilities. This program track is also available, online, full time, or part time and can be completed in as little as 15 months.

Pros:

  • Syracuse University is a well-known respected institution with a solid name brand;
  • According to USNews the iSchool at Syracuse is rated as having the #1 ranked program in Information Systems within an iSchool
    • Syracuse edged out other respected schools such as Michigan, UNC, Washington, Maryland and UT-Austin
  • All of the Big-4 professional services firms recruit from the full time program;
    • I took this as an indication of program quality
  • The degree is offered by the same school that houses the on campus program and not offered from a Professional Studies school
    • This held sway with me as I was looking for schools that did not herd distance students into a program with separate faculty and lower admissions standards
  • Any student in the Executive MSIM program can at anytime enroll in an on-campus class
    • This was further proof of academic parity between all MSIM degree options as I enrolled in two on-campus classes during summer terms
    • The majority of the classes were taught by PhD level faculty
  • You can combine a graduate degree with a CAS (Certificate of Advanced Study) in multiple disciplines;
    • Data Science
    • e-Government
    • Information Security Management
    • Information Systems & Telecommunications Management
    • School Media
  • You can transfer in to the MSIM program up to 6 credit hours from another graduate level program;
  • There are Syracuse University alumni chapters in most major cities that present networking opportunities
  • If you’re a college basketball fan Syracuse gives you a new rooting interest in a quality program. College football fans however…
  • If you complete the program, you WILL learn new concepts that can be put to work immediately in your current position and make you more confident in your abilities

Cons:

  • Expensive!!!
    • The program was much cheaper than Carnegie Mellon and slightly cheaper than Northwestern’s Continuing Studies offering but it was pricey nonetheless. Back in 2012 the classes were roughly $3800 for 1 class. By the time I finished in 2016 I was paying $4400 per class (a whopping 16% increase)
    • The majority of classes I took were of high quality and I gained additional knowledge to fill in some gaps but there were some classes were I felt the material was on par with a MOOC (which is not necessarily bad) but at $4000 a class, students will have much higher standards
    • The high cost of the program forced me to drag out the program over 3 and a half years
  • Would have liked to see greater connectivity solely between Executive MSIM students. I listed parity and sameness of academic quality as a plus but I would have liked to know (for networking purposes) who else was in the executive program even if they weren’t in my particular class. Classes were comprised of executive online students, non-executive online students and full-time on-campus students who wanted to take an online option. This cross-blend of students enriched the overall class experience but I still wanted a means of connection with all students in the executive MSIM program
  • Your student advisor can be unresponsive
    • There were times when I received an email response from my advisor up to a week later. There were times when my emails and phones calls to my advisor were completely ignored. Slow and especially non-responses should not be a possibility when customers are paying the university roughly $4,000 a class
  • Work. School. Social Life. Pick 2. This problem is not unique to Syracuse graduate programs
    • In my first class I had to write a 25-page paper which meant taking a week of “vacation” to finish!

All in all, I was pleased with the program. I was able to combine the M.S. in Information Management degree with a Certificate in Data Science which included exposure to tools like R, Tableau and Qlikview.

I was able to travel to Syracuse during the summer on two different occasions and complete two classes which were both of high quality. If you enroll in the program you should absolutely attend a summer Maymester class to acquaint yourself with the city and the campus. On my first visit I stayed in Haven Hall and lived like a student. On my subsequent visit it was a week in the Marriott since I had hotel points to use from my consulting job.

I specifically enjoyed the capstone class I took on campus, IST-755 Strategic Management of Information Resources. The class involved lectures combined with readings, a mid-term test and an in-class group strategic presentation based on an assigned case problem. After the week of class, students needed to compose an individual strategic paper.

From a cost perspective, there are innovative graduate programs leveraging MOOC (i.e. Massive Open Online Course) components which are currently offered at prestigious public universities. The MOOC underpinnings of these offerings allow the degrees to be offered at a mere fraction of Syracuse’s cost.

Both Georgia Tech (full disclosure, my MBA was earned here) and the University of Illinois are offering M.S. programs in computer science and data science for roughly $7,000 and $19,000 respectively. I would love to see more reputable universities with quality STEM programs follow suit in this regard (looking at you NYU, Cal-Berkeley, Carnegie Mellon and Syracuse).

However, if you can take advantage of your employer’s tuition reimbursement plan to subsidize the cost, I recommend the Syracuse Executive Master’s of Science in Information Management program for the quality academics taught by PhDs and the flexible curriculum. The program will be especially useful for talented individuals mired in back-office banking looking to transition to consulting!!

Photo Copyright : maglara

Advertisement

Consumer Financial Protection Bureau Infographic: Complaints Analysis

Background

As a data and visualization endeavor, I put together an infographic that highlights some product complaints analyses I performed using publicly available Consumer Financial Protection Bureau data.

In case you are unfamiliar with the CFPB, it is an organization that was created in 2010 as a result of the financial calamity that gripped that nation during the great recession. The CFPB’s mission is to write and enforce rules for financial institutions, examine both bank and non-bank financial institutions, monitor and report on markets, as well as collect and track consumer complaints.

On the bureau’s website they host a consumer complaint database that houses a number of complaints that consumers file against financial institutions.

Each week we send thousands of consumers’ complaints about financial products and services to companies for response. Those complaints are published here after the company responds or after 15 days, whichever comes first. By adding their voice, consumers help improve the financial marketplace.

Process

I downloaded the complaint database from the CFPB’s  website and then decided to concentrate on selected bank complaints from the many financial institutions that are present in the database. I settled on a self-defined “National” category and a “Regional” category and then analyzed the percentage of complaints across three product spaces (Mortgages, Bank Accounts & Credit Cards).

I felt a percentage approach would be more useful than just merely listing a total count of complaints. The national banks category consists of the four nationally known firms: JP Morgan Chase, Wells Fargo, Bank of America and Citibank. The regional banks category consists of ten fairly large regional banks that have product offerings similar to the national banks.

It’s fairly obvious that the behemoth national banks are going to have more mortgage complaints than the much smaller regional banks on a total count basis. The more interesting analysis is to look at the rate of mortgage complaints for the national banks as compared to the regional banks (e.g. divide a specific product complaints total like mortgage by the total complaints for all products; calculate this percentage for national and regional categories across all three products).

I carried out this analysis using the ggplot package in R to generate the base graphics for the infographic. Adobe Illustrator was then used to further refine the graphics into what you see below:

IST 719 Final Project-01_BLOG_VERSION

I have an additional unrefined chart that is a straight output from the ggplot package in R. I didn’t have enough space on the infographic to include it there. However, this analysis is the same as is represented in the bottom quadrant of the infographic, except that it solely applies to regional banks.

The analysis consists of totaling all of the specific PRODUCT complaints filed against a particular bank and then dividing that number by the total number of ALL complaints filed against the individual bank (e.g. Total mortgage complaints filed against a bank/total complaints filed against a bank). I call the resulting number the Complaint Ratio.

In the ggplot graph output below we can see that Regions’s “Bank Account or service” product represents about 67% of all complaints filed against Regions. If I were to break out the numbers on a total count basis, we’d see that Regions’s overall complaints total is relatively small compared to other banks. However, the bulk of its complaints are distributed in the “Bank Account or service” product area.

9_Regional Data by Product

May your next bank be your best bank.

Additional Reading:

An Interesting Comparison of Bank of America to JPMorgan Chase

The Need For Speed: Improve SQL Query Performance with Indexing

This article is also published on LinkedIn.

How many times have you executed a SQL query against a million plus row table and then engaged in a protracted waiting game for your results? Unfortunately, a poor database table indexing strategy can counteract the gains of the best hardware and server architectures. The positive impact that strategically applied indexes can provide to query performance should not be ignored just because one isn’t wearing a DBA hat. “You can obtain the greatest improvement in database application performance by looking first at the area of data access, including logical/physical database design, query design, and index design” (Fritchey, 2014). Understanding the basics of index application should not be eschewed and treated as an esoteric art best left to DBAs.

Make use of the Covering Index

It is important that regularly used, resource intensive queries be subjected to “covering indexes”. The aim of a covering index is to “cover” the query by including all of the fields that are referenced in WHERE or SELECT statements. Babbar, Bjeletich, Mackman, Meier and Vasireddy (2004) state, “The index ‘covers’ the query, and can completely service the query without going to the base data. This is in effect a materialized view of the query. The covering index performs well because the data is in one place and in the required order.” The benefit of a properly constructed covering index is clear; the RDBMS can find all the data columns it needs in the index without the need to refer back to the base table which drastically improves performance. Kriegel (2011) asserts, “Not all indices are created equal — If the column for which you’ve created an index is not part of your search criteria, the index will be useless at best and detrimental at worst.”

Apply a Clustered Index

More often than not, a table should have a clustered index applied so as to avoid expensive table scans by the query optimizer. It is advisable to create one clustered index per table preferably on the PRIMARY KEY column. In theory, since the primary key is the unique identifier for a row, query writers will employ the primary key in order to aid with record search performance.

“When no clustered index is present to establish a storage order for the data, the storage engine will simply read through the entire table to find what it needs. A table without a clustered index is called a heap table. A heap is just an unordered stack of data with a row identifier as a pointer to the storage location. This data is not ordered or searchable except by walking through the data, row by row, in a process called a scan” (Fritchey, 2014).

However, the caveat to applying clustered indexes on a transactional table is that the index must be reordered after every INSERT or UPDATE to the key which can add substantial overhead to those processes. Dimensional or static tables which are only accessed for join purposes are optimal for this indexing strategy.

Apply a Non-Clustered Index

Another consideration in regard to SQL performance tuning is to apply non-clustered indexes on foreign keys within frequently accessed tables. Babbar et al. (2004) advise, “Be sure to create an index on any foreign key. Because foreign keys are used in joins, foreign keys almost always benefit from having an index.”

Indexing is an Art not a Science

Always remember that indexing is considered an art and not a science. Diverse real world scenarios often call for different indexing strategies. In some instances, indexing a table may not be required. If a table is small (on a per data page basis), then a full table scan will be more efficient than processing an index and then subsequently accessing the base table to locate the rest of the row data.

Conclusion

One of the biggest detriments to SQL query performance is an insufficient indexing strategy. On one hand, under-indexing can potentially cause queries to run longer than necessary due to the costly nature of table scans against unordered heaps. This scenario must be counterbalanced by the tendency to over-index, which will negatively impact insert and update performance.

When possible, SQL practitioners and DBAs should collaborate to understand query performance as a whole; especially in a production environment. DBAs left to their own devices have the potential to create indexes without any knowledge of the queries that will utilize those indexes. This uncoordinated approach has the potential to render indexes inefficient on arrival. Conversely, it is equally important that SQL practitioners have a basic understanding of indexing as well. Placing “SELECT *” in every SQL query will negate the effectiveness of covering indexes and add additional processing overhead as compared to specifically listing the subset of fields desired.

Even if you do not have administrative access to the tables that constitute your queries, approaching your DBA with a basic understanding of indexing strategies will lead to a more effective conversation.

References

Babbar, A., Bjeletich, S., Mackman, A., Meier, J., & Vasireddy, S. (May, 2004). Improving .NET Application Performance and Scalability. Retrieved from https://msdn.microsoft.com/en-us/library/ff647793.aspx

Fritchey, Grant. ( © 2014). Sql server query performance tuning (4th ed.).

Kriegel, Alex. ( © 2011). Discovering sql: a hands-on guide for beginners.

Protection Against Injection: The SQL Injection Attack

As we are all well aware, data is everywhere. Every organization generates and stores data and unfortunately too many bad apples are willing to exploit application weaknesses.  A very popular technique used by hackers of all hats to compromise data confidentiality and integrity is the SQL injection attack. “In terms of the attack methods used by hackers, SQL injection remains the number one exploit, accounting for nearly one-fifth of all vulnerabilities used by hackers” (Howarth, 2010). Don’t believe the hype? Visit the SQL Injection Hall of Fame.

Not everyone is a DBA or a security expert but if you care about data, you need to have a basic understanding of how this attack can be used to potentially compromise your web exposed data. In 2009 infamous hacker Albert Gonzalez was indicted by grand juries in Massachusetts and New York for stealing data from companies such as Dave & Buster’s Holdings, TJ Maxx, BJ’s Wholesale Club, OfficeMax, Barnes & Noble and The Sports Authority by using SQL injection attacks. All of these attacks were enabled due to poorly coded web application software (Vijayan, 2009). He masterminded “the combined credit card theft and subsequent reselling of more than 170 million card and ATM numbers from 2005 through 2007—the biggest such fraud in history” (Wikipedia, Albert Gonzalez). As an aside, Mr. Gonzalez is serving 20 years in prison for his crimes.

In short, a SQL injection is a malicious hacking method used to compromise the security of a SQL database. Invalid parameters are entered into a user input field on a website and that user input is submitted to a web application database server for execution. A successful exploit will allow the hacker to remotely shell into the server and take control or simply obtain sensitive information from a hacked SQL SELECT statement. The exploiter may be able to further exploit SQL commands and escalate privileges to read, modify or even delete information at will.

A popular method to test the vulnerability of a site is to place a single quote character, ‘, into the query string of a URL (Krutz, R. L. & Vines, R. D., 2008). The desired response is to see an error message that contains an ODBC (Open Database Connectivity) reference. ODBC is a standard database access protocol used to interact with applications regardless of the underlying database management system. Krutz et. al (2008) offer the example of typical ODBC error message:

Microsoft OLE DB Provider for ODBC Drivers error ‘80040e14’
[Microsoft][ODBC SQL Server Driver][SQL Server]Incorrect syntax near the
keyword ‘and’. /wasc.asp, line 68

An error message like this contains a wealth of information that an ill-intentioned hacker can use to exploit an insecure system. It would be in the best interests of a secure application to return a custom generic error response. Furthermore, it is not necessary to be an experienced hacker to take advantage of this exploit; there are automated SQL injection tools available that can make carrying out this attack fairly simple for someone with a script kiddie level of understanding.

There are ways to protect against SQL injection attacks; the most obvious way is to apply input validation. Rejecting unreasonably long inputs may prevent exploitation of a buffer overflow scenario. Programmers due to the extra work involved, sometimes skip validation steps, however the extra safety margin may be worth the cost. Encrypting the database contents and limiting privileges on those accounts which execute user input queries is also ideal (Daswani, N., Kern, C., & Kesavan, A., 2007)

From a SQL Server perspective, here are a few best practice tips shared from Microsoft TechNet to consider for input validation:

    • You should review all code that calls EXECUTE, EXEC, or sp_executesql
    • Test the size and data type of input and enforce appropriate limits. This can help prevent deliberate buffer overruns.
    • Test the content of string variables and accept only expected values. Reject entries that contain binary data, escape sequences, and comment characters. This can help prevent script injection and can protect against some buffer overrun exploits.
    • Never build Transact-SQL statements directly from user input.
    • Use stored procedures to validate user input.
    • In multitiered environments, all data should be validated before admission to the trusted zone. Data that does not pass the validation process should be rejected and an error should be returned to the previous tier.
    • Implement multiple layers of validation. Validate input in the user interface and at all subsequent points where it crosses a trust boundary. For example, data validation in a client-side application can prevent simple script injection. However, if the next tier assumes that its input has already been validated, any malicious user who can bypass a client can have unrestricted access to a system.
    • Never concatenate user input that is not validated. String concatenation is the primary point of entry for script injection.

References
Albert Gonzalez. In Wikipedia. http://en.wikipedia.org/wiki/Albert_Gonzalez

Howarth.F. (2010). Emerging Hacker Attacks. Faulkner Information Services.

Krutz, R. L. & Vines, R. D., ( © 2008). The CEH Prep Guide: The Comprehensive Guide to Certified Ethical Hacking.

Microsoft TechNet. SQL Injection. https://technet.microsoft.com/en-us/library/ms161953%28v=sql.105%29.aspx

Vijayan, J. (2009). “U.S. says SQL injection caused major breaches.” Computerworld, 43(26), 4-4.

SQL: Think in Sets not Rows

This article is also posted on LinkedIn.

Structured Query Language, better known as SQL, is regarded as the working language of relational database management systems (RDBMS). As was the case with the relational model and the concepts of normalization, the language developed as result of IBM research in the nineteen seventies.

Left to their own devices, the early RDBMSs (sic) implemented a number of languages, including SEQUEL, developed by Donald D. Chamberlin and Raymond F. Boyce in the early 1970s while working at IBM; and QUEL, the original language of Ingres. Eventually these efforts converged into a workable SQL, the Structured Query Language” (Kriegel, 2001).

For information professionals and database practitioners, SQL is regarded as a foundational skill that enables raw data to be manipulated within a RDBMS. “This is a declarative type of language. It instructs the database about what you want to do, and leaves details of implementation (how to do it) to the RDBMS itself” (Kriegel, 2001).

Before the advent of commercially accessible databases, data was typically stored in a proprietary file format manner. Each vendor had detailed specific access mechanisms, which could not be easily configured and customized for access by alternate applications. As databases began to adopt the relational model, the arrival and eventual standardization of SQL by ANSI (American National Standards Institute) and ISO (International Standards Institute) helped foster access, manipulation and retrieval consistency across many products.

Think in Sets not Rows!

SQL provides users the ability to query and manipulate data within the RDBMS without having to solely rely on a graphical user interface. There are powerful extensions in the many variant structured query languages (e.g. T-SQL, DB2, PL/SQL, etc.) that provide functionality above and beyond ISO and ANSI standards. However, SQL practitioners must first and foremost remember that SQL is a SET BASED construct. The most efficient SQL code regards table data as a whole and refrains from manipulating individual row elements one at a time unless absolutely necessary.

“Thinking in sets, or more precisely, in relational terms, is probably the most important best practice when writing T-SQL code. Many people start coding in T-SQL after having some background in procedural programming. Often, at least at the early stages of coding in this new environment, you don’t really think in relational terms, but rather in procedural terms. That’s because it’s easier to think of a new language as an extension to what you already know as opposed to thinking of it as a different thing, which requires adopting the correct mindset” (Ben-Gan, 2012).

Working with a relational language based upon the relational data model demands a set based mindset. Iterative cursor based processing, if used, should be used sparingly.

“By preferring a cursor-based (row-at-a-time) result set—or as Jeff Moden has so aptly termed it, Row By Agonizing Row (RBAR; pronounced ‘ree-bar’)—instead of a regular set-based SQL query, you add a large amount of overhead to SQL Server” (Fritchey, 2014).

If all other set based options have been exhausted and a row-by-row cursor must be employed, then make sure to use an “efficient” (relatively speaking) cursor type. The fast-forward only cursor type provides some performance advantages with respect to other cursor types in a SQL server environment. Fast forward cursors are read only and they only move forward within a data set (i.e. they do not support multi-direction iteration). Furthermore, according to Microsoft Technet (2015), fast forward only cursors automatically close when they reach the end of the data. The application driver does not have to send a close request to the server, which saves a roundtrip across the network.

References:

Ben-Gan, I.  (Apr, 2012). T-SQL Foundations: Thinking in Sets. Why this line of thought is important when addressing querying tasks. Retrieved from http://sqlmag.com/t-sql/t-sql-foundations-thinking-sets

Fritchey, Grant. ( © 2014). Sql server query performance tuning (4th ed.).

Kriegel, Alex. ( © 2011). Discovering sql: a hands-on guide for beginners.

Microsoft Technet. Fast Forward-Only Cursors (ODBC). Retrieved April 23, 2015, from https://technet.microsoft.com/en-us/library/aa177106(v=sql.80).aspx

From White Hat to Cyber Terrorist: The Seven Types of Hackers

The traditional definition of a hacker is someone who uses computers to gain unauthorized access to data. “Hacks” are deployed for various reasons as diverse as the thrill of the conquest, protests, profit or bolstering status within the hacker community. Some security professionals question whether the term “ethical hacker” is a contradiction in terms, as hacking was originally defined as a criminal activity (Wikipedia, Certified Ethical Hacker).

Conrad Constantine a research engineer at the security management company AlienVault states, “The term ‘ethical’ is unnecessary – it is not logical to refer to a hacker as an ‘ethical hacker’ because they have moved over from the ‘dark side’ into ‘the light’… The reason companies want to employ a hacker is not because they know the ‘rules’ to hacking, but because of the very fact that they do not play by the rules” (Bodhani, pg. 66)

There are many subgroups within the hacker community that encompass more than the traditional black hat, white hat dichotomy. Here are a few of the different types of hackers and their aims:

  • White Hat: Commonly referred to as an Ethical Hacker. Holders of the Certified Ethical Hacker (CEH) certification who uphold the values of the EC-Council (aka the International Council of Electronic Commerce Consultants) would be classified as white hat hackers. The aim of the white hat is to legally and non maliciously perform penetration testing and vulnerability assessments against computer systems in order to improve security weaknesses. White hats are typically employed by security consulting firms that perform penetration testing.
  • Black Hat: Commonly referred to as a “cracker”. Black hats are the opposite of a white hat hacker in that black hats attempt to penetrate computer systems illegally and without prior consent. A Black hat hacker is interested in committing a range of cybercrimes such as identity theft, destroying data, destabilizing systems, credit card fraud etc.
  • Grey Hat: The ethics of the grey hat lies somewhere between those of the white hat and black hat hackers. A grey hat may use the tools and skill sets of a black hat to penetrate into a system illegally but will exhibit white tendencies in that no harm is caused to the system. Typically, the grey hat will notify the system owner of any systems vulnerabilities uncovered.
  • Blue Hat: An outside external security professional invited by Microsoft to exploit vulnerabilities in products prior to launch. This community gathers every year in a conference sponsored by Microsoft; the blue signifies Microsoft’s corporate color. “BlueHat’s goal is to educate Microsoft engineers and executives on current and emerging security threats in an effort to help address security issues in Microsoft products and services and protect customers” (Microsoft, 2013, para. 1)
  • Hacktivists: These hackers will compromise a network or system for political or socially motivated purposes. Website defacement or denial-of-service attacks are the favored methods used by Hacktivists (Wikipedia, Hacker (Computer Security)).
  • Script Kiddies: These “hackers” are amateurs who follow directions and use scripts developed and prepared by advanced hackers. The script kiddie may be able to successfully perform a hack but has no thorough understanding of the actual steps employed.
  • Cyber Terrorists: According to the U.S. Federal Bureau of Investigation, cyberterrorism is any “premeditated, politically motivated attack against information, computer systems, computer programs, and data which results in violence against non-combatant targets by sub-national groups or clandestine agents. Unlike a nuisance virus or computer attack that results in a denial of service, a cyberterrorist attack is designed to cause physical violence or extreme financial harm. According to the U.S. Commission of Critical Infrastructure Protection, possible cyberterrorist targets include the banking industry, military installations, power plants, air traffic control centers, and water systems” (Search Security)

Bodhani, A. (January, 2013). “Ethical hacking: bad in a good way.” Engineering and Technology Magazine, 7(12), Pg.64-64

Cyberterrorism. In Search Security. Retrieved April 16, 2013 from http://searchsecurity.techtarget.com/definition/cyberterrorism

Microsoft. (2013). BlueHat Security Briefings. Retrieved April 16, 2013 from http://technet.microsoft.com/en-us/security/cc261637.aspx

Image courtesy of pat138241 at freedigitalphotos.net

L.A. Lakers Visualization: R Code Plus Illustrator for the Win

I am a huge Los Angeles Lakers fan since I grew up on the West Coast; I lived in Los Angeles for a year and Las Vegas for many years as a kid. Magic Johnson and the “Showtime” squad of the 80’s will always be the best team dynasty in NBA history in my rather biased opinion. I wanted to make a visualization using base R code to plot a bar chart of Lakers wins by season and then use Adobe Illustrator to complete the effort. Using a .csv data file from Basketball-Reference.com I was able to tell the story of the franchise in an easy to comprehend visualization. I love bringing data to life and making it tell a story!

Laker Wins By Season

Twitter Link

Visualizations with R and Adobe Illustrator

I’ve been reading Visualize This by Nathan Yau to better understand visualization concepts. The book provides some direction regarding how to begin graphing data in R and then touching up the graphics in Adobe Illustrator. Here are a few visualizations I was able to create with some basic knowledge of R code and Adobe Illustrator. Nathan’s book provides most of the R code but the Illustrator portion took some work to get just right.

This slideshow requires JavaScript.

Enterprise Architecture Best Practice: Communicating & Quantifying Value

One of the common threads that I have come across while researching Enterprise Architecture with respect to its rollout and adoption within organizations is the importance of communication and value quantification.

Many cases studies have hammered home a common theme that communication is a critical success factor in EA engagements. Particular EA challenges include working with a wide assortment of stakeholders who are unfamiliar with EA and how it adds value. A successful EA implementation and adoption depends upon stakeholders having an understanding of how EA adds value.

Bernard, (2012) asserts that translating value to the bottom line is a major concern for key executives and line of business managers with respect to an EA program.  I believe that his list of quantifiable benefits would shore up any “marketing” plan for EA implementation. Blosch (2012) states that EA is quite frequently new to many business executives and that these executives often need help to understand the value that EA is adding. Articulating the value proposition of EA is paramount and the ten benefits as paraphrased from Bernard (2012, pgs. 72-74) are as listed below:

Shortening Planning Cycles: The EA repository provides a wealth of information that is already preassembled for strategic planning or BPI (Business Process Improvement) activities.

More Effective Planning Meetings: EA can help reduce uncertainties by providing a common baseline.

Shorter Decision Making Cycles: A majority of strategy, business and technology information is already pre-vetted and assembled thereby abbreviating the decision making process.

Improved Reference Information: Reference data is gathered using a standardized methodology that lends itself to practicable storage on the EA repository; thus data mining and business analysis capabilities are enhanced.

Reduction of Duplicative Resources: EA enables current enterprise resources to be inventoried and then subsequently analyzed for value overlap and performance gaps.

Reduced Re-work: Greatly reduces potential for individual program level initiatives, which typically involve duplicative processes and implementations if not crafted in sync with an overarching strategy.

Improved Resource Integration and Performance: Resources are planned and utilized on an enterprise-wide basis thus promoting enterprise wide integration. Future state requirements are compared to current state requirements to identify performance gaps.

Fewer People in a Process: EA supports BPR (Business Process Reengineering) and BPI (Business Process Improvement), which can lead to streamlined processes.

Improved Communication: An EA approach helps to reduce misunderstandings and potential rework via a common language of the business.

Reduction in Cycle Time: EA facilitates the capturing of “Lessons learned” from completed projects. These lessons can then be reapplied to future projects making implementation more effective and efficient.

With these ten quantifiable benefits of EA in hand, EA practitioners should work to communicate the benefits of EA to the organization as a whole. Concentrating on gathering executive level support is another key to initial organizational or line of business adoption.  In turn, executives must remain actively engaged in showing their support. They should also communicate expectations that the business should participate in the burgeoning EA or any other process improvement initiative.

Doucet et al (2009 pgs. 460 – 465) describe the AIDA (Attention, Interest, Desire, Action) method that is commonly used in advertising to sway behavior. A marketing communications model is used to push the EA from a level of Unawareness to full Adoption. The full six communications objectives are as follows: Unaware, Awareness, Interest, Desire, Action and Adoption.

At differing stages of the objectives, different communication approaches are employed. In the earlier Unaware states, more broad based statements about EA benefits and effectiveness are communicated. As the objectives move closer to the adoption stage, the details on EA become more focused until actual benchmarks, touchstones and guidelines are shared for full adoption.

In a similar manner, the “Coherency Management State” of an organization ranging from Level 0 (Absent) to Level 5 (Innovating) will dictate communication objectives (Doucet et al., 2009. Pg. 465).

  • Level 0 (Absent): Recognize the importance and create awareness of EA.
  • Level 1 (Introduced): Find an isolated application of EA and encourage use elsewhere in the organization
  • Level 2 (Encouraged): Reinforce and promote values and practices
  • Level 3 (Instituted): Widen the adoption
  • Level 4 (Optimized): Communicate results and organizational wins achieved through EA
  • Level 5 (Innovating): Maintain sustained interest in continuous improvement

Blosch (2012, pg. 10) promotes the idea of recognizing and measuring the impact of a communications strategy to make sure that it is having the desired effect.

Quantitative Measures:

·      Timeliness of communications

·      Production to plan

·      Readership statistics

·      Amount of feedback

·      Number of communications sent out by channel

·      Access and use of EA artifacts

 

Qualitative Measures

·      Feedback from stakeholders

·      Assessment of stakeholders perception of EA

·      Adoption of EA, where and how widely it is being use

Marketing the EA program with a credible list of quantifiable benefits and paring that list with a robust, well thought out communications strategy should greatly support adoption and diffusion of EA throughout the organization.

References:

Bernard, Scott A. (2012). Linking Strategy, Business and Technology. EA3 An Introduction to Enterprise Architecture (3rd ed.). Bloomington, IN: Author House.

Blosch, M (2012, August 16). Best Practices: Communicating the Value of Enterprise Architecture. Retrieved from Gartner.

Doucet, G., Gotze, J., Saha, P., & Bernard, S. (Eds.). (2009) Coherency Management (1st Ed.). Bloomington, IN: Author House.

Image courtesy of Stuart Miles at FreeDigitalPhotos.net

Enterprise Architecture and Business Process Management

We know that Enterprise Architecture is a logical framework that helps forge a relationship between business, strategy and technology. Within those macro concepts lies various organizational structures, processes and informational flows that help businesses meet their end goals.

With respect to business processes, businesses themselves are dynamic and must change to adapt with the latest market conditions in order to remain a going concern. Thus, proper attention must be paid to processes and the continuous improvement of those processes.

As organizations grow, they need to continuously analyze and refine their processes to ensure they are doing business as effectively and efficiently as possible. Fine-tuning processes gives an organization a competitive advantage in a global marketplace.(Project Management Approach For Business Process Improvement, 2010)

EA and business process management (BPM) are not mutually exclusive. Redshaw (2005. Pg. 3) defines BPM as “the management of all the processes supporting a business transaction/event from the beginning to the end while applying the policies/rules needed to support an organization’s stated business model at a specific point in time.” BPM offers advantages to large institutions as it enables a linkage between IT systems and business processes. Jensen (2010) offers this summarization:

“When done together, BPM provides the business context, understanding and metrics, and EA the discipline for translating business vision and strategy into architectural change. Both are, in fact, needed for sustainable continuous optimization. It is important to realize the value of direct collaboration across BPM and EA boundaries. Only when supported by appropriate collaboration and governance processes can BPM and EA roles work effectively together towards the common goals of the enterprise.” (Jensen, 2010)

EA can support BPM projects by helping project teams become better acquainted with the very processes they are trying to improve. A project manager assigned to a new project can simply access the EA repository to get up to date information on the current processes pertinent to his/her domain. With respect to EA3 framework, “The enterprise’s key business and support processes are documented at the Business level of the EA framework” (Bernard, 2012. Pg. 127).

As processes are improved and changed and project wins or losses are accumulated, this knowledge is shared back into the EA repository for reuse and can be leveraged across the organization.

Quick process improvement wins and one off pinpoint projects may embody a “silo-ed” or parochial approach not in keeping with a broader strategic outlook. Ignoring emerging business strategies can be a costly mistake. For example, energy and resources could be mobilized by a bank to architect a new customer account management or card/payments processing system within the enterprise, accompanied by revised processes. The bank could simultaneously be moving forward with emerging cloud strategies that render the new architected solutions meaningless and obsolete. This hypothetical example of creating solutions in isolation from the overall strategy would be a very costly endeavor in terms of time and money and should obviously be avoided.

By definition, business process management projects embedded within an EA framework are guaranteed to align to the overall organizational strategy. EA becomes a key enabler to ensure process improvement projects are aligned to the strategy for the existing enterprise, as well as any future state strategies.

Wells Fargo and its use of Enterprise Architecture and BPM

As with most organizations of comparable size, Wells Fargo wrestled with issues from both the business and IT (Information Technology) ends of the house. The business had to gain a better understanding of what it needed. It also had to become better acquainted with the capabilities and solutions available from IT. On the other side of the coin, IT had to remain agile enough to deliver and react to changes in business conditions. In this manner IT could be better positioned to deliver solutions that met various business needs.

Olding (2008) found that Wells Fargo operated a very decentralized structure but lacked the coordinated ability to understand what was occurring in other groups that were employing business process management initiatives. A disadvantage of not embedding the BPM experiences within an EA framework was the failure to capitalize on successes that were gained across other “silo-ed” groups. Integrating EA into the approach dramatically simplified the process of capturing those wins for organizational reuse.

At Wells Fargo, a BPM Working Group was established with EA as its champion. The business set out to capture the current state of BPM technologies and approaches around a dozen lines of business. The results indicated that there were over 20 different BPM technologies being employed, each with their own varying approaches to implementation (Olding, 2008). In order to maximize the value of BPM, coordination had to occur across these lines of business.

A seasoned Enterprise Architect within the company made use of a communications strategy to raise awareness of the duplicative uncoordinated approaches dotting the landscape. Business analysts, project managers, executives, and technology professionals were engaged and best practices from the various approaches were discussed and reworked into an EA framework.

A year later, senior executives were presented with the best practices from various approaches, which had since been re-developed using a common framework. The commonality gained from the EA framework allowed for patterns of success to be easily identified, communicated and thus ultimately standardized. With senior level executive backing, the EA framework will persist in the organization allowing the bank to quickly identify opportunities for standardization.

Burns, Neutens, Newman & Power (2009, pg. 11) state, “Successful EA functions measure, and communicate, results that matter to the business, which in turn only strengthens the message that EA is not simply the preserve of the IT department.” This dovetails into the approach that Wells Fargo’s Enterprise Architect employed; the communication of pertinent information back to various business lines to gain acceptance.

The lessons learned from Wells Fargo’s use of BPM and EA as paraphrased from (Olding, 2008. Pgs 5-6):

  • Communicate at all levels of the enterprise.
  • Build BPM adoption from the bottom up. Approach business groups with proven examples and internal successes that will help drive the willingness to adopt new approaches.
  • Facilitate, do not own. Allow business groups to manage their own processes aligned within the framework.
  • Build EA from the top down.
  • Use BPM to derive the needed context and then incorporate it into the EA

As of 2008 Wells Fargo Financial (a business unit of the Wells Fargo & Co.) currently had nine BPM deployments in production and another four projects in the works. Gene Rawls, VP of continuous development, information services, for Wells Fargo Financial has stated that not having to reinvent the wheel saves months of development work for every deployment (Feig, 2008). Project turnaround time from the initial go-ahead for a BPM project to its actual deployment, is just three months.

References:

Bernard, Scott A. (2012). Linking Strategy, Business and Technology. EA3 An Introduction to Enterprise Architecture (3rd ed.). Bloomington, IN: Author House.

Burns, P., Neutens, M., Newman, D., & Power, Tim. (2009). Building Value through Enterprise Architecture: A Global Study. Booz & Co. Retrieved November 14, 2012.

Feig, N. (2008, June 1). The Transparent Bank: The Strategic Benefits of BPM — Banks are taking business process management beyond simple workflow automation to actually measure and optimize processes ranging from online account opening to compliance. Bank Systems + Technology, Vol 31. Retrieved from Factiva database.

Olding, Elise. (2008, December 7). BPM and EA Work Together to Deliver Business Value at Wells Fargo Bank. Retrieved from Gartner October 29, 2012.

Jensen, Claus Torp. (2010, February 10). Continuous improvement with BPM and EA together. Retrieved November 13, 2012.

Project Management Approach For Business Process Improvement. Retrieved November 12, 2012 from http://www.pmhut.com/project-management-approach-for-business-process-improvement

Redshaw, P. (2005, February 24). How Banks Can Benefit From Business Process Management. Retrieved from Gartner October 29, 2012.

Image courtesy of Stuart Miles at FreeDigitalPhotos.net