Trust, but verify: 5 Considerations to Reconcile Your Data Nightly

As Ronald Reagan once said, many users follow the principle of “trust, but verify.”  When working with the multitude of technology solutions used to do their daily jobs, it becomes increasingly important to ensure that the users are painting an accurate picture to their leadership team.  Let me explain.

Trust, but verifyUser logs into, let’s say Workday Financials, in this example.  This financial user runs an operating report for a specific account, fiscal month, and fiscal year.  This user then logs into the associated BI solution.  They run a longitudinal report on this same account across two fiscal years for comparison purposes.  At first blush, the user will compare what is seen in Workday to the same fiscal month/year in BI to ensure that the figures match.  They will then review the historical months/years.

What happens when the numbers “don’t match”?  In this example, the user will lose confidence in the BI solution quickly – and will question the figures seen in the longitudinal analysis.  Likely, this user will communicate with others on his or her team – and user adoption will decrease until the problem is acknowledged and corrected.  You’re probably thinking, “well – right…fix it!”.

The larger question arises as to how to corroborate the data and reconcile on a nightly basis so that this does not occur.  And, in the off chance that it does occur, you can be proactive in your communication to end-users to provide transparency around the issue(s).  Below are a few suggestions based upon our experience:

  1. Row Counts – this one is simple.  We perform row count analysis on the data extraction from Workday – and ensure this exact number of rows is being written to the data warehouse. We broke this down further to identify the # of rows in the extract, # of rows inserted, and # of rows updated.
    • Note – we’ve seen some fall out due to inaccurate budget dates.  This error was quickly identified by this method.  The root cause of this budget date error was caused by an erroneous configuration in a nightly business integration.  This is value-add for the BI solution as it helped to fix an error that was causing data integrity issues.
  2. Overall Debit/Credit Match – This is also a fairly simple check.  In our Informatica Workflow, we’ve setup a step to simply check if the overall debits and credits match.  This is a quick and easy way to check if you have something out of alignment.
  3. Audit by Ledger Type and Period – In Workday, this was a bit more complex.  We wanted to bump the BI data against the Workday Trial Balance report.  The out-of-box Workday Trial Balance report cannot be scheduled via CSV or XML to an SFTP.  This is our transfer mechanism.  We recreated a custom report which mimics the Trial Balance functionality and then attached it to a corresponding EIB.  This allows for us to generate the Trial Balance report nightly, load the BI data, and then reconcile against the Workday Trial Balance report.  This is done by ledger type (actuals, obligations, and commitments) by fiscal year, period, and debit/credit.
  4. Audit by Ledger Type, Ledger Account, and Period– This reconciliation is the same as step #3, but we added one more layer of detail.  In this audit, we also added the ledger account.  This bumps the Workday Trial Balance report against the loaded BI data by fiscal year, period, ledger type, ledger account, and debit/credit.
  5. Variance File – Finally, to make entirely sure that everything is in alignment (or we didn’t overlook anything from steps #1-4) – we created a variance file.  If anything is out of alignment, the variances are highlighted in this exception file.  This is useful if something is out of alignment, then we don’t have to scroll through long lists (most of which are in alignment).

Hopefully, these have been helpful tidbits.  As always, I’m open to sharing more.  Please contact me if you’d like to learn more.  Or, please feel free to leave comments about what you are doing.  It’s always great to learn from one another.

Top 10 Gotchas of BI in a SaaS World

DataInCloud

As we undergo a BI implementation with a cloud-based ERP provider, I thought it would be interesting to share some of our insights.  As you know from previous posts, Georgetown has been undergoing a large ERP implementation on the Workday ERP.  We are now live on both HCM and Financials.

The Business Intelligence (BI) project has been running in parallel to this effort and I wanted to share some initial findings in the event that it may be helpful for others.

Here are our top 10 gotchas:

#10Be cognizant of time zones.  When you are setting up incremental nightly data pulls, ensure that the user account that you are utilizing is in the same time zone as the underlying tenant.  Otherwise, you might spend hours validating data that may be hours out-of-sync (e.g. user account to extract data is in EST and the tenant is in PST).

#9Chop the data into sufficiently sized extracts to prevent timeout issues.  This may be less of an issue on a nightly basis, but is important if you are loading historical data or preparing for a time in which there is a large integration into the ERP.  Large integrations (e.g. from your Student Information System into your Financial System) can cause a nighly extract file to balloon and subsequently fail.

#8Send your team to deep ERP training – immediately!  Be prepared to know the ERP as well as the team administering it.

#7Ensure that your BI team receives the appropriate security that they require in the SaaS ERP.  If this gets into policy-related issues, ensure that you work these through with the respective owners in advance.  You may have to work through operational protocols and procedure that are not expected.

#6Plan a strategy for how you will create BI related content in your SaaS-based ERP.  Typically, this development needs to occur in a lower tenant and migrated to production.  How will you access these environments?  What is the path to migrate content?  Be prepared that some sandbox tenants may refresh on a weekly basis (i.e. BI content may not reside there for longer than a week).  Consider the notion of development in a dev tenant, migration to sandbox (which may refresh weekly), and then to production.  The timing of this process should feed into your planning.

#5Set up BI extract notifications.  We setup a specific email address which routes to our on call team.  These alerts notify our team if any of the extracts failed so that we can quickly extract and reprocess the file as needed.

#4Prepare to either learn an API in and out, or work with custom reports in the SaaS ERP.  Remember, there will be no direct connection to the database, so you’ll need to find alternate ways of efficiently extracting large amounts of data.  In Georgetown’s case, we wrote custom versions of reports from our SaaS-based ERP and then scheduled their output to a XML or CSV format.  These files are then processed by the nightly ETL routine.

#3Create a portal for consolidation and collaboration of reporting deliverables.  We set up a portal which provides instructions, documentation, links to metadata, ability to ask for help, report guides, etc.  Make this easy for your end-users to consume.

#2Test, Test, and Test.  SaaS-based ERP tenants typically go down one night a week for patches and maintenance.  This can be problematic for nightly BI extracts.  Make sure that you are aware of the outage schedules and can plan around them.  ETL routines may need to start a bit later on the day in which maintenance occurs.

#1Create an operational structure for the ongoing needs of the BI function in advance.  Georgetown is embarking upon a ‘BI Center of Excellence’ that will handle this sort of cross-functional structure which will be comprised of subject matter experts which represent different business units.  This will help the business units to continue to drive the development of the platform and allow IT to play a supportive role.

 

Data Governance in Higher Education

Data governance in higher education is an interesting topic.  Most colleges and universities operate in a decentralized nature to support the various functions of their business.  Consider student records (registrar’s office), student accounts, financial aid, admissions, financial affairs, human resources, advancement, facilities, etc.  In larger universities, decentralized operations are further compounded by multiple campuses and disparate physical/geographic locations.

As data are more prolific in our day-to-day jobs, and big data are ever-increasing, what role does data governance play in the current and future strategy of a higher ed?  It should play a major role.  But, how?

Data Governance
Source:  www.virtusa.com

I recently listened to Notre Dame’s data governance strategy on Educause.com.   Mike Chapple, Sr. Director of Information Technology Service Delivery, had some interesting insights.

In this interview, he discussed 5 key pillars of Notre Dame’s data governance strategy:

  1. Quality and consistency
  2. Policy and standards
  3. Security and privacy
  4. Compliance
  5. Retention and archiving

He also discussed the use of tools such as the RACI matrix to help steward various data governance processes.  All of these topics are relevant – and challenging to centralize.

We’re undergoing a similar effort at Georgetown to address some of these areas and to create a “BI Center of Excellence”.  We’ve engaged Gartner to learn more about the broader higher education landscape and to glean best practices.  This will become increasingly important as we roll out Workday Financials this July.  I look forward to these discussions and the opportunity to help better structure reporting operations and data governance in a central operation.

What is your organization doing?  Where does reporting/BI fall organizationally within your university?  Under the CIO, CFO, or Provost?  Or, do you have a neutral reporting/data governance organization that serves everyone?  Or, do you have reporting centers for each campus that report organizationally into a broader reporting unit?

These are some of the questions that we are trying to answer as we design the future state of how reporting and BI are handled at Georgetown.  If you are a member of HEDW, they just released an informative data governance survey which includes data from about 25 universities.  As we progress further down this path, I will post further about what was ultimately decided to support Georgetown’s efforts.

 

Business Intelligence with Workday

As part of my current project at Georgetown University, we have a requirement to extract large volumes of data for business intelligence purposes from Workday.  We’ve been running Workday HCM for about 2 years and plan to go-live with Workday Financials this July.  Historically, the university has been largely an advocate of on-premise solutions – and Workday represents one of the first large steps we’ve taken to move ERP data “into the cloud”.

Workday

What does this mean for business intelligence?  Let’s dive a bit deeper.

Provided that Workday is a SaaS (software-as-a-service) vendor, a direct on-premise database connection isn’t an option.  That being said, we had to explore options to get data of out of the Workday platform – and to do it quickly and efficiently.

Below are a few data extraction methods which we considered:

  1. Utilize Workday Studio and leverage the web services-based Workday API
  2. Join required Workday objects in a custom report and expose them as a RaaS (report-as-a-service)
  3. Utilize Informatica Cloud

We quickly ruled out option #3.  We enjoy using Informatica and thought this might be a quick win.  Unfortunately, when aligning the data options available in Informatica Cloud with our requirements – we had several gaps (i.e. missing data, data was not defined down to the appropriate grain, etc).  Informatica offered to partner with us to expand capabilities (much appreciated), but unfortunately our timelines did not permit this to occur for our current project.  We continue to use Informatica PowerCenter as our best-in-class data integration tool.

InformaticaWorkdayConnector

So – which option was better #1 (API) or #2 (RaaS)?  This question was kicked around for quite some time.  After much deliberation, we decided on option #2 (RaaS).  You’re probably already asking why.  Isn’t the API faster?  Below were the results of our analysis:

  1. The Workday API was hitting a governor limit for our tenant.  This may have been a specific configuration setting for our tenant but it was prohibitive to pull large amounts of data out of the API.  The API was timing out for certain data pulls at the 2 hour mark.  It also required an API developer to manage.  Workday has done a nice job of documenting their API.  We found this to be extremely helpful along the way.  I’m sure we’ll have a future use for the API.
  2. Custom Reports allowed us the flexibility to join multiple objects that met our requirements.  The API calls were limited to what is publicly available in those calls.  We have a bit more flexibility to combine objects which support our requirements in Custom Reports.
  3. Custom Reports are built using the Workday GUI.  This is an added benefit at it did not require a developer to maintain the API calls.  We can have a functional resource maintain the custom reports which are used for data extraction.
  4. Custom Reports (so far) have been able to support the large data extraction that we require – without timing out.  An example is that we have been able to extract about 200K bi-weekly payroll records in about 12 minutes.  We will use this Custom Report-delivered-by-EIB mechanism to support incremental loads to our Finance and HCM data warehouses.

WorkdayHCM

Using Workday’s Enterprise Interface Builder (EIB), we have been able to schedule the Custom Reports and have them output to a text file format which is conducive to load into our business intelligence platform.  The Workday Enterprise Interface Builder (EIB) tool provides an easy-to-use graphical and guided interface to define inbound and outbound integrations without requiring any programming. Used by the majority of Workday customers, the EIB can be used by both business and IT users to address a variety of integration needs.

WorkdayEIB

Our project is still underway.  I’ll write another post with more details as our project moves further toward completion.   If you have any thoughts or suggestions regarding Workday data extraction / integration, please feel free to comment.  Suggestions and comments are welcome!

Ten Steps to Deploy IBM Cognos 10.2.1

Cognos_TM1_Accounting_Software_by_IBM
Last weekend, my team was able to successfully migrate Georgetown’s IBM Cognos software from version 10.1 to 10.2.1.  Seems like a piece of cake, right?  Well, in our case, not exactly.  I thought I’d share a bit about our experience in the event that it may help others.

CognosHome

Our IBM Cognos upgrade effort followed a previously failed attempt to upgrade.  After much deliberation with IBM, it was decided that the failed attempt to upgrade primarily related to the fact that we had extremely old hardware and were running on a dated Solaris OS.  Still being fairly new to Georgetown, I dug in a bit further.  Indeed – our hardware was ancient and the Solaris OS was preventing us from getting much needed support from the vendor (IBM).  We had previously implemented IBM Cognos in 2005 on version 8.4.  Then, in 2010, we migrated on the same hardware, to version 10.1.  Given these facts, I was dealt a server which was 8-9 years old, and software that hadn’t been upgraded in at least 3-4 years.

Through several conversations with IBM SMEs, we settled on a proposed 6 server architecture (depicted below).  I’ve removed the details for security reasons, but we designed these machines down to the processor and RAM level.  We also had conversations with IBM about what OS would be best suited for us longer term.  We landed on Windows Server 2012.

CognosHardwareSizingCognosVirtualInfrastructure

For anyone interested, below are the series of steps that we followed to get this project off the ground:

  1. Proposed business case and secured funding
  2. Assessed the current state architecture internally.  Made an educated decision on what we felt that we needed to support our business requirements for the future state.  We were specific – down to the processor and memory level for each machine in the architecture.  We lessened the hardware requirements for the DEV and TEST tiers.  QA and PROD tiers were identical.
  3. Validated the architecture with the vendor (IBM) and ensured that they supported and recommended the approach.
  4. Altered the architecture post vendor recommendation.
  5. Based upon the agreed architecture, engaged the IBM sales representative to identify the correct licensing.  This step will make your head spin a bit.  IBM calculates license costs on a Processor Value Unit (PVU) basis.  Effectively, it is a proprietary formula used to calculate how much processing power resides in each of your machines.  This done by processor and accounts for cores.
  6. Negotiated and completed the procurement process.  Thankfully, IBM has some decent higher education discounts.  For future operating budgets, please be aware that the licensing does not stay flat.  You’ll typically see a 4-5% increase per year.  Also, for renewals, you might consider working through a business partner (such as CDW-G) to save money on the renewal.
  7. Setup the infrastructure.  We chose to do this in a virtual environment.  We also setup the architecture in DEV, TEST, QA, and PROD.
  8. Configured the IBM Cognos software (in this case, 10.2.1).  This is more intensive across the distributed architecture, but well worth the performance and scalability benefit.
  9. Tested, tested, and tested.  We started at the DEV tier and slowly promoted to TEST, QA, and then PROD.  If you have an existing environment already in production, you may consider running the two production environments in parallel for a short period of time.  We did this for about a week and then recopied the content store from the live environment to the new production environment.  It provided an additional level of comfort for testing.
  10. Go-live and enjoy the new features of IBM Cognos 10.2.1.  Please note – we decided to go-live with 10.2.1 on our existing 32-bit packages.  As a next phase, we are migrating all of the 32-bit packages to 64-bit.  You may consider this during the testing phase and deploy all at once.

CognosPVUContract

What tips do we recommend?

  1. Ensure your SSL certificate is installed across all of the machines in your architecture.  If you only install on the gateway server(s), the images on some of your reports will be broken.  They attempt to run via port 80 (HTTP) instead of port 443 (HTTPS) and are blocked.
  2. The governor limit on packages is reset.  We had to go in a modify each package to reset this limit.
  3. The portal may seem slower initially. Allow the content store several business days to optimize and reindex.  You’ll then see an improvement.
  4. Don’t forget to import the new visualizations and install the mobile capability. Very cool stuff!
  5. Collaborate with IBM. They may offer to provide an overview of the new features to your team.  If you have budget, they may also have optimization recommendations.

So what are our favorite features thus far?

  1. External object store for report output – helps tremendously with the size of our content store.
  2. New visualizations – very cool!
  3. We also enjoy the Cognos Mobile app which allows us to share content on mobile devices and push alerts.

CognosWorkspace

CognosMobile

Here’s the full list of new features from IBM:
http://pic.dhe.ibm.com/infocenter/cbi/v10r2m1/index.jsp?topic=%2Fcom.ibm.swg.ba.cognos.ug_cra.10.2.1.doc%2Fc_asg_new_10_2_1.html

Considering BI? Follow these 10 steps for success!

Business intelligence.  It consistently ranks as one of the top priorities in various CIO and IT surveys.  In January 2013, Gartner conducted an executive program survey of over 2,000 CIOs.  I’ve listed the top 10 business and technology priorities from the survey below.

Top 10 Business Priorities Top 10 Technology Priorities
  1. Increasing enterprise growth
  2. Delivering operational results
  3. Reducing enterprise costs
  4. Attracting and retaining new customers
  5. Improving IT applications and infrastructure
  6. Creating new products and services (innovation)
  7. Improving efficiency
  8. Attracting and retaining the workforce
  9. Implementing analytics and big data
  10. Improving business processes
  1. Analytics and business intelligence
  2. Mobile technologies
  3. Cloud computing (SaaS, IaaS, PaaS)
  4. Collaboration technologies (workflow)
  5. Legacy modernization
  6. IT management
  7. CRM
  8. Virtualization
  9. Security
  10. ERP Applications

SaaS = software as a service; IaaS = infrastructure as a service; PaaS = platform as a service
Source: Gartner Executive Programs (January 2013)

That said, it is a known fact that many business intelligence (BI) projects fail. Are you planning to implement a BI and analytics program? Maybe thinking of big data?

Here are a few lessons learned that will put you on the path to success:

  1. Ensure executive sponsorship from the start of your project.  This may seem like a simple thing, but ensuring that the executives stand behind your project is critically important.  Before the project starts, lay out the business plan, create budgets (operating/capital), document key stakeholders, and setup a governance structure.  Keep these executives apprised of progress throughout and ensure they see the business value that you are providing.
  2. Understand your current software agreements – and choose the right software/database platform for your organization.  Many people ask me – what is your favorite BI software/database platform?  My answer is that is always depends.  It depends on what contracts your company already has in place.  It depends on the skills and expertise of your staff.  It depends on the budget of your project.  The net is that there are a variety of really good BI tools on the market.  To name a few – Microsoft, Oracle, MicroStrategy, IBM, Teradata, etc.  For a small scale rapid BI implementation, consider cloud-based tools such as Intuit’s Quickbase.
  3. Be inclusive during requirements gathering – don’t design in a bubble.  IT departments often get frustrated with business users as they feel that they can get the technical solution in place much more quickly without the business users involvement.  While this is probably true, if you don’t get critical buy-in from these business users – your initiative will ultimately fail.  The business users need to understand that the system will support their needs and requirements.  This is also critical when you get to the decommissioning phase (item #9 below).
  4. Employ a professional services team to help you.  This is not necessary, but in my personal opinion, I feel that it is tremendously helpful.  Your staff may or may not have dedicated time for the project.  Bringing on a few technical consultants and a project manager can really help to drive the project forward.  In addition, they hold an objective opinion and can help facilitate communication and decisions among departments.
  5. Don’t overlook security.  Security is often over-engineered in BI projects.  Please remember that BI projects don’t need to have ERP-level security.  You may consider bringing the security up a notch in order to gain better performance.  During your design, you may also identify that the users of the BI platform may be power-users by nature.  The platform doesn’t have to be available to everyone.  You may consider allowing a greater level of access to a few number of “power-users”.  This will depend on your specific deployment, but make sure you plan to discuss security early in the project and don’t leave it as an afterthought. 
  6. Document data definitions and ensure buy-in of that definition across groups.  Data definitions can be the most challenging part of a successful BI project – particularly if you have multiple groups involved and they each have their own definition of one piece of data.  This is a tedious process, but be diligent in working through definitions for your data and particularly for any calculated fields.  You may consider software packages that help you to manage this workload (i.e. Informatica’s Business Glossary)
  7. Keep it simple for the end-user and consider a portal.  Presentation is also important.   In a recent implementation, I used Microsoft’s SharePoint 2013 and its Team Site capability to disseminate reports and data by department.  We used team sites for the various departments and a public site to bring all of the departments together.  Consider building a portal similar to this for your end-users.  This makes the report/analytics delivery seamless and easy.  In an ever-growing mobile world, ensure that the portal is mobile capable.  Your users will want to access data when on the go.
  8. Allow for collaboration, but in a controlled environment.  Control the data model and only expose elements that have been approved by your data stewards.  Allow for power-users to create their own reports, but do it in a controlled way.  Expose approved data models to them.   Keep the report creation controlled to departments so that you can keep the solution clean and tidy.  You may even consider a workspace for the custom reports/analytics so that it can be separated from the standard content that is developed and available with the platform.
  9. Decommission old reporting solutions.  During go-live, don’t forget to decommission the legacy reporting solutions.  If you do not, people will continue to use them.  Caution:  if you decommission them, you’ll need executive sponsorship (item #1) and also assurance that you’ve captured the requirements from this system (item #3).
  10. Constantly innovate and evolve your new platform.  Don’t let the platform go-live and become immediately stale.  Engage the end-users and have a constant feedback loop established with them.  Get them engaged to have them help you innovate and build additional content that will better serve their community.  Hold a user-group for key power users so that they can collaborate and share lessons learned.

TrustRadius – A great new Technology Crowdsourcing concept from Vinay Bhagat

Earlier in the year, I wrote an article about Technology Crowdsourcing.  There is a new player in this space named TrustRadius.

trustradius_logo_beta

The talented Vinay Bhagat and his team have created a unique platform which compiles software reviews sourced from product experts.  Key differentiators in this new platform include:

  1. TrustRadius is really focused on trying to source in-depth insights from primary product users/ experts
  2. The reviews of TrustRadius are structured, and soon will be able to be added to, and amended over time.  Structure allows for curation into things like comparisons.

They were launched in May and were funded by the Mayfield Fund in July and are rapidly starting to scale.  So far, they have 23k monthly visitors and are growing at 30% per month.

Review of Microsoft Business Intelligence at TrustRadius

For interested higher education folks, I’ve posted a few reviews there that you may find interesting:

Taboo? Microsoft in Higher Education

In a recent post, I discussed the changes in the business intelligence landscape as outlined by Gartner in their 2013 Magic Quadrant.  Today, I wanted to focus solely on Microsoft as a vendor in this space.  Yes, I mentioned Microsoft – and I work in Higher Education!

Gartner Magic Quadrant for Business Intelligence & Analytics - Comparison of 2012 to 2013

In working with a number of higher education institutions over the years, I often hear direct concerns about “Microsoft.”  In the academic world, we are concerned about the most open way of doing things.  We like to share – and you may have noticed by the adoption of Sakai and the Open Source Portfolio (OSP).

The emergence of open-source tools was prevalent over the last few decades.  You now see many organizations running miscellaneous versions of Linux, open source wiki tools, Drupal-type content management systems – and now many have implemented Google (Google Drive, Google Docs, GMail).  If you mention “Microsoft” – you’d better start running.  You’ll have someone from IT chasing after you pretty quickly – and not in a good way!

Ok – you’re not Jack Sparrow, so you can relax a bit!  But, you can imagine the feelings of many of these IT organizations when you start to implement enterprise-level software that holds a significant cost and the source is proprietary.  Think Sungard’s Banner (now Ellucian), or PeopleSoft, and maybe even Workday now in some cases.  Somehow, Oracle has slipped through the cracks as many of these large ERP vendors require Oracle’s database platform.  Oracle was also smart and acquired mySQL – so they have an almost natural support of the open source community.  Oracle is an investment, too.

You’re probably asking – what’s your point?  My point is that Microsoft isn’t bad.  It’s actually very, very GOOD!  Besides the educational licensing, and the obvious love for Microsoft Office (Excel, Word, PowerPoint, et al) – let’s look at some of the benefits of Microsoft’s SQL Server platform.  Let’s start with a basic point that is often overlooked.  It is a suite of tools, not simply a database platform.   I have listed a basic table below, but you can read more on Microsoft’s website.

Server components Description
SQL Server Database Engine  SQL Server Database Engine includes the Database Engine, the core service for storing, processing, and securing data, replication, full-text search, tools for managing relational and XML data, and the Data Quality Services (DQS) server.
Analysis Services (SSAS) Analysis Services includes the tools for creating and managing online analytical processing (OLAP) and data mining applications.
Reporting Services (SSRS) Reporting Services includes server and client components for creating, managing, and deploying tabular, matrix, graphical, and free-form reports. Reporting Services is also an extensible platform that you can use to develop report applications.
Integration Services (SSIS) Integration Services is a set of graphical tools and programmable objects for moving, copying, and transforming data. It also includes the Data Quality Services (DQS) component for Integration Services.
Master Data Services Master Data Services (MDS) is the SQL Server solution for master data management. MDS can be configured to manage any domain (products, customers, accounts) and includes hierarchies, granular security, transactions, data versioning, and business rules, as well as an Add-in for Excel that can be used to manage data.

The great part of purchasing Microsoft SQL Server is that these tools come out of the box – and are included with the license for the database platform.  There are several different editions which provide more or less horsepower as your project requires, but this is an added bonus that Microsoft bundles these tools.

Here are a few thoughts from my experience and why I enjoy working with Microsoft BI tools:

Technical Benefits:

  • Relatively easy to deploy and installation is wizard-based
  • Learning curve to adopt SSRS and SSIS is reasonable in comparison with other tools
  • Direct integration with Windows operating system and Active Directory (this is great if you have a nice active directory structure already in place; not so helpful if you do not).
  • Direct integration with Team Foundation Server (TFS) for version control
  • Platform is sophisticated enough to handle complex tasks (i.e. stored procedures, SSRS data driven subscriptions)

Functional Benefits:

  • All-in-one solution (combine with SharePoint for full functionality)
  • End-user tools are intuitive and within a familiar Microsoft interface
  • SharePoint can be used to pull information together in a one-stop-shop
  • Office integration (i.e. Excel, PowerPivot)

Cost Benefits:

  • Educational and non-profit discounts are a nice way for Microsoft to give back.
  • License costs, on average, are lower than combining multiple tools from multiple vendors (this always depends on your situation and the license agreements that you have in place).
  • Total cost of ownership (TCO) tends to be lower.  This is due to the license fees and also the availability of technical resources that are familiar with the Microsoft platform.  Again, this is completely dependent on your situation, but this is what I have seen with other clients.  It may also be indirect, but by having all of these tools with one vendor, you spend less time managing 4 or 5 invoices for maintenance and renewals as well.  And, if you need to renegotiate anything – it is again done with a single vendor not 4 or 5.

My Favorite Features:

  1. SQL Server Management Studio (SSMS) – it seems silly, but this a great tool and I enjoy testing my queries within SSMS prior to loading them into SSRS.  It has some really nice features built-in for ETL developers as well (i.e. the ability to script the creation/deletion of a table with a mouse click)
  2. SSIS Package Variables – I use them frequently to make dynamic filenames in my SSIS routines.  They are flexible and allow SSIS routines to handle a number of complexities that would otherwise be very difficult to address.
  3. Data-driven subscriptions – this is a great way to deliver tailored content to your user base.  Same report…different content.  In a previous consulting organization, I used data-driven subscriptions to improve internal processes and implementation times for external projects.
  4. PowerPivot – Let’s be honest.  It’s just cool!  In-memory BI is a hot topic.  We also like tools like Tableau and Qlikview.

Warning Signs That You Need Better Business Intelligence

Many people ignore the warning signs.  My memory immediately flashes to warning labels on cigarette packaging or the famous 80’s commercial by Partnership for a Drug-free America (This is your brain…this is your brain on drugs).

Unfortunately, warning signs for outdated business intelligence systems may not be that clear…and are much more easily ignored.  Let’s start with a few basic questions:

  1. Do you often go to the ‘data guy’ to get reports or data from your ERP systems?
  2. Do you have to contact multiple people to get the data that you require to run a report?
  3. Is there a central portal where you visit for reporting needs?
  4. Are you able to answer the majority of your business questions without involving the IT department?
  5. Are you stuck with canned reports, or do you have the flexibility to run in-memory BI with tools such as Microsoft PowerPivot, Qlikview, or Tableau?
  6. Are you able to write your own reports, and more importantly, understand the data that is available?
  7. Do you have dashboards setup to govern key business areas?
  8. Are you able to share those dashboards in real-time to further collaborate?
  9. Does your business intelligence system allow you to look across functional areas to compare and contrast?
  10. Is there a process in place to request that additional data be added to the system?

Now that you’ve had a chance to think about the questions above, let’s look at what I would consider a positive and negative response to each question.

No. Question Response
1 Do you often go to the ‘data guy’ to get reports or data from your ERP systems? Yes No
2 Do you have to contact multiple people to get the data that you require to run a report? Yes No
3 Is there a central portal where you visit for reporting needs? Yes No
4 Are you able to answer the majority of your business questions without involving the IT department? Yes No
5 Are you stuck with canned reports, or do you have the flexibility to run in-memory BI with tools such as Microsoft PowerPivot, Qlikview, or Tableau? Yes No
6 Are you able to write your own reports, and more importantly, understand the data that is available? Yes No
7 Do you have dashboards setup to govern key business areas? Yes No
8 Are you able to share those dashboards in real-time to further collaborate? Yes No
9 Does your business intelligence system allow you to look across functional areas to compare and contrast? Yes No
10 Is there a process in place to request that additional data be added to the system? Yes No
Legend
 Green text Positive response
 Red text Negative response

Use these questions as a quick litmus test.  If you answered negatively to 4 or more of the questions above, you should revisit your business intelligence strategy.  If you answered positively to 6 or more of the questions above, you are likely headed in the right direction.  That being said, it is never a bad idea to reassess and evaluate strategies to improve your business intelligence environment.  Considering the constant change and fantastic software tools that are out there – you can always find something that will add to the value of your BI strategy.  Caution:  many of these tools are expensive, so evaluate carefully and find the right fit for your budget and your organization.

If you would like more explanation as to the responses to my questions above, I have provided them below:

Do you often go to the ‘data guy’ to get reports or data from your ERP systems?

I have worked in many organizations to find that only one specific person, or a small group of people, have access to organization-wide data.  If you find yourself constantly going to one person to get the data that you require…or to get ‘that report’ written, then the business intelligence system is failing.  A proper business intelligence system should allow you to get the types of data that you require as an end user.  It may impose appropriate security on the data, but it should allow you to get access to the data that is required to do your job.

Do you have to contact multiple people to get the data that you require to run a report?

If you find yourself contacting one person in Finance to get the finance data – and one person in Human Resources to get the HR data – and one person in Sales to get the sales data, then the business intelligence system is failing.  An appropriate BI solution should bring this data together and make it accessible cross-functionally.

Is there a central portal where you visit for reporting needs?

Provided the cross-functional nature of any business intelligence solution, it is important to have a BI portal in place.  This portal may address key information such as:

  • Report glossary
  • Data definitions
  • Training
  • Change requests
  • Project activity
  • Maintenance schedules

Without such a portal, this information is difficult to cobble together and it leads to confusion as end-users are visiting multiple locations to find this information.  This portal serves as a one-stop shop for business intelligence, reporting, and analytics needs.

Are you able to answer the majority of your business questions without involving the IT department?

This question is two-fold.  The first part is to ensure that the business intelligence system has the data that is required to fulfill your request.  The second part is to ensure that if the business intelligence system does have all of the data that is required for your request, that it is accessible to you.  Security is important, but it can also be over engineered.  In a recent project, I stepped into a BI project where each report was executing a security table with over 3.5 M rows of data.  This killed performance and made the experience so frustrating for end-users that they started to spawn their own shadow systems.  Not good.  Make sure that you have the appropriate security in place for your data, but remember that security for BI projects will likely not be the same as the transactional security that you have setup in your ERP systems.

Are you stuck with canned reports, or do you have the flexibility to run in-memory BI with tools such as Microsoft PowerPivot, Qlikview, or Tableau?

Business users will always find a report that isn’t written…or a permutation of an existing report that is required to fulfill a specific need.  For the canned or standard reports that you make available, try to make them as flexible as possible with report parameters.  These filters allow you to further refine the report for various users.  Once the user community becomes used to the environment, they will quickly outgrow canned reports.  This is where in-memory BI plays a big part.  Allow power-users the flexibility of further analyzing data using in-memory tools.  I’ve seen this done very well with Microsoft PowerPivot, Qlikview, and Tableau among others.  This collaborative analysis may lead to future canned reports or dashboards that will provide benefit to the larger community.  And, if not, it provides the flexibility to perform quick cross-functional analysis without the effort of creating a full report.  In the case of Qlikview, users can collaborate and share dashboard data in real-time.  This is very neat.

Are you able to write your own reports, and more importantly, understand the data that is available?

If your business intelligence environment is setup correctly, power-users should have the ability to write custom reports and save them either to a public or personal directory.  If the BI implementation has been done well, power-users will also understand the metadata which fuels the reports.  In a recent implementation, I’ve seen this done well with IBM Cognos and their Framework Manager.

Do you have dashboards setup to govern key business areas?

Dashboards are always the eye-candy of a BI implementation.  With the proliferation of good dashboard tools, this is an integral part of a modern BI solution and is a key aid in helping to drive key business decisions.

Are you able to share those dashboards in real-time to further collaborate?

Not only are dashboards great to help the c-level suite make key business decisions, but they are also becoming a fantastic way to collaborate and share information.  As mentioned above, some of the more advanced dashboard tools allow you to share dashboard information in real-time.  This could be as simple as selecting a view parameters and sending over a copy of a dashboard that you’ve modified.  In some of the nicer tools, it could mean sharing a session and collaborating on the dashboard in real-time.

Does your business intelligence system allow you to look across functional areas to compare and contrast?

This may seem like a silly question, but it is an important one.  Does your business intelligence system encompass all of your key ERP systems?  If not, you need to reevaluate and ensure that all key business systems are brought into the BI solution.  Once the data from the key business systems is standardized and brought into the BI solution, you can start to join these data together for insightful cross-functional analysis.

Is there a process in place to request that additional data be added to the system?

Through exploration, collaboration, and report writing, you may identify data that has not been loaded into the BI solution.  This could be as simple as a data field or as complex as another transactional system.  Either way, you should ensure that your BI team has a process in place to queue up work items.  The BI solution needs to evolve as business needs are identified.  Personally, I have gained a lot of benefit from using Intuit’s Quickbase software to manage this process through technology.  Quickbase has a nice blend of tools that facilitate collaboration and allow end-users to submit requests without a lot of effort.

As you evaluate your BI solution, also ensure that your ETLs are loading data at the appropriate interval.  I’ve also seen many implementation where the data is so dated that is becomes useless to the end-users and impacts adoption of the platform.  Try to run ETL processes as quickly as possible.  In many implementations that I’ve completed, we’ve set data up to run nightly.  This isn’t always possible given the volume, but fresh data always allows for better adoption of the platform.