Top 10 Gotchas of BI in a SaaS World

DataInCloud

As we undergo a BI implementation with a cloud-based ERP provider, I thought it would be interesting to share some of our insights.  As you know from previous posts, Georgetown has been undergoing a large ERP implementation on the Workday ERP.  We are now live on both HCM and Financials.

The Business Intelligence (BI) project has been running in parallel to this effort and I wanted to share some initial findings in the event that it may be helpful for others.

Here are our top 10 gotchas:

#10Be cognizant of time zones.  When you are setting up incremental nightly data pulls, ensure that the user account that you are utilizing is in the same time zone as the underlying tenant.  Otherwise, you might spend hours validating data that may be hours out-of-sync (e.g. user account to extract data is in EST and the tenant is in PST).

#9Chop the data into sufficiently sized extracts to prevent timeout issues.  This may be less of an issue on a nightly basis, but is important if you are loading historical data or preparing for a time in which there is a large integration into the ERP.  Large integrations (e.g. from your Student Information System into your Financial System) can cause a nighly extract file to balloon and subsequently fail.

#8Send your team to deep ERP training – immediately!  Be prepared to know the ERP as well as the team administering it.

#7Ensure that your BI team receives the appropriate security that they require in the SaaS ERP.  If this gets into policy-related issues, ensure that you work these through with the respective owners in advance.  You may have to work through operational protocols and procedure that are not expected.

#6Plan a strategy for how you will create BI related content in your SaaS-based ERP.  Typically, this development needs to occur in a lower tenant and migrated to production.  How will you access these environments?  What is the path to migrate content?  Be prepared that some sandbox tenants may refresh on a weekly basis (i.e. BI content may not reside there for longer than a week).  Consider the notion of development in a dev tenant, migration to sandbox (which may refresh weekly), and then to production.  The timing of this process should feed into your planning.

#5Set up BI extract notifications.  We setup a specific email address which routes to our on call team.  These alerts notify our team if any of the extracts failed so that we can quickly extract and reprocess the file as needed.

#4Prepare to either learn an API in and out, or work with custom reports in the SaaS ERP.  Remember, there will be no direct connection to the database, so you’ll need to find alternate ways of efficiently extracting large amounts of data.  In Georgetown’s case, we wrote custom versions of reports from our SaaS-based ERP and then scheduled their output to a XML or CSV format.  These files are then processed by the nightly ETL routine.

#3Create a portal for consolidation and collaboration of reporting deliverables.  We set up a portal which provides instructions, documentation, links to metadata, ability to ask for help, report guides, etc.  Make this easy for your end-users to consume.

#2Test, Test, and Test.  SaaS-based ERP tenants typically go down one night a week for patches and maintenance.  This can be problematic for nightly BI extracts.  Make sure that you are aware of the outage schedules and can plan around them.  ETL routines may need to start a bit later on the day in which maintenance occurs.

#1Create an operational structure for the ongoing needs of the BI function in advance.  Georgetown is embarking upon a ‘BI Center of Excellence’ that will handle this sort of cross-functional structure which will be comprised of subject matter experts which represent different business units.  This will help the business units to continue to drive the development of the platform and allow IT to play a supportive role.

 

Ten Steps to Deploy IBM Cognos 10.2.1

Cognos_TM1_Accounting_Software_by_IBM
Last weekend, my team was able to successfully migrate Georgetown’s IBM Cognos software from version 10.1 to 10.2.1.  Seems like a piece of cake, right?  Well, in our case, not exactly.  I thought I’d share a bit about our experience in the event that it may help others.

CognosHome

Our IBM Cognos upgrade effort followed a previously failed attempt to upgrade.  After much deliberation with IBM, it was decided that the failed attempt to upgrade primarily related to the fact that we had extremely old hardware and were running on a dated Solaris OS.  Still being fairly new to Georgetown, I dug in a bit further.  Indeed – our hardware was ancient and the Solaris OS was preventing us from getting much needed support from the vendor (IBM).  We had previously implemented IBM Cognos in 2005 on version 8.4.  Then, in 2010, we migrated on the same hardware, to version 10.1.  Given these facts, I was dealt a server which was 8-9 years old, and software that hadn’t been upgraded in at least 3-4 years.

Through several conversations with IBM SMEs, we settled on a proposed 6 server architecture (depicted below).  I’ve removed the details for security reasons, but we designed these machines down to the processor and RAM level.  We also had conversations with IBM about what OS would be best suited for us longer term.  We landed on Windows Server 2012.

CognosHardwareSizingCognosVirtualInfrastructure

For anyone interested, below are the series of steps that we followed to get this project off the ground:

  1. Proposed business case and secured funding
  2. Assessed the current state architecture internally.  Made an educated decision on what we felt that we needed to support our business requirements for the future state.  We were specific – down to the processor and memory level for each machine in the architecture.  We lessened the hardware requirements for the DEV and TEST tiers.  QA and PROD tiers were identical.
  3. Validated the architecture with the vendor (IBM) and ensured that they supported and recommended the approach.
  4. Altered the architecture post vendor recommendation.
  5. Based upon the agreed architecture, engaged the IBM sales representative to identify the correct licensing.  This step will make your head spin a bit.  IBM calculates license costs on a Processor Value Unit (PVU) basis.  Effectively, it is a proprietary formula used to calculate how much processing power resides in each of your machines.  This done by processor and accounts for cores.
  6. Negotiated and completed the procurement process.  Thankfully, IBM has some decent higher education discounts.  For future operating budgets, please be aware that the licensing does not stay flat.  You’ll typically see a 4-5% increase per year.  Also, for renewals, you might consider working through a business partner (such as CDW-G) to save money on the renewal.
  7. Setup the infrastructure.  We chose to do this in a virtual environment.  We also setup the architecture in DEV, TEST, QA, and PROD.
  8. Configured the IBM Cognos software (in this case, 10.2.1).  This is more intensive across the distributed architecture, but well worth the performance and scalability benefit.
  9. Tested, tested, and tested.  We started at the DEV tier and slowly promoted to TEST, QA, and then PROD.  If you have an existing environment already in production, you may consider running the two production environments in parallel for a short period of time.  We did this for about a week and then recopied the content store from the live environment to the new production environment.  It provided an additional level of comfort for testing.
  10. Go-live and enjoy the new features of IBM Cognos 10.2.1.  Please note – we decided to go-live with 10.2.1 on our existing 32-bit packages.  As a next phase, we are migrating all of the 32-bit packages to 64-bit.  You may consider this during the testing phase and deploy all at once.

CognosPVUContract

What tips do we recommend?

  1. Ensure your SSL certificate is installed across all of the machines in your architecture.  If you only install on the gateway server(s), the images on some of your reports will be broken.  They attempt to run via port 80 (HTTP) instead of port 443 (HTTPS) and are blocked.
  2. The governor limit on packages is reset.  We had to go in a modify each package to reset this limit.
  3. The portal may seem slower initially. Allow the content store several business days to optimize and reindex.  You’ll then see an improvement.
  4. Don’t forget to import the new visualizations and install the mobile capability. Very cool stuff!
  5. Collaborate with IBM. They may offer to provide an overview of the new features to your team.  If you have budget, they may also have optimization recommendations.

So what are our favorite features thus far?

  1. External object store for report output – helps tremendously with the size of our content store.
  2. New visualizations – very cool!
  3. We also enjoy the Cognos Mobile app which allows us to share content on mobile devices and push alerts.

CognosWorkspace

CognosMobile

Here’s the full list of new features from IBM:
http://pic.dhe.ibm.com/infocenter/cbi/v10r2m1/index.jsp?topic=%2Fcom.ibm.swg.ba.cognos.ug_cra.10.2.1.doc%2Fc_asg_new_10_2_1.html