Warning Signs That You Need Better Business Intelligence

Many people ignore the warning signs.  My memory immediately flashes to warning labels on cigarette packaging or the famous 80’s commercial by Partnership for a Drug-free America (This is your brain…this is your brain on drugs).

Unfortunately, warning signs for outdated business intelligence systems may not be that clear…and are much more easily ignored.  Let’s start with a few basic questions:

  1. Do you often go to the ‘data guy’ to get reports or data from your ERP systems?
  2. Do you have to contact multiple people to get the data that you require to run a report?
  3. Is there a central portal where you visit for reporting needs?
  4. Are you able to answer the majority of your business questions without involving the IT department?
  5. Are you stuck with canned reports, or do you have the flexibility to run in-memory BI with tools such as Microsoft PowerPivot, Qlikview, or Tableau?
  6. Are you able to write your own reports, and more importantly, understand the data that is available?
  7. Do you have dashboards setup to govern key business areas?
  8. Are you able to share those dashboards in real-time to further collaborate?
  9. Does your business intelligence system allow you to look across functional areas to compare and contrast?
  10. Is there a process in place to request that additional data be added to the system?

Now that you’ve had a chance to think about the questions above, let’s look at what I would consider a positive and negative response to each question.

No. Question Response
1 Do you often go to the ‘data guy’ to get reports or data from your ERP systems? Yes No
2 Do you have to contact multiple people to get the data that you require to run a report? Yes No
3 Is there a central portal where you visit for reporting needs? Yes No
4 Are you able to answer the majority of your business questions without involving the IT department? Yes No
5 Are you stuck with canned reports, or do you have the flexibility to run in-memory BI with tools such as Microsoft PowerPivot, Qlikview, or Tableau? Yes No
6 Are you able to write your own reports, and more importantly, understand the data that is available? Yes No
7 Do you have dashboards setup to govern key business areas? Yes No
8 Are you able to share those dashboards in real-time to further collaborate? Yes No
9 Does your business intelligence system allow you to look across functional areas to compare and contrast? Yes No
10 Is there a process in place to request that additional data be added to the system? Yes No
Legend
 Green text Positive response
 Red text Negative response

Use these questions as a quick litmus test.  If you answered negatively to 4 or more of the questions above, you should revisit your business intelligence strategy.  If you answered positively to 6 or more of the questions above, you are likely headed in the right direction.  That being said, it is never a bad idea to reassess and evaluate strategies to improve your business intelligence environment.  Considering the constant change and fantastic software tools that are out there – you can always find something that will add to the value of your BI strategy.  Caution:  many of these tools are expensive, so evaluate carefully and find the right fit for your budget and your organization.

If you would like more explanation as to the responses to my questions above, I have provided them below:

Do you often go to the ‘data guy’ to get reports or data from your ERP systems?

I have worked in many organizations to find that only one specific person, or a small group of people, have access to organization-wide data.  If you find yourself constantly going to one person to get the data that you require…or to get ‘that report’ written, then the business intelligence system is failing.  A proper business intelligence system should allow you to get the types of data that you require as an end user.  It may impose appropriate security on the data, but it should allow you to get access to the data that is required to do your job.

Do you have to contact multiple people to get the data that you require to run a report?

If you find yourself contacting one person in Finance to get the finance data – and one person in Human Resources to get the HR data – and one person in Sales to get the sales data, then the business intelligence system is failing.  An appropriate BI solution should bring this data together and make it accessible cross-functionally.

Is there a central portal where you visit for reporting needs?

Provided the cross-functional nature of any business intelligence solution, it is important to have a BI portal in place.  This portal may address key information such as:

  • Report glossary
  • Data definitions
  • Training
  • Change requests
  • Project activity
  • Maintenance schedules

Without such a portal, this information is difficult to cobble together and it leads to confusion as end-users are visiting multiple locations to find this information.  This portal serves as a one-stop shop for business intelligence, reporting, and analytics needs.

Are you able to answer the majority of your business questions without involving the IT department?

This question is two-fold.  The first part is to ensure that the business intelligence system has the data that is required to fulfill your request.  The second part is to ensure that if the business intelligence system does have all of the data that is required for your request, that it is accessible to you.  Security is important, but it can also be over engineered.  In a recent project, I stepped into a BI project where each report was executing a security table with over 3.5 M rows of data.  This killed performance and made the experience so frustrating for end-users that they started to spawn their own shadow systems.  Not good.  Make sure that you have the appropriate security in place for your data, but remember that security for BI projects will likely not be the same as the transactional security that you have setup in your ERP systems.

Are you stuck with canned reports, or do you have the flexibility to run in-memory BI with tools such as Microsoft PowerPivot, Qlikview, or Tableau?

Business users will always find a report that isn’t written…or a permutation of an existing report that is required to fulfill a specific need.  For the canned or standard reports that you make available, try to make them as flexible as possible with report parameters.  These filters allow you to further refine the report for various users.  Once the user community becomes used to the environment, they will quickly outgrow canned reports.  This is where in-memory BI plays a big part.  Allow power-users the flexibility of further analyzing data using in-memory tools.  I’ve seen this done very well with Microsoft PowerPivot, Qlikview, and Tableau among others.  This collaborative analysis may lead to future canned reports or dashboards that will provide benefit to the larger community.  And, if not, it provides the flexibility to perform quick cross-functional analysis without the effort of creating a full report.  In the case of Qlikview, users can collaborate and share dashboard data in real-time.  This is very neat.

Are you able to write your own reports, and more importantly, understand the data that is available?

If your business intelligence environment is setup correctly, power-users should have the ability to write custom reports and save them either to a public or personal directory.  If the BI implementation has been done well, power-users will also understand the metadata which fuels the reports.  In a recent implementation, I’ve seen this done well with IBM Cognos and their Framework Manager.

Do you have dashboards setup to govern key business areas?

Dashboards are always the eye-candy of a BI implementation.  With the proliferation of good dashboard tools, this is an integral part of a modern BI solution and is a key aid in helping to drive key business decisions.

Are you able to share those dashboards in real-time to further collaborate?

Not only are dashboards great to help the c-level suite make key business decisions, but they are also becoming a fantastic way to collaborate and share information.  As mentioned above, some of the more advanced dashboard tools allow you to share dashboard information in real-time.  This could be as simple as selecting a view parameters and sending over a copy of a dashboard that you’ve modified.  In some of the nicer tools, it could mean sharing a session and collaborating on the dashboard in real-time.

Does your business intelligence system allow you to look across functional areas to compare and contrast?

This may seem like a silly question, but it is an important one.  Does your business intelligence system encompass all of your key ERP systems?  If not, you need to reevaluate and ensure that all key business systems are brought into the BI solution.  Once the data from the key business systems is standardized and brought into the BI solution, you can start to join these data together for insightful cross-functional analysis.

Is there a process in place to request that additional data be added to the system?

Through exploration, collaboration, and report writing, you may identify data that has not been loaded into the BI solution.  This could be as simple as a data field or as complex as another transactional system.  Either way, you should ensure that your BI team has a process in place to queue up work items.  The BI solution needs to evolve as business needs are identified.  Personally, I have gained a lot of benefit from using Intuit’s Quickbase software to manage this process through technology.  Quickbase has a nice blend of tools that facilitate collaboration and allow end-users to submit requests without a lot of effort.

As you evaluate your BI solution, also ensure that your ETLs are loading data at the appropriate interval.  I’ve also seen many implementation where the data is so dated that is becomes useless to the end-users and impacts adoption of the platform.  Try to run ETL processes as quickly as possible.  In many implementations that I’ve completed, we’ve set data up to run nightly.  This isn’t always possible given the volume, but fresh data always allows for better adoption of the platform.

Real-time Data Collaboration by Qlikview

Qlikview

Yesterday, I had a very nice meeting with the folks at Qlikview (thanks, @DHurd11 & Andy for making the trip).  They partnered with a local DC consulting firm by the name of Tandem Conglomerate.  I had the pleasure of working with Ben Nah from Tandem Conglomerate last year – and I can vouch that their talent is top-notch.  Qlikview is smart to find partners of this caliber – and utilize them to better serve their customers.

As a technologist, I always have my eyes open to exploring new technology.  This is always challenging with long term contracts, university politics, and an ever-changing IT landscape.  However, for me, vendors have to prove why they should remain at the top.  Competition is healthy for everyone as it makes us constantly improve.  I should also say that as long as we have a defined data model, the reporting tools that we use on top of that data model are fluid.  The point is not to constantly change and rip out what you’ve done just for the sake of redoing it, but it is important to keep an eye on the latest technology, experiment, and find what is best for your organization.  This can be done gradually with small pilot projects to prove value.  We’re actually in the process of doing one of these pilot projects with Hadoop.

Ok – so you’re probably wondering why I titled this post ‘real-time data collaboration’?  During the Qlikview presentation yesterday, I saw something that really resonated with me.  And, that was the ability to collaborate, in real-time, on the same Qlikview dashboard.

This capability is a market differentiator for Qlikview.  As many of you may have seen from Gartner and in my previous post regarding Gartner’s Magic Quadrant for BI, this is one of the reasons that Qlikview remains ahead of competitors such as Tableau.  Other dashboard vendors may provide the ability to ‘share’ with others, or embed the dashboard into a web page or part.  Don’t misunderstand.  This is NOT the same.

During their product demonstration, Qlikview demonstrated the ability in real-time to share dashboards.  This means that you can select the filters/parameters that you would like to analyze, hone in on the particular area of interest, and share it in real-time.  The recipient can then see that selection, modify it, and as they modify the shared dashboard, it will update your dashboard.  You can modify and send changes back to the recipient as well.  VERY COOL!  Kudos to Qlikview on this feature.  Below are a few screen shots to show how it works:

Click ‘Share Session’

Click ‘Start Sharing’

Choose if you want to cut/paste the link, email it, or even expire the shared session.  By the way, recipients don’t have to have any sort of Qlikview license to be able to provide feedback in real-time.

Try it out in the demo below:

 

Related Articles:

Improve your sales pitch with these 5 tips!

At Georgetown University, we are in the RFP stage for a number of key IT projects.  This week, I sat through about 6 different vendor presentations in response to our requirements for these projects.  It was amazing how much each presentation, and the people presenting them, differed.

This is brief, but here are my take-aways and recommendations:

  • Be human.  Mix in some humor and rotate presenters.  I’m not talking about cheesy jokes, but keep the conversation light by mixing in witty humor.  No one wants to listen to a presentation that drones in monotone.  Take every opportunity to connect with your audience personally.  Don’t be afraid to mix in a brief story from the day-of to show that this presentation is geared directly to the audience.  I observed that it is also nice to rotate presenters as this helps to naturally break up the presentation and make it more engaging.  Some groups had up to 4 presenters during an hour-long presentation.
  • Demonstrate with examples and make it real.  People like to understand that you’re an expert in the solution that you are providing.  Make your examples as real as possible.  If you don’t have a direct example, mock up an example from the requirements for the project.  Help the audience understand how you are going to help them and break the solution down.  Visuals are always a good idea.
  • Be transparent about some of the key challenges.  No one is perfect.  Seeing a number like 99 or 100% always raises my eyebrows.  We don’t live in a perfect world.  Be open and share challenges from past experiences.  Help the audience understand your experience and how you would work to mitigate these risks and challenges.  You rarely see a 99.99% up-time guarantee or success rate.  And, even when you do, it seems like more of a marketing plug.  Just ask Amazon about their failure on Christmas Eve that sent Netflix users in a fury.   The point here is not to slam Amazon.  Amazon is a great company with fantastic services.  The point is that no one is perfect.  I bet they learned from that Christmas Eve mistake and have measures in place to prevent it this year.  If you were a company similar to Netflix and seeking Amazon’s services, wouldn’t you want to know how they addressed that issue and the preventative measures that they have put in place moving forward?
  • If possible, show the solution.  We all love PowerPoint, right?  I like PPT visuals and pretty charts and graphs, but I also want to see where the rubber meets the road.  It is great to see something like workflow presented in PowerPoint, but it brings an entirely new dimension to the presentation when you can show how the system handles workflow in real time.  If you’ve done your homework, you can align these demo “bursts” with the requirements in the RFP.  I realize this isn’t always possible, but it is a very nice touch!  If you are scared of internet connectivity, pre-record the demo bursts and save them on the local laptop.
  • Be interactive.  We had one vendor actually video conference references during the presentation.  Not only was it great to connect with someone that is in our position, but it was helpful to hear them talk about the solution and how it has directly benefited their business.  In other sessions, we enjoyed engaging conversation and impromptu requests to provide a requirement and the vendor would demonstrate how that requirement would be met in the solution.  This goes so much further than talking about the “possibilities” in the solution.  Take action….show us!

Higher Education Data Warehouse Conference (HEDW) @ Cornell University

I just returned from an excellent conference  (HEDW) which was administered by the IT staff at Cornell University. Kudos to the 2013 Conference Chair, Jeff Christen, and the staff of Cornell University for hosting this year! Below is a little bit more information about the conference:

The Higher Education Data Warehousing Forum (HEDW) is a network of higher education colleagues dedicated to promoting the sharing of knowledge and best practices regarding knowledge management in colleges and universities, including building data warehouses, developing institutional reporting strategies, and providing decision support.

The Forum meets once a year and sponsors a series of listservs to promote communication among technical developers and administrators of data access and reporting systems, data custodians, institutional researchers, and consumers of data representing a variety of internal university audiences.

There are more than 2000 active members in HEDW, representing professionals from 700+ institutions found in 38 different countries and 48 different states.

This conference has proven to be helpful to Georgetown University over the last 5 years.  It is a great opportunity to network with peers and share best practice around the latest technology tools.  And, sorry vendors, you are kept at bay.  This is important as the focus of the conference is less on technology sales – and more about relationships and sharing successes.

Cornell University Outside of Statler Conference Center

Cornell University Outside of Statler Conference Center

Personally, this was my first year in attendance.  I gained a lot of industry insight, but it was also helpful to find peer organizations that are using the same technology tools.  We are about to embark upon an Oracle Peoplesoft finance to Workday conversion.  It was helpful to connect with others that are going through similar projects.  And for me specifically, it was helpful to learn how folks are starting to extract data from Workday for business intelligence purposes.

Higher Education Data Warehouse Conference

Higher Education Data Warehouse Conference

2013 HEDW Attendee List

2013 HEDW Attendee List

My key take-aways from the conference were:

  • Business intelligence is happening with MANY tools.  We saw A LOT of technology.  Industry leaders in the higher education space still seem to be Oracle and MicrosoftOracle seemed to be embedded in more universities; however many are starting projects on the Microsoft stack – particularly with the Blackboard Analytics team standardizing on the Microsoft platform.  IBM Cognos still seemed to be the market leader in terms of operational reporting; however Microsoft’s SSRS is gaining momentum.  From an OLAP and dashboard perspective, it seemed like a mixed bag.  Some were using IBM BI Dashboards, while others were using tools such as OBIEE Dashboards, Microsoft Sharepoint’s Dashboard Designer, and an emerging product – Pyramid Analytics. Microsoft’s PowerPivot was also highly demonstrated and users like it!  PowerView was mentioned, but no one seemed to have it up and running…yet.  Tableau was also a very popular choice and highly recommended.  Several people mentioned how responsive both Microsoft and Tableau had been to their needs pre-sale.
  • Business intelligence requires a SIGNIFICANT amount of governance to be successful.  We saw presentation after presentation about the governance structures that should have been setup.  Or, projects that had to be restarted in order be governed in the appropriate way.  This includes changing business processes and ensuring that common data definitions are put in place across university silos.  A stove-piped approach does not work when you are trying to analyze data cross functionally.
  • Standardizing on one tool is difficult.  We spoke to many universities that had multiple tools in play.  This is due to the difficulty of change management and training.  It is worth making the investment for change management in order to standardize on the appropriate tool set.
  • Technology is expensive.  There is no one size fits all.  Depending on the licensing agreements that are in place at your university – there may be a clear technology choice.  Oracle is expensive, but it may already be in use to support critical ERP systems.  We also heard many universities discuss their use of Microsoft due to educational and statewide discounts available.
  • Predictive Analytics are still future state.  We had brief discussions about statistical tools like SAS and IBM’s SPSS; however, these tools were not the focus of many discussions.  It seems that most universities are trying to figure out simple ODS and EDW projects. Predictive analytics and sophisticated statistical tools are in use – but seem to be taking a back seat while IT departments get the more fundamental data models in place.  Most had an extreme amount of interest in these types of predictive analytics, but felt, “we just aren’t there yet.”  GIS data also came up in a small number of presentations, but also has interest.  In fact, one presentation displayed a dashboard with student enrollment by county.  People like to see data overlaid on a map.  I can see more universities taking advantage of this soon.
  • Business intelligence technologists are in high demand and hard to find.  It was apparent throughout the conference that many universities are challenged to find the right technology talent.  Many are in need of employees that possess business intelligence and reporting skills.
  • Hadoop remains on the shelf.  John Rome from Arizona State gave an excellent presentation about Hadoop and its functional use.  He clarified how Hadoop got its name.  The founder, Doug Cutting, named the company after his son’s stuffed yellow elephant!  John also presented a few experiments that ASU has been doing to evaluate the value that Hadoop may be able to bring the university.  In ASU’s experiments, they used Amazon’s EC2 service to quickly spin up supporting servers and configure the services necessary to support Hadoop.  This presentation was entertaining, but was almost the only mention of Hadoop during the entire conference.  It may have more use in research functions, but does not seem widely adopted in key university business intelligence efforts as of yet.  Wonder if this will change by next year?
g with Son's Stuffed Elephant

Doug Cutting with Son’s Stuffed Elephant

Hadoop