Take a look at my guest post on TrustRadius.
In this blog, I discuss 3 Strategic Tips to Unleash the Power of Self-Service BI tools:
Take a look at my guest post on TrustRadius.
In this blog, I discuss 3 Strategic Tips to Unleash the Power of Self-Service BI tools:
As part of my current project at Georgetown University, we have a requirement to extract large volumes of data for business intelligence purposes from Workday. We’ve been running Workday HCM for about 2 years and plan to go-live with Workday Financials this July. Historically, the university has been largely an advocate of on-premise solutions – and Workday represents one of the first large steps we’ve taken to move ERP data “into the cloud”.
What does this mean for business intelligence? Let’s dive a bit deeper.
Provided that Workday is a SaaS (software-as-a-service) vendor, a direct on-premise database connection isn’t an option. That being said, we had to explore options to get data of out of the Workday platform – and to do it quickly and efficiently.
Below are a few data extraction methods which we considered:
We quickly ruled out option #3. We enjoy using Informatica and thought this might be a quick win. Unfortunately, when aligning the data options available in Informatica Cloud with our requirements – we had several gaps (i.e. missing data, data was not defined down to the appropriate grain, etc). Informatica offered to partner with us to expand capabilities (much appreciated), but unfortunately our timelines did not permit this to occur for our current project. We continue to use Informatica PowerCenter as our best-in-class data integration tool.
So – which option was better #1 (API) or #2 (RaaS)? This question was kicked around for quite some time. After much deliberation, we decided on option #2 (RaaS). You’re probably already asking why. Isn’t the API faster? Below were the results of our analysis:
Using Workday’s Enterprise Interface Builder (EIB), we have been able to schedule the Custom Reports and have them output to a text file format which is conducive to load into our business intelligence platform. The Workday Enterprise Interface Builder (EIB) tool provides an easy-to-use graphical and guided interface to define inbound and outbound integrations without requiring any programming. Used by the majority of Workday customers, the EIB can be used by both business and IT users to address a variety of integration needs.
Our project is still underway. I’ll write another post with more details as our project moves further toward completion. If you have any thoughts or suggestions regarding Workday data extraction / integration, please feel free to comment. Suggestions and comments are welcome!
Business intelligence. It consistently ranks as one of the top priorities in various CIO and IT surveys. In January 2013, Gartner conducted an executive program survey of over 2,000 CIOs. I’ve listed the top 10 business and technology priorities from the survey below.
|Top 10 Business Priorities||Top 10 Technology Priorities|
SaaS = software as a service; IaaS = infrastructure as a service; PaaS = platform as a service
Source: Gartner Executive Programs (January 2013)
That said, it is a known fact that many business intelligence (BI) projects fail. Are you planning to implement a BI and analytics program? Maybe thinking of big data?
Here are a few lessons learned that will put you on the path to success:
The talented Vinay Bhagat and his team have created a unique platform which compiles software reviews sourced from product experts. Key differentiators in this new platform include:
They were launched in May and were funded by the Mayfield Fund in July and are rapidly starting to scale. So far, they have 23k monthly visitors and are growing at 30% per month.
For interested higher education folks, I’ve posted a few reviews there that you may find interesting:
Many people ignore the warning signs. My memory immediately flashes to warning labels on cigarette packaging or the famous 80’s commercial by Partnership for a Drug-free America (This is your brain…this is your brain on drugs).
Unfortunately, warning signs for outdated business intelligence systems may not be that clear…and are much more easily ignored. Let’s start with a few basic questions:
Now that you’ve had a chance to think about the questions above, let’s look at what I would consider a positive and negative response to each question.
|1||Do you often go to the ‘data guy’ to get reports or data from your ERP systems?||Yes||No|
|2||Do you have to contact multiple people to get the data that you require to run a report?||Yes||No|
|3||Is there a central portal where you visit for reporting needs?||Yes||No|
|4||Are you able to answer the majority of your business questions without involving the IT department?||Yes||No|
|5||Are you stuck with canned reports, or do you have the flexibility to run in-memory BI with tools such as Microsoft PowerPivot, Qlikview, or Tableau?||Yes||No|
|6||Are you able to write your own reports, and more importantly, understand the data that is available?||Yes||No|
|7||Do you have dashboards setup to govern key business areas?||Yes||No|
|8||Are you able to share those dashboards in real-time to further collaborate?||Yes||No|
|9||Does your business intelligence system allow you to look across functional areas to compare and contrast?||Yes||No|
|10||Is there a process in place to request that additional data be added to the system?||Yes||No|
|Green text||Positive response|
|Red text||Negative response|
Use these questions as a quick litmus test. If you answered negatively to 4 or more of the questions above, you should revisit your business intelligence strategy. If you answered positively to 6 or more of the questions above, you are likely headed in the right direction. That being said, it is never a bad idea to reassess and evaluate strategies to improve your business intelligence environment. Considering the constant change and fantastic software tools that are out there – you can always find something that will add to the value of your BI strategy. Caution: many of these tools are expensive, so evaluate carefully and find the right fit for your budget and your organization.
If you would like more explanation as to the responses to my questions above, I have provided them below:
Do you often go to the ‘data guy’ to get reports or data from your ERP systems?
I have worked in many organizations to find that only one specific person, or a small group of people, have access to organization-wide data. If you find yourself constantly going to one person to get the data that you require…or to get ‘that report’ written, then the business intelligence system is failing. A proper business intelligence system should allow you to get the types of data that you require as an end user. It may impose appropriate security on the data, but it should allow you to get access to the data that is required to do your job.
Do you have to contact multiple people to get the data that you require to run a report?
If you find yourself contacting one person in Finance to get the finance data – and one person in Human Resources to get the HR data – and one person in Sales to get the sales data, then the business intelligence system is failing. An appropriate BI solution should bring this data together and make it accessible cross-functionally.
Is there a central portal where you visit for reporting needs?
Provided the cross-functional nature of any business intelligence solution, it is important to have a BI portal in place. This portal may address key information such as:
Without such a portal, this information is difficult to cobble together and it leads to confusion as end-users are visiting multiple locations to find this information. This portal serves as a one-stop shop for business intelligence, reporting, and analytics needs.
Are you able to answer the majority of your business questions without involving the IT department?
This question is two-fold. The first part is to ensure that the business intelligence system has the data that is required to fulfill your request. The second part is to ensure that if the business intelligence system does have all of the data that is required for your request, that it is accessible to you. Security is important, but it can also be over engineered. In a recent project, I stepped into a BI project where each report was executing a security table with over 3.5 M rows of data. This killed performance and made the experience so frustrating for end-users that they started to spawn their own shadow systems. Not good. Make sure that you have the appropriate security in place for your data, but remember that security for BI projects will likely not be the same as the transactional security that you have setup in your ERP systems.
Are you stuck with canned reports, or do you have the flexibility to run in-memory BI with tools such as Microsoft PowerPivot, Qlikview, or Tableau?
Business users will always find a report that isn’t written…or a permutation of an existing report that is required to fulfill a specific need. For the canned or standard reports that you make available, try to make them as flexible as possible with report parameters. These filters allow you to further refine the report for various users. Once the user community becomes used to the environment, they will quickly outgrow canned reports. This is where in-memory BI plays a big part. Allow power-users the flexibility of further analyzing data using in-memory tools. I’ve seen this done very well with Microsoft PowerPivot, Qlikview, and Tableau among others. This collaborative analysis may lead to future canned reports or dashboards that will provide benefit to the larger community. And, if not, it provides the flexibility to perform quick cross-functional analysis without the effort of creating a full report. In the case of Qlikview, users can collaborate and share dashboard data in real-time. This is very neat.
Are you able to write your own reports, and more importantly, understand the data that is available?
If your business intelligence environment is setup correctly, power-users should have the ability to write custom reports and save them either to a public or personal directory. If the BI implementation has been done well, power-users will also understand the metadata which fuels the reports. In a recent implementation, I’ve seen this done well with IBM Cognos and their Framework Manager.
Do you have dashboards setup to govern key business areas?
Dashboards are always the eye-candy of a BI implementation. With the proliferation of good dashboard tools, this is an integral part of a modern BI solution and is a key aid in helping to drive key business decisions.
Are you able to share those dashboards in real-time to further collaborate?
Not only are dashboards great to help the c-level suite make key business decisions, but they are also becoming a fantastic way to collaborate and share information. As mentioned above, some of the more advanced dashboard tools allow you to share dashboard information in real-time. This could be as simple as selecting a view parameters and sending over a copy of a dashboard that you’ve modified. In some of the nicer tools, it could mean sharing a session and collaborating on the dashboard in real-time.
Does your business intelligence system allow you to look across functional areas to compare and contrast?
This may seem like a silly question, but it is an important one. Does your business intelligence system encompass all of your key ERP systems? If not, you need to reevaluate and ensure that all key business systems are brought into the BI solution. Once the data from the key business systems is standardized and brought into the BI solution, you can start to join these data together for insightful cross-functional analysis.
Is there a process in place to request that additional data be added to the system?
Through exploration, collaboration, and report writing, you may identify data that has not been loaded into the BI solution. This could be as simple as a data field or as complex as another transactional system. Either way, you should ensure that your BI team has a process in place to queue up work items. The BI solution needs to evolve as business needs are identified. Personally, I have gained a lot of benefit from using Intuit’s Quickbase software to manage this process through technology. Quickbase has a nice blend of tools that facilitate collaboration and allow end-users to submit requests without a lot of effort.
As you evaluate your BI solution, also ensure that your ETLs are loading data at the appropriate interval. I’ve also seen many implementation where the data is so dated that is becomes useless to the end-users and impacts adoption of the platform. Try to run ETL processes as quickly as possible. In many implementations that I’ve completed, we’ve set data up to run nightly. This isn’t always possible given the volume, but fresh data always allows for better adoption of the platform.
Yesterday, I had a very nice meeting with the folks at Qlikview (thanks,
@DHurd11 & Andy for making the trip). They partnered with a local DC consulting firm by the name of Tandem Conglomerate. I had the pleasure of working with Ben Nah from Tandem Conglomerate last year – and I can vouch that their talent is top-notch. Qlikview is smart to find partners of this caliber – and utilize them to better serve their customers.
As a technologist, I always have my eyes open to exploring new technology. This is always challenging with long term contracts, university politics, and an ever-changing IT landscape. However, for me, vendors have to prove why they should remain at the top. Competition is healthy for everyone as it makes us constantly improve. I should also say that as long as we have a defined data model, the reporting tools that we use on top of that data model are fluid. The point is not to constantly change and rip out what you’ve done just for the sake of redoing it, but it is important to keep an eye on the latest technology, experiment, and find what is best for your organization. This can be done gradually with small pilot projects to prove value. We’re actually in the process of doing one of these pilot projects with Hadoop.
Ok – so you’re probably wondering why I titled this post ‘real-time data collaboration’? During the Qlikview presentation yesterday, I saw something that really resonated with me. And, that was the ability to collaborate, in real-time, on the same Qlikview dashboard.
This capability is a market differentiator for Qlikview. As many of you may have seen from Gartner and in my previous post regarding Gartner’s Magic Quadrant for BI, this is one of the reasons that Qlikview remains ahead of competitors such as Tableau. Other dashboard vendors may provide the ability to ‘share’ with others, or embed the dashboard into a web page or part. Don’t misunderstand. This is NOT the same.
During their product demonstration, Qlikview demonstrated the ability in real-time to share dashboards. This means that you can select the filters/parameters that you would like to analyze, hone in on the particular area of interest, and share it in real-time. The recipient can then see that selection, modify it, and as they modify the shared dashboard, it will update your dashboard. You can modify and send changes back to the recipient as well. VERY COOL! Kudos to Qlikview on this feature. Below are a few screen shots to show how it works:
Click ‘Share Session’
Click ‘Start Sharing’
Choose if you want to cut/paste the link, email it, or even expire the shared session. By the way, recipients don’t have to have any sort of Qlikview license to be able to provide feedback in real-time.
Try it out in the demo below:
I just returned from an excellent conference (HEDW) which was administered by the IT staff at Cornell University. Kudos to the 2013 Conference Chair, Jeff Christen, and the staff of Cornell University for hosting this year! Below is a little bit more information about the conference:
The Higher Education Data Warehousing Forum (HEDW) is a network of higher education colleagues dedicated to promoting the sharing of knowledge and best practices regarding knowledge management in colleges and universities, including building data warehouses, developing institutional reporting strategies, and providing decision support.
The Forum meets once a year and sponsors a series of listservs to promote communication among technical developers and administrators of data access and reporting systems, data custodians, institutional researchers, and consumers of data representing a variety of internal university audiences.
There are more than 2000 active members in HEDW, representing professionals from 700+ institutions found in 38 different countries and 48 different states.
This conference has proven to be helpful to Georgetown University over the last 5 years. It is a great opportunity to network with peers and share best practice around the latest technology tools. And, sorry vendors, you are kept at bay. This is important as the focus of the conference is less on technology sales – and more about relationships and sharing successes.
Personally, this was my first year in attendance. I gained a lot of industry insight, but it was also helpful to find peer organizations that are using the same technology tools. We are about to embark upon an Oracle Peoplesoft finance to Workday conversion. It was helpful to connect with others that are going through similar projects. And for me specifically, it was helpful to learn how folks are starting to extract data from Workday for business intelligence purposes.
My key take-aways from the conference were:
Recently, I received an email as part of a listserv from a colleague at HEDW.org. HEDW, or Higher Education Data Warehousing Forum, is a network of higher education colleagues dedicated to promoting the sharing of knowledge and best practices regarding knowledge management in colleges and universities, including building data warehouses, developing institutional reporting strategies, and providing decision support.
In the email that I referenced above, my colleague sent a link to an IBM Redbooks publication titled, “Dimensional Modeling: In a Business Intelligence Environment.” This is a good read for someone that wants the basics of data warehousing. It also may be a good refresher for others. Here’s a short description of the book:
In this IBM Redbooks publication we describe and demonstrate dimensional data modeling techniques and technology, specifically focused on business intelligence and data warehousing. It is to help the reader understand how to design, maintain, and use a dimensional model for data warehousing that can provide the data access and performance required for business intelligence.
Business intelligence is comprised of a data warehousing infrastructure, and a query, analysis, and reporting environment. Here we focus on the data warehousing infrastructure. But only a specific element of it, the data model – which we consider the base building block of the data warehouse. Or, more precisely, the topic of data modeling and its impact on the business and business applications. The objective is not to provide a treatise on dimensional modeling techniques, but to focus at a more practical level.
There is technical content for designing and maintaining such an environment, but also business content.
In reading through a few responses on the listserv, it compelled me to produce a list of some of my favorite BI books. I’ll publish a part II to this post in the future, but here is an initial list that I would recommend to any BI professional. It is also worth signing up for the Kimball Group’s Design Tips. They are tremendously useful.
Data quality is always an aspect of business intelligence (BI) projects that seems to be deprioritized. It is easy to look at the beautiful visualizations and drill-through reports that are key selling features of a BI project. However, this article is about the value of cleansing your data so that these tools will work seamlessly with the data model that you establish. Everyone knows the IT saying, “Garbage in. Garbage Out.” That holds entirely true with BI projects. If the incoming data is dirty, it is going to be very difficult to efficiently process the data and make it available for a reporting platform. This isn’t an easy problem to solve either. When working across multiple functional areas, you may also have different sets of users that are entering data into the system in DIFFERENT ways. So, in this instance, you may not have a data quality issue, but a business process issue.
As I have worked through my BI projects, here are 10 steps that I have followed to work with teams to create a data-centric culture and to improve data integrity. I hope that these are of use to you…and please feel free to share any additional best practice in the comments of this blog! We can all learn from one another.
SAS sponsored this report which draws on a survey of 752 executives, as well as in-depth interviews with 17 senior executives and experts who are regarded as data pioneers. I would love to hear from other higher education institutions in regard to “the importance of data to company functions” section. This report focuses on a more corporate audience, but I imagine that in a university setting you might see Finance (Audit & Regulatory Compliance) and Advancement added to the list below. And, perhaps, more mission-driven data such as Learning data (student/teacher performance). Blackboard Analytics has a neat product in that arena (see screen shot below):
Overall, I think this infographic represents a comprehensive list of some of the key challenges of big data and business intelligence. What is the most compelling bit of information in this infographic for you?
I found the following two questions of particular interest. These questions provoke thought about having the right type of talent within your organization and ensure that you have achieved the right type of outcome for your end-user.