Trust, but verify: 5 Considerations to Reconcile Your Data Nightly

As Ronald Reagan once said, many users follow the principle of “trust, but verify.”  When working with the multitude of technology solutions used to do their daily jobs, it becomes increasingly important to ensure that the users are painting an accurate picture to their leadership team.  Let me explain.

Trust, but verifyUser logs into, let’s say Workday Financials, in this example.  This financial user runs an operating report for a specific account, fiscal month, and fiscal year.  This user then logs into the associated BI solution.  They run a longitudinal report on this same account across two fiscal years for comparison purposes.  At first blush, the user will compare what is seen in Workday to the same fiscal month/year in BI to ensure that the figures match.  They will then review the historical months/years.

What happens when the numbers “don’t match”?  In this example, the user will lose confidence in the BI solution quickly – and will question the figures seen in the longitudinal analysis.  Likely, this user will communicate with others on his or her team – and user adoption will decrease until the problem is acknowledged and corrected.  You’re probably thinking, “well – right…fix it!”.

The larger question arises as to how to corroborate the data and reconcile on a nightly basis so that this does not occur.  And, in the off chance that it does occur, you can be proactive in your communication to end-users to provide transparency around the issue(s).  Below are a few suggestions based upon our experience:

  1. Row Counts – this one is simple.  We perform row count analysis on the data extraction from Workday – and ensure this exact number of rows is being written to the data warehouse. We broke this down further to identify the # of rows in the extract, # of rows inserted, and # of rows updated.
    • Note – we’ve seen some fall out due to inaccurate budget dates.  This error was quickly identified by this method.  The root cause of this budget date error was caused by an erroneous configuration in a nightly business integration.  This is value-add for the BI solution as it helped to fix an error that was causing data integrity issues.
  2. Overall Debit/Credit Match – This is also a fairly simple check.  In our Informatica Workflow, we’ve setup a step to simply check if the overall debits and credits match.  This is a quick and easy way to check if you have something out of alignment.
  3. Audit by Ledger Type and Period – In Workday, this was a bit more complex.  We wanted to bump the BI data against the Workday Trial Balance report.  The out-of-box Workday Trial Balance report cannot be scheduled via CSV or XML to an SFTP.  This is our transfer mechanism.  We recreated a custom report which mimics the Trial Balance functionality and then attached it to a corresponding EIB.  This allows for us to generate the Trial Balance report nightly, load the BI data, and then reconcile against the Workday Trial Balance report.  This is done by ledger type (actuals, obligations, and commitments) by fiscal year, period, and debit/credit.
  4. Audit by Ledger Type, Ledger Account, and Period– This reconciliation is the same as step #3, but we added one more layer of detail.  In this audit, we also added the ledger account.  This bumps the Workday Trial Balance report against the loaded BI data by fiscal year, period, ledger type, ledger account, and debit/credit.
  5. Variance File – Finally, to make entirely sure that everything is in alignment (or we didn’t overlook anything from steps #1-4) – we created a variance file.  If anything is out of alignment, the variances are highlighted in this exception file.  This is useful if something is out of alignment, then we don’t have to scroll through long lists (most of which are in alignment).

Hopefully, these have been helpful tidbits.  As always, I’m open to sharing more.  Please contact me if you’d like to learn more.  Or, please feel free to leave comments about what you are doing.  It’s always great to learn from one another.

Top 10 Gotchas of BI in a SaaS World

DataInCloud

As we undergo a BI implementation with a cloud-based ERP provider, I thought it would be interesting to share some of our insights.  As you know from previous posts, Georgetown has been undergoing a large ERP implementation on the Workday ERP.  We are now live on both HCM and Financials.

The Business Intelligence (BI) project has been running in parallel to this effort and I wanted to share some initial findings in the event that it may be helpful for others.

Here are our top 10 gotchas:

#10Be cognizant of time zones.  When you are setting up incremental nightly data pulls, ensure that the user account that you are utilizing is in the same time zone as the underlying tenant.  Otherwise, you might spend hours validating data that may be hours out-of-sync (e.g. user account to extract data is in EST and the tenant is in PST).

#9Chop the data into sufficiently sized extracts to prevent timeout issues.  This may be less of an issue on a nightly basis, but is important if you are loading historical data or preparing for a time in which there is a large integration into the ERP.  Large integrations (e.g. from your Student Information System into your Financial System) can cause a nighly extract file to balloon and subsequently fail.

#8Send your team to deep ERP training – immediately!  Be prepared to know the ERP as well as the team administering it.

#7Ensure that your BI team receives the appropriate security that they require in the SaaS ERP.  If this gets into policy-related issues, ensure that you work these through with the respective owners in advance.  You may have to work through operational protocols and procedure that are not expected.

#6Plan a strategy for how you will create BI related content in your SaaS-based ERP.  Typically, this development needs to occur in a lower tenant and migrated to production.  How will you access these environments?  What is the path to migrate content?  Be prepared that some sandbox tenants may refresh on a weekly basis (i.e. BI content may not reside there for longer than a week).  Consider the notion of development in a dev tenant, migration to sandbox (which may refresh weekly), and then to production.  The timing of this process should feed into your planning.

#5Set up BI extract notifications.  We setup a specific email address which routes to our on call team.  These alerts notify our team if any of the extracts failed so that we can quickly extract and reprocess the file as needed.

#4Prepare to either learn an API in and out, or work with custom reports in the SaaS ERP.  Remember, there will be no direct connection to the database, so you’ll need to find alternate ways of efficiently extracting large amounts of data.  In Georgetown’s case, we wrote custom versions of reports from our SaaS-based ERP and then scheduled their output to a XML or CSV format.  These files are then processed by the nightly ETL routine.

#3Create a portal for consolidation and collaboration of reporting deliverables.  We set up a portal which provides instructions, documentation, links to metadata, ability to ask for help, report guides, etc.  Make this easy for your end-users to consume.

#2Test, Test, and Test.  SaaS-based ERP tenants typically go down one night a week for patches and maintenance.  This can be problematic for nightly BI extracts.  Make sure that you are aware of the outage schedules and can plan around them.  ETL routines may need to start a bit later on the day in which maintenance occurs.

#1Create an operational structure for the ongoing needs of the BI function in advance.  Georgetown is embarking upon a ‘BI Center of Excellence’ that will handle this sort of cross-functional structure which will be comprised of subject matter experts which represent different business units.  This will help the business units to continue to drive the development of the platform and allow IT to play a supportive role.

 

Data Governance in Higher Education

Data governance in higher education is an interesting topic.  Most colleges and universities operate in a decentralized nature to support the various functions of their business.  Consider student records (registrar’s office), student accounts, financial aid, admissions, financial affairs, human resources, advancement, facilities, etc.  In larger universities, decentralized operations are further compounded by multiple campuses and disparate physical/geographic locations.

As data are more prolific in our day-to-day jobs, and big data are ever-increasing, what role does data governance play in the current and future strategy of a higher ed?  It should play a major role.  But, how?

Data Governance
Source:  www.virtusa.com

I recently listened to Notre Dame’s data governance strategy on Educause.com.   Mike Chapple, Sr. Director of Information Technology Service Delivery, had some interesting insights.

In this interview, he discussed 5 key pillars of Notre Dame’s data governance strategy:

  1. Quality and consistency
  2. Policy and standards
  3. Security and privacy
  4. Compliance
  5. Retention and archiving

He also discussed the use of tools such as the RACI matrix to help steward various data governance processes.  All of these topics are relevant – and challenging to centralize.

We’re undergoing a similar effort at Georgetown to address some of these areas and to create a “BI Center of Excellence”.  We’ve engaged Gartner to learn more about the broader higher education landscape and to glean best practices.  This will become increasingly important as we roll out Workday Financials this July.  I look forward to these discussions and the opportunity to help better structure reporting operations and data governance in a central operation.

What is your organization doing?  Where does reporting/BI fall organizationally within your university?  Under the CIO, CFO, or Provost?  Or, do you have a neutral reporting/data governance organization that serves everyone?  Or, do you have reporting centers for each campus that report organizationally into a broader reporting unit?

These are some of the questions that we are trying to answer as we design the future state of how reporting and BI are handled at Georgetown.  If you are a member of HEDW, they just released an informative data governance survey which includes data from about 25 universities.  As we progress further down this path, I will post further about what was ultimately decided to support Georgetown’s efforts.

 

Business Intelligence with Workday

As part of my current project at Georgetown University, we have a requirement to extract large volumes of data for business intelligence purposes from Workday.  We’ve been running Workday HCM for about 2 years and plan to go-live with Workday Financials this July.  Historically, the university has been largely an advocate of on-premise solutions – and Workday represents one of the first large steps we’ve taken to move ERP data “into the cloud”.

Workday

What does this mean for business intelligence?  Let’s dive a bit deeper.

Provided that Workday is a SaaS (software-as-a-service) vendor, a direct on-premise database connection isn’t an option.  That being said, we had to explore options to get data of out of the Workday platform – and to do it quickly and efficiently.

Below are a few data extraction methods which we considered:

  1. Utilize Workday Studio and leverage the web services-based Workday API
  2. Join required Workday objects in a custom report and expose them as a RaaS (report-as-a-service)
  3. Utilize Informatica Cloud

We quickly ruled out option #3.  We enjoy using Informatica and thought this might be a quick win.  Unfortunately, when aligning the data options available in Informatica Cloud with our requirements – we had several gaps (i.e. missing data, data was not defined down to the appropriate grain, etc).  Informatica offered to partner with us to expand capabilities (much appreciated), but unfortunately our timelines did not permit this to occur for our current project.  We continue to use Informatica PowerCenter as our best-in-class data integration tool.

InformaticaWorkdayConnector

So – which option was better #1 (API) or #2 (RaaS)?  This question was kicked around for quite some time.  After much deliberation, we decided on option #2 (RaaS).  You’re probably already asking why.  Isn’t the API faster?  Below were the results of our analysis:

  1. The Workday API was hitting a governor limit for our tenant.  This may have been a specific configuration setting for our tenant but it was prohibitive to pull large amounts of data out of the API.  The API was timing out for certain data pulls at the 2 hour mark.  It also required an API developer to manage.  Workday has done a nice job of documenting their API.  We found this to be extremely helpful along the way.  I’m sure we’ll have a future use for the API.
  2. Custom Reports allowed us the flexibility to join multiple objects that met our requirements.  The API calls were limited to what is publicly available in those calls.  We have a bit more flexibility to combine objects which support our requirements in Custom Reports.
  3. Custom Reports are built using the Workday GUI.  This is an added benefit at it did not require a developer to maintain the API calls.  We can have a functional resource maintain the custom reports which are used for data extraction.
  4. Custom Reports (so far) have been able to support the large data extraction that we require – without timing out.  An example is that we have been able to extract about 200K bi-weekly payroll records in about 12 minutes.  We will use this Custom Report-delivered-by-EIB mechanism to support incremental loads to our Finance and HCM data warehouses.

WorkdayHCM

Using Workday’s Enterprise Interface Builder (EIB), we have been able to schedule the Custom Reports and have them output to a text file format which is conducive to load into our business intelligence platform.  The Workday Enterprise Interface Builder (EIB) tool provides an easy-to-use graphical and guided interface to define inbound and outbound integrations without requiring any programming. Used by the majority of Workday customers, the EIB can be used by both business and IT users to address a variety of integration needs.

WorkdayEIB

Our project is still underway.  I’ll write another post with more details as our project moves further toward completion.   If you have any thoughts or suggestions regarding Workday data extraction / integration, please feel free to comment.  Suggestions and comments are welcome!

The Power of Timeliness

My team at Georgetown is tackling a large upgrade of Ellucian’s Banner Operational Data Store product.  You’re probably already wondering what this has to do with timeliness.  Ironically, we haven’t been timely at all in keeping pace with Ellucian’s software updates on this product.  When I stepped into my current role, I made a commitment to get all of our software products upgraded to the latest version.  This is important to not only to get support/maintenance benefits, but also to take advantage of the latest software features – many of which positively impact our business.

Ellucian ODS

This large upgrade that we are completing for Ellucian’s ODS product further exposed the benefit of timely and responsive team collaboration.  To give you a sense of the magnitude, this upgrade impacts over 400 reports across thousands of users.

  • To better collaborate, we setup a Google Spreadsheet, which enabled the team to perform collaborative testing in real-time.
  • From an IT perspective, we documented all of the reports that we wanted the end-users to test by functional area.
  • We then assigned points of contacts in each functional area to test that collection of reports.
  • Feedback, errors, and successes were all tracked consistently in the Google Spreadsheet.

So – where I am going with all of this?  The point of the article was to talk about timeliness.

pocket-watch

I’ve seen this in other cycles, but we noticed that the testers that performed their testing early actually received more feedback and were more successful in their efforts.

  • By submitting feedback early, it enabled my team to more quickly respond. In some cases, we had time to screen share with the users.
  • My team was also timely and provided quicker responses to these initial pieces of feedback. This was due to bandwidth.  The had more flexibility to respond earlier in the testing cycle.
  • As a result, the teams that submitted early feedback received more iterations withing the same testing cycle.  This made their efforts more successful overall.

Ellucian ODS 8.4.1 Testing

There is a lot of meaning behind the old adage, “The early bird catches the worm.”  In this case, it had a direct correlation to the success of testing.

In many other cases, timeliness and responsiveness can also have a positive impact on:

  • Employee morale
  • Customer satisfaction
  • And, if you’re in the commercial world – profit

Ten Steps to Deploy IBM Cognos 10.2.1

Cognos_TM1_Accounting_Software_by_IBM
Last weekend, my team was able to successfully migrate Georgetown’s IBM Cognos software from version 10.1 to 10.2.1.  Seems like a piece of cake, right?  Well, in our case, not exactly.  I thought I’d share a bit about our experience in the event that it may help others.

CognosHome

Our IBM Cognos upgrade effort followed a previously failed attempt to upgrade.  After much deliberation with IBM, it was decided that the failed attempt to upgrade primarily related to the fact that we had extremely old hardware and were running on a dated Solaris OS.  Still being fairly new to Georgetown, I dug in a bit further.  Indeed – our hardware was ancient and the Solaris OS was preventing us from getting much needed support from the vendor (IBM).  We had previously implemented IBM Cognos in 2005 on version 8.4.  Then, in 2010, we migrated on the same hardware, to version 10.1.  Given these facts, I was dealt a server which was 8-9 years old, and software that hadn’t been upgraded in at least 3-4 years.

Through several conversations with IBM SMEs, we settled on a proposed 6 server architecture (depicted below).  I’ve removed the details for security reasons, but we designed these machines down to the processor and RAM level.  We also had conversations with IBM about what OS would be best suited for us longer term.  We landed on Windows Server 2012.

CognosHardwareSizingCognosVirtualInfrastructure

For anyone interested, below are the series of steps that we followed to get this project off the ground:

  1. Proposed business case and secured funding
  2. Assessed the current state architecture internally.  Made an educated decision on what we felt that we needed to support our business requirements for the future state.  We were specific – down to the processor and memory level for each machine in the architecture.  We lessened the hardware requirements for the DEV and TEST tiers.  QA and PROD tiers were identical.
  3. Validated the architecture with the vendor (IBM) and ensured that they supported and recommended the approach.
  4. Altered the architecture post vendor recommendation.
  5. Based upon the agreed architecture, engaged the IBM sales representative to identify the correct licensing.  This step will make your head spin a bit.  IBM calculates license costs on a Processor Value Unit (PVU) basis.  Effectively, it is a proprietary formula used to calculate how much processing power resides in each of your machines.  This done by processor and accounts for cores.
  6. Negotiated and completed the procurement process.  Thankfully, IBM has some decent higher education discounts.  For future operating budgets, please be aware that the licensing does not stay flat.  You’ll typically see a 4-5% increase per year.  Also, for renewals, you might consider working through a business partner (such as CDW-G) to save money on the renewal.
  7. Setup the infrastructure.  We chose to do this in a virtual environment.  We also setup the architecture in DEV, TEST, QA, and PROD.
  8. Configured the IBM Cognos software (in this case, 10.2.1).  This is more intensive across the distributed architecture, but well worth the performance and scalability benefit.
  9. Tested, tested, and tested.  We started at the DEV tier and slowly promoted to TEST, QA, and then PROD.  If you have an existing environment already in production, you may consider running the two production environments in parallel for a short period of time.  We did this for about a week and then recopied the content store from the live environment to the new production environment.  It provided an additional level of comfort for testing.
  10. Go-live and enjoy the new features of IBM Cognos 10.2.1.  Please note – we decided to go-live with 10.2.1 on our existing 32-bit packages.  As a next phase, we are migrating all of the 32-bit packages to 64-bit.  You may consider this during the testing phase and deploy all at once.

CognosPVUContract

What tips do we recommend?

  1. Ensure your SSL certificate is installed across all of the machines in your architecture.  If you only install on the gateway server(s), the images on some of your reports will be broken.  They attempt to run via port 80 (HTTP) instead of port 443 (HTTPS) and are blocked.
  2. The governor limit on packages is reset.  We had to go in a modify each package to reset this limit.
  3. The portal may seem slower initially. Allow the content store several business days to optimize and reindex.  You’ll then see an improvement.
  4. Don’t forget to import the new visualizations and install the mobile capability. Very cool stuff!
  5. Collaborate with IBM. They may offer to provide an overview of the new features to your team.  If you have budget, they may also have optimization recommendations.

So what are our favorite features thus far?

  1. External object store for report output – helps tremendously with the size of our content store.
  2. New visualizations – very cool!
  3. We also enjoy the Cognos Mobile app which allows us to share content on mobile devices and push alerts.

CognosWorkspace

CognosMobile

Here’s the full list of new features from IBM:
http://pic.dhe.ibm.com/infocenter/cbi/v10r2m1/index.jsp?topic=%2Fcom.ibm.swg.ba.cognos.ug_cra.10.2.1.doc%2Fc_asg_new_10_2_1.html

Enhance the way you do business in the cloud with Intuit’s Quickbase

intuit_logo

My recent experiences with Intuit’s QuickBase product led me to write this article.  I found it to be a great tool for project management – and beyond my needs, several departments adopted it as a means to build custom, cloud-based applications.  Here are a few quotes from my former team.  I’ve also reviewed the application on TrustRadius for any of you that may be looking to purchase.

Quickbase allowed our professional services team to fine tune our delivery processes, reduce delivery time, and ultimately improve our margins.

We felt more connected as a team due to the workflow and notification capabilities.  We felt in-sync!

Quickbase allowed us to quickly adapt and keep pace with our ever-changing business.

I felt more empowered to write my own reports and see data the way I needed to in order to run my business unit.

The flexibility of Quickbase’s platform allows for almost limitless use for the product.  If you are considering a similar tool (i.e. SalesForce), here are a few features that we enjoyed.  Disclaimer: I do not get paid in any way by Intuit.

  1. Get started quickly with best practice templates.  My favorite?  The Project Management template.
    • QuickBase provides numerous out-of-the-box templates.  On many occasions, we thought up a concept and then realized that Intuit already had a template available that solved 75-80% of our needs.  We were then able to quickly customize this template to build our “custom” application.  It was simple to add fields, modify forms, etc.
  2. Easy-to-use reports, charts, and graphs.
    • Not only is it easy to setup reports around your data, it is also easy to view and modify the data directly from your report.  The QuickBase platform allows you to drill into the data from the reporting elements.  You can then use ‘Grid Edit’ mode in many cases to directly edit the data that you’ve drilled into.  This assumes that you have the appropriate access to that data, of course.


  3. Control access easily through customizable roles.
    • Access and security is sometimes an afterthought.  Quickbase allows you to setup custom roles and then secure that data that resides in your application tables down to the data element.  You can choose the basic functions of view, add, modify, delete.  This flexibility allows you to setup complex security in your application to reflect the unique needs of your business community.  You can also nest roles for individuals that may be in more than one.

  4. Easily add custom fields to support your business need.  There are a variety of data types which you can select.
    • Be it a custom application, or the customization of an existing QuickBase template, it is tremendously helpful to be able to add custom fields to your application.  This is done easily by the ‘Add New Fields’ feature.  Intuit has made it really easy by predefining a number of logical data types.  You can even use their scripting language to create a derived or calculated field.  Personally, I’ve found that calculated fields come in extreme handy.  In terms of project management, I’ve used calculated fields to determine the variance between two dates.   This helped me to analyze how long a certain activity was taking to complete across several projects.

  5. Subscriptions and notifications are a powerful way to keep your users engaged and target communication to them at the right time.
    • QuickBase has an excellent built-in feature to handle notifications.  These are helpful when setting up complex workflows.  They are also very nice for prompting users to take action on particular pieces of data when it has changed.  We were able to reduce our delivery timeframes by notifying different functional groups when activities were completed ahead of schedule.  This had a direct impact on our business and helped us to achieve better margins.

Don’t take my word for it.  Quickbase offers a free 30-day trial (no credit card data required).  Sign-up with your email address and let me know what you think!  In the trial, you’ll have the ability to use all of the features above and get a look at the available templates.  If you’re looking to build a pilot ‘business case’ for your boss – this is the spot!

Considering BI? Follow these 10 steps for success!

Business intelligence.  It consistently ranks as one of the top priorities in various CIO and IT surveys.  In January 2013, Gartner conducted an executive program survey of over 2,000 CIOs.  I’ve listed the top 10 business and technology priorities from the survey below.

Top 10 Business Priorities Top 10 Technology Priorities
  1. Increasing enterprise growth
  2. Delivering operational results
  3. Reducing enterprise costs
  4. Attracting and retaining new customers
  5. Improving IT applications and infrastructure
  6. Creating new products and services (innovation)
  7. Improving efficiency
  8. Attracting and retaining the workforce
  9. Implementing analytics and big data
  10. Improving business processes
  1. Analytics and business intelligence
  2. Mobile technologies
  3. Cloud computing (SaaS, IaaS, PaaS)
  4. Collaboration technologies (workflow)
  5. Legacy modernization
  6. IT management
  7. CRM
  8. Virtualization
  9. Security
  10. ERP Applications

SaaS = software as a service; IaaS = infrastructure as a service; PaaS = platform as a service
Source: Gartner Executive Programs (January 2013)

That said, it is a known fact that many business intelligence (BI) projects fail. Are you planning to implement a BI and analytics program? Maybe thinking of big data?

Here are a few lessons learned that will put you on the path to success:

  1. Ensure executive sponsorship from the start of your project.  This may seem like a simple thing, but ensuring that the executives stand behind your project is critically important.  Before the project starts, lay out the business plan, create budgets (operating/capital), document key stakeholders, and setup a governance structure.  Keep these executives apprised of progress throughout and ensure they see the business value that you are providing.
  2. Understand your current software agreements – and choose the right software/database platform for your organization.  Many people ask me – what is your favorite BI software/database platform?  My answer is that is always depends.  It depends on what contracts your company already has in place.  It depends on the skills and expertise of your staff.  It depends on the budget of your project.  The net is that there are a variety of really good BI tools on the market.  To name a few – Microsoft, Oracle, MicroStrategy, IBM, Teradata, etc.  For a small scale rapid BI implementation, consider cloud-based tools such as Intuit’s Quickbase.
  3. Be inclusive during requirements gathering – don’t design in a bubble.  IT departments often get frustrated with business users as they feel that they can get the technical solution in place much more quickly without the business users involvement.  While this is probably true, if you don’t get critical buy-in from these business users – your initiative will ultimately fail.  The business users need to understand that the system will support their needs and requirements.  This is also critical when you get to the decommissioning phase (item #9 below).
  4. Employ a professional services team to help you.  This is not necessary, but in my personal opinion, I feel that it is tremendously helpful.  Your staff may or may not have dedicated time for the project.  Bringing on a few technical consultants and a project manager can really help to drive the project forward.  In addition, they hold an objective opinion and can help facilitate communication and decisions among departments.
  5. Don’t overlook security.  Security is often over-engineered in BI projects.  Please remember that BI projects don’t need to have ERP-level security.  You may consider bringing the security up a notch in order to gain better performance.  During your design, you may also identify that the users of the BI platform may be power-users by nature.  The platform doesn’t have to be available to everyone.  You may consider allowing a greater level of access to a few number of “power-users”.  This will depend on your specific deployment, but make sure you plan to discuss security early in the project and don’t leave it as an afterthought. 
  6. Document data definitions and ensure buy-in of that definition across groups.  Data definitions can be the most challenging part of a successful BI project – particularly if you have multiple groups involved and they each have their own definition of one piece of data.  This is a tedious process, but be diligent in working through definitions for your data and particularly for any calculated fields.  You may consider software packages that help you to manage this workload (i.e. Informatica’s Business Glossary)
  7. Keep it simple for the end-user and consider a portal.  Presentation is also important.   In a recent implementation, I used Microsoft’s SharePoint 2013 and its Team Site capability to disseminate reports and data by department.  We used team sites for the various departments and a public site to bring all of the departments together.  Consider building a portal similar to this for your end-users.  This makes the report/analytics delivery seamless and easy.  In an ever-growing mobile world, ensure that the portal is mobile capable.  Your users will want to access data when on the go.
  8. Allow for collaboration, but in a controlled environment.  Control the data model and only expose elements that have been approved by your data stewards.  Allow for power-users to create their own reports, but do it in a controlled way.  Expose approved data models to them.   Keep the report creation controlled to departments so that you can keep the solution clean and tidy.  You may even consider a workspace for the custom reports/analytics so that it can be separated from the standard content that is developed and available with the platform.
  9. Decommission old reporting solutions.  During go-live, don’t forget to decommission the legacy reporting solutions.  If you do not, people will continue to use them.  Caution:  if you decommission them, you’ll need executive sponsorship (item #1) and also assurance that you’ve captured the requirements from this system (item #3).
  10. Constantly innovate and evolve your new platform.  Don’t let the platform go-live and become immediately stale.  Engage the end-users and have a constant feedback loop established with them.  Get them engaged to have them help you innovate and build additional content that will better serve their community.  Hold a user-group for key power users so that they can collaborate and share lessons learned.

TrustRadius – A great new Technology Crowdsourcing concept from Vinay Bhagat

Earlier in the year, I wrote an article about Technology Crowdsourcing.  There is a new player in this space named TrustRadius.

trustradius_logo_beta

The talented Vinay Bhagat and his team have created a unique platform which compiles software reviews sourced from product experts.  Key differentiators in this new platform include:

  1. TrustRadius is really focused on trying to source in-depth insights from primary product users/ experts
  2. The reviews of TrustRadius are structured, and soon will be able to be added to, and amended over time.  Structure allows for curation into things like comparisons.

They were launched in May and were funded by the Mayfield Fund in July and are rapidly starting to scale.  So far, they have 23k monthly visitors and are growing at 30% per month.

Review of Microsoft Business Intelligence at TrustRadius

For interested higher education folks, I’ve posted a few reviews there that you may find interesting: