It’s Easy to Assess and Share Your Project Portfolio Health

POL Dashboard

 

Successful companies continually improve the way in which they envision, manage and execute their internal projects.  Companies which execute their projects effectively (on-time and on-budget) have a tremendous advantage in  the marketplace – provided they are doing the RIGHT projects.

But. we hear a common pain from many of our clients around reporting and insights – How can we monitor the overall health of our Project Portfolio, without getting mired in Resource constraints and “Eye-Chart” style reporting?  Executives need continuous insight into how their portfolio is performing, and to do this they need to calculate the ROI and Health of each project individually and then aggregate.  They also frequently need to consider Resource Availability, Schedule Constraints and Costs when assessing the health of any given project.

We have been working with our Project and Portfolio Management clients to help them build dashboards and visualizations of project data managed in Microsoft Project Online.  Microsoft Project Online is a Cloud Hosted Software-as-a-Service (SaaS) solution for Project Management Offices (PMO’s), Project Managers and Project Teams to formally manage ALL of the projects in their portfolio in a disciplined manner.  Blum Shapiro’s Project and Portfolio Management practice helps companies improve their Project Management capabilities, often with technology tools such as Microsoft Project Online.  You can learn more about our Project and Portfolio Management practice here.

Microsoft Power BI is uniquely suited to the building of these dashboards and visualizations, because it is also a SaaS type of product – which means you don’t need to purchase BI servers or make room in your data center.  That would make this into another project!  Further, it is designed to connect to, shape and model ALL data you may need, whether this data is On-Premises, in the Microsoft Cloud, or in another Cloud.  Finally, it’s really, really EASY to get started.

Power BI comes in two licensing offers: Free and Pro.  The Free Edition works well for individuals or departments who want to build personal dashboards for themselves, simply to keep track of or analyze important data for which they are responsible.  We’re running workshops throughout the fall called Dashboard in a Day where we help clients connect to cloud datasets (such as Project Online) and build themselves a simple set of reports and dashboards. Truth in advertising: it takes less than a day.

However, collaboration is valuable.  Nobody wants to hold status meetings with Senior Management at their desk or even be tied to a projector.  Therefore, we recommend that companies with a clear Project Management directive upgrade to the Pro Level ($10/user/month), and the biggest reason is in order to take advantage of Content Packs.  A Content Pack is a collection of pre-built Datasets, Reports and Dashboards which can be published and shared across an entire enterprise.  As easy as it is to connect and consume data in Power BI, some knowledge of the source systems is extremely helpful.  Since not everyone needs to understand how data is stored in Microsoft Project Online, let’s arrange and model the data for our colleagues and direct reports, then share our insights.

Blum Shapiro Consulting offers a Power BI Content Pack which contains pre-built reports and dashboards for Microsoft Project Online.  All that is required is for us to change the web address of our data sources from ours to yours, publish the content pack to your Power BI tenant, and your organization will have instant visibility into the project portfolio.

In order to get this instant visibility, users would follow these simple steps:

First, Sign up for Power BI Pro.

Once signed up with an Organizational Account, users will have their own dashboard viewer.

Second, In the lower right hand corner of the application, Click Get Data

Power BI Get Data

Under My Organization, Click Get

Power BI Content Pack Library

Select from one of several Content Packs (in this case, Project Online Customer Immersion Experience) and click Connect

POL Content Pack

After about 10 seconds, users will have a prebuilt set of Reports and Dashboards to view.

POL Dashboard

Before we leave, we’ll help you set up an Hourly Refresh Schedule on the data from Project Online.  That way, the dashboards you share will always be up to date.

Contact us to learn more about Power BI Content Packs, Microsoft Project Online or our Dashboard in a Day workshops.

 

6 Critical Technologies for the Internet of Things

Picture7If you and your company prefer Microsoft solutions and technologies, you may be fearing that the Internet of Things is an opportunity which will pass you by.

Have no fear: Microsoft’s transformation from “Windows and Office” company to “Cloud and Services” company continues to accelerate.  Nowhere is this trend more evident than in the range of services supporting Internet of Things scenarios.

So – What are the Microsoft technologies that would comprise an Internet of Things solution architecture?

And – How do Cloud Computing and Microsoft Azure enable Internet of Things scenarios?

Here are the key Microsoft technologies which architects and developers need to understand.

Software for Intelligent Devices

First, let’s understand the Things.  The community of device makers and entrepreneurs continues to flourish, enabled by the emergence of simple intelligent devices.  These devices have a simplified lightweight computing model capable of connecting machine-to-machine or machine-to-cloud. Windows 10 for IoT, released in July 2015, will enable secure connectivity for a broad range of devices on the Windows Platform.

Scalable Event Ingestion

The Velocity of Big Data demands a solution capable of receiving telemetry data at cloud scale with low latency and high availability.  This component of the architecture is the “front-end” of an event pipeline which will sit between the Things sending data and the consumers of the data.  Microsoft’s Azure platform delivers this capability with Azure Event Hubs – extremely easy  to setup and connect to over HTTPS.

Still – Volume + Velocity lead to major complexity when Big Data is consumed; the data may not be ready for human consumption. Microsoft provides options to analyze this massive stream of “Fast data”.  Option 1 is to process the events “in-flight” with Azure Stream Analytics.  ASA allows developers to combine streaming data with Reference Data (e.g. Master Data) to analyze events, defects, “likes” and summarize the data for human consumption.  Option 2 is to stream the data to a massive storage repository for analysis later (see The Data Lake and Hadoop).  Regardless of whether you analyze in flight or at rest, a third option can help you learn about what is happening behind the data (see Machine Learning).

Machine Learning

We’ve learned a lot about “Artificial Intelligence” over the past 10 years.  Indeed, we’ve learned that machines “think” very differently than humans.  Machines use principles of statistics to assess which features (“columns”) of a dataset provide the most “information” about a given observation (“row”).  For example, which variable(s) are most predictive (or closely correlated) with the final feature of the dataset?  Having learned how the data is related to one another, a machine can be “trained” to predict the outcome of the next record in the dataset; given an algorithm and enough data – a machine can learn about the real world.

If the IoT solution you envision includes predictions or “intelligence”, you’ll want to look at Azure Machine Learning.  Azure ML provides a development studio for data science professionals to design, test and deploy Machine Learning services to the Microsoft Azure Cloud.

Finally, you’ll also want to understand how to organize a data science project within the structure of your company’s overall project management processes.  The term “Data Science” is telling – it indicates an experimental aspect to the process.  Data scientists prepare datasets, conduct experiments, and test their algorithms (written in statistical processing languages like “R” and “Python”) until the algorithm accurately predicts correct answers to questions posed by the business, using data.  Data Science requires a balance between experimentation and business value.

The Data Lake and Hadoop

A Data Lake is a term used to describe a single place where the huge variety of data produced by your big data initiatives is stored for future analysis.  A Data Lake is not a Data Warehouse.  A Data Warehouse has One Single Structure; data from a variety of formats must be transformed into that structure.  A Data Lake has no predefined structure.  Instead, the structure is determined when the data is analyzed.  New structures can be created over and over again on the same data.

Businesses have the choice of simply storing Big Data in Azure Storage.  If the data velocity and volume exceed certain limits of Azure Storage, Azure Data Lake is a specialized storage service optimized for Hadoop, with no fixed limits on file size.  Azure Data Lake is a service announced in May 2015, and you can sign up for the Public Preview.

The ability to define a structure as the data is read is the magic of Hadoop.   The premise is simple – Big Data is too massive to move from one structure to another, as you would in a Data Warehouse/ETL solution.  Instead, keep all the data in its native format, wait to apply structure until analysis time, and perform as many reads over the same data as needed.  There is no need to buy tons of hardware for Hadoop: Azure HDInsight provides Hadoop-as-a-Service, which can be enabled/disabled as needed to keep your costs low.

Real Time Analytics

The human consumption part of this equation is represented by Power BI.  Power BI is the “single pane of glass” for all of your Data Analysis needs, including Big Data.  Power Bi is a dashboard tool capable of transforming company data into rich visuals. It can connect to data sources on premises, consume data from HDInsight or Storage, and receive real-time updates from data “in-flight”.  If you are located in New England, attend one of our Dashboard in a Day workshops happening throughout the Northeast in 2015.

Management

IoT solutions are feasible because of the robust cloud offerings currently available.  The cloud is an integral part of your solution, and you need resources capable of managing your cloud assets as though they were on premise.  Your operations team should be comfortable turning on and off services in your cloud, just as they are comfortable enabling services and capabilities on a  server. Azure PowerShell provides the operations environment for managing Azure cloud services and automating maintenance and management of those services.

Conclusion

Enterprises ready to meet their customers in the digital world will be rewarded.  First, they must grasp Big Data technologies.  Microsoft customers can take advantage of the Azure cloud to create Microsoft Big Data solutions.  They are designed first by connecting Things to the cloud, then creating and connecting Azure services to receive, analyze, learn from, and visualize the data.  Finally, be ready to treat those cloud assets as part of your production infrastructure, by training your operations team in cloud management tools from Microsoft.

Reorganize Your Content in SharePoint

There is never a bad time to think about how your content can be structured better. This is especially true if you are migrating to SharePoint from an old platform, upgrading, or even sticking with your current SharePoint system. This article will give you a few key areas to think about when reorganizing your content in SharePoint.

First off, I want you to think about the date when your current system was implemented and how your company looked. Now think about how your company has changed since then. Your company could have grown exponentially, departments and teams could now be located in different towns/states/countries, or you might be doing old processes in radically different ways. In any case, it’s safe to say that your company has not stayed the same. So does it make sense for your content structure to stay the same?

Roadmap

Reorganizing your content should be a well thought out process. A formal roadmap will need to be created to get your content from Point A (current structure) to Point B (completed structure). Steps in your roadmap should be well defined, assigned to specific people and time boxed.

PointAtoB

Your plan might be for content to be reorganized in your current system then migrated. If you are migrating to a new SharePoint system, this is great to get your users to experience the new structure in their current environment. Otherwise you can reorganize while the content is in transit, or even after it has been moved. All of this will depend on the tools you are using, timing, business priority, etc. Many factors!

As part of your roadmap, it is always a prudent idea to think about the future of your company. As you thought about how your company changed, you also want to keep in mind how your company WILL change. No one can predict the future, but your company might have a strategic 5/10/20 year plan where they list out their growth strategy. That plan would be a good guideline when planning out your new structure.

 

Some Reorganization Steps

  • Consolidation of Libraries There are many different scenarios why you should have multiple document libraries. Ease of security management, for one. But that doesn’t mean you need multiple document libraries. SharePoint content can start to sprawl, since the ease of adding libraries might not give the user pause when choosing between creating a new library and using a current one.
  • Add Content Types / metadata This can go hand-in-hand with the consolidation of libraries. If you are adding different document types to one libraries, you might want to distinguish them with Content Types or even just columns on the library. How many times have you created a new library for the current year (e.g. Financials 2014 and Financials 2015) when one library would be sufficient with the addition of a year column?
  • Archive content / Keep Content in place Not all data needs to make it over. You can segregate content by putting it into an Archive site or even keep it where it resides (if it’s not going away). Remember, all this content can still be discoverable in SharePoint. SharePoint search can return results of content in prior versions of SharePoint or to other file systems and web sites.
  • Remove Content Not all content is still used or even useful. Why have a survey to determine where the company picnic should be if it was 2 years ago. Backup this content and do not give it the green light to migrate.
  • Rename Content Sometimes “Shared Documents” just isn’t descriptive enough.
  • Consolidation / Creation of Sites Teams and departments may no longer work together or even exist.
  • Move Content to the Front Most people that go to your SharePoint site might not be regulars. It’s also a good idea to determine what might be the most relevant to all users of the site and put that on the home page.

 

If you are moving to SharePoint, upgrading, or migrating to the cloud, it is always a good idea to take a step back and look at your content. It very well may be that it can use a little bit more organization.

 

iStock_000018843424_ExtraSmall

4 Reasons Power BI is Better than Tableau, Qlik Sense

iStock_000006412772XSmall

I enjoyed reading this article by Martin Heller in which the author analyzed three top Data Visualization products: Tableau, Qlik Sense and Power BI.  Heller does a nice job explaining how these products represent an evolution in BI, making real data insights attainable for non-IT business users.  Mr. Heller says that each of these make self-service BI “remarkably easy” for users throughout the organization, but that in his opinion Tableau stood out as the best of the three.

But if Heller’s analysis is correct, then his conclusion makes no sense.  He first cautions that none of these products is well suited for Enterprise Reporting;: an important point – if you are looking for financial reporting, you are looking in the wrong place.  He then details each product’s features at length, with a focus on ease of use and the breadth of visualizations available in each.  He notes that there is very little difference in mobile capabilities (the Android app for Power BI is an exception, slated for September release).  Finally, he concedes that Power BI offers the best value as compared to Qlik Sense, and significantly better value than Tableau.  My question to Martin is this:  if you cannot afford Tableau or Qlik Sense licenses, and therefore cannot truly democratize business insights from data in your organization – what difference does the rest make?

Here are four reasons why Power BI is the smart choice:

1. Better Value – I’m not talking about a few dollars here and there, I’m talking 4x – Power BI Professional is $10 per month per user, Tableau Online is $500 per year per user.  I’ll let you do the math.  Mr. Heller does state that Qlik Sense is less expensive than Tableau, but does not go into specifics.

2. SaaS model – who wants another server on-premises to publish BI reports and dashboards?  Business users love cloud services: they pay a simple monthly subscription for the insights and visualizations they need, and  they don’t need to ask IT for anything!  Further, they get accelerated product updates from the cloud – much faster than traditional IT shops can maintain.  The net result is that vendors inevitably achieve parity, and the question “which product has the best visualizations?” quickly becomes a zero sum game.

3. Q&A – Power BI is the only product with Natural Language Query capabilities.  You simply type a question (i.e. “What was our customer churn rate in the past 3 months?”) and Power BI selects a visualization for you to explore.  The visualization chosen may or may not have been created by the dashboard’s author.

4. Power Query – all of these tools offer simple connectors to databases, Hadoop, CSV files, cloud data providers, but neither Tableau nor Qlik Sense provides a data shaping tool with the capability of Power Query.  For real power data users, Power Query’s M Language has all the capability of an IT Pro’s data transformation package, with none of the IT headache.  Power Query can be used in Excel, and it can also be used in the Power BI Desktop application. In either case, analysts consume and shape data on the desktop, build reports and publish to the cloud.

But is Power BI easy to use?  Yes it is – I invite you to come see for yourself! Dashboard in a Day is a Power BI workshop which Blum Shapiro Consulting is hosting at various locations in New England through the remainder of 2015.   We are offering Dashboard in a Day sessions at No Cost to participants.

Our first Dashboard in a Day workshop will be hosted by Microsoft at the Hartford, CT office on Thursday August 27th.  We’ll be teaching and working with 12 participants starting promptly at 9:00 AM.  Those of you who would like to “Bring Your Own Data” are invited stay for a working lunch where we’ll help you get started with your data (in Excel or CSV format, or organizational data stored in SalesForce, Google Analytics, Marketo, Dynamics CRM).  Register to attend here