Tag Archive for Cloud Computing

Internet of Things Modern Application Development

Technology in the hands of businessmen

Over the past decade modern application development has shifted from mainframe computing to personal PCs, and now to smartphones and cloud services. These shifts required new software languages, new hardware and new application development solutions. The best illustration of this shift came with the 90’s “Internet Boom.”  This shift resulted in application development on mainframe and personal PCs to applications that can run on a web browser. The shift also included new tools such as Visual Studio, new languages like HTML/JavaScript, new Architecture Patterns such as MVC and new application life cycle processes like Agile/Scrum.

Then came the smart phone. This shift from personal PCs to small mobile devices such as iPhones forced modern application development to support multiple screen resolutions, and a need to operate off-line while remaining connected to cloud services. Our next shift is to the Internet of Things (IoT), once again giving a new meaning to modern application development. Now, applications need to be developed to run on different types of devices like thermostats, doorbells and small Bluetooth sensors. The application must be secure, cloud ready and able to perform predictive analysis using machine learning. Below are my thoughts on this latest shift in modern application development:

Devices

The IoT modern application development shift includes a multitude of devices that range from televisions to cameras, to refrigerators, to pretty much any device that is powered into an outlet. One of the more notable products in this space is the Amazon Echo which uses voice recognition as its main interface, and can provide control over your light switches, thermostats and even your music collection. Amazon Echo is an example of an IoT device which breaks away from the previous modern application development, as it uses voice as its interface, is always connected to the cloud, and can connect with other IoT devices. This changes everything about how we think of modern application development. No longer is it about supporting multiple device resolutions, but rather about what data can be captured via the latest IoT devices and how that data can be used to improve our lives. This means we need new software tools, new cloud services, new analysis software and new machine learning algorithms.

These applications do not always include fancy user interfaces, as they are often function specific. For example, an IoT device could capture changes in temperature on a farm, take soil sample readings or even capture images and video of the fields. This data can then be sent to cloud services where it can be analyzed and run through machine learning to produce an easy to understand update on the farm. The data from the disparate “things” needs to be collected in a common format for actionable insights. Of note, most of the “big data” being processed and collected today is machine-to-machine. Cloud services help to aggregate and display this data in ways humans can understand, analyze and take action on the insights delivered.

Cloud Services

Cloud services are at the heart of IoT. Devices are built to perform a simple purpose and leave all complex user interfacing, analysis and thinking to the cloud. Cloud services such as the Azure IoT hub provide both the software tooling and service for a device to talk to the cloud and the device to connect to other devices. For example, in the manufacturing industry, IoT devices using the Azure IoT hub can be developed to monitor the production line and equipment use, which is then submitted to cloud service which then can be interpreted by human intelligence to predict equipment maintenance.

With this shift to IoT modern application development software is developed to capture data from a range of sensors, submit that data to cloud services and then process that data using analytics services such as Business Intelligence dashboards for timely and relevant role based information.

Machine Learning

So what is the point of these IoT devices in our homes, cars and at work, capturing data and sending it to the cloud? Well that’s what machine learning is all about. We now need to develop algorithms that can learn based on data from the IoT. For example: home IoT devices using machine learning will learn the normal patterns in your house and only notify you when there is a disruption such as the lights staying on past a normal pattern or when you leave your windows open while you are away. Machine learning is one of the most important aspects of IoT and without it, all we would have is raw data in a cloud service with no meaningful way to utilize it.

BlumShapiro Consulting is a Microsoft Advanced Analytics partner, with experience building modern IoT apps. 

Technology Talks Newsletter CTA

About Hector: 

hectorHector Luciano, Jr. is a Consulting Manager at BlumShapiro, a Microsoft Gold Partner focusing on SharePoint, Office 365, mobile technologies and custom development solutions. Hector is an active member of the SharePoint community. His experience reflects a breadth of Microsoft .Net Technologies experience. With a focus on Software Application development, Hector has worked on various projects including architected and designed solutions for web, client/server and mobile platforms. He has worked closely with business owners to understand the business process, then design and build custom solution. Hector currently holds Microsoft Certified Solution Developer (MCSD), Microsoft Certified Professional Developer (MCPD).

 

 

KPI’s in Power BI: Not as hard as you think

KPI compare

Power BI just keeps getting better. The addition of the KPI visual to the standard palette is just another example in a long line of improvements, and the subject of this quick article.

Key Performance Indicators have been around long before the computer age. Show of hands: Who has ever browsed the new car showroom and NOT looked at the window stickers listing the vehicles’ MPG ratings? I thought not. While maybe not a true KPI as explained by Gerke & Associates, Inc here, Miles per Gallon is something we all understand when talking about the performance metric of a car. Most cars now come with computerized displays that will show instantaneous MPG, or an average over time. Keep these in mind when we transition this discussion over to Power BI.

In SQL Server Analysis Server cubes, you had the ability to create KPI’s inside the cube. They could then be browsed by the tool to which it was connected, something such as an Excel Pivot Table. Thought slightly different, KPI’s were also available in Analysis Server Tabular Models and Excel PowerPivot, both precursors to the Power BI Desktop.

But data analysts and modelers may experience premature disappointment to find that there is no way to create a KPI when using the Power BI Desktop designer. Knowing it was there in Excel PowerPivot models doesn’t help. It was there before, why did they take it out? Enter the KPI VISUAL.

KPI on Palette

By selecting this visual, you can create a KPI out of any metric in your model. It has three simple fields in the designer to define its appearance: Indicator, Trend axis, and Target goals. Let’s take a look at each of these in turn.

  • Indicator: This is the aggregated column or measure being considered. It could be as simple as the Sum of Sales. You do NOT need to slice this metric down to the latest value, such as [Sum of Sales for the last full month] or anything crazy like that. The base metric will suffice. The reason is explained below.
  • Trend axis: Grab a date type field for this (obviously) and (not so obviously) select the granularity: year, month, etc. Doing so will tell the KPI visual how you want to aggregate the Indicator metric over time.

At this point, your KPI should display two things: a black number and a grey shaded area in the background. We’ll explain all this at the end.

KPI 2

  • Target goal: This can also be an aggregated column or measure similar to the Indicator. If you don’t have one, it’s easy to create a ‘static’ goal by adding a new Measure with a simple static value such as: “KPI Goal = 100″, which is what I did for this demo.

Now, there’s only two options when you get all those things set: either it looks right, or it doesn’t. Consider the following two KPI Visuals from Power BI, both created from the same set of data, both using the same field settings.

KPI compare

The green one on the left I can tell you is how it is supposed to look based on the data I entered. It’s easy to see in the 6 rows that there has been a steady increase over the last 4 years.

KPI Data

So why did my initial attempt at a KPI result in the red one on the right? Apparently there is a dependency on the order in which the KPI is designed. If you selected a metric to add to your canvas first, and then decided to switch it to a KPI, you may get erroneous results. If, however, you start by adding an empty KPI to the canvas and then populating the three fields, you will probably see what you expect. If it doesn’t look right, the fix is quite simple really: remove the Indicator field and re-add it! It may be quirky, but it works.

Now let’s talk about all the pieces of information contained in this one (now correctly formatted) KPI visual. Referring to the green version on the left above, the bold number 120 corresponds to the value calculated (probably summed) for the latest point of the Trend Axis. Based on my data, that is the point for 1/1/1015. Typically, data would be spread over may dates over the years, but the aggregation of the Indicator, and the granularity of the Trend axis will determine the latest point to be displayed. The shaded green area is the trend plot for the Indicator. This shows a decrease at the very beginning, but steady increase after that. Next we see the green check-mark and green shade, indicating that this point is above the goal. (Using the format menu for the KPI you can reverse this if a lower number is better.) Finally, the small black text below the Indicator shows us the Goal, and the distance from that goal.

With this type of control in your hands, you can easily create KPI’s that display the same Indicator, but for different time slices such as by year, quarter, month, week, or day, depending on your needs.

Microsoft’s Cloud-Scale Big Data Solution

The rate of new Azure services designed to address Big Data type problems continues to accelerate.  This is due in large part to the continued maturity of Azure as a stable and reliable Public Cloud offering.  Indeed, the key to profitability in a business data science effort has much more to do with the ways in which today’s cloud services deliver big data capabilities cost effectively.

Several pieces of the Microsoft Big Data solution are delivered in Azure, allowing us to truly build Big Data solutions at “Cloud-Scale”.  In the context of the 3-V’s of Big Data (Volume, Variety and Velocity), “Cloud-Scale” means massive cost-effective storage, schema-less data and “ingestion” of millions of rows of data per second.

1. Azure Blob Storage eliminates the need for your Data Science team to provision a Petabyte or more of redundant storage for your Data Lake.

2. Azure Service Bus and Event Hubs deliver telemetry ingestion from websites, apps, and devices.  Intake millions of events per second from a wide variety of sources.

3. Azure Stream Analytics handles the transformation of data “In Motion”.  In traditional BI, we aggregate and summarize data “At Rest”.  The velocity of Big Data requires technology which aggregates and summarizes data in motion.

4. Azure HD Insight is Microsoft’s Apache Hadoop distribution.  Developed by Hortonworks, its 100% compatible with Hadoop toolsets such as Pig, Hive, SQOOL, etc.  The Map and Reduce components can be deployed as Microsoft.NET assemblies, written in Microsoft C#

I’ll add another V, because “Cloud-Scale” does not translate well in the land of humans.  In order for insights to be actionable (and profitable), our Big Data solution must simplify the information Visually for mere mortals.

5. Power BI is a cloud service that lets you share, collaborate and access your Excel reports anywhere on any device.

The Big Data picture is coming into focus and it does not require a legion of consultants, hardware, or Hadoop experts to achieve.  Enterprises which have invested in Microsoft Business Intelligence can move into the Internet of Things era with a mid-size data science project, and grow to “Cloud-Scale”.

Deploying Software with Windows Intune

New trends in IT are emerging with the biggest being the move for businesses to cloud services. One of the newer cloud based services from Microsoft is Windows Intune. Intune is Microsoft’s cloud answer for on-premise management based technologies such as System Center. With Intune you can manage windows updates, mobile/tablet devices, perform hardware/software inventories and manage antivirus. In addition, you can now choose 2 plans for Intune with or without a Windows 8 Enterprise license so you can be sure your desktops/laptops are running the latest version of Windows.

In this blog article, I’ll be walking you through deploying a 3rd party application, Adobe Reader by using Windows Intune. This is a great feature from Windows Intune giving IT the ability to deploy 3rd party applications with a few clicks versus having users themselves install the software or IT walking around to each desk. Time is money these days in businesses and businesses are looking at where they can save money and increase productivity. In my next article, I’ll discuss on how to deploy an Adobe Reader update so you can be sure everyone running 3rd party applications are kept up to date.

To get started, if you don’t already have the Adobe Reader MSI file, you can download it here on Adobe’s FTP site: ftp://ftp.adobe.com/pub/adobe/reader/win/11.x/

Once the file is downloaded, log into your Intune Admin console and click on the Software icon on the left hand side of the screen.

1. Click Add Software under Tasks on the right.

clip_image002

2. Run the application and sign in with your Intune credentials. Click Next.

clip_image004

3. Keep default selections and browse to your downloaded MSI file from the FTP site. Click Next.

clip_image006

4. Enter a publisher name. You can also upload an icon for the package. (Optional). Click Next.

clip_image008

5. Keep the defaults and click Next.

clip_image010

6. No command line arguments needed for Adobe Reader. However, note you may need to configure silent installs for other applications. Click Next.

clip_image012

7. Click Uploadon the summary screen and Adobe Reader’s .msi file will be uploaded to your Windows Intune storage.

clip_image014

8. Once the upload is complete. Click view software properties to go directly to the deployment settings for the software package.


clip_image016

9. Select the devices/users you’d like to deploy the software to then click Next.

10. Specify the type of deployment whether is it required or available to install then a deadline. Click Finish.

If you’ve followed all of the above then you should have successfully deployed Adobe Reader through Windows Intune. Comment below if you have issues or questions.