Tag Archive for Azure

On the Leading Edge of New Technology

Being on the leading edge of any technology can be exciting, but it’s often frustrating and even costly. There is an inherent risk associated with adopting technology that is new. Lack of community support or documentation if something goes wrong are just a couple of the issues that can arise. However, there are benefits to being an early adopter. For example, working hands-on with a new technology is the best way to understand how it works. As technology consultants, we view it as our job to understand what’s coming so we can advise our clients with a clear eye to the future.

Scenario

A client asked us about alternatives to their current Remote Desktop Services (RDS) implementation which was being hosted by a third-party vendor. There were a few issues with their current setup, namely cost and maintaining multiple logins, and they didn’t have any type of domain or user directory. After exploring a few different RDS deployment scenarios, they ultimately decided on using a preview version of Azure Active Directory Domain Services (AD DS) on Azure virtual machines.

They really liked the idea of using Azure AD DS because of the promised benefits; no servers (on-premises or in the cloud) to maintain, simplified user interface, etc. We shared our assessment of the risks and unknowns of using an untested technology, but the client whole heartedly accepted these risks because there were so many more upsides to using Azure AD DS for their specific setup. So, we set out to implement Remote Desktop Services using Azure Active Directory Domain Services…and we learned a couple of things along the way which we are happy to share with you.

Sometimes the Leading Edge is the Bleeding Edge

The first lesson learned was that with Azure AD DS, you cannot be added as a Domain Admin or Global Admin. They have their own security group called AAD DC Administrators that you have to create yourself. A good thing to note when dealing with Azure AD DS. Which lead us right to our second lesson learned.

When trying to add the Licensing Manager as a member of the AD group Terminal Server License Servers group, a permissions error popped up:

The computer account for license server [ServerName] cannot be added to the Terminal Server License Servers group in Active Directory Domain Services (AD DS) because of insufficient privileges.

Leading Edge

Thinking back to that security group, I thought, “I am not a Domain Admin, I cannot be a Domain Admin.” I felt a little helpless. Thankfully the computer didn’t need to be added to the group since all RDS servers were on the same domain. But still, I couldn’t help feeling like something might be a miss later.

As a Microsoft partner we have top tier access to Microsoft support, who recommended a few solutions to this issue…which resulted in the same permissions’ roadblock.

When the Microsoft support engineer mentioned this was the first he has heard of someone trying this, I thought, I must be a pioneer attempting this while AD DS was still in beta. But one thing was for sure, the Azure AD DS team liked the idea that someone was trying out an RDS implementation with it.

When you work with a beta version or when you install something without waiting for Service Pack 2 to be released you are blazing a new trail. When you do something new there is a thrill of being the first person to try something, and a long-standing honor in the tech world to be the first to figure something out.

In the end, after another hiccup or two, the rest of the Remote Desktop Services deployment went well, without any additional permission issues. And the result showed us that Remote Desktop Services does work well with Azure Active Directory Domain Services and was able to accomplish the client’s goals.

Once the beta for Azure Active Directory Domain Services is complete, I’m wondering if RDS will be on the list of supported technologies. Then I will feel like a true trailblazer cutting a path for others to follow.

Our experience with Microsoft tools gives us an inside track and an ability to work with these new technologies because we deeply understand the underlying platform. While being on the bleeding edge of technology can be risky, having experts to help guide you, navigate any issues and provide needed support can help mitigate some of these risks. And in the end, the benefits to your organization will outweigh any roadblocks encountered along the way.

About Brent:

Brent

Brent Harvey has over 10 years of software development experience with a specific focus on SharePoint, Project Server, and C #and web development. Brent is an Architect at BlumShapiro Consulting, working on projects across varied industries (banking, manufacturing, health care, etc.). Brent is a Microsoft Certified Solutions Expert in SharePoint 2013, Solutions Associate in Windows Server 2012, Specialist in Developing Azure Solutions, and Professional Developer in SharePoint 2010.

Technology CTA (3)

6 Critical Technologies for the Internet of Things

Picture7If you and your company prefer Microsoft solutions and technologies, you may be fearing that the Internet of Things is an opportunity which will pass you by.

Have no fear: Microsoft’s transformation from “Windows and Office” company to “Cloud and Services” company continues to accelerate.  Nowhere is this trend more evident than in the range of services supporting Internet of Things scenarios.

So – What are the Microsoft technologies that would comprise an Internet of Things solution architecture?

And – How do Cloud Computing and Microsoft Azure enable Internet of Things scenarios?

Here are the key Microsoft technologies which architects and developers need to understand.

Software for Intelligent Devices

First, let’s understand the Things.  The community of device makers and entrepreneurs continues to flourish, enabled by the emergence of simple intelligent devices.  These devices have a simplified lightweight computing model capable of connecting machine-to-machine or machine-to-cloud. Windows 10 for IoT, released in July 2015, will enable secure connectivity for a broad range of devices on the Windows Platform.

Scalable Event Ingestion

The Velocity of Big Data demands a solution capable of receiving telemetry data at cloud scale with low latency and high availability.  This component of the architecture is the “front-end” of an event pipeline which will sit between the Things sending data and the consumers of the data.  Microsoft’s Azure platform delivers this capability with Azure Event Hubs – extremely easy  to setup and connect to over HTTPS.

Still – Volume + Velocity lead to major complexity when Big Data is consumed; the data may not be ready for human consumption. Microsoft provides options to analyze this massive stream of “Fast data”.  Option 1 is to process the events “in-flight” with Azure Stream Analytics.  ASA allows developers to combine streaming data with Reference Data (e.g. Master Data) to analyze events, defects, “likes” and summarize the data for human consumption.  Option 2 is to stream the data to a massive storage repository for analysis later (see The Data Lake and Hadoop).  Regardless of whether you analyze in flight or at rest, a third option can help you learn about what is happening behind the data (see Machine Learning).

Machine Learning

We’ve learned a lot about “Artificial Intelligence” over the past 10 years.  Indeed, we’ve learned that machines “think” very differently than humans.  Machines use principles of statistics to assess which features (“columns”) of a dataset provide the most “information” about a given observation (“row”).  For example, which variable(s) are most predictive (or closely correlated) with the final feature of the dataset?  Having learned how the data is related to one another, a machine can be “trained” to predict the outcome of the next record in the dataset; given an algorithm and enough data – a machine can learn about the real world.

If the IoT solution you envision includes predictions or “intelligence”, you’ll want to look at Azure Machine Learning.  Azure ML provides a development studio for data science professionals to design, test and deploy Machine Learning services to the Microsoft Azure Cloud.

Finally, you’ll also want to understand how to organize a data science project within the structure of your company’s overall project management processes.  The term “Data Science” is telling – it indicates an experimental aspect to the process.  Data scientists prepare datasets, conduct experiments, and test their algorithms (written in statistical processing languages like “R” and “Python”) until the algorithm accurately predicts correct answers to questions posed by the business, using data.  Data Science requires a balance between experimentation and business value.

The Data Lake and Hadoop

A Data Lake is a term used to describe a single place where the huge variety of data produced by your big data initiatives is stored for future analysis.  A Data Lake is not a Data Warehouse.  A Data Warehouse has One Single Structure; data from a variety of formats must be transformed into that structure.  A Data Lake has no predefined structure.  Instead, the structure is determined when the data is analyzed.  New structures can be created over and over again on the same data.

Businesses have the choice of simply storing Big Data in Azure Storage.  If the data velocity and volume exceed certain limits of Azure Storage, Azure Data Lake is a specialized storage service optimized for Hadoop, with no fixed limits on file size.  Azure Data Lake is a service announced in May 2015, and you can sign up for the Public Preview.

The ability to define a structure as the data is read is the magic of Hadoop.   The premise is simple – Big Data is too massive to move from one structure to another, as you would in a Data Warehouse/ETL solution.  Instead, keep all the data in its native format, wait to apply structure until analysis time, and perform as many reads over the same data as needed.  There is no need to buy tons of hardware for Hadoop: Azure HDInsight provides Hadoop-as-a-Service, which can be enabled/disabled as needed to keep your costs low.

Real Time Analytics

The human consumption part of this equation is represented by Power BI.  Power BI is the “single pane of glass” for all of your Data Analysis needs, including Big Data.  Power Bi is a dashboard tool capable of transforming company data into rich visuals. It can connect to data sources on premises, consume data from HDInsight or Storage, and receive real-time updates from data “in-flight”.  If you are located in New England, attend one of our Dashboard in a Day workshops happening throughout the Northeast in 2015.

Management

IoT solutions are feasible because of the robust cloud offerings currently available.  The cloud is an integral part of your solution, and you need resources capable of managing your cloud assets as though they were on premise.  Your operations team should be comfortable turning on and off services in your cloud, just as they are comfortable enabling services and capabilities on a  server. Azure PowerShell provides the operations environment for managing Azure cloud services and automating maintenance and management of those services.

Conclusion

Enterprises ready to meet their customers in the digital world will be rewarded.  First, they must grasp Big Data technologies.  Microsoft customers can take advantage of the Azure cloud to create Microsoft Big Data solutions.  They are designed first by connecting Things to the cloud, then creating and connecting Azure services to receive, analyze, learn from, and visualize the data.  Finally, be ready to treat those cloud assets as part of your production infrastructure, by training your operations team in cloud management tools from Microsoft.

Face API and Power BI

At last week’s Build2015 developer conference, Microsoft demonstrated many great new tools. One demo which got quite a bit of attention was the How Old Am I? app (http://how-old.net) The demo allows users to upload pictures and let the service “guess” the age and gender of the individuals in the photo. Within a few hours, the demo went viral, with over 210,000 images uploaded to the site from all over the world. The result was a dashboard of requests from all over the globe.

Power BI

This solution shows off the use of a number of powerful technologies.

Face APIProject Oxford is a set of Artificial Intelligence API’s and REST services which developers can use today to build Intelligent Systems. In addition to Facial Recognition, the Project Oxford AI services include Speech Recognition, Vision (or Image Recognition and OCR), and Language Understanding Intelligent Services – leveraging the technology capabilities of Bing and Cortana.

Azure Event Hubs –  a highly scalable publish-subscribe ingestor that can intake millions of events per second, the Event Hubs API is used to stream the JSON document from the web page when the user uploads a picture.

Azure Stream Analytics – a fully managed low latency high throughput stream processing solution. Azure Stream Analytics lets you write your stream processing logic in a very simple SQL -like language.   This allows the solution to take measurements every 10 seconds of how many requests, from which countries, of which gender and age.  These measurements become Facts for your analysis.

Power BI – choose PowerBI as the output of our stream analytics job (click here to learn how). Then the team went to http://www.powerbi.com, and selected the dataset and table created by Azure Stream Analytics. There is no additional coding needed to create real time dashboards.

The only down side to this is that my worst fears have been confirmed – I look older than I actually am by over 10 years! 🙁
How old do I look?!?!

The Business Value of Microsoft Azure – Part 5 – Notification Hubs

This article is part 5 of a series of articles that focus on the Business Value of Microsoft Azure. Microsoft Azure provides a variety of cloud based technologies that can enable organizations in a number of ways. Rather than focusing on the technical aspects of Microsoft Azure (there’s plenty of that content out there) this series will focus on business situations and how Microsoft Azure services can benefit.

In our last article we focused on virtualization and the use of virtual machines as part of an Infrastructure as a Service (IaaS) solution. While this is a great approach for traditional server workloads, there has been a significant shift in the way individuals interact with and consume information suggesting the need for something different. Specifically, a mobile device has overtaken the PC in terms of unit sales/year and this presents a scenario that many municipalities can tap into.

Let’s think back to our fictional town of Gamehendge. A hurricane is approaching and Mayor Wilson needs to warn its citizens. To handle the scale required to communicate in this fashion would require a significant notification infrastructure. Why pay for this type of scale when it’s only needed on occasion? Microsoft Azure Notification Hubs is a massively scalable mobile push notification engine for quickly sending millions of messages to iOS, Android, Windows, or Kindle devices. It’s possible to tailor notifications to specific citizens or entire groups with just a few lines of code, and do it across any platform.

Further, in Gamehendge there is a population that doesn’t speak English as their native language. Traditional communications can often go without understanding. The templates feature of Notification Hubs provide a handy way to send localized push notifications so you’re speaking to citizens in their language. Templates also eliminates the hassle of storing the localization settings for each group.

Combining the scalability and configurability of the Notification Hubs solution, along with its ability to work with either on-premise or cloud based systems, your municipality gains the ability to notify your citizens of any information that can prepare and inform them of upcoming events in the event of an emergency or as part of a more generalized community awareness system. While the Notification Hubs feature is just one small component of the Azure platform, it can have a significant impact in your community.

As a partner with BlumShapiro Consulting, Michael Pelletier leads our Technology Consulting Practice. He consults with a range of businesses and industries on issues related to technology strategy and direction, enterprise and solution architecture, service oriented architecture and solution delivery.