Tag Archive for Azure

Send Custom Emails Using Azure Functions as Scheduled Tasks in SharePoint Online

Recently, a client of ours was looking to have a daily email sent to each user that has an overdue task, or a task that is set to expire. The email had a specific format. Each task needed direct links and had to be grouped by the date that it was due. Since it was an on-premises Project Server and SharePoint 2013 farm, it was not too difficult to build and embed the solution into the farm. I took care of this by building a SharePoint timer job, which leveraged a SharePoint search to retrieve all tasks where the due date was before the next business day. Once deployed, and activated, this timer job was automatically scheduled to run every morning, and the SharePoint admins could trigger it manually.  Everything worked great.

Another client of ours was looking for a solution exactly like this, except they were strictly SharePoint Online / Project Online. They had no on-premises farm, there were no real servers to even speak of. One option would have been to create a PowerShell script or .NET executable to run the code, and have that process run as a Scheduled Task on some server. However, there were no servers. Even if they did, what was the point of being in the cloud, if you are still stuck with a foot (or process) on the ground?

So, I turned to Microsoft Azure, and that’s where Azure Functions came into play.

Azure Functions

Azure Functions are programs or code snippets that run in the cloud. They can run on schedules or can be triggered by different types of events (HTTP request, item added to Azure Blob Storage, Excel file saved to OneDrive, etc.). Think of this as a Windows Scheduled Task that can be triggered by modern events and activities.

The programs or code snippets can be created and edited within the Azure Portal, making it easy to get up and running with an Azure Function. The languages supported by Azure Functions are more than adequate: PowerShell, C#, JavaScript, F#, Python, PHP, Bash, and Batch.

Note that I could have also used Azure WebJobs to accomplish this, but I felt that Azure Functions had many positives. Azure Functions are easy for the client to maintain, it has automatic scaling, they only pay for how long the code executes, it supports WebHooks and can be easily triggered via an HTTP request.

Send Custom Emails from SharePoint Online

For this solution, I created the Azure Function in Visual Studio pre-compiled with the SharePoint Online client-side object model (CSOM) DLLs. The solution was straightforward, as it would use CSOM to query SharePoint Online’s search service for all overdue tasks and tasks due within the next business day. It would then do some logic to build the email content based on the assigned users, and then send out emails using SendGrid. SendGrid is built into Microsoft Azure, so configuring it was a breeze, and you get 25,000 free emails a month!

Once deployed, I configured the Azure Function to run on schedule (like before), and it can even be triggered by an HTTP request, so putting an HTTP request in a SharePoint site workflow or Microsoft Flow means that any site user would be able to trigger this function as needed.

Long gone are the days where there are integration servers laying around in the data center waiting to get more processes to help them consume more of their over-allocated resources. Most servers, virtual machines, really, are now dedicated to a specific application, and shouldn’t share their resources with one-off processes.

Azure Functions is a great server-less architecture solution to these problems. Whether you need to send emails, calculate metrics, or analyze big data, Azure Functions can be a solution for you. Learn more about how BlumShapiro can help your organization with issues like this.

About Brent:

Brent

Brent Harvey has over 10 years of software development experience with a specific focus on SharePoint, Project Server, and C #and web development. Brent is an Architect at BlumShapiro Consulting. Brent is a Microsoft Certified Solutions Expert in SharePoint 2013, Solutions Associate in Windows Server 2012, Specialist in Developing Azure Solutions and Professional Developer in SharePoint 2010.

On the Leading Edge of New Technology

Being on the leading edge of any technology can be exciting, but it’s often frustrating and even costly. There is an inherent risk associated with adopting technology that is new. Lack of community support or documentation if something goes wrong are just a couple of the issues that can arise. However, there are benefits to being an early adopter. For example, working hands-on with a new technology is the best way to understand how it works. As technology consultants, we view it as our job to understand what’s coming so we can advise our clients with a clear eye to the future.

Scenario

A client asked us about alternatives to their current Remote Desktop Services (RDS) implementation which was being hosted by a third-party vendor. There were a few issues with their current setup, namely cost and maintaining multiple logins, and they didn’t have any type of domain or user directory. After exploring a few different RDS deployment scenarios, they ultimately decided on using a preview version of Azure Active Directory Domain Services (AD DS) on Azure virtual machines.

They really liked the idea of using Azure AD DS because of the promised benefits; no servers (on-premises or in the cloud) to maintain, simplified user interface, etc. We shared our assessment of the risks and unknowns of using an untested technology, but the client whole heartedly accepted these risks because there were so many more upsides to using Azure AD DS for their specific setup. So, we set out to implement Remote Desktop Services using Azure Active Directory Domain Services…and we learned a couple of things along the way which we are happy to share with you.

Sometimes the Leading Edge is the Bleeding Edge

The first lesson learned was that with Azure AD DS, you cannot be added as a Domain Admin or Global Admin. They have their own security group called AAD DC Administrators that you have to create yourself. A good thing to note when dealing with Azure AD DS. Which lead us right to our second lesson learned.

When trying to add the Licensing Manager as a member of the AD group Terminal Server License Servers group, a permissions error popped up:

The computer account for license server [ServerName] cannot be added to the Terminal Server License Servers group in Active Directory Domain Services (AD DS) because of insufficient privileges.

Leading Edge

Thinking back to that security group, I thought, “I am not a Domain Admin, I cannot be a Domain Admin.” I felt a little helpless. Thankfully the computer didn’t need to be added to the group since all RDS servers were on the same domain. But still, I couldn’t help feeling like something might be a miss later.

As a Microsoft partner we have top tier access to Microsoft support, who recommended a few solutions to this issue…which resulted in the same permissions’ roadblock.

When the Microsoft support engineer mentioned this was the first he has heard of someone trying this, I thought, I must be a pioneer attempting this while AD DS was still in beta. But one thing was for sure, the Azure AD DS team liked the idea that someone was trying out an RDS implementation with it.

When you work with a beta version or when you install something without waiting for Service Pack 2 to be released you are blazing a new trail. When you do something new there is a thrill of being the first person to try something, and a long-standing honor in the tech world to be the first to figure something out.

In the end, after another hiccup or two, the rest of the Remote Desktop Services deployment went well, without any additional permission issues. And the result showed us that Remote Desktop Services does work well with Azure Active Directory Domain Services and was able to accomplish the client’s goals.

Once the beta for Azure Active Directory Domain Services is complete, I’m wondering if RDS will be on the list of supported technologies. Then I will feel like a true trailblazer cutting a path for others to follow.

Our experience with Microsoft tools gives us an inside track and an ability to work with these new technologies because we deeply understand the underlying platform. While being on the bleeding edge of technology can be risky, having experts to help guide you, navigate any issues and provide needed support can help mitigate some of these risks. And in the end, the benefits to your organization will outweigh any roadblocks encountered along the way.

About Brent:

Brent

Brent Harvey has over 10 years of software development experience with a specific focus on SharePoint, Project Server, and C #and web development. Brent is an Architect at BlumShapiro Consulting, working on projects across varied industries (banking, manufacturing, health care, etc.). Brent is a Microsoft Certified Solutions Expert in SharePoint 2013, Solutions Associate in Windows Server 2012, Specialist in Developing Azure Solutions, and Professional Developer in SharePoint 2010.

Technology CTA (3)

6 Critical Technologies for the Internet of Things

Picture7If you and your company prefer Microsoft solutions and technologies, you may be fearing that the Internet of Things is an opportunity which will pass you by.

Have no fear: Microsoft’s transformation from “Windows and Office” company to “Cloud and Services” company continues to accelerate.  Nowhere is this trend more evident than in the range of services supporting Internet of Things scenarios.

So – What are the Microsoft technologies that would comprise an Internet of Things solution architecture?

And – How do Cloud Computing and Microsoft Azure enable Internet of Things scenarios?

Here are the key Microsoft technologies which architects and developers need to understand.

Software for Intelligent Devices

First, let’s understand the Things.  The community of device makers and entrepreneurs continues to flourish, enabled by the emergence of simple intelligent devices.  These devices have a simplified lightweight computing model capable of connecting machine-to-machine or machine-to-cloud. Windows 10 for IoT, released in July 2015, will enable secure connectivity for a broad range of devices on the Windows Platform.

Scalable Event Ingestion

The Velocity of Big Data demands a solution capable of receiving telemetry data at cloud scale with low latency and high availability.  This component of the architecture is the “front-end” of an event pipeline which will sit between the Things sending data and the consumers of the data.  Microsoft’s Azure platform delivers this capability with Azure Event Hubs – extremely easy  to setup and connect to over HTTPS.

Still – Volume + Velocity lead to major complexity when Big Data is consumed; the data may not be ready for human consumption. Microsoft provides options to analyze this massive stream of “Fast data”.  Option 1 is to process the events “in-flight” with Azure Stream Analytics.  ASA allows developers to combine streaming data with Reference Data (e.g. Master Data) to analyze events, defects, “likes” and summarize the data for human consumption.  Option 2 is to stream the data to a massive storage repository for analysis later (see The Data Lake and Hadoop).  Regardless of whether you analyze in flight or at rest, a third option can help you learn about what is happening behind the data (see Machine Learning).

Machine Learning

We’ve learned a lot about “Artificial Intelligence” over the past 10 years.  Indeed, we’ve learned that machines “think” very differently than humans.  Machines use principles of statistics to assess which features (“columns”) of a dataset provide the most “information” about a given observation (“row”).  For example, which variable(s) are most predictive (or closely correlated) with the final feature of the dataset?  Having learned how the data is related to one another, a machine can be “trained” to predict the outcome of the next record in the dataset; given an algorithm and enough data – a machine can learn about the real world.

If the IoT solution you envision includes predictions or “intelligence”, you’ll want to look at Azure Machine Learning.  Azure ML provides a development studio for data science professionals to design, test and deploy Machine Learning services to the Microsoft Azure Cloud.

Finally, you’ll also want to understand how to organize a data science project within the structure of your company’s overall project management processes.  The term “Data Science” is telling – it indicates an experimental aspect to the process.  Data scientists prepare datasets, conduct experiments, and test their algorithms (written in statistical processing languages like “R” and “Python”) until the algorithm accurately predicts correct answers to questions posed by the business, using data.  Data Science requires a balance between experimentation and business value.

The Data Lake and Hadoop

A Data Lake is a term used to describe a single place where the huge variety of data produced by your big data initiatives is stored for future analysis.  A Data Lake is not a Data Warehouse.  A Data Warehouse has One Single Structure; data from a variety of formats must be transformed into that structure.  A Data Lake has no predefined structure.  Instead, the structure is determined when the data is analyzed.  New structures can be created over and over again on the same data.

Businesses have the choice of simply storing Big Data in Azure Storage.  If the data velocity and volume exceed certain limits of Azure Storage, Azure Data Lake is a specialized storage service optimized for Hadoop, with no fixed limits on file size.  Azure Data Lake is a service announced in May 2015, and you can sign up for the Public Preview.

The ability to define a structure as the data is read is the magic of Hadoop.   The premise is simple – Big Data is too massive to move from one structure to another, as you would in a Data Warehouse/ETL solution.  Instead, keep all the data in its native format, wait to apply structure until analysis time, and perform as many reads over the same data as needed.  There is no need to buy tons of hardware for Hadoop: Azure HDInsight provides Hadoop-as-a-Service, which can be enabled/disabled as needed to keep your costs low.

Real Time Analytics

The human consumption part of this equation is represented by Power BI.  Power BI is the “single pane of glass” for all of your Data Analysis needs, including Big Data.  Power Bi is a dashboard tool capable of transforming company data into rich visuals. It can connect to data sources on premises, consume data from HDInsight or Storage, and receive real-time updates from data “in-flight”.  If you are located in New England, attend one of our Dashboard in a Day workshops happening throughout the Northeast in 2015.

Management

IoT solutions are feasible because of the robust cloud offerings currently available.  The cloud is an integral part of your solution, and you need resources capable of managing your cloud assets as though they were on premise.  Your operations team should be comfortable turning on and off services in your cloud, just as they are comfortable enabling services and capabilities on a  server. Azure PowerShell provides the operations environment for managing Azure cloud services and automating maintenance and management of those services.

Conclusion

Enterprises ready to meet their customers in the digital world will be rewarded.  First, they must grasp Big Data technologies.  Microsoft customers can take advantage of the Azure cloud to create Microsoft Big Data solutions.  They are designed first by connecting Things to the cloud, then creating and connecting Azure services to receive, analyze, learn from, and visualize the data.  Finally, be ready to treat those cloud assets as part of your production infrastructure, by training your operations team in cloud management tools from Microsoft.

Face API and Power BI

At last week’s Build2015 developer conference, Microsoft demonstrated many great new tools. One demo which got quite a bit of attention was the How Old Am I? app (http://how-old.net) The demo allows users to upload pictures and let the service “guess” the age and gender of the individuals in the photo. Within a few hours, the demo went viral, with over 210,000 images uploaded to the site from all over the world. The result was a dashboard of requests from all over the globe.

Power BI

This solution shows off the use of a number of powerful technologies.

Face APIProject Oxford is a set of Artificial Intelligence API’s and REST services which developers can use today to build Intelligent Systems. In addition to Facial Recognition, the Project Oxford AI services include Speech Recognition, Vision (or Image Recognition and OCR), and Language Understanding Intelligent Services – leveraging the technology capabilities of Bing and Cortana.

Azure Event Hubs –  a highly scalable publish-subscribe ingestor that can intake millions of events per second, the Event Hubs API is used to stream the JSON document from the web page when the user uploads a picture.

Azure Stream Analytics – a fully managed low latency high throughput stream processing solution. Azure Stream Analytics lets you write your stream processing logic in a very simple SQL -like language.   This allows the solution to take measurements every 10 seconds of how many requests, from which countries, of which gender and age.  These measurements become Facts for your analysis.

Power BI – choose PowerBI as the output of our stream analytics job (click here to learn how). Then the team went to http://www.powerbi.com, and selected the dataset and table created by Azure Stream Analytics. There is no additional coding needed to create real time dashboards.

The only down side to this is that my worst fears have been confirmed – I look older than I actually am by over 10 years! 🙁
How old do I look?!?!