Archive for Azure

Send Custom Emails Using Azure Functions as Scheduled Tasks in SharePoint Online

Recently, a client of ours was looking to have a daily email sent to each user that has an overdue task, or a task that is set to expire. The email had a specific format. Each task needed direct links and had to be grouped by the date that it was due. Since it was an on-premises Project Server and SharePoint 2013 farm, it was not too difficult to build and embed the solution into the farm. I took care of this by building a SharePoint timer job, which leveraged a SharePoint search to retrieve all tasks where the due date was before the next business day. Once deployed, and activated, this timer job was automatically scheduled to run every morning, and the SharePoint admins could trigger it manually.  Everything worked great.

Another client of ours was looking for a solution exactly like this, except they were strictly SharePoint Online / Project Online. They had no on-premises farm, there were no real servers to even speak of. One option would have been to create a PowerShell script or .NET executable to run the code, and have that process run as a Scheduled Task on some server. However, there were no servers. Even if they did, what was the point of being in the cloud, if you are still stuck with a foot (or process) on the ground?

So, I turned to Microsoft Azure, and that’s where Azure Functions came into play.

Azure Functions

Azure Functions are programs or code snippets that run in the cloud. They can run on schedules or can be triggered by different types of events (HTTP request, item added to Azure Blob Storage, Excel file saved to OneDrive, etc.). Think of this as a Windows Scheduled Task that can be triggered by modern events and activities.

The programs or code snippets can be created and edited within the Azure Portal, making it easy to get up and running with an Azure Function. The languages supported by Azure Functions are more than adequate: PowerShell, C#, JavaScript, F#, Python, PHP, Bash, and Batch.

Note that I could have also used Azure WebJobs to accomplish this, but I felt that Azure Functions had many positives. Azure Functions are easy for the client to maintain, it has automatic scaling, they only pay for how long the code executes, it supports WebHooks and can be easily triggered via an HTTP request.

Send Custom Emails from SharePoint Online

For this solution, I created the Azure Function in Visual Studio pre-compiled with the SharePoint Online client-side object model (CSOM) DLLs. The solution was straightforward, as it would use CSOM to query SharePoint Online’s search service for all overdue tasks and tasks due within the next business day. It would then do some logic to build the email content based on the assigned users, and then send out emails using SendGrid. SendGrid is built into Microsoft Azure, so configuring it was a breeze, and you get 25,000 free emails a month!

Once deployed, I configured the Azure Function to run on schedule (like before), and it can even be triggered by an HTTP request, so putting an HTTP request in a SharePoint site workflow or Microsoft Flow means that any site user would be able to trigger this function as needed.

Long gone are the days where there are integration servers laying around in the data center waiting to get more processes to help them consume more of their over-allocated resources. Most servers, virtual machines, really, are now dedicated to a specific application, and shouldn’t share their resources with one-off processes.

Azure Functions is a great server-less architecture solution to these problems. Whether you need to send emails, calculate metrics, or analyze big data, Azure Functions can be a solution for you. Learn more about how BlumShapiro can help your organization with issues like this.

About Brent:

Brent

Brent Harvey has over 10 years of software development experience with a specific focus on SharePoint, Project Server, and C #and web development. Brent is an Architect at BlumShapiro Consulting. Brent is a Microsoft Certified Solutions Expert in SharePoint 2013, Solutions Associate in Windows Server 2012, Specialist in Developing Azure Solutions and Professional Developer in SharePoint 2010.

Using Real Time Data Analytics and Visualization Tools to Drive Your Business Forward

Business leaders need timely information about the operations and profitability of the businesses they manage to help make informed decisions. But when information delivery is delayed, decision makers lose precious time to adjust and respond to changing market conditions, customer preferences, supplier issues or all three. When thinking about any business analytics solution, a critical question to ask is: how frequently can we (or should we) update the underlying data? Often, the first answer from the business stakeholders is “as frequently as possible.” The concept of “real time analytics,” with data being provided up-to-the minute, is usually quite attractive. But there may be some confusion about what this really means.

While the term real time analytics does refer to data which is frequently changing, it is not the same as simply refreshing data frequently. Traditional analytics packages which take advantage of data marts, data warehouses and data cubes are often collectively referred to as a Decision Support System (DSS). A DSS helps business analysts, management and ownership understand historical trends in their business, perform root cause analysis and enable strategic decisions. Whereas a DSS system aggregates and analyzes sales, costs and other transactions, a real time analytics system ingests and processes events. One can imagine a $25 million business recording 10,000 transactions a day. One can imagine that same business recording events on their website: login, searches, shopping cart adds, shopping card deletes, product image zoom events. If the business is 100% online, how many events would that be? The answer may astonish you.

Why Real Time Analytics?

DSS solutions answer questions such as “What was our net income last month?”, “What was our net income compared to the same month last year?” or “Which customers were most profitable last month?” Real time analytics answers questions such as “Is the customer experience positive right now?” or “How can we optimize this transaction right now?” In the retail industry, listening to social media channels to hear what customers are saying about their experience in your stores, can drive service level adjustments or pricing promotions. When that analysis is real-time, store managers can adjust that day for optimized profitability. Some examples:

  1. Social media sentiment analysis – addressing customer satisfaction concerns
  2. Eliminating business disruption costs with equipment maintenance analytics
  3. Promotion and marketing optimization with web and mobile analytics
  4. Product recommendations throughout the shopping experience, online or “brick and mortar”
  5. Improved health care services with real time patient health metrics from wearable technology

In today’s world, customers expect world class service. Implicit in that expectation is the assumption that companies with whom they do business “know them”, anticipate their needs and respond to them. That’s easy to say, but harder to execute. Companies who must meet that expectation need technology leaders to be aware of three concepts critical to making real time analytics a real thing.

The first is Internet of Things or IoT. The velocity and volume of data generated by mobile devices, social media, factory floor sensors, etc. is the basis for real time analytics. “Internet of Things” refers to devices or sensors which are connected to the internet, providing data about usage or simply their physical environment (where the device is powered on). Like social media and mobile devices, IoT sensors can generate enormous volumes of data very, very quickly – this is the “big data” phenomenon.

The second is Cloud Computing. The massive scale of IoT and big data can only be achieved with cloud scale data storage and cloud scale data processing. Unless your company’s name is Google, Amazon or Microsoft, you probably cannot keep up. So, to achieve real-time analytics, you must embrace cloud computing.

The third is Intelligent Systems. IBM’s “Watson” computer achieved a significant milestone by out-performing humans on Jeopardy. Since then, companies have been integrating artificial intelligence (AI) into large scale systems. AI in this sense is simply a mathematical model which calculates the probability that data represents something a human would recognize: a supplier disruption, a dissatisfied customer about to cancel their order, an equipment breakdown. Using real time data, machine learning models can recognize events which are about to occur. From there, they can automate a response, or raise an alert to the humans involved in the process. Intelligent systems help humans make nimble adjustments to improve the bottom line.

What technologies will my company need to make this happen?

From a technology perspective, a clear understanding of cloud computing is essential. When evaluating a cloud platform, CIO’s should look for breadth of capability and support for multiple frameworks. As a Microsoft Partner, BlumShapiro Consulting works with Microsoft Azure and its Cortana Intelligence platform. This gives our clients cloud scale, low cost and a wide variety of real time and big data processing options.

CIO Article 1

This diagram describes the Azure resources which comprise Cortana Intelligence. The most relevant resources for real time analytics are:

  1. Event Hubs ingest high velocity streaming data being sent by Event Providers (i.e. Sensors and Devices)
  2. Data Lake Store provide low cost cloud storage which no practical limits
  3. Stream Analytics perform in-flight processing of streaming data
  4. Machine Learning, or AzureML, supports the design, evaluation and integration of predictive models into the real-time pipeline
  5. Cognitive Services are out-of-the-box Artificial Intelligence services, addressing a broad range of common machine intelligence scenarios
  6. Power BI supports streaming datasets made visible in a dashboard context

Four Steps to Get Started with Real Time Analytics

Start with the Eye Candy – If you do not have a dashboard tool which supports real-time data streaming, consider solutions such as Power BI. Even if you are not ready to implement an IoT solution, Power BI makes any social media or customer marketing campaigns much more feasible. Power BI can be used to connect databases, data marts, data warehouses and data cubes, and is valuable as a dashboard and visualization tool for existing DSS systems. Without visualization, it will be very difficult to provide human insights and actions for any kind of data, slow or fast.

Get to the Cloud – Cloud storage costs and cloud processing scale are the only mechanisms by which real time analytics is economically feasible (for most companies). Learn how investing in technologies like Cloud Computing can really help move your business forward.

Embrace Machine Intelligence – To make intelligent systems a reality, you will need to understand machine learning technologies, if only at a high level. Historically, this has meant developing a team of data scientists, many of whom have PhD’s in Mathematics or Statistics, and open source tools like R or Python. Today, machine learning is much more accessible then it has ever been. AzureML helps to fast track both the evaluation and operationalization of predictive models.

Find the Real-Time Opportunity – As the technology leader in the organization, CIO’s will need to work closely with other business leaders to understand where real-time information can increase revenue, decrease costs or both. This may require imagination. Start with the question – what would we like to know faster? If we knew our customer was going to do this sooner, how would we respond? If we knew our equipment was going to fail sooner, how would we respond? If we knew there was an opportunity to sell more, how would we respond?

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Power BI Demo CTA

 

4 Cost Saving DevOps Tools on Azure

Technology leaders need to pay attention to DevOps. Yes, it’s a funny little name. Wikipedia states that DevOps is a compound of “development” and “operations” before explaining it as “a culture, movement or practice that emphasizes the collaboration and communication of both software developers and other information-technology (IT) professionals while automating the process of software delivery and infrastructure changes.”

Technology professionals know that identifying, tracking and resolving bugs costs money. If you are the one writing the software (and sooner or later, everyone will), the bugs are on your dime. Good testing practices can help minimize bugs and costs. However, sometimes bugs result from deployment practices. Indeed, the best technology operations focus on standardized, automated testing and release management practices. By DevOps best practices, software teams treat software deliverables the way a manufacturing company treats finished goods – ruthlessly eliminating deviations with automation.

If you have tried and failed to create innovative solutions within your company by writing software, there could be several reasons why that happened. If you think you got the requirements right, and think the architecture was right, and your software developers understand the technology, then examine the process of delivering the software to the users.

Delivering Software Cost Effectively

The concept behind DevOps has been known as Continuous Integration (CI), Application Lifecycle Management (ALM) and by other names. Often, IT departments found ALM complex, or did not have the knowledge required to design a pipeline for software development. But, the tools have continued to evolve, and the processes have simplified. Today, Cloud vendors deliver DevOps services to technology professionals which are very hard to dismiss. Among the very best is Microsoft’s Azure platform. Microsoft Azure provides many tools for standardizing, testing and delivering high quality software.

Here are my four favorites:

Azure Resource Management (ARM) templates

Azure Resource Management templates are JSON documents which can be used to describe a complete set of Azure services. These documents can be saved and managed by IT operations personnel. This highlights a key cloud computing value proposition: the cloud offers technology as a “standard service” and each service can be encapsulated to be brought up and down as needed.

ARM templates can describe Infrastructure-as-a-Service offerings (i.e. Virtual Machines, Networks and Storage). This enables Dev / Test Labs to be designed, templated, deployed and undeployed as needed. Technology teams which must plan for an upgrade by providing a test environment no longer need to buy infrastructure to support a virtual environment. Instead, they can define the environment as an ARM. Azure allows you to build the environment once, extract the ARM template for later use, and then destroy the resources.

ARM templates can describe Platform-as-a-Service offerings (i.e. Websites, Services, Databases). This enables the exact same concept, with even better results. In the end, you don’t even have any servers to manage or patch: the underlying infrastructure is standardized. This brings me to Deployment Slots.

Deployment Slots

A common best practice in delivering software is to have at least one Quality Assurance (QA) environment. This shadow environment should replicate production as closely as possible. However. in the PaaS world, we don’t have control of the underlying infrastructure – that’s great, it’s standardized and we want to keep it that way. But we don’t want to abandon the practice of performing final testing before deploying to production.

With deployment slots, we get the ability to create a number of “environments” for our applications and services, then switch them back and forth as needed. Let’s say you have a new software release which you want to ensure passes some tests before releasing to the user community. Simply create a slot called “Staging” for deployment, perform your tests, then switch to production.

azure deployment

Uh oh – we missed something. We’re human after all. Users are reporting bugs and they liked it better the way we had it. Switch it back – no harm no foul.

Deployment Azure 2

There are some important things to consider before adding Deployment Slots to your DevOps pipeline. For example, if your application relies upon a database of some kind, you may need to provision staging copy for your tests. You also need to be aware that Connection Strings are one of the configuration values which can switch with the slot, unless configured to do otherwise.

Deploy to Azure

I was recently treated to some excellent material on the Cortana Analytics Suite of products. Paying close attention (as I sometimes do), I noticed that the lab environment was prepared for me as an ARM template. I was directed to GitHub (an online public software repository) and told to push the button marked “Deploy to Azure”. When I did, I was brought to http://deploy.azure.com – and the URL included a reference to the GitHub location, or repository, which I had just visited. The author of the software had placed an ARM template describing the entire lab environment, and included a few parameters so that I could fill in the information from my Azure subscription. 20 minutes later, I had Machine Learning, Hadoop/Spark, Data Factory and Power BI resources at my fingertips. Later in the day, we did deployed again, this time deploying a simple Web app which consumed Advanced Analytics services. When I was finished, I simply deleted the resources – the entire day cost me less than $20 of Azure consumption costs. Deploying an app has never been easier.

Azure Container Services

No discussion of DevOps would be complete without mentioning Docker. Docker is a platform gaining popularity among developers and IT operations for its consistency with Virtual Machines and lower overhead. Essentially, Docker runs as a subsystem which hosts containers. A container is similar in functionality to ARM.

Azure Container Services 1

Azure Container Services 2

 

 

 

 

 

 

 

DevOps Tools on Azure

Linux or Windows, Open Source or Closed, Infrastructure or Platform, TFS or GitHub. None of that matters anymore. No more excuses – Microsoft Azure provides outstanding DevOps tooling for Modern Application Development. If you have not deployed your first application to Azure, let’s talk. We can get you optimized quickly

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Technology Talks Newsletter CTA

 

On the Leading Edge of New Technology

Being on the leading edge of any technology can be exciting, but it’s often frustrating and even costly. There is an inherent risk associated with adopting technology that is new. Lack of community support or documentation if something goes wrong are just a couple of the issues that can arise. However, there are benefits to being an early adopter. For example, working hands-on with a new technology is the best way to understand how it works. As technology consultants, we view it as our job to understand what’s coming so we can advise our clients with a clear eye to the future.

Scenario

A client asked us about alternatives to their current Remote Desktop Services (RDS) implementation which was being hosted by a third-party vendor. There were a few issues with their current setup, namely cost and maintaining multiple logins, and they didn’t have any type of domain or user directory. After exploring a few different RDS deployment scenarios, they ultimately decided on using a preview version of Azure Active Directory Domain Services (AD DS) on Azure virtual machines.

They really liked the idea of using Azure AD DS because of the promised benefits; no servers (on-premises or in the cloud) to maintain, simplified user interface, etc. We shared our assessment of the risks and unknowns of using an untested technology, but the client whole heartedly accepted these risks because there were so many more upsides to using Azure AD DS for their specific setup. So, we set out to implement Remote Desktop Services using Azure Active Directory Domain Services…and we learned a couple of things along the way which we are happy to share with you.

Sometimes the Leading Edge is the Bleeding Edge

The first lesson learned was that with Azure AD DS, you cannot be added as a Domain Admin or Global Admin. They have their own security group called AAD DC Administrators that you have to create yourself. A good thing to note when dealing with Azure AD DS. Which lead us right to our second lesson learned.

When trying to add the Licensing Manager as a member of the AD group Terminal Server License Servers group, a permissions error popped up:

The computer account for license server [ServerName] cannot be added to the Terminal Server License Servers group in Active Directory Domain Services (AD DS) because of insufficient privileges.

Leading Edge

Thinking back to that security group, I thought, “I am not a Domain Admin, I cannot be a Domain Admin.” I felt a little helpless. Thankfully the computer didn’t need to be added to the group since all RDS servers were on the same domain. But still, I couldn’t help feeling like something might be a miss later.

As a Microsoft partner we have top tier access to Microsoft support, who recommended a few solutions to this issue…which resulted in the same permissions’ roadblock.

When the Microsoft support engineer mentioned this was the first he has heard of someone trying this, I thought, I must be a pioneer attempting this while AD DS was still in beta. But one thing was for sure, the Azure AD DS team liked the idea that someone was trying out an RDS implementation with it.

When you work with a beta version or when you install something without waiting for Service Pack 2 to be released you are blazing a new trail. When you do something new there is a thrill of being the first person to try something, and a long-standing honor in the tech world to be the first to figure something out.

In the end, after another hiccup or two, the rest of the Remote Desktop Services deployment went well, without any additional permission issues. And the result showed us that Remote Desktop Services does work well with Azure Active Directory Domain Services and was able to accomplish the client’s goals.

Once the beta for Azure Active Directory Domain Services is complete, I’m wondering if RDS will be on the list of supported technologies. Then I will feel like a true trailblazer cutting a path for others to follow.

Our experience with Microsoft tools gives us an inside track and an ability to work with these new technologies because we deeply understand the underlying platform. While being on the bleeding edge of technology can be risky, having experts to help guide you, navigate any issues and provide needed support can help mitigate some of these risks. And in the end, the benefits to your organization will outweigh any roadblocks encountered along the way.

About Brent:

Brent

Brent Harvey has over 10 years of software development experience with a specific focus on SharePoint, Project Server, and C #and web development. Brent is an Architect at BlumShapiro Consulting, working on projects across varied industries (banking, manufacturing, health care, etc.). Brent is a Microsoft Certified Solutions Expert in SharePoint 2013, Solutions Associate in Windows Server 2012, Specialist in Developing Azure Solutions, and Professional Developer in SharePoint 2010.

Technology CTA (3)