Archive for Technology

T-SQL Inner Join Need Not Always Be an Equal Join

Have you ever had to write a T-SQL query that joined a table to itself? The classic example is an employee table that lists the ID of the person’s supervisor or boss, which refers to another row in the same table. This example is nothing like that (Hint: If you do have to do that, do a search for “recursive hierarchy”). Our example today is something different.

In our example we’re going off of the following scenario: given a list of cities on a map, generate a table that can record the distance between any two. In other words, create a list where every value in the table is paired with every other value. For this exercise, we’re not going to be calculating the actual distance, simply creating the records and leaving the space for someone else to fill in. Sounds simple enough. Let’s start with the simplest example: TWO points: New York and San Francisco:

We could easily create a simple three column table with the following entries:

But the distance in one direction is the same as the return trip so really, we only need one row. After deleting the redundant record and adding Chicago to our list of cities, we get this:

 

As you can see, with the addition of just one value to our list of cities, we need two additional rows in the Distances table. As you might expect, the fourth city in the first table would require three more rows in the second. The fifth city would require four more and so on. The growth is almost exponential. At 100 cities, we would need 4950 pairs of cities in the distance table, and at 1,000 we would need close to half a million. The actual number is

{ N * ( N – 1 ) / 2 }.

Time to automate, which is the focus of this article.

Let’s settle on a total of five cities, with the last two being New Orleans and Las Vegas. We could easily build the grid below and fill in any of the appropriate blocks. The black squares are same-city pairs (New York to New York, or Chicago to Chicago, etc.) so we don’t need to worry about them. And if we can get the distance from New York to San Francisco, for example, we don’t need the distance from San Francisco back to New York, so we can also eliminate all the squares marked in gray:

Let’s replace our city names with numbers. After all, any decent database design will have a unique key assigned to each row of the City table.

And our grid now looks like this:

 

So how do we create this mega list of relationships from ONE table? An INNER JOIN of the City table to itself on the City ID column will generate the squares marked in black. But that is exactly what we DON’T want. What about a CROSS JOIN?

That will do it, but with over twice the results we need. Remember that once we know the distance from, say, Chicago to Las Vegas, we don’t also need to record the distance from Las Vegas to Chicago. And we’re still left with some extra rows where the cities are the same, like the distance from New Orleans to New Orleans.

Comparing the City ID columns in the result set above with the City ID axis of the grid, we can see that we’re only looking for pairs (cells) where the ID of City 2 is greater than the ID of City 1. From here, we have two choices: Stick with the CROSS JOIN and add a WHERE clause, or go back to the INNER JOIN and change the ON clause:

The results are the same. In fact, the estimated execution plan from the query optimizer is the same as well.

The point of this whole exercise is this: An INNER JOIN need not always be an EQUAL JOIN, though that is by far the most common occurrence. When faced with unique query needs, think outside the box. In this case, simple substituting a GREATER THAN for the EQUAL in the ON clause did the trick.

Learn more about how BlumShapiro Consulting can help with a project like this and additional application development.

About Todd: Todd Chittenden started his programming and reporting career with industrial maintenance applications in the late 1990’s. When SQL Server 2005 was introduced, he quickly became certified in Microsoft’s latest RDBMS technology and has added certifications over the years. He currently holds an MCSE in Business Intelligence . He has applied his knowledge of relational databases, data warehouses, business intelligence and analytics to a variety of projects for BlumShapiro since 2011. 

Send Custom Emails Using Azure Functions as Scheduled Tasks in SharePoint Online

Recently, a client of ours was looking to have a daily email sent to each user that has an overdue task, or a task that is set to expire. The email had a specific format. Each task needed direct links and had to be grouped by the date that it was due. Since it was an on-premises Project Server and SharePoint 2013 farm, it was not too difficult to build and embed the solution into the farm. I took care of this by building a SharePoint timer job, which leveraged a SharePoint search to retrieve all tasks where the due date was before the next business day. Once deployed, and activated, this timer job was automatically scheduled to run every morning, and the SharePoint admins could trigger it manually.  Everything worked great.

Another client of ours was looking for a solution exactly like this, except they were strictly SharePoint Online / Project Online. They had no on-premises farm, there were no real servers to even speak of. One option would have been to create a PowerShell script or .NET executable to run the code, and have that process run as a Scheduled Task on some server. However, there were no servers. Even if they did, what was the point of being in the cloud, if you are still stuck with a foot (or process) on the ground?

So, I turned to Microsoft Azure, and that’s where Azure Functions came into play.

Azure Functions

Azure Functions are programs or code snippets that run in the cloud. They can run on schedules or can be triggered by different types of events (HTTP request, item added to Azure Blob Storage, Excel file saved to OneDrive, etc.). Think of this as a Windows Scheduled Task that can be triggered by modern events and activities.

The programs or code snippets can be created and edited within the Azure Portal, making it easy to get up and running with an Azure Function. The languages supported by Azure Functions are more than adequate: PowerShell, C#, JavaScript, F#, Python, PHP, Bash, and Batch.

Note that I could have also used Azure WebJobs to accomplish this, but I felt that Azure Functions had many positives. Azure Functions are easy for the client to maintain, it has automatic scaling, they only pay for how long the code executes, it supports WebHooks and can be easily triggered via an HTTP request.

Send Custom Emails from SharePoint Online

For this solution, I created the Azure Function in Visual Studio pre-compiled with the SharePoint Online client-side object model (CSOM) DLLs. The solution was straightforward, as it would use CSOM to query SharePoint Online’s search service for all overdue tasks and tasks due within the next business day. It would then do some logic to build the email content based on the assigned users, and then send out emails using SendGrid. SendGrid is built into Microsoft Azure, so configuring it was a breeze, and you get 25,000 free emails a month!

Once deployed, I configured the Azure Function to run on schedule (like before), and it can even be triggered by an HTTP request, so putting an HTTP request in a SharePoint site workflow or Microsoft Flow means that any site user would be able to trigger this function as needed.

Long gone are the days where there are integration servers laying around in the data center waiting to get more processes to help them consume more of their over-allocated resources. Most servers, virtual machines, really, are now dedicated to a specific application, and shouldn’t share their resources with one-off processes.

Azure Functions is a great server-less architecture solution to these problems. Whether you need to send emails, calculate metrics, or analyze big data, Azure Functions can be a solution for you. Learn more about how BlumShapiro can help your organization with issues like this.

About Brent:

Brent

Brent Harvey has over 10 years of software development experience with a specific focus on SharePoint, Project Server, and C #and web development. Brent is an Architect at BlumShapiro Consulting. Brent is a Microsoft Certified Solutions Expert in SharePoint 2013, Solutions Associate in Windows Server 2012, Specialist in Developing Azure Solutions and Professional Developer in SharePoint 2010.

Modern Mobile Application Development with Xamarin and Azure App Services

In order to stay relevant in today’s digital revolution having a mobile application is vital. It is almost expected by your customers and employees that companies have a mobile app to connect and share information with customers. However, companies often run into trouble when it comes to designing and developing their app. In this article, we will look at why you should develop your own mobile app using Xamarin and Azure Apps Service by Microsoft.

What Do Xamarin and Azure Mobile Apps Have to Offer?

Xamarin is Microsoft’s mobile cross-platform solution for developing apps for iOS, Android, Mac and wearable technology. Xamarin makes it easier to develop apps by using the C# language and the ability to test on hundreds of devices via the Xamarin Test Cloud. Azure App Service is Microsoft’s back-end as a service solution that provides all the necessary features for building flexible and scalable mobile apps. Being able to scale your app will be helpful when your newly created Xamarin app becomes the #1 selling app in the app stores!

One advantage of Xamarin is that you only need to write the code once (C#) and then you can run your app everywhere, allowing your company to reach your customers on any platform. On the Azure Mobile Apps side, a key advantage is that you have one back-end service that can be used by iOS, Android and Windows devices. Working in tandem, these offerings can help make your app a major success.

How Xamarin Can Help Create a Successful App

When it comes to having a modern, cross-platform mobile app, there are a few key features that are needed and the Xamarin platform, along with some help from Azure Mobile Services are able to make your app a successful one

Mobile Engagement and Notifications

Modern Xamarin Apps tailor the user experience and engagement with dynamic, context aware notifications. This means that users gets notifications when they really want them and when they can provide the greatest impact on the user. Azure Mobile Engagement provides the API functionality for getting insight into the Xamarin App usage and real-time segmentation. This provides the contextual awareness that is needed before any notification is sent. The Azure Notification Hub is a service that provides the functionality to send notifications across all platforms. You can then take advantage of Azure Machine Learning to provide predictive analytics to tailor the user notification experience.

Storage

Modern mobile apps are expected to have online and offline data access to all cloud drives, services and on-premise systems. The Xamarin Platform simplifies how to store and access data from each platform via ADO.NET framework and SQLite Data Provider. To save you from having to learn each platform, storage API and idioms, Azure Mobile Apps allows you to choose from SQL database, Azure Table Storage, MongoDB, DocumentDB or even connect to other services like Salesforce and Office 365. This is all done via API provided by Azure.

Authentication

Successful apps these days never ask a user to register or create an account. Instead, modern apps use Azure Mobile Services for single sign-on, OAuth/OAuth2 such as connections into Facebook, Twitter, Google and Microsoft Accounts. Your Xamarin Apps can takes advantage of this via the Azure Mobile Service SDK without having to write any code.

Testing

Modern Apps need to work. This means you can’t have your app display error messages, crash or freeze. The days when users forgive and wait for an update are long gone. Users have high standards for their apps and, if yours is not up to par they will not hesitate to delete it. No matter how beautifully your app is designed, if it is slow or it crashes on a user, they’ll be quick to send your app to their trash bin. Xamarin Test Cloud along with Azure Application Insight provides you with the tools to test your app on hundreds of devices, allowing you to capture performance and monitor activities. By taking advantage of these features you can reduce the chance of bugs on the final version of your app by being able to test and clear them up early in the process.

Conclusion

Today’s digital age requires that your modern app provides a rich and tailored experience on day one. If you do not have the capital to re-invent the wheel or deal with fragmented systems, Xamarin and Azure Mobile Apps are the perfect combination to use for your app. Here at BlumShapiro we stand ready, with Xamarin and Azure certified consultants ready to help you build your modern app.

About Hector: 

hectorHector Luciano, Jr., is a Consulting Manager at BlumShapiro, a Microsoft Gold Partner focusing on SharePoint, Office 365, mobile technologies and custom development solutions. Hector is focused on delivering high value solutions to his customers in Mobile and SharePoint.

Using Real Time Data Analytics and Visualization Tools to Drive Your Business Forward

Business leaders need timely information about the operations and profitability of the businesses they manage to help make informed decisions. But when information delivery is delayed, decision makers lose precious time to adjust and respond to changing market conditions, customer preferences, supplier issues or all three. When thinking about any business analytics solution, a critical question to ask is: how frequently can we (or should we) update the underlying data? Often, the first answer from the business stakeholders is “as frequently as possible.” The concept of “real time analytics,” with data being provided up-to-the minute, is usually quite attractive. But there may be some confusion about what this really means.

While the term real time analytics does refer to data which is frequently changing, it is not the same as simply refreshing data frequently. Traditional analytics packages which take advantage of data marts, data warehouses and data cubes are often collectively referred to as a Decision Support System (DSS). A DSS helps business analysts, management and ownership understand historical trends in their business, perform root cause analysis and enable strategic decisions. Whereas a DSS system aggregates and analyzes sales, costs and other transactions, a real time analytics system ingests and processes events. One can imagine a $25 million business recording 10,000 transactions a day. One can imagine that same business recording events on their website: login, searches, shopping cart adds, shopping card deletes, product image zoom events. If the business is 100% online, how many events would that be? The answer may astonish you.

Why Real Time Analytics?

DSS solutions answer questions such as “What was our net income last month?”, “What was our net income compared to the same month last year?” or “Which customers were most profitable last month?” Real time analytics answers questions such as “Is the customer experience positive right now?” or “How can we optimize this transaction right now?” In the retail industry, listening to social media channels to hear what customers are saying about their experience in your stores, can drive service level adjustments or pricing promotions. When that analysis is real-time, store managers can adjust that day for optimized profitability. Some examples:

  1. Social media sentiment analysis – addressing customer satisfaction concerns
  2. Eliminating business disruption costs with equipment maintenance analytics
  3. Promotion and marketing optimization with web and mobile analytics
  4. Product recommendations throughout the shopping experience, online or “brick and mortar”
  5. Improved health care services with real time patient health metrics from wearable technology

In today’s world, customers expect world class service. Implicit in that expectation is the assumption that companies with whom they do business “know them”, anticipate their needs and respond to them. That’s easy to say, but harder to execute. Companies who must meet that expectation need technology leaders to be aware of three concepts critical to making real time analytics a real thing.

The first is Internet of Things or IoT. The velocity and volume of data generated by mobile devices, social media, factory floor sensors, etc. is the basis for real time analytics. “Internet of Things” refers to devices or sensors which are connected to the internet, providing data about usage or simply their physical environment (where the device is powered on). Like social media and mobile devices, IoT sensors can generate enormous volumes of data very, very quickly – this is the “big data” phenomenon.

The second is Cloud Computing. The massive scale of IoT and big data can only be achieved with cloud scale data storage and cloud scale data processing. Unless your company’s name is Google, Amazon or Microsoft, you probably cannot keep up. So, to achieve real-time analytics, you must embrace cloud computing.

The third is Intelligent Systems. IBM’s “Watson” computer achieved a significant milestone by out-performing humans on Jeopardy. Since then, companies have been integrating artificial intelligence (AI) into large scale systems. AI in this sense is simply a mathematical model which calculates the probability that data represents something a human would recognize: a supplier disruption, a dissatisfied customer about to cancel their order, an equipment breakdown. Using real time data, machine learning models can recognize events which are about to occur. From there, they can automate a response, or raise an alert to the humans involved in the process. Intelligent systems help humans make nimble adjustments to improve the bottom line.

What technologies will my company need to make this happen?

From a technology perspective, a clear understanding of cloud computing is essential. When evaluating a cloud platform, CIO’s should look for breadth of capability and support for multiple frameworks. As a Microsoft Partner, BlumShapiro Consulting works with Microsoft Azure and its Cortana Intelligence platform. This gives our clients cloud scale, low cost and a wide variety of real time and big data processing options.

CIO Article 1

This diagram describes the Azure resources which comprise Cortana Intelligence. The most relevant resources for real time analytics are:

  1. Event Hubs ingest high velocity streaming data being sent by Event Providers (i.e. Sensors and Devices)
  2. Data Lake Store provide low cost cloud storage which no practical limits
  3. Stream Analytics perform in-flight processing of streaming data
  4. Machine Learning, or AzureML, supports the design, evaluation and integration of predictive models into the real-time pipeline
  5. Cognitive Services are out-of-the-box Artificial Intelligence services, addressing a broad range of common machine intelligence scenarios
  6. Power BI supports streaming datasets made visible in a dashboard context

Four Steps to Get Started with Real Time Analytics

Start with the Eye Candy – If you do not have a dashboard tool which supports real-time data streaming, consider solutions such as Power BI. Even if you are not ready to implement an IoT solution, Power BI makes any social media or customer marketing campaigns much more feasible. Power BI can be used to connect databases, data marts, data warehouses and data cubes, and is valuable as a dashboard and visualization tool for existing DSS systems. Without visualization, it will be very difficult to provide human insights and actions for any kind of data, slow or fast.

Get to the Cloud – Cloud storage costs and cloud processing scale are the only mechanisms by which real time analytics is economically feasible (for most companies). Learn how investing in technologies like Cloud Computing can really help move your business forward.

Embrace Machine Intelligence – To make intelligent systems a reality, you will need to understand machine learning technologies, if only at a high level. Historically, this has meant developing a team of data scientists, many of whom have PhD’s in Mathematics or Statistics, and open source tools like R or Python. Today, machine learning is much more accessible then it has ever been. AzureML helps to fast track both the evaluation and operationalization of predictive models.

Find the Real-Time Opportunity – As the technology leader in the organization, CIO’s will need to work closely with other business leaders to understand where real-time information can increase revenue, decrease costs or both. This may require imagination. Start with the question – what would we like to know faster? If we knew our customer was going to do this sooner, how would we respond? If we knew our equipment was going to fail sooner, how would we respond? If we knew there was an opportunity to sell more, how would we respond?

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Power BI Demo CTA