4 Cost Saving DevOps Tools on Azure

Technology leaders need to pay attention to DevOps. Yes, it’s a funny little name. Wikipedia states that DevOps is a compound of “development” and “operations” before explaining it as “a culture, movement or practice that emphasizes the collaboration and communication of both software developers and other information-technology (IT) professionals while automating the process of software delivery and infrastructure changes.”

Technology professionals know that identifying, tracking and resolving bugs costs money. If you are the one writing the software (and sooner or later, everyone will), the bugs are on your dime. Good testing practices can help minimize bugs and costs. However, sometimes bugs result from deployment practices. Indeed, the best technology operations focus on standardized, automated testing and release management practices. By DevOps best practices, software teams treat software deliverables the way a manufacturing company treats finished goods – ruthlessly eliminating deviations with automation.

If you have tried and failed to create innovative solutions within your company by writing software, there could be several reasons why that happened. If you think you got the requirements right, and think the architecture was right, and your software developers understand the technology, then examine the process of delivering the software to the users.

Delivering Software Cost Effectively

The concept behind DevOps has been known as Continuous Integration (CI), Application Lifecycle Management (ALM) and by other names. Often, IT departments found ALM complex, or did not have the knowledge required to design a pipeline for software development. But, the tools have continued to evolve, and the processes have simplified. Today, Cloud vendors deliver DevOps services to technology professionals which are very hard to dismiss. Among the very best is Microsoft’s Azure platform. Microsoft Azure provides many tools for standardizing, testing and delivering high quality software.

Here are my four favorites:

Azure Resource Management (ARM) templates

Azure Resource Management templates are JSON documents which can be used to describe a complete set of Azure services. These documents can be saved and managed by IT operations personnel. This highlights a key cloud computing value proposition: the cloud offers technology as a “standard service” and each service can be encapsulated to be brought up and down as needed.

ARM templates can describe Infrastructure-as-a-Service offerings (i.e. Virtual Machines, Networks and Storage). This enables Dev / Test Labs to be designed, templated, deployed and undeployed as needed. Technology teams which must plan for an upgrade by providing a test environment no longer need to buy infrastructure to support a virtual environment. Instead, they can define the environment as an ARM. Azure allows you to build the environment once, extract the ARM template for later use, and then destroy the resources.

ARM templates can describe Platform-as-a-Service offerings (i.e. Websites, Services, Databases). This enables the exact same concept, with even better results. In the end, you don’t even have any servers to manage or patch: the underlying infrastructure is standardized. This brings me to Deployment Slots.

Deployment Slots

A common best practice in delivering software is to have at least one Quality Assurance (QA) environment. This shadow environment should replicate production as closely as possible. However. in the PaaS world, we don’t have control of the underlying infrastructure – that’s great, it’s standardized and we want to keep it that way. But we don’t want to abandon the practice of performing final testing before deploying to production.

With deployment slots, we get the ability to create a number of “environments” for our applications and services, then switch them back and forth as needed. Let’s say you have a new software release which you want to ensure passes some tests before releasing to the user community. Simply create a slot called “Staging” for deployment, perform your tests, then switch to production.

azure deployment

Uh oh – we missed something. We’re human after all. Users are reporting bugs and they liked it better the way we had it. Switch it back – no harm no foul.

Deployment Azure 2

There are some important things to consider before adding Deployment Slots to your DevOps pipeline. For example, if your application relies upon a database of some kind, you may need to provision staging copy for your tests. You also need to be aware that Connection Strings are one of the configuration values which can switch with the slot, unless configured to do otherwise.

Deploy to Azure

I was recently treated to some excellent material on the Cortana Analytics Suite of products. Paying close attention (as I sometimes do), I noticed that the lab environment was prepared for me as an ARM template. I was directed to GitHub (an online public software repository) and told to push the button marked “Deploy to Azure”. When I did, I was brought to http://deploy.azure.com – and the URL included a reference to the GitHub location, or repository, which I had just visited. The author of the software had placed an ARM template describing the entire lab environment, and included a few parameters so that I could fill in the information from my Azure subscription. 20 minutes later, I had Machine Learning, Hadoop/Spark, Data Factory and Power BI resources at my fingertips. Later in the day, we did deployed again, this time deploying a simple Web app which consumed Advanced Analytics services. When I was finished, I simply deleted the resources – the entire day cost me less than $20 of Azure consumption costs. Deploying an app has never been easier.

Azure Container Services

No discussion of DevOps would be complete without mentioning Docker. Docker is a platform gaining popularity among developers and IT operations for its consistency with Virtual Machines and lower overhead. Essentially, Docker runs as a subsystem which hosts containers. A container is similar in functionality to ARM.

Azure Container Services 1

Azure Container Services 2

 

 

 

 

 

 

 

DevOps Tools on Azure

Linux or Windows, Open Source or Closed, Infrastructure or Platform, TFS or GitHub. None of that matters anymore. No more excuses – Microsoft Azure provides outstanding DevOps tooling for Modern Application Development. If you have not deployed your first application to Azure, let’s talk. We can get you optimized quickly

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Technology Talks Newsletter CTA

 

SharePoint Mobile App Review, Tips and Tricks

Today I’d like to discuss the new SharePoint Mobile App, aka the Intranet in your pocket. In 2016, Microsoft released SharePoint 2016, along with a mobile application for the product. If you have not already downloaded the app, I highly suggest you download it, and follow along with the tips provided below. This post will serve as an introduction to the SharePoint Mobile app, highlight some of its capabilities and provide tips to make sure you take advantage of the features offered. So, let’s get started.

SharePoint Sites Feature

Sites 1It is important to understand that SharePoint is hierarchical and the sites screen illustrates this concept. This concept was not illustrated well in the current mobile web view of SharePoint, so the progress made with the app has not gone unnoticed. The mobile app provides a clean and simple sites screen, and by default settings, sites are ordered by your most recent activity. The sites screen menu provides access to view lists, libraries and sub-sites. Along with seeing the sites that you use, you also have the ability to share sites with others, or mark one as a favorite. This screen brings the best of SharePoint right to your fingertips and is a perfect start.

Embedded Browser View

A view of a site in an embedded browser view.

While navigating between sites the app will display each site differently. Some sites will load via an embedded browser in the app whereas others will have a native app screen. The embedded browser view is SharePoint’s mobile web view, and while it is not as clean or user-friendly as a native screen, it still provides the ability to see the site’s content. In the past, the mobile web view would open every site on a separate browser page causing confusion and a poor user experience, so having the embedded browser view for sites as part of the mobile app navigation is a major improvement.

Quick Tip

If you regularly use a site, I highly recommend pressing the “star” button on the site screen. This will add the site to the “Following” tab for quicker access. Remember to remove sites which you no longer have an interest in, by un-clicking on the “star” button.

SharePoint Links Feature

LinksMicrosoft knows that SharePoint is a great tool for collaboration, sharing and empowering users. The links screen provides you with easy access to both internal and external resources.  If your company has a “Quick Links” section on their intranet, then those links should also display on the SharePoint App.  If you don’t see any links displayed, contact your administrator and request that they update the “Featured Links” section in Office 365.

 

SharePoint People Feature

PeopleThe people screen provides you with direct access to your contact list and their profile pages. Creating and managing your contacts is key to getting the most out of SharePoint. If you have never used the Microsoft contacts capabilities, now is a great time to give it a try. To add new contacts you just need to navigate to Office 365, select the people link from the quick launch and start adding contacts.

I highly suggest you start by creating contacts for leaders within your organization. If you need help finding these individuals, the search feature will serve as a big help. To view an individual’s page, click on their name in the people screen. This will open a page that clearly displays their contact information, title, a photo, who they work with and recent activity.

Quick Tip

One of the most important features of people is the ability for you to add notes about your contact. For example, you can create a note that contains information about when you met a certain individual. This information can only be seen from your end and is one of the “hidden” features available in Microsoft People.

SharePoint Search Feature

In my opinion, the greatest feature in the new SharePoint mobile app is search. Using the search feature in the app is the easiest way to find the information you are looking for. The app allows you to drill down and search based on specific dimensions including sites, files, people or recommended It is clear to me that Microsoft’s investment in SharePoint search is paying off. Give it a try—go ahead and search for a file either by name or the content of the file.

See what else SharePoint can do, when it comes to workflows and automation of business processes.

Conclusion

Microsoft’s release of this SharePoint app shows their commitment to both the mobile space and SharePoint. Here at BlumShapiro Consulting, we are Gold Certified in Collaboration and Content. We are Partners in Office and Collaboration and are ready to help your business leverage these Microsoft tools. Learn more about SharePoint by looking at our library of posts on the topic here. Contact us to learn more about how SharePoint can help your organization.

Learn more about SharePoint from our library of blog content >>

 

About Hector: 

hectorHector Luciano, Jr., is a Consulting Manager at BlumShapiro, a Microsoft Gold Partner focusing on SharePoint, Office 365, mobile technologies and custom development solutions. Hector is focused on delivering high value solutions to his customers in Mobile and SharePoint.

Do Data Scientists Fear for Their Jobs?

What happened in this last election, November 2016? Rather, what happened to the analysts in this last election? Just about every poll and news report prediction had Hillary Clinton leading by a comfortable margin over Donald Trump. In every election I can recall from years past, the number crunchers have been pretty accurate on their predictions—at least on who would win if not the actual numerical results. However, this turned out not to be the case for the 2016 presidential race.

But this is not the first time this has happened. In 1936, Franklin Delano Roosevelt defeated Alfred Landon, much to the chagrin of The Literary Digest, a magazine that collected two and a half million mail-in surveys—roughly five percent of the voting population at the time. George Gallup, on the other hand, predicted a Roosevelt victory with a mere 3,000 interviews. The difference, according to the article’s author, was that Literary Digest’s mailing lists were sourced from vehicle registration records. How did this impact the results? In 1936 not everyone could afford a car, therefore, the Literary Digest sample was not a truly representative sample of the population. This is known as a sampling bias, where the very method used to collect the data points introduces its own force on the numbers collected. On the other hand, Gallup’s interviews were more in-line with the voting public.

The article cited above also mentions Boston’s ‘Street Bump’ smartphone app “that uses the phone’s accelerometer to detect potholes… as citizen’s of Boston … drive around, their phones automatically notify City Hall of the need to repair the road surface.” What a great idea! Or was it? The app was only collecting data from people who a) owned a smart phone, b) were willing to download the app, and c) drove regularly. Poorer neighborhoods were pretty much left out of the equation. Again, an example of sample bias.

The final case, and not to pick on Boston, but I recently heard that data scientists analyzing Twitter feeds for positive and negative sentiment, had to factor in the term “wicked,” as a positive sentiment force, but only for greater Boston. Apparently, that adjective doesn’t mean what the rest of the country assumes is means.

Along with sampling bias, another driving factor in erroneous conclusions from analyzing data is the ‘undocumented confounder.’ Suppose, for example, you wanted to see which coffee people prefer better, that from Starbucks or Dunkin’ Donuts. For this ‘experiment’, we’re interested only in the coffee itself, nothing else. So we have each shop prepare several pots with varying additions like ‘cream only’, ‘light and sweet’, ‘black no sugar’, etc. We then take these to a neutral location and do a side-by-side blind taste comparison. From our taste results we draw some conclusions as to which coffee is more preferred by the sample population. But unbeknownst to us, when the individual shops prepared their various samples of coffee, one shop used brown sugar and one used white sugar, or one used half-and-half while the other used heavy cream. The cream and sugar are now both undocumented confounders of the experiment, possibly driving results one way or the other.

So, back to the elections, how did this year’s political analysts miss the mark? Without knowing their sampling methods, I’m willing to suggest that some form of sample bias or confounder may have played a part. Was it the well known ‘cell-only problem’ again (households with no land-line are less likely to be reached by pollsters)? Did they take into consideration that Trump used Twitter as a means to deliver sound byte like messages to his followers, bypassing the main-stream media’s content filters? Some other factor perhaps as yet unidentified? As technology advances and society trends morph over time, so must political polling and data analysis methods.

Pollsters and data scientists are continually refining their methods of collection, compensation factors and models to eliminate any form of sample bias in order to get closer to the ‘truth.’ My guess is that the election analysts will eventually figure out where they went wrong. After all, they’ve got three years to work it out before the next presidential race starts. Heck, they probably started sloshing through all the data the day after the election!

One needs to realize that data science is just that, a science, and not something that can simply be stepped into without knowledge of the complexities of the discipline. Attempting to do so without the full understanding of sample bias, undocumented confounders and a host of other factors will lead you down the path to a wrong conclusion, aka ‘failure’. History has shown that, for ANY science, there are many failed experiments before a breakthrough. Laboratory scientists need to exercise caution and adhere to strict protocols to keep their work from getting ruined from outside contaminants. The same for data scientists who continually refine collection methods and models for experiments that fail.

So what about the ‘data science’ efforts for YOUR business? Are you trying to predict outcomes based on limited datasets and rudimentary Excel skills, then wondering why you can’t make any sense out of your analysis models? Do you need help identifying and eliminating sample bias, accounting for those pesky ‘undocumented confounders’? Social media sentiment analysis is a big buzz-word these days, with lots of potential for companies to mix this with their own performance metrics. But many just don’t know how to go about it, or are afraid of the cost.

At BlumShapiro Consulting, our team of consultants are constantly looking at the latest trends and technologies associated with data collection and analysis. Some of the same principles associated with election polling can be applied to your organization through predictive analytics and demand planning. Using Microsoft’s Azure framework we can quickly develop a prototype solution that can help take your organization’s data reporting and predicting to the next level.

About Todd: Todd Chittenden started his programming and reporting career with industrial maintenance applications in the late 1990’s. When SQL Server 2005 was introduced, he quickly became certified in Microsoft’s latest RDBMS technology and has added certifications over the years. He currently holds an MCSE in Business Intelligence. He has applied his knowledge of relational databases, data warehouses, business intelligence and analytics to a variety of projects for BlumShapiro since 2011. 

Data scientist

Three Steps to High Quality Master Data

Data quality is critical to business, because poor business data leads to poor operations and poor management decisions. For any business to succeed, especially now in this digital-first era, data is “the air your business needs to breathe”.  If leadership at your organization is starting to consider what digital transformation means to your business or industry – and how your business needs to evolve to thrive in these changing times, they will likely assess the current business and technology state. One of the most common outcomes management may observe is that the business systems are “outdated” and “need to be replaced”. As a result, many businesses resolve to replace legacy systems with modern business systems as part of their digital transformation strategy.

Digital Transformation Starts with Data

More than likely, those legacy systems did a terrible job with your business data. They often permitted numerous, incomplete master data records to be entered into the system. Now, you have customer records which aren’t really customers. The “Bill To’s” are “Sold-To’s”, the “Sold-To’s” are “Ship-To’s”, and the data won’t tell you which is which. You might even have international customers with all of their pertinent information in the NOTES section. Each system which shares customer master data with other systems contains just a small piece of the customer, not the complete record.

This may have been the way things were “always done” or departments made due with the systems available, but now it’s a much larger problem, because in order to transform itself, a business must leverage its data assets. It’s a significant problem when you consider all the data your legacy systems maintain. Parts, assets, locations, vendors, material, GL accounts: each suffer from different, slightly nuanced data quality problems. Now it hits you: your legacy systems have resulted in legacy data.  And as the old saying goes – “garbage in, garbage out.” In order to modernize your systems, you must first get a handle on data and your data practices.

Data Quality Processes

The data modernization process should begin with Master Data Management (MDM), because MDM can be an effective data quality improvement tool to launch your business’ Digital Transformation journey. Here’s how a data quality process works in MDM.

Data Validation – Master Data Management systems provide the ability to define data quality rules for the master data. You’ll want these rules to be robust — checking for completeness and accuracy. Once defined and applied, these rules highlight the gaps you have in your source data and anticipate problems which will present themselves when that master data is loaded into your shiny new modern business applications.

Data Standardization – Master Data thrives in a standardized world. Whether it is address standardization, ISO standardization, UPC standardization, DUNS standardization, standards assist greatly with the final step in the process.

Matching and Survivorship – If you have master data residing in more than one system, then your data quality process must consider the creation of a “golden record”. The golden record is the best, single representation of the master data, and it must be arrived at by matching similar records from heterogeneous systems and grouping them into clusters. Once these clusters are formed, a golden record emerges which contains the “survivors” from the source data. For example, the data from a CRM system may be the most authoritative source for location information, because service personnel are working in CRM regularly, but the AR system may have the best DUNS credit rating information.

Modernize Your Data and Modernize Your Business

BB Art

These three data quality processes result in a radical transformation in the quality of master data, laying the foundation for critical steps which follow. Whether or not your digital transformation involves system modernization, your journey requires clean, usable data. Digital transformation can improve your ability to engage with customers, but only if you have a complete view of who your customers are. Digital transformation can empower your employees, but only if your employees have accurate information about the core assets of the business. Digital transformation can help optimize operations, but only if management has can make informed data driven decisions. Finally, digital transformation can drive product innovation, but only if you know what your products can and cannot currently do.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics. 

Technology Talks Newsletter CTA