Archive for Brian Berry

4 Cost Saving DevOps Tools on Azure

Technology leaders need to pay attention to DevOps. Yes, it’s a funny little name. Wikipedia states that DevOps is a compound of “development” and “operations” before explaining it as “a culture, movement or practice that emphasizes the collaboration and communication of both software developers and other information-technology (IT) professionals while automating the process of software delivery and infrastructure changes.”

Technology professionals know that identifying, tracking and resolving bugs costs money. If you are the one writing the software (and sooner or later, everyone will), the bugs are on your dime. Good testing practices can help minimize bugs and costs. However, sometimes bugs result from deployment practices. Indeed, the best technology operations focus on standardized, automated testing and release management practices. By DevOps best practices, software teams treat software deliverables the way a manufacturing company treats finished goods – ruthlessly eliminating deviations with automation.

If you have tried and failed to create innovative solutions within your company by writing software, there could be several reasons why that happened. If you think you got the requirements right, and think the architecture was right, and your software developers understand the technology, then examine the process of delivering the software to the users.

Delivering Software Cost Effectively

The concept behind DevOps has been known as Continuous Integration (CI), Application Lifecycle Management (ALM) and by other names. Often, IT departments found ALM complex, or did not have the knowledge required to design a pipeline for software development. But, the tools have continued to evolve, and the processes have simplified. Today, Cloud vendors deliver DevOps services to technology professionals which are very hard to dismiss. Among the very best is Microsoft’s Azure platform. Microsoft Azure provides many tools for standardizing, testing and delivering high quality software.

Here are my four favorites:

Azure Resource Management (ARM) templates

Azure Resource Management templates are JSON documents which can be used to describe a complete set of Azure services. These documents can be saved and managed by IT operations personnel. This highlights a key cloud computing value proposition: the cloud offers technology as a “standard service” and each service can be encapsulated to be brought up and down as needed.

ARM templates can describe Infrastructure-as-a-Service offerings (i.e. Virtual Machines, Networks and Storage). This enables Dev / Test Labs to be designed, templated, deployed and undeployed as needed. Technology teams which must plan for an upgrade by providing a test environment no longer need to buy infrastructure to support a virtual environment. Instead, they can define the environment as an ARM. Azure allows you to build the environment once, extract the ARM template for later use, and then destroy the resources.

ARM templates can describe Platform-as-a-Service offerings (i.e. Websites, Services, Databases). This enables the exact same concept, with even better results. In the end, you don’t even have any servers to manage or patch: the underlying infrastructure is standardized. This brings me to Deployment Slots.

Deployment Slots

A common best practice in delivering software is to have at least one Quality Assurance (QA) environment. This shadow environment should replicate production as closely as possible. However. in the PaaS world, we don’t have control of the underlying infrastructure – that’s great, it’s standardized and we want to keep it that way. But we don’t want to abandon the practice of performing final testing before deploying to production.

With deployment slots, we get the ability to create a number of “environments” for our applications and services, then switch them back and forth as needed. Let’s say you have a new software release which you want to ensure passes some tests before releasing to the user community. Simply create a slot called “Staging” for deployment, perform your tests, then switch to production.

azure deployment

Uh oh – we missed something. We’re human after all. Users are reporting bugs and they liked it better the way we had it. Switch it back – no harm no foul.

Deployment Azure 2

There are some important things to consider before adding Deployment Slots to your DevOps pipeline. For example, if your application relies upon a database of some kind, you may need to provision staging copy for your tests. You also need to be aware that Connection Strings are one of the configuration values which can switch with the slot, unless configured to do otherwise.

Deploy to Azure

I was recently treated to some excellent material on the Cortana Analytics Suite of products. Paying close attention (as I sometimes do), I noticed that the lab environment was prepared for me as an ARM template. I was directed to GitHub (an online public software repository) and told to push the button marked “Deploy to Azure”. When I did, I was brought to http://deploy.azure.com – and the URL included a reference to the GitHub location, or repository, which I had just visited. The author of the software had placed an ARM template describing the entire lab environment, and included a few parameters so that I could fill in the information from my Azure subscription. 20 minutes later, I had Machine Learning, Hadoop/Spark, Data Factory and Power BI resources at my fingertips. Later in the day, we did deployed again, this time deploying a simple Web app which consumed Advanced Analytics services. When I was finished, I simply deleted the resources – the entire day cost me less than $20 of Azure consumption costs. Deploying an app has never been easier.

Azure Container Services

No discussion of DevOps would be complete without mentioning Docker. Docker is a platform gaining popularity among developers and IT operations for its consistency with Virtual Machines and lower overhead. Essentially, Docker runs as a subsystem which hosts containers. A container is similar in functionality to ARM.

Azure Container Services 1

Azure Container Services 2

 

 

 

 

 

 

 

DevOps Tools on Azure

Linux or Windows, Open Source or Closed, Infrastructure or Platform, TFS or GitHub. None of that matters anymore. No more excuses – Microsoft Azure provides outstanding DevOps tooling for Modern Application Development. If you have not deployed your first application to Azure, let’s talk. We can get you optimized quickly

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Technology Talks Newsletter CTA

 

Three Steps to High Quality Master Data

Data quality is critical to business, because poor business data leads to poor operations and poor management decisions. For any business to succeed, especially now in this digital-first era, data is “the air your business needs to breathe”.  If leadership at your organization is starting to consider what digital transformation means to your business or industry – and how your business needs to evolve to thrive in these changing times, they will likely assess the current business and technology state. One of the most common outcomes management may observe is that the business systems are “outdated” and “need to be replaced”. As a result, many businesses resolve to replace legacy systems with modern business systems as part of their digital transformation strategy.

Digital Transformation Starts with Data

More than likely, those legacy systems did a terrible job with your business data. They often permitted numerous, incomplete master data records to be entered into the system. Now, you have customer records which aren’t really customers. The “Bill To’s” are “Sold-To’s”, the “Sold-To’s” are “Ship-To’s”, and the data won’t tell you which is which. You might even have international customers with all of their pertinent information in the NOTES section. Each system which shares customer master data with other systems contains just a small piece of the customer, not the complete record.

This may have been the way things were “always done” or departments made due with the systems available, but now it’s a much larger problem, because in order to transform itself, a business must leverage its data assets. It’s a significant problem when you consider all the data your legacy systems maintain. Parts, assets, locations, vendors, material, GL accounts: each suffer from different, slightly nuanced data quality problems. Now it hits you: your legacy systems have resulted in legacy data.  And as the old saying goes – “garbage in, garbage out.” In order to modernize your systems, you must first get a handle on data and your data practices.

Data Quality Processes

The data modernization process should begin with Master Data Management (MDM), because MDM can be an effective data quality improvement tool to launch your business’ Digital Transformation journey. Here’s how a data quality process works in MDM.

Data Validation – Master Data Management systems provide the ability to define data quality rules for the master data. You’ll want these rules to be robust — checking for completeness and accuracy. Once defined and applied, these rules highlight the gaps you have in your source data and anticipate problems which will present themselves when that master data is loaded into your shiny new modern business applications.

Data Standardization – Master Data thrives in a standardized world. Whether it is address standardization, ISO standardization, UPC standardization, DUNS standardization, standards assist greatly with the final step in the process.

Matching and Survivorship – If you have master data residing in more than one system, then your data quality process must consider the creation of a “golden record”. The golden record is the best, single representation of the master data, and it must be arrived at by matching similar records from heterogeneous systems and grouping them into clusters. Once these clusters are formed, a golden record emerges which contains the “survivors” from the source data. For example, the data from a CRM system may be the most authoritative source for location information, because service personnel are working in CRM regularly, but the AR system may have the best DUNS credit rating information.

Modernize Your Data and Modernize Your Business

BB Art

These three data quality processes result in a radical transformation in the quality of master data, laying the foundation for critical steps which follow. Whether or not your digital transformation involves system modernization, your journey requires clean, usable data. Digital transformation can improve your ability to engage with customers, but only if you have a complete view of who your customers are. Digital transformation can empower your employees, but only if your employees have accurate information about the core assets of the business. Digital transformation can help optimize operations, but only if management has can make informed data driven decisions. Finally, digital transformation can drive product innovation, but only if you know what your products can and cannot currently do.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics. 

Technology Talks Newsletter CTA

Intelligent Apps Are Friendly Apps

Whether you are a human or a computer, it pays to be friendly. When you buy something, are you more likely to buy from friendly or unfriendly salespeople? I like to spend time with people, but only if they are friendly. I am more apt to be generous with people who are friendly and am more easily persuaded by friendly people.

With technology, I love to interact with friendly software, or should I say “intelligent apps”. What makes for an intelligent app? Well, they are apps which exhibit a kind of machine intelligence which we associate with human intelligence. Not “super computers,” but computers and software which exhibit the same qualities I enjoy in friendly people. Let me get a little more specific:

  • People whom I have met before usually recognize the sound of my voice. Those who listen to what I say are ones which I admit to my inner circle. My friends may disagree with what I say, but I know that they listen and understand me.
  • People whom I have just met make some guesses about my mood and interact with me accordingly. My friends recognize my mood pretty quickly when they converse with me. My close friends always seem to respond to me in ways that are intended to bring me back to a positive frame of mind.
  • Most humans I come across recognize that, when I get to the “heart of the matter,” I am not performing surgery, or dealing with organs in any way. Only a literal minded person, or a super-computer, would come to that conclusion.
  • Finally, I often come across humans who do a great job of sharing knowledge with me. When I ask questions, they provide me with a lot of great information. I enjoy spending time with people who are knowledgeable, yet humble, and try to maintain contact with them professionally.

Of course, computers and software have historically not done any of these things well! It’s no wonder many people may find them infuriating. Our computers and software just haven’t conformed to our perceptions of intelligence – therefore, we don’t perceive them as friendly. But, longstanding ideas about what artificial intelligence (AI) looks like have inspired what are called “Cloud-Based Cognitive Services.” In other words, scientists and engineers have figured out that cloud computing, big data and data sciences have enabled the technologies needed to deliver AI.

Meet Your New Best Friends

I think the thing which is so attractive about “intelligent apps” is that I perceive them as being friendly.  Take Windows Hello, the facial recognition software in Windows 10 which recognizes your face as your login. I much prefer logging onto my Surface Pro 4 at home (which has Windows Hello) than my work laptop (which does not). My face never expires, does not need to be reset, and doesn’t need to be remembered! This is just a fabulous experience; it’s almost as though my tablet “knows me.”

Here is another example of intelligence which makes life easier- natural language processing in Power BI. Before natural language processing, I had to apply filters to my data, click around to find the thing I was looking for and format the graphs and charts on my report. With Power BI, I can simply type “Show Me Last Year’s Sales by Territory” and the data appears. This is simply one example. Power BI dashboard authors do not even have to have created a report in order for this intelligent app to suggest it as a possible solution. When paired with the voice recognition capabilities of Cortana, it may seem that you have a digital assistant with limitless access to the dashboard, reports and data you need to run your business.

Cloud-Based Cognitive Services

Today’s modern applications are intelligent apps, and the hallmark of an intelligent app is human-like artificial intelligence. Most application developers do not have access to the AI algorithms needed to be truly effective. However, the giants of cloud computing have made these capabilities easy to acquire and integrate into your next product, service or business systems.

Microsoft Cognitive Services are a set of Cloud Application Programming Interface’s (API’s) which application developers can embed into their modern apps to make them intelligent. There are API’s for visual recognition, speech recognition, text analytics, recommendations and much more. Perhaps you want to create an app which recognizes a face, or a user’s voice. Perhaps you want to create an app which interacts with users differently based upon the user’s perceived mood. Perhaps you want to make recommendations to customers on your website. It’s all possible, and in fact, a lot easier than you might imagine.

BlumShapiro Consulting is a Microsoft Advanced Analytics partner, with expertise in building modern intelligent apps. And we are extremely friendly.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics. 

Technology Talks Newsletter CTA

Applied Machine Learning: Optimizing Patient Care in Hospitals, Profitably

Why are so many industries exploring Machine Learning as a means of delivering innovation and value?  In my view, the technology speaks to a primitive urge – machine learning is like having a crystal ball, telling you what will happen next.  For a business, it can convey information about a customer before they introduce themselves.  On a personal level, when I consider what I would like to have information about in advance, the first thing that springs to mind is obvious: my health.  Am I about to get sick?  How can I improve my wellness and overall health?  If you are wearing a Fit-Bit right now, then you probably agree with me.

In my last blog, I shared some real-world examples for how the Hospitality Industry applies Machine Learning.  What about Health Care and Hospitals?  While hospitals have similar challenges, in that they accommodate guests who stay overnight, the objectives in health care are quite different, and changing rapidly.   The Affordable Care Act is driving new business models, incentivizing outcome based reimbursement as opposed to volume based reimbursement.  Unlike hotels, today’s hospitals are interested in ensuring their guests do not have to return, at least not in the short term.   They also need to manage costs in a way they have not been incentivized to do in the past.  Hospitals across the country are considering how predictive analytics can have a  meaningful impact on operations, leading to improved patient health and improving the bottom line.

The cost savings opportunity for health care providers is startling. Here are just three examples:

Reducing Hospital Readmissions – in 2014, Medicare fined 2,610 hospitals $428 million for having high hospital readmission rates. Leaving actual fines aside, industry analysts estimate that the overall cost of preventable readmissions approaches $25 Billion annually. As a result, hospital systems all over the nation are mobilizing to intervene, using ML to identify risk factors which are highly predictive of readmission. Carolinas Healthcare System, partnering with Microsoft, did just that. Using data from 200,000 patient-discharge records, they created a predictive model deliver customized discharge planning, saving the hospital system hundreds of thousands of dollars annually. Read the article in Healthcare IT News.

Clinical Variation Management – Mercy Hospital is partnering with Ayasdi to find the optimal care path for common surgical procedures. Using knee replacement as an example, the Clinical Variation Management software helps hospital administrators find clusters of patient outcomes, then enables the exploration of those clusters in order to correlate a metric (i.e. Length of Stay) with a certain regiment or activity. Watch this video to learn how Mercy Hospital saved $50 million by applying Machine Learning to an extremely common procedure.

Improving Population Health – Dartmouth Hitchcock, a healthcare network affiliated with Dartmouth University, is piloting a remote monitoring system for patients requiring chronic care. 6,000+ patients are permitting the hospital to collect biometric data (i.e. blood pressure, temperature, etc.) in order that nurses and health coaches can monitor their vital signs, and machines can predict good days and bad days. Quite the opposite of hospitality: Dartmouth Hitchcock is trying to keep the guests from needing to checking in! Read more about the Case Study from Microsoft.

Are machines taking over for physicians? No. The Patient – Physician relationship remains (and I think will always remain) central to the delivery of personal health care. However, it seems clear that ACA is providing significant rewards to health care providers who manage population risk better. Machines can help here: through data, machine learning can find risk “hiding in the data”.

Contact Blum Shapiro Consulting to learn more about how Azure Machine Learning can curtail hospital readmissions, identify variations in common clinical procedures and improve patient population health.