The Business Value of Microsoft Azure – Part 2 – Active Directory

This article is part 2 of a series of articles that will focus on the Business Value of Microsoft Azure. Microsoft Azure provides a variety of cloud based technologies that can enable organizations in a variety of ways. Rather than focusing on the technical aspects of Microsoft Azure (there’s plenty of that content out there) this series will focus on business situations and how Microsoft Azure services can benefit.

There are many risks that businesses and governmental entities face when it comes to data loss. Most have taken steps to do things like encrypt hard drives, enforce password change policies and limit the use of consumer oriented applications like Facebook. However, one the biggest gaps that exist has emerged as a result of the prevalence of Software as a Service (SaaS) solutions. These cloud based systems require little to no IT involvement to get up and running. A consequence of this ease of deployment is that countless new opportunities for compromise emerge.

Let’s take something like a file sharing service. Whether it’s Dropbox, box, or some other solution, an individual in an organization can quickly set it up with a username and password and begin sharing files inside or outside the organization. In most cases this isn’t carried out by a nefarious user with malicious intent. Rather, it’s set up to address a specific business need. Perhaps it’s a new product catalog and price list that’s too large to send to distributors via email. While this sounds good on the surface, let’s fast forward six months…

Six months after the service has been in use there are now a couple dozen fellow employees using the service, all with an individual username and password. The CIO finally becomes aware of this because he gets a link shared with him from one of his employees that takes him to a file in a box account. He immediately spots a problem – what happens if one of these employees leave?

  1. We don’t know that they are using the service
  2. We can’t terminate their access
  3. We have no ability to enforce any password complexity or change frequency requirements

Now, there are a variety of solutions to this problem. First, the CIO could disable access to box and prevent users from using the service. OneDrive for Business, part of Office 365 could be implemented as a secure, enterprise alternative. However, what if the CIO didn’t want to take away this service, but have more control over it. Is there a solution?

Enter Microsoft Azure Active Directory. Microsoft Azure Active Directory provides a variety of services to the enterprise that can help our CIO. First and foremost is the Access Panel portal for Single Sign On (SSO) based access to SaaS applications. This allows the CIO to configure access to box so that the user actually uses their standard Active Directory credential to authenticate against box. This also means that when that employee leaves or is terminated and their Active Directory account is disabled…so too is their access to box!

In addition to the Access Panel, another key feature is something called Azure Active Directory Cloud App Discovery. This service allows a small agent to be deployed to an end user workstation which allows for access to various cloud services to be monitored. This is a huge benefit to IT organizations because they:

  • Get a summary view of the total number of cloud applications in use and the number of users using cloud applications
  • See the top cloud applications in use within the organization
  • See top applications per category
  • See usage graphs for applications that can be pivoted on users, requests or volume of data exchanged with the application
  • Can drill down into specific applications for targeted information
  • Can view which users are accessing which apps
  • Can easily proceed to integrate an application with Azure Active Directory

There are many other reasons for organizations to look at Azure Active Directory, but this is the first one that pops into mind whenever I think about security risks and simple ways to reduce exposure while still providing end-users with access to the productivity applications they desire.

As a partner with BlumShapiro Consulting, Michael Pelletier leads our Technology Consulting Practice. He consults with a range of businesses and industries on issues related to technology strategy and direction, enterprise and solution architecture, service oriented architecture and solution delivery.

5 Critical Success Factors for Initiating Master Data Management

If you are an organization looking to improve data quality and business operations through the development of Master Data Management (MDM), then confusion about how to implement it can have drastic consequences.  While business leaders see and live with the challenges, they are likely unclear about the details, and  turn to technology leadership for guidance.  Technology leaders are more apt to see immediately how MDM transforms an organization from one with costly Information Gaps, to one which operates more seamlessly through its ability to create Data Enabled Value.

But if you have never implemented MDM, your preconceived notions about data quality can become a critical obstacle to realizing the benefits on a timeline which the business would accept.  Alternatively, perhaps you simply don’t know where to start.  Phase 1 is the right time to introduce your organization to fundamental MDM concepts, setting expectations for what the ownership framework for master data will look like.  To ensure a successful introduction of MDM and Data Quality practices to your organization, follow these steps.

Don’t Create Another Information Silo – the goal of MDM is to bring break down barriers to clean, high value data assets.  But many technologists are tempted to see the MDM system as “just another database”, which can be designed like any other custom databases in the organization.  MDM is not just another database.  It has its own set of rules and best practices for how to design a simple, clean data model which can be managed by the business.  Finally, it is not necessary to spend big bucks on an MDM database – Microsoft’s solution comes bundled into two editions SQL Server.  Really, don’t build your own.

Don’t confuse CRM with MDM - or any other business system you currently uses, for that matter!  Many CRM software companies (and some ERP’s) like to promote MDM capabilities in their software.  But another goal of MDM is to improve the overall quality of the Master Data and create a space for the Data Governance group to manage that data quality.  If you do not extract master data from its source, then the governance group must manage the data in a process oriented system.  This is akin to tying one arm behind their back.  In the end, governance becomes hindered by the process requirements of a source system.  Instead, create a “data jurisdiction” (which is MDM), extract the data from sources into that jurisdiction and govern inside that jurisdiction.  This brings me to my next point.

Flip the Script – a common frame of reference for data quality is “garbage in, garbage out”.  This frame of reference helps technology leaders explain to the business why data assets have historically been ill-suited for future use: the data entered the system in poor form, and now exits the system in poor form.  The conclusion many draw from this is that in order to improve the data quality, one must enforce better data quality rules at the outset, or “scrub” the data in transit from the source.  Wrong Answer!  Flip the Script: bring the data “as-is” from the sources.  Permit your data stewards and governors to see the data as it exists in the enterprise.  This will lead to a deeper understanding of the very real process “frictions” in play.  Then, use an MDM toolset to enrich, match and harmonize the data in MDM solution itself.  By switching the frame of reference, you can accelerate the project plan and get the solution and data into the hands of stakeholders who are empowered to take action.

Formulate the Data Governance Program- Standing up an MDM system is a project, and should be managed as such.  Data Governance is an ongoing program, and when the project is successfully concluded, the organization must now take ownership of the “Data as an Asset”.  Master Data, by its very nature, implies Shared Ownership.  Each of those data stakeholders are accustomed to managing their own piece of the larger whole.  Differences will inevitably arise.  If you don’t know how to get started, work with an experienced MDM delivery team or borrow an existing framework.

Save Operational MDM for Phase 2- Operational MDM refers to a solution’s capability to distribute and synchronize high quality data to the sources of that data.  Data cleanup of a source system, such an ERP system, is a common goal, one which is widely articulated by business and technology leaders.  I see the virtue here, and indeed have completed very successful projects where master data  flows back to operational systems to drive even more value.  However, these types of projects are lengthy, because a Data Bus Architecture must be implemented alongside MDM to route and transmit data back to systems reliably.  Further, they tend to neglect  a crucial aspect of MDM – all of the Unsanctioned Master Data which resides in Spreadsheets, Databases or in people’s heads.  There is a ton of value created simply by establishing the MDM solution as an authoritative source of high quality, high value data assets.  Don’t make ERP clean up part of your initial goal.  But, do create a data model which will make it easy to do so in a later phase.  MDM projects which defer this requirement show value to the business very quickly, and get funding for later projects more easily.

Master Data Management is a powerful remedy for a number of broad Information Gaps in an organization.  The challenge of implementation is understanding what the goals are, managing expectations, and building a Data Governance and Stewardship culture.  Make sure you understand what can be accomplished quickly with MDM, and focus on those goals in the early stages. 

7 Steps to Software Testing Without Pain

Testing code is not something that gets most developers excited, nor is it something that a business owner is excited about pouring hours and hours of effort and cost into. The reason is simple: a test produces no real value, it only confirms the value of the feature which is being tested.  Nonetheless, everyone on the team, both the team building the software and the team ready to use the software, assumes that code will be tested, right?  In other words, testing is something we know is important that someone else should do.

Why Test?

Yes, of course, tests need to be conducted for obvious reasons. Testing maintains a high level of quality in the product and ensures that software does what it is expected to do.  But equally important is the ability to increase the velocity with which a team can release code to the people who will actually use it.  In my experience working with clients, technology teams are frequently perceived of as “too slow”.  These teams labor over the design of new features which are requested by the business, and then wring their hands over whether or not the code is working and whether it impacts any other aspects of the software.  In the end, these teams sit on an ever increasing pile of work whose velocity to the users (the client or customer) slows to a crawl.  The root cause?  They can’t be sure it works because they don’t have a test plan.

So how can you implement a test plan and get your valuable software out the door?  Follow these steps:

  • Write Unit Tests – Unit tests are functions written by developers which exercise a component which the team has written.  Unit Tests can be run automatically.  When planning for unit tests, it is important to design the system such that your code is separated into testable units.  That leads me to the next point.
  • Know What Needs Unit Tests, and What Doesn’t – often times I see unit tests which are testing database functions, such as inserts, updates and selects.  You shouldn’t need to test your data access code.  Instead, you should be testing components which perform the logic of your solution, using data provided to it.  For example, my team uses Microsoft’s Entity Framework libraries for data access, and separates business logic into a set of components which follows a “Unit of Work” pattern.  EF itself does not need to be tested; a third party (Microsoft) wrote it.  However, the code which runs the Unit of Work, your custom logic, does.
  • Create a Lab Environment – a lab environment is an environment where you test – call it your Test Area, or “QA” for Quality Assurance.  Make sure you have one and make sure you have duplicated the target environment closely.  For on-premises software, the cost of provisioning a second environment, used only for testing, often prohibited teams from creating one.  This is because it doubles the initial hardware investment of the project, doubles the software license investment and doubles the maintenance time required.  With Cloud Computing, Infrastructure as a Service can be leveraged to provision a test environment quickly, then turn the environment off when testing has been completed.  This saves money and reduces attack surface, because you don’t have machines running continually which can be hacked.
  • Document your Acceptance Criteria – When you were in school, did you ever ask your teacher “Will this be on the test?”  For me, it’s has always been easier to focus on a task if I know what is expected of me.  By documenting the expected behavior of the software, you’re giving the team the answers to the test in order to help them be successful.  If you are a Microsoft shop and you have MSDN licenses for your team, I recommend you check out the test planning tools found in Visual Studio Online.  It allows you to document planned tests and success criteria for each feature and work item.  Developers working on a feature can see these plans right in Visual Studio where the work will be performed.
  • Record Manual tests, then run them – Microsoft Test Manager, which comes with Visual Studio 2013, has the ability to connect to Visual Studio Online and review, create and edit test plans.  One little known feature is the Test Runner, which has the ability to record tests and even steps within a test for future runs.  This is extremely handy for regression tests, which test to ensure that new functionality does not break existing functionality.  Once recorded, you should find that you have much more confidence in you changes, and time to test takes minutes, not days.
  • Automate your Builds -when a developer creates a new piece of functionality for the team, they write code to perform the work, and then they integrate that code into the existing solution.  If you are using a Source Code Control system (and I hope you are) then this means “Checking the code in”.  What should happen next?  If your intent is to get the feature into the hands of someone who can use it, then what are you waiting for?  Build it, test it, deploy it.  Visual Studio Online provides a Hosted Build Controller, which is essentially a build service for code checked into projects managed there.  With the VSO Build Controller, you can ensure that new code compiles properly, and even run unit tests directly after a successful build to ensure a high level of quality.
  • Generate Coded UI Tests – A Coded User Interface test allows you to fully automate tests which would otherwise require a person supervise.  As in Steps 1 and 4, you have to devise the test, and walk through it one more times.  This is easy with Visual Studio 2013, which has tools to record actions taken by a tester and then write the test out in code for you.  Once the test is recorded in code, the test can be run automatically in a lab environment by the Microsoft Test Runner.

Testing does not need to be painful. It requires commitment from the team to writing testable code, a ruthless pursuit of automation and a recognition that software delivers no value until it is tested and deployed. If you are a Microsoft shop, I highly recommend you talk to BlumShapiro about how to improve your testing processes. Visual Studio Online and Visual Studio 2013 provide all of the tools you need to get this process up and running. 

The Business Value of Microsoft Azure – Part 1 – BizTalk Services

This article is part 1 of a series of articles that will focus on the Business Value of Microsoft Azure. Microsoft Azure provides a variety of cloud based technologies that can enable organizations in a variety of ways. Rather than focusing on the technical aspects of Microsoft Azure (there’s plenty of that content out there) this series will focus on business situations and how Microsoft Azure services can benefit.

Within business there often defining moments that catapult an organization to the next level of profitability and growth. These can come from through careful planning and an effective strategy, but can also come about unexpectedly. Take, for example, an distribution organization that fulfills orders for a variety of manufacturers. The organization has a variety of processes by which orders are transmitted these manufacturers, but most rely on fairly one-off, manual methods as the volume of orders is typically pretty low for each company.

One fateful day a large manufacturer contacts this distributor and offers them some business. This business, however, requires the distributor to accept order information through EDI and handle a jump in typical monthly order volume from 1,500 to 15,000 orders.

As the distributor evaluated this opportunity they needed to consider the impact on employees to fulfill the orders, the warehouse space to house the products and the technology implications. The distributor had a fairly well run warehouse management system (WMS) and processes which allowed them to bring in relatively unskilled, temporary labor, to quickly meet whatever human resource demands emerged. Fortunately the facility they were in was owned by the distributor and had a significant amount of unused square footage. This left only the technology challenge to address.

The distributor did not want to invest significantly in IT staff, servers and other infrastructure that might not be needed in the near future. They were risk averse and didn’t want to bet the future of the company on this one new client. At this same time, it was a significant opportunity that couldn’t be passed up.

Enter Microsoft Azure BizTalk Services

BizTalk Services provides a robust and extensible solution for trading partner management and Electronic Data Interchange (EDI) processing. The distributor quickly implemented an interface that mapped standard EDI data to an internal XML format that could feed into their on-premise WMS. This integration was achieved using another Microsoft Azure service, the Azure Service Bus, which is a generic, cloud-based messaging system for connecting just about anything—applications, services, and devices—wherever they are.

By using cloud based technologies the distributor was able to take advantage of a market opportunity without a significant investment in new on-premise hardware and software. The cost of the Azure services is based on usage and so if the business were to dry up the wouldn’t have an entire, worthless, EDI infrastructure lying around.

As a partner with BlumShapiro Consulting, Michael Pelletier leads our Technology Consulting Practice. He consults with a range of businesses and industries on issues related to technology strategy and direction, enterprise and solution architecture, service oriented architecture and solution delivery.