Archive for November 29, 2016

What to do When There’s No App For That

It finally happened. You have a great idea to streamline a business process or improve customer engagement when you discover that there’s NO app for that. What do you do? How do you build it? The answer is, it depends. It depends on the technical requirements, the target audience, your budget and the platforms and devices you want to target.

Traditionally for modern apps there are 3 approaches

 

Native Apps

A Native app is a mobile application developed in a programming language such as C# for Windows, Java for Android or Objective C for IOS to target a specific device. There are frameworks and tools like Xamarin that allow you to develop native apps with a single codebase in a single programming language targeting multiple platforms, but such tools are not required to build a native app.

HTML5 Apps

HTML5 apps are applications delivered from the web that look and feel like native mobile applications. They run in the browser, and can be accessed like any other web page (open browser, type in the URL, etc.). A responsive website is an example of an HTML5 app.

Hybrid Apps

As the name implies hybrid apps are part native app, part HTML5 app. Hybrid apps can be delivered via an app store and are stored on the device much like native apps. However, unlike native apps hybrid apps are served up through a browser (more specifically a browser control in the application) and are developed using web technologies like HTML5 and JavaScript.

Recently a new type of app has entered the mobile ecosystem. These apps can be developed without writing any code, and can be made available to users within your organization. As such, I’ll refer to these apps as Organization Apps.

Organization Apps

Organization apps are internal line-of-business applications published to users within your organization. Apps developed with PowerApps  from Microsoft are a great example of this type of app. With PowerApps users within an organization can connect to business systems like SharePoint, OneDrive and MS Dynamics CRM to create powerful web and mobile applications which can be made available to other users within the organization.

Another great example is Composer 2 from AppGyver which allows users to connect to business systems like Oracle and Salesforce to create applications for their enterprise.

With so many options for developing mobile applications it can be tough to decide which approach to take. Here is a simple chart covering just some of the many things that should be considered when making the decision on the development approach.

  Native App HTML5 App Hybrid App Organization App
Cost High Moderate Moderate Low
Connectivity Online/Offline Mainly Online Online/Offline Mainly Online
Distribution App Store Web App Store Internal to Organization
Device Access Yes No Yes No
Development Time High Moderate Moderate Low
Developer Skills C#/Java/C HTML/CSS/JavaScript HTML/CSS/ JavaScript None
Cross Platform No Yes Yes Yes


How We Can Help

Fully understanding the requirements for the app and how it will be used will be critical to the apps success. Understanding the security and accessibility of the app is also crucial. At BlumShapiro we have the expertise to bring your app idea to fruition. Talk to us about your app idea to get started.

About Matt:

As a senior in BlumShapiro’s Technology Consulting Group, Matt has over 7 years of experience with Microsoft .NET software application development, including solutions for web, client/server and mobile platforms.

What to do CTA

Develop Workflows and Business Processes Without Developers

Companies are beginning to embrace technology at a higher level. However, there are still a number of businesses that have processes, a.k.a. workflows (an automated process) that rely on a user to manually enter information into an Excel spreadsheet. This spreadsheet can range from a simple list to one which has “complex” calculations that accounts for crazy exceptions (like adding 2% to the total if the month ends on a Tuesday while raining). These lists usually have one or two gatekeepers who know their calculations by heart, so if they leave the organization, the process becomes a headache for someone else or worse—grinds the business to a halt. Bottom line, it could be bad for a company.

Imagine taking your complex process and developing an application to take it over. This may seem like a steep challenge, but in today’s technological marketplace, there are services and apps that can help users create workflows and apps. Taking advantage of the technology to create these apps and workflows provides a huge benefit by getting the process knowledge and logic out of one person’s hands and into an automated process—making it accessible by anyone. This will also help to document the process and uncover any inefficiencies and deficiencies.

Microsoft, among other companies, are gearing application and workflow development toward power users and away from developers. These products are built with users in mind first, using a drag and drop interface. Most of the software tools are intuitive so little direction is needed to develop these workflows. These users already know the process intimately and don’t need to learn C# or Java or whatever language they have never heard of. However, having a technology specialist can still provide insight into workflows that might not have been thought of in the first place.

Workflow Automation Products

Below are a few Software as a Service (SaaS) workflow products that are geared towards power users.

Microsoft Flow

Flow is a drag and drop service solution used to create automated workflows in Office 365. These workflows can connect different applications and services; both enterprise (Office 365, SharePoint Online, Salesforce, CRM) and social (Twitter, DropBox, MailChimp). For example, you can easily create a Dynamics CRM entry from SharePoint list items. This happens to be a preconfigured template; requiring little effort to implement. Flow also has the ability for you to create your own custom workflows using their drag and drop interface.

PowerApps

PowerApps is a service that allows users to build Android, iOS and Windows apps without writing any code. This SaaS allows you to connect custom APIs, SharePoint, Excel, etc., and turn this data into an app. You can easily create an app to list and fill out information. Users outside of the office can use these apps on their phones and all of the data would be up-to-date in a SharePoint Online list, Excel, etc. Like Microsoft Flow, it is also hosted in Office 365 and has pre-built templates.

Power BI

Power BI is a service which can be used to build dashboards and data analytics reports using data from different sources within your organization. There are out of the box connectors to programs like Excel, Project Online, Adobe Analytics, Salesforce, CRM and others. Power BI can improve processes, even when data is in different locations that requires someone to pull together data from different places into one location. For example, say all project financials are located in an Excel spreadsheet over in Finance, yet the actual project costs are tracked by each individual team. To get all of this data into one report, someone would need to get the data from at least two different sources and merge them into one report. Power BI can automate that. Power BI is used to aggregate data from different sources into one location. In our example, instead of an executive calling on someone to get the numbers, compile them and produce a report; those numbers are displayed in Power BI, which is always up-to-date, and can even be drilled into or associated with key performance indicators (KPIs).

SharePoint Workflows

SharePoint workflows are also designed for a power user to create business processes (workflows). These can run on either SharePoint on premise or online. These workflows work well for approvals and processes that reside in SharePoint. External site integration is possible out of the box, but it does take some technical knowledge to do this. Some products fill that gap and have created a usable drag and drop interface with pre-built connectors. Two top vendors are Nintex and K2. Just like Microsoft Flow, users can build workflows by dragging actions onto a canvas.

Which Workflow Tool is the Best?

We recommend that organizations evaluate all software and services that are available to determine which product would best suit their unique business needs. They should look at features such as which product might integrate best with their existing software and what is the future software/technology strategy of the organization?

How can we help?

All organizations can benefit from streamlining processes or eliminating manual tasks. Workflows are easier to create than ever before, however, having the technical expertise of a consultant can be extremely beneficial. By being able to leverage our past experience and intimate knowledge of the products you need, we can determine the best technology for your project and implement the process along with guiding and training your users to do this themselves.

About Brent:

Brent

Brent Harvey has over 10 years of software development experience with a specific focus on SharePoint, Project Server, and C #and web development. Brent is an Architect at BlumShapiro Consulting. Brent is a Microsoft Certified Solutions Expert in SharePoint 2013, Solutions Associate in Windows Server 2012, Specialist in Developing Azure Solutions and Professional Developer in SharePoint 2010.

Workflow Automation CTA

How Much is Your Data Worth?

Data is the new currency in today’s modern businesses. From the largest international conglomerate down to the smallest neighborhood Mom-and-Pop shop, data is EVERYTHING! Without data, you don’t know who to bill for services, or for how much. You don’t know how much inventory you need on hand, or who to buy it from if you run out. Seriously, if you lost all of your data, or even a small but vitally important piece of it, could your company recover? I’m guessing not.

“But,” you say, “We have a disaster recovery site we can switch to!”

That’s fine if your racks melt down into a pool of heavy metals on the server room floor, then yes, by all means switch over to your disaster recovery site because molten discs certainly qualify as a “disaster!” Databases hosted on private or public cloud virtual machines are less susceptible, but not immune, to hardware failures.  But what about a failure of a lesser nature? What if one of your production databases gets corrupted because of a SQL Injection hack, cleaned out by a disgruntled employee, or is accidentally purged because a developer thought he was working against the DEV environment? Inadvertent changes to data are no respecter of where such data is stored, or how it is stored! And, sorry to say, clustering or other HADR solutions (High Availability/Disaster Recovery, such as SQL Server Always On technology) may not be able to save you in some cases. Suppose some data gets deleted or is modified in error. These ‘changes’, be they accidental or on purpose, may get replicated to the inactive node of the cluster before the issue is discovered. After all, the database system doesn’t know if it should stop such changes from happening when the command to modify data is issued. How can it tell an ‘accidental purge’ from regular record maintenance? So the system replicates those changes to the failover node. You end up with TWO copies of an incorrect database instead of one good one and one bad! And worse yet, depending on your data replication latency from your primary site to the disaster recovery site, and how quickly you stop the DR site from replicating, THAT may get hosed too if you don’t catch it in time!

Enter the DATABASE BACKUP AND RESTORE, the subject of this article. Database backups have been around as long as Relational Database Management Systems (RDBMS). In my humble opinion, a product cannot be considered a full-featured RDBMS unless it has the capability of performing routine backups and allows for granular restore to a point in time. (Sorry, but Microsoft Excel and Access simply do not qualify.) Being a Microsoft guy, I’m going to zero in on their flagship product: SQL Server, but Oracle, SAP, IBM and many others will have similar functionality. (See the Gartner Magic Quadrant for database systems for a quick look at various vendors, including Microsoft a clear leader in this Magic Quadrant.)

So what is a BACKUP? “Is it not simply a copy of the database?” you say, “I can make file copies of my Excel spreadsheet. Isn’t that the same as a backup?” Let me explain how database backups work and then you can decide the answer to that question.

First of all, you’ll need the system to create a FULL database backup. This is a file generated by the database server system, stored on the file system, the format of which is proprietary to the system. Typically, full backups are taken once per night for a moderately sized database, for example under 100 GB, and should be handled via an automated scheduling service such as SQL Agent.

iStock_000006412772XSmallNext, you’ll need TRANSACTION LOG backups. Log backups, as they are known, record every single change in the database that has occurred since the last full or log backup. A good starting point is scheduling log backups at least every hour, with possible tightening down to every few minutes if the database is extremely active.

Now, to restore a database in the event of a failure, you need to do one very important step: backup the transaction log one last time if you want to have any hope of restoring to a recent point. To perform the actual restore, you’ll need what is known as the ‘chain of backups’ which includes the most recent full backup and every subsequent log backup. During the restore, you will be able to specify a point in time anywhere from the time of the full backup to the time of the latest log backup, right down to the second or millisecond.

So we’re all set right? Almost. The mantra of Database Administrators the world over regarding backups is this: “The backups are only as good and sure as the last time we tested the RESTORE capability.” In other words, if you haven’t tested your ability to restore your database to a particular point in time, you can’t be sure you’re doing it right. Case in point: I saw a backup strategy once where the FULL backups were written directly to a tape drive every night, then first thing in the morning, the IT guys would dutifully eject the tapes and immediately ship them out to an off-site storage location. How can you restore a database if your backups are not available? Case two: The IT guys, not understanding SQL backup functionality and benefits, used a third party tool to take database backups, but didn’t bother with the logs. After four years of this, they had a log that was 15 times the size of the database! So big, in fact, that there was no space available to hold its backup. About a year after I got the situation straightened out with regular full AND transaction log backups going, the physical server (virtualization was not common practice then) experienced a debilitating hardware failure and the whole system was down for three days. Once running again, the system (a financials software package with over 20,000 tables!) was restored to a point in time right before the failure. Having the daily FULL backups saved the financials system (and the company). But also having the log backups saved many people a day’s work if we had had to go back to the latest FULL backup.

So, what’s your data worth? If your data is critical to your business, it is critical that you properly back up the data. Talk to us to learn how we can help with this.

About Todd: Todd Chittenden started his programming and reporting career with industrial maintenance applications in the late 1990’s. When SQL Server 2005 was introduced, he quickly became certified in Microsoft’s latest RDBMS technology and has added certifications over the years. He currently holds an MCSE in Business Intelligence. He has applied his knowledge of relational databases, data warehouses, business intelligence and analytics to a variety of projects for BlumShapiro since 2011. 

Technology Talks Newsletter CTA

Internet of Things Modern Application Development

Over the past decade modern application development has shifted from mainframe computing to personal PCs, and now to smartphones and cloud services. These shifts required new software languages, new hardware and new application development solutions. The best illustration of this shift came with the 90’s “Internet Boom.”  This shift resulted in application development on mainframe and personal PCs to applications that can run on a web browser. The shift also included new tools such as Visual Studio, new languages like HTML/JavaScript, new Architecture Patterns such as MVC and new application life cycle processes like Agile/Scrum.

Then came the smart phone. This shift from personal PCs to small mobile devices such as iPhones forced modern application development to support multiple screen resolutions, and a need to operate off-line while remaining connected to cloud services. Our next shift is to the Internet of Things (IoT), once again giving a new meaning to modern application development. Now, applications need to be developed to run on different types of devices like thermostats, doorbells and small Bluetooth sensors. The application must be secure, cloud ready and able to perform predictive analysis using machine learning. Below are my thoughts on this latest shift in modern application development:

Devices

The IoT modern application development shift includes a multitude of devices that range from televisions to cameras, to refrigerators, to pretty much any device that is powered into an outlet. One of the more notable products in this space is the Amazon Echo which uses voice recognition as its main interface, and can provide control over your light switches, thermostats and even your music collection. Amazon Echo is an example of an IoT device which breaks away from the previous modern application development, as it uses voice as its interface, is always connected to the cloud, and can connect with other IoT devices. This changes everything about how we think of modern application development. No longer is it about supporting multiple device resolutions, but rather about what data can be captured via the latest IoT devices and how that data can be used to improve our lives. This means we need new software tools, new cloud services, new analysis software and new machine learning algorithms.

These applications do not always include fancy user interfaces, as they are often function specific. For example, an IoT device could capture changes in temperature on a farm, take soil sample readings or even capture images and video of the fields. This data can then be sent to cloud services where it can be analyzed and run through machine learning to produce an easy to understand update on the farm. The data from the disparate “things” needs to be collected in a common format for actionable insights. Of note, most of the “big data” being processed and collected today is machine-to-machine. Cloud services help to aggregate and display this data in ways humans can understand, analyze and take action on the insights delivered.

Cloud Services

Cloud services are at the heart of IoT. Devices are built to perform a simple purpose and leave all complex user interfacing, analysis and thinking to the cloud. Cloud services such as the Azure IoT hub provide both the software tooling and service for a device to talk to the cloud and the device to connect to other devices. For example, in the manufacturing industry, IoT devices using the Azure IoT hub can be developed to monitor the production line and equipment use, which is then submitted to cloud service which then can be interpreted by human intelligence to predict equipment maintenance.

With this shift to IoT modern application development software is developed to capture data from a range of sensors, submit that data to cloud services and then process that data using analytics services such as Business Intelligence dashboards for timely and relevant role based information.

Machine Learning

So what is the point of these IoT devices in our homes, cars and at work, capturing data and sending it to the cloud? Well that’s what machine learning is all about. We now need to develop algorithms that can learn based on data from the IoT. For example: home IoT devices using machine learning will learn the normal patterns in your house and only notify you when there is a disruption such as the lights staying on past a normal pattern or when you leave your windows open while you are away. Machine learning is one of the most important aspects of IoT and without it, all we would have is raw data in a cloud service with no meaningful way to utilize it.

BlumShapiro Consulting is a Microsoft Advanced Analytics partner, with experience building modern IoT apps. 

Technology Talks Newsletter CTA

About Hector: 

hectorHector Luciano, Jr. is a Consulting Manager at BlumShapiro, a Microsoft Gold Partner focusing on SharePoint, Office 365, mobile technologies and custom development solutions. Hector is an active member of the SharePoint community. His experience reflects a breadth of Microsoft .Net Technologies experience. With a focus on Software Application development, Hector has worked on various projects including architected and designed solutions for web, client/server and mobile platforms. He has worked closely with business owners to understand the business process, then design and build custom solution. Hector currently holds Microsoft Certified Solution Developer (MCSD), Microsoft Certified Professional Developer (MCPD).