Archive for Microsoft

CC’d By Mistake: Helpful Outlook Tip Series – Ignore Feature

Have you been CC’d before on emails that you should not have been copied on in the first place? The old way of handling these emails was to simply delete them as they came in. However, in larger email chains there may be numerous email replies and time spent deleting these can quickly add up, resulting in lost productivity. A faster and easier way to reduce these emails is to highlight the message in your Outlook inbox and then click the Ignore button in the ribbon under the home tab.

CC'd by Mistake 1

If it’s your first time using this feature, you will see this additional prompt below, just click on Ignore Conversation button to delete the email conversation.

Now, if there are any future replies to that email thread, Outlook will automatically send it to your Deleted Items folder. If you accidentally ignored the email conversation, right click on the email in your Deleted Items and click on the Ignore button again to restore the email conversation back to your inbox.

This is one more tip for how to utilize the features of Office 365 to be more efficient and productive with your time. If you aren’t already using Outlook, we recommend trying Office 365 which gives you access to the latest Office version (Office 2016). Learn more about how BlumShapiro Consulting can help implement Office 365 for your organization to use.

david haleAbout David: As a senior consultant with BlumShapiro Consulting, David coaches a range of businesses on the latest Microsoft cloud solutions. David’s belief is in keeping technology simple and easy to use for businesses. David has experience in the lifecycle of technology implementations from assessments, selections, implementations and project management. David specializes in the SMB/non-profit market for Microsoft Office 365 and Dynamics CRMDavid received a Bachelor of Science in Information Systems Management from Quinnipiac University. Prior to joining BlumShapiro in 2009, David was a Desktop Support Specialist at Aetna in Middletown, CT. He currently holds a Microsoft Certified Professional (MCP) in Office 365, Dynamics CRM and is a Yammer Certified Community Manager.

Stay up to date on the latest technology trends with our eNewsletter Technology Talks. Sign up today! 

On the Leading Edge of New Technology

Being on the leading edge of any technology can be exciting, but it’s often frustrating and even costly. There is an inherent risk associated with adopting technology that is new. Lack of community support or documentation if something goes wrong are just a couple of the issues that can arise. However, there are benefits to being an early adopter. For example, working hands-on with a new technology is the best way to understand how it works. As technology consultants, we view it as our job to understand what’s coming so we can advise our clients with a clear eye to the future.

Scenario

A client asked us about alternatives to their current Remote Desktop Services (RDS) implementation which was being hosted by a third-party vendor. There were a few issues with their current setup, namely cost and maintaining multiple logins, and they didn’t have any type of domain or user directory. After exploring a few different RDS deployment scenarios, they ultimately decided on using a preview version of Azure Active Directory Domain Services (AD DS) on Azure virtual machines.

They really liked the idea of using Azure AD DS because of the promised benefits; no servers (on-premises or in the cloud) to maintain, simplified user interface, etc. We shared our assessment of the risks and unknowns of using an untested technology, but the client whole heartedly accepted these risks because there were so many more upsides to using Azure AD DS for their specific setup. So, we set out to implement Remote Desktop Services using Azure Active Directory Domain Services…and we learned a couple of things along the way which we are happy to share with you.

Sometimes the Leading Edge is the Bleeding Edge

The first lesson learned was that with Azure AD DS, you cannot be added as a Domain Admin or Global Admin. They have their own security group called AAD DC Administrators that you have to create yourself. A good thing to note when dealing with Azure AD DS. Which lead us right to our second lesson learned.

When trying to add the Licensing Manager as a member of the AD group Terminal Server License Servers group, a permissions error popped up:

The computer account for license server [ServerName] cannot be added to the Terminal Server License Servers group in Active Directory Domain Services (AD DS) because of insufficient privileges.

Leading Edge

Thinking back to that security group, I thought, “I am not a Domain Admin, I cannot be a Domain Admin.” I felt a little helpless. Thankfully the computer didn’t need to be added to the group since all RDS servers were on the same domain. But still, I couldn’t help feeling like something might be a miss later.

As a Microsoft partner we have top tier access to Microsoft support, who recommended a few solutions to this issue…which resulted in the same permissions’ roadblock.

When the Microsoft support engineer mentioned this was the first he has heard of someone trying this, I thought, I must be a pioneer attempting this while AD DS was still in beta. But one thing was for sure, the Azure AD DS team liked the idea that someone was trying out an RDS implementation with it.

When you work with a beta version or when you install something without waiting for Service Pack 2 to be released you are blazing a new trail. When you do something new there is a thrill of being the first person to try something, and a long-standing honor in the tech world to be the first to figure something out.

In the end, after another hiccup or two, the rest of the Remote Desktop Services deployment went well, without any additional permission issues. And the result showed us that Remote Desktop Services does work well with Azure Active Directory Domain Services and was able to accomplish the client’s goals.

Once the beta for Azure Active Directory Domain Services is complete, I’m wondering if RDS will be on the list of supported technologies. Then I will feel like a true trailblazer cutting a path for others to follow.

Our experience with Microsoft tools gives us an inside track and an ability to work with these new technologies because we deeply understand the underlying platform. While being on the bleeding edge of technology can be risky, having experts to help guide you, navigate any issues and provide needed support can help mitigate some of these risks. And in the end, the benefits to your organization will outweigh any roadblocks encountered along the way.

About Brent:

Brent

Brent Harvey has over 10 years of software development experience with a specific focus on SharePoint, Project Server, and C #and web development. Brent is an Architect at BlumShapiro Consulting, working on projects across varied industries (banking, manufacturing, health care, etc.). Brent is a Microsoft Certified Solutions Expert in SharePoint 2013, Solutions Associate in Windows Server 2012, Specialist in Developing Azure Solutions, and Professional Developer in SharePoint 2010.

Technology CTA (3)

Adding User Configurations to an Analysis Server Cube

Part 4: Default Configurations with Overrides

In the first parts of this article, we looked at a few methods of getting user controlled configuration values into a cube such that the value can be changed simply by changing a value in a table. (I could have easily labeled these as “data driven solutions” but feel that term is grossly overused these days.) All the methods outlined previously could suffice if the requirements allowed, but eventually, the requirements are going to overtake the capabilities of the design and you will be left with some rework.

To add some complexity to our requirements, the business users have handed down the following:

  1. A default configuration value
  2. A place to enter override values, but ONLY for the date upon which the new value is to take affect
  3. Ability to easily change the effective date of a particular value
  4. Ability to revert back to the default value on a particular date
  5. And just to make it interesting, the granularity needs to be at the Fiscal Week level, not at the Date level.

In this fourth and final installment of this series, we’ll meet all the requirements above by pulling out some fancy T-SQL. We used the PIVOT clause in the last solution. This one will add the LEAD windowing function.

The method outlined in Part 3 worked to a point, but if the single value was ever changed, it would change it for all of time. That falls short of our new requirements. Explained in a little more detail: What we need is a way to enter a starting (default) value (#1), and then any new values. Also, the lazy business users only want to have to put in the new value ONCE and have THAT value be the ‘new default’, to remain in effect until changed again (#2). Item #3 says they want a way to, in essence, slide the effective date forward or backward in time. They also want a way to revert back to the default, whatever that default happens to be, without even know the original default value (#4). And finally, they want it at the Fiscal Week level, not the Date or Day level (#5).

If you missed it in Part 2, we will be using a Configuration table and an abbreviated Date dimension table:

 

Next, and new to this solution, we’ll need an override table that joins the Date Key and Configuration ID as follows:

As with any table, we’ll need some sample data:

Note that I am only inserting one day per fiscal week when in reality there would be 7.

Again from Part 2, We’ll add the two values to the main configuration table:

Now add a couple of override values to the override table

For the sample override values above, I have purposefully staggered the overrides between ID’s 1 and 2 so they do not happen on the same week.

 

To bring it all together, we’ll build a view one block at a time. Start with a SELECT statement that uses the LEAD function as follows:

In the query above, the [EndWeek] column represents the [FiscalWeekKey] of the NEXT Override Value for that particular [ConfigurationID] based on the “PARTITION BY ConfigurationID” statement. If that [EndWeek] values happens to be null, it means that is the last entry chronologically for that configuration ID.

We’ll also need a list of distinct Fiscal Weeks from the dimDate table:

Finally, wrapping the above two statements in respective common table expressions, we’ll bring them all together. As before, the CROSS JOIN between WEEKS and the Configuration view yields the Cartesian product of Weeks and Configurations. In our case, 10 weeks and 2 configurations yields 20 rows. But also note the ON clause of the LEFT OUTER JOIN to the [OVR] table: We’re making an equal join on [Configuration ID] (no surprise there) and the next two lines make sure that the override value is joined to the correct Fiscal Weeks. If there are 4 weeks that a particular Override Value is in effect, it is these two AND lines in the joining that will make sure it happens.

Also note the first ISNULL ( , ) function to get the default value if the row is missing an override value. To see this in action, add the following row to the CubeConfigurationTable and re-run the SELECT statement.

Note that as of Week 201348, Configuration ID 2 (“My Floating Configuration”) reverts back to the default of 1.2345

 

And lastly, we need to PIVOT the whole mess by adding the appropriate T-SQL clause at the end:

Now, by adding additional entries to the Override table, users can specify when a new value becomes effective. After the entry is added, editing its FiscalWeekKey will change when it becomes effective.

Recap:

Leveraging an existing Date dimension table, and adding two simple tables for default and override configuration values, we have a view that meets all the requirements. Also, to slide the effective date of any one configuration override involves simply editing the week it becomes effective. It will remain in effect until it is overridden again by another entry.

To add a third Configuration value to the cube would involve the following actions:

  1. Adding a row to the Configuration table, and any overrides to the Override table.
  2. Editing the PIVOT clause of the view to include the value as a pivoted column.
  3. Refreshing the cube’s Data Source View to include the new column from the view.
  4. Adding the item as a new measure in the Configuration Measure Group.

 

 

 

 

 

Bringing Data to Life with Excel Power Map

If you have spent any time building, designing, viewing, or otherwise interacting with data visualizations, sooner or later you are going to come across Charles Joseph Minard’s iconic graph of Napoleon’s ‘Grande Armée’ as it advances on, and retreats from, Moscow in the fall and winter of 1812.

If you are unfamiliar with the graph, the wide yellow band represents the size of the army at various points along the advance, and the black band represents the same during the retreat, which is accompanied by the temperature plot along the bottom. To put the numbers in perspective, the army started at 400,000 men, reached Moscow with 100,000, and wound up in the end with 10,000.

The graph is stunning in and of itself, especially when one considers that it is about 200 years old. I have kept a post-card sized copy pinned to my cubicle wall over the last 10 years as an example of what can be done with data and the right visualization.

But for all its richness, the graph is still just a two dimensional piece of paper. Playing with Excel Power Map one day, I thought, “What if I got my hands on Minard’s original data? What could I do with it?” The result, which I will walk the reader through creating in this article, is shown below. Even better is the movie clip that can be created from within Excel, which can be seen here.

Now that you have seen the result, I will walk you through how to get there. And of course, it all starts with data. The dataset I pulled off the internet (http://www.cs.uic.edu/~wilkinson/TheGrammarOfGraphics/minard.txt) looked a little sparse when I first plotted it. I took some liberties with that original data and interpolated many of the points. This was done by taking two adjacent points in time, determining the average army size, latitude, longitude and date between them and coming up with a third point half way between the two. If that did not fill out the graph sufficiently, I took the averages again, between the middle average point and the two original points, in essence creating ‘quarter points’ or ‘third points’ as needed. Purists may argue that I have destroyed the fidelity of the data, and I would argue back that I’m not attempting complex predictive analytics, only trying to ‘pad’ my graph with enough data points such that it closely resembles Minard’s original. I played around with simple circles, the size of each representing the size of the army at that particular point, but settled on the bar graph instead.

The temperature data was also modified by A) putting all the points on the same latitude and somewhat lower than the army’s path and B) duplicating it in the Fahrenheit scale for those of us not too familiar with the Celsius scale. Because of the up and down nature of the temperature plot, straight line interpolation would not have been appropriate. My resulting data sets of army size and temperatures can be found in the following Excel document. Also of note is the fact that Excel does not recognize dates prior to the year 1900, so all dates in the datasets are moved forward by one century to 1912.

Now to the fun stuff. For this you will need Microsoft Excel 2013 with the Power Map add-in installed (http://www.microsoft.com/en-us/download/details.aspx?id=38395). After installation, activate the add-in by clicking File > Options, select the Add-Ins page, select “COM Add-ins” in the “Manage” combo-box at the bottom, click Go, and enable “Microsoft Power Map for Excel”. Back on your workbook, on the INSERT ribbon click the Map button.

The first thing to do with any mapping exercise is to set the geographic references. On the right side, under Army Stats, check the boxes for Latitude and Longitude, and click Next (not shown). On the next page, drag the Survivors field to the HEIGHT box, the Direction field to the CATEGORY box, and the Date field to the TIME box as show in red below. At this point, after just three mouse clicks and three drag and drop operations, you have a fully interactive map (powered by Bing) with browsing controls as shown in the blue boxes. And we’re just getting started!

Let’s clean some things up. Hover your mouse over the “Tour 1” title and notice the popup. Change the title to “Napoleon’s March.” Click the Layer Manager icon, then the Change Layer Options icon (gear) next to Layer 1. Rename the layer to “Army Statistics” using the edit icon (pencil) next to the layer name (not shown).

To add the temperature plot is very simple as well. From the Home ribbon, click Add Layer. At this point, you will follow the same steps we did for plotting the army, starting with the geographic references of latitude and longitude, but this time we’ll be taking data from the Temperature fields using those fields in the same way, and plotting Temperature as the HEIGHT, Scale as the CATEGORY, and Date as the TIME. The only difference is to select a Clustered Column visualization instead of the default Stacked Column.

Of course, to be true to Minard’s original, we can further modify the plot colors to yellow and black using the Layer Settings. I have also reduced the height and opacity of the temperature plots and changed their colors so as to better contrast the army.

The result is good, but unfortunately still shows 1812 data plotted on modern day Europe. We need to fix this. On the Tour editor (left side of the screen), click the Change Scene Options button. Then in the right side, click Change Map Type. Select New Custom Map, in the dialog box, click the Browse icon, and locate this map of Russia 1812.jpg. After applying the custom map, don’t be discouraged by how your data looks; we’ll need to make a few adjustments. Set the X Min and Max values to 19 and 41, and Y Min and Max to 50 and 60, respectively and the Y Scale to 120. Click Apply to view how the changes affect your map. Make adjustments as necessary until the army plot starts at the Niemen River on the left and ends at Moscow on the right. Your final settings may be different depending on your hardware and resolution, etc. It should be noted that the jpg referenced is a conical projection and Excel does not allow ‘bending’ the plot to coincide with the curved lines of latitude, or converging lines of longitude which are clearly visible on the map. Not much we can do about that. It’s the best map of 1812 Russia I could find on the internet.

(Disclaimer: Custom Maps may not be available in your current version of Power Maps. As of September 2014, it was only available to Office 365 clients and not included in the latest download as posted at the start of this article. I don’t know if that has been rectified.)

The last thing we’re going to do is to capture the playback video and add a soundtrack. To set the playback speed, on the Play Axis toolbar at the bottom, click the settings icon. Slide the Speed control at the bottom until the Scene duration reaches just over 60 seconds. Alternatively, use the spinner control to fine tune the duration in seconds. For the soundtrack, what better music than Tchaikovsky’s iconic 1812 Overture, which, by the way, was commissioned by the Tsar to commemorate Napoleon’s defeat at this very battle! An excerpt sound clip can be found below, and is the final 61 seconds of the timeless classic.

From the Home ribbon, click Create Video (second button from the left). Choose a quality and click Soundtrack Options. This is fairly straight forward from here. For the video in this start of this article, I selected the middle quality (Computers and Tablets at 720p), and removed the option for looping the soundtrack.

As you can see, using Excel Power Map is easy, and can yield exciting visualizations, even with 200 year old data!

Enjoy!