Microsoft HoloLens

One of the activities I anticipated in coming out to Build 2015 was experiencing the HoloLens.  I saw the announcement in January and watched the related video a number of times leading up to Build.  Of course the hope was to have this as the developer ‘give-away’ which would be incredible and allow us to build for the device.  And yes, as Alex stated during the keynote that “I have hundreds of these devices”, me and those around me grew very excited at what that might mean.  Then the realization set in that hundreds would not be enough for all given the thousands of us here.  Alex’s plan for these hundreds of devices was to provide sessions where attendees could review the devices in a couple of scenarios.  I chose the Hands-on Developer option and within a couple of hours received an email with my acceptance into the Holographic Academy. 

Development Experience

The development experience was a full 4.5 hours immersion into the world of Holographic development.  I had my own workstation and HoloLens device attached by USB.  The session followed a script designed to get you comfortable with both the tools and the device.  After a short sample app to get used to the HoloLens, we moved on to recreating the ‘Origami’ app.  The development takes place equally in Unity and Visual Studio. 

Unity is a development tool very popular in game development.  It provides a 3D area for placing objects, lights, camera and defining the various actions for these items.  The assets we needed for our application were provided in our workstations and it was a matter of placing them in the correct location in the object tree.  This part reminded me a little of xaml development in Blend.  Actions for the objects were defined by creating C# classes and dragging them onto the object. 

The C# code is modified using Visual Studio.  The code is there to look for events or periodically get the state of the HoloLens view, which reminds me of game or XNA development.  See if my world has change and do something in response to that change.  The lab provided the code to include the classes so we could spend time seeing the implementation in the device.  However, there was flexibility in how we coded certain components, especially in the voice recognition.  The lab code included standard phrases.  However, we were encouraged to include our own text and make it respond to that.  This worked quite well without any ‘voice training’ for the device and with other people around interacting with their own devices.

After the objects are bound and the code modified, the Unity application builds a Visual Studio solution and places it in a folder via a custom add-in to Unity, which is not publically available at this time.  The solution is opened in Visual Studio and the application deployed to the USB connected HoloLens device.  It is important the HoloLens is held while the application is launched so that it points to the location where it will be used as the code specified placing the object x number of inches in front of the device.  Once deployed, the USB can be removed.  Now untethered, you can view the app in 3D while walking around as well as closer and farther away.  This type of movement around the visual is accentuated by the spatial sound enabled in the device making the experience more realistic without any of the nausea/motion sickness sometimes experienced in other devices.  I was also able to see and carry on conversation with members of the HoloLens Team present in the room. 

The development experience was simplified for this short session, but I think this was enough to get a sense of the techniques needed to make this work.  I am guessing that this type of development is only needed for the immersive, 3D applications.  Universal Applications, which are also Holographic Applications, would just be created in the current development model in Visual Studio.  I would also think that these would appear for placement from some kind of app store interface instead of the deployment scenario described here.  This also gets into the question of the storage capabilities of the device which we did not discuss.  I look forward to getting my hands on a device and seeing what it can do without a script.

OneDrive for Business Development

April was OneDrive for Business month at TriState SharePoint.  ODFB or OD4B, depending on your preference, is the corporate user’s, cloud storage solution available as part of Office 365 SharePoint (Sites) or as a stand-alone product.  This is not to be confused with the OneDrive Personal you get as part of your Microsoft Account. 

For my part of the discussion, I focused on the development hooks available for ODFB.  This blog post describes the information presented in the meeting  I pulled this information from available resources on MSDN,, and Microsoft Virtual Academy.  I link to these sources in the content of the blog and don’t repeat the specifics here.  You can check out the links for the greater details.

I grouped these into 3 possible options, Provision, Customize, Develop.


Much like a MySite in SharePoint, ODFB is not created by default.  It is only instantiated when a user navigates to the site for the first time.  This can cause a slight delay while it is being created.  In one-off situations, the wait is not terrible.  If you are launch a new site to you user community, you may want to create those sites ahead of time and save unnecessary calls to the service desk.  Scripts are available in CSOM and PowerShell.  Both languages follow the same pattern. 

  • Connect to the SharePoint instance.
  • Get a collection of users
  • Connect to the User Profile object
  • Create ODFB instance. The collection of users should be less than 200 per batch.  There is a TechNet article describing the full process with the PowerShell code..


What is the first thing everyone wants to do with their SharePoint environments?  Make them not look like SharePoint.. While you can change the masterpage, this is not the recommended practice.  However, there are ways to change the appearance and content of the ODFB site.

First, why would you want to do this?  You may want to change the colors used on the ODFB site.  Maybe add a background image or do other things that make the ODFB site look more like the rest of your custom tenant sites.  All of this is possible with the patterns described here.  One other reason is for governance.  There may be content you want to make sure is included with each ODFB site.  This could be Sales Contract templates for the Field Sales Team or just standard policy documentation.  These patterns can also be used to check and confirm that the content remains included on the user’s ODFB site.

There are three design patterns available for enabling these changes.  They are App Part Model, Scheduled Job Model and the Pre-create Model.  All of these patterns are described in more detail in this post.  I’ll briefly summarize here.

App Part Model

You cannot deploy applications directly into the ODFB site like you can on a team site.  One method of getting around this is to create an app part that sits on an Office 365 page that is common to your users, like the home page of your intranet site.  When the home page loads, the app part runs and calls a provider hosted app so it can create an entry in a queue for the user with the information needing to change on the ODFB site.  A scheduled job looks at the queue and checks for work and performs the customizations described in the entry.  This model is needed as the process can be long running and would not be a good idea to run in the provider hosted app directly.  There is a great solution included in the Patterns and Practices repository on the OfficeDev GitHub.  It is complex, as it is a complete solution included console apps to apply and reset the customizations.  There is also a channel 9 video walking through the project.

Scheduled Job

A scheduled job is nothing more than some code, usually an executable, like a console application, that gets executed on a recurring basis by some process.  In SharePoint, these would be Timer Jobs, which we are not available in Office 365.  As a an IT Pro, you might think of these as Windows Scheduler Jobs that run on a server.  In Azure, the equivalent is a WebJob.  The webjob runs within an Azure website and will run an executable on a defined interval.  Regardless of how you architect it, the executable portion of the scheduled job runs code that queries Office 365 for new ODFB sites.  When it finds them, it applies the customizations defined in the code.  The potential limitation with this approach is that people may get to the site prior to the customization being applied as search does take time to populate with the new sites. 

Pre-create Model

I mentioned the provisioning options available in the first part of this post.  It is possible to include the code for customizing the ODFB site at the time it is created.  This provides a simple way of getting the sites and the customizations applied from the start.  Unlike the other two models, creating site for additional users and resetting user changes is not easily handled in this model.


In October 2014, the Office Dev Team RTMd the Office 365 APIs.  These APIs were designed to make the consumption of Office 365 services easy for the developer and to make them available on every major development platform, including iOS, Android and PHP.  If you are interested in this type of development, start by going to  It has all the links for training, code samples and the tool I showed at the end of my talk, API Sandbox.  This site offers an easy way to try out getting data from any of the services exposed directly through the APIs, Calendar, Mail, Contacts and Files.  There are samples are client library and REST samples written in both C# and JavaScript.

As for the actual project sample, I stepped through lab project associated with the Microsoft Virtual Academy course, Deep Dive: Integrate Office 365 APIs in Your Web Apps.  Section 3 focuses on ODFB integration and includes a lab in which you build a web site that includes a page showing the file in the user’s ODFB.  The code for this lab is available here.  If you clone this repo, move the project to a local folder and open it from there.  The folder names are too long in the repo and they prevent Nuget from updating the packages.  Even with that move, I still had issues with the OWIN package.  I had to remove it and it’s dependencies using uninstall-package command and then reinstall again.  This was a little troubling but it did solve my build issues.

That’s all for the recap.  Ping me on twitter or leave a comment if you have any questions.

Office 365 for Developers – Philly .net Code Camp Session

In 2014, we tried some thing a little different with Philly .net Code Camp, offering a 2-day camp at Valley Forge Conventions Center.  It was fun.  We learned a lot about running an event of that size.   

This year for our 21st Code Camp on March 20 & 21, we have something old and something new. 

For the old, we are back at the Microsoft offices in Malvern.  This goes back to before I started attending Code Camp, which was when they were at DeVry.  Same building except the Microsoft Technology Center is now there and the rooms all have top notch presentation equipment.  We still have 60 sessions on Saturday over 12 different tracks with more than 50 speakers, some local, some from out of town, but all delivering great content.

For the new, we are conducting full day learning sessions.  There are 10 sessions to choose from and while we ask for a selection when you register, you can still change as long as there is room.  You can see the full list of course and the rest of the information on the Philly .net site.  There are additional items for attendees including a guaranteed spot at Code Camp, which typically reaches capacity quickly.  Registration for these learning sessions is open if you are interested.

I am participating in this longer format and conducting a full day session on Office 365 for Developers.  Office 365 has a ton of great services available and many of them provide access via a rich set of APIs (and if they don’t now, I expect they soon will).  Office 365 also represents a departure in how you typically architect solutions, especially for those coming from the ‘full-trust’ SharePoint development world.  This session will give you everything you need to begin writing your own solutions against Office 365 services.

Here is the agenda for the day. 

  • Introduction to Office 365
  • Setting up the Development Environment
    • Development Guidelines for Office 365
    • Branding and Customizations – Code
    • Office 365 APIs – the new Application Model – Lots of code
    • Staying Up to Date and Wrap up
    I am currently not planning for this to be a hands on lab.  However, you will be able to follow along as i walk through the steps, so bring your laptops. 
    For those planning to attend or interested in attending, let me know if this is what you expect.  If you are interested in a topic not mentioned here, add a comment or tweet me.  Same for the agenda.  I expect this to be a small audience, so I can morph the content to focus on your interests. 

Looking forward to seeing you there!

Office 365 Developer Content at Build

On this week’s Office Developer Podcast (Episode #32), Jeremy Thake and Sonya Koptyev discuss the initial plans for Office 365 sessions at Build.  They also mention the Ignite sessions and related activities for attendees with more information to come on both conferences in a future podcast.

Some of the highlights from the discussion.  Both conferences will have more than 20 sessions each.  As I am sure you would guess, the focus at Build is more the developer story.  More information about the existing Office 365 APIs.  Information about the new Office 365 APIs in development as well as some ‘exciting Graph (API)news’ (We will just have to wait and see what that means).  These sessions will also include information about coding various web technologies and frameworks against the Office 365 APIs.  Of course, there will also be sessions for connecting mobile and other platforms (ie iOS and Android) to the Office 365 services.

If you haven’t guessed, I am registered to attend.  If you are attending too, find me on twitter (@rich_ross) .  There are quite a few people heading out there from the Philadelphia area, including some of my fellow SharePoint developers interested in learning about the future and developing on Office 365 services.

If you are not going, don’t worry, you can still access the session content.  There will be a live stream for the keynote sessions, with selected sessions included in the live stream for the rest of the day.  The Channel 9 Team does an outstanding job making you feel like a part of the conference no matter where you watch it.  Given the previous Build conference experiences, I expect all of the sessions will be available on line for download or streaming about 24-48 hours after they occur.

Only 79 more days…hope to see you there.

Office 365 and Kendo UI Map Control

For a recent project, I built an intranet site for a client using SharePoint Online.  One of the requirements for this project was to display the various corporate locations on a map.  The corporate locations are stored in a SharePoint List, so they can be updated by the business once we launch the site.  The list has a custom content type for Corporate Location that includes the standard Site Columns for Work Address,, Work Country,  Primary Phone, etc.  It also includes two custom columns of type Number for the location’s longitude and latitude. 

Populating the information for these columns requires some additional work.  Most organizations have the Address and Phone information for their corporate sites readily available, but probably not the exact location of the corporate site in geographical space.  The address of the location can be entered into a map service, like Bing Maps to return the corresponding latitude and longitude.  I went through the steps in getting this data in a previous blog post on SharePoint 2013 Geolocation in the section Using Geolocation.  I did the same steps for this project.  Now, I just need a map control.

I used Telerik controls for basic web development and SharePoint On Premises solution development.  In those projects, the dlls for the controls are deployed to the server in the solution package or even pre-installed on the machines.  This is a SharePoint Online project, so deploying solutions and any necessary dlls is not an option here.  We need controls that are based on a JavaScript framework as these files will download when requested by the web page and run client-side.  In addition to their server-side controls, Telerik also has a suite of JavaScript framework controls called Kendo UI.  The Professional version of these controls includes a map control, which will be perfect for our application here. 

So now that we have the data in place and our control selected, let’s wired this up.

Wiring It Up

We need to develop two components for this solution, a custom JavaScript file that runs when the page finishes loading and a snippet of code to place on the page where we need the map.

Here is the overall structure of the JavaScript file.  The file is an Immediately Invoked Function Expression (IIFE) or Self Executing Function (SEF) depending on your terminology.  It is also encapsulated in the namespace of locationMap.  I place this file in the _catalogs/masterpage in a folder named ‘js’.  This means it is only available for that site collection, which was acceptable for my project. 

"use strict";

var locationMap = window.locationMap || {};

(function () {

    function createMap() 
    $().ready(function () 


The first function is called using the jQuery document.ready function.  This anonymous function makes a REST call to SharePoint to return a collection of items in the list.  The REST call is made in the context of the authenticated user.  This means the user must have READ permissions on the list as there is no way to impersonate or authenticate with other credentials.

This collection needs a little additional processing before it is ready to pass to the map control.  This is because the map control is looking for one object containing an array of our location points, the latitude and longitude.  Our data from SharePoint has these as two separate fields in our list.  Once the data is successfully returned, I used the jQuery .each function to loop through our results and add an object named ‘location’ to the item object.  This is an array and I add the latitude and longitude values as separate items while parsing them to type Float.  The modified data.values object is then passed to the location.MapData object as a new  Calling the read function on the MapData object loads the data into the object.  Now that the data is parsed and loaded, I call the createMap function.

$().ready(function () {
    var restUrl = "/{site location}/_api/web/lists/GetByTitle('Locations')/items";
    $.getJSON(restUrl, {
        format: 'json'
    }).done(function (data) {
        $.each(data.value, function (i, item) {
            item.location = [];

        locationMap.mapData = new{ data: data.value });;


The second function is the createMap function.  This sets the options and defines the data source for the map.  I am using a template in this map to display a custom tooltip containing the location information when the user selects a pin on the map.  More on that below, but the map control needs to be aware of the template to use for the tooltip, which is what I do here in setting the marker.tooltip.template object.  For the databinding, set the dataSource property to the location.mapData object, the location property to the object in our mapData (location) and finally the titleField comes from the ‘Title’ property (this is the name of the field returned in our json object, which corresponds to the matching SPList internal field names).  The map includes an event for ‘reset’.  This event fires when the map zoom level changes and allows the points to reposition themselves based on the current zoom level of the map.  Without this, the map images will change, but the points will remain in the location of the original zoom level.
function createMap() {
        var myTemplate = kendo.template($("#TemplateHtml").html());

            layerDefaults: {
                marker: {
                    tooltip: {
                        template: myTemplate
            markerDefaults: {
                shape: "pin"
            center: [30.000000, 0.000000],
            zoom: 2,
            layers: [{
                type: "tile",
                urlTemplate: "https://#= subdomain zoom #/#= x #/#= y #.png",
                subdomains: ["a", "b", "c"],
                attribution: "&copy; <a href=''>OpenStreetMap conributors</a>"
                type: "marker",
                dataSource: locationMap.mapData,
                locationField: "location",
                titleField: "Title"
            reset: function (e) {


Now that the code is in place, I need to modify the page where this map is displayed.  The site has the Publishing feature enabled so the page is in the Pages document library.  We could include the links to css and js files in a page template, but for this project, I only need this functionality on one page, so it is not worth creating the template.  I added a Script Editor webpart to the page in the location where I need the map.  The following code snippet was then included in the webpart.
<link href= 
type="text/css" rel="stylesheet" ms-design-css-conversion="no" />
<link href= 
type="text/css" rel="stylesheet" ms-design-css-conversion="no" />
<link href= 
type="text/css" rel="stylesheet" ms-design-css-conversion="no" />
<link href= 
type="text/css" rel="stylesheet" ms-design-css-conversion="no" />
<script id="TemplateHtml">
    <h3><a href="#= marker.dataItem.LocationPageUrl#">#= marker.dataItem.Title#</a></h3>
<p style="text-align:left">
    #=marker.dataItem.WorkCity#, #=marker.dataItem.WorkState# #=marker.dataItem.WorkZip#<br/>
<p style="text-align:left">
    <b>Phone:</b> #=marker.dataItem.PrimaryNumber#<br/>
    <b>Fax:</b> #=marker.dataItem.WorkFax#
    <div class="demo-section k-header kendoMapStyle">
        <div id="map" class="kendoMapStyle"></div>
<script src= 
<script src= 
<script src="/_catalogs/masterpage/js/map/locationMap.js" 


The first section is just the links to the css required to render the control.  The next block is the script block for the html template I mentioned earlier.  i place the html I want rendered in the tooltip window.  When I made the rest service call, I included more than just the location and title information I bound to the grid.  Address, Phone Numbers and a link to the corresponding location page is also available.  To get them to display in the tooltip, I have placeholders in the html template.  They require the form #= marker.dataItem.{propertyName}#, where property name is the Internal Name of the field in the SharePoint list.
After the script block is a div where the map is placed.  It is wrapped in a parent div that provides additional styling.  Lastly, the scripts are loaded, jQuery, then Kendo and then our custom locationMap file.  The jQuery file is downloaded from the Microsoft AJAX CDN.  All of the Kendo files are pulled from a CDN that Telerik provides on Cloudfront, so i get the SSL support.  This provides faster loading of the page as this is not coming from the SharePoint Online environment and is the recommended practice where possible.

The Results

Here are a couple of images of the resulting output on the page.

Full Page with the control
image  image
Same page zoomed into the map points and the tooltip information from the SharePoint list for this location.
Hope you found this useful.  As always, send your questions or comments via twitter or use the commenting system below.

Disclaimer: I do not work for nor am I paid by Telerik.  I use their tools because they work for me, are continually innovating and provide great support when I have questions.  I mention them by name here as it is what I use for my projects.  Other controls and frameworks based on JavaScript instead of dlls to render will likely work as well.

Office 365 and Yammer Administrators

I recently completed a Philly MTC Tech Talk on Yammer Administration.  You can see the video here.  This talk was mainly about the Yammer administrative tools available in an Enterprise version of Yammer.  I did start out with some steps on enabling your Yammer Enterprise instance.  These steps are all listed on the Activation pages of site.

Life will go easier for you if you follow the steps and assign a Global Administrator who does not have a generic username prior to activation.  Yammer only assigns named users from the O365 Global Administrator group as Verified Admins on Yammer.  So the default MOD Administrator account, with the username ‘admin’ is not listed as an Admin in Yammer.  That is why you need to identify and assign one as part of the steps in provisioning the Enterprise instance. 

I have found that you can do this after the fact.  I had a previous demo instance where I did not follow the order exactly and this resulted in not being able to see the admin menu.  I removed and readded my named user to the Global Administrator group, and they now had access to the Yammer Admin area.

It is important to note that any Global Administrator you assign in O365 are automatically Verified Administrators in Yammer.  This means they have access to all data in the Yammer environment, public and private.  They can also export data, manage users and manage integrations.  You might expect the GAs to need access to the later functions, but the viewing of private data can be a concern to some networks.

Often administrators will have two accounts, their normal domain account and an admin account with the additional permissions an admin requires.  Given the current limitation with assigning administrator permissions to accounts beginning with ‘admin’, you can create Global Administrator accounts in the format of ‘admin-username’.  Creating an account this way will not add it as a Verified Administrator in Yammer.  This is not a solution, but for some networks, this may be enough.  There does need to be a better solution going forward

If you need to create Network Admins, this is done on the Admin tab of the Admin area.  Enter the email address of the person, select the appropriate pressure and click the Submit button.  Network Admins can also be promoted to Verified Admin in the same area.  Administration of Yammer Admins assigned by the Global Administrator group can only be managed in the O365 Admin area for Users and Groups.  You can see the differences in the screenshot below.


As always, put your questions below or find me on twitter.

Yammer: Accounts and External Networks

Typically, access to Yammer comes from the account associated with your company’s Yammer instance.  This is your company email address and the domain of that email address is the name of your Yammer Home Network.  For example, sign in with and your home network is  You can see this in the url as it will be

In addition to the home network, you can participate in external networks.  An external network provides the same features as your home network but includes people outside the domain.  These people can be invited by someone from the home network, or if the network is public, a user can request access.

There are number of great reasons to participate in external networks.  Microsoft and Yammer Teams use these heavily to collect feedback, provide support and share information with the community.  External networks are also a great way to connect with your customers by creating your company’s own separate external networks for project engagements or events like tradeshows and conferences.  In general, any use case where you want to securely collaborate with people outside of your corporate home network is a good reason to explore external networks.

Memberships to external networks are tied to the account used to request access to the external networks.  In the typical case, this is your home network account.  You are logged into the home network and then request access to an external network.  The approval is associated with your account and this works well.

A problem arises if you should leave  As part of mycompany’s processes, disables your domain account if they configured Directory Synchronization or if not configured, someone manually starts the disable process of your account in Yammer.  You now no longer have access to the home network,  You also no longer have access to any external networks associated with that account.

You can see this by accessing an external network directly through the url.  I recently changed companies and my previous account had access to the Office 365 IT Pro Network.  The direct url to this network is  Accessing this url with my previous company Yammer credentials gives the following notice.


So as a Yammer user, what can you do.  What I am now doing when needing access to external networks is to make the request using my personal email which just happens to be associated with my Microsoft Account, although any personal email account seems to work.  This is a good fix for those who are at a company where they are not using Yammer (at least not yet!).  I also would only use this for access to external networks that go beyond employment at any one company, like the Office 365 Technical Network or Yammer Developer Network.

There is a downside to this if you are using your corporate account to access your company’s Yammer network.  You will need to logout/login to switch between corporate and external networks as the browser can only hold one yammer connection at a time.  You can keep multiple yammer connections open by using normal and private mode for the same browser or by connecting with two different browser applications, like IE and Chrome.

What I would like to see is a change to the yammer account where it is based on my Microsoft Account so it provides access to external networks I have associated with it regardless of where I am employed.  This new yammer account also has one home network, which is tied to my corporate account.  This way if I change companies, external access remains but the home network is blocked until I enter a new set of credentials for the home network.  Just a thought.

Tweet me or post any questions you have here.


Get every new post delivered to your Inbox.

Join 355 other followers

%d bloggers like this: