14 October 2017

Lessons learned from adapting Walk the World from pure HoloLens to Windows Mixed Reality

imageHeads-up (pun intended)

This is not my typical code-with-sample story - this a war story from the front line of Windows Mixed Reality development. How did I get here, what did I learn, what mistakes did I make, what scars I have to show, and how did I win in the end.

The end

On the evening (CET) of Tuesday, October 10, 2017, Kevin Gallo - VP of Windows Developer Platform - announced in London the release of the SDK for the Windows 10 Fall Creators Update and the opening of the Windows Store for apps targeting that release - including Mixed Reality apps. 

Mere hours after that - Thursday had just arrived in the Netherlands - an updated version of Walk the World with added support for Windows Mixed Reality passed certification, and became available for download in the store on Friday the 13th around 8:30pm CET. I was able to download, install and verify it was working as I expected. Four days before the actual official rollout of the FCU including the Mixed Reality portal, I was in the Store. Against all odds, I not only managed to make my app available, but also get it available as a launch title.

Achievement unlocked ;)

What happened before

On June 28th, 2017, I was invited to Unity Unite Europe 2017 by Microsoftie Desiree Lockwood who I met numerous times in the cause of becoming an MVP. Not having to fly 9 hours to meet an old friend but only to have to take a short hop on a train, I gladly accepted. On a whim, I decided to bring my HoloLens with Walk the World for HoloLens loaded on it. We had lunch and I demoed the app, showing Mount Rainier about 4 meters high, in a side building. That apparently made quite an impression. Talk moved quickly to the FCU Mixed Reality, and how much work it would be to make my app available for MR as well. In a very uncharacteristically moment of hubris I said "you get me a head set, I will get you this app". I got guidance on how to pitch my app, I did follow the instructions, and July 27th the head set arrived.

Suddenly it was time to make sure I lived up to my big words.Before

Challenge 1: hardware

My venerable old development PC, dating back to 2011, had a video card that in no way in Hades would be able to drive a headset. I could get a 2nd hand video card and the PC, running the Creators Update AKA RS2, said it was ready to rock. So I happily enabled a dual boot config, added the Fall Creators Update Insiders preview, and then ran into my first snag.

On the Creators Update the Mixed Reality portal is nothing more than a preview, the preview ran nice on my old PC, but the upcoming production version apparently not. Maybe I should have read this part of the Mixed Reality development documentation better. Not only the GPU did not cut it by far, but the CPU was way too old. So with a head set on the way, I was looking at this.

After

Fortunately, one of my colleagues is an avid gamer. She and her husband took a look at the specs, and built an amazing PC for me in a like days. It’s specs are:

  • CPU: AMD Ryzen 7 1700, 3.0 GHz (3,7 GHz Turbo Boost) socket AM4 processor
  • Graphics card: Gigabyte GeForce GTX 1070 G1 Gaming 8GB
  • Motherboard: ASUS PRIME B350-PLUS, socket AM4
  • Memory: Corsair 16 GB DDR4-3000
  • Storage: Crucial MX300 1TB M.2
  • Power supply: Seasonic Focus Plus Gold 650W

This is built into a Fractal Design Core 2500 Tower with an extra Fractal Design Silent Series R3 120mm Case fan. My involvement in the actual creation of this monster was supplying maximum physical dimensions and entering payment details. Software is my shtick, not hardware. But I can tell you this device runs Windows Mixed Reality like a charm, and very stable, too. Thanks Alexandra and Miles!

Lesson 1: RTFM, and then wait till the FM is indeed final before making assumptions.

Lesson 2: don’t skimp on hardware especially when you are aiming for development.

Challenge 2: tools in flux

Developing for Windows Mixed Reality in early August 2017 was a bit of a challenge. Five factors where in play:

  • The Fall Creators Update Insiders preview
  • The Mixed Reality Portal
  • Visual Studio 2017.x (a few updates came out during the timeframe)
  • Unity 2017 (numerous versions)
  • The HoloToolkit (halfway rechristened the Mixed Reality Toolkit) – or actually, the development branch for Mixed Reality.

Only when all these five stars aligned, things would actually work together. Only three of the stars were in Microsoft’s control – Unity is of course made by Unity, and the Mixed Reality Toolkit is an open source project only partially driven by Microsoft. Four of them were very much in flux. A new version comes out for one of these stars, and the whole constellation starts to wobble. Fun things I had to deal with were, amongst others:

  • imageFor quite some time, Unity could not generate Unity C# projects, but only ‘player’ projects. Which meant debugging was nearly impossible. But it also made it a fun second-guessing-the-compiler game, kind of like in the very old day before live debugging (yes kids, that’s how long I have been developing software). Effectively, this meant I had to leave the "Unity C# Projects" checkbox unchecked in the build settings, because it created something the compiler did not want - let alone it being deployable.
  • An update in Visual Studio 2017 made it impossible to run Unity generated projects unless you manually edited project.lock.json or downgraded Visual Studio (which, in the end, I did).
  • Apps ran only once, then you had to reset the MR portal. Or only showed a black screen. Next time they ran flawlessly.
  • For a while, I could not start apps from Visual Studio. I could only start them from the start menu. And only from the desktop start menu. Not from the MR start menu.
  • The Mixed Reality version and the HoloLens version of the app got quite far out of sync at one point.

Lesson 3: on the bleeding edge is where you suffer pain. But you also get the biggest gain. And this is were the community’s biggest strengths come to light.

A tale of two tool sets

I wanted to move forward with Mixed Reality, but at the same time I wanted to maintain the integrity of the HoloLens version. So although the sources I wrote myself remained virtually the same, at one point the versions of Unity, the Mixed Reality Toolkit and even the Visual Studio versions I needed were different. For now I used for HoloLens development:

  • Visual Studio 2017 15.3.5
  • Unity 2017.1.1f1
  • The Mixed Reality Toolkit master branch.

For Mixed Reality development I used:

  • Visual Studio 2017 15.4.0 preview 5
  • Unity 2017.2.0f1
  • The Mixed Reality Toolkit Dev_Unity_2017.2.0 branch

Why a Visual Studio Preview? Well that particular preview contained the 16299 SDK (as fellow MVP Oren Novotny pointed out in a tweet), and although I did not know for sure 16299 would be the FCU indeed, I decided to go for it. Late afternoon (CET) of Sunday, October 8, I built the package, pressed it trough the WACK, and submitted it. And as I wrote before, it sneaked through shortly after the Store was declared open, becoming an unplanned Mixed Reality release title. Unplanned by Microsoft, that is. It was definitely planned by me. ;)

In the mean time, things are till changing, see this screen shot from the HoloDeveloper Slack group,which I highly recommend joining, especially the immersive_hmd_info and mrtoolkit_holotoolkit channels as these give a lot of up-to-date information on the five-star-constellation changes and wobbles:

image

Lesson 4: Keep tightly track of your tool versions

Lesson 5: Join the HoloDeveloper slack channel (and this means something from a self-proclaimed NOT fan of Slack;) )

A tale of two source trees

As I already mentioned, I needed to use two versions of the Mixed Reality Toolkit. These are distributed in the form on Unity packages, which means they insert themselves into the source of your app, as source. It’s not like you reference an assembly or a NuGet package. This had a kind of nasty consequence – if I wanted to move forward and keep my HoloLens app intact for the moment, I had to make a a separate branch for Mixed Reality development, which was exactly what I did. So although the sources I wrote for my app are virtually the same, there was a different toolkit in my sources. Wow, did Microsoft mess up this one, right?

No. Not at all. Think with me.

  • I have a master branch that is based upon the master branch of the Mixed Reality Toolkit – this contains my HoloLens app
  • I have an MR branch based upon the Dev_Unity_2017.2.0 branch – this is the Mixed Reality variant. In this branch sits all the intelligence that makes the app work on a HoloLens and an immersive headset.
  • At one point the stars will align to a point where I can use one version of everything (most notably, Unity, which keeps on being a wild card in this constellation) to generate an app that will work on all devices. Presumably the Mixed Reality Dev_Unity_2017.2.0 branch will become the master branch. Then I will not merge to my master branch – that will be deleted. My MR branch will be based upon the latest stuff and will become the source of everything.

Changes in code and Unity objects

Preprocessing directives

In the phase that I could not create Unity C# projects - hence debuggable projects - it seemed to me that that UWP code within #if UNITY_UWP compiler directives did not get to be executed in the not-debuggable projects that were the only thing I could generate. Peeking in the HoloToolkit - I beg your pardon - Mixed Reality Toolkit I saw the all UNITY_UWP compiler directives were gone, and several others were used. I tried WINDOWS_UWP and lo and behold - it worked. Wanting to keep backwards compatibility I changed all the #if UNITY_UWP directives to #if UNITY_UWP || WINDOWS_UWP. I am not really sure it's still necessary - looking in the Visual Studio solution build configuration now, I see both conditionals defined. I decided to leave it there.

Camera

Next to the tried and trusted HoloLensCamera, there's now the MixedRealityCamera. This also includes support for controllers and stuff. What you need to do is to disable (or remove) the HoloLensCamera and add a MixedRealityCameraParent:

image

This includes the actual camera, the in-app-controller display (just like in the Cliff House, and it looks really cool) as well as default floor - a kind of bluish square that appears on ground level. I thinks it's apparent size is about 7x7 meters, but I did not check. As Walk the World has it's own 'floor' - a 6.5x6.5 meter map, I did not need that so I disabled that portion.

Runtime headset checking - for defining the floor

I am not sure about this one - but when running a HoloLens app, position (0,0,0) is the place where the HoloLens is at the start of the app. That is why my Walk the World for HoloLens starts up with a prompt for you to identify the floor. That way, I can determine your length and decide how much below your head I need to place the map to make it appear on the floor. Simply a matter of sending a Raycast, having it intersect with the Spatial Mapping at least 1 meter below the user's head, and go from there. I will blog about this soon. In fact, I have already started doing so, but then this came around.

First of all, we don't have Spatial Mapping in an immersive headset. But experimenting I found out that (0,0,0) is not the user's head position but apparently the floor directly beneath the headset on startup. This makes life a whole lot easier. I just check

if(Windows.Graphics.Holographic.HolographicDisplay.GetDefault().IsOpaque)

then skip the whole floor finding experience, make the initial map at (0,0,0) and I am done.

Stupid tight loops

In my HoloLens app I got away with calling this on startup.

private bool CheckAllowSomeUrl()
{
    var checkLoader = new WWW("http://someurl");
    while(!checkLoader.isDone);
    return checkLoader.text == "true";
}

This worked, as it was in the class that was used to build the map. In the HoloLens app was not used until the user had defined the floor so it had plenty of time to do it's thing. Now, this line was called almost immediately after app startup and the whole thing got stuck in a tight loop and I only got a black screen.

In the mean time, I have upgraded the Unity version that builds the HoloLens app from 5.6x to 2017.1.x and this problem occurs there now as well. Yeah, I know it's stupid. I wrote this quite some time, it worked, and I forgot about it. Thou shalt use yield. Try to pinpoint this while you cannot debug.

Skybox

A HoloLens app has a black Skybox, as it does not need to generate an virtual environment - it's environment is reality. An immersive head set does not have that, so in order to prevent the user having the feeling to float in and empty dark space, you have to provide for some context. Now Unity has a default Skybox, but according to a Microsoftie who helped me out (but does not want to be named) using the default Skybox is Not A Good Thing and the hallmark of low quality apps. Since I only ever made HoloLens apps, this never occurred to me. With the aid of the HoloDeveloper slack channel I selected this package of Skyboxes and selected the HaloSky, which gives a nice half-overcast sky.

Coming from HoloLens, having never had to bother with Skyboxes before, you can spend quite some time looking for how the hell you are supposed to set one. I am assume it's all very logical for Unity buffs, but the fact is that you don't have to look in the Scene or the Camera - the most logical place to look after all - but you have to select Windows/Lighting/Settings from the main window. That will give a popup where you can drag the Skybox material in.

image

You can find this in the documentation, in this page that is titled "How do I Make a Skybox?" but since I did not want to make one, just use one it took me a while to find it. I find this confusing wording rather typical for Unity documentation. The irony is that the page itself is called "HOWTO-UseSkybox.html"

Upgrading can be fun - but not always

At one point I had to upgrade from Unity 5.6.x to 2017.1.x and later 2017.2.x. I have no idea what exactly happened and how, but at some point some settings I had changed from default in some of my components in the Unity editor got reverted to default. This was fortunately easy to track down with a diff using TortoiseGit. I also noticed my Store Icon got reverted to it's default value - no idea why or how, but still.

In the cause of upgrading, you will also notice some name spaces have changed in Unit. For instance, everything that used to be in UnityEngine.VR.WSA in now in UnityEngine.XR.WSA. Similar things happened in the Mixed Reality Toolkit. For reasons I don't quite understand, using a TextToSpeechManager can now only be used by calling from the main thread. For extra fun, in a later release it's name changed into TextToSpeech (sans "Manager") and the method name changed a little too.

Submitting for multiple device families

Having only submitted either all-device-type supported UWP apps or Hololens apps that, well, only ran on HoloLens, I was a bit puzzled how to go about making various packages for separate device families. I wanted to have an x86 based package for HoloLens, and an x86/x64 package for Windows Mixed Reality. I actually built those on different machine and I also gave them different version numbers.

image

But whatever I tried, I could not get this to work. If I checked both the checkbox for Holographic and Windows, the portal said it would offer both packages on both platforms depending on capabilities. I don't know if that would have caused any problems, but I got a tip from my awesome friend Matteo Pagani that I should dig into the Package.appxmanifest manually.

In my original Package.appmanifest it said:

<Dependencies>
<TargetDeviceFamily Name="Windows.Universal" MinVersion="10.0.10240.0" 
                    MaxVersionTested="10.0.15063.0" />
</Dependencies>

For my HoloLens app, I changed that into

<Dependencies>
<TargetDeviceFamily Name="Windows.Holographic" MinVersion="10.0.10240.0" 
                    MaxVersionTested="10.0.15063.0" />
</Dependencies>

For my Mixed Reality app, I changed that into

<Dependencies>
<TargetDeviceFamily Name="Windows.Desktop" MinVersion="10.0.16299.0" 
                    MaxVersionTested="10.0.16299.0" />
</Dependencies>

And then I got the results I wanted, and I was absolutely sure the right packages were offered to the right (and capable) devices only.

Setting some more submission options

From the same Microsoftie who pointed me to the Skybox I also got some hints on how to submit a proper Mixed Reality head set app. There were a lot of options I was not ever aware of. Under "Properties", for instance, I set this

image

as well as this under "System requirements" (left is minimum hardware, right is recommended hardware)

image

Actually, you should set a lot more settings considering the minimal specs for the PC. Detailed instructions can be found here, including the ones I just discussed ;)

Conclusion

It was a rocky ride but a fun one too. I spent an insane amount of time wrestling with unfinished tools, but seeing my app work on the Mixed Reality headset for the very first time was an adrenaline high I will not forget easily. Even better was the fact I managed to sneak in the back door to get my app in the Store ready for the Fall Creators Update launch - that was a huge victory.

In the end, I did it all myself, but I could not have gotten there without the help of all the people I already mentioned, not to mention some heroes from the Slack Channel, particularly Lance McCarthy and Jesse McCulloch who were always there to get me unstuck.

In hindsight, Mixed Realty development is not harder than HoloLens development. In fact, I'd call it easier because you are not constrained by device limits, deployment and testing goes faster, and the Mixed Reality Toolkit evolved to a point where things get insanely easy. Nearly all my woes where caused by my stubborn determination to be there right out of the gate, so I had to use tools that still had severe issues. Now stuff is fleshed out, there's not nearly as much pain. The fun things is, when all is said and done, HoloLens apps and Mixed Reality apps are very much the same. Microsoft vision for a platform for 3D apps is really becoming true. You can re-use your HoloLens knowledge for Mixed Reality - and vice versa. Which brings us to:

Lesson 6: if you thought HoloLens development was too expensive for you, get yourself a headset and a PC to go with it. It's insanely fun and a completely new, exiting and nearly empty market is waiting for you!

Enjoy!

02 August 2017

Creating a 3D topographical map in your HoloLens / Windows MR app with the Bing Maps Elevation API

Intro

All right, my friends. It's time for the real deal, the blog post I have been anticipating to write since I first published the 3D version of Walk the World like two months ago. It took me a while to extract the minimal understandable code from my rather convoluted app. Everyone who has ever embarked on a trip of 'exploratory programming' (meaning you have an idea what you want to do, but only some vague clues how) knows how you end up with a repo (and code) full of failed experiments, side tracks, etc. So I had to clean that up a little first. Also, my app does a lot more than just show the map, and those features would obscure the general idea. As a bonus - after creating this blog post I finally actually understand myself how and most importantly why the app works :).

So, without further ado, I am going to show you how to display a 3D map in your HoloLens or Windows MR headset. Just like in Walk the World. I will build upon my previous post, in which I showed you how to make a flat slippy map. This time, we are going 3D.

The general idea

As you can read in the previous post, I 'paste' the actual map tiles - mere images - on a Unity3D Plane. A Plane is a so-called mesh that exists out of a grid of 11x11 points, that form the vertices. If I somehow would be able to ascertain actual elevation on those locations, I can move those points up and down and actually get a 3D map. The tile itself will be stretched up and down.The idea of manipulating the insides of a mesh, which turns out to be very simple, is explained by the awesome Rick Bazarra in the first episode of his must-see "Unity Strikes Back" explanatory video series on YouTube, a follow-up to his Creative Coding with Unity series on Channel 9, that I consider a standard starting point for everyone who wants to get off the ground with Unity3D.

So where do we get those elevations? Enter the awesome Microsoft service called the Bing Maps Elevation API. It seems to be built-to-order for this task. Your first order of business - get yourself a Basic Bing Maps key.

Adding some geo-intelligence to the tile

The Bing Maps Elevation API documentation describes an endpoint GetElevations that allows you to get altitudes in several ways. One of them is a grid of altitudes in a bounding box. That is what we want - our tiles are square. The documentation says the bounding box should be specified as follows:

"A bounding box defined as a set of WGS84 latitudes and longitudes in the following order:
south latitude, west longitude, north latitude, east longitude"

If you envision a tile positioned so that north is up, we are required to calculate the geographical location of the top-right and bottom-left of the tile. The Open Street Maps Wiki provides code for the north-west corner of the tile, i.e. top left. I translated the code to C#...

//http://wiki.openstreetmap.org/wiki/Slippy_map_tilenames#C.23
private WorldCoordinate GetNorthWestLocation(int tileX, int tileY, int zoomLevel)
{
    var p = new WorldCoordinate();
    var n = Math.Pow(2.0, zoomLevel);
    p.Lon = (float)(tileX / n * 360.0 - 180.0);
    var latRad = Math.Atan(Math.Sinh(Math.PI * (1 - 2 * tileY / n)));
    p.Lat = (float) (latRad * 180.0 / Math.PI);
    return p;
}

image... and it works fine, but we need the south west and the north east points. Well, if you consider how the tiles are stacked, you can easily see how the north east point is the north-west point of the tile right of our current tile, and the south-west point is the north west point of the tile below our tile. Therefore we can use the north-west points of those adjacent tiles to find the values we actually need - like this:

public WorldCoordinate GetNorthEast()
{
    return GetNorthWestLocation(X+1, Y, ZoomLevel);
}

public WorldCoordinate GetSouthWest()
{
    return GetNorthWestLocation(X, Y+1, ZoomLevel);
}

That was easy, right?

Size matters

Although Microsoft is a USA company, it fortunately has an international orientation so the Bing Maps Elevation API returns no yards, feet, inches, miles, furlongs, stadia, or any other deprecated distance unit – it returns plain old meters. Which is very fortunate, as the Windows Mixed Reality distance unit is – oh joy – meters too. But it returns elevation in real world values, and while it might be fun to show Kilimanjaro in real height, it will be a bit too big to fit in my room (or any room, for what matters). Open Street Map is shown at a definite scale per zoom level – and as a GIS guy, I like to be the height correctly scaled - to get for this real life feeling. Once again, referring to the Open Street Map Wiki – there is a nice table that shows how many meters a pixel is at any given zoom level. We will need the size per tile (which is 256 pixels, as I explained in the previous post), so we add the following code that will give you a scale factor for the available zoom levels for Open Street Map:

//http://wiki.openstreetmap.org/wiki/Zoom_levels
private static readonly float[] _zoomScales =
{
    156412f, 78206f, 39103f, 19551f, 9776f, 4888f, 2444f,
    1222f, 610.984f, 305.492f, 152.746f, 76.373f, 38.187f,
    19.093f, 9.547f, 4.773f, 2.387f, 1.193f, 0.596f, 0.298f
};

private const int MapPixelSize = 256;

public float ScaleFactor
{
    get { return _zoomScales[ZoomLevel] * MapPixelSize; }
}

Creating the request

Now we move to MapTile. We add the following code to download the Bing Maps Elevation API values

private string _mapToken = "your-map-token-here";

public bool IsDownloading { get; private set; }

private WWW _downloader;

private void StartLoadElevationDataFromWeb()
{
    if (_tileData == null)
    {
        return;
    }
    var northEast = _tileData.GetNorthEast();
    var southWest = _tileData.GetSouthWest();

    var urlData = string.Format(
    "http://dev.virtualearth.net/REST/v1/Elevation/Bounds?bounds={0},{1},{2},{3}&rows=11&cols=11&key={4}",
     southWest.Lat, southWest.Lon, northEast.Lat, northEast.Lon, _mapToken);
    _downloader = new WWW(urlData);
    IsDownloading = true;
}

This simply queries the TileInfo structure for the new methods we have just created. Notice it then builds the URL, containing the bounds, the hard coded 11x11 points that are in a Unity Plane, and the key. Then it calls a piece of Unity3D code called “WWW”  which is a sort of HttpClient named by someone with a lot of fantasy (NOT). And that’s it. We add a call to the existing SetTileData method like this:

public void SetTileData(TileInfo tiledata, bool forceReload = false)
{
    if (_tileData == null || !_tileData.Equals(tiledata) || forceReload)
    {
        TileData = tiledata;
        StartLoadElevationDataFromWeb();
    }
}

so that whenever tile data is supplied, it does not only initiate the downloading of the tile, but also the downloading of the 3D data.

Processing the 3D data

Next up is a method ProcessElevationDataFromWeb, that is called from Update (so about 60 times a second). In this method we check if a the MapTile is downloading – and if it’s ready, we process the data

protected override void OnUpdate()
{
    ProcessElevationDataFromWeb();
}

private void ProcessElevationDataFromWeb()
{
    if (TileData == null || _downloader == null)
    {
        return;
    }

    if (IsDownloading && _downloader.isDone)
    {
        IsDownloading = false;
        var elevationData = JsonUtility.FromJson<ElevationResult>(_downloader.text);
        if (elevationData == null)
        {
            return;
        }

        ApplyElevationData(elevationData);
    }
}

An ElevationResult is a class to deserialize a result from a call to the Bing Maps Elevation API in. I entered the result of a manual call in Json2CSharp and got a class structure back – only I changed all properties into public fields so the rather stupid limited Unity JsonUtility, that does not seem to understand the concept of properties, can handle it. I also initialized lists in the objects from the constructors. It’s not very interesting but if you want a look go here in the demo project.

Applying the 3D data.

So now it’s time to actually move the mesh points up an down. Mostly using code I stole from Rick Bazarra, with a few adaptions from me:

private void ApplyElevationData(ElevationResult elevationData)
{
    var threeDScale = TileData.ScaleFactor;

    var resource = elevationData.resourceSets[0].resources[0];

    var verts = new List<Vector3>();
    var mesh = GetComponent<MeshFilter>().mesh;
    for (var i = 0; i < mesh.vertexCount; i++)
    {
        var newPos = mesh.vertices[i];
        newPos.y = resource.elevations[i] / threeDScale;
        verts.Add(newPos);
    }
    RebuildMesh(mesh, verts);
}

private void RebuildMesh(Mesh mesh, List<Vector3> verts)
{
    mesh.SetVertices(verts);
    mesh.RecalculateNormals();
    mesh.RecalculateBounds();
    DestroyImmediate(gameObject.GetComponent<MeshCollider>());
    var meshCollider = gameObject.AddComponent<MeshCollider>();
    meshCollider.sharedMesh = mesh;
}

First we get the scale factor – that’s simply the value by which elevation data must be divided to make it match the current zoom level. Next, we get the elevation data itself, that is two levels down in de ElevationData. And then we go modify the elevation of the mesh points to match those of the elevation we got. For some reason - and that's why I said it looks like the Bing Maps Elevation API looks to be like built-to-order for this task - the points come in at exactly the right order for Unity to process in the mesh.

As I learned from Rick, you cannot modify the points of a mesh, you have to replace them. So we loop through the mesh points and fill a list with points that have their y – so the vertical direction – changed to a scaled value of the elevation. Then we call RebuildMesh, that simply replaces the entire mesh with new vertices, does some recalculation and rebuilds the collider, so your gaze cursor will actually play nice with the new mesh. I also noticed that it you don’t do the recalculate stuff, you will end up looking partly through tiles. I am sure people with a deeper understanding of Unity3D will understand why. I just found out that it needs to be done.

Don't press play yet! There a few tiny things left to do, to make the result look good.

Setting the right location and material

First of all, the map is kind of shiny, which was more or less okay-ish for the flat map, but if you turn the map into 3D you will get this over bright effect. So open up the project in Unity, create a new material “MapMaterial” and apply the properties as displayed here below left. The color of the material should be #BABABAFF. See left image. When it is done, drag it on top of the MapTile (see right image).

imageimage

Then, the app is still looking at Redmond. While that’s an awesome place, there isn’t much spectacular to see as far as geography is concerned. So we mosey over to the MapBuilder script. There we change the zoom level to 14, the Latitude to 46.78403 and the Longitude to -121.7543

image

It's a little east and quite a bit more south from Redmond. In fact, when you press play, you will see a landmark that is very familiar if you live anywhere near the Seattle area or visited it:

image

famous Mount Rainier, the volcano sitting about 100 km from Seattle, very prominently visible from aircraft - weather permitting. To get this view, I had to fiddle with the view control keys a little after pressing play - if you press play initially you will see Rainier from a lot more close up.

And that, my friends, is how you make a 3D map in your HoloLens. Of almost any place in the world. Want to see Kilimanjaro? Change Latitude to -3.21508 and Longitude to 37.37316. Press play.

image

Niagra falls? Latitude 43.07306, Longitude -79.07561 and change zoom level to 17. Rotate the view forward a little with your mouse and pull back. You have to look down. But then, here you go.

image

GIS is cool, 3D GIS is ultra-cool! All that is left is to generate the UWP app and deploy it into your HoloLens or view it in your new Windows Mixed Reality device.

Caveat emptor

Awesome right?  Now there are a few things to consider. In my previous post I said this app was a bandwidth hog, as it downloads 169 tiles per map. In addition, it now also fires 169 requests per map to the Bing Maps Elevation API. For. Every. Single. Map. Every time. Apart from the bandwidth and power consequences, there's another thing to consider. If you go to this page and click "Basic Key", you will see something like this:

image

What is boils down to is - if your app is anywhere near successful, it will eat your allotted request limit very fast, you will get a mail from the Bing Team kindly informing you of this (been there, got that) - and then suddenly you will have a 2D app again. You will have to buy and enterprise key and those are not cheap. So I advise you to do some caching - both in the app and if possible on an Azure service. I employed a Redis cache to that extent.

Furthermore, I explained I calculate the north east and south west points of a tile using the north west points of the tiles left of and below the current tile. If those tiles are not present, because you are at the edge of the map, I have no idea what will happen - but presumably it won't work as expected. You can run into this when you are zoomed out sufficiently in the very south or east of the map. But then you are either at the International Date Line (that runs from the North Pole to the South Pole exactly on the side of Earth that is exactly opposite of the Greenwich Meridian) or at Antarctica. On the first spot, there’s mostly ocean (why else do you think they’ve put it there) and thus no (visible) geography to speak of. As far as Antarctica goes, you’ll hit another limitation, for it clearly says in the Bing Maps Elevation API documentation:

"There is a limitation in that latitude coordinates outside of the range of -85 and 85 are not supported."

So beware. Stay away from the Earth's edges. Your app might fall off :).

Some assembly required, batteries not included

Indeed, it does not look exactly like in the videos I showed. Walk the World employs a different map tile sets (plural indeed), and there's also all kinds of other stuff my app does - like sticking the map to the floor so that even at high elevations you have a nice overview, reverse geocoding so you can click on the map to see what's there, tracking the user's moves so it can make a new map appear where the user is walking off it  - connecting to the old one, zoom in/out centered on the user's location in the map, showing a map of the physical surroundings... there's a lot of math in there. I only showed you the basics. If you need a HoloDeveloper who knows and understand GIS to the core, you know who to contact now :)

Conclusion

Once you know the basics, it's actually pretty easy to create a 3D scaled map of about anywhere in the world, that is - anywhere where the Bing Maps Elevation API is supported. The 3D stuff is actually the easy part - knowing how to calculate tiles and build a slippy map is harder. But in the end, it is always easy when you know what to do. Like I said, I think GIS is a premier field in which Mixed Reality will shine. I am ready for it: I hope you are too. Innovate or die - there are certainly companies I know that could take that advice and get moving.

Get the demo project and get inspired!

Credits

Thanks to RenĂ© Schulte, the wise man from 51.050409, 13.737262 ;)  for pointing me to the Bing Maps Elevation API. And of course to Rick Bazarra, who inspired me so often and actually provided some crucial code in his YouTube training video series.

22 July 2017

Creating a geographical map on the floor in your Hololens / Windows MR app

Intro

If you make an awesome app like Walk the World, of course you are not going to spill the beans on how you built it, right? Well – wrong, because I think sharing knowledge is the basis of any successful technical community, because I like doing it, and last but not least – I feel it is one the primary things that makes an MVP an MVP. I can’t show you all the details, if only because the app is humongous and is badly in need of some overhaul / refactoring, but I can show you some basic principles that will allow you to make your own map. And that’s exactly what I am going to do.

‘Slippy map’?

Getting on my GIS hobby horse here :) Gather around the fire ye youngsters and let this old GIS buff tell you all about it. ;)

Slippy maps are maps made out of pre rendered square images that together form a continuous looking map. Well-known examples are Bing Maps and Google Maps, as well as the open source map Open Street Maps – that we use for this example, The images are usually 256x256 pixels. Slippy maps have a fixed number of discrete zoom levels. For every zoom level there is a number of tiles. Zoom level 0 is typically 1 image showing the whole world. Zoom level 1 is 4 tiles each showing 1/4th of the world. Zoom level 2 is 16 tiles each showing 1/16th of the world. You get the idea. Generally speaking a zoom level has 2zoomlevel x 2zoomlevel tiles.

You can see the number of tiles (and the amount of data servers need to store those) go up very quickly. Open Street Maps’ maximum zoom level is 19 – which all by itself is 274.9 billion tiles, and that is on top of all the other levels. Google Map’s maximum zoom level is 21 at some places and I think Bing Maps goes even further. The amount of tiles of level 21 alone would be 4398046511104, a little short of 5 trillion. And that is not all - they even have multiple layers – map, terrain, satellite. Trillions upon trillions of tiles - that is even too much to swallow for Microsoft and Google, which is why they vary the maximum zoom level depending on the location – in the middle of the ocean there’s considerable less zoom levels available ;). But still: you need really insane amounts of storage and bandwidth to serve a slippy map of the whole world with some amount of detail, which is why you have so little parties actually meeting this challenge – mostly Google, Microsoft, and Open Street Maps.

Anyway - in a slippy map tiles are defined in rows and columns per zoom level. The takeaway from this is to realize that a map tile can be identified by three parameters and three parameters only: X, Y and zoom level. And those need to be converted into a unique URL per tile. The way these tiles are exactly organized depends on the actual map supplier. And if your starting point is a Lat/Lon coordinate, you will have to do some extra math. Basically all you need to know you can find here at the Open Streep Maps wiki. But I am going to show in more detail anyway.

Setting up the project

For this project we will use Unity 2017.1.0.f3. Do not forget to install the Windows Store (.NET) Target Support as well. If you are finished, clone my basic setup from Setting up a HoloLens project with the HoloToolkit - June 2017 edition. Rename the folder to “SlippyMapDemo”, then

  • Open the project in Unity
  • Open the Build Settings window (CTRL+B)
  • Make sure it looks like this:

image 

  • Hit the "Player Settings..." button
  • Change "June2017HoloLensSetup" in "SlippyMapDemo" where ever you see it. Initially you will see only one place, but please expand the "Icon" and "Publishing Settings" panels as well, there are more boxes to fill in. 

Upgrading the HoloToolkit

imageTo make a working app on the new Unity version, we will need a new HoloToolkit. So delete everything from the Assets/Holotoolkit but do it from Unity, as this will leave the .gitignore in place.

Select all items, press delete. Then, go to http://holotoolkit.download/ and download the latest HoloToolkit. If you click “Open” while downloading, Unity will automatically start to import it. I would once again suggest de-selecting Holotoolkit-Tests as they are not useful for an application project.

Some basic stuff first

We will need to have a simple class that can hold a Lat/Lon coordinate. UWP supplies those, but that means we cannot test in the editor and, well, we don’t need all what they can do. So we start off with this extremely simple class:

public class WorldCoordinate
{
    public float Lon { get; set; }
    public float Lat { get; set; }

    public override string ToString()
    {
        return string.Format("lat={0},lon={1}", Lat, Lon);
    }
}

Lat/Lon to tile

A map has a location – latitude and longitude, so that is our starting point. So we need to have a way to get the tile on which the desired location is. A tile is defined by X, Y and zoom level, remember? We start like this:

using System;
using UnityEngine;

public class TileInfo : IEquatable<TileInfo>
{
    public float MapTileSize { get; private set; }

    public TileInfo(WorldCoordinate centerLocation, int zoom, float mapTileSize)
    {
        SetStandardValues(mapTileSize);

        var latrad = centerLocation.Lat * Mathf.Deg2Rad;
        var n = Math.Pow(2, zoom);
        X = (int)((centerLocation.Lon + 180.0)/360.0*n);
        Y = (int)((1.0 - Mathf.Log(Mathf.Tan(latrad) + 1 / Mathf.Cos(latrad)) / Mathf.PI) / 2.0 * n);
        ZoomLevel = zoom;
    }

    private void SetStandardValues(float mapTileSize)
    {
        MapTileSize = mapTileSize;
    }

    public int X { get;  set; }
    public int Y { get;  set; }

    public int ZoomLevel { get; private set; }

}

Via a simple constructor X and Y are calculated from latitude, longitude and zoomlevel, The MapTileSize is the apparent physical size of the map tile - that we will need in the future (it will be 0.5 meters, as we will see later).

Wow. That seems like some very high brow GIS calculation, that only someone like me understands, right? ;)Maybe, maybe not, but you find this formula on the Open Street Map wiki, more specifically, here.

So now we have the center tile, and to calculate the tiles next to it, we simply need another constructor

public TileInfo(int x, int y, int zoom, float mapTileSize)
{
    SetStandardValues(mapTileSize);
    X = x;
    Y = y;
    ZoomLevel = zoom;
}

in a simple loop, as we also will see later, we can find the tiles next, above and below it and we can define a MapBuilder class that can loop over this information, and make map tiles grid. BTW, this class also has some equality logic, but that’s not very exiting in this context, so you can look that up in the demo project.

Tile to URL

As I have explained, a tile can be identified by three numbers: X, Y and zoom level but to actually show the tile, you need to convert that to a URL. So I defined this interface to make the tile retrieval map system agnostic:

public interface IMapUrlBuilder
{
    string GetTileUrl(TileInfo tileInfo);
}

and this single class implementing it for Open Street Map:

using UnityEngine;

public class OpenStreetMapTileBuilder : IMapUrlBuilder
{
    private static readonly string[] TilePathPrefixes = { "a", "b", "c" };

    public string GetTileUrl(TileInfo tileInfo)
    {
        return string.Format("http://{0}.tile.openstreetmap.org/{1}/{2}/{3}.png",
                   TilePathPrefixes[Mathf.Abs(tileInfo.X) % 3],
                   tileInfo.ZoomLevel, tileInfo.X, tileInfo.Y);
    }
}

this is not new code. Regular readers (or better – long time readers) of this blog may have seen it as early as 2010, when I showed how to do this for Window Phone 7.

Some (very little) re-use

Basically we are going to download images from the web again, using the URL that the IMapUrlBuilder can calculate from the TileData. Downloading an showing images in a HoloLens app – been there, done that, and it fact, that’s what made the idea of Walk the World popup up in my mind in the first place. So I am going to reuse the DynamicTextureDownloader from this post. In the demo project, it sits in the HolotoolkitExtensions. We make a simple child class:

using HoloToolkitExtensions.RemoteAssets;
using System.Collections;
using UnityEngine;

public class MapTile : DynamicTextureDownloader
{
    public IMapUrlBuilder MapBuilder { get; set; }

    private TileInfo _tileData;

    public MapTile()
    {
        MapBuilder = MapBuilder != null ? MapBuilder : new OpenStreetMapTileBuilder();
    }

    public void SetTileData(TileInfo tiledata, bool forceReload = false)
    {
        if (_tileData == null || !_tileData.Equals(tiledata) || forceReload)
        {
            TileData = tiledata;
        }
    }

    public TileInfo TileData
    {
        get { return _tileData; }
        private set
        {
            _tileData = value;
            ImageUrl = MapBuilder.GetTileUrl(_tileData);
        }
    }
}

So what happens – if you set TileData using SetTileInfo, it will ask the MapBuilder to calculate the tile image URL, the result will be assigned to the parent class’ ImageUrl property, and the image will automatically be drawn as a texture on the Plane it’s supposed to be added to.

Creating the MapTile prefab

In HologramCollection, create a Plane and call it MapTile. Change it’s X and Z scale to 0.05 so the 10x10 meter plane will show as 0.5 meters indeed. Then add the MapTile script to it as a component. Finally, drag the MapTile Plane (with attached script) from the HologramCollection to the Prefabs folder in Assets.

image

If you are done, remove the MapTile from the HologramCollection in the Hierarchy.

A first test

To the HologramCollection we add another empty element called “Map”. Set it’s Y position to –1.5, so the map will appear well below our viewpoint. To that Map game object that we add a first version of our MapBuilder script:

using UnityEngine;

public class MapBuilder : MonoBehaviour
{
    public int ZoomLevel = 12;

    public float MapTileSize = 0.5f;

    public float Latitude = 47.642567f;
    public float Longitude = -122.136919f;

    public GameObject MapTilePrefab;


    void Start()
    {
        ShowMap();
    }

    public void ShowMap()
    {
        var mapTile = Instantiate(MapTilePrefab, transform);
        var tile = mapTile.GetComponent<MapTile>();
       tile.SetTileData(
           new TileInfo(
               new WorldCoordinate{ Lat = Latitude, Lon = Longitude }, 
               ZoomLevel, MapTileSize)
           );
    }
}

This simple script basically creates one tile based upon the information you have provided. Since it does nothing with location, it will appear at 0,0,0 in the Map, which itself is at 1.5 meters below your viewpoint. If you were to run the result in the HoloLens, a 0.5x0.5m map tile map should appear right around your feet.

Anyway, drag the prefab we created in the previous step on the Map Tile Prefab property of the Map Tile Builder …

image

… and press the Unity Play button.You will see nothing at all, but if you rotate the Hololens Camera 90 degrees over X (basically looking down) while being in play mode a map tile will appear, showing Redmond.

image

Only it will be upside down, thanks to the default way Unity handles Planes - something I still don't understand the reason for. Exit play mode, select the Map game object and change it’s Y rotation to 180, hit play mode again and rotate the camera once again 90 over X.

image

That’s more like it.

The final step

Yes, there is only one step left. In stead of making one tile, let’s make a grid of tiles.

We add three more properties to the MapBuilder script:

public float MapSize = 12;

private TileInfo _centerTile;
private List<MapTile> _mapTiles;

And then here’s the body of MapBuilder v2

void Start()
{
    _mapTiles = new List<MapTile>();
    ShowMap();
}

public void ShowMap()
{
    _centerTile = new TileInfo(new WorldCoordinate { Lat = Latitude, Lon = Longitude }, 
        ZoomLevel, MapTileSize);
    LoadTiles();
}

private void LoadTiles(bool forceReload = false)
{
    var size = (int)(MapSize / 2);

    var tileIndex = 0;
    for (var x = -size; x <= size; x++)
    {
        for (var y = -size; y <= size; y++)
        {
            var tile = GetOrCreateTile(x, y, tileIndex++);
            tile.SetTileData(
new TileInfo(_centerTile.X - x, _centerTile.Y + y, ZoomLevel, MapTileSize), forceReload); tile.gameObject.name = string.Format("({0},{1}) - {2},{3}", x, y, tile.TileData.X, tile.TileData.Y); } } } private MapTile GetOrCreateTile(int x, int y, int i) { if (_mapTiles.Any() && _mapTiles.Count > i) { return _mapTiles[i]; } var mapTile = Instantiate(MapTilePrefab, transform); mapTile.transform.localPosition = new Vector3(MapTileSize * x - MapTileSize / 2, 0, MapTileSize * y + MapTileSize / 2); mapTile.transform.localRotation = Quaternion.identity; var tile = mapTile.GetComponent<MapTile>(); _mapTiles.Add(tile); return tile; }

In ShowMap we first calculate the center tile’s data, and then in LoadTiles we simply loop over a square matrix from –(MapSize/ 2) to +(MapSize/2) and yes indeed, if you define a MapSize of 12 you will actually get a map of 13x13 tiles because there has to be a center tile. If there is a center tile, the resulting matrix by definition has an uneven number of tiles.

In LoadTiles you see TileInfo’s second constructor in action: just X, Y, and Zoomlevel. Once you have the first center tile, calculating adjacent tiles is extremely easy. GetOrCreateTiles does the actual instantiation and positioning in space of the tiles – that’s what we need the MapTileSize for. Note, that for an extra performance gain (and a prevention of memory leaks), it actually keeps the instantiated game objects in memory once they are created, so if you call ShowMap from code after you have changed one of the MapBuilder’s parameters, it will re-use the existing tiles in stead of generating new ones. LoadTiles itself also creates a name for the MapTile but that’s only for debugging / educational purposes – that way you can see which tiles are actually downloaded in the Hierachy while being in Unity Play Mode.

If you deploy this into a HoloLens and look down you will see a map of 6.5x6.5 meters flowing over the floor.

image

To make this completely visible I had to move the camera up over 20 meters ;) but you get the drift.

Some words of warning

  • The formula in TileInfo calculates a tile from Latitude and Longitude. That only guarantees that location will be on that tile but it doesn’t say anything about where on the tile it is. You can see Redmond on the center tile, but it’s not quite in the center of that center tile. It may have well been op the top left. The more you zoom out, the more this will be a factor.
  • This sample shows Open Street Map and Open Street Map only. You will need to build IMapUrlBuilder implementations yourself if you want to use other map providers. Regular readers of this blog know this very easy to do. But please be aware of the TOS of map providers.
  • Be aware that the MapBuilder with map setting of 12 downloads 13x13=169 tiles from the internet. On every call. This app is quite the bandwidth hog - and probably a power hog as well.

Conclusion and some final thoughts

Building basic maps in Unity for use in your HoloLens / Windows Mixed Reality apps is actually pretty easy. I will admit I don’t understand all the fine details of the math either, but I do know how slippy maps are supposed to work, and once I had translated the formula to C#, the rest was actually not that hard, as I hope to have shown.

Almost all data somehow has a relation with a location, and being able to generate a birds’ eye environment in which you can collaborate with you peers, will greatly enhance productivity and correct interpretation of data. Especially if you add 3D geography and/or buildings to it. It looks like reality, it shows you what’s going on, and you less and less have to interpret 2D data into 3D data. We are 3D creatures, and it’s high time our data jumps off the screen to join us in 3D space.

I personally think GIS and geo-apps are one of the premier fields in which AR/VR/MR will shine – and this will kick off an awesome revolution in the GIS world IMHO. Watch this space. More to come!

Demo project here. Enjoy!

15 July 2017

Styles in Xamarin Forms don't work properly in UWP .NET Native - here is how to fix it

Intro

Xamarin Forms is awesome. If you have learned XAML from WPF, Silverlight, Windows Phone, Universal Windows Apps or UWP, you can jump right in using the XAML you know (or at least something that looks remarkably familiar) and start to make apps that will run cross platform on iOS, Android and UWP. So potentially your app cannot only run on phones but also on XBox, HoloLens and PCs.

OnPlatform FTW!

One of the coolest thing is the OnPlatform construct. For instance, you can have something this:

<Style TargetType="Label" x:Key="OtherTextStyle" >
    <Setter Property="FontSize">
        <OnPlatform x:Key="FontSize" x:TypeArguments="x:Double" >
            <On Platform="Windows" Value="100"></On>
            <On Platform="Android" Value="30"></On>
            <On Platform="iOS" Value="30"></On>
        </OnPlatform>
    </Setter>
</Style

This indicates the label that has this style applied to it, should have a font size of 100 on Windows, 30 on Android, and 30 on iOS. In the demo project I have defined some styles in the App.xaml, and the net result is that is looks like this on Android (left), iOS(right) and Windows (below).

imageimage

image

The result is not necessarily very beautiful, but if you look in the MainPage.xaml you will see everything has a style and no values are hard coded. You can also see that although the Android and iOS apps are mobile apps and the Windows app is essentially an app running on a tablet or a PC (the demarcation line between these is becoming hazier with the day) it will still work out using OnPlatform.

I have used various constructs. Apart from the inline construct as I showed above, there's also this one

<OnPlatform x:Key="ImageSize" x:TypeArguments="x:Double" >
    <On Platform="Windows" Value="150"></On>
    <On Platform="Android" Value="100"></On>
    <On Platform="iOS" Value="90"></On>
</OnPlatform>

<Style TargetType="Image" x:Key="ImageStyle" >
    <Setter Property="HeightRequest" Value="{StaticResource ImageSize}" />
    <Setter Property="WidthRequest" Value="{StaticResource ImageSize}" />
    <Setter Property="VerticalOptions" Value="Center" />
    <Setter Property="HorizontalOptions" Value="Center" />
</Style

A construct I would very much recommend, as it enables you to re-use the ImageSize value for other things, for instance the height of button, in another style. You can also use these doubles directly in Xaml, like I did with SomeOtherTextFontSize in the last label in MainPage.xaml

<?xml version="1.0" encoding="utf-8" ?>
<ContentPage xmlns="http://xamarin.com/schemas/2014/forms"
             xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
             x:Class="UWPStyleIssue.MainPage">
    <Grid VerticalOptions="Center" HorizontalOptions="Center" >
        <Grid.RowDefinitions>
            <RowDefinition Height="*"></RowDefinition>
            <RowDefinition Height="*"></RowDefinition>
            <RowDefinition Height="*"></RowDefinition>
            <RowDefinition Height="*"></RowDefinition>
        </Grid.RowDefinitions>
        <Image Grid.Row="0"
            Source=
               "https://media.licdn.com/mpr/mpr/shrinknp_400_400/[abbreviated]jpg"
           Style="{StaticResource ImageStyle}"></Image>
        <Label Text="Welcome to                       Xamarin Forms!" Grid.Row="1"
Style="{StaticResource TextStyle}"/> <Label Text="Yet another line" Grid.Row="2" Style="{StaticResource OtherTextStyle}"/> <Label Text="Last Line" Grid.Row="3" FontSize="{StaticResource SomeOtherTextFontSize}"/> </Grid> </ContentPage

Although I do not recommend this practice - styles are much cleaner - sometimes needs must and this can be handy.

I can hear you think by now: "your point please, kind sir?" (or most likely something less friendly). Well... it works great on Android, as you have seen. It also works great on iOS. And yes, on Windows too...

OnPlatform WTF?

... until you think "let's get this puppy into the Windows Store". As every Windows Developer knows, if you compile for the Store, you compile for Release, which kicks off the .NET Native toolchain. This is very easy to spot as the compilation process takes much longer. The result is not Intermediary Language (IL),  but binary code - an exe - which makes UWP apps so much faster than their predecessors. Unfortunately, it also means the release build is an entirely different beast than a debug build, which can have some unexpected side effects. In our application, if you run the Release build, you will end up with this.

image

That is quite some 'side effect'. No margin to pull the first text up, no font size (just default), no image... WTF indeed.

Analysis

Unfortunately I had some issues with another library (FFImageLoading) which took me on the wrong track for quite a while, but after I had fixed that I noticed that when I changed the styles from Onplatform to hard coded values the styling started to work again - even in .NET Native. So if I did this

<x:Double x:Key="ImageSize">150</x:Double>
<!--<OnPlatform x:Key="ImageSize" x:TypeArguments="x:Double"  >
    <On Platform="Windows" Value="150"></On>
    <On Platform="Android" Value="100"></On>
    <On Platform="iOS" Value="90"></On>
</OnPlatform>-->

at least my image showed up again:

image

With a deadline looming and an ginormous style sheet in my app I really had no time to make a branch with separate styles for Windows. We had to go to the store and we had to go now. Time for a cunning plan. I came up with this:

A solution/workaround/hack/fix ... sort of

So it works when the styles do contain direct values, not OnPlatform, right... ? If you look at App.xaml.cs in the portable project you will see a line in the constructor that's usually not there, and it's commented out

public App()
{
    InitializeComponent();
    //this.FixUWPStyling();

    MainPage = new UWPStyleIssue.MainPage();
}

If you remove the slashes and run the app again in Release....

image

magic happens. All styles seem to work again. This is because of an extension method that's in the file ApplicationExtensions, that you will find in the Portable project in de Extensions folder

public static void FixUWPStyling(this Application app)
{
    if (Device.RuntimePlatform == Device.Windows)
    {
        app.ConvertAllOnPlatformToExplict();
        app.ConvertAllOnDoubleToPlainDouble();
    }
}

The first method, ConvertAllOnPlatformToExplict, does the following:

  • Loop trough all the styles
  • Loop through all the setters in a style
  • Check if the setters 's property name is either "HeightRequest", "WidthRequest", or "FontSize"
  • If so, extract the Windows value from the OnPlatform struct
  • Set the setter's value to a plain double with as value the extracted Windows value

It's crude, it requires about everything to be in OnPlatform, but it does the trick. I am not going to write it all out here, it's not very great code, and you can see it all on GitHub anyway.

Then, for good measure it calls ConvertAllOnDoubleToPlainDouble, which loops trough the all the doubles, like

<OnPlatform x:Key="ImageSize" x:TypeArguments="x:Double" >...</OnPlatform>

It extracts the Windows value, removes the OnPlatform from the resource dictionary and adds a new plain double with the Windows style only to the resource dictionary. For some reason, replacement is not possible.

Conclusion

There is apparently a bug in the Xamarin Forms .NET Native UWP tooling, which causes OnPlatform values being totally ignored. With my dirty little trick, you can at least get your styles to work without having to rewrite the whole shebang for Windows or have a separate style file for it. Note this does not fix everything, if you have other value types (like GridHeights) you will need to add your own conversion to ConvertAllOnPlatformToExplict  What I have given you was enough to fix my problems, but not all potential issues that may arise from this bug.

I hope this drives the Xamarin for UWP adoption forward, and I also hopes this helps the good folks in Redmond fix the bug. I've pretty much identified what goes wrong, now they 'only' have to take are of the how ;)

Demo project with fix can be found here.

12 July 2017

Building a dynamic floating clickable menu for HoloLens/Windows MR

Intro

In the various XAML-based platforms (WPF, UWP, Xamarin) that were created by or are now part of Microsoft we have the great capability to perform databinding and templating - essentially saying to for instance a list 'this is my data, this is how a single item should look, good luck with it' and the UI kind of creates itself. This we don't quite have in Unity projects for Windows Mixed Reality. But still I gave it my best shot when I created a dynamic floating menu for my app Walk the World (only the first few seconds are relevant, the rest is just showing off a view of Machu Picchu)

Starting point

We actually start using the end result of my previous post, as I don't really like to do things twice. So copy that project to another folder, or make a branch, whatever. I called the renamed folder FloatingDynamicMenuDemo. Then proceed as follows:

  • Delete FloatingScreenDemo.* from the project's root
  • Empty the App sub folder - just leave the .gitignore
  • Open the project in Unity
  • Open the Build Settings window (CTRL+B)
  • Hit the "Player Settings..." button
  • Change "FloatingScreenDemo" in "FloatingDynamicMenuDemo" whereever you see it. Initially you will see only one place, but please expand the "Icon" and "Publishing Settings" panels as well, there are more boxes to fill in.
  • Rename the HelpHolder to MenuHolder
  • Remove the Help Text Controller from the HelpHolder.
  • Change the text in the 3DTextPrefab from the Lorum ipsum to "Select a place too see"
  • Change the text's Y position from 0.84 to 0.23 so it will end up at the top of the 'screen'

So now we have a workspace with most of the stuff we need already in it. Time to fill in the gaps.

Building a Menu Item part 1 - graphics

imageSo, think templating. We first need to have a template before we can instantiate it. But the only thing I can instantiate are game objects. So... we need to make one... a combination of graphics and code. That sounds like - a prefab indeed!

First, we will make a material for the menu items, as this will be easier for debugging. Go to the App/Materials folder, find HelpScreenMaterial, hit CTRL-D, and rename HelpScreenMaterial 1 to MenuItemMaterial. Then, change its color to a kind of green, for instance 00B476FF. Also, change the rendering mode to "Opaque". This is so we can easily see the plane.

Inside the MenuHolder we make a new empty game object. I called it - d'oh - MenuItem. Inside that MenuItem, we first make a 3DTextPrefab, then a Plane. The plane will be of course humongous again, and very white. So first drag the green MenuItemMaterial on it. Then change it's X Rotation to 270 so it will be upright again. Then you have to experiment a little with the X and Z scale until it is more or less the same width as your blue Plane, and a little over 1 line of text height, as showed to the left.The values I got were X = 0.065 and Z = 0.004 but this depends of course on the font size you take. Make sure there is some extra padding between the left and right edges of the green Plane and the blue Plane.

imageAs you can see in the top panel, the text and the menu pane are invisible - they are only visible when looked upon dead right from the camera in the game view. This is because they basically are at the same distance as the screen. So we need to set -0.02 to the Z of the green Plane - so it appears in front of the blue screen - and -0.04 to the Z of the 3DTextPrefab so it will appear in front of the green Plane, and you will see the effect in the Scene pane as well now.

Since this is a Menu, we want the text to appear from the left. The Anchor is now middle center and it's Alignment Center, and that is not desirable. So we have to set Alignment to Left and Anchor to Middle Left, and then we drag the text prefab to the left till the edge of the green plane. I found an X position value of -0.32.

Now create a folder "Prefabs" in your App folder in the Assets pane, and drag the MenuItem object from there. This will create a Prefab. The text MenuItem in the Hierarchy will turn blue.

image

You can now safely delete the MenuItem from the Hierarchy. Mind you, the Hierarchy. Make sure it stays in Prefabs.

Building a Menu Item part 2 - code

Our 'menu' needs some general data structure helper. So we start with an interface for that:

public interface IMenuItemData
{
    object SelectMessageObject { get; set; }

    string Title { get; set; }

    int MenuId { get; set; }
}

And a default implementation:

public class MenuItemData : IMenuItemData
{
    public object SelectMessageObject { get; set; }

    public string Title { get; set; }

    public int MenuId { get; set; }
}

The SelectedMessageObject is the payload - the actual data. The Title contains the text we want to have displayed on the menu, and the MenuId we need so we can distinguish select events coming from multiple menus, should your application have such. For the distribution of events we once again use the Messenger that I introduced before (and have used extensively ever since).

To send a selected object around we need a message class:

public class MenuSelectedMessage
{
    public IMenuItemData MenuItem { get; set; }
}

And then we only need to add this simple MenuItemController, a behaviour that handles when the MenuItem is tapped:

using HoloToolkit.Unity.InputModule;
using HoloToolkitExtensions.Messaging;
using UnityEngine;

public class MenuItemController : MonoBehaviour, IInputClickHandler
{
    private TextMesh _textMesh;
    private IMenuItemData _menuItemData;

    public IMenuItemData MenuItemData
    {
        get { return _menuItemData; }
        set
        {
            if (_menuItemData == value)
            {
                return;
            }
            _menuItemData = value;
            _textMesh = GetComponentInChildren<TextMesh>();
            if (_menuItemData != null && _textMesh != null)
            {
                _textMesh.text = _menuItemData.Title;
            }
        }
    }

    public void OnInputClicked(InputClickedEventData eventData)
    {
        if (MenuItemData != null)
        {
            Messenger.Instance.Broadcast(
new MenuSelectedMessage {MenuItem = MenuItemData}); PlayConfirmationSound(); } } private AudioSource _audioSource; private void PlayConfirmationSound() { if (_audioSource == null) { _audioSource = GetComponent<AudioSource>(); } if (_audioSource != null) { _audioSource.Play(); } } }

There is a property MenuItemData that accepts an IMenuItemData. If you set it, it will retain the value in a private field but also shows the value of Title in a TextMesh component. This behaviour is also an IInputClickHandler, so if the user taps this, the OnInputClicked method is called. Essentially all it does, is sending off it's MenuItemData object - that was used to fill the text with a value - to the Messenger. And it tries to play a sound. You should decide for yourself if you want that.

So all we have to to is add this behaviour MenuItem prefab, as this is the thing we are going to click on. That way, if you click next to the text but at the correct height (the menu 'row'), it's still selected. So select MenuItem, hit the "Add Component" button and add the Menu Item Controller.

image

Now if you like, you can add an with AudioSource with a special sound that signifies the selection of a menu. As I have stated before, immediate (audio) feedback is very important in immersive applications. I have done not so. I usually let the receiver of a MenuSelectedMessage do the notification sound.

Building the menu itself

This is done by a surprisingly small and simple behavior. All it does is instantiate a number of game object on a certain positions.

using System.Collections.Generic;
using System.Linq;
using UnityEngine;public class MenuBuilder :MonoBehaviour
{
    public float MaxNumber = 10;

    public float TopMargin = 0.1f;

    public float MenuItemSize = 0.1f;

    private List<GameObject> _createdMenuItems = new List<GameObject>();
    public MenuBuilder()
    {
        _menuItems = new List<IMenuItemData>();
    }
    public GameObject MenuItem;

    private IList<IMenuItemData> _menuItems;

    public IList<IMenuItemData> MenuItems
    {
        get { return _menuItems; }
        set
        {
            _menuItems = value;
            BuildMenuItems();
        }
    }

    private void BuildMenuItems()
    {
        foreach (var menuItem in _createdMenuItems)
        {
            DestroyImmediate(menuItem);
        }
        if (_menuItems == null || !_menuItems.Any())
        {
            return;
        }
        for (var index = 0; index < MenuItems.Count; index++)
        {
            var newMenuItem = MenuItems[index];
            var newGameObject = Instantiate(MenuItem, gameObject.transform);
            newGameObject.transform.localPosition -= 
                new Vector3(0,(MenuItemSize * index) - TopMargin, 0);

            var controller = newGameObject.GetComponent<MenuItemController>();
            controller.MenuItemData = newMenuItem;
            _createdMenuItems.Add(newGameObject);
        }
    }
}

All the important work happens in BuildMenuItems. Any existing items are destroyed first, then we simply loop through the list of menu items - these are IMenuItemData objects. Then a game object provided in MenuItem is instantiated inside the current game object, and it's vertical position is calculated and set. Then it gets the MenuItemController from the instantiated game object - it just assumes it must be there - and puts the newMenuItem in it - so the MenuItem will show the associated text.

So now add the MenuBuilder to the HelpHolder. Then, from prefabs, drag the MenuItem prefab onto the Menu Item property. Net result:

image

Now let's add an initialization behaviour to actually make stuff appear in the menu. This behavior has some hard coded data in it, but you can imagine this coming from some Azure data source

using System.Collections.Generic;
using UnityEngine;

public class VistaMenuController : MonoBehaviour
{
    void Start()
    {
        var builder = GetComponent<MenuBuilder>();

        IList<IMenuItemData> list = new List<IMenuItemData>();
        list.Add(new MenuItemData
        {
            Title = "Mount Everest, Nepal",
            MenuId = 1,
            SelectMessageObject = new WorldCoordinate(27.91282f, 86.94221f)
        });
        list.Add(new MenuItemData
        {
            Title = "Kilomanjaro, Tanzania",
            MenuId = 1,
            SelectMessageObject = new WorldCoordinate(-3.21508f, 37.37316f)
        });
        list.Add(new MenuItemData
        {
            Title = "Mount Rainier, Washington, USA",
            MenuId = 1,
            SelectMessageObject = new WorldCoordinate(46.76566f, -121.7554f)
        });
        list.Add(new MenuItemData
        {
            Title = "Niagra falls (from Canada)",
            MenuId = 1,
            SelectMessageObject = new WorldCoordinate(43.07306f, -79.07561f)
        });
        list.Add(new MenuItemData
        {
            Title = "Mount Robson, British Columbia, Canada",
            MenuId = 1,
            SelectMessageObject = new WorldCoordinate(53.061809f, -119.168358f)
        });
        list.Add(new MenuItemData
        {
            Title = "Athabasca Glacier, Alberta, Canada",
            MenuId = 1,
            SelectMessageObject = new WorldCoordinate(52.18406f, -117.257f)
        });
        list.Add(new MenuItemData
        {
            Title = "Etna, Sicily, Italy",
            MenuId = 1,
            SelectMessageObject = new WorldCoordinate(37.67865f, 14.9964f)
        });

        builder.MenuItems = list;
    }
}

This comes straight from Walk the World - these 7 of it's 10 vistas with a location to look from it. Add this behaviour to the Help Holder as well. Now it's time to run the code and see our menu for the very first time!

imageSome tweaking and fiddling

If you press the play button in Unity, you will get something like displayed to the left. A former British colleague would say something among the lines of "It's not quite what I had in mind". But we can fix this, fortunately.

In the Menu Item Builder that you have added to HelpHolder, there's two more properties:

image

The first one is the relative location where the first item should appear, and the second the size allotted for each menu. Clearly the first MenuItem is placed too low. The only way to really get this done is by trial an error. The higher you make Top Margin, the higher up the first item moves. A value of 0.18 gives about this and that seems about right:

image

And 0.041 for Menu Item Size gives this:

image

Which is just what you want - a tiny little space between the menu items. Like I said, just trial and error.

Testing if its works

Once again, a bit lame: a simple behaviour to listen to the menu selection messages:

using HoloToolkitExtensions.Messaging;

public class MenuListener : MonoBehaviour
{
    // Use this for initialization
    void Start()
    {
        Messenger.Instance.AddListener<MenuSelectedMessage>(ProcessMenuMessage);
    }

    private void ProcessMenuMessage(MenuSelectedMessage msg)
    {
        if (msg.MenuItem.MenuId == 1 )
        {
            Debug.Log("Taking you to " + msg.MenuItem.Title);
            Debug.Log(msg.MenuItem.SelectMessageObject.ToString());
        }
    }
}

Add this behaviour to the Managers object, click play, and sure enough if you click menu items, you will see in Unity's debug console:

image

Yes, I know, that's a lame demo - you connect something to the message that actually does something. Speak out the name. Have a dancing popup. The point is that it works and the messages get out when you click :)

Some final look & feel bits

Yah! We have a more or less working menu but it looks kind of ugly and not everything works - the close button, for instance. Let's fix the look & feel first. We needed the greenish background of the menu item to properly space and align the items, but now we do not need it anymore. So go to the MenuItemMaterial. Select a new imageshader: under HoloToolkit, you will find "Vertex Lit Configurable Transparent".

Then go all the way down, to "Other" and set "Cull" to front. That way, the front part of the plane - the green strips will be invisible - but still hittable.

If you press play, the menu should now look like this:

image

Getting the button to work

As stated above, the button is not working - and for a very simple reason: in my previous post I showed that it looks for a component in it's parent that is a BaseTextScreenController (or a child class of that). There are none.

So let's go back to the VistaMenuController again. The top says

public class VistaMenuController : MonoBehaviour

Let's change that into

public class VistaMenuController : BaseTextScreenController

You will need to add "using HoloToolkitExtensions.Animation;" to top to get this to work. You will also need to change

void Start()
{

into

public override void Start()
{
base.Start();

If you now hit "Play" in Unity you will end up with this

image

Right. Nothing at all :). This is because the base Start method (which is the Start method of BaseTextScreenController) actually hides the menu, on the premises that you don't want to see the menu initially. So we have to have a way to make it visible. Fortunately, that's very easy. We will just re-use the ShowHelpMessage from the previous post again to make this work. Go back one more time to the VistaMenuController "Start" method, and add one more statement:

public override void Start()
{
base.Start();
Messenger.Instance.AddListener<ShowHelpMessage>(m => Show());

If you now press play, you will still see nothing. But if you yell "Show help" to your computer (or press "0" - zero) the menu pops up and comes into view. With, I might add, the for my apps now iconic "pling" sound. And if you click the button, the menu will disappear with the equally iconic "clonk" .

Some concluding remarks

Of course, this is still pretty primitive. With the current font size and menu item size, stuff will be happily rendered outside of the actual menu screen if your texts are too long or you have more than 7 menu items. That is because the screen is just a floating backdrop. Scrolling for more items? Nope. Dynamic or manual resizing? Nope. But it is a start, and I have used it with great success.

Let me know if this was valuable to you, and what you used it for. Full demo project at GitHub, as always.