An application that uses Scene Understanding to help people with Parkinson's disease to navigate through their environment. It will display both visual and auditory cues to help them navigate through their environment.
This project builds on top of Microsoft's Scene Understanding example project. Its original README can be found below.
The original project used Unity 2020.3.12f1, which we upgraded to 2021.3.14f1. The old version may or may not work with this project, so it is recommended to use the 2021 version.
- Install Visual Studio 2022 following these instructions. This also means installing the listed workloads, namely:
- .NET desktop development
- Desktop development with C++
- Universal Windows Platform (UWP) development and the following components:
- Windows 10 SDK (10.0.19041.0 or 10.0.18362.0) or Windows 11 SDK
- USB Device Connectivity (required to deploy/debug to HoloLens over USB)
- C++ (v142) Universal Windows Platform tools (required when using Unity)
- Game development with Unity
- Install Unity 2021.3.14f1 or higher following these instructions. Make sure to install the following components:
- Universal Windows Platform Build Support
- Windows Build Support (IL2CPP)
While other setups may work, these are the ones we used and can guarantee they work.
- Clone or download this repository.
- Open the project in Unity.
- Open the
NavMesh-Simple
scene insideAssets/SceneUnderstanding/Examples/NavMesh/Scenes
.
Now, follow the instructions for running on PC or running on HoloLens 2.
- In the scene, select the
SceneUnderstandingManager
game object and make sure thatQuery Scene From Device
is not selected on theSceneUnderstandingManager Component
. - Click the play button in the top center of the Unity editor. A scene should load and you should be able to navigate around it.
- In the scene, select the
SceneUnderstandingManager
game object and make sure thatQuery Scene From Device
is selected on theSceneUnderstandingManager Component
. - Go to
File > Build Settings
and deselect all scenes except forNavMesh-Simple
. - Make sure the platform is set to
Universal Windows Platform
and the architecture is set toARM64
. Set the build configuration toRelease
, then clickBuild
. - Once the build completes successfully, open the
.sln
file in Visual Studio. - In Visual Studio, select
Release
andARM64
as the build configuration and architecture, and make sure to selectDevice
as the target device. Then, click theStart
button to deploy the application to your HoloLens 2.
Our changes are located in the Assets
folder with the code being in the Assets/Scripts
folder:
Calibration.cs
: Handles the calibration process. Stores the last few positions of the user's head and continuously updates them. Offers aCalibrateCoroutine
method that is going to be called fromCueManager
during the calibration process.CueManager.cs
: Handles the cues, i.e. enables and disables them. Also handles the calibration process and in general the menus the user can interact with.Footprints.cs
: Handles the footprints. Stores the list of footprints and shows and hides them. Continuously checks whether the invisible agent has moved and if so, adds new footprints.FreezingDetector.cs
: Handles the freezing detection. Stores the last few positions of the user's head and continuously updates them. Continuously checks whether the user's walking speed is below a certain threshold and if so, sets theisFreezing
variable totrue
. Otherwise, it sets it tofalse
.PoseTracking.cs
: Used for understanding the data provided and how to use it for the user calibration.SoundCuesManager.cs
: Plays the sound cues if cues are enabled.TargetSelector.cs
: Handles the target selection. Shows and hides the target crosshair and continuously checks whether the user is looking at a target. If so, it calls theNewPoint
method fromVisualCuesManager
.VisualCuesManager.cs
: Handles the visual cues. WhereasFootprints.cs
handles the footprints, this class actually moves the invisible agent to the target location using the provided NavMesh methods, and resets the footprints before doing so.
Supported Unity versions | Built with XR configuration |
---|---|
Unity 2020.3.12f1 | Windows XR |
A Unity-based sample application that showcases Scene Understanding on HoloLens 2. When this sample is deployed on a HoloLens, it will show the virtual representation of your real environment. For PC deployment, the sample will load a serialized scene (included under Assets\SceneUnderstanding\StandardAssets\SUScenes) and display it. A help menu is presented on launch, which provides information about all the input commands available in the application.
To learn more about Scene Understanding, visit our Scene Understanding and Scene Understanding SDK documentation.
File/folder | Description |
---|---|
Assets |
Unity assets, scenes, prefabs, and scripts. |
Packages |
Project manifest and packages list. |
ProjectSettings |
Unity asset setting files. |
UIElementsSchema |
UIElements schema files from the Unity editor. |
.gitignore |
Define what to ignore at commit time. |
LICENSE |
The license for the sample. |
README.md |
This README file. |
- Unity 2020.3.12f1 or higher
- Up-to-date version of Unity Hub
- Visual Studio 2017 or 2019 with Universal Windows Platform components
- Windows SDK version 10.0.18362.0 or higher
- Clone or download this sample repository.
- Open Unity Hub, select 'Add' and choose the project folder where you extracted the cloned sample.
- After the project loads, navigate to Windows > Package Manager and check that you have the required packages installed:
- Mixed Reality Scene Understanding
- Mixed Reality WinRT Projections
- If they're missing, download them using the Mixed Reality Feature Tool
Before trying to build the project, go to File > Build Settings and make sure all samples scenes in the SceneUnderstanding/Examples folder appear in the list.
[!IMPORTANT] The Home-Examples scene is not a SceneUnderstanding Scene per se, but rather a Menu Scene from which you can load the other example scenes. You can load any of the other example scenes using voice commands.
To run this sample on a HoloLens 2:
- Open the SceneUnderstanding Sample Scenes under Assets\SceneUnderstanding\Examples - Scenes are in the Placement, NavMesh and Understanding folders
- Select the SceneUnderstandingManager game object and make sure that Query Scene From Device is selected on the SceneUnderstandingManager Component in all Scenes
- Go to File > Build Settings and select Build > UWP. Once the build completes successfully, a log indicating this will show up in the output console.
- Navigate to the UWP folder under root and open 'Scene Understanding.sln' in Visual Studio.
- Right-click on the 'Scene Understanding (Universal Windows)' project and click on 'Publish' --> 'Create App Packages'.
- Run through the wizard and wait for building and packaging to complete.
- The built app package should be at UWP\AppPackages\Scene Understanding\Scene Understanding_\Scene Understanding_.[appx|msix|appxbundle|msixbundle]
- Deploy the package to a HoloLens 2. Ensure you build your application using ARM64, see the topic Unity 2019.3 and HoloLens for further details.
- Launch the 'Scene Understanding' app from the 'All Apps' list on the HoloLens 2!
To run this sample on a PC:
- Open the SceneUnderstanding Sample Scenes under Assets\SceneUnderstanding\Examples - Scenes are in the Placement, NavMesh and Understanding folders
- Select the SceneUnderstandingManager game object and uncheck the Query Scene From Device checkbox on the SceneUnderstandingManager Component
- Ensure SU Serialized Scene Paths on the Scene Understanding component is referring to a serialized Scene Understanding scene, examples scenes are provided under the examples folder
- Click Play in the Editor!
Problem:
Multiple errors occur in SceneUnderstandingManager.cs
Line 571.
System.Numerics.Matrix4x4 converted4x4LocationMatrix = ConvertRightHandedMatrix4x4ToLeftHanded(suObject.GetLocationAsMatrix());
error CS7069: Reference to type 'Matrix4x4' claims it is defined in 'System.Numerics', but it could not be found
Solution:
- Go to Build Settings > Player Settings > Other Settings > Api Compatibility Level and select .Net 4.x
- This setting might revert when upgrading Unity versions
- We have stopped using NugetForUnity and we will no longer support that path. If you still want our legacy build with NugetForUnity please checkout our legacy branch: microsoft/MixedReality-SceneUnderstanding-Samples at LegacyNugetBuild (github.com)
- This branch is deprecated and we will not support updates on it.
[!NOTE] When running on your Hololens, all the interactive commands are voice commands. You're require to speak to interact with the scene when running on Hololens. Say Scene Objects Wireframe, Load NavMesh, Toggle Auto Refresh and so on.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.