Skip to content

Latest commit

 

History

History
67 lines (34 loc) · 4.56 KB

File metadata and controls

67 lines (34 loc) · 4.56 KB

Example Scenes

To simplify development, there are example scenes showcasing best practices with each Presence Platform component:

ContentPlacement

Content Placement

ContentPlacement.unity uses the mesh for content placement. The example takes the Blaster from the game and shows how to use the mesh to place it anywhere in the room.

MeshCollisions

Mesh Collisions

PhantomCollisions.unity demonstrates using the mesh for physics. Using fast collisions, the Ectoplasma bounces off the mesh, creating a realistic experience.

AirNavigation

Air Navigation

AirNavigation.unity shows how to use the scanned mesh as a sensor for an air-navigated character (Phanto).

MeshNavigation

Mesh Navigation

MeshNavigation.unity shows how to use the mesh for ground navigation, with and without additional bounding box information on furniture (acquired using manual capture of room elements).

SceneVisualization

Scene Visualization

SceneVisualization.unity is a debug scene that presents the mesh and furniture bounding box, if available.

SemanticSceneQuery

Semantic Scene Query

SemanticSceneQuery.unity demonstrates how to use automatically discovered furniture in the scene. Phantoms use the Scene Mesh for spawning, targeting, navigating, and attacking crystals. The phantoms' thought bubble enhances immersion, allowing advanced path planning based on detected furniture.

DebugDrawingScene

Debug Scene

DebugDrawingScene.unity is a debug scene showcasing developer debug tools.

UserInBounds

User In Bounds

UserInBounds.unity demonstrates best practices for handling cases when the user is outside the scene. When leaving the scene bounds, the user is notified and presented with an option to rescan the space. InsideSceneChecker.cs is attached to the camera prefab and notifies the app when the user's head or hands are inside/outside the bounds.

DepthOcclusion

Depth Occlusion

DepthOcclusion.unity demonstrates best practices for dynamic occlusion using the Depth API, which uses real-time depth estimation for occlusions. To mitigate performance impact, a mixture of soft and hard occlusions were selected for each element in the game. Visit the Depth API open-source repository to learn more and try the new SDK.

HapticsDemo

Haptics Demo

HapticsDemo.unity showcases the integration of haptics with dynamic modulation tied to controller interactions and virtual objects: Phanto floats in the middle of the room and triggers a synchronized audio-haptic effect when "poked". Pulling the right controller trigger increases the effect's amplitude, while moving the thumbstick modulates the frequency.

The haptic assets used in this project were designed with Haptics Studio and integrated using the Haptics SDK for Unity following our Haptic Design Guidelines.

To learn more about the Haptics SDK for Unity and how dynamically modulated haptics were implemented, check out HapticsDemoController.cs for the demo scene or PolterblastTrigger.cs for the Polterblast haptics featured in the main game loop.