Publisher | Kyrylo Kuzyk |
---|---|
File size | 5.01MB |
Number of files | 23 |
Latest version | 2.0.39-release.0 |
Latest release date | 2024-08-18 06:43:12 |
First release date | 2021-08-12 08:48:14 |
Supported Unity versions | 2018.4.2 or higher |
AR Foundation Remote 2.0 is a big update to AR Foundation Remote: the most popular debugging tool for AR apps. With new exclusive features, you'll be able to iterate even faster and deliver high-quality AR apps to the market with greater confidence.
In simple words: AR Foundation Remote 2.0 = Unity Remote + AR Foundation + Input System (New) + So Much More.
💡 Current workflow with AR Foundation 💡
1. Make a change to your AR project.
2. Build the project and run it on a real AR device.
3. Wait for the build to complete.
4. Wait a little bit more.
5. Test your app on a real device using only Debug.Log().
🔥 Improved workflow with AR Foundation Remote 2.0 🔥
1. Setup the AR Companion app once. The setup process takes less than a few minutes.
2. Just press play! Run and debug your AR app with full access to scene hierarchy and all object properties right in the Editor!
💡 This plugin is licensed on a per-seat basis, meaning that one license is required for each developer in your team. More Info.
⚡ Features ⚡
• Precisely replicates the behavior of a real AR device in Editor.
• Extensively tested with both ARKit and ARCore.
• Plug-and-play: no additional scene setup is needed, just run your AR scene in Editor with AR Companion running (minor code change may be needed).
• Streams video from Editor to real AR device so you can see how your app looks on it without making a build (see Limitations).
• Multi-touch input remoting: stream multi-touch from an AR device or simulate touch using a mouse in Editor (see Limitations).
• Test Location Services (GPS), Gyroscope, and Compass right in the Editor.
• Written in pure C# with no third-party libraries. Full source code is available.
• Connect any AR Device to Windows PC or macOS via Wi-Fi: iOS + Windows PC, Android + macOS... any variation you can imagine!
• Compatible with Wikitude SDK Expert Edition.
• Compatible with VisionLib SDK.
🎥 Session Recording and Playback 🎥
Session Recording and Playback feature will allow you to record AR sessions to a file and play them back in the reproducible environment (see Limitations).
• Record and playback all supported features: face tracking, image tracking, plane tracking, touch input, you name it!
• Fix bugs that occur only under some specific conditions. Playing a previously recorded AR session in the reproducible environment will help you track down and fix bugs even faster!
• Record testing scenarios for your AR app. Your testers don't have to fight over testing devices ever again: record a testing scenario once, then play it back as many times as you want without an AR device.
⚓️ ARCore Cloud Anchors ⚓️
Testing ARCore Cloud Anchors don't have to be that hard. With a custom fork adapted to work with the AR Foundation Remote 2.0, you can run AR projects with ARCore Cloud Anchors right in the Unity Editor.
• Host Cloud Anchors.
• Resolve Cloud Anchors.
• Record an AR session with ARCore Cloud Anchors and play it back in the reproducible environment.
🕹 Input System (New) support 🕹
Version 2.0 brings Input System (New) support with all benefits of input events and enhanced touch functionality.
• Input Remoting allows you to transmit all input events from your AR device back to the Editor. Test Input System multi-touch input right in the Editor!
• Test Input Actions right in the Editor without making builds.
• Record all Input System events to a file and play them back in the reproducible environment. Again, all supported features can be recorded and played back!
⚡ Supported AR subsystems ⚡
• ARCore Cloud Anchors: host and resolve ARCore Cloud Anchors.
• Meshing (ARMeshManager): physical environment mesh generation, ARKit mesh classification support.
• Occlusion (AROcclusionManager): ARKit depth/stencil human segmentation, ARKit/ARCore environment occlusion (see Limitations).
• Face Tracking: face mesh, face pose, eye tracking, ARKit Blendshapes.
• Body Tracking: ARKit 2D/3D body tracking, scale estimation.
• Plane Tracking: horizontal and vertical plane detection, boundary vertices, raycast support.
• Image Tracking: supports mutable image library and replacement of image library at runtime.
• Depth Tracking (ARPointCloudManager): feature points, raycast support.
• Camera: camera background video (see Limitations), camera position and rotation, facing direction, camera configurations.
• CPU images: camera and occlusion CPU images support (see Limitations).
• Anchors (ARAnchorManager): add/remove anchors, and attach anchors to detected planes.
• Session subsystem: Pause/Resume, receive Tracking State, set Tracking Mode.
• Light Estimation: Average Light Intensity, Brightness, and Color Temperature; Main Light Direction, Color, and Intensity; Exposure Duration and Offset; Ambient Spherical Harmonics.
• Raycast subsystem: perform world-based raycasts against detected planes, point clouds, and the depth map.
• Object Tracking: ARKit object detection after scanning with the scanning app (see Limitations).
• ARKit World Map: full support of ARWorldMap. Serialize the current world map, deserialize the saved world map and apply it to the current session.