visionOS 2 + Object Tracking + ARKit means: we can create visual highlights of real world objects around us and have those visualizations respond to the proximity of our hands.
This project is largely a quick repurposing and combining of Apple's Scene Reconstruction sample project (which utilizes ARKit's HandTrackingProvider
) and Object Tracking sample project.
The full demo video with sound is here.
Some details about putting together this demo are over here.
- Choose your Apple Developer Account in: Signing & Capabilities
- Build
I live in Chicago and purchased the cereal and milk at a local Jewel in June 2024 – your local packaging may vary and prevent recognition. The three products used are:
If you want to strip out the three bundled objects and use your own:
- You will need to train on a
.udsz
file to create a.referenceObject
, I recommend using Apple's Object Capture sample project to create a.usdz
file of your object - You will need to use Create ML (version 6, or higher, which comes bundled with Xcode 16) to train a
.referenceObject
from your.usdz
, for me this process has taken anywhere from 4 - 16 hours per.referenceObject
- You will need to bundle your new
.referenceObject
in the Xcode project - You will need to coordinate the naming of your new
.referenceObject
with the demo'sObjectType
enum so everything plays nicely together