79742942

Date: 2025-08-22 03:44:11
Score: 1
Natty:
Report link

After spending ages trying to get this working where I set .allowsHitTesting(true) and tried to let the SpriteView children manage all interaction and feed it back to the RealityView when needed, I decided it just wasn't possible. RealityKit doesn't really want to play nicely with anything else.

So what I did was create a simple ApplicationModel:

public class ApplicationModel : ObservableObject {
    
    @Published var hudInControl : Bool
    
    init() {
        self.hudInControl = false
    }
    
    static let shared : ApplicationModel = ApplicationModel()
    
}

and then in the ContentView do this:

struct ContentView: View {
    
    @Environment(\.mainWindowSize) var mainWindowSize

    @StateObject var appModel : ApplicationModel = .shared

    var body: some View {
        ZStack {
            RealityView { content in
                // If iOS device that is not the simulator,
                // use the spatial tracking camera.
                #if os(iOS) && !targetEnvironment(simulator)
                content.camera = .spatialTracking
                #endif
                createGameScene(content)
            }.gesture(tapEntityGesture)
            // When this app runs on macOS or iOS simulator,
            // add camera controls that orbit the origin.
            #if os(macOS) || (os(iOS) && targetEnvironment(simulator))
            .realityViewCameraControls(.orbit)
            #endif

            let hudScene = HUDScene(size: mainWindowSize)
            
            SpriteView(scene: hudScene, options: [.allowsTransparency])
            
            // this following line either allows the HUD to receive events (true), or
            // the RealityView to receive Gestures.  How can we enable both at the same
            // time so that SpriteKit SKNodes within the HUD node tree can receive and
            // respond to touches as well as letting RealityKit handle gestures when
            // the HUD ignores the interaction?
            //
                .allowsHitTesting(appModel.hudInControl)
        }
    }
}

this then gives the app some control over whether RealityKit, or SpriteKit get the user interaction events. When the app starts, interaction is through the RealityKit environment by default.

When the user then triggers something that gives control to the 2D environment, appModel.hudInControl is set to true and it just works.

For those situations where I have a HUD based button that I want sensitive to taps when the HUD is not in control, I, in the tapEntityGesture handler, offer the tap to the HUD first, and if the HUD does not consume it, I then use it as needed within the RealityView.

Reasons:
  • Blacklisted phrase (1): How can we
  • RegEx Blacklisted phrase (1): I want
  • Long answer (-1):
  • Has code block (-0.5):
  • Self-answer (0.5):
Posted by: PKCLsoft