Most engines have trouble interacting with 2D menu’s in VR such as Occulus OpenVR, etc. Is there a best practice for Lumberyard in this case?
You can set a UI canvas to render to a texture that you then display on a quad floating in the 3D world. You can still interact with the UI elements.
For an example see the SamplesProject UiIn3dWorld level.
Does that work for your use case?
I created a post of my own, but I noticed that the 3d world UI canvases are designed to be interacted with a mouse cursor. Would you be able to recommend a way for you to interact with a UI rendered on a texture using my controller’s raycast hit-position?
The raycast is able to return the entityId of what it’s hitting, but unable to differentiate between different UI components. Would this mean you would have to create each individual button as a texture?