FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇Recent Questions - Game Development Stack Exchange
  • Unity XR Build and Run no longer workingKokodoko
    I have connected the meta quest 3 to my computer via USB cable and allowed debugging on the device. I have opened the default VR template from Unity Hub, without any alterations. In Unity I select "build and run" for the default scene. This gives the following messages: Application installed to device "2G0Y...ZK [Quest 3]". Build completed with a result of 'Succeeded' in 11 seconds (10622 ms). The problem is that the app does not appear on the Meta Quest at all...! The quest does not respond a
     

Unity XR Build and Run no longer working

I have connected the meta quest 3 to my computer via USB cable and allowed debugging on the device.

I have opened the default VR template from Unity Hub, without any alterations.

In Unity I select "build and run" for the default scene. This gives the following messages:

  • Application installed to device "2G0Y...ZK [Quest 3]".
  • Build completed with a result of 'Succeeded' in 11 seconds (10622 ms).

The problem is that the app does not appear on the Meta Quest at all...! The quest does not respond at all. There are no error messages.

For a unity 3D multiplayer game, how to spawn and despawn multiple gameobjects for specific client?

I am working on a Unity 3D multiplayer game. There are 3 gameobjects (table, chair, pen) and apart from host, I have two client - student and teacher, consider them as two roles. I know how to instantiate a prefab and sync it across clients. But if i want to spawn-despawn one or two gameobjects for client 1 and not for client 2, how can I achieve that?

private void Update()
{
    if (!IsOwner) return;
    if (Input.GetKeyDown(KeyCode.T))
    {
        spawnedObjectTransform = Instantiate(spawnedObjectPrefab);
        spawnedObjectTransform.GetComponent<NetworkObject>().Spawn(true);

    }
    if (Input.GetKeyDown(KeyCode.Y))
    {
        Destroy(spawnedObjectTransform.gameObject);

    }
    Vector3 moveDir = new Vector3(0, 0, 0);
    if (Input.GetKey(KeyCode.W)) moveDir.z = +1f;
    if (Input.GetKey(KeyCode.S)) moveDir.z = -1f;
    if (Input.GetKey(KeyCode.A)) moveDir.x = -1f;
    if (Input.GetKey(KeyCode.D)) moveDir.x = +1f;

    float moveSpeed = 3f;
    transform.position += moveDir * moveSpeed * Time.deltaTime;

}

Or is there any other alternative like defining roles for clients and then setting permissions in the scene based on that role and again how can I do that?

Thanks in advance!

Converting a Rendered Canvas from Overlay to World Space while facing camera

I'm working on converting a 3D game to VR. I've made some progress, but the UI has been challenging. From my understanding (and use) a canvas rendered in world space is recommended in VR.

Here is a video of what I see after converting the overlay canvas to world space. The two arrow sprites follow the hand correctly, but from some angles, we end up looking at them edge-on like this:

Arrows at an angle

But I want them to always face the player's head flat-on like this:

Arrows flat-on

I feel I have identified the issue, but my programming is a work in progress. I'm wondering if this could be the problem:

public static float Atan2(float y, float x);

Description
Returns the angle in radians whose Tan is y/x.

Return value is the angle between the x-axis and a 2D vector starting at zero and terminating at (x,y).

using System.Collections;
using System.Collections.Generic;
using Unity.Mathematics;
using UnityEngine;
using UnityEngine.SocialPlatforms;

public class UIManager : MonoBehaviour
{
    public GameObject WindVectorUI;
    public GameObject SailDirectionUI;
    public GameObject ApparentWindUI;


    // Start is called before the first frame update
    void Start()
    {

    }

    // Update is called once per frame
    void Update()
    {
        UpdateWindVectorUI();
        UpdateSailDirectionUI();
        UpdateApparentWindUI();
    }

    // calculates the angle of the wind vector using Mathf.Atan2() which returns the angile in radians between the x-axis and the vector pointing to (x,y)
    void UpdateWindVectorUI()
    {
        float AngleInRad = Mathf.Atan2(WindManager.instance.CurrentTrueWind.y, WindManager.instance.CurrentTrueWind.x);
        WindVectorUI.transform.rotation = Quaternion.Euler(0, 0, AngleInRad * Mathf.Rad2Deg);

    }

    void UpdateSailDirectionUI()
    {
        SailDirectionUI.transform.rotation = Quaternion.Euler(0, 0, -BoatManager.Player.Sail.transform.localRotation.eulerAngles.y + 90 - BoatManager.Player.transform.localRotation.eulerAngles.y);
    }

    void UpdateApparentWindUI()
    {
        float AngleInRad = Mathf.Atan2(BoatManager.Player.ApparentWind.y, BoatManager.Player.ApparentWind.x);
        ApparentWindUI.transform.rotation = Quaternion.Euler(0, 0, AngleInRad * Mathf.Rad2Deg);
    }
}

Correcting Coordinate Misalignment When Casting Rays from Touch Position on RenderTexture in Unity and Meta Quest

I am developing a VR application for Meta Quest using Unity. In this application, the user interacts using a controller, and the user's hand is visually represented in the virtual space.

Objective:

I want to cast a ray into the virtual space corresponding to the point touched by the hand on an object (hereafter referred to as the "panel") that has a RenderTexture showing part of the virtual space. For debugging purposes, a cube with scale (0.1, 0.1, 0.1) is displayed at the hit position for 0.1 seconds. Eventually, I plan to activate a particle system on the cube existing at the hit position.

Additionally, the camera that renders to the RenderTexture is attached to other players and is always in motion.

Current Issue:

Currently, touching the panel does not trigger any actions. Using a previous method, a cube is indeed created when touched, but the coordinates where the ray is cast are significantly misaligned.

Referenced Articles:

How do you get the texture coordinate hit by the mouse on a UI raw image in Unity?

Method to display effects on objects touched via RenderTexture (Japanese)

The second link is from a question I previously asked. With this method, a cube is generated upon touch, but the coordinates where the ray is cast are greatly misaligned.

Code:

Here is the main code attached to the panel. Any suggestions or modifications to correct the issue would be greatly appreciated.

using UnityEngine;
using Photon.Pun;
using UnityEngine.UI;

public class PanelManager : MonoBehaviourPun
{
    public Camera displayRenderCamera; // Camera that renders to the RenderTexture
    private RawImage displayGameObject; // GameObject displaying the RenderTexture
    private Vector3? colliderPoint = null; // Intersection point with the collider

    void Start()
    {
        InitializeCameraAndPanel();
    }

    void Update()
    {
        bool gripHeld = OVRInput.Get(OVRInput.Button.PrimaryHandTrigger, OVRInput.Controller.RTouch);
        bool triggerNotPressed = !OVRInput.Get(OVRInput.Button.PrimaryIndexTrigger, OVRInput.Controller.RTouch);

        if (gripHeld && triggerNotPressed && colliderPoint != null) // Holding grip and not pressing trigger (pointing gesture)
        {
            InteractWithRenderTexture();
        }
        InitializeCameraAndPanel();
    }

    private void InitializeCameraAndPanel()
    {
        PhotonView[] allPhotonViews = FindObjectsOfType<PhotonView>();

        foreach (PhotonView view in allPhotonViews)
        {
            if (view.Owner != null)
            {
                if (view.Owner.ActorNumber != PhotonNetwork.LocalPlayer.ActorNumber)
                {
                    GameObject camera = view.gameObject.transform.Find("Head/ViewCamera")?.gameObject;
                    if (camera != null)
                    {
                        displayRenderCamera = camera.GetComponent<Camera>();
                        Debug.Log(displayRenderCamera);
                    }
                }
                else if (view.Owner.ActorNumber == PhotonNetwork.LocalPlayer.ActorNumber)
                {
                    GameObject panel = view.gameObject.transform.Find("Panel/Panel")?.gameObject;
                    if (panel != null)
                    {
                        displayGameObject = panel.GetComponent<RawImage>();
                    }
                }
            }
        }
    }

    private void InteractWithRenderTexture()
    {
        if (colliderPoint == null) return;

        Vector3 worldSpaceHitPoint = colliderPoint.Value;

        Vector2 localHitPoint = displayGameObject.rectTransform.InverseTransformPoint(worldSpaceHitPoint);

        var rect = displayGameObject.rectTransform.rect;
        Vector2 textureCoord = localHitPoint - rect.min;
        textureCoord.x *= displayGameObject.uvRect.width / rect.width;
        textureCoord.y *= displayGameObject.uvRect.height / rect.height;
        textureCoord += displayGameObject.uvRect.min;

        Ray ray = displayRenderCamera.ViewportPointToRay(new Vector3(textureCoord.x, textureCoord.y, 0));

        // Debug: Show a red cube at the touch location
        Vector3 point = ray.GetPoint(2.0f);
        GameObject cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
        cube.transform.position = point;
        cube.transform.localScale = new Vector3(0.1f, 0.1f, 0.1f);
        cube.GetComponent<Renderer>().material.color = Color.red;
        Destroy(cube, 0.1f);

        if (Physics.Raycast(ray, out var hit, 10.0f))
        {
            if (hit.transform.TryGetComponent<CubeManager>(out var cubeManager))
            {
                cubeManager.StartParticleSystem();
            }
        }
    }

    void OnTriggerEnter(Collider other)
    {
        if (other.CompareTag("rightHand"))
        {
            var plane = new Plane(transform.forward, transform.position);

            colliderPoint = plane.ClosestPointOnPlane(other.bounds.center);
        }
    }

    void OnTriggerExit(Collider other)
    {
        if (other.CompareTag("rightHand"))
        {
            colliderPoint = null;
        }
    }
}

I've revisited the RenderTexture settings to ensure the virtual space is rendered correctly.

How should I modify the code to accurately cast rays based on the touch position?

Thank you for your assistance!

❌
❌