I want to detect click/touch event on my gameObject 2D.
And this is my code:
void Update()
{
if (Input.touchCount > 0)
{
Debug.Log("Touch");
}
}
Debug.Log("Touch");
does not show when I click on screen or my gameObject.
Short answer: yes, touch may be handled with Input.GetMouseButtonDown()
.
Input.GetMouseButtonDown()
, Input.mousePosition
, and associated functions work as tap on the touch screen (which is kind of odd, but welcome). If you don't have a multi-touch game, this is a good way to keep the in-editor game functioning well while still keeping touch input for devices. (source: Unity Community)
Mouse simulation with touches can be enabled/disabled with Input.simulateMouseWithTouches
option. By default, this option is enabled.
Though it is good for testing, I believe Input.GetTouch()
should be used in production code (because it is able to handle simultaneous touches).
Interesting approach is to add touch handling to OnMouseUp()
/OnMouseDown()
event:
// OnTouchDown.cs
// Allows "OnMouseDown()" events to work on the iPhone.
// Attach to the main camera.
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
public class OnTouchDown : MonoBehaviour {
void Update () {
// Code for OnMouseDown in the iPhone. Unquote to test.
RaycastHit hit = new RaycastHit();
for (int i = 0; i < Input.touchCount; ++i)
if (Input.GetTouch(i).phase.Equals(TouchPhase.Began)) {
// Construct a ray from the current touch coordinates
Ray ray = Camera.main.ScreenPointToRay(Input.GetTouch(i).position);
if (Physics.Raycast(ray, out hit))
hit.transform.gameObject.SendMessage("OnMouseDown");
}
}
}
(source: Unity Answers)
UPD.: There is Unity Remote mobile app for simulating touching in editor mode (works with Unity Editor 4 and Unity Editor 5).