I am wanting to know when a user has touched anywhere on the screen of my app.
I have looked into using -(UIResponder *)nextResponder but unfortunately this will not work, as I am also reloaded a table automatically, so this gets trigged when that occurs.
I have also tried a gesture recognizer, with the following code. But this will only recognise touches on the view. Where as I have many buttons the user will be using to operate the app. I would like to avoid adding a gesture recogniser or code for this in every button and segment control I have on the screen
UITapGestureRecognizer *tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapOnView:)];
[self.mainView addGestureRecognizer:tap];
- (void)tapOnView:(UITapGestureRecognizer *)sender
{
//do something
}
I have also tried -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event , but this has the same issue as the gesture recognizer.
I was wondering if there is any way I could achieve this task. I was hoping that I may be able to recognise the type of event from within the nextResponder, and then I could detect if it is button for example.
EDIT: The reason I am working on this is that my app needs to stay active and the screen cannot be locked (so I have disabled screen locking). To avoid excessive use of power, I need to dim the screen, but then return the brightness back to the original level once the app is touched. I need this feature to only occur on 1 of my viewcontrollers.
As mentioned by Ian MacDonald, using hitTest::
is a great solution to detect user interaction on an app wide scale, including when buttons, textfields, etc, are selected.
My solution was to subclass UIWindow
and implement the hitTest method.
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
// do your stuff here
// return nil if you want to prevent interaction with UI elements
return [super hitTest:point withEvent:event];
}