I'm basically trying to get all touch event data from something like a system overlay, move my sprites around based on this touch data, then allow the OS/homescreen/browser to act upon the data as it should(or vice versa). I have found similar questions, but nothing that leads me anywhere I haven't already been:
Getting the View that is receiving all the touch events
(Implemented, with results below) Creating a system overlay window (always on top)
What I can do:
I can EITHER grab ALL the touch events and act upon them by moving my sprites and not allow the OS/homescreen/browser to see any of them, OR ELSE I can allow the touch events to pass through and only get a “TOUCH_OUTSIDE” for my app to act upon.
My unattained goal:
I CAN NOT for the life of me figure out a way around getting BOTH to work with the data. The only methods I can think of, that I can't get implemented are: Intercepting the data in my APP and passing it onto OS/homescreen/browser to work with Allowing the OS/homescreen/browser to get the data first, and then getting a callback with information somehow Allowing the OS/homescreen/browser to get the data, act on the data, and the poll them for what their scroll/location values are so as to act upon it in my APP.
I fear that this just isn't possible, I think I read somewhere in some documentation that I can't find now: “It's all or nothing, either your view gets all the events, or none of them”
(To avoid confusion, I don't mean I have two views. I mean I have one view controlled via activity/service overlaying the OS/homescreen/browser. Like a pane of glass if you will.)
Thank you for any helpful information you can offer, it's very much appreciated!
[UPDATE] Posted my own documentation on the matter below, so as to not be confusing.
Found this documentation that pretty much states that it's not possible to do both: Android : Multi touch and TYPE_SYSTEM_OVERLAY
They discuss workarounds but I don't think any of them will actually work for exactly what I'm trying to do. Both given the events to the underlying app, and being able to snoop them to act upon them for myself.
To create an overlay view, when setting up the LayoutParams you need to set the type to TYPE_SYSTEM_OVERLAY and use the flag FLAG_WATCH_OUTSIDE_TOUCH. This presents a problem because as the Android documentation states: "you will not receive the full down/move/up gesture, only the location of the first down as an ACTION_OUTSIDE." In order to receive the full array of touch events you need to use the TYPE_SYSTEM_ALERT type, but this causes the overlay to take over the screen and stop interaction with other elements.
Anyone wants to disagree I'd love to hear good news :-D