On 04/25/2012 08:03 AM, bootchk wrote:
> Chase wrote> "You have to put five or more fingers down on a trackpad to
> start sending events that aren't caught by Unity or X synaptics."
>
> My understanding now: Unity is a window manager (more or less.) It is
> also in the so-called stack that processes touch events. It does
> gesture recognition and hides touch events from an application (possibly
> converting sequences of touch events, i.e. gestures, to other event
> types) unless five touches are used.
>
> Is there any way to disable unity gesture recognition (so that all touch
> events go to the application), similarly to the way you can disable
> gesture recognition in the synaptics Xorg input driver?
No. The Unity gesture specification doesn't allow for gesture
configuration, and that's what we are implementing. If you want to
influence this decision, I suggest sending a message to the unity-dev
mailing list.
> The stack: kernel device driver (e.g. wacom) > Xorg input driver (e.g.
> synaptics) > window manager (e.g. unity) > GUI toolkit (e.g. Qt) >
> application
>
> Shouldn't there be a set of protocols for querying and disabling gesture
> recognition throughout the entire stack? An API in the GUI TK that an
> app uses to configure touch? I suppose that is what you architecting
> now. I understand it is not easy to do, in terms of coordinating all the
> software pieces and also in terms of providing a consistent user
> interface across platforms.
It's simply a matter of what gestures Unity subscribes to. It could be
configurable through gnome-control-center, for example. Some Unity
options are already provided there.
On 04/25/2012 08:03 AM, bootchk wrote:
> Chase wrote> "You have to put five or more fingers down on a trackpad to
> start sending events that aren't caught by Unity or X synaptics."
>
> My understanding now: Unity is a window manager (more or less.) It is
> also in the so-called stack that processes touch events. It does
> gesture recognition and hides touch events from an application (possibly
> converting sequences of touch events, i.e. gestures, to other event
> types) unless five touches are used.
>
> Is there any way to disable unity gesture recognition (so that all touch
> events go to the application), similarly to the way you can disable
> gesture recognition in the synaptics Xorg input driver?
No. The Unity gesture specification doesn't allow for gesture
configuration, and that's what we are implementing. If you want to
influence this decision, I suggest sending a message to the unity-dev
mailing list.
> The stack: kernel device driver (e.g. wacom) > Xorg input driver (e.g.
> synaptics) > window manager (e.g. unity) > GUI toolkit (e.g. Qt) >
> application
>
> Shouldn't there be a set of protocols for querying and disabling gesture
> recognition throughout the entire stack? An API in the GUI TK that an
> app uses to configure touch? I suppose that is what you architecting
> now. I understand it is not easy to do, in terms of coordinating all the
> software pieces and also in terms of providing a consistent user
> interface across platforms.
It's simply a matter of what gestures Unity subscribes to. It could be center, for example. Some Unity
configurable through gnome-control-
options are already provided there.