multitouch + qt doesn't work (eg fingerpaint demo) with wacom serial touchscreen

Bug #901630 reported by Erno Kuusela
16
This bug affects 2 people
Affects Status Importance Assigned to Milestone
xf86-input-wacom (Ubuntu)
Confirmed
Undecided
Unassigned

Bug Description

I found a thread at https://lists.launchpad.net/multi-touch-dev/msg00798.html saying multitouch now works in Ubuntu's Qt
and mentioned the Qt touch demos. Two finger pinch-zoom using the touchscreen works in Firefox on this hardware (HP EliteNook 2760p) out of the box- but none of the Qt multitouch Qt demos (eg fingerpaint, pinchzoom) from qt4-demos package work.

I tried to follow https://wiki.ubuntu.com/Multitouch/Testing but got stuck at https://wiki.ubuntu.com/Multitouch/Testing/CheckingMTDevice -

the only plausible sounding device out of these..

$ sudo lsinput|grep name
   name : "Sleep Button"
   name : "Lid Switch"
   name : "Power Button"
   name : "AT Translated Set 2 keyboard"
   name : "HP HD Webcam [Fixed]"
   name : "HP WMI hotkeys"
   name : "PS/2 Generic Mouse"
   name : "SynPS/2 Synaptics TouchPad"
   name : "ST LIS3LV02DL Accelerometer"
   name : "Video Bus"
   name : "HDA Intel PCH HDMI/DP,pcm=3"
   name : "HDA Intel PCH Line In at Sep Rea"
   name : "HDA Intel PCH Mic at Ext Right J"
   name : "HDA Intel PCH Line Out at Sep Re"
   name : "HDA Intel PCH HP Out at Ext Righ"

was the synaptics touchpad at /dev/input/event7 but:

$ mtdev-test /dev/input/event7
error: could not open device
$ sudo mtdev-test /dev/input/event7
error: could not grab the device

Also, xinput_calibrator shows the touchscreen as this, which didn't show up in lsinput:
$ xinput_calibrator --list
Device "Serial Wacom Tablet stylus" id=13
Device "Serial Wacom Tablet eraser" id=15
Device "Serial Wacom Tablet touch" id=16

https://wiki.ubuntu.com/Multitouch/HardwareSupport says that Wacom should work.

ProblemType: Bug
DistroRelease: Ubuntu 11.10
Package: libqt4-core (not installed)
ProcVersionSignature: Ubuntu 3.0.0-12.20-generic 3.0.4
Uname: Linux 3.0.0-12-generic x86_64
ApportVersion: 1.23-0ubuntu4
Architecture: amd64
Date: Thu Dec 8 12:53:16 2011
InstallationMedia: Ubuntu 11.10 "Oneiric Ocelot" - Release amd64 (20111012)
ProcEnviron:
 PATH=(custom, no user)
 LANG=en_US.UTF-8
 SHELL=/bin/bash
SourcePackage: qt4-x11
UpgradeStatus: No upgrade log present (probably fresh install)

Erno Kuusela (erno-iki)
tags: added: multitouch
Revision history for this message
Erno Kuusela (erno-iki) wrote :

Looks like the wacom driver just converts the touch gestures (eg pinch) to scroll wheel events, so it's not Qt's fault.

affects: qt4-x11 (Ubuntu) → xf86-input-wacom (Ubuntu)
summary: - multitouch + qt doesn't work (eg fingerpaint demo)
+ multitouch + qt doesn't work (eg fingerpaint demo) with wacom serial
+ touchscreen
Revision history for this message
Brent Fox (brent-s-fox) wrote :

I have a 2740p, which I believe is just a smaller screen version of the model referenced in this bug report.

I'm using 11.10. When I try to do a two-finger pinch or rotate, the system no longer recognizes any right-mouse clicks on the trackpad device.

https://wiki.ubuntu.com/Multitouch/HardwareSupport suggests using 'setserial' for this device, but I'm not sure how to do that. Any pointers?

Revision history for this message
Chase Douglas (chasedouglas) wrote :

If it's a serial touchscreen, try these instructions from Brian Murray:

http://www.murraytwins.com/blog/?p=103

If it's a usb touchscreen, try uninstalling xserver-xorg-input-wacom.

Hopefully after the above your touchscreen will be using the X evdev input module. It has support for multitouch, which will then work with uTouch for gestures.

Revision history for this message
bootchk (bootch) wrote :

I had a similar issue and yes:

sudo apt-get remove xserver-xorg-input-wacom

changes the behaviour. Previously, Settings>Input Devices>Trackpad showed no devices. Now, it shows my Wacom CTT-460. Also, mtdev-test still fails as above but geisttest seems to show the device working with pinches. (geisview fails because gtk is not installed.)

Other sources say that the kernel module named "wacom", a *driver*, is distinct from the *Xorg input driver* named "xserver-xorg-input-wacom." Also, the kernel module named "evdev" is distinct from the Xorg input driver named similarly. I don't understand the difference, but /var/log/dmesg still shows a "wacom" driver being used for my device.

I am running kubuntu 12.4b.

My best guess it is an architectural issue: client vs server. I recall that Ubuntu is moving multitouch and gesture recognition to the client side (the windowing system and application, more or less) from the server side (Xorg, the X11 server.) IOW the Xorg input driver named evdev passes raw multitouch events on to the client (Qt) instead of interpreting them???

Revision history for this message
bootchk (bootch) wrote :

According to some sources, multitouch doesn't work on Ubuntu desktop until version Qt5.
Only on Windows desktop or some mobile platforms.

http://qt-project.org/forums/viewthread/12664

Just to be clear, Chase above is correct, uTouch should work, but its not necessarily part of Qt?
uTouch is for the Ubuntu platform. Qt uses other technology on Windows.

Revision history for this message
Erno Kuusela (erno-iki) wrote :

I got Qt multitouch working on another laptop after a driver fix (see #724831), so Qt isn't the problem.

Revision history for this message
Chase Douglas (chasedouglas) wrote :

Hi bootchk,

Multitouch is supported in Qt in Ubuntu. We added support for it in Ubuntu 11.04 and have maintained it since then. We haven't had a chance to merge it upstream yet.

Revision history for this message
bootchk (bootch) wrote :

Chase,

Thanks. How I interpret your last comment is: it is in Ubuntu but not upstream to Qt (whoever that is now, Nokia or other.) As long as I use Ubuntu's version of Qt (whether the source or libraries) it should work.

Also, referring to: https://wiki.ubuntu.com/Multitouch , specifically where it says "2 finger gestures require extra setup". I interpret that to mean: without the setup, the synaptics driver (which is inappropriately named, is really THE driver for most touch HID's, not just those of the Synaptics brand) will hide all one finger and two finger events from further up the stack, i.e. from a Qt application? Possibly that is the problem with my testing.

The whole business of the stack confuses me. My understanding is a driver can translate certain sequences of touch events into single "mouse" events, i.e. recognize multi-finger gestures, in an attempt to be "friendly" to an app. For example, two finger tap in a certain corner translates to something, as you can see in the System Settings panel for a touchpad. For example, the way you must write an app to be "pen-friendly" is to rely on the translated mouse events instead of raw pen events. How does an application control what the driver does, if the application wants raw touch events to do its own gesture recognition? E.g. using Qt custom QGestureRecognizer. Does Qt have any way of knowing that certain "friendly" mouse events were in fact a sequence of touch events translated by a driver? (Not that it needs to, what could Qt or an app do differently if it knew, besides tell the user that the driver might be improperly "setup" for the gestures that Qt or the app was ready to recognize.)

Does Qt QPanGesture require a certain number of fingers, or even require touch at all? For example, would Qt recognize a pan consisting of "mouse" events that were in fact translated from touch events by a touchpad driver?

If there is a better forum for this discussion, please let me know.

Revision history for this message
Chase Douglas (chasedouglas) wrote : Re: [Bug 901630] Re: multitouch + qt doesn't work (eg fingerpaint demo) with wacom serial touchscreen

On 04/24/2012 04:35 PM, bootchk wrote:
> Thanks. How I interpret your last comment is: it is in Ubuntu but not
> upstream to Qt (whoever that is now, Nokia or other.) As long as I use
> Ubuntu's version of Qt (whether the source or libraries) it should work.

Correct.

> Also, referring to: https://wiki.ubuntu.com/Multitouch , specifically
> where it says "2 finger gestures require extra setup". I interpret that
> to mean: without the setup, the synaptics driver (which is
> inappropriately named, is really THE driver for most touch HID's, not
> just those of the Synaptics brand) will hide all one finger and two
> finger events from further up the stack, i.e. from a Qt application?
> Possibly that is the problem with my testing.

It could be. And then three and four finger interactions are grabbed by
Unity. You have to put five or more fingers down on a trackpad to start
sending events that aren't caught by Unity or X synaptics.

> The whole business of the stack confuses me. My understanding is a
> driver can translate certain sequences of touch events into single
> "mouse" events, i.e. recognize multi-finger gestures, in an attempt to
> be "friendly" to an app. For example, two finger tap in a certain
> corner translates to something, as you can see in the System Settings
> panel for a touchpad. For example, the way you must write an app to be
> "pen-friendly" is to rely on the translated mouse events instead of raw
> pen events. How does an application control what the driver does, if
> the application wants raw touch events to do its own gesture
> recognition? E.g. using Qt custom QGestureRecognizer. Does Qt have
> any way of knowing that certain "friendly" mouse events were in fact a
> sequence of touch events translated by a driver? (Not that it needs to,
> what could Qt or an app do differently if it knew, besides tell the user
> that the driver might be improperly "setup" for the gestures that Qt or
> the app was ready to recognize.)

You're getting at the fundamental problem of the traditional touchpad
"gestures" being interpreted in the X server. It worked reasonably well
to this point, but is fundamentally broken now.

We need to rework how touchpad gestures are recognized. They should be
moved into the toolkits. However, that's a large amount of work. We are
looking into our options.

> Does Qt QPanGesture require a certain number of fingers, or even require
> touch at all? For example, would Qt recognize a pan consisting of
> "mouse" events that were in fact translated from touch events by a
> touchpad driver?

The Qt gestures only work with touch events, as far as I know. They will
not work with mouse events.

> If there is a better forum for this discussion, please let me know.

Feel free to ask questions and participate in discussions on the
multi-touch-dev mailing list. You can subscribe on the project's page:
http://launchpad.net/~multi-touch-dev

Revision history for this message
bootchk (bootch) wrote :

Chase wrote> "You have to put five or more fingers down on a trackpad to start sending events that aren't caught by Unity or X synaptics."

My understanding now: Unity is a window manager (more or less.) It is also in the so-called stack that processes touch events. It does gesture recognition and hides touch events from an application (possibly converting sequences of touch events, i.e. gestures, to other event types) unless five touches are used.

Is there any way to disable unity gesture recognition (so that all touch events go to the application), similarly to the way you can disable gesture recognition in the synaptics Xorg input driver?

The stack: kernel device driver (e.g. wacom) > Xorg input driver (e.g. synaptics) > window manager (e.g. unity) > GUI toolkit (e.g. Qt) > application

Shouldn't there be a set of protocols for querying and disabling gesture recognition throughout the entire stack? An API in the GUI TK that an app uses to configure touch? I suppose that is what you architecting now. I understand it is not easy to do, in terms of coordinating all the software pieces and also in terms of providing a consistent user interface across platforms.

Revision history for this message
Chase Douglas (chasedouglas) wrote :

On 04/25/2012 08:03 AM, bootchk wrote:
> Chase wrote> "You have to put five or more fingers down on a trackpad to
> start sending events that aren't caught by Unity or X synaptics."
>
> My understanding now: Unity is a window manager (more or less.) It is
> also in the so-called stack that processes touch events. It does
> gesture recognition and hides touch events from an application (possibly
> converting sequences of touch events, i.e. gestures, to other event
> types) unless five touches are used.
>
> Is there any way to disable unity gesture recognition (so that all touch
> events go to the application), similarly to the way you can disable
> gesture recognition in the synaptics Xorg input driver?

No. The Unity gesture specification doesn't allow for gesture
configuration, and that's what we are implementing. If you want to
influence this decision, I suggest sending a message to the unity-dev
mailing list.

> The stack: kernel device driver (e.g. wacom) > Xorg input driver (e.g.
> synaptics) > window manager (e.g. unity) > GUI toolkit (e.g. Qt) >
> application
>
> Shouldn't there be a set of protocols for querying and disabling gesture
> recognition throughout the entire stack? An API in the GUI TK that an
> app uses to configure touch? I suppose that is what you architecting
> now. I understand it is not easy to do, in terms of coordinating all the
> software pieces and also in terms of providing a consistent user
> interface across platforms.

It's simply a matter of what gestures Unity subscribes to. It could be
configurable through gnome-control-center, for example. Some Unity
options are already provided there.

Revision history for this message
Erno Kuusela (erno-iki) wrote :

With the N-trig touchscreen (#724831) the Qt fingerpaint demo works with fine 1-5 fingers. So the wacom
touchscreen support should be fixed to handle that as well.

Revision history for this message
bootchk (bootch) wrote :

Erno,

You're might be right that there still is a problem somewhere. Unless Unity as window manager X client is grabbing the Xorg.input touch device, and if Qt is built to find the device and become another X-windows client of the Xorg input device, there is no reason it shouldn't get touch events regardless of the count of fingers?

But the fault could be in the way I have been testing it. I probably need to read the code, and debug it on my own.

Revision history for this message
Erno Kuusela (erno-iki) wrote :

I finally got around to following through the config hackery described at http://www.murraytwins.com/blog/?p=103
(the trick was to remove the existing wacom .conf, i think i earlier just chmod 000'd it but Xorg was running as
root so read it anyway). And it worked, kind of. I got fingerpaint working with 2 fingers, and then Xorg promptly (about 2 secs in)
crashed. Xorg.log recorded the sig11 but backtrace had no symbols.

Changed in xf86-input-wacom (Ubuntu):
status: New → Confirmed
Revision history for this message
Erno Kuusela (erno-iki) wrote :

Marking as confirmed since Brent Fox reproduced this already couple of months ago.

Revision history for this message
Erno Kuusela (erno-iki) wrote :

I now upgraded the laptop to 12.04 and the fingerpaint demo stopped working even with Xorg.0.log showing
it's using evdev.

Revision history for this message
bootchk (bootch) wrote :

Have you thought of using xscope? from http://cgit.freedesktop.org/xorg/app/xscope
It would snoop on the net traffic between the X server (including the Xorg driver say synaptics or evdev) and the X client (say Qt and app.)

Possibly the xscope tool is not up to date with XI2 protocol packets, but if not, maybe it would dump them in a raw format without interpreting them in the log.

A binary strategy: if you see touch events between the two, suspect Xclient (Qt and whatever touch support Ubuntu added), otherwise suspect X server the driver.

You could also snoop between X server and Compiz/uTouch which is also X client. Again, maybe uTouch is improperly grabbing.

Also, check the commits to they Synaptics Xorg driver. It seems active.

Revision history for this message
bootchk (bootch) wrote :

I upgraded to 12.04, downloaded Qt SDK, built the fingerpaint demo, futzed with xorg.conf.d to get it to load the synaptics Xorg input driver instead of the wacom xorg input driver, futzed with xinput set-prop, etc. to change the settings of the driver. No, the fingerpaint demo doesn't work.

I found that "xinput test-xi2 10" ( documented in xinput man page) will display the events from my device. It shows RawTouchUpdate events etc., but apparently no plain TouchEvents ?? when a second finger is moved. The first finger seems to send just plain Motion events?? I futzed some more (with synaptics input driver properties, and also running the demo) and saw some plain TouchBegin etc events, but don't know the conditions needed to reproduce it.

So I will search for the difference between raw and plain touch events, and which flavor Qt needs.

Revision history for this message
bootchk (bootch) wrote :

OK plain and raw events are sent when window belonging to "xinput test-xi2" has the cursor in it, or is in front ( or something like that.) When the second finger is moved.

Revision history for this message
Chase Douglas (chasedouglas) wrote :

Hi bootchk,

Do you have a touchscreen? If so, you want to be using the evdev driver isntead of wacom. If not, then you'll want to use the synaptics driver and open a different bug since this one is specific for touchscreens.

Revision history for this message
Erno Kuusela (erno-iki) wrote :

bootchk: see comment #7, if you use the qt sdk you won't get the ubuntu code (it's not in upstream yet). You want
to test with the ubuntu-provided qt (fingerpaint demo is in the qt4-demos package).

Revision history for this message
Cédric Dufour (cdufour-keyword-ubuntu-086000) wrote :

Hello,
Can someone clarify where we stand in regard with this issue and Ubuntu Quantal 12.10?
I've been following Qt doc to the letter and got QPinchGesture working in a QWidget... but as soon as the "grabGesture" is issued on that widget all other widgets (tabs, pushbuttons, lineedits, etc.) stop responding to "click" (tap).
Where does the problem lie?
Can it be fixed or shall one forget to have that gesture-stuff working along evdev (at least as far as 12.10 is concerned)?
Thanks for your answer.
Cheers

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.