There was code in the osgViewer/Viewer.cpp and osgViewer/CompositeViewer.cpp that transformed the Y-coordinates of an event. The code in the composite viewer did however miss the touch-data of the event. I thought that it should really be the GUIEventAdapter that should know about this, and hence I added the
GUIEventAdapter::setMouseYOrientationAndUpdateCoords which is re-computing the coordinates. First I simply added a boolean to the setMouseYOrientation function:
setMouseYOrientation( MouseYOrientation, bool updatecooreds=false );
but then the serializer complained.
This function is called from both the Viewer and the CompositeViewer. We have not tested from the viewer, but I cannot see it would not work from visual inspection.
The other change is in MultiTouchTrackballManipulator::handleMultiTouchDrag. I have removed the normalisation. The reason for that is that it normalised into screen coordinates from 0,0 to 1,1. The problem with that is that if you have a pinch event and you keep the distance say 300 pixels between your fingers, these 300 pixels represent 0.20 of the screen in the horizontal domain, but 0.3 of the screen in the vertical domain. A rotation of the pinch-fingers will hence result in a zoom in, as the normalised distance is changing between them.
A consequence of this is that I have changed the pan-code to use the same algorithm as the middle-mouse-pan.
The rest of it is very similar from previous revision, and there has been some fine-tuning here and there.
"
I added a new tag to p3d called forward_touch_event_to_device and renamed the existing forward_event_to_device to forward_mouse_event_to_device. This new tag will transmit touches to the virtual trackpad as touch events. I added the MultitouchTrackball to the p3d-app so zooming and moving a model remotely should now work, if you use forward_touch_event_to_device. I kept (and fixed) forward_mouse_event_to_device for background compatibility, so old presentations works as in previous versions, without the ability to zoom + scale. of course.
forward_touch_event_to_device needs some more testing, (e.g. with image-streams and keystone, afaik there’s no support for touch-events...) but for a first version it works nice.
"
* MultiTouchTrackball: some code cleanup and support for normalized touch-points
* oscdevice: receiving and sending multi-touch-events via the Cursor2D-profile from TUIO
* added some documentation
hpux. I have skipped irix this time as irix is too dead to keep osg building
there.
As usual, solaris does not like member templates in stl containers.
Some headers missing and link problems due to missing libraries."
osgGA. My approach is to bundle all touchpoints into one custom data
structure which is attached to an GUIEventAdapter.
The current approach simulates a moving mouse for the first touch-point,
so basic manipulators do work, sort of.
I created a MultiTouchTrackballManipulator-class, one touch-point does
rotate the view, two touch-points pan and zoom the view as known from
the iphone or other similar multi-touch-devices. A double-tap (similar
to a double-click) resets the manipulator to its home-position.
The multi-touch-trackball-implementation is not the best, see it as a
first starting point. (there's a demo-video at http://vimeo.com/15017377 )"