I am working on a Qt application which resembles hex editor for Mac.
(picture from Google)
It has a very large portion of data to scroll vertically(upward and downward) because it shows all large files data in hex format.
In my application, I'd like to add two finger smooth scrolling in both direction: up and down like that in Macbook Air two finger scrolling.
It work properly with mouse wheel but not with trackpad two finger move scrolling.
If someone has a solution, please help me out. Thanks in advance.
The scroller allows for gestures like click and drag to do kinetic scrolling.
http://qt-project.org/doc/qt-5/qscroller.html#details
Note on this page:
QScroller::TouchGesture 0
The gesture recognizer will only trigger on touch events. Specifically it will react on single touch points when using a touch screen and dual touch points when using a touchpad.
So then the example they give would turn into this for you:
QWidget *w = ...;
QScroller::grabGesture(w, QScroller::TouchGesture);
There is more on doing new things with touch screens and touch pads by handling the QTouchEvent:
http://qt-project.org/doc/qt-5/qtouchevent.html#details
Hope that helps.
Related
I'm working on a Qt class project. We're supposed to develop an application like Microsoft Paint. Now I don't know how to enlarge a selected part of an image. Actually I don't even know how to "select" an area. You know, just like that on the desktop of Windows, you press the left button of the mouse and than move it, a dashed-line rectangle will show up. I hope to move or zoom in/out this particular area.
Any help will be appreciated, thanks!
It could be done by using mouse events. Here's example that might be useful to you:https://doc.qt.io/qt-5/qtwidgets-widgets-scribble-example.html
I would like to make a multi touch control for my camera
This camera should do:
zoom in/out when pinching
orbit when swiping
pan when swiping with 2 fingers.
Is there somebody who know some good examples/tutorials or give me some advice?
Thank you so much
The best example I found was the Strategy Game (Tower defense) sample that comes with the Unreal Engine. It demonstrates an independent camera system in C++ that responds to touch gestures.
As a simplified but very similar approach you may find also find useful my UE4TopDownCamera sample project for a top down camera with:
spread/pinch or mouse wheel up/down for zoom in/out (implemented as dollying)
swipe with one finger for panning
on/off functionality to lock on/follow main character or freely move
camera.
Please notice that the gestures are not exactly the ones you described, as my requirements were different.
I'll soon upload some full explanation and a video on github.
I use visual studio to implement my Win32 game project, but because I will port it to android platform later, so in AppDelegate class, I use setFrameSize to make window screen be like a mobile screen as below:
glview->setFrameSize(600, 900);
glview->setDesignResolutionSize(320, 480, ResolutionPolicy::FIXED_WIDTH);
But I got problem when implementing menu items, if I use setFrameSize function, when I touch to menu item, they does not work because their visual position is diffent from thier real position. If I comment out the set frame size command, menu items work correctly, but my screen too big and is very difficult for me to develope mobile game. Does anyone know why this problem come to me and how to resolve it? Thank alot.
Edit: I use visibleSize to set position for my menu items. It seem to not the same as what I want because when I set position of sprite is 3/8 of screen height ( bottom up), it become 1/2 of screen height ( bottom up).
I am developing a tiny app that will run on beagleboard with 7" inch touch screen, but I haven't got it so far and I am working as developing standard desktop app. Are mouse events equal to touch screen events? I am working on QTableView and I've disable mouse drag multiple selection via:
void CTableView::mouseMoveEvent(QMouseEvent* event)
{
if(this->state()!=DragSelectingState)
QTableView::mouseMoveEvent(event);
}
Will this code also work on touch screen if user will try to select multiple cells with fingers?
Normally this strongly depends on you touch-driver. Mostly the touch events will be interpreted as left mouse clicks. And depending on your touch-driver somethimes you have to ensure that the touchdriver is calibrated in the right way ( for instance if the touch driver needs to know the origin of a touch event to get the right coordinates ).
For multitouch-device-handling I strogly recommend to use the QML-stuff for your user interface with a MultiPointTouchArea: http://qt-project.org/doc/qt-5/qml-qtquick-multipointtoucharea.html#details . You can easily connect the QML-stuff to your c++-logic as well.
I have an EeePC 900 running Ubuntu Intrepid Ibex. The touchpad has some simple multi-touch gestures built in - scrolling by dragging with two fingers instead of one for example.
How would I detect multi-touch events in an OpenGL/C application?
Is the touchpad on the EeePC 900 capable of handling rotational and scaling gestures?
The MPX example returns with Only found one master pointer. and the suggested xinput --create-master "ImPS/2 Logitech Wheel Mouse" isn't recognised by xinput. So is the multi-touch scrolling behaviour built-in at a lower level?
(On my Eee 1000)
xev reveals that the two-finger scroll gesture is being turned into clicks of buttons 4 and 5. The three-finger tap is just turned into button 3. I don't think the pad supports any other operations. So it looks like the pad hardware is just generating clicks as though it is a wheel mouse.
I don't own an EeePC, however you may want to check out the MPX API for accessing multiple pointers in X. If you're looking for general multitouch frameworks, there are a few on Google Code.