I am developing a tiny app that will run on beagleboard with 7" inch touch screen, but I haven't got it so far and I am working as developing standard desktop app. Are mouse events equal to touch screen events? I am working on QTableView and I've disable mouse drag multiple selection via:
void CTableView::mouseMoveEvent(QMouseEvent* event)
{
if(this->state()!=DragSelectingState)
QTableView::mouseMoveEvent(event);
}
Will this code also work on touch screen if user will try to select multiple cells with fingers?
Normally this strongly depends on you touch-driver. Mostly the touch events will be interpreted as left mouse clicks. And depending on your touch-driver somethimes you have to ensure that the touchdriver is calibrated in the right way ( for instance if the touch driver needs to know the origin of a touch event to get the right coordinates ).
For multitouch-device-handling I strogly recommend to use the QML-stuff for your user interface with a MultiPointTouchArea: http://qt-project.org/doc/qt-5/qml-qtquick-multipointtoucharea.html#details . You can easily connect the QML-stuff to your c++-logic as well.
Related
I am working on a Qt application which resembles hex editor for Mac.
(picture from Google)
It has a very large portion of data to scroll vertically(upward and downward) because it shows all large files data in hex format.
In my application, I'd like to add two finger smooth scrolling in both direction: up and down like that in Macbook Air two finger scrolling.
It work properly with mouse wheel but not with trackpad two finger move scrolling.
If someone has a solution, please help me out. Thanks in advance.
The scroller allows for gestures like click and drag to do kinetic scrolling.
http://qt-project.org/doc/qt-5/qscroller.html#details
Note on this page:
QScroller::TouchGesture 0
The gesture recognizer will only trigger on touch events. Specifically it will react on single touch points when using a touch screen and dual touch points when using a touchpad.
So then the example they give would turn into this for you:
QWidget *w = ...;
QScroller::grabGesture(w, QScroller::TouchGesture);
There is more on doing new things with touch screens and touch pads by handling the QTouchEvent:
http://qt-project.org/doc/qt-5/qtouchevent.html#details
Hope that helps.
I am working on a Desktop based project using visual C++/MFC. There are lots of buttons. But problem is, It should work on touch screen monitor where no mouse/keyboard are available.
So, will ON_BN_CLICKED work as touch event in touch screen monitor? Or I have to handle it other ways?
By "Touch" event if you mean screen tap then Yes, they CAN be treated as same.
Windows 7 provides built in support for applications that do not provide any explicit support for touch and ink support to receive input via the Onscreen Keyboard and Writing Pad.
Windows will primarily use touch in much the same mode as a mouse, with screen taps equating to mouse clicks. So, 'ON_BN_CLICKED' will work for screen taps.
That being said, you can provide explicit support for Touch support in one of two ways:
Gestures: Windows-provided mapping of distinctive touch sequences into gestures like zoom and pan. MFC further translates these gestures into a simplified set of CWnd virtual methods that can be overridden as required.
OnGestureZoom(CPoint ptCenter, long lDelta)
OnGesturePan(CPoint ptFrom, CPoint ptTo)
OnGestureRotate(CPoint ptCenter, double dblAngle)
OnGestureTwoFingerTap(CPoint ptCenter)
OnGesturePressAndTap(CPoint ptPress, long lDelta)
Touch Messages: Registering to receive the low-level touch messages which may be coming from multiple touch points simultaneously, and responding to these touch events in the message handler.
virtual BOOL OnTouchInput(CPoint pt, int nInputNumber, int nInputsCount, PTOUCHINPUT pInput);
Source: Check this article for details.
I am trying to send mouse events to a (I think SDL/OpenGL) game that doesn't support a gamepad. I know I could just use one of the many Gamepad to Keyboard/Mouse applications available, but I thought it would be fun to write my own. The following code works fine except when the game is running:
// point is a CGPoint that is set earlier on...
CGEventRef event = CGEventCreateMouseEvent(NULL,kCGEventMouseMoved , point, 0);
CGEventSetType(event, kCGEventMouseMoved);// apparently there is a apple bug that requires this...
CGEventPost(kCGHIDEventTap, event);
CFRelease(event);
With this, I can move the cursor on my desktop, and even in the games menu's, however the only time the game gets it while in the actual game is when I move my physical mouse. Sending keyboard events works fine in the game, so I don't know what the problem is
I found out what the problem was, I had to use:
CGEventSetIntegerValueField(event, kCGMouseEventDeltaX, dX);
CGEventSetIntegerValueField(event, kCGMouseEventDeltaY, dY);
to set the relative coordinates of the mouse move. I think this is because the game was warping the mouse to the center of the game window, and my application wasn't getting the mouse move events from that.
I'm not sure quite how to phrase the question concisely, so if there is a similar question, please point me in the right direction and close this one.
I am currently building a CAD app, the user interacts within the 3D viewports primarily through the mouse and the three keyboard modifiers (alt, shift, ctrl). Shift and control modify the currently selected tool options, and alt operates the camera - much like any other 3D CAD app.
However I'm currently developing with a Gnome desktop, and it's window manager (AFAIK) catches any Alt-RightButton mouse dragging events and interprets them as a window drag command - even when not holding the title bar and regardless of the currently highlighted widget.
This is a disaster for me because camera keyboard controls are quite standardised in my target industry. So does anyone know of a way to override this behaviour, preferably from within Qt, and preferably focus it for my one scenario in one particular widget class?
Thank you,
Cam
If you use the Qt::X11BypassWindowManagerHint on the window, then the window manager can't steal your keypresses. However, this means you lose the native window frame (including decoration, moving, and resizing), so it is likely you don't want to do this.
Another way: if your users are only on 1 or 2 varieties of Linux, add something to the installer which asks the user whether they want to manipulate the gnome (or whatever) keysettings, and if so, changes them via gconftool-2 (or equivalent).
I'm currently writing a c++ console application that grabs the mouse position at regular intervals and sends it to another visual application where it is used to drive some 3d graphics in real time. The visual app is closed source and cannot be altered outside it's limited plug-in functionality.
Currently I'm using the GetCursorPos() function which is easy and fast enough, but I'm running into the issue that all of the data is clipped based on the current screen resolution of 1920x1600 so that all x values are between 0 and 1920 and all y values are between 0 and 1600 no matter how far the mouse is physically moved.
I need to get the mouse position before it's clipped at the edge of the screen, or possibly the deltas which I could use to calculate the current position.
I've seen some references to the windows MouseMove event but I would really not want to implement a window to make it work or especially have it as the active to receive those events.
I'm working in a windows environment and a language change is not feasible.
I might be wrong, but in Win32 land you don't get mouse move messages when the mouse is at the edge of the screen because, well, the mouse isn't moving. The usual way to get an infinite mouse area is to do the following:
Hide the mouse, get exclusive access and record position
Centre mouse to window
When mouse moves, get delta from centre of screen to current position
Centre mouse to window again
The next mouse move should have a delta of (0,0), so ignore it
Go to 3 until end of mouse move operation
Reset position, show the mouse and release exclusive access
If you didn't hide the mouse, then you'd see the mouse moving a small distance and then snapping back to the centre position, which looks nasty.
This method does require a message pump for the mouse move messages so the console application idea probably won't work with this. Can you create a full screen invisible window for grabbing the mouse?
Just get the position, and move it to the center and return the delta yourself
This is how FPS games do it
I don't have any direct experience with raw input, which is probably what you need to tap into. According to MSDN, you have to register the device, then setup your winproc to accept the WM_INPUT messages and then do your calculations based on the raw data.
Here's another relevant link.