I have a problem regarding movement of Box2D sprite body. The problem is when I press the left button, my sprite moves left and when I release the button, my sprite will stop.
I know the logic of movement, but do not know how to create the left button for continuous movement.
When u press the button set some Boolean variable to true and then in update/tick method just do this :
If(boolVariable == true) {
sprite.position = ccp(sprite.position.x - 0.5 , sprite.position.y);
}
This should move your sprite to left in continuation.
Related
I'm working on a Qt5 application to attempt to make use of raw input data from a touchscreen because touch is not fully supported by my Linux kernel (2.6.32). I wrote a parser for the raw data coming from /dev/input/eventX because even though X doesn't support touch, the events are still there.
I'm able to read the events just fine, and wrote a wrapper called "TouchPoint" which contains the ID and (x,y) coordinates of each touch point, as well as a boolean indicating whether or not the touch point is currently "active" (ie: the user currently has that finger on the screen). It's worth noting that I also output the number of points of touch currently on the screen, and the value is accurate.
My issue is that I can't seem to figure out how to accurately simulate a mouse click with it. With multi-touch events in Linux, each touch point is assigned a "tracking ID" when the user presses a finger to the screen, and when that point is lifted, an event setting that slot's tracking ID to -1 is generated. I use this change of ID from some value to -1 to indicate that a touch point is "down" or not:
void TouchPoint::setID(int nid) {
if((id == -1) && (nid >= 0)) down = true;
else if(nid == -1) down = false;
id = nid;
}
So to try and simulate a mouse click, I do this when a tracking ID event is read:
else if(event.code == ABS_MT_TRACKING_ID) {
bool before = touchPoints[cSlot].isDown(); // where cSlot is the current slot the events are referring to
touchPoints[cSlot].setID(event.value);
bool after = touchPoints[cSlot].isDown();
if(!before && after) touch(touchPoints[cSlot]);
}
And then the touch method does this:
void MainWindow::touch(TouchPoint tp) {
if(ui->touchMe->rect().contains(QPoint(tp.getX(), tp.getY()))) {
on_touchMe_clicked();
}
}
The button does not respond to me directly touching it, but sometimes if I wildly flail my fingers around the screen, the message box that should show when it's pressed will show, but when it does, my fingers are always somewhere on another area of the screen.
Can anyone think of something I might be doing wrong here? Is my method of checking to see if the touch point is within the button's bounds wrong? Or is my logic for keeping track of the "down" status of the touch points wrong? Or something else entirely?
UPDATE: I just noticed two things by outputting the screen coordinates of both the touch location and the button location.
The digitizer resolution is larger than the screen resolution, so I needed to scale the X and Y coordinates coming from the raw events.
My coordinates are offset because they are screen coordinates.
UPDATE 2: I've now dealt with the two issues mentioned in my last update, but I'm still not able to figure out how to accurately simulate a touch/mouse event to be able to interact with the interface. I also modified my TouchPoint class to have a boolean flag indicating whether or not the touch point was just updated, which is set to true when the tracking ID is updated and reset when an EV_SYN event is raised. This is so I can get the X and Y position events before creating the event for the application. I tried the following:
Using the QApplication class to post a mouse event to the QApplication::desktop()->screen() widget that has the position of the touch point.
Using the QTest class to raise a touchEvent to the QApplication::desktop()->screen() widget using the press and release methods for the given slot and position of the touch point and then using the bool event(QEvent*) method to try to catch the event. I also enabled the WA_AcceptTouchEvents attribute on the main window.
Neither of these works, as for some reason, when I have the "bool event(QEvent*)" method in place, the signal emitted from the thread reading /dev/input/eventX doesn't trigger the slot in the main window class. and I can't seem to find any other method to accomplish simulating the events. Anyone have any ideas?
You wrote that you sometimes get your message box "clicked" when your fingers are in a "wrong position" so it sounds like you a mixing global screen coordinates and widget local coordinates. It seems, in your MainWindow::touch you try to check if a point in global screen coordinates, e.g. (530, 815) is inside a widget's geometry (in its local coordinates). QWidget::rect() returns internal geometry of the button (widget), i.e. a rectangle of widget's width and height, e.g. (0, 0, 60, 100).
You have to move the rect to a right position in global screen coordinates. QWidget::pos() returns widget local position to its parent. You can use QWidget::mapToGlobal to translate widget coordinate position to global screen coordinates. Then your rect is something like (500, 850, 60, 100) and you should get a hit and get your slot called.
However, better approach is to use QApplication::widgetAt to get the widget in a specific screen position and generate a mouse click for it.
You can generate a mouse click for a widget by posting a mouse press and release to it, something like below:
QPoint screenPos = QPoint(tp.getX(), tp.getY());
QWidget *targetWidget = QApplication::widgetAt(screenPos);
QPoint localPos = targetWidget->mapFromGlobal(screenPos);
QMouseEvent *eventPress = new QMouseEvent(QEvent::MouseButtonPress, localPos, screenPos, Qt::LeftButton, Qt::LeftButton, Qt::NoModifier);
QApplication::postEvent(targetWidget, eventPress);
QMouseEvent *eventRelease = new QMouseEvent(QEvent::MouseButtonRelease, localPos, screenPos, Qt::LeftButton, Qt::LeftButton, Qt::NoModifier);
QApplication::postEvent(targetWidget, eventRelease);
I'm programming my first 2D game in Qt.
I have QWidged where I draw my game (isometric view). When mouse enters border of widget it moves map view (like in every strategy game...).
And here is my trouble... I'm tracking mouse position with mouseMoveEvent but it fires only when mouse moves (only when position changes). So map moves only when I move mouse at borders. If mouse stand still, map does not move (mouseMoveEvent is not triggered). And I have no idea how to solve this. It's annoying when you try to play it.
This is my first post here.. and I hope that I explained my problem clearly :)
Edit (little more clarify):
Imagine this: you want to move map. So you move mouse to the edge of screen (QWidget) but at the moment when you stop mouse, map stops moving too. But mouse is still at edge of screen. What I want to do is that map will still move after mouse stops at edge.
You can create QPropertyAnimation for coordinates and start/stop it when mouse moves to/from widget's border.
Or you can remember current state ("changing x by -1 every 100ms, changing y by 0") and call some slot that does the real moving with QTimer.
I'm developing an ios game with cocos2d currently. There is a ccmenuitem on the screen. As the game starts, the player moves forward. In order to keep the player on the centre of the screen, the cccamera coordinates are changing according to player's position. The problem is when I click on the menuitem after the camera coordinates are changed, it does not response. For example, if the camera coordinates are moved 10 px to the right, I have to click 10px to the right of the the menuitem in order to "click on it".
Does any one know how fix this ? :(
This is a know side-effect of using CCCamera. There's rarely any need for using the camera though because you can achieve the same scrolling effect by simply moving the layer in the opposite direction.
I'm learning C++ and SFML right now, trying to create a chess program in which I can drag and drop the pieces around the board. I drag the pieces by checking if the left mouse button is down and if the mouse is over a piece. If these are both true, I change the position of the piece to that of the mouse.
The only problem is when I drag the pieces really quickly, thus having my mouse down but not hovering over the piece.
I want to fix this using something like:
sf::Sprite pieceSelected;
sf::Sprite Pawn;
bool selected;
...
if (LeftButtonDown && isMouseOver(Pawn,Input)) {
pieceSelected=&Pawn;
selected = true;
}
if (LeftButtonDown && selected)
pieceSelected.SetPosition(MouseX - (XSizeSprite / 2), MouseY - (YSizeSprite / 2));
else
selected=false;
App.Draw(Pawn);
I want 'pieceSelected' to be referencing 'Pawn' so that when I'm moving 'pieceSelected' I'm actually moving 'Pawn' at the same time.
EDIT
I fixed this by changing
sf:Sprite pieceSelected;
to
sf::Sprite * pieceSelected;
and
pieceSelected.SetPosition
to
pieceSelected->SetPosition
Right, from the comments I spotted the problem. Your drag-and-drop code repeatedly picks up and drops the pawn. That's indeed not the correct solution. Instead, you should only drop the piece on a LeftMouseUp.
What you want is a DragOperation class. You create it when you detect the begin of a drag operation (mouse down over a pawn). The DragOperation class has a sf::Sprite & pieceSelected, set of course in its constructor. You should also store both the mouse and pawn coordinates where the drag operation started.
While dragging, the responsibility of drawing the selected piece should be moved to the DragOperation class. This allows you to smoothly drag the pawn, in pixel increments, instead of drawing it only in the middle of a square.
When the mouse button is released, check the result and then delete your DragOperation object.
I want to show a little image on my mouse position.
So i did that:
void AreaScene::mouseMoveEvent(QGraphicsSceneMouseEvent *event){
MapData::pEnd.setX(event->scenePos().x());
MapData::pEnd.setY(event->scenePos().y());
this->update(0, 0, this->width(), this->height());
}
The pEnd is my point.
On the drawForeground i did that:
void AreaScene::drawForeground(QPainter *painter, const QRectF &rect){
qDebug() << "called";
if(MapData::tileIndex!=-1&&MapData::pEnd.x()!=-1){
painter->drawPixmap(MapData::pEnd.x(),MapData::pEnd.y(), *MapData::tileImage, (((int)(MapData::tileIndex%(MapData::tileImage->width()/MapData::tileSize.x())))*MapData::tileSize.y()),
(((int)(MapData::tileIndex/(MapData::tileImage->width()/MapData::tileSize.x())))*MapData::tileSize.x()),
MapData::tileSize.x(), MapData::tileSize.y());
}
}
Note:
The tile index is the position of the subrectangle on the tileImage (QPixelMap)
So i get the points, the image and the subrectange inside it.
It works if i keep pressing the right or the left mouse buttons it updates, but i want to update it when i move the mouse, i know the drawForeground is not even called at all.
Is there a way to call it, force to update so i can show the little tile on the screen?
The another option (i think) is change the mouse icon to the tile image, but i did a little research and didn't find a way to do that.
Thanks ppl
Call setMouseTracking(true); on the QGraphicsView that is displaying the scene. That will tell the view to generate mouse move events whenever the mouse is hovered over the view. Otherwise, the view will only generate mouse move events when you click and drag while holding down a mouse button.