Qt - How do I get a QGraphicsItem's position relative to scene - c++

I have this constructor for a class that derives from QGraphicsRectItem:
Tower::Tower(int x, int y, QGraphicsScene *scene) : QGraphicsRectItem(x, y, width, height)
{
QBrush brush; // color it
brush.setStyle(Qt::SolidPattern);
brush.setColor(START_COLOR);
this->setBrush(brush);
this->setAcceptHoverEvents(true);
scene->addItem(this); // add to scene
}
And I have this code for a fire function which should create a bullet at the center of the Rectangle:
void Tower::fire()
{
Bullet *bullet = new Bullet(scenePos().x() + width / 2, scenePos().y() + height / 2, scene());
}
This is the code for Bullet:
Bullet::Bullet(int x, int y, QGraphicsScene *scene) : QGraphicsRectItem(x, y, width, height)
{
QBrush brush; // color it
brush.setStyle(Qt::SolidPattern);
brush.setColor(color);
this->setBrush(brush);
scene->addItem(this);
}
When the function runs it creates a bullet at the beginning of the scene.
What can I do to fix this?
I checked other questions but their answers didn't work for me.

In QGraphicsRectItem the position is not at the top left corner of the rectangle, but is the default item position (0, 0).
If you want the scene position of the center you can do something like this:
QPointF Tower::sceneCenter()
{
return mapToScene(rect().center());
}

Related

How to scale a graphics taking the center of window as origin?

QPainter::scale takes the top-left corner of the window as an origin. In order to use the center of the window as an origin, I thought I could first translate the origin of the coordinate system to the center of the window using QPainter::translate and then scale the graphics:
class MainWindow : public QMainWindow
{
Q_OBJECT
public:
explicit MainWindow(QWidget *parent = nullptr) :
QMainWindow(parent) {
resize(600, 400);
}
protected:
void paintEvent(QPaintEvent *) override {
QPainter painter(this);
// draw a rectangle
QRectF rectangle(10.0, 20.0, 80.0, 60.0);
painter.drawRect(rectangle);
// translate the origin of coordinate system to the center of window
QPointF offset = rect().center();
painter.translate(offset);
// scale the rectangle
painter.scale(2,2);
painter.drawRect(rectangle);
}
};
The example produces the following result:
The problem is that the scale is still made with regard to the top-left corner.
How to fix that?
Following is my solution.
QPainter painter(this);
// draw a rectangle
QRectF rectangle1(10.0, 20.0, 80.0, 60.0);
painter.drawRect(rectangle1);
// scale the rectangle by 2 times
QRectF rectangle2(10.0, 20.0, 80.0 * 2, 60.0 * 2);
// move it to the center of window
QPointF offset = rect().center() - rectangle2.center();
painter.translate(offset);
painter.drawRect(rectangle2);
And I get what I want like this:
Finding the appropriate transformation that should be applied to QPainter is not a simple task since it involves centering one element on another, moving it, etc. The simplest thing is to transform the rectangle as shown below:
void Widget::paintEvent(QPaintEvent *event)
{
QPainter painter(this);
// draw a rectangle
QRectF rectangle(10.0, 20.0, 80.0, 60.0);
painter.drawRect(rectangle);
// scale
rectangle.setSize(2*rectangle.size());
// translate
rectangle.moveCenter(rect().center());
painter.drawRect(rectangle);
}
You miss one step, i.e. to re-translate the painter back after the scale. In other words, between
painter.scale(2,2);
painter.drawRect(rectangle);
add
painter.translate(-offset);

How to center fullscreen window

I have a 16:9 display where I'd like to show fullscreen SDL window which is in 4:3 mode.SDL_SetWindowFullscreen(window, SDL_WINDOW_FULLSCREEN) sets the window to the left side of the screen and leaves a big black bar to the right.
I'd like to center the window and have black bars on the left and the right side.
It seems that SDL_SetWindowPosition(window, x, y) has no effect on window when it is in fullscreen mode. Can I center the fullscreen window in SDL2?
There are two situation:
(1) display with renderer and texture base on window size.
(2) display with screen and surface base on pixel.
For (1) here is a simple solution base on setting view port for renderer.(no testing but a guideline)
void SDL_SetRendererViewportRatio_4_3(SDL_Window *window,
SDL_Renderer *renderer
SDL_Rect *viewport) {
Uint8 r, g, b, a;
SDL_GetRenderDrawColor(renderer, &r, &g, &b, &a);
SDL_SetRenderDrawColor(renderer, 0, 0, 0, 255);
SDL_RenderClear(renderer);
SDL_RenderPresent(renderer);
SDL_SetRenderDrawColor(renderer, r, g, b, a);
int w, h;
SDL_GetWindowSize(window, &w, &h);
if (w * 3 > h * 4) {
viewport->w = h * 4 / 3;
viewport->h = h;
} else {
viewport->w = w;
viewport->h = w * 3 / 4;
}
viewport->x = (w - viewport->w) / 2;
viewport->y = (h - viewport->h) / 2;
SDL_RenderSetViewport(renderer, viewport);
}
Note that you should call this function whenever window changed size.
For (2) I guess you should calculate the coordinate of surface and draw big black bars by yourself. It is more difficult that I cannot prove simple solution.

An issue when rendering a line depending on Mouse position

I have an openGL widget, where I want to render a line which is dependent of mouse positions, as follows:
glPushMatrix();
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glColor4f(1, 0, 0, 0.1);
glScalef(a, b, 0);
glBegin(GL_LINES);
glVertex2f(pushedX, pushedY);
glVertex2f(currentX, currentY);
glEnd();
glFlush();
glDisable(GL_BLEND);
glPopMatrix();
where:
pushedX=buttonPressCoordinates.x();
pushedY=buttonPressCoordinates.y();
currentX=mouseCurrentPosition.x();
currentY=mouseCurrentPosition.y();
The rendering goes good, and the line is rendered as required when I move the mouse connecting with line the pushed and current coordinates.
BUT:
the issue is, that even when I press my mouse somewhere on the widget and don't move it, it generates randomly (as I think) some (x,y) and connects the it with a line with the coordinates of mouse pressed position. Though when I start moving my mouse it starts working fine.
Please, help to fix this bug.
EDIT
The code of assigning the current values of the mouse
void MainWindow::mouseMoveEvent(QMouseEvent *eventMove)
{
if(eventMove->buttons() & Qt::LeftButton)
{
GLWidget *widget = this->findChild<GLWidget *>("glwidget");
float x = widget->getNormalizedWidth(widget->mapFromGlobal(QCursor::pos()).x());
float y = widget->getNormalizedHeight(widget->mapFromGlobal(QCursor::pos()).y());
float y_ = 1.0 - y;
mouseCurrentPosition.setX(x);
mouseCurrentPosition.setY(y_);
widget->setCurrentX(mouseCurrentPosition.x());
widget->setCurrentY(mouseCurrentPosition.y());
}
}
note: QPointF mouseCurrentPosition;, getNormalizedWidth(...) is my defined fundction which works perfect.
EDIT-2
The mouse click coordinates are updated as follows:
setMouseTracking(true);
m = true;
GLWidget *widget = this->findChild<GLWidget *>("glwidget");
float x = widget->getNormalizedWidth(widget->mapFromGlobal(QCursor::pos()).x());
float y = widget->getNormalizedHeight(widget->mapFromGlobal(QCursor::pos()).y());
float y_ = 1.0 - y;
buttonPressCoordinates.setX(x);
buttonPressCoordinates.setY(y_);
qDebug() << buttonPressCoordinates.x() << buttonPressCoordinates.y();
widget->setQ(true);
widget->setPushedX(buttonPressCoordinates.x());
widget->setPushedY(buttonPressCoordinates.y());
One thing you can try is to update currentX and currentY in the mouseMoveEvent when LeftButton is not pressed:
void MainWindow::mouseMoveEvent(QMouseEvent *eventMove)
{
GLWidget *widget = this->findChild<GLWidget *>("glwidget");
float x = widget->getNormalizedWidth(widget->mapFromGlobal(QCursor::pos()).x());
float y = widget->getNormalizedHeight(widget->mapFromGlobal(QCursor::pos()).y());
float y_ = 1.0 - y;
if(eventMove->buttons() & Qt::LeftButton)
{
buttonPressCoordinates.setX(x);
buttonPressCoordinates.setY(y_);
widget->setPushedX(buttonPressCoordinates.x());
widget->setPushedY(buttonPressCoordinates.y());
}
else
{
mouseCurrentPosition.setX(x);
mouseCurrentPosition.setY(y_);
widget->setCurrentX(mouseCurrentPosition.x());
widget->setCurrentY(mouseCurrentPosition.y());
}
}

Libgdx - How to draw filled rectangle in the right place in scene2d?

I am using scene2d. Here is my code:
group.addActor(new Actor() {
#Override
public Actor hit(float arg0, float arg1) {
return null;
}
#Override
public void draw(SpriteBatch batch, float arg1) {
batch.end();
shapeRenderer.begin(ShapeType.FilledRectangle);
shapeRenderer.setColor(Color.RED);
shapeRenderer.filledRect(0, 0, 300, 20);
shapeRenderer.end();
batch.begin();
}
});
The problem is that it draws this rectangular relative to screen (x = 0, y = 0), but I need it to be drawn relative to my group. But if I draw other entities with:
batch.draw(texture, 0, 0, width, height);
it correctly draws at (x = 0, y = 0) relative my group (0,0 pixels from left-bottom corner of the group).
Any suggestions how can I implement shape drawing in scene2d? And can someone can explain why these two calls work differently?
ShapeRenderer has its own transform matrix and projection matrix. These are separate to those in the SpriteBatch that the scene2d Stage uses. If you update the ShapeRenderer's matrices to match those that scene2d is using when Actor.draw() is called then you should get the results that you want.
As Rod Hyde mentions, ShapeRenderer has its own transform matrix and projection matrix.
So you would have to get the SpriteBatch's projection Matrix first.
I am not sure if there is an elegant way to do it, I did it like this:
public class myActor extends Actor{
private ShapeRenderer shapeRenderer;
static private boolean projectionMatrixSet;
public myActor(){
shapeRenderer = new ShapeRenderer();
projectionMatrixSet = false;
}
#Override
public void draw(SpriteBatch batch, float alpha){
batch.end();
if(!projectionMatrixSet){
shapeRenderer.setProjectionMatrix(batch.getProjectionMatrix());
}
shapeRenderer.begin(ShapeType.Filled);
shapeRenderer.setColor(Color.RED);
shapeRenderer.rect(0, 0, 50, 50);
shapeRenderer.end();
batch.begin();
}
}
The best solution for me. Cause when you using ShapeRenderer it's doesn't react on moving/zooming camera.
public class Rectangle extends Actor {
private Texture texture;
public Rectangle(float x, float y, float width, float height, Color color) {
createTexture((int)width, (int)height, color);
setX(x);
setY(y);
setWidth(width);
setHeight(height);
}
private void createTexture(int width, int height, Color color) {
Pixmap pixmap = new Pixmap(width, height, Pixmap.Format.RGBA8888);
pixmap.setColor(color);
pixmap.fillRectangle(0, 0, width, height);
texture = new Texture(pixmap);
pixmap.dispose();
}
#Override
public void draw(Batch batch, float parentAlpha) {
Color color = getColor();
batch.setColor(color.r, color.g, color.b, color.a * parentAlpha);
batch.draw(texture, getX(), getY(), getWidth(), getHeight());
}
}
You need to convert the Actor's local coordinates into screen coordinates. Assuming your stage is full-screen, you can just use Actor.localToStageCoordinates:
vec.set(getX(), getY());
this.localToStageCoordinates(/* in/out */ vec);
shapeRenderer.filledRect(vec.x, vec.y, getWidth(), getHeight());
Where vec is a private Vector2d (you don't want to allocate a new one on each render call).
This is also assuming that your ShapeRenderer is defined to be map to the full screen (which is the default).
Also, if you switch away from the ShapeRenderer and back to the SpriteBatch, note that the batch is already adjusted to Actor coordinates (and thus you can use getX() and getY() directly with batch.draw(...).
If your are using the ShapeRenderer don't forget using setProjectionMatrix() and setTransformMatrix() methods...
A sample of draw circle inside an Actor on draw method :
#Override public void draw(Batch batch, float parentAlpha) {
batch.end();
if (shapeRenderer == null) {
shapeRenderer = new ShapeRenderer();
}
Gdx.gl.glEnable(GL20.GL_BLEND);
shapeRenderer.setProjectionMatrix(batch.getProjectionMatrix());
shapeRenderer.setTransformMatrix(batch.getTransformMatrix());
shapeRenderer.setColor(mColor.r, mColor.g, mColor.b, mColor.a * parentAlpha);
shapeRenderer.begin(ShapeRenderer.ShapeType.Filled);
shapeRenderer.circle(getX() + getWidth()/2 , getY() + getHeight()/2 , Math.min(getWidth(),getHeight())/2 );
shapeRenderer.end();
Gdx.gl.glDisable(GL20.GL_BLEND);
batch.begin();
}

Rotate rectangle around its center

I need to rotate a rectangle around it's center-point and display it in the center of a QWidget. Can you complete this specific code? If possible, could you also dumb-down the explaination or provide a link to the simplest explaination?
Please note: I have read the Qt documentation, compiled examples/demos that deal with rotation and I STILL cannot understand it!
void Canvas::paintEvent(QPaintEvent *event)
{
QPainter paint(this);
paint.setBrush(Qt::transparent);
paint.setPen(Qt::black);
paint.drawLine(this->width()/2, 0, this->width()/2, this->height());
paint.drawLine(0, this->height()/2, this->width(), this->height()/2);
paint.setBrush(Qt::white);
paint.setPen(Qt::blue);
// Draw a 13x17 rectangle rotated to 45 degrees around its center-point
// in the center of the canvas.
paint.drawRect(QRect(0,0, 13, 17));
}
void paintEvent(QPaintEvent* event){
QPainter painter(this);
// xc and yc are the center of the widget's rect.
qreal xc = width() * 0.5;
qreal yc = height() * 0.5;
painter.setPen(Qt::black);
// draw the cross lines.
painter.drawLine(xc, rect().top(), xc, rect().bottom());
painter.drawLine(rect().left(), yc, rect().right(), yc);
painter.setBrush(Qt::white);
painter.setPen(Qt::blue);
// Draw a 13x17 rectangle rotated to 45 degrees around its center-point
// in the center of the canvas.
// translates the coordinate system by xc and yc
painter.translate(xc, yc);
// then rotate the coordinate system by 45 degrees
painter.rotate(45);
// we need to move the rectangle that we draw by rx and ry so it's in the center.
qreal rx = -(13 * 0.5);
qreal ry = -(17 * 0.5);
painter.drawRect(QRect(rx, ry, 13, 17));
}
You are in the painter's coordinate system. When you call drawRect(x, y, 13, 17), it's upper left corner is at (x,y). If you want (x, y) to be the center of your rectangle, then you need to move the rectangle by half, hence rx and ry.
You can call resetTransform() to reset the transformations that were made by translate() and rotate().
Simple:
void rotate(QPainter* p, const QRectF& r, qreal angle, bool clock_wise) {
p->translate(r.center());
p->rotate(clock_wise ? angle : -angle);
p->translate(-r.center());
}