Remove all canvas margins & spacing in Qwt - c++

I have a QwtPlot which consists in a single horizontal QwtPlotMultiBarChart. I would like to have the canvas height, the y axis length and the barchart height to be all the same. I tried settings all margins/spacing to zero and switch layout policies.
seriesPlot = new QwtPlot(this);
seriesPlot->plotLayout()->setCanvasMargin(0);
seriesPlot->plotLayout()->setSpacing(0);
seriesPlot->canvas()->setSizePolicy(QSizePolicy::Minimum, QSizePolicy::Minimum);
seriesPlot->canvas()->setContentsMargins(0,0,0,0);
d_barChartItem = new QwtPlotMultiBarChart();
d_barChartItem->setLayoutPolicy( QwtPlotMultiBarChart::ScaleSampleToCanvas );
d_barChartItem->setStyle( QwtPlotMultiBarChart::Stacked );
d_barChartItem->attach( seriesPlot );
populate();
d_barChartItem->setSpacing(0);
d_barChartItem->setMargin( 0 );
QwtPlot::Axis axis2 = QwtPlot::xBottom;
QwtPlot::Axis axis1 = QwtPlot::yLeft;
d_barChartItem->setOrientation( Qt::Horizontal );
seriesPlot->setAxisScale( axis1,-0.2,0.2 );
seriesPlot->setAxisAutoScale( axis2 );
QwtScaleDraw *scaleDraw1 = seriesPlot->axisScaleDraw( axis1 );
scaleDraw1->enableComponent( QwtScaleDraw::Backbone, false );
scaleDraw1->setSpacing(0);
scaleDraw1->enableComponent( QwtScaleDraw::Ticks, false );
scaleDraw1->enableComponent( QwtScaleDraw::Labels, false );
QwtScaleDraw *scaleDraw2 = seriesPlot->axisScaleDraw( axis2 );
scaleDraw2->enableComponent( QwtScaleDraw::Backbone, false );
scaleDraw2->enableComponent( QwtScaleDraw::Ticks, true );
seriesPlot->plotLayout()->setAlignCanvasToScale( axis1, true);
seriesPlot->updateAxes();
seriesPlot->plotLayout()->setCanvasMargin( 0 );
seriesPlot->updateCanvasMargins();
seriesPlot->updateLayout();
seriesPlot->replot();
seriesPlot->setAutoReplot( true );
And here is a picture of the two margins that i would like to remove:

I found the solution:
d_barChartItem->setLayoutHint(0.75);

Related

ANY edit in this threejs sample build ends up with white screen

I was given this sample build project to be able to play around with threejs since I was having trouble making the app work as a beginner. I wanted to test if I could edit the build project, so I decided to add another cube, and reposition it anywhere along the x axis. However, I noticed that I just keep getting a white screen for an unknown reason.
main.js(adding a cube)
import {
BoxBufferGeometry,
Mesh,
MeshBasicMaterial,
PerspectiveCamera,
Scene,
WebGLRenderer
} from 'three';
import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls.js';
let camera, scene, renderer;
class App {
init() {
camera = new PerspectiveCamera( 70, window.innerWidth / window.innerHeight, 1, 1000 );
camera.position.z = 400;
scene = new Scene();
const geometry = new BoxBufferGeometry( 200, 200, 200 );
const material = new MeshBasicMaterial();
const mesh = new Mesh( geometry, material );
scene.add(mesh);
const geometry2 = new BoxBufferGeometry(200, 200, 200);
const material2 = new MeshBasicMaterial();
const mesh2 = Mesh(geometry2, material2);
mesh2.position.y = -3;
scene.add(mesh2);
renderer = new WebGLRenderer( { antialias: true } );
renderer.setPixelRatio( window.devicePixelRatio );
renderer.setSize( window.innerWidth, window.innerHeight );
document.body.appendChild( renderer.domElement );
window.addEventListener( 'resize', onWindowResize, false );
const controls = new OrbitControls( camera, renderer.domElement );
animate();
}
}
function onWindowResize() {
camera.aspect = window.innerWidth / window.innerHeight;
camera.updateProjectionMatrix();
renderer.setSize( window.innerWidth, window.innerHeight );
}
function animate() {
requestAnimationFrame( animate );
renderer.render( scene, camera );
}
export { App }
Correct me if what I did was wrong, but I ran the following to test out the App.
npm install for the npm modules.
npm run build to update the build.js
I checked the console and the following error was shown:
I also tried just changing the material of the cube using the code below.
main.js (changing material)
import {
BoxBufferGeometry,
Mesh,
MeshLambertMaterial,
PerspectiveCamera,
Scene,
WebGLRenderer
} from 'three';
import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls.js';
let camera, scene, renderer;
class App {
init() {
camera = new PerspectiveCamera( 70, window.innerWidth / window.innerHeight, 1, 1000 );
camera.position.z = 400;
scene = new Scene();
const geometry = new BoxBufferGeometry( 200, 200, 200 );
const material = new MeshLambertMaterial();
const mesh = new Mesh( geometry, material );
scene.add(mesh);
renderer = new WebGLRenderer( { antialias: true } );
renderer.setPixelRatio( window.devicePixelRatio );
renderer.setSize( window.innerWidth, window.innerHeight );
document.body.appendChild( renderer.domElement );
window.addEventListener( 'resize', onWindowResize, false );
const controls = new OrbitControls( camera, renderer.domElement );
animate();
}
}
function onWindowResize() {
camera.aspect = window.innerWidth / window.innerHeight;
camera.updateProjectionMatrix();
renderer.setSize( window.innerWidth, window.innerHeight );
}
function animate() {
requestAnimationFrame( animate );
renderer.render( scene, camera );
}
export { App }
This resulted in just a black screen. I could not even access the console to check the error since nothing shows up when I right click on the screen.
Is there something wrong with how I build the project after editing?

QGraphicsLinearLayout not aligning widgets as expected

I have got a QGraphicsScene that I have added a QGraphicsWidget.
I initially setup the widget this way
AGraphicsWidget::AGraphicsWidget( QGraphicsItem* parent, QGraphicsScene* scene )
: QGraphicsWidget( parent )
, m_mainLayout( new QGraphicsLinearLayout( Qt::Vertical ) )
, m_titleLabel( new QLabel( "Title" ) )
, m_executionPin( new NodeExecutionPin() )
{
setSizePolicy( QSizePolicy::Expanding, QSizePolicy::Expanding );
setGraphicsItem( this );
setFlag( ItemIsMovable, true );
m_mainLayout->setInstantInvalidatePropagation( true );
auto* labelProxy = scene->addWidget( m_titleLabel );
auto* secondLabel = scene->addWidget( new QLabel( "Subtitle" ) );
auto* button = scene->addWidget( new QPushButton( "Another" ) );
m_mainLayout->addItem( labelProxy );
m_mainLayout->addItem( secondLabel );
m_mainLayout->addItem( button );
setLayout( m_mainLayout );
}
This setup does not do what I expect which would be a vertically aligned set of widgets instead I get this
However when I add a text edit as the last element like this
AGraphicsWidget::AGraphicsWidget( QGraphicsItem* parent, QGraphicsScene* scene )
: QGraphicsWidget( parent )
, m_mainLayout( new QGraphicsLinearLayout( Qt::Vertical ) )
, m_titleLabel( new QLabel( "Title" ) )
, m_executionPin( new NodeExecutionPin() )
{
setSizePolicy( QSizePolicy::Expanding, QSizePolicy::Expanding );
setGraphicsItem( this );
setFlag( ItemIsMovable, true );
m_mainLayout->setInstantInvalidatePropagation( true );
auto* labelProxy = scene->addWidget( m_titleLabel );
auto* secondLabel = scene->addWidget( new QLabel( "Subtitle" ) );
auto* button = scene->addWidget( new QPushButton( "Another" ) );
auto* edit = scene->addWidget( new QTextEdit() );
m_mainLayout->addItem( labelProxy );
m_mainLayout->addItem( secondLabel );
m_mainLayout->addItem( button );
m_mainLayout->addItem( edit );
setLayout( m_mainLayout );
}
It works as expected, I get the nice vertical layout, I get the same non alignment as above if I try and nest any QGraphicsLinearLayouts any further where they will overlap each other, what is happening here?
Thanks in advance.

QwtPlotSpectrogram with log scales

I want to put logarithmic a scale next to spectrogram. I want the displayed image to be the same as for the linear data. The code for the version with linear scales looks like this:
#include <QApplication>
#include <QMainWindow>
#include <qwt_plot.h>
#include <qwt_plot_spectrogram.h>
#include <qwt_matrix_raster_data.h>
#include <qwt_color_map.h>
#include <qwt_scale_engine.h>
int main( int argc, char* argv[] ) {
QApplication app( argc, argv );
QMainWindow wnd;
QVector<double> heat_values( 100 * 100 );
for( int n = 0; n < 100 * 100; ++n ) {
heat_values[n] = ( n % 100 ) + n / 100;
};
QwtPlotSpectrogram heat;
auto heat_data = std::make_unique<QwtMatrixRasterData>();
heat_data->setValueMatrix( heat_values, 100 );
heat_data->setResampleMode(
QwtMatrixRasterData::ResampleMode::NearestNeighbour );
heat_data->setInterval( Qt::XAxis, QwtInterval( 0, 100.0 ) );
heat_data->setInterval( Qt::YAxis, QwtInterval( 0, 100.0 ) );
heat_data->setInterval( Qt::ZAxis, QwtInterval( 0, 200.0 ) );
heat.setDisplayMode( QwtPlotSpectrogram::DisplayMode::ImageMode, true );
heat.setColorMap( new QwtLinearColorMap( Qt::white, Qt::black ) );
heat.setData( heat_data.release() );
QwtPlot p;
p.setAutoDelete( false );
heat.attach( &p );
p.repaint();
wnd.setCentralWidget( &p );
wnd.resize( 400, 300 );
wnd.show();
return QApplication::exec();
}
and produces the expected result.
However, I want the same image but with different scales, for example logarithmic scales from 1 to 101. But after I change the scales like this:
p.setAxisScaleEngine( QwtPlot::yLeft, new QwtLogScaleEngine() );
p.setAxisScale( QwtPlot::yLeft, 1.0, 101.0 );
p.setAxisScaleEngine( QwtPlot::xBottom, new QwtLogScaleEngine() );
p.setAxisScale( QwtPlot::xBottom, 1.0, 101.0 );
then the spectrogram is all messed up.
Does anyone know how to just change the displayed scale?
msvc 2017, x64, qwt 6.1.4, qt 5.12.2
Edit:
I can get half way there by defining my own RasterData and mapping the coordinates back into bins, but it's still missing the inverse transformation, so the displayed data is a 'log' version of the original.
class RasterData : public QwtRasterData
{
public:
double value( double const x, double const y ) const override {
int const ix = std::min<int>( std::max<int>( 0, x ), m_cols-1 );
int const iy = std::min<int>( std::max<int>( 0, y ), m_cols-1 );
return m_values[iy * m_cols + ix];
}
void setValueMatrix( QVector<double> const& values, int const cols ) {
m_values = values;
m_cols = cols;
}
private:
QVector<double> m_values;
int m_cols;
};
then result then looks like this:
But essentially I want to avoid all of these tranformations. I want it to just transform the image data passed in via setValueMatrix into an image using the set color map and stretch that image to fit the plot.
The best way I found to make this work is by deriving from QwtPlotSpectrogram and changing the transformation to linear for the call to draw.
class PlotSpectrogram : public QwtPlotSpectrogram {
public:
void draw(
QPainter* painter,
QwtScaleMap const& xMap,
QwtScaleMap const & yMap,
QRectF const& canvasRect ) const override {
QwtScaleMap xMapLin( xMap );
QwtScaleMap yMapLin( yMap );
auto const xi = data()->interval( Qt::XAxis );
auto const yi = data()->interval( Qt::YAxis );
auto const dx = xMapLin.transform( xMap.s1() );
xMapLin.setScaleInterval( xi.minValue(), xi.maxValue() );
auto const dy = yMapLin.transform( yMap.s2() );
yMapLin.setScaleInterval( yi.minValue(), yi.maxValue() );
xMapLin.setTransformation( new QwtNullTransform() );
yMapLin.setTransformation( new QwtNullTransform() );
QwtPlotSpectrogram::draw(
painter, xMapLin, yMapLin, canvasRect.translated( dx, -dy ) );
}
};
With main altered for a scale log scale from 20..50 and using PlotSpectrogram
PlotSpectrogram heat;
auto heat_data = std::make_unique<QwtMatrixRasterData>();
heat_data->setValueMatrix( heat_values, 100 );
heat_data->setInterval( Qt::XAxis, QwtInterval( 0, 100.0 ) );
heat_data->setInterval( Qt::YAxis, QwtInterval( 0, 100.0 ) );
heat_data->setInterval( Qt::ZAxis, QwtInterval( 0, 200.0 ) );
heat.setDisplayMode( QwtPlotSpectrogram::DisplayMode::ImageMode, true );
heat.setColorMap( new QwtLinearColorMap( Qt::white, Qt::black ) );
heat.setData( heat_data.release() );
QwtPlot p;
p.setAxisScaleEngine( QwtPlot::yLeft, new QwtLogScaleEngine() );
p.setAxisScale( QwtPlot::yLeft, 20.0, 50.0 );
p.setAxisScaleEngine( QwtPlot::xBottom, new QwtLogScaleEngine() );
p.setAxisScale( QwtPlot::xBottom, 20.0, 50.0 );
p.setAutoDelete( false );
heat.attach( &p );
I then get the desired output
QwtPlotMatrixRasterData is not working with non linear scales !
When using QwtRasterData instead everything will work out of the box with any type of scales.

qt - QPainter::rotate with non-perpendicular angles causes wiggly lines

I'm trying to draw text on a widget at an angle which is non-perpendicular, i.e. between 0 and 90. Drawing the text itself is no issue, but the resulting text is very wiggly/unevenly drawn.
In the picture below, I'm drawing to lines of text at a 45 degree angle. The first line of text is many underscores ("_____"), and second line of text is "Multithreading". Underscores are drawn here instead of a line just to highlight the issue.
As you can see, the first line obviously shows the text is not evenly drawn. This is more subtle in the second line, but still visible.
Drawing at perpendicular angles (0, 90, 180, etc.) does not cause this effect. Any reason why this is happening?
I'm working on Windows 10 with Qt 5.7.0.
Minimal code example:
void MyWidget::paintEvent( QPaintEvent * /* event */ )
{
QFont font;
font.setPointSize( 16 );
font.setStyleStrategy( QFont::StyleStrategy::PreferAntialias );
setFont( font );
QImage image( size(), QImage::Format_ARGB32_Premultiplied );
QPainter imagePainter( &image );
imagePainter.initFrom( this );
imagePainter.setFont( font() );
imagePainter.setRenderHint( QPainter::Antialiasing, true );
imagePainter.eraseRect( rect() );
// Set logical origo in the middle of the image
m_window = QRect(
- width() / 2, // x
- height() / 2, // y
width(), // w
height() // h
);
imagePainter.setWindow( m_window );
m_viewport = QRect(
0, // x
0, // y
width(), // w
height() // h
);
imagePainter.setViewport( m_viewport );
draw( imagePainter );
imagePainter.end();
QPainter widgetPainter( this );
widgetPainter.drawImage( 0, 0, image );
}
void MyWidget::draw( QPainter & painter )
{
painter.save();
// Rotate anti-clockwise
painter.rotate( -m_degrees );
painter.drawText( m_window.top(), 0, tr( "Multithreads" ) );
painter.drawText( m_window.top(), 15, tr( "__________" ) );
painter.restore();
}
I found a workaround from this Qt bug ticket. In short, the fix is to draw the text as a QPainterPath rather than as text.
Example of the fix is
// Do this
QPainterPath glyphPath;
glyphPath.addText( x, y, painter.font(), text );
painter.fillPath( glyphPath, painter.pen().color() );
// instead of this
painter.drawText( x, y, text );
EDIT:
The difference can be seen below.
Before:
After:

Zooming in a QGraphicsScene makes my drawings disappear sometimes

I have a QGraphicsScene where I add a QPixmap composed of 4 images and the borders of each image.
I create a new QPixmap with the total size and then use a QPainter to draw each sub-image in the appropiate place in the bigger pixmap. After one sub-image is done, inmediately draw its borders (this may not be optimal but for now I don't mind).
Once the "final" pixmap is finished, I insert directly to the scene with
scene->addPixmap( total )
Here's the code for the pixmap composition:
QPixmap pixFromCube( PanoramicImages* lim ) const
{
const QSize img_size = getImageSize( lim );
const QSize pano_size( img_size.width() * 4, img_size.height() );
QPixmap toret( pano_size );
if( !toret.isNull() ) {
QPainter painter( &toret );
painter.setRenderHint( QPainter::Antialiasing );
int x( 0 );
QPixmap pix = lim->getCamera1Image();
if( !pix.isNull() ) {
painter.drawPixmap( 0, 0, pix.width(), pix.height(), pix );
drawPixBorder( painter, pix.rect() );
}
x += img_size.width();
pix = lim->getCamera2Image();
if( !pix.isNull() ) {
painter.drawPixmap( x, 0, pix.width(), pix.height(), pix );
drawPixBorder( painter, QRectF( x, 0, pix.width(),
pix.height() ) )
;
}
x += img_size.width();
pix = lim->getCamera3Image();
if( !pix.isNull() ) {
painter.drawPixmap( x, 0, pix.width(), pix.height(), pix );
drawPixBorder( painter, QRectF( x, 0, pix.width(),
pix.height() ) )
;
}
x += img_size.width();
pix = lim->getCamera4Image();
if( !pix.isNull() ) {
painter.drawPixmap( x, 0, pix.width(), pix.height(), pix );
drawPixBorder( painter, QRectF( x, 0, pix.width(),
pix.height() ) )
;
}
}
return toret;
}
And
void drawPixBorder( QPainter& painter, const QRectF rect ) const
{
const QBrush oldBrush = painter.brush();
const QPen oldPen = painter.pen();
QColor color( Qt::blue );
if( timer.isActive() ) {
color = Qt::green;
} else {
color = Qt::red;
}
const QBrush brush( color );
QPen pen( brush, 22 );
const QPointF points[ 5 ] = {
rect.topLeft(),
rect.topRight(),
rect.bottomRight(),
rect.bottomLeft(),
rect.topLeft()
};
painter.setBrush( brush );
painter.setPen( pen );
painter.drawPolyline( points, sizeof( points ) / sizeof( points[ 0 ] ) );
painter.setBrush( oldBrush );
painter.setPen( oldPen );
}
Here's the final pixmap when it's loaded for the first time:
And here after a few zoom-outs:
As you can see, at the right some of the borders are missing. When zooming back again to the inital position, the borders are displayed. If I use a smaller width for the lines (say, 5), the borders disappear sooner.
I've been reading other questions here and in the Qt Forums and tried some suggestions like:
pen.setCosmetic( true );
or
painter.setRenderHint( QPainter::NonCosmeticDefaultPen, false);
or:
painter.setRenderHint( QPainter::Antialiasing );
setting the pen width directly to 0
pen.setWidth( 0 )
and combinations.
Neither of them prevented the borders to disappear and using a bigger width just delays the problem.
Is there a way to show always the borders regardless of the zoom level?
Thanks to #Robert for his help. As he has stated in his answer, the solution was to draw directly in the scene, instead of doing it in the pixmap and then adding it.
For drawing in the scene, I decided to use a QPainterPath:
int x( 0 );
QPainterPath rectPath;
for( unsigned int i( 0 ); i < 4; ++i ) {
rectPath.addRect( QRectF( x, 0, width, height ) );
x += width;
}
QColor color( Qt::blue );
if( timer.isActive() ) {
color = Qt::green;
} else {
color = Qt::red;
}
scene->addPath( rectPath, QPen( color ) );
It is because the painter you are using to create the pixmap does not know anything about the transformations/scale of the graphic scene... A possible solutions would be to draw the rectangles within the scene and not directly to the pixmap.