"Invalid Handle Object" when plotting 2 figures Matlab - c++

I'm having a difficult time understanding the paradigm of Matlab classes vs compared to c++. I wrote code the other day, and I thought it should work. It did not... until I added
<handle
after the classdef.
So I have two classes, landmarks and robot, both are called from within the simulation class. This is the main loop of obj.simulation.animate() and it works, until I try to plot two things at once.
DATA.path is a record of all the places a robot has been on the map, and it's updated every time the position is updated.
When I try to plot it, by uncommenting the two marked lines below, I get this error:
??? Error using ==> set
Invalid handle object.
Error in ==> simulation>simulation.animate at 45
set(l.lm,'XData',obj.landmarks.apparentPositions(:,1),'YData',obj.landmarks.apparentPositions(:,2));
%INITIALIZE GLOBALS
global DATA XX
XX = [obj.robot.x ; obj.robot.y];
DATA.i=1;
DATA.path = XX;
%Setup Plots
fig=figure;
xlabel('meters'), ylabel('meters')
set(fig, 'name', 'Phil''s AWESOME 80''s Robot Simulator')
xymax = obj.landmarks.mapSize*3;
xymin = -(obj.landmarks.mapSize*3);
l.lm=scatter([0],[0],'b+');
%"UNCOMMENT ME"l.pth= plot(0,0,'k.','markersize',2,'erasemode','background'); % vehicle path
axis([xymin xymax xymin xymax]);
%Simulation Loop
for n = 1:720,
%Calculate and Set Heading/Location
XX = [obj.robot.x;obj.robot.y];
store_data(XX);
if n == 120,
DATA.path
end
%Update Position
headingChange = navigate(n);
obj.robot.updatePosition(headingChange);
obj.landmarks.updatePerspective(obj.robot.heading, obj.robot.x, obj.robot.y);
%Animate
%"UNCOMMENT ME" set(l.pth, 'xdata', DATA.path(1,1:DATA.i), 'ydata', DATA.path(2,1:DATA.i));
set(l.lm,'XData',obj.landmarks.apparentPositions(:,1),'YData',obj.landmarks.apparentPositions(:,2));
rectangle('Position',[-2,-2,4,4]);
drawnow
This is the classdef for landmarks
classdef landmarks <handle
properties
fixedPositions; %# positions in a fixed coordinate system. [ x, y ]
mapSize; %Map Size. Value is side of square
x;
y;
heading;
headingChange;
end
properties (Dependent)
apparentPositions
end
methods
function obj = landmarks(mapSize, numberOfTrees)
obj.mapSize = mapSize;
obj.fixedPositions = obj.mapSize * rand([numberOfTrees, 2]) .* sign(rand([numberOfTrees, 2]) - 0.5);
end
function apparent = get.apparentPositions(obj)
currentPosition = [obj.x ; obj.y];
apparent = bsxfun(#minus,(obj.fixedPositions)',currentPosition)';
apparent = ([cosd(obj.heading) -sind(obj.heading) ; sind(obj.heading) cosd(obj.heading)] * (apparent)')';
end
function updatePerspective(obj,tempHeading,tempX,tempY)
obj.heading = tempHeading;
obj.x = tempX;
obj.y = tempY;
end
end
end
To me, this is how I understand things. I created a figure l.lm that has about 100 xy points. I can rotate this figure by using
set(l.lm,'XData',obj.landmarks.apparentPositions(:,1),'YData',obj.landmarks.apparentPositions(:,2));
When I do that, things work. When I try to plot a second group of XY points, stored in DATA.path, it craps out and I can't figure out why.
I need to plot the robots path, stored in DATA.path, AND the landmarks positions. Ideas on how to do that?
Jonas:
I'm not saying you're wrong, because I don't know the answer, but I have code from another application that plots this way without calling axes('NextPlot','add');
if dtsum==0 & ~isempty(z) % plots related to observations
set(h.xf, 'xdata', XX(4:2:end), 'ydata', XX(5:2:end))
plines= make_laser_lines (z,XX(1:3));
set(h.obs, 'xdata', plines(1,:), 'ydata', plines(2,:))
pfcov= make_feature_covariance_ellipses(XX,PX);
set(h.fcov, 'xdata', pfcov(1,:), 'ydata', pfcov(2,:))
end
drawnow
The above works on the other code, but not mine. I'll try implementing your suggestion and let you know.

When you call plot multiple times on the same figure, the previous plot is by default erased, and the handle to the previous plot points to nothing. Thus the error.
To fix this, you need to set the NextPlot property of the axes to add. You can do this by calling hold on (that's what you'd do if you were plotting from command line), or you can write
fig=figure;
%# create a set of axes where additional plots will be added on top of each other
%# without erasing
axes('NextPlot','add');
If you want, you can store the axes handle as well, and use plot(ah,x,y,...) to make sure that you plot into the right set of axes and not somewhere strange if you happen to click on a different figure window between the time the figure is opened and the plot command is issued.

Related

How to make a GUI that plots user-entered function(x)?

I'm developing a program that takes a function of x from the user, also it takes the min and max values of x, then the program have to plot this function.
for example:
user-entered function(x) is: x^2+2x-1
Max value of x is : 3
Min value of x is : -3
now the GUI have to display (if the entered function is free of errors otherwise the error will be displayed to the user) something similar to this image:
The entered function also maybe a little bit complex E.g.(sin(x), cos(2*x+1), etc..)
I'm trying to make this job with C++ and QT, so any advice how to make the plotting part of the program using QT, or if anyone knows better recommendation instead of QT that works with C++ and can do this job.
Thanks in advance.
You are going to need a library that interprets mathematical expressions (i.e: muparser). In the code i did my own math, but you will be doing that using a library. Considering you managed all those; with QCustomPlot you can draw your graphs.
Here's a sample to give an idea how you can use QCustomPlot:
/** I copy pasted the code from one of my projects, please ignore function & class namings
*/
/** somewhere in your window constructor, create a QCustomPlot
* widget and add graphs to your QCustomPlot widget.
* example: ui->plt_freq_domain->addGraph();
*/
void ControllerMain::plot_frequency_domain()
{
QVector <double> vec_keys(1024), vec_values(1024);
//fill x axis values with incrementing numbers from 0 to 1024 (or any other number you want)
// let's say your function is y = x^2 and calculate all y values and store them in vec_values
for(int i = 0; i < vec_keys.size(); i++)
{
vec_values[i] = std::pow(vec_keys[i], 2);
}
// we fill keys with continuous integers [0,1023] so our graph spans along x-axis
std::iota(vec_keys.begin(), vec_keys.end(), 0);
ui->plt_freq_domain->graph(0)->setData(vec_keys, vec_values, true);
ui->plt_freq_domain->graph(0)->setPen(QPen(QColor(245, 197, 66)));
ui->plt_freq_domain->yAxis->setLabel("A/m"); // change it with your unit or just keep empty
ui->plt_freq_domain->yAxis->setRange(*std::min_element(vec_values.begin(), vec_values.end()),
*std::max_element(vec_values.begin(), vec_values.end()));
ui->plt_freq_domain->xAxis->setLabel("f"); // change it with your unit or just keep empty
ui->plt_freq_domain->xAxis->setRange(vec_keys.constFirst(), vec_keys.constLast());
ui->plt_freq_domain->replot();
}

How to draw graph for a trajectory which goes left and right in x axis?

i want to draw a trajectory in x and y of a car in a parking lot.
the trajectory in x is not always in the same direction. sometime the car will go left.
the problem here is: sometime (not always!) the graph will no go left in x axis. You can see the two different result on the image https://imgur.com/Z53fNkt
any idea why?
the image at left is what i expect. at right is the same values , but i continue to plot data a little longer.
void TrackingResultsView::setupTrajectoryPlot()
{
QCustomPlot *customPlot = ui->qcp_trajectory;
customPlot->xAxis2->setVisible(true);
customPlot->xAxis2->setLabel("X-Position (pixel)");
customPlot->xAxis2->setRange(0, mModelPtr->frameSize().width());
customPlot->xAxis2->grid()->setVisible(true);
customPlot->xAxis->setRange(0, mModelPtr->frameSize().width());
customPlot->yAxis->setLabel("Y-Position (pixel)");
customPlot->yAxis->setRange(0, mModelPtr->frameSize().height());
customPlot->yAxis->setRangeReversed(true);
customPlot->yAxis2->setVisible(true);
customPlot->yAxis2->setRange(0, mModelPtr->frameSize().height());
customPlot->yAxis2->grid()->setVisible(true);
customPlot->yAxis2->setRangeReversed(true);
customPlot->addGraph(customPlot->xAxis2, customPlot->yAxis);
QVector<QVector<double>> data = createDataMap(mModelPtr->points());
customPlot->graph()->setData(data.at(0), data.at(1), true);
setTheme(customPlot, false);
}
thank you
(english is not my first langage)
The QCPGraph seems to be used for sorted data with only value per key. From QCustomPlot documentation, it looks like the QCPCurve would be a better match in order to plot a trajectory graph (multiple value for the same key).
From the QCPCurve description:
Unlike QCPGraph, plottables of this type may have multiple points with the same key coordinate, so their visual representation can have loops. This is realized by introducing a third coordinate t, which defines the order of the points described by the other two coordinates x and y.
here my new code with olivier help. its work!
QCustomPlot *customPlot = ui->qcp_trajectory;
customPlot->xAxis2->setVisible(true);
customPlot->xAxis2->setLabel("X-Position (pixel)");
customPlot->xAxis2->setRange(0, mModelPtr->frameSize().width());
customPlot->xAxis2->grid()->setVisible(true);
customPlot->xAxis->setRange(0, mModelPtr->frameSize().width());
customPlot->yAxis->setLabel("Y-Position (pixel)");
customPlot->yAxis->setRange(0, mModelPtr->frameSize().height());
customPlot->yAxis->setRangeReversed(true);
customPlot->yAxis2->setVisible(true);
customPlot->yAxis2->setRange(0, mModelPtr->frameSize().height());
customPlot->yAxis2->grid()->setVisible(true);
customPlot->yAxis2->setRangeReversed(true);
customPlot->addGraph(customPlot->xAxis2, customPlot->yAxis);
// create empty curve objects:
QCPCurve *trajectory = new QCPCurve(customPlot->xAxis2, customPlot->yAxis);
// generate the curve data points:
const int pointCount = mModelPtr->points().size();
QVector<QCPCurveData> datatrajectory(pointCount);
QVector<QVector<double>> data = createDataMap(mModelPtr->points());
for (int i = 0; i < pointCount ; ++i)
{
datatrajectory[i] = QCPCurveData(i, data.at(0).at(i), data.at(1).at(i));
}
trajectory->data()->set(datatrajectory, true);
setTheme(customPlot, false);

How do I find my mouse point in a scene using SceneKit?

I have set up a scene in SceneKit and have issued a hit-test to select an item. However, I want to be able to move that item along a plane in my scene. I continue to receive mouse drag events, but don't know how to transform those 2D coordinates into 3D coordinate in the scene.
My case is very simple. The camera is located at 0, 0, 50 and pointed at 0, 0, 0. I just want to drag my object along the z-plane with a z-value of 0.
The hit-test works like a charm, but how do I translate the mouse point from a drag event into a new position in the scene for the 3D object I am dragging?
You don't need to use invisible geometry — Scene Kit can do all the coordinate conversions you need without having to hit test invisible objects. Basically you need to do the same thing you would in a 2D drawing app for moving an object: find the offset between the mouseDown: location and the object position, then for each mouseMoved:, add that offset to the new mouse location to set the object's new position.
Here's an approach you could use...
Hit-test the initial click location as you're already doing. This gets you an SCNHitTestResult object identifying the node you want to move, right?
Check the worldCoordinates property of that hit test result. If the node you want to move is a child of the scene's rootNode, these is the vector you want for finding the offset. (Otherwise you'll need to convert it to the coordinate system of the parent of the node you want to move — see convertPosition:toNode: or convertPosition:fromNode:.)
You're going to need a reference depth for this point so you can compare mouseMoved: locations to it. Use projectPoint: to convert the vector you got in step 2 (a point in the 3D scene) back to screen space — this gets you a 3D vector whose x- and y-coordinates are a screen-space point and whose z-coordinate tells you the depth of that point relative to the clipping planes (0.0 is on the near plane, 1.0 is on the far plane). Hold onto this z-coordinate for use during mouseMoved:.
Subtract the position of the node you want to move from the mouse location vector you got in step 2. This gets you the offset of the mouse click from the object's position. Hold onto this vector — you'll need it until dragging ends.
On mouseMoved:, construct a new 3D vector from the screen coordinates of the new mouse location and the depth value you got in step 3. Then, convert this vector into scene coordinates using unprojectPoint: — this is the mouse location in your scene's 3D space (equivalent to the one you got from the hit test, but without needing to "hit" scene geometry).
Add the offset you got in step 3 to the new location you got in step 5 - this is the new position to move the node to. (Note: for live dragging to look right, you should make sure this position change isn't animated. By default the duration of the current SCNTransaction is zero, so you don't need to worry about this unless you've changed it already.)
(This is sort of off the top of my head, so you should probably double-check the relevant docs and headers. And you might be able to simplify this a bit with some math.)
As an experiment I implemented Mr Bishop's helpful answer. The drag doesn't quite work (the object - a chess piece - jumps off screen) because of differences in the coordinate magnitudes between the mouse click and the 3-D world. I've inserted log outputs here and there among the code.
I asked on the Apple forums if anyone knew the secret sauce to homogenize the coordinates but didn't get a decisive answer. One thing, I had made some experimental changes to Mr Bishop's method and the forum members advised me to return to his technique.
Despite my code's failings, I thought someone might find it a useful starting point. I suspect there are only one or two small problems with the code.
Note that the log of the world transform matrix of the object (chess piece) is not part of the process but one Apple forum member advised me that the matrix often offers a useful 'sanity check' - which indeed it did.
- (NSPoint)
viewPointForEvent: (NSEvent *) event_
{
NSPoint windowPoint = [event_ locationInWindow];
NSPoint viewPoint = [self.view convertPoint: windowPoint
fromView: nil];
return viewPoint;
}
- (SCNHitTestResult *)
hitTestResultForEvent: (NSEvent *) event_
{
NSPoint viewPoint = [self viewPointForEvent: event_];
CGPoint cgPoint = CGPointMake (viewPoint.x, viewPoint.y);
NSArray * points = [(SCNView *) self.view hitTest: cgPoint
options: #{}];
return points.firstObject;
}
- (void)
mouseDown: (NSEvent *) theEvent
{
SCNHitTestResult * result = [self hitTestResultForEvent: theEvent];
SCNVector3 clickWorldCoordinates = result.worldCoordinates;
log output: clickWorldCoordinates x 208.124578, y -12827.223365, z 3163.659073
SCNVector3 screenCoordinates = [(SCNView *) self.view projectPoint: clickWorldCoordinates];
log output: screenCoordinates x 245.128906, y 149.335938, z 0.985565
// save the z coordinate for use in mouseDragged
mouseDownClickOnObjectZCoordinate = screenCoordinates.z;
selectedPiece = result.node; // save selected piece for use in mouseDragged
SCNVector3 piecePosition = selectedPiece.position;
log output: piecePosition x -18.200000, y 6.483060, z 2.350000
offsetOfMouseClickFromPiece.x = clickWorldCoordinates.x - piecePosition.x;
offsetOfMouseClickFromPiece.y = clickWorldCoordinates.y - piecePosition.y;
offsetOfMouseClickFromPiece.z = clickWorldCoordinates.z - piecePosition.z;
log output: offsetOfMouseClickFromPiece x 226.324578, y -12833.706425, z 3161.309073
}
- (void)
mouseDragged: (NSEvent *) theEvent;
{
NSPoint viewClickPoint = [self viewPointForEvent: theEvent];
SCNVector3 clickCoordinates;
clickCoordinates.x = viewClickPoint.x;
clickCoordinates.y = viewClickPoint.y;
clickCoordinates.z = mouseDownClickOnObjectZCoordinate;
log output: clickCoordinates x 246.128906, y 0.000000, z 0.985565
log output: pieceWorldTransform:
m11 = 242.15889219510001, m12 = -0.000045609300002524833, m13 = -0.00000721691076126, m14 = 0,
m21 = 0.0000072168760805499971, m22 = -0.000039452697396149999, m23 = 242.15890446329999, m24 = 0,
m31 = -0.000045609300002524833, m32 = -242.15889219510001, m33 = -0.000039452676995750002, m34 = 0,
m41 = -4268.2349924762348, m42 = -12724.050221935429, m43 = 4852.6652710104272, m44 = 1)
SCNVector3 newPiecePosition;
newPiecePosition.x = offsetOfMouseClickFromPiece.x + clickCoordinates.x;
newPiecePosition.y = offsetOfMouseClickFromPiece.y + clickCoordinates.y;
newPiecePosition.z = offsetOfMouseClickFromPiece.z + clickCoordinates.z;
log output: newPiecePosition x 472.453484, y -12833.706425, z 3162.294639
selectedPiece.position = newPiecePosition;
}
I used the code written by Steve and with little modification it worked for me.
On mouseDown I save clickWorldCoordinates on a property called startClickWorldCoordinates.
On mouseDragged I calculate the selectedPiece position in this way:
SCNVector3 worldClickCoordinate = [(SCNView *) self.view unprojectPoint:clickCoordinates.x];
newPiecePosition.x = selectedPiece.position.x + worldClickCoordinate.x - startClickWorldCoordinates.x;
newPiecePosition.y = selectedPiece.position.y + worldClickCoordinate.y - startClickWorldCoordinates.y;
newPiecePosition.z = selectedPiece.position.z + worldClickCoordinate.z - startClickWorldCoordinates.z;
selectedPiece.position = newPiecePosition;
startClickWorldCoordinates = worldClickCoordinate;

Why isn't my pictureBox moving (Visual C ++)?

I am in the early stages of making a basic two-body problem program (for those of you that don't know, a 'two-body problem' is when you have two bodies in space being gravitationally attracted to each other). I have it set up so that on each timer tick the objects (which are pictureBoxes) move in accordance with the direction (in degrees) inputted into a textbox.
Once a couple of IF statements make sure that the values in the textbox are valid, they do this inside of a Button_Press event (the button will start the simulation):
this->SimTick->Enabled=true; //Master timer for simulation
radians1=(int::Parse(DirectionBox1->Text))*(3.14/180); //Converts the degrees entered for the first object into radians for use in trig functions
radians2=(int::Parse(DirectionBox2->Text))*(3.14/180); //Converts the degrees entered for the second object into radians for use in trig functions
Inside the timer_tick event:
this->Object1->Location.X+=(int::Parse(VelocityBox1->Text)*cos(radians1));
this->Object1->Location.Y+=(int::Parse(VelocityBox1->Text)*sin(radians1));
this->StartStop->Text=(radians1.ToString()); //This is just here to check that the math was correct, which it is
I haven't coded C++ in a while, so this might be a really simple mistake, but does anyone have any ideas, or need any more code pasted?
Besides updating the Location property properly to get it to take new values, you probably want to keep a temporary x & y as a float so that you can add fractional values to it.
typedef struct { float x, y ; } floatcoord ;
floatcoord tmpLocation ;
tmpLocation.x += int::Parse(VelocityBox1->Text)*cos(radians1) ;
tmpLocation.y += int::Parse(VelocityBox1->Text)*sin(radians1) ;
this->Object1->Location.set( (int) floorf( tmpLocation.x), (int) floorf( tmpLocation.y) ; // or however you update Location

Best way to get the nearest intersection point in a grid

I'm using Cocos2D for iPhone to build up a game.I have a grid on the screen drawn by horizontal and vertical lines.(I did it with CCDrawNode) As you might guess there're lots of intersection points in there, I mean the points where horizontal and vertical lines intersect. With every touchBegan-Moved-Ended routine I draw a line, a bolder and different color line. In touchesMoved method I need to find the intersection point nearest to the current end point of the line and stick the line end to that point. How can I do that? I have one idea in my mind which is to add all the intersection points to an array when drawing the grid, iterate through that array and find the closest one. But I think this is not the best approach. You have any better ideas?
Assuming it is a normal grid with evenly spaced lines (e.g. every 10 pixels apart), you are much better off using a formula to tell you where an intersection should be.
E.g. given end point X/Y of 17,23, then x(17)/x-spacing(10) = 1.7, rounds to 2. 2*x-spacing = 20. y/y-spacing=2.3 -> 2*20 = 20. Thus your intersection is 20,20.
EDIT: more detailed example, in C# as that's what I use, if I get time I'll write an Objective-C sample
// defined somewhere and used to draw the grid
private int _spacingX = 10;
private int _spacingY = 10;
public Point GetNearestIntersection(int x, int y)
{
// round off to the nearest vertical/horizontal line number
double tempX = Math.Round((double)x / _spacingX);
double tempY = Math.Round((double)y / _spacingY);
// convert back to pixels
int nearestX = (int)tempX * _spacingX;
int nearestY = (int)tempY * _spacingY;
return new Point(nearestX, nearestY);
}
NOTE: the code above is left quite verbose to help you understand, you could easily re-write it to be cleaner