I'm drawing text using QPainter on a QImage, and then saving it to TIFF.
I need to increase the DPI to 300, which should make the text bigger in terms of pixels (for the same point size).
You can try using QImage::setDotsPerMeterY() and QImage::setDotsPerMeterX(). DPI means "dots per inch". 1 inch equals 0.0254 meters. So you should be able to convert to dots per meter (dpm):
int dpm = 300 / 0.0254; // ~300 DPI
image.setDotsPerMeterX(dpm);
image.setDotsPerMeterY(dpm);
It's not going to be exactly 300DPI (it's actually 299.9994), since the functions only work with integral values. But for all intents and purposes, it's good enough (299.9994 vs 300 is quite good, I'd say.)
There are 39.37 inches in a meter. So:
Setting:
qimage.setDotsPerMeterX(xdpi * 39.37);
qimage.setDotsPerMeterY(ydpi * 39.37);
Getting:
xdpi = qimage.dotsPerMeterX() / 39.37;
ydpi = qimage.dotsPerMeterY() / 39.37;
Related
So i'm playing around with creating a simple game engine in c++. I needed to render some text so I used this tutorial (http://learnopengl.com/#!In-Practice/Text-Rendering) for guidance. It's using the library freetype 2.
Everything works great, text is rendering as it should. But now when i'm fleshing the ui out and is creating labels I would like to be able to change the size of the text. I can do so by scaling the text, but I would like to be able to do so by using pixels.
Here you can see the scaling in action:
GLfloat xpos = x + ch.Bearing.x * scale;
GLfloat ypos = y + linegap + (font.Characters['H'].Bearing.y - ch.Bearing.y) * scale;
GLfloat w = ch.Size.x * scale;
GLfloat h = ch.Size.y * scale;
So in my method renderText I just pass a scale variable and it scales the text. But I would prefer to use pixels as it is more user friendly, is there any way I could do this in freetype 2 or am I stuck with a scale variable?
Assuming you don't want to regenerate the glyphs at a different resolution, but instead want to specify scale as a unit of pixels instead of a ratio (i.e. you want to say scale = 14 pixels instead of scale = 29%), then you can do the following: Save the height value you passed to FT_Set_Pixel_Sizes (which is 48 in the tutorial). Now if you want a 14-pixel render, just divide 14 by that number (48), so it would be scale = 14.0f / 48.0f. That will give you the scaling needed to render at a 14-pixel scale from a font that was originally generated with a 48-pixel height.
You might want to play with your OpenGL texture filters or mipmapping as well when you do this to improve your results. Additionally, fonts sometimes have low-resolution pixel hinting, which helps them be rendered clearly even at low resolutions; unfortunately this hinting information is lost/not used when you generate a high res texture and then scale it down to a smaller render size, so it might not look as clear as you desire.
I want to make a figure that marker's size depend on the size of the figure. That way, using square marker size, no matter what resolution or figure size you choose, all the markers will touch each other, masking the backgroud without overlapping. Here is where I am at:
The marker size is specified in pt^2, with 1pt=1/72inch, the resolution in Pixel Per Inches, and the figure size in pixels (also the proportion that main subplot represent out of the main figure size : 0.8). So, if my graph's limits are lim_min and lim_max, I should by able to get the corresponding marker size using :
marker_size=((fig_size*0.8*72/Resolution)/(lim_max-lim_min))**2
because (fig_size*0.8*72/Resolution) is the size of the figure in points, and (lim_max-lim_min) the number of marker I want to fill a line.
And that should do the trick !... Well it doesn't... At all... The marker are so small they are invisible without a zoom. And I don't get why.
I understand this my not be the best way, and the way you would do it, but I see no reason why it wouldn't work, so I want to understand where I am wrong.
PS : both my main figure and my subplot are squares
Edit :
Okay so I found the reason of the problem, not the solution. The problem in the confusion between ppi and dpi. Matplotlib set the resolution in dpi, which is defined as a unit specific to scanner or printer depending on the model (?!?).
Needless to say I am extremely confused on the actual meaning of the resolution in matplotlib. It simply makes absolutely no sens to me. Please someone help. How do i convert this to a meaningful unit ? It seems that matplotlib website is completely silent on the matter.
If you specify the figure size in inches and matplotlib uses a resolution of 72 points per inch (ppi), then for a given number of markers the width of each marker should be size_in_inches * points_per_inch / number_of_markers points (assuming for now that the subplot uses the entire figure)? As I see it, dpi is only used to display or save the figure in a size of size_in_inches * dpi pixels.
If I understand your goal correctly, the code below should reproduce the required behavior:
# Figure settings
fig_size_inch = 3
fig_ppi = 72
margin = 0.12
subplot_fraction = 1 - 2*margin
# Plot settings
lim_max = 10
lim_min = 2
n_markers = lim_max-lim_min
# Centers of each marker
xy = np.arange(lim_min+0.5, lim_max, 1)
# Size of the marker, in points^2
marker_size = (subplot_fraction * fig_size_inch * fig_ppi / n_markers)**2
fig = pl.figure(figsize=(fig_size_inch, fig_size_inch))
fig.subplots_adjust(margin, margin, 1-margin, 1-margin, 0, 0)
# Create n_markers^2 colors
cc = pl.cm.Paired(np.linspace(0,1,n_markers*n_markers))
# Plot each marker (I could/should have left out the loops...)
for i in range(n_markers):
for j in range(n_markers):
ij=i+j*n_markers
pl.scatter(xy[i], xy[j], s=marker_size, marker='s', color=cc[ij])
pl.xlim(lim_min, lim_max)
pl.ylim(lim_min, lim_max)
This is more or less the same as you wrote (in the calculation of marker_size), except the division by Resolution has been left out.
Result:
Or when settings fig_ppi incorrectly to 60:
I want to capture the coordinate of components in my MFC program.
Now I can perfectly complete this by using GetWindowRect.
However, when I set my windows dpi to 150% (120 dpi), I get different coordinates from GetWindowRect.
Therefore, I survey some method to convert new coordinates to that in default dpi (96 dpi).
Finally, I found there was some error when I tried:
Rect.top = Rect.top * (DEFAULT_DPIY / CURRENT_DPIY);
Rect.left = Rect.left * (DEFAULT_DPIX / CURRENT_DPIX);
The converted value is very close, but not equal.
Is there any method to convert it without error ?
Your program is subject to DPI virtualization. The right way to deal with this is to make your program high DPI aware but that may well involve more changes than you are prepared to attempt.
If being high DPI aware is not something you wish to tackle, then you can at least make your arithmetic better. Your code uses integer divide. But that's going to be inaccurate. In order to minimise that inaccuracy you should perform the division after the multiplication:
Rect.top = (Rect.top * DEFAULT_DPIY) / CURRENT_DPIY;
Rect.left = (Rect.left * DEFAULT_DPIX) / CURRENT_DPIX;
Of course, the parentheses could be omitted here without changing the meaning, but I think it's nice to be explicit about the ordering of the operations in this case.
Another option would be to use MulDiv:
Rect.top = MulDiv(Rect.top, DEFAULT_DPIY, CURRENT_DPIY);
Rect.left = MulDiv(Rect.left, DEFAULT_DPIX, CURRENT_DPIX);
Ive been using HORZRES and VERTRES to print various strings. I had been using xps to test my printing and such but when I switched over to my actually computer I noticed that things werent printing the same.
How do you get the size of the actual page and print from there?
For example if I was printing from a letter(8 1/2 x 12 inches) How could I get a universal measurement that could be used for any printer
You can use SetMapMode to change the mapping mode.
If you set the mapping mode to, say, MM_LOENGLISH then all drawing will be in units of 1/100 inch. A line drawn with length 100 will then be one inch long on any printer, and you don't need to worry about the printer's resolution.
If you want further information about the page you can get other data from GetDeviceCaps:
LOGPIXELSX - horizontal pixels per inch
LOGPIXELSY - vertical pixels per inch
PHYSICALWIDTH - width of the page in device units
PHYSICALHEIGHT - height of the page in device units
Then the width of the page in inches is PHYSICALWIDTH / LOGPIXELSX and the height is PHYSICALHEIGHT / LOGPIXELSY.
I am developing an android game in cocos2d. How many different sizes of images do I need to support android devices and tablets?
I have never used that engine but if you mean by image size, device screen size, then you should use an scale.
I took for base the most bigger I could, 1280x800, the one that's on my tablet, just to be more precise in tablets too.
I apply the scale in (X,Y) to every image size and every operation that screen or screen size it's involved. i.e:
soldierBitmapX.move(movement*scaleX)
soldierBitmapY.move(movement*scaleY)
scaleX and scaleY represents your scale and movement represent how many pixel your soodier will move.
This is an example to understand how to apply the scale. I don't recommend you to move your sprites with this operation but have in mind if you should apply the scale.
You can apply this to every screen possible and your game will feet exactly in all of it. Beware of for example QVGA screens, more "squared" in comparision with other standards and very small.
EDIT (how to get the scale):
_xMultiplier = (_screenWidth/(1280.0f/100.0f))/100.0f;
_yMultiplier = (_screenHeight/(800.0f/100.0f))/100.0f;
matrix.setScale(_xMultiplier, _yMultiplier);
this is an example of the scale applied to the matrix that we'll use.
Through ScaleX and ScaleY Property you can easily scale the images .....as for example you take for tablet size is 1280 * 800 ,yo u can scale that sprite and use it; you can also use that image for smaller resolution e.g. 320 * 480.....