Is there a UISegmented control for cocos2d. I know I can use the UISegmented control in cocos2d but I can only attach it to the glview and I need to be able to attach it to specific layers.
Related
I am trying to build a Level Editor for my engine and I wondered how I can achieve multiple viewport windows in one window, like in Blender, Cinema 4D or Unity, where you have your rendering viewport, scene hierarchy, properties window etc.
Does the win32 API have a function to create these viewport windows or do I have to create another instance with CreateWindowW with no title bar?
You could conceivably do this with a single window but this sort of thing is usually much easier achieved using a child window ((yes, created via CreateWindow(Ex)?) for each view and then a parent window that handles positioning those child windows (that is, a spliter type frame).
You may even end up with a window tree that is separate from the level view for a properties list etc.
It is simply much easier for the child windows to only need handle one thing (show an overhead level view, show a 3D projection etc) than to make one window class that does all of these.
There is no native notion of a "viewport" in Win32.
To support this kind of functionality at all, to create even a single viewport, you will need to know how to create a custom control. In Win32 "custom controls" are really just custom child windows. Say you have a custom child window class called "view" that handles rendering using a 3D library in its WM_PAINT handler, etc., then to support multiple viewports you fundamentally have two options:
Make "view" implement the functionality itself. Multiple viewports would not be separate Win32 windows. There would be one Win32 child control painted to look as though it was multiple windows. You would then need to handle all the internal UI interactions you offer the user 100% yourself. Dragging the view splitter bar, etc. The benefit would be that you could then make those interactions however you want, possibly totally nonstandard, and also performance while dragging and performing other interactions would probably be better than the alternative.
Use separate "view" child windows for each viewport. Handle UI interactions via other custom child controls, possibly, e.g. a view splitter control, etc.
Without more focus to the question that is about as much of an answer as can be given. The key thing to understand is that Win32 is a powerful but low-level API. If you are looking for an application framework that gives you a lot of functionality for free you should look somewhere else.
I have the following setup for the game:
launcher.exe - starts under Steam on Windows and provides some settings UI for the user.
Then launcher.exe starts actual game.exe.
Problem is that the launcher.exe is using H/W accelerated UI - uses Direct2D/DirectX.
This page https://partner.steamgames.com/doc/features/overlay states:
Your game does not need to do anything special for the overlay to
work, it automatically hooks into any game launched from Steam!
But in my case that creates problems - the overlay is created on wrong window. So launcher.exe (uses DirectX) has the overlay but window that is created by game.exe (real game, uses DirectX and/or OpenGL) is not.
And the question is: how can I modify code of my launcher.exe window to prevent Steam overlay to be created on it "automatically"?
Update, response from Valve's TS:
Sorry, there's no code in place to selectively enable or disable the
overlay between launchers and games!
The only "option" is to disable DirectX drawing in the launcher.exe. In this case their injected DLL will not create that thing. But that effectively means no GPU accelerated UI drawing under the Steam... Kind of "640kb is enough for everybody" type of design.
Ideally Steam should send some custom message to the window to ask how and where the window wants that overlay to be rendered. But apparently there is no such thing, or is it?
Just for the context, the launcher looks as this:
I'm looking to make a sprite animation editor. I have the loading of my custom animation file done but now need to get the actual ui started. I'm really just stuck on what widgets I would use to actually play my animation. I need to be able to go to certain frame, play, pause, loop, etc. Once I'm done with the viewing portion I plan on adding in the editing.
I've seen AnimatedSprite in qt docs but that seems to only allow playback of sprites in the same file. In my situation sprites can be from multiple image files and sometimes doesn't follow a grid like sprite cutter.
First of all, you should decide whether you want to use QML or Widgets. AnimatedSprite is QML related class. All widget-related classes starts with "Q" letter.
If you decide to use Qt Widgets, I would recommend to take a look at Qt Animation Framework in combination with Qt Graphics View Framework. Most likely it will not let you do everything you want out of box, but it should provide you with a rich set of useful tools.
If you need here are some examples.
Hope it helps.
Have a look at QMovie. This class may provide all the methods you need, as long as you only want to use it for viewing. The QMovie can be passed to a QLabel to show the animation.
QMovie however supports only gif out of the box (and there is a third party plugin for apng files). You would probably have to create your own image handle plugin to support your format.
If thats not applicable or to complicated, you will most likely have to create your own custom widget. Have a look at the painter example. Playing an animation is not that hard if you have all the frames. A simple QTimer to change the image to be drawn in a constant rate should work.
I am coding a game with cocos2d-x in c++.
In my game scene i will place some instances of my class CircleSprite (Which is an extension of Layer where i create multiple items and set them like child of CircleSprite.
In my scene the user should touch one circle and connect it to another one by moving the finger until another circle is reached. While doing this a line (sprite or draw it dosen't matter) should appear and follow the finger until reach the choosen circle.
I'm new with cocos2d programming and i'm not a c++ expert...i don't know how to manage the events.
Check this official tutorial
http://www.cocos2d-x.org/wiki/User_Tutorials-Dragging_a_Sprite_Around_the_Screen
Also check the version of cocos2dx for which this tutorial was written. If yours is a lower version then v2.3, then you just need to override onTouchesBegin and onTouchesEnded onTouchesMoved functions of the layer, the are already registered with the touch events.
I've been looking around for awhile about how to produce buttons using Direct2D and DirectWrite with no luck. I'm comfortable with shapes, text and that jazz. However, it suddenly occurred to me I might be looking about it in the wrong way.
Take the sentence:
you draw your controls and content for your app using the Direct2D and
DirectWrite APIs, handling all the input events directly.
I'm now thinking this means that instead of being able to quickly produce a fully functional button as I would using XAML. I would draw the button, manually check the location of the mouse on click, whether it's within the button boundaries and then handle the event? Similar method for hovering without the click.
Is this the kind of method required when using Direct2D and DirectWrite?
I haven't any experience with DirectX, but in OpenGL I build my buttons from scratch. Assuming you have animated sprites implemented, your buttons are essentially sprites that play certain animations in response to being clicked, hovered over, etc., and which you can register callbacks with. In my 2D engine, I have a class called UiButton, which inherits Sprite, and listens for various UI events. It gets more complicated when you want to handle keyboard navigation (arrow keys + enter to select) as you have to think about how the buttons are connected and which of them has focus at any given moment.
Here is my implementation for reference:
Headers: https://github.com/RobJinman/dodge/tree/master/Dodge/include/dodge/ui
Source: https://github.com/RobJinman/dodge/tree/master/Dodge/src/ui
If you're not prepared to roll your own, Googling "direct2d gui framework" seems to bring up some promising results.
Sorry I can't be of more help.
Yes, to draw a UI Button with Direct2D, you need to handle everything yourself, why? Direct2D is a 2D graphics API, not controls library. you need to draw the layout of your button, and handle the message of your button(such as click, mouse hover...), you lost lots of convenient and that's time-consuming, but the most important thing is: you can control it by yourself!
Direct2D is a graphics library. UI controls like, Text-selection, Textbox, and Buttons is not a part of it. However the benefits of using Direct2D and DirectWrite is we can implement our own UI controls, and having a full control of it.
Please also see: ID2D1Geometry::FillsContainsPoint() for hit-testing task.