I want to create something like plugin or driver for mouse and extend right button mouse menu with movements. First of all, I want to create next feature:
1.user press right button and while button pressed move mouse up
2.will be shown a simple image with the scheme. arrow up and text "close"
3.when button released current window will be closed.
something similar was in old opera browser and it was very comfortable.
I try to find in googleб but I found how to do only for WPF or browser. I want to create global for the whole system with different configs. for example mouse, up in explorer will close the window and in visual studio, it will fire "Run"(f5).
I want to create this using C#. also I have basic knowledge in c++.
I will be grateful for help
Related
I have a QT application with a TreeView in which items can be dragged into and around the TreeView. The Scroll wheel is able to scroll up and down the view when collapsed sufficiently to show the scroll bar in all cases except for when dragging in an item from the desktop.
Could anyone point me in the right directon? Or inform me on what questions I should be asking?
is the problem maybe related to the application not beeing in focus anymore? Keypresses (also mousewheel scroll) is usually send to the application which is activ/in focus.
I have done an private project which would react to keypresses on OS (windows) level to be able to control it while the user would play a computer game.
I don't have enough reputation to make a comment. Hope this might help you.
I am monitoring the mouse/keyboard/touch-screen interactions of my Taskbar icons from the Win32 app in order to make custom behaviours for the icon.
What Win32 API could be used instead of GetCursorPos() to get the x,y where touch-screen was clicked (from other process)?
It seems GetCursorPos() only works for the mouse cursor, not when a finger events occur.
I'm afraid you can't get touch information from other process since since Touch Input Handle is valid only within the current process and should not be passed cross-process.
On Mac there is a window flag/call: WindowTitleHidden + [nativeWindow setTitlebarAppearsTransparent:YES]
which basically makes the title bar to be "embedded" inside the window itself instead of creating a frame that "holds" the window. Like this:
Notice how the minimize, maximize and close buttons are on the same row as the app's widgets.
Is there something similar for Windows?
Like this:
Code: https://github.com/alexandernst/TrueFramelessWindow
AFAIK, you must draw them yourself and then reply to the WM_NCHITTEST mesage that Windows sends to your form to query where the mouse is positioned. That way you can tell Windows that the mouse is located over a, say, maximize button, although Windows itself didn't draw one there.
The painting can be done with the help of VisualStyleRenderer or ControlPaint.
I hope these questions will point you further:
Winforms: Add a close "x" button in a UserControl
Winforms - WM_NCHITEST message for click on control
I am new to cocos2d-x and I am developing a game in x-code using cocos2d-x2.0.4. In my game I created a button using CCcontrolbutton. Now I want to drag my button to one place to another. I tried with all the CCControlEvents but it doesn't work. Now I want to know, is it possible to drag and drop a button using CCControlbutton. I pasted my code which I have used to create the button.
button1 = CCControlButton::create(CCScale9Sprite::create("7.png"));
button1->setAdjustBackgroundImage(false);
button1->setPosition( ccp(winwsize/6, winhsize/7) );
button1->addTargetWithActionForControlEvents(this, cccontrol_selector(plus::add),CCControlEventTouchDragOutside);
button1->addTargetWithActionForControlEvents(this, cccontrol_selector(plus::add),CCControlEventTouchDragInside);
this->addChild(button1, 4);
In add() I have given the code to enter next scene. But now it is entering while clicking the button. But i want while drag it to one position to another. If it is possible to drag a button using CCControlbutton then please give me some sample codes. Thanks.
The events CCControlEventTouchDragInside and CCControlEventTouchDragOutside are happening when the user's touch (his fingertip) is entering or leaving your button while already or still touching the touchscreen (or while the mouse button is still pressed while the mouse pointer moves).
You would need to observe the dragging process yourself. Starting with a click on your button, change the button's position while the user is dragging (to visualize the dragging process), and then call plus::add() when the dragging ends in your target area.
I have in my editor few editing modes. I can choose specific mode using buttons that are placed on a toolbar. I want to indicate which mode is currently on. When I press appropriate button - I want to make the clicked button remain pushed. How do I do that in WinAPI? My toolbar uses bitmaps for icons if that's relevant.
There used to be a way to get something like the look and feel of a toolbar by using a normal check box with the BS_PUSHLIKE style set. But that got broken a bit with Windows XP because of mouse hover effects, so it's not widely used any more.
If you want to create your own toolbar, without the help of MFC, there is an MSDN article that covers the creation and management of a toolbar window (actually a dedicated window class as part of the Common Controls Library).