Is there anyway I can get the robot view out of Choregraphe? read body for more specifics - pepper

My main goal is to create a website that when you open it, it shows the identical view that the Robot view has in Choregraphe (virtual robot and everything) and then you could send a behavior to the robot on that website and the robot would do the behavior. But I am trying to figure out if I can just get the robot view by itself. Like imagine opening Choregraphe but it opens with just the robot view (a view of the robot). Let me know if anyone has dealt with this, thanks.

No, the robot view is handled by a native program showing the view on your PC screen. Apart by capturing the screen, it is not designed to be redirected to a web page.
However NAO and Pepper robots are compatible with ROS, and you can setup web views showing any robot running ROS. I found this online course that could teach you exactly how to do that.

Related

Why does SwiftUI code run/reloae on iPhone and not on iPad?

I have been looking at this code and other similar code for the last week and have a question.
The app (code link below) is a simple 5 question app in SwiftUI. Using an iPhone 11 simulator, the quiz runs perfectly and when you hit the back nav link and press the Start Quiz button, the quiz runs again. On an iPad however the Start Quiz button does not work to relaunch the Quiz.
I figured since both are running the same IOS, they should both function the same but they do not. Tried it on my physical iPad and the same thing. Runs through the quiz once, and will not run a second time. Anybody have any idea why? Thanks for your help!
https://github.com/albypanz94/Quiz-Game-in-SwiftUI.git
The issue is that the SwiftUI Navigation model is, by default, based on a UISplitview, and that on the iPad, the "Start Quiz" navigation link transfers navigation to its second controller. On an iphone, the second controller is collapsed away.
What this means for the game is that for the iPad (and large iPhones in landscape mode), the game cannot re-start.
You would also find anomalies if you ran the game on a large format iPhone, and then rotated it mid way through the game.
You can change the navigation view style to not use the split view model by using:
.navigationViewStyle(StackNavigationViewStyle())
after the NavigationView closing brace.

C++ WinRT - GameControllerSwitchPosition() is for which part of gamepad?

I am using RawGameController API and from article below:
https://learn.microsoft.com/en-us/uwp/api/windows.gaming.input.gamecontrollerswitchposition
Which part of Xbox controller shows the GameControllerSwitchPosition ?
I have tested Xbox controller, retrieved current button, axis states except the SwitchPosition where the API GameControllerSwitchPosition() always result in GameControllerSwitchPosition::Center.
Any game controller or gamepad that has this "SwitchPosition" features ?
Any info or links to show the "Switch" mechanism as picture is greatly appreciated.
Please advise.
I noticed GameControllerSwitchPosition() is actually referring to "Point of View Hat" under the controller panel properties such as picture below:
GameControllerSwitchPosition() as Point of View Hat detection image
The above is true for PS4 DualShock 4 and some clone controller.
However, it is not the case for Xbox One controller where the view hat value positions are reported as part of Button section.
Hopes, this help.

How to make an Android app that is overlayed over the home screen

I intend to create an app that will be overlayed on top of the normal "home" Android screen.
I have only seen this done before with the Facebook Messenger app: it puts a floating, draggable icon in the homescreen whenever I get a message, and if touched it opens on top of everything else.
I would like to make an app that can show animated graphics similarly floating around on top of the homescreen, with user interaction possible.
My most preferred way would be with Adobe AIR. :-) But that might be too much to ask. :-)
My second best preferred way to do it would be through Visual Studio 2015 C++, i.e. the Android NDK. Is it possible? Or is the standard Java SDK the only way to go?
Bonus PS. I haven't said anything about iOS cause I assume it is not possible at all on that platform. Or is it?

Popping consecutive view controllers and returning to main view controller (using navigation controller)

I'm following the Firebase-Chat-Messenger example in the "let's build that app" Youtube videos, and it works fine.
However, I'm testing integration inside a test application :
My test app has a menu with buttons and one of them is for the chat, which takes us to a similar interface (login menu and so on, anything beyond it is similar to the example in the tutorial. But you don't need to check it to answer my question).
Main menu button => Login/Register interface => Chat interface
I can't find a way to dismiss the chat interface to return to the main menu of the app, dismiss always returns to the login/register interface and sometimes causes errors. Could you suggest a good solution to use for this?
tl;dr : How to dismiss two or more views and return to main view (main menu) of app?
P.S : I'm new to Swift and still struggling with some basic elements, Sorry if the question seems too simple.
Use either popToRootViewController(animated:) to pop to the root view controller, or popToViewController(_:animated:) and provide the spicific controller you'd like to pop to.

How to simulate a mouse click without interfering with actual mouse in Python

I've made a simple click bot to automatically play an android game on my Windows PC. It currently identifies when certain things change on the screen then moves the mouse and clicks the correct button.
Currently I am using the following win32api functions to achieve this:
Win32api.SetCursorPos(Position)
Win32api.mouse_event(winn32con.MOUSEEVENTF_LEFTDOWN,0,0)
Win32api.mouse_event(winn32con.MOUSEEVENTF_LEFTUP,0,0)
These work great however when I use the bot it takes over my computer's mouse and I basically have to let it run. Is there anyway I could simulate a click without it actually using my mouse, or is there a way I could isolate the bot on one of my other screens and be able to work freely on the other?
There is a lib specific for deal with user interaction components and periferics: pyautogui
Here, is a short and easy to try documentation for performing/simulating mouse click DOWN & UP
https://pyautogui.readthedocs.org/en/latest/mouse.html#the-mousedown-and-mouseup-functions