Pepper robot app development - Follow me as long as you see me - pepper

I would like to make the Pepper robot follow me as long as it recognizes me until I touch the left hand.
I made the pepper recognize me with Learn face choreograph box. I also know the pepper recognize my left-hand touch using Tactile L.Hand Choregraphe box. The problem which I have is using Move Toward box. It seems that it simply report failure and stop.
I wonder anyone know how to make this app be done. Thanks,

You probably want to use ALTracker, specifically the "Move" or "Navigate" modes, that will make Pepper try to move toward it's target; in this case you should track "People" (defined by a ppid, People Perception ID).
(So if so far you're doing this in Choregraphe, you will need to define a custom box).

Related

Communicating with a Blackstar ID Core Guitar Amplifier through its USB Entrance and C++

I currently own a Blackstar ID Core 10w Amp. It has a lot of built in guitar effects such as Reverb, Delay and Modulation, all of which have various depths and levels. By connecting a USB cable from the amp to my computer, I'm able to use Blackstar's Insider software, which allows me to save these effects settings and switch to any of them with just a double click. However, the need for a double click makes it impossible for you to play your guitar and change effects during a song (which is what a pedal does).
However, I wanted to know if it's possible to use C++ to do something more ambitious than the manufacturer allows: I wanted to create a software that would play a backing track (voice+drums but no guitar) of a song and allow the user to set where during that song he wants his effects changed, and to what. This way, one would be able to play a song from start to end, not needing to worry about having to change effects.
This would also be a school project, so it can't really be a "mouse manager" or anything of this sort. It would need to be something more robust.
FYI, as far as I'm concerned, Blackstar does not give us any API we could work with. So I'd like to know if this project is even possible and, if so, where I should start.
Thank you!
This existing project is likely to help provide clues for you to reverse engineer what Insider does and rewrite that in C++.
https://github.com/jonathanunderwood/outsider
I feel your pain regarding Blackstar's awful Insider software.
To answer your question, is this project even possible, of course it is, the Insider software obviously is able to control the amp via USB. You just have to figure out what its protocol is.
You can use a USB sniffer like this one to see what commands Insider sends to the amp when you perform actions. With enough experimentation you should be able to reverse-engineer the protocol.
It's probably easier than you think. As evidence I offer the Insider software itself, which is not very sophisticated. The settings are probably modeled more or less as a struct.

Use a Live Card as a Shortcut to my application

I have an application that my user would frequently have to access quickly and dismiss.
I wanted to allow the user be able to sticky my app as a live card but with only the overhead of a lower frequency card. I want to create a bookmark/shortcut to allow the user to launch and use my activity and then able to quit normally.
When I use high frequency cards they make my application hang and freeze. Is there a better way to accomplish this?
Also is this advisable as a UX/UI design standpoint?
Without more context on what your application does, it's hard to know but I'm hesitant to say this is a good way to go about it.
If it's just as a pure shortcut, it's probably a bad idea. Glass already has the "ok glass" menu paradigm which is supported by both touch and voice input, while a card would only be supported by touch. Further, a bunch of live cards as shortcuts probably doesn't scale well if the user has a number of applications which do that.
On the other hand, if your application shows information on the live card that the user would want to see on a regular basis (think Google now cards, etc) then this might be a good idea depending on how it's executed.
Again, it's hard to know without more context. Glass is a different paradigm than phones or desktops so be careful when importing concepts from those other interaction paradigms.

OpenGL in live cards?

I've been playing with the glass GDK and glass 'native' (Java) development in general. I've got a open GL app that works well enough on Glass (using standard android conventions) and I'm looking to port it to the GDK to take advantage of voice triggers and what not.
While I can certainly use it easily enough as an Immersion (I think anyway) what I'd like to do is use it as a Live Card. I'm just not sure how possible or practical this is. The documentation seems to imply that with a high frequency rendering card this should be possible but before I dive in I was hoping someone with more experience could weigh in.
(Forgive me if this is obvious -- I'm new to Android having spent the last few years in IOS/obj-c land)
XE16 supports OpenGL in live cards. Use the class GlRenderer: https://developers.google.com/glass/develop/gdk/reference/com/google/android/glass/timeline/GlRenderer
I would look at your app and determine if you want to have more user input or not and whether you want it to live in a specific part of your Timeline or just have it be launched when the user wants it.
Specifically, since Live Cards live in the Timeline, they will not be able to capture the swipe backward or swipe forwards gestures since those navigate the Timeline. See the "When to use Live Cards" section of: https://developers.google.com/glass/develop/gdk/ui/index
If you use an Immersion, however you will be able to use those swipe backwards and forwards gestures as well as these others: https://developers.google.com/glass/develop/gdk/input/touch This will give you complete control over the UI and touchpad, with the exception that swipe down should exit your Immersion.
The downside is that once the user exits your Immersion, they will need to start it again likely with a Voice Trigger, whereas a Live Card can live on in part of your Timeline.
You should be able to do your rendering in both a Surface, which a LiveCard can use or in whatever View you choose to put in your Activity which is what an Immersion is. GLSurfaceView for example may be what you need and that internally uses a Surface: http://developer.android.com/guide/topics/graphics/opengl.html Note that you'll want to avoid RemoteViews but I think you already figured that out.

Google Glass bone conduction transducer as input device?

Does Google Glass's bone conduction transducer also work as an input device, and if so, how can one access its readings?
EDIT: Let me clarify why I ask.
According to Catwig's tear-down "it appears to double as a tactile switch". It's hard to tell from the pictures, so I was wondering how sensitive the switch is, and whether or not it could be used to detect vibrations in the skull. If this is the case, it could be used to enhance voice command accuracy by identifying which sounds are originating from the wearer.
tl;dr: The speaker on Glass is not a button. Do not press it.
The speaker may appear to double as a tactile switch, but it does not. Depressing it like a button makes a clicking sound, but it does not generate a signal and it's not designed to be pressed like a button.

Kivy: How to make Kivy Pong Game to stop after maximum points reached?

I found Pong Game made in Kivy and trying to develop it further just for my own learning purpose. But it is difficult to find any information about Kivy which could help me. For example, at this moment I am trying to make the game stop after one of the players reaches max defined points. I have no code for this, since I have no idea about how to do this. Can somebody point me to the right direction? Is there any source which explains in simple language how the Kivy stuff works. I find the official Kivy tutorials too "professional".
As this is for learning purpose, i won't give a solution, more like directions.
You want to have something happen when the score reach some number of points, kivy have the concepts of properties, and you can see scores are stored into NumericProperties, properties have the nice advantage that you can bind to them, that is, have actions happen when they change. (see http://kivy.org/docs/guide/events.html#introduction-to-properties). Or you can simply check the score after it's changed (each time a point is taken).
Once you got your event, you can change the game state, either quit it (any way to crash the program would do, but it's always better to quit nicely), or you can reset the scores to 0 (by simply changing the properties values), you can even try to add a widget (maybe a Popup?) that asks for the player to start a new game, and only start serving balls again when the user validated, you can add conditions to the serving of balls, and change the game inner working to achieve such things, just experiment.
Hope this helps.