How to read the temperature from sensors on the motherboard? - c++

Trying to get the temperature of the processor.
Have already tried using WQL (WMI class MSAcpi_ThermalZoneTemperature), but apparently it is not implemented for all the platforms yet. On most of the machines it simply returns every existing error message via HRESULT return value. The temperature itself is not returned.
The idea is to read this temperature directly via the bus port. I have found the library, which gives the functionality of outp and inp functions, and with it managed to start initiate the connection (NTPort), however, the question becomes which port to connect to in order to read the data. The microcontroller is IT8728F. SpeedFan (the application that is able to read the temperature) in its logs says that it reads the data from port 0x290. However, when connecting to it, the data that comes back does not look like temperature (it always returns 29).
So what the next thing that was tried was to read from whatever port, trying to determine if any data that came through looked like temperature. However, from every port some data came through that was either too low or too high to be the real temperature.
int CPU_TEMP;
Outp(INDEX, BANK_SET);
Outp(DATA, Inp(DATA)|0x01);
for(CPU_TEMP=0x0;CPU_TEMP<0x999;CPU_TEMP++)
{
Outp(INDEX, CPU_TEMP);
printf("CPU temp: %iC\n", Inp(DATA));
}

maybe your processor is 29 degrees Celsius and stable. That seems legit for a processor.
Also: is that 0x00000029 or DEC 29? that makes a big difference

Related

Qt Serial Communication not sending all data

I am writing a Qt application for serial communication with a Qorvo MDEK-1001. All built-in serial commands I've had to use work fine except for one: aurs n k, where n and k are integers corresponding to the desired rate of data transmission (e.g. "aurs 1 1\r"). Write function is:
void MainWindow::serialWrite(const QByteArray &command)
{
if(mdek->isOpen())
{
mdek->write(command);
qDebug() << "Command: " << command;
//mdek->flush();
}
}
If I send the command "aurs 1 1\r". It doesn't actually get sent to the device until I send another command for some reason. So if I subsequently send the "quit" command to the device, the returned data from the device is: "aurs 1quit", which registers as an unknown command. Any assistance getting this command to send properly is appreciated.
I've tried a bunch of stuff (setting bytes to write as second parameter in write(), using QDataStream, appending individual hex bytes onto QByteArray and writing that), but nothing has worked. This is the first time I've had to use Qt's serial communication software so I've probably missed something obvious.
On Linux Manjaro (same thing happens on Windows 8.1)
Connection settings: 8 data bits, Baud: 115200, No Flow Control, No Parity, One Stop Bit

get the binary data transferred from grpc client

I am new to gRPC framework, and I have created a sample client-server on my PC (referring to this).
In my client-server application I have implemented a simple RPC
service NameStudent {
rpc GetRoll(RollNo) returns (Details) {}
}
The client sends a RollNo and receives his/her details which are name, age, gender, parent name, and roll no.
message RollNo{
int32 roll = 1;
}
message Details {
string name = 1;
string gender = 2;
int32 age = 3;
string parent = 4;
RollNo rollid = 5;
}
The actual server and client codes are adaptation of the sample code explained here
Now my server is able to listen to "0.0.0.0:50051(address:port)" and client is able to send the roll no on "localhost:50051" and receive the details.
I want to see the actual binary data that is transferred between client and server. i have tried using Wireshark, but I don't understand what I am seeing here.
Here is the screenshot of wireshark capture
And here are the details of highlighted entry from above screenshot.
Need help in understanding wireshark here, Or any other way that can be used to see the binary data.
Wireshark uses the port to determine how to decode the communication, and it doesn't know any protocol associated with 50051. So you need to configure it to treat this as HTTP.
Right click on a row and select "Decode As..." in the context menu.
Then set "Current" to "HTTP" or "HTTP2" (HTTP will generally auto-detect HTTP2) and hit "OK".
Then the HTTP/2 frames should be decoded. And if using a recent version of Wireshark, you may also see the gRPC frames decoded.
The whole idea of grpc is to HIDE that. Let's say we ignore that and you know what you're doing.
Look at https://en.wikipedia.org/wiki/Protocol_Buffers. gRPC uses Protocol Buffers for it's data representation. You might get a hint at the data you're seeing.
Two good starting points for a reverse engineer exercise are:
Start simple: compile a program that sends an integer. Understand it. Sniff it. Then compile a program that sends a string. Try several values. Once you understand it, pass to tacke the problem of understanding how's google sending your structure.
Use known data and do small variations: knowing what 505249... means is easier if you start knowing the data you're sending (as an example, send "Hello world" string; then change it to "Hella world"; see what changes on the coded sniff; also check that sending several times the same data produces the same sniffed output). Apply prior point: start simple, first empty string, then " ", then "a", then "b", etc. and then pass to complex and larger strings. Don't be affraid to start simple.

Updating yaw with msg_request_data_stream and MavlinkObserver

I am working on a project in which I am controlling a quadcopter with an android phone. I have noticed that the using the DroneListener to listen for ATTITUDE_UPDATED messages is not updating the yaw nearly as fast enough as I need it to. I would like to get the quadcopter’s yaw updated frequently, in (more or less) real time, such as how the Mission Planner application does when you plug your quadcopter in via usb.
I have tried using msg_request_data_stream (that is essentially what Mission Planner uses, but in C#) to request various data streams from the drone, but it has not worked. My assumption is that such the response to this request (i.e., the stream) would be received by any registered MavlinkObservers, however the custom MavlinkObserver that I have added to my drone has not had its onMavlinkMessageReceived(…) method called a single time. It has not even received heartbeat messages, which should (based on my understanding) be coming reqularly and frequently.
Is there any help that you could offer? Is there any better way to get the quadcopter’s yaw than waiting for the DroneListener to be notified of events? I know that it should be possible with Mavlink (MissionPlanner does it), but I haven’t been having any success with Mavlink messages or with a MavlinkObserver.
Here is my code for requesting a data stream:
msg_request_data_stream message = new msg_request_data_stream();
//not sure how doubles cast to byte, so I cast it to an int first (the value is
//always going to be an int anyway, getValue just returns doubles for parameters by default)
message.target_system = (byte)((int)params.getParameter("SYSID_THISMAV").getValue());
message.req_message_rate = (short)1;
message.req_stream_id = (byte)0;
message.start_stop = (byte)1;
//wrap the message and send it!
ExperimentalApi.sendMavlinkMessage(this, new MavlinkMessageWrapper(message));

Android usb host input bulktransfer fails to read randomly when data available

The following code is inside a thread and reads input data coming over usb. Approximately every 80 readings it misses one of the packets coming from an stm32 board. The board is programmed to send data packets to the android tablet every one second.
// Non Working Code
while(running){
int resp = bulktransfer(mInEp,mBuf,mBuf.lenght,1000);
if(resp>0){
dispatchMessage(mBuf);
}else if(resp<0)
showsBufferEmptyMessage();
}
I was looking the Missile Launcher example in android an other libraries on the internet and they put a delay of 50ms between each poll. Doing this it solves the missing package problem.
//Working code
while(running){
int resp = bulktransfer(mInEp,mBuf,mBuf.lenght,1000);
if(resp>0){
dispatchMessage(mBuf);
}else if(resp<0)
showsBufferEmptyMessage();
try{
Thread.sleep(50);
}catch(Exception e){}
}
Does anyone knows the reason why the delay works. Most of the libraries on github has this delay an as I mention before the google example too.
I am putting down my results regarding this problem. After all seems that the UsbConnection.bulkTransfer(...) method has some bug when called continuously. The solution was to use the asynchronous API, UsbRequest class. Using this method I could read from the input endpoint without delay and no data was lost during the whole stress test. So the direction to take is asynchronous UsbRequest instead of synchronously bulktransfer.

Synchronizing input pins in directshow

I am creating a directshow filter which's purpose is to take 3 input pins and create a video which shows alternately vidoe from the first source, the second source and the third source, in a fixed time internal.
So if i have three webcam connected to my filter, i want the final video for example to show 5 seconds of the first cam, five seconds of the second cam, and so on...
I have tried two approaches:
Approach one
I use a class TimeManager. This class has a function isItPinsTurn(pinname). This functions returns true or false regarding if the pin is supposed to send sample to the output. To do this the TimeManager creates a new thread which sleeps every x seconds.
After it slept it changes to the current active inputpin to the next.
The result is that every x seconds the isItPinSTurn(pinname) function returns another pin. This way every pin only seconds output to the outputpin when it is its turn, hence i get the desired videos with x intervalls between the input cam.
The problem with this approach
Sleep doesn't seem to work in directshow filters. I get a runtime error:
abort() has been called
Approach two
I use the samples GetMediaTime method and a buffer which keeps track of how much video samples in terms of its mediatime, has already been sent to the output pin. This is best illustrated with code:
void MyFilter::acceptFilterInput(LPCWSTR pinname, IMediaSample* sample)
{
mylogger->LogDebug("In acceptFIlterInput", L"D:\\TEMP\\yc.log");
if (wcscmp(pinname, this->currentInputPin) == 0)
{
outpin->Deliver(sample);
LONGLONG timestart;
LONGLONG timeend;
sample->GetTime(&timestart, &timeend);
*mediaTimeBuffer += timeend - timestart;
if (*mediaTimeBuffer > this->MEDIATIME)
{
this->SetNextPinActive(pinname);
*mediaTimeBuffer = 0;
}
}
}
When the filter starts the currentInputPin is set to pin0 (the first). Calls to acceptFilterInput (which is called by the the input pins receie function) adjust the mediaTimeBUffer with the size of the MediaSample-MediaTime. If this buffer is higher than MEDIATIME (which can for example be 5 (seconds)), the buffer is set back to zero and the next pin is set active.
Problems with this approach
I am not even sure if CMediaSample->GetMediaTime returns the data i need, as it seems to return negative numbers, which doesn't seem to make much sense. I didn't find useful information about the return value of GetMediaTime on the web.
You are expected to block execution (incoming calls to IPin::Receive) on input streams so that other streams could catch up on their own streaming threads. You typically achieve this by either using wait/synchronization APIs and functions, or by holding references on media samples so that input peer would block on empty allocator waiting for a media sample (buffer) to get available.
Yes Sleep works well, although polling is the worst of possible options.
Approach two does not make sense for me because I don't see any real synchronization there: there is no execution blocking, and there is no making pin active. You cannot force data on the input pin, you only can wait to get called with new media sample. So you should block accepting data on one input stream/pin until you get data on another.
Some useful relevant information on multiplexing:
How to make a DirectShow Muxer Filter - Part 1
How to make a DirectShow Muxer Filter - Part 2
GDCL MPEG-4 Multiplexer - available in source, and can multiplex data from 2+ streams