EEPROM protocol misconception - c++
I am currently using a michrochip's eeprom ( 24cw160 ) connected with an stm32f4 (11RET) via i2c. The configurations and the connection seem to work as my logical analyzer prints some i2c messages (with ACK) and I can send data and receive data back. After reading the reference manual(especially pages 13 and 18 that have the schematics for the two operations I am doing) I am expecting the code below to send the data 0,1,2... to the addresses after x10 sequentially and then receiving the same data back and printing them :
while(true){
HAL_Delay(1000);
std::array<uint8_t,100> arr{};
int counter=0;
for(auto&i :arr){
i=counter;
counter++;
}
auto ret1 = HAL_I2C_Mem_Write_DMA(&hi2c1 , 0xa0 , 0x10 , 1 ,arr.data() , arr.size());
HAL_Delay(1000);
std::array<uint8_t,100> arr2{};
arr2.fill(1);
auto ret2 = HAL_I2C_Mem_Read( &hi2c1 , 0xa1 , 0x10 , 1 , arr2.data() , arr2.size(),100);
printf("arr2:\n");
for(auto i:arr2){
printf("%d,",(int)i);
}
printf("\nWrite ret status: %d\nRead ret status: %d\n",ret1,ret2);
}
Instead what I get on my terminal is :
arr2:
70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,
Write ret status: 0
Read ret status: 0
arr2:
68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,
Write ret status: 0
Read ret status: 0
Notice ,that the first line of prints has some differences with the second and the second is recurring (so the while true at the first time print a little bit different things than the others) I honestly think I have confused myself with the constant parameters I give to HAL_I2C_Mem_Write and read and I would like some explanation on that too.
For more info comment me and I will provide all the necessary diagnostics/initializations etc.
Thanks to the comments I managed to send a byte at whatever address I want . One noticed error on the previous code is the MemAddress parameter( third parameter of HAL_I2C_Write_Mem) ,because my eeprom is 11 bit addressable I should declare to hal that it is 16 and not 8 as the 5 extra bits are ignored bits ,but less than 11 might malfunction.
A second problem was that I was trying to write more than 32 bytes and I was reading only the first 32 as the protocol doesn't accept writing or reading more than one 32 byte page at a time. So here is the modified code with some comments on the changes:
std::array<uint8_t,32> arr{0}; //32 bytes instead of something silly
unsigned counter=0;
for(auto&i :arr){
i=counter;
counter++;
}
// 0x20 is a start of a page so it can write all 32 bits
// I2C_MEMADD_SIZE_16BIT is the value 2 instead of 1 I had
// The extra 5 bits are ignored
// For the testing I call the blocking write
auto ret1 = HAL_I2C_Mem_Write(&hi2c1 , 0xa0 , 0x20 , I2C_MEMADD_SIZE_16BIT , arr.data() , arr.size(),4);
HAL_Delay(4);
std::array<uint8_t,32> arr2{0};
arr2.fill(1);
auto ret2 = HAL_I2C_Mem_Read( &hi2c1 , 0xa1 , 0x20 , I2C_MEMADD_SIZE_16BIT , arr2.data() ,arr2.size(),4);
HAL_Delay(4);
printf("arr2: ");
for(auto i:arr2)
printf(" %d,",(int)i);
printf("\nWrite ret status: %d\nRead ret status: %d\n",ret1,ret2);
Related
Reading value with I2C protocol from magnetoscope
I'm pretty new still to all this. So please excuse me if there is something obvious. I have been struggling with the included datasheet for a magnetoscope. For some reason it seems like everything is working, but when I wave a magnet at it, I'm not really getting any response in the serial. So here is some information. #include <Wire.h> void setup() { Wire.begin(); // join i2c bus (address optional for master) Serial.begin(9600); // start serial communication at 9600bps } void loop() { int reading = 0; int Address = 30; Wire.beginTransmission(Address); Wire.write(byte(0x03)); Wire.write(byte(0x04)); Wire.endTransmission(); Wire.requestFrom(Address, 2); delay(10); if (2 <= Wire.available()) { reading = Wire.read(); reading = reading << 8; reading |= Wire.read(); Serial.println(int(reading)); } delay(250); // wait a bit since people have to read the output :) } With this code, I receive a number. -5637 -5637 -5637 -5637 -5637 But then if I remove the following line Wire.write(byte(0x03));, my output does not change. The value from the device is supposed to be expressed as two's complement. So at first I thought I didn't know how to send multiple bytes to the device, but after some research I found that I was doing it right (I think). Then if I only put Wire.write(byte(0x03)); I receive "0" as response. Reading the datasheet I see that response 0 means that the command is invalid. I included the datasheet to this post. Can someone point me in the right dirrection? The IC I'm using is an LSM303DLHC and I'm using it from this "sheild". Here is the datasheet. The following picture is a picture of the communication of the bus.
I believe the following code does this, which is like Table 11 in the datasheet: Wire.beginTransmission(Address); // START and write to device address 0x1E Wire.write(byte(0x03)); // Set the register pointer to sub-address 0x03 Wire.write(byte(0x04)); // Write a value of 0x04 to sub-address 0x3 Wire.endTransmission(); // STOP. Then I suspect the device's register pointer gets automatically incremented from register 0x03 to 0x04. And then the rest of your code maybe reads two bytes from sub-address 0x04 and 0x05. You didn't express your intention for your code but I suspect the above is NOT what you intended. My guess is that you intend to read two bytes from device address 0x1E, sub-address 0x03 and sub-address 0x04. Is that right? You should be doing an operation like what is described in Table 13. Wire.beginTransmission(Address); // START and write to device address 0x1E Wire.write(byte(0x03)); // Set the register pointer to sub-address 0x03 Wire.requestFrom(Address, 2); // REPEAT-START and read 2 bytes from sub-address 0x03 and 0x04
Openframeworks, reading serial data from Arduino
I'm trying to read serial data from an Arduino UNO using an ofSerialobject and assign it as an int. I am able to read in individual bytes, however, the values I'm receiving in the openframeworks console are not the same as the values I'm reading in the Arduino serial monitor. I have provided screenshots of the respective consoles: My Arduino code is simply the basic "AnalogReadSerial" example available with the Arduino IDE. // the setup routine runs once when you press reset: void setup() { // initialize serial communication at 9600 bits per second: Serial.begin(9600); } // the loop routine runs over and over again forever: void loop() { // read the input on analog pin 0: int sensorValue = analogRead(A0); // print out the value you read: Serial.println(sensorValue); delay(1); // delay in between reads for stability } Whereas my C++ code is mostly copied from the documentation for the ofSerial readByte function. void serialReader::setup() { serial.listDevices(); vector <ofSerialDeviceInfo> deviceList = serial.getDeviceList(); serial.setup("COM4", 9600); //open the first device and talk to it at 9600 baud } void serialReader::printByteToConsole() { int myByte = 0; myByte = serial.readByte(); if ( myByte == OF_SERIAL_NO_DATA ) printf("\nno data was read"); else if ( myByte == OF_SERIAL_ERROR ) printf("\nan error occurred"); else printf("\nmyByte is %d ", myByte); } Any insight into what may be causing this disparity between the readings would be greatly appreciated. Thank you.
Arduino's Serial.println translates the raw bytes into their ASCII equivalent and then sends those bytes followed by linefeed (10) and carriage return (13) bytes. So, the raw byte 12 is sent as 4 total bytes -- two bytes representing the ASCII 1 (49), 2 (50) and then (10) and (13) for the new line characters. So, since openFrameworks does not automatically translate the ASCII values back into raw bytes, you are seeing the ASCII version. The Arduino console shows you the ASCII version as readable text. You can see the translation between ASCII and raw bytes (Decimal / aka DEC) here: http://www.asciitable.com/ If you want the two numbers to match on both sides, consider using Serial.write in Arduino to write the raw bytes without ASCII translation and new line characters.
What variable type to use for parameter LPVOID lpBuffer in C++ functions WriteFile and ReadFile
What variable type should be used for lpBuffer of C++ ReadFile and WriteFile functions to communicate between a Windows XP based PC and a micro-controller based system? The PC has WinForm application in VS2010 C++/CLI. The micro-controller firmware is ANSI C. My PC is supposed to transmit command characters (say 'S', 'C' etc) followed by command termination character 0xd (hex for decimal 13). The micro-controller based system would respond with 5 to 10 bytes that would be mix of ASCII characters and hex numbers e.g. 'V' followed by 0x41 0x72 etc. PC transmits and micro-controller receives: TxMessage, PP1 and pTx declared as char and keeping nNumberOfBytesToWrite as 2, makes the micro-controller receive 0x53 for 'S' followed by 0xC3 instead of 0xd. TxMessage, PP1 and pTx declared as wchar_t and keeping nNumberOfBytesToWrite as 2, makes the micro-controller receive 0x53 for 'S' only. TxMessage, PP1 and pTx declared as wchar_t and keeping nNumberOfBytesToWrite as 4, makes the micro-controller receive 0x53 for 'S' followed by 0xd correctly. The third scheme of transmit and receive above meets my expected behavior of a solution. But the confusion is here: Although the PC might be transmitting 4 bytes (for two wchar types), the micro-controller receives 2 bytes 0x53 for 'S', correctly followed by 0xD. Micro-controller transmits and PC receives: Assuming that wchar_t is the right choice for lpBuffer, what should be my nNumberOfBytesToRead for receiving 10 bytes from the micro-controller? ReadFile would expect 20 bytes by virtue of wchar_t, whereas the micro-controller would transmit 10 bytes only. Amazingly, irrespective of declaring (RxMessage, PP2 and pRx) as wchar_t, char or unsigned char, ReadFile receives 10 bytes from the micro-controller (meets my expected behavior of a solution). But the issue is that transmitting 'A' 10 times from the micro-controller, ReadFile on the PC's end receives junk like 'S', 0x0, 0xd, 0x54, 0x29. /// Required designer variable. HANDLE hCommPort; BOOL fSuccess; array<wchar_t> ^ TxMessage; array<unsigned char> ^ RxMessage; TxMessage = gcnew array<wchar_t> (12); RxMessage = gcnew array<unsigned char> (12); { TxMessage[0]='S';//target cmd TxMessage[1]=0xd;//cmd termination character DWORD dwhandled; if (hCommPort != INVALID_HANDLE_VALUE) { pin_ptr<wchar_t> pp1 = &TxMessage[0]; wchar_t *pTx = pp1; fSuccess = WriteFile(hCommPort, pTx, 4, &dwhandled, NULL); PurgeComm(hCommPort, PURGE_RXABORT|PURGE_TXABORT|PURGE_RXCLEAR|PURGE_TXCLEAR); pin_ptr<unsigned char> pp2 = &RxMessage[0]; unsigned char *pRx = pp2; fSuccess = ReadFile(hCommPort, pRx, 10, &dwhandled, NULL); }//if IsOpen else{ this->toolStripStatusLabel4->Text="Port Not Opened";} }
ReadFile/WriteFile do not care about C++ types, they operate in terms of bytes read/written. ReadFile reads specified number of bytes (or less if there is less bytes to read) from file/device, and puts them into memory pointed by lpBuffer. WriteFile writes specified number of bytes to file/device from memory pointed to by lpcBuffer. The memory buffer for these functions is simply a region of allocated memory that has the size at least as many bytes as you tell those functions in the third parameter. wchat_t is a multibyte type. It's size can be bigger than one byte. Consequently, your TxMessage[0]='S'; TxMessage[1]=0xd; can be actually filling not two bytes in memory, but, say, 4 bytes. For example, it can be x0053 , x000D in wchar_t representation. From the point of view of WriteFile it does not care how and what you put into that memory. It will read raw memory and will write to device. So, if your device expects x530D, it might not be getting it, but x0053. Overall, think about bytes. If you need to write 4 bytes x0A0B0C0D to your device, it does not matter HOW you allocated buffer for this value. It can be 4-byte unsigned int = x0A0B0C0D, it can be char[ 4 ] = {x0A, x0B, x0C, x0D}, it can be int short [ 2 ]= {x0A0B, x0C0D}, it can be ANY C++ type, including custom class. But the first 4 bytes of memory pointed by the memory pointer passed to WriteFile should be x0A0B0C0D. Similarly, ReadFile will read number of bytes you specify. If your device sends you, say, 2 bytes, ReadFile will write 2 bytes it gets into memory pointed by the pointer you pass to it (and it's your responsibility to ensure it has enough bytes allocated). Again, it does not care how you allocated it as long as there are 2 bytes allocated. After that, again, you can look at these two bytes as you want - it can be char[ 2 ], can be int short etc.
Using bytes is the natural match, serial ports are pretty fundamentally byte oriented devices. You could make your transmitting code look like this: bool SendCommand(HANDLE hCommPort) { auto TxMessage = gcnew array<Byte> { 'S', '\r' }; pin_ptr<Byte> pbytes = &TxMessage[0]; DWORD bytesSent = 0; BOOL fSuccess = WriteFile(hCommPort, pbytes, TxMessage->Length, &bytesSent, NULL); return fSuccess && bytesSent == TxMessage->Length); } Your receiving code needs to do more work, the number of bytes you get back from ReadFile() is unpredictable. A protocol is required to indicate when you should stop reading. A fixed length response is common. Or a special last character is very common. That could look like this: bool ReceiveResponse(HANDLE hCommPort, array<Byte>^ RxMessage, int% RxCount) { for (; RxCount < RxMessage->Length; ) { DWORD bytesReceived = 0; pin_ptr<Byte> pbytes = &RxMessage[0]; BOOL fSuccess = ReadFile(hCommPort, pbytes, RxMessage->Length - RxCount, &bytesReceived, NULL); if (!fSuccess) return false; int rxStart = RxCount; RxCount += bytesReceived; for (int ix = rxStart; ix < RxCount; ++ix) { if (RxMessage[ix] == '\r') return true; } } return true; } Don't overlook the .NET System::IO::Ports::SerialPort class. It has built-in Encoding support which makes it easier to work with characters vs bytes. Your ReceiveResponse() method could collapse to a simple ReadLine() call with the NewLine property set correctly.
Apologies for not able to respond earlier than this. Infact the problem was identified and corrected after posting of my last comment. It related to 9bit protocol being forced by micro-controller. Original micro-firmware is 9bit protocol to address one of many slaves. For development-testing, i had temporary modification to 8bit protocol. Unfortunately, modifications missed the UART mode register that remained as 9bit mode. With boxed/biased mind i kept debugging
recv() strings of unknown encoding from sockets in c++
I'm writing some piece of client code which will connect to a server and issue it an ID - "16 byte string", and in return it will get back the same 16 byte string. I could also get unsolicited requests from this server telling me it's ID, and I'll have to send the same thing back to him as well. When I create the string and send it,it works fine, as I am able to completely parse the reply I get and print out, and it's exactly what I sent. However, for the unsolicited request part - I am unable to read the string that is sent to me. Here is a part of my code .. string my_send_string = "123" ; char my_send_buffer[16] ; char my_reply[256] ; memset(my_send_buffer,'\0', 16) ; strncpy(my_send_buffer,my_send_string.c_str(),my_send_string.size()) ; numBytes = send(sock_fd,my_send_buffer,16,0); // verified that numBytes is indeed 16 .. numBytes = recv(sock_fd, my_reply, 16, 0) ; // verified that numBytes recvd is indeed 16 .. printf("Original Send Reply: %s\n",my_reply ); // This prints 123, as I expect. memset(my_reply,'\0', 16) ; numBytes = recv(sock_fd, my_reply, 16, 0) ; // verified that numBytes is 16. printf("Unsolicited Request:%s\n",my_reply ); // This prints ?a?p??˝س?8? .... I think I'm encoding the received string incorrectly here - any tips on how I can fix this ? In the example above, I tried to just send it once, but if i send the string 10 times with different string ID's, I always get back the ID I send, so it's not just a one time thing.
Pointer in C++ - Need explanation how it works
http://www.codeproject.com/KB/IP/SocketFileTransfer.aspx?artkw=socket%20send%20a%20file I don't clearly understand this line : // get the file's size first cbLeftToReceive = sizeof( dataLength ); do { BYTE* bp = (BYTE*)(&dataLength) + sizeof(dataLength) - cbLeftToReceive; cbBytesRet = sockClient.Receive( bp, cbLeftToReceive ); // test for errors and get out if they occurred if ( cbBytesRet == SOCKET_ERROR || cbBytesRet == 0 ) { int iErr = ::GetLastError(); TRACE( "GetFileFromRemoteSite returned a socket error while getting file length\n" "\tNumber of Bytes received (zero means connection was closed) = %d\n" "\tGetLastError = %d\n", cbBytesRet, iErr ); /* you should handle the error here */ bRet = FALSE; goto PreReturnCleanup; } // good data was retrieved, so accumulate // it with already-received data cbLeftToReceive -= cbBytesRet; } while ( cbLeftToReceive > 0 ); I want to know how get it get size of the file to dataLength :) This line : BYTE* bp = (BYTE*)(&dataLength) + sizeof(dataLength) - cbLeftToReceive; does it right that bp is a byte pointer of dataLength addr but what + sizeof(dataLength) - cbLeftToReceive mean ? I don't think the file is that small : 4 bytes, just onc Receive how can they receive dataLength and data ? Does it send dataLength first and data after ?
Oh. The funny array arithmetic. The idea is to count from the end, so that when you reach the end you know you're done. In pieces: 1. Find the address of dataLength (BYTE*)(&dataLength) 2. Skip to the end of dataLength + sizeof(dataLength) 3. Back up by the number of bytes we expect to receive - cbLeftToReceive This is where we are writing the bytes we get from the network. As we get bytes from the network, we reduce cbLeftToReceive (cbLeftToReceive -= cbBytesRet;) and continue trying to receive bytes until we are done. So every time through the loop, bp points to where we need to write the next bytes we Receive(). EDIT: So now that we know how many bytes we're going to get, how to we receive them without potentially filling all of RAM with hunks of the data? We get a buffer, repeatedly fill it, and flush that buffer to disk whenever it's not empty. When there's still a lot of data (more than a buffer) left to receive, we try to Receive() a fill buffer. When there's less than a full buffer to go, we only request to the end of the file. iiGet = (cbLeftToReceive<RECV_BUFFER_SIZE) ? cbLeftToReceive : RECV_BUFFER_SIZE ; iiRecd = sockClient.Receive( recdData, iiGet ); The we catch and handle errors. If there was no error, write however many bytes we got and reduce the number of bytes we expect to receive by the number we got. destFile.Write( recdData, iiRecd); // Write it cbLeftToReceive -= iiRecd; If we're still not done receiving bytes, go back to the top and keep going. while ( cbLeftToReceive > 0 ); General advice: It's good to practice reading code where you don't pay too much attention to the error handling and exception handling code. Typically what's left is much easier to understand.
He/She means that he sets aside the size of int in the start of the buffer, where the size of the file will be placed (it will be later read from the socket)