I try to read and write data to a sensor via i2c with DMA1 in an STM32 Nucleo F401 board where a FreeRTOS is running.
My project is written in C++ and using the stm32 HAL libraries as extern "C".
I can read the sensor data with polling method, and next to the I2C an UART is running with the DMA2 correctly. I have checked and the MX_DMA_Init is running before the MX_I2C1_Init.
When I try to write to the sensor as a master with DMA(HAL_I2C_Master_Transmit_DMA) with the "i2c1 event interrupt" and "i2c1 error interrupt" disabled in the FreeRTOS the parallel tasks are just running fine just the callback HAL_I2C_MasterTxCpltCallback is not triggered.
And when i am enabling the" i2c1 event interrupt" and "i2c1 error interrupt" from the CubeMX two parallel task in the Free RTOS are running at once and then no more parallel scheduling is happening and the HAL_I2C_EV_IRQHandler function is called periodically and leaving the function in the last else branch where the /* Do nothing */ is commented.
" i2c1 event interrupt" and "i2c1 error interrupt" enabled
UART DMA configuration
I2C DMA confuguration
Could you please suggest what could I try to use the i2c with DMA?
I tried to enable and disable the i2c1 event interrupt from CubeMX and expected to trigger the HAL_I2C_MasterTxCpltCallback callback. But only the HAL_I2C_EV_IRQHandler was triggered periodically.
I tried to use the I2C with polling method and it was working correctly.
I tried to use the UART with DMA and it was working correctly too.
I tried to check order of MX_DMA_Init and MX_I2C1_Init, but the order was correct.
I tried if any other I2C callback is triggered, but none other I2C callback is triggered.
I tried to update the Cube MX version from F4 V1.26.2 to F4 V27.1, but i have not found any improvement.
I tried to have all the HAL implementation and callback functions embedded in extern"C". No change has occurred.
I switched to STM32 H723ZG board where at lest the first transfer of the I2C data happened with DMA [FIRST I2C Data transfer][1][1]: https://i.stack.imgur.com/4wsxk.png. But happenes only for 1 cycle and the DMA failes with error code 1. What is the Transfer error.
#define HAL_DMA_ERROR_TE (0x00000001U) /*!< Transfer error */
[DMA Registers]
[2]: https://i.stack.imgur.com/4SH7Z.png
I have seen that in the h= series i need to align the data given to the DMA to be able to send and i tryed to apply the fix for this.
My code now looks like this:
#define TX_LENGTH (16)
uint8_t i2cData[TX_LENGTH];
void I2C_Write8(uint8_t ADDR, uint8_t data)
{
i2cData[0] = ADDR;
i2cData[1] = data;
uint8_t MPUADDR = (MPU_ADDR<<1);
/* Clean D-cache */
/* Make sure the address is 32-byte aligned and add 32-bytes to length, in case it overlaps cacheline */
SCB_CleanDCache_by_Addr((uint32_t*)(((uint32_t)i2cData) & ~(uint32_t)0x1F), TX_LENGTH+32);
HAL_I2C_Master_Transmit_DMA(&i2cHandler, MPUADDR, i2cData, TX_LENGTH);
//HAL_Delay(100);
}
The I2C Init:
static void MX_I2C2_Init(void)
{
/* USER CODE BEGIN I2C2_Init 0 */
/* USER CODE END I2C2_Init 0 */
/* USER CODE BEGIN I2C2_Init 1 */
/* USER CODE END I2C2_Init 1 */
hi2c2.Instance = I2C2;
hi2c2.Init.Timing = 0x60404E72;
hi2c2.Init.OwnAddress1 = 0;
hi2c2.Init.AddressingMode = I2C_ADDRESSINGMODE_7BIT;
hi2c2.Init.DualAddressMode = I2C_DUALADDRESS_DISABLE;
hi2c2.Init.OwnAddress2 = 0;
hi2c2.Init.OwnAddress2Masks = I2C_OA2_NOMASK;
hi2c2.Init.GeneralCallMode = I2C_GENERALCALL_DISABLE;
hi2c2.Init.NoStretchMode = I2C_NOSTRETCH_DISABLE;
if (HAL_I2C_Init(&hi2c2) != HAL_OK)
{
Error_Handler();
}
/** Configure Analogue filter
*/
if (HAL_I2CEx_ConfigAnalogFilter(&hi2c2, I2C_ANALOGFILTER_ENABLE) != HAL_OK)
{
Error_Handler();
}
/** Configure Digital filter
*/
if (HAL_I2CEx_ConfigDigitalFilter(&hi2c2, 0) != HAL_OK)
{
Error_Handler();
}
/* USER CODE BEGIN I2C2_Init 2 */
/* USER CODE END I2C2_Init 2 */
}
The DMA Init:
static void MX_DMA_Init(void)
{
/* DMA controller clock enable */
__HAL_RCC_DMA1_CLK_ENABLE();
/* DMA interrupt init */
/* DMA1_Stream0_IRQn interrupt configuration */
HAL_NVIC_SetPriority(DMA1_Stream0_IRQn, 5, 0);
HAL_NVIC_EnableIRQ(DMA1_Stream0_IRQn);
/* DMA1_Stream1_IRQn interrupt configuration */
HAL_NVIC_SetPriority(DMA1_Stream1_IRQn, 5, 0);
HAL_NVIC_EnableIRQ(DMA1_Stream1_IRQn);
}
The initialization from the main:
/* USER CODE BEGIN 1 */
/* USER CODE END 1 */
/* MCU Configuration--------------------------------------------------------*/
/* Reset of all peripherals, Initializes the Flash interface and the Systick. */
HAL_Init();
/* USER CODE BEGIN Init */
/* USER CODE END Init */
/* Configure the system clock */
SystemClock_Config();
/* USER CODE BEGIN SysInit */
/* USER CODE END SysInit */
/* Initialize all configured peripherals */
MX_GPIO_Init();
MX_DMA_Init();
MX_USART3_UART_Init();
MX_USB_OTG_HS_USB_Init();
MX_SPI1_Init();
MX_ETH_Init();
MX_I2C2_Init();
/* USER CODE BEGIN 2 */
/* USER CODE END 2 */
/* Init scheduler */
osKernelInitialize();
Could you please suggest what I am doing wrongly?
Finally found the solution. I summarise some issues I had:
HAL_I2C_Master_Transmit_DMA(i2cHandler, MPUADDR, i2cData, 2);
The Deliverable data: The deliverable data must be global variable for DMA, otherwise when I am leaving the function/ or calling a destructor of the object the memory is freed and the DMA aburts.
The delivery target in my case the i2cHandler must not be the copy of the original structure/objetc, because the callback function addresses are checked from parallel interrupts based on the original object and updated in the caling HAL_I2C_Master_Transmit_DMA function. If incorrect ending up in a loop in the HAL_I2C_EV_IRQHandler because teh hi2c->XferISR is alvais NULL.
What to check: If the I2C handler is exacly the same in the interrupt handler from the function or the name is different, but the new one only uses a pointer to the old one. In my case thy where not the same from the myin called hi2c1 and the function i2cHandler.
[Name from the main]https://i.stack.imgur.com/OhqJT.png
The solution for me was to use the copy of the pointer only not the contant of the hendler.[Copy of the struct]https://i.stack.imgur.com/JgmTU.png
With the STM32 H7 boards there is a memory allocation issue which is explaind with the solution in the below article.
https://community.st.com/s/article/FAQ-DMA-is-not-working-on-STM32H7-devices
Related
I am getting my feet wet with SMT32 and some basic lower level things instead of using libraries.
I am running into a problem with a ( I would think ) very basic program.
All I want to do is use the STM32 ( Nucelo - F302R8 ) to try and detect a DS18B20 sensor on a OneWire bus. ( Port C Pin 0 )
If it detects it, put on the on-board LED ( Port B, Pin 13 ) , if not turn it off.
That is it, nothing else for now at least.
I have read the initializing parts of the DS18B20 Datasheets and seen how a lot of people do it online but I do not want to use their libraries, I want to understand how to get it working myself.
However I have tried to follow their lead and come up with the below, I can say why I am getting no luck though.
My code needs to setup an Output ( Port C, Pin 0 ) , pull the One Wire line low for 480us, setup as an Input, wait 60uS and then read the line.
If the line is Low then there is a sensor there, if its High then there is not one ( From my understanding )
I have checked the wiring and the sensor with my Arduino and both seem to be working fine with those standard libraries.
Here is the Output and Input functions :
void ONEWIRE_OUTPUT(GPIO_TypeDef* GPIOx,uint32_t GPIO_Pin)
{
GPIO_InitTypeDef GPIO_InitStructOUT = {0}; // Holds the pin info.
GPIO_InitStructOUT.Pin = GPIO_PIN_0;
GPIO_InitStructOUT.Mode = GPIO_MODE_OUTPUT_PP;
GPIO_InitStructOUT.Speed = GPIO_SPEED_LOW;
HAL_GPIO_Init(GPIOC, &GPIO_InitStructOUT);
}
void ONEWIRE_INPUT(GPIO_TypeDef* GPIOx,uint32_t GPIO_Pin)
{
GPIO_InitTypeDef gpio_prms = {0};
gpio_prms.Pin = GPIO_Pin;
gpio_prms.Mode = GPIO_MODE_INPUT;
gpio_prms.Pull = GPIO_PULLUP;
HAL_GPIO_Init(GPIOx, &gpio_prms);
}
The delay I am using to get to uS is the following :
I have found how to setup these timers online and I think they are correct but I do not have an oscilloscope with me right now to check if it is outputting that.
void Delay_us(uint16_t delay)
{
__HAL_TIM_SET_COUNTER(&htim1,0); // set the counter value a 0
while (__HAL_TIM_GET_COUNTER(&htim1) < delay ); // wait for the counter to reach
}
My Mian loop is as follows :
int main(void)
{
/* USER CODE BEGIN 1 */
/* USER CODE END 1 */
/* MCU Configuration--------------------------------------------------------*/
/* Reset of all peripherals, Initializes the Flash interface and the Systick. */
HAL_Init();
/* USER CODE BEGIN Init */
/* USER CODE END Init */
/* Configure the system clock */
SystemClock_Config();
/* USER CODE BEGIN SysInit */
/* USER CODE END SysInit */
/* Initialize all configured peripherals */
MX_GPIO_Init();
MX_USART2_UART_Init();
MX_TIM1_Init();
/* USER CODE BEGIN 2 */
/* USER CODE END 2 */
/* Infinite loop */
/* USER CODE BEGIN WHILE */
HAL_TIM_Base_Start(&htim1); // Starting timer 1
int timer_val;
while (1)
{
//Setup as Output
ONEWIRE_OUTPUT(GPIOC, GPIO_PIN_0);
// Hold low for 480uS
HAL_GPIO_WritePin(GPIOC, GPIO_PIN_0, 0);
Delay_us(480);
timer_val = __HAL_TIM_GET_COUNTER(&htim1);
uart_buf_len = sprintf(uart_buf, "%u us -- Should be around 480 us\r\n" , timer_val );
HAL_UART_Transmit(&huart2, (uint8_t *)uart_buf, uart_buf_len, 100);
//release and change to INPUT
ONEWIRE_INPUT(GPIOC, GPIO_PIN_0);
//wait 60uS
Delay_us(60);
//Read, If there is a low - True, else False
timer_val = __HAL_TIM_GET_COUNTER(&htim1);
uart_buf_len = sprintf(uart_buf, "%u us -- Should be around 60 us\r\n" , timer_val );
HAL_UART_Transmit(&huart2, (uint8_t *)uart_buf, uart_buf_len, 100);
//Update LED
if ( HAL_GPIO_ReadPin(GPIOC, GPIO_PIN_0) ) {
HAL_GPIO_WritePin(GPIOB, GPIO_PIN_13, GPIO_PIN_RESET);
}else
{
HAL_GPIO_WritePin(GPIOB, GPIO_PIN_13, GPIO_PIN_SET);
}
HAL_Delay(1000);
// Delay a second
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
}
/* USER CODE END 3 */
}
I have a feeling it has to do with how to setup the Output and Input specifics but I might be barking up the wrong tree there.
I am pretty sure Push Pull is the best Output mode to use in this instance.
My other thought is that I did in fact get the Timer setting wrong, But I have my doubts, I think I have them correct.
I have been able to get a very faint glow on an LED when the pin is in Output mode and then have it blink for an extremely short time while trouble shooting things.
Any ideas would be welcome, I know this code is not pretty, and there are 1000 better ways to do it but this is purely me just messing around and trying to learn things.
It was all going great until I got stuck here.
Lastly as this is a new topic any recommendations on where to learn more about STM32 programming would be welcome, I have a few books but will lookup any new reccomrecommended ones.
I've been trying to process speech on a stm32f407ve development board for some time now, which makes me wonder if the ADC is really set up to precisely sample the values. CMSIS FFT Functions. But when I try to couple it with the ADC in continuous conversion to sample a sine signal, it doesn't seem to sample well periodically. I put a sine signal into it from a frequency test of a 1khz sine wave from an internet video with a plug that I take out of some headphones, which by the way I already tested that it works correctly with an oscilloscope. So... this one from the development board is obviously not from a DSP but its ADC should work correctly for this type of application? Here is my code, obviously I made sure that the test was emitting voltage before the debug.
#include "main.h"
#include "arm_math.h"
#include "arm_const_structs.h"
/* Private includes ----------------------------------------------------------*/
/* USER CODE BEGIN Includes */
/* USER CODE END Includes */
/* Private typedef -----------------------------------------------------------*/
/* USER CODE BEGIN PTD */
#define Fs 4096;
/* USER CODE END PTD */
/* Private define ------------------------------------------------------------*/
/* USER CODE BEGIN PD */
/* USER CODE END PD */
/* Private macro -------------------------------------------------------------*/
/* USER CODE BEGIN PM */
/* USER CODE END PM */
/* Private variables ---------------------------------------------------------*/
ADC_HandleTypeDef hadc1;
/* USER CODE BEGIN PV */
#define SIGNAL_BUFFER_LENGTH 4096
float signalBuffer[2*SIGNAL_BUFFER_LENGTH];
float fftBuffer[2*SIGNAL_BUFFER_LENGTH];
float magnitudes[SIGNAL_BUFFER_LENGTH];
/* USER CODE END PV */
uint32_t k;
uint32_t cont1,cont2;
uint32_t start;
uint32_t stopi;
uint32_t delta;
float32_t maxValue; /* Max FFT value is stored here */
uint32_t maxIndex;
float frecuencia=10.0;
float32_t Ts;
float tiempo;
/* Private function prototypes -----------------------------------------------*/
void SystemClock_Config(void);
static void MX_GPIO_Init(void);
static void MX_ADC1_Init(void);
/* USER CODE BEGIN PFP */
/* USER CODE END PFP */
/* Private user code ---------------------------------------------------------*/
/* USER CODE BEGIN 0 */
/* USER CODE END 0 */
/**
* #brief The application entry point.
* #retval int
*/
int main(void)
{
/* USER CODE BEGIN 1 */
/* USER CODE END 1 */
/* MCU Configuration--------------------------------------------------------*/
/* Reset of all peripherals, Initializes the Flash interface and the Systick. */
HAL_Init();
/* USER CODE BEGIN Init */
#define ARM_CM_DEMCR (*(uint32_t*)0xE000EDFC)
#define ARM_CM_DWT_CTRL (*(uint32_t*)0xE0001000)
#define ARM_CM_DWT_CYCCNT (*(uint32_t*)0xE0001004)
if(ARM_CM_DWT_CTRL !=0){
ARM_CM_DEMCR |= 1<<24;
ARM_CM_DWT_CYCCNT =0;
ARM_CM_DWT_CTRL |= 1<<0;
}
/* USER CODE END Init */
/* Configure the system clock */
SystemClock_Config();
/* USER CODE BEGIN SysInit */
/* USER CODE END SysInit */
/* Initialize all configured peripherals */
MX_GPIO_Init();
MX_ADC1_Init();
/* USER CODE BEGIN 2 */
Ts=1.0/(float)Fs;
HAL_ADC_Start(&hadc1);
for(k=0;k<2*SIGNAL_BUFFER_LENGTH;k+=2 )
{
signalBuffer[k]=HAL_ADC_GetValue(&hadc1);
}
k++;
//signalBuffer[0]=0;
//start= ARM_CM_DWT_CYCCNT;
arm_cfft_f32(&arm_cfft_sR_f32_len4096,signalBuffer,0,1);
signalBuffer[0]=0;
arm_cmplx_mag_f32(signalBuffer,magnitudes,4096);
arm_max_f32(magnitudes, 4096, &maxValue, &maxIndex);
//stopi = ARM_CM_DWT_CYCCNT;
//delta=stopi-start;
//tiempo=delta/8.0E07*1000.0;
/* USER CODE END 2 */
/* Infinite loop */
/* USER CODE BEGIN WHILE */
while (1)
{
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
}
/* USER CODE END 3 */
}
You are just calling the function to take a single reading over and over in a loop. There is no reason to think that each pass through this loop will take the same amount of time. You need to set the ADC to be triggered from a timer in order to have some kind of reproducible sample rate.
In general the internal ADC is not of suitable quality for audio use. There is an external audio codec fitted to this board, look at the example projects in the Stm32CubeF4 package.
I have build a prototype board with a STM8L, and I want it to be used and configured as a SPI slave. I am testing it with a raspberry pi as master.
I use the lib provided by ST called "STM8 Standard Peripherals Library" for this, but the documentation is very poor and doesn't expain how to do this...
I can send data from the Raspberry Pi with no issue and receive it on the STM8 but I can't send back any data to the raspberry from the STM8 on MISO.
Is anybody known how I can send back some data to the Raspberry Pi master? Where is my mistake?
Here is the main code:
void main(void)
{
// GPIO
GPIO_Init(GPIOA, GPIO_Pin_7, GPIO_Mode_Out_PP_Low_Fast);
CLK_Config();
// Set the MOSI and SCK at high level
GPIO_ExternalPullUpConfig(GPIOB, GPIO_Pin_6 | GPIO_Pin_5, ENABLE);
SPI_DeInit(SPI1);
SPI_Init(SPI1, SPI_FirstBit_LSB, SPI_BaudRatePrescaler_2, SPI_Mode_Slave,
SPI_CPOL_Low, SPI_CPHA_2Edge, SPI_Direction_2Lines_FullDuplex,
SPI_NSS_Hard, (uint8_t)0x07);
SPI_BiDirectionalLineConfig(SPI1, SPI_Direction_Tx);
// Enable SPI
SPI_Cmd(SPI1, ENABLE);
/* Infinite loop */
while (1)
{
while(SPI_GetFlagStatus(SPI1, SPI_FLAG_BSY));
// SPI polling
if(SPI_GetFlagStatus(SPI1, SPI_FLAG_RXNE) == SET) {
while(SPI_GetFlagStatus(SPI1, SPI_FLAG_BSY));
GPIO_ToggleBits(GPIOA, GPIO_Pin_7);
uint8_t data = SPI_ReceiveData(SPI1);
while(SPI_GetFlagStatus(SPI1, SPI_FLAG_BSY));
// I can't send back data here, it doesn't work
SPI_SendData(SPI1, 0xFA);
uint8_t test = SPI1->DR;
GPIO_ResetBits(GPIOA, GPIO_Pin_7);
}
}
}
static void CLK_Config(void)
{
/* Select HSE as system clock source */
CLK_SYSCLKSourceSwitchCmd(ENABLE);
CLK_SYSCLKSourceConfig(CLK_SYSCLKSource_HSI);
/*High speed external clock prescaler: 1*/
CLK_SYSCLKDivConfig(CLK_SYSCLKDiv_2);
while (CLK_GetSYSCLKSource() != CLK_SYSCLKSource_HSI)
{}
/* Enable SPI clock */
CLK_PeripheralClockConfig(CLK_Peripheral_SPI1, ENABLE);
}
And the RPi simple code:
#include <iostream>
#include <wiringPi.h>
#include <wiringPiSPI.h>
using namespace std;
int main()
{
wiringPiSetup();
wiringPiSPISetup(0, 50000);
unsigned char data[] = {0x5A};
wiringPiSPIDataRW(0, data, 2);
std::cout<<data<<std::endl;
return 0;
Thank you for your help! :)
Edit: I think the mistake is in uC code because the spi data register still contain the data sent by the master after I read it. I can't change it even by trying to write directly in the register.
Also: is it normal that the device only contain one data register for SPI? How is it supposed to be full duplex if it haven't one for MOSI (Rx) and one for MISO(Tx)? I think there is something I don't understand about SPI. I am not very experienced with this serial protocol. I mainly used I2C before.
SPI requires the master to provide the clock. If you want the slave to send something - your master
has to send some dummuy data to generate the clock for the slave.
I finaly found where were my mistakes.
First, I forgot to configure a pullup resistor on the MISO pin:
// Set the MOSI and SCK at high level
GPIO_ExternalPullUpConfig(GPIOB, GPIO_Pin_6 | GPIO_Pin_5 | GPIO_Pin_7, ENABLE);
Next, the SPI config were wrong. The Rpi was in MSB and the STM8 in LSB, and phase was on the second edge when it needed to be on the first edge:
SPI_Init(SPI1, SPI_FirstBit_MSB, SPI_BaudRatePrescaler_2, SPI_Mode_Slave,
SPI_CPOL_Low, SPI_CPHA_1Edge, SPI_Direction_2Lines_FullDuplex,
SPI_NSS_Hard, (uint8_t)0x07);
Finaly, not a mistake but a not optimal way to test: I were sending 0x81 with the master, but it is symetric in binary (0b10000001). I should have sent some asymetric message, for example 0x17 (0b00010111).
And the complete STM8 code:
#include "stm8l15x.h"
#include "stm8l15x_it.h" /* SDCC patch: required by SDCC for interrupts */
static void CLK_Config(void);
void Delay(__IO uint16_t nCount);
void main(void)
{
// GPIO
GPIO_Init(GPIOA, GPIO_Pin_7, GPIO_Mode_Out_PP_Low_Fast);
CLK_Config();
// Set the MOSI and SCK at high level (I added MOSI)
GPIO_ExternalPullUpConfig(GPIOB, GPIO_Pin_6 | GPIO_Pin_5 | GPIO_Pin_7, ENABLE);
SPI_DeInit(SPI1);
SPI_Init(SPI1, SPI_FirstBit_MSB, SPI_BaudRatePrescaler_2, SPI_Mode_Slave,
SPI_CPOL_Low, SPI_CPHA_1Edge, SPI_Direction_2Lines_FullDuplex,
SPI_NSS_Hard, (uint8_t)0x07);
SPI_BiDirectionalLineConfig(SPI1, SPI_Direction_Tx);
// Enable SPI
SPI_Cmd(SPI1, ENABLE);
/* Infinite loop */
while (1)
{
while(SPI_GetFlagStatus(SPI1, SPI_FLAG_BSY));
// SPI polling
if(SPI_GetFlagStatus(SPI1, SPI_FLAG_RXNE) == SET) {
// maybe this line is not necessary, I didn't have the time to test without it yet
while(SPI_GetFlagStatus(SPI1, SPI_FLAG_BSY);
uint8_t data = SPI_ReceiveData(SPI1);
while(SPI_GetFlagStatus(SPI1, SPI_FLAG_RXNE));
if(data==0x82) SPI_SendData(SPI1, 0xCD);
GPIO_ResetBits(GPIOA, GPIO_Pin_7);
}
}
}
/* Private functions ---------------------------------------------------------*/
static void CLK_Config(void)
{
/* Select HSE as system clock source */
CLK_SYSCLKSourceSwitchCmd(ENABLE);
CLK_SYSCLKSourceConfig(CLK_SYSCLKSource_HSI);
/*High speed external clock prescaler: 1*/
CLK_SYSCLKDivConfig(CLK_SYSCLKDiv_2);
while (CLK_GetSYSCLKSource() != CLK_SYSCLKSource_HSI)
{}
/* Enable SPI clock */
CLK_PeripheralClockConfig(CLK_Peripheral_SPI1, ENABLE);
}
void Delay(__IO uint16_t nCount)
{
/* Decrement nCount value */
while (nCount != 0)
{
nCount--;
}
}
/*******************************************************************************/
#ifdef USE_FULL_ASSERT
void assert_failed(uint8_t* file, uint32_t line)
{
/* Infinite loop */
while (1)
{
}
}
#endif
PS:
I am on linux and soft tools were not adapted to my OS, so I used some tools to be able to develop with it.
I think it can be useful for some people, so I add it here:
First, the lib were not able to compile with SDCC, so I used the patch I found here:
https://github.com/gicking/STM8-SPL_SDCC_patch
To upload to the uC, I use stm8flash with a ST-LINK V2:
https://github.com/vdudouyt/stm8flash
I also had some trouble to find the lib for the STM8L. Here it is:
https://www.st.com/en/embedded-software/stsw-stm8016.html
PS2:
I understand that it is not easy to answer to hardware related questions. Does anybody knows some websites which are more specified on this kind of questions?
I have a TM4C123 processor acting as a I2C master and a ESP8266 as a slave. For the ESP I am using the Arduino IDE with ESP8266 support installed at version 2.5.2, which should support the I2C slave mode. However, I can't get it to work. Even with the Arduino slave_receiver example, the slave does not acknowledge (ACK) the master's requests which I am displaying on a scope.
To make sure I am using the right address at least once, I implemented an address sweep on the master. And to make sure I am using the right pins on the ESP I implemented the master mode on the ESP first and did a pin sweep with an I2C slave device. So I am fairly certain neither of this can be the problem.
I am using the Olimex Mod-wifi board, with SDA Pin 12 and SCL Pin 13 (Schematic here)
Can someone help me on this? Here is my code:
// Wire Slave Receiver
// by devyte
// based on the example by Nicholas Zambetti <http://www.zambetti.com>
// Demonstrates use of the Wire library
// Receives data as an I2C/TWI slave device
// Refer to the "Wire Master Writer" example for use with this
// This example code is in the public domain.
#include <Wire.h>
#define SDA_PIN 12
#define SCL_PIN 13
const int16_t I2C_SLAVE = 0x12;
void setup() {
Serial.begin(115200); // start serial for output
Wire.begin(SDA_PIN, SCL_PIN, I2C_SLAVE); // new syntax: join i2c bus (address required for slave)
Wire.onReceive(receiveEvent); // register event
}
void loop() {
delay(1000);
Serial.println("Loop");
}
// function that executes whenever data is received from master
// this function is registered as an event, see setup()
void receiveEvent(size_t howMany) {
(void) howMany;
while (1 < Wire.available()) { // loop through all but the last
char c = Wire.read(); // receive byte as a character
Serial.print(c); // print the character
}
int x = Wire.read(); // receive byte as an integer
Serial.println(x); // print the integer
}
I had a compareable issue with the ESP8266. It was no problem sending data between the Arduino Nano (Slave) to the ESP8266 (Master) via I²C. But when I switch the modes (Arduino Nano = Master and ESP8266 = Slave) the Wire example doesn't work.
My workaround for this issue was to reduce the I²C working frequence from 100kHz to about 20kHz.
I want to use hardware RX pin of Arduino as interrupt pin. If there is any data available on RX pin, an interrupt signal will be generated, call a callback function to read incoming serial data.I don't want my loop() function constant reading on serial port. I am using this code but my interrupt is not triggered.I also tried by removing digitalPintointerrupt() but getting no response.
`#include <SoftwareSerial.h>
const byte interruptPin = 0;//In arduino MEGA RX 19. TX 18
String msg = "";//Incomming message
#define Line_RX 3 //UART RX
#define Line_TX 2 //UART TX
SoftwareSerial mySerial (Line_TX, Line_RX); //initialize software serial
void setup() {
// put your setup code here, to run once:
Serial.begin(19200);
mySerial.begin(19200);
attachInterrupt(digitalPinToInterrupt(interruptPin), serial_read, HIGH);
}//end setup
void loop() {
// put your main code here, to run repeatedly:
}//end loop
void serial_read(){
char _bite;
sei();//Disable hardware interrupts for a moment
while(Serial.available()>0){
delay(1);//Do not delete this delay
if(Serial.available()>0){
_bite = (char)Serial.read();
msg += _bite;
if(_bite == '\n'){
mySerial.print(msg);//Do what you print your message
msg = "";//Clean message for new one
break;
}//end if
}//end if
}//end while
cli();//re-enabling hardware interrupts
}//en
d serial_read`
sei() ENABLES interrupts, while cli() disables them. Your comments suggest you have them backwards. Perhaps there are other problems, but these instructions are certainly not consistent with your intentions.
If you want to get lower-level, consider a pure interrupt-driven design like:
ISR (USART0_UDRE_vect)
{
// Send next byte and increment pointer
UDR0 = *ub_outptr++;
// Pointer wrapping
if (ub_outptr >= UART_buffer + BUFF_SIZE)
ub_outptr = UART_buffer;
// If buffer is empty: disable interrupt
if(--ub_buffcnt == 0)
UCSR0B &= ~(1 << UDRIE0);
}
I know this takes you out of the Arduino library stuff, so this may not be ideal for you. But it works (the example is for sending data, as I have an active project where the microcontroller sends data to an LCD display. Just an example in AVR-GCC C.)
Try my NeoHWSerial. It is a modified version of the Arduino core class, HardwareSerial, the class used for Serial, Serial1, etc. It adds the ability to register a callback for each received character.