I am currently using the AWS IoT SDK for the Arduino Yun, and I am running the example sketches (specifically the Thermostat simulator).
I modified the loop slightly (just changed the delay at the end) such that it looks like this:
void loop() {
if(success_connect) {
// If the desired temperature is set to a higher value, start heating.
if(desiredTemp - reportedTemp > 0.001) {reportedTemp += 0.1;}
// If the desired temperature is set to a lower value, start cooling.
else if(reportedTemp - desiredTemp > 0.001) {reportedTemp -= 0.1;}
dtostrf(reportedTemp, 4, 1, float_buf);
float_buf[4] = '\0';
sprintf_P(JSON_buf, PSTR("{\"state\":{\"reported\":{\"Temp\":%s}}}"), float_buf);
print_log("shadow update", myClient.shadow_update(AWS_IOT_MY_THING_NAME, JSON_buf, strlen(JSON_buf), NULL, 5));
if(myClient.yield()) {
Serial.println("Yield failed.");
}
delay(10000); // CHANGED TO 10 SECONDS INSTEAD OF 1
}
}
For some reason, when the loop is delayed for less than 10 seconds, the shadow update has no problem repeating. But as soon as the delay is 10 seconds or more, I get this error:
Exception in thread Thread-4 (most likely raised during interpr
[ERR] command: shadow update code: -1
I changed the CMD_TIME_OUT to a higher value in the aws_iot_config_SDK.h file, but it still seems to have no effect. The reason why I'd like the timeout delay to be larger than 10 seconds is because I want to ideally have a trigger mechanism in the arduino that would update the shadow.
Related
I am testing application's latency during UDP communication on windows 10.
I tried to send a message every 1 second and receive a response sent immediately from the remote.
Send thread
It works every 1 second.
auto start = std::chrono::system_clock::now();
unsigned int count = 1;
while (destroyFlag.load(std::memory_order_acquire) == false)
{
if (isReady() == false)
{
break;
}
/*to do*/
worker_();
std::this_thread::sleep_until(start + std::chrono::milliseconds(interval_)* count++);
}
worker_()
Send thread call this. just send message and make log string.
socket_.send(address_);
logger_.log("," + std::string("Send") + "\n");
Receiver
When message arrives, it creates a receive log string and flushes it to a file.
auto& queueData = socket_.getQueue();
while (queueData.size() > 0)
{
auto str = queueData.dequeue();
logger_.log(",Receive" + str + "\n");
logger_.flush();
}
I've been testing it overnight and I can't figure out why I got this result.
chart for microseconds
x-axis : Hour_Minute_second
y-axis : microseconds
For a few hours it seemed to work as expected. But after that, the time gradually changed and went to a different time zone.
Does anyone know why this is happening?
std::chrono::steady_clock is working.
It made my charts straight.
And another way, turn off the windows automatically time synchronize.
I am trying to get a pulse of 100us to occur 4 times a second through GPIO. The way I am doing this is by having two timer-based interrupts; one that triggers 4 times every second, and another that gets triggered 100us after the first.
Within the interrupt handler of the first timer, the target pin is set high, the second timer is reset, and interrupts on the second timer are enabled. Within the second interrupt handler, the target pin is set low and interrupts are disabled. Here is what my code looks like:
First timer's ISR:
void TIM4_IRQHandler(void)
{
{
TIM4 -> SR = ~(TIM_SR_UIF); // clear UIF flag
HAL_GPIO_WritePin(GPIOB, GPIO_PIN_14, GPIO_HIGH); // target pin
endTrigger->restartTimer();
endTrigger->enableInterrupts();
}
}
Second Timer's ISR:
void TIM5_IRQHandler(void)
{
{
TIM5 -> SR = ~(TIM_SR_UIF); // clear UIF flag
HAL_GPIO_WritePin(GPIOB, GPIO_PIN_14, GPIO_LOW); // target pin
endTrigger->disableInterrupts();
}
}
retart timer function:
void Timer::restartTimer() {
myhTim->CR1 &= ~TIM_CR1_CEN; // disable the timer
myhTim->CNT = 0; // reset count
myhTim->SR = 0; // clear any interrupt flags
myhTim->CR1 = TIM_CR1_CEN; // re-engage timer
}
For whatever reason, the second I write to CR1 I get a hard fault... Any idea why? I am aware that there are other approaches to getting a 100us pulse but this seemed to be the simplest way for what our needs are... We aren't going to need the additional timer and we will need to be semi-frequently syncing the pulse to an external piece of hardware.
The timer interrupt occurred immediately after initializing the first timer. I had to add a line of code to my second IRQ such that it would only attempt to monkey with the second timer in the case of it not being a nullptr.
I am trying to connect my TM4C123GH6PM Microcontroller from Texas Instruments with my Smartphone and use it to control an alarm clock and LED Lights. (the LEDs are controlled over a Transistor, which is controlled over an GPIO Pin).
I have some experience with coding in C++ and the TM4C123GH6PM, but I am still learning a lot. So please excuse some foolish mistakes I might have made.
I want to connect the ESP8266 with the Microcontroller using UART and the TivaWare Framework.
I have written some code and my UART works correctly (I tested it by sending chars from UART 4 to 3).
According to the AT commands of ESP8266 It should respond to "AT" with "OK". But whenever I send something to the ESP it responds with exactly what I sent to it. I checked the wiring, and that's not The Issue. Or at least I think so. Please correct me, if the wiring is wrong.
ESP -> TM4C123GH6PM:
GND -> GND
VCC -> 3.3V
Tx -> Rx (UART3 / PC6)
Rx -> Tx (UART4 / PC5)
CH_PD -> 3.3V
I also checked for the power consumption of the ESP. Everything is powered by the USB-port of my laptop, since that helps keep the cable mess down. I monitor the power consumption with (https://www.amazon.de/gp/product/B07C8CM5TG/ref=ppx_yo_dt_b_asin_title_o08_s00?ie=UTF8&psc=1). The ESP is drawing about 150mA from the computer, but the port can provide a lot more. I checked with some LEDs and 400mA is not a problem.
Can anyone help me? I am working on this now for over two days and can't find a Solution. What is the Problem with the ESP not responding correctly to the AT command? The blue light is one, when the code is running.
PS: The attached code contains also code for the alarm clock control and LEDs. I attached it, since it could be part of the problem, but some of it is commented out and most of it is not used.
#include<stdint.h>
#include<stdbool.h>
#include"inc/hw_ints.h"
#include"inc/hw_memmap.h"
#include"inc/hw_types.h"
#include"driverlib/gpio.h"
#include"driverlib/sysctl.h"
#include"driverlib/timer.h"
#include"driverlib/interrupt.h"
#include"driverlib/uart.h"
#include"driverlib/pin_map.h"
#include "driverlib/rom.h"
// stores the time since system start in ms
uint32_t systemTime_ms;
//bools or controling the alarm clock and LEDS
bool an_aus = false;
bool alarm_clock = false;
void InterruptHandlerTimer0A (void)
{
// Clear the timer interrupt flag to avoid calling it up again directly
TimerIntClear(TIMER0_BASE, TIMER_TIMA_TIMEOUT);
// increase the ms counter by 1 ms
systemTime_ms++;
}
void clockSetup(void)
{
uint32_t timerPeriod;
//configure clock
SysCtlClockSet(SYSCTL_SYSDIV_5|SYSCTL_USE_PLL|SYSCTL_XTAL_16MHZ| SYSCTL_OSC_MAIN);
//activate peripherals for the timer
SysCtlPeripheralEnable(SYSCTL_PERIPH_TIMER0);
// configure timers as 32 bit timers in periodic mode
TimerConfigure(TIMER0_BASE, TIMER_CFG_PERIODIC);
// set the variable timerPeriod to the number of periods to generate a timeout every ms
timerPeriod = (SysCtlClockGet()/1000);
// pass the variable timerPeriod to the TIMER-0-A
TimerLoadSet(TIMER0_BASE, TIMER_A, timerPeriod-1);
// register the InterruptHandlerTimer0A function as an interrupt service routine
TimerIntRegister(TIMER0_BASE, TIMER_A, &(InterruptHandlerTimer0A));
// activate the interrupt on TIMER-0-A
IntEnable(INT_TIMER0A);
// generate an interrupt when TIMER-0-A generates a timeout
TimerIntEnable(TIMER0_BASE, TIMER_TIMA_TIMEOUT);
// all interrupts are activated
IntMasterEnable();
// start the timer
TimerEnable(TIMER0_BASE, TIMER_A);
}
void UART (void)
{
//configure UART 4:
SysCtlPeripheralEnable(SYSCTL_PERIPH_GPIOC);
SysCtlPeripheralEnable(SYSCTL_PERIPH_UART4);
while(!SysCtlPeripheralReady(SYSCTL_PERIPH_UART4));
//GPIO pins for transmitting and receiving
GPIOPinConfigure(GPIO_PC4_U4RX);
GPIOPinConfigure(GPIO_PC5_U4TX);
GPIOPinTypeUART(GPIO_PORTC_BASE, GPIO_PIN_4 | GPIO_PIN_5);
//configure UART 8Bit, no parity, baudrat 38400
UARTConfigSetExpClk(UART4_BASE, SysCtlClockGet(), 38400, (UART_CONFIG_WLEN_8 | UART_CONFIG_STOP_ONE | UART_CONFIG_PAR_NONE));
//configure UART 3:
SysCtlPeripheralEnable(SYSCTL_PERIPH_UART3);
while(!SysCtlPeripheralReady(SYSCTL_PERIPH_UART3));
GPIOPinConfigure(GPIO_PC6_U3RX);
GPIOPinConfigure(GPIO_PC7_U3TX);
GPIOPinTypeUART(GPIO_PORTC_BASE, GPIO_PIN_6 | GPIO_PIN_7);
UARTConfigSetExpClk(UART3_BASE, SysCtlClockGet(), 38400, (UART_CONFIG_WLEN_8 | UART_CONFIG_STOP_ONE | UART_CONFIG_PAR_NONE));
}
void delay_ms(uint32_t waitTime)
{
// Saves the current system time in ms
uint32_t aktuell = systemTime_ms;
// Wait until the current system time corresponds to the sum of the time at the start of the delay and the waiting time
while(aktuell + waitTime > systemTime_ms);
}
void ex_int_handler(void)
{
// press the button to start timer for alarm clock
alarm_clock = true;
GPIOIntClear(GPIO_PORTF_BASE,GPIO_PIN_4);
}
int main(void)
{
//Peripherals for LED and GPIO
SysCtlPeripheralEnable(SYSCTL_PERIPH_GPIOB);
SysCtlPeripheralEnable(SYSCTL_PERIPH_GPIOF);
//UART
UART();
//Timer
clockSetup();
// button
GPIOPinTypeGPIOInput(GPIO_PORTF_BASE,GPIO_PIN_4);
GPIOPadConfigSet(GPIO_PORTF_BASE,GPIO_PIN_4,GPIO_STRENGTH_2MA,GPIO_PIN_TYPE_STD_WPU);
//OnboardLED
GPIOPinTypeGPIOOutput(GPIO_PORTF_BASE,GPIO_PIN_1);
GPIOPinTypeGPIOOutput(GPIO_PORTF_BASE,GPIO_PIN_3);
//Interrupt Timer
GPIOIntDisable(GPIO_PORTF_BASE,GPIO_PIN_4);
GPIOIntClear(GPIO_PORTF_BASE,GPIO_PIN_4);
GPIOIntTypeSet(GPIO_PORTF_BASE,GPIO_PIN_4,GPIO_FALLING_EDGE);
GPIOIntRegister(GPIO_PORTF_BASE,ex_int_handler);
GPIOIntEnable(GPIO_PORTF_BASE,GPIO_PIN_4);
//Transistor Gate
GPIOPinTypeGPIOOutput(GPIO_PORTB_BASE,GPIO_PIN_0);
//GPIOPadConfigSet(GPIO_PORTB_BASE,GPIO_PIN_0,GPIO_STRENGTH_6MA,GPIO_PIN_TYPE_STD_WPU);
//debugging only: save all the received data from the ESP in an array to look at while debugging
int32_t data[20] = {0};
int32_t j = 0;
//Code for debugging the UART and ESP8266
while(1){
//Checks for Data in the FIFO
while(!UARTCharsAvail(UART4_BASE));
//send AT-command to ESP8266
UARTCharPut(UART4_BASE, 'A');
while(UARTBusy(UART4_BASE));
UARTCharPut(UART4_BASE, 'T');
while(UARTBusy(UART4_BASE));
if(UARTCharsAvail(UART3_BASE))
{
while(UARTCharsAvail(UART3_BASE))
{
//Read data from the FIFO in UART3 -> received from ESP8266
data[j] = UARTCharGet(UART3_BASE);
j++;
}
}
//clear array when its full
if (j >= 20)
{
j = 0;
for(int32_t a = 0; a <21; a++)
{
data[a] = 0;
}
}
}
//code to run the alarm clock and leds
/*
while(1)
{
if (alarm_clock)
{
GPIOPinWrite(GPIO_PORTF_BASE,GPIO_PIN_3,GPIO_PIN_3);
//Wait
delay_ms(30600000);
GPIOPinWrite(GPIO_PORTB_BASE,GPIO_PIN_0,GPIO_PIN_0);
alarm_clock = false;
GPIOPinWrite(GPIO_PORTF_BASE,GPIO_PIN_3,0x00);
//Start Red LED blinking when it is finished
while(1)
{
GPIOPinWrite(GPIO_PORTF_BASE,GPIO_PIN_1,GPIO_PIN_1);
delay_ms(1000);
GPIOPinWrite(GPIO_PORTF_BASE,GPIO_PIN_1,0x00);
delay_ms(1000);
}
}
}
*/
}
According to the AT commands of ESP8266 It should respond to "AT" with
"OK". But whenever I send something to the ESP it responds with
exactly what I sent to it
Modems with AT Commands commonly ship with the echo mode turned on, so that when you are interacting with it manually through serial port, it will echo the characters you sent first, and then send the reply.
So, when you are automating the process, you first send the characters, then wait for the reply until you reach a '\r'. Well, you are reaching a '\r', but its the one from the echo. You might have some other characters next. You send AT, you should receive AT first, then you have the OK.
To solve this problem, you should turn echo mode off.
The command to turn off echo is ATE0.
I am trying to use an mbed LPC1768 to interface with a uCAM-III camera module. This requires following a specific transfer of bytes between the mbed and the camera to sync correctly. The datasheet specifies that this can take up to 60 times to happen correctly.
I'm using the library found on this page https://os.mbed.com/users/ms523/notebook/ucam-development/ to help with the syncing process. It doesn't seem to sync correctly though and I get a reponse timeout found in the Get_Reponse function. I added some printf to help with debugging and discovered that it will run the sync attempt once, return to the start of the loop, execute the first printf statement there but then stop. And I cannot figure out why that is.
This is the response I get on the terminal. Even though there is no code between the Trying to sync time %i and Sending sync chars to uCAM it doesn't run the 2nd time around the loop and just stops. No more lines are printed after this.
Trying to sync time 0
Sending sync chars to uCAM
Response Timeout
Sync failed - trying again
Trying to sync time 1
int uCam::Sync()
{
// This will give 60 attempts to sync with the uCam module
for (int i=0; i<60; i++) {
printf("\n\rTrying to sync time %i ", i);
// Send out the sync command
printf("\n\rSending sync chars to uCAM");
for (int j=0; j<6; j++) {
_uCam.putc(SYNC[j]);
}
// Check if the response was an ACK
if (Get_Response(_ACK,SYNC[1])) {
printf("\n\rRecevied ACK");
// It was an ACK so now get the next response - it should be a sync
if (Get_Response(_SYNC,0x00)) {
printf("\n\rGot ACK");
// We need a small delay (1ms) from receiving the SYNC response and sending an ACK in return
wait(0.001);
printf("\n\rSending ACK for ACK");
for (int k=0; k<6; k++) {
_uCam.putc(ACK[k]);
}
// Everything is now complete so return true
printf("\n\rSynced");
return(1);
}
}
// Wait a while and try again
printf("\n\rSync failed - trying again");
}
printf("\n\rExiting sync function");
// Something went wrong so return false
return(0);
}
I'm implementing RS485 on arm developement board using serial port and gpio for data enable.
I'm setting data enable to high before sending and I want it to be set low after transmission is complete.
It can be simply done by writing:
//fd = open("/dev/ttyO2", ...);
DataEnable.Set(true);
write(fd, data, datalen);
tcdrain(fd); //Wait until all data is sent
DataEnable.Set(false);
I wanted to change from blocking-mode to non-blocking and use poll with fd. But I dont see any poll event corresponding to 'transmission complete'.
How can I get notified when all data has been sent?
System: linux
Language: c++
Board: BeagleBone Black
I don't think it's possible. You'll either have to run tcdrain in another thread and have it notify the the main thread, or use timeout on poll and poll to see if the output has been drained.
You can use the TIOCOUTQ ioctl to get the number of bytes in the output buffer and tune the timeout according to baud rate. That should reduce the amount of polling you need to do to just once or twice. Something like:
enum { writing, draining, idle } write_state;
while(1) {
int write_event, timeout = -1;
...
if (write_state == writing) {
poll_fds[poll_len].fd = write_fd;
poll_fds[poll_len].event = POLLOUT;
write_event = poll_len++
} else if (write == draining) {
int outq;
ioctl(write_fd, TIOCOUTQ, &outq);
if (outq == 0) {
DataEnable.Set(false);
write_state = idle;
} else {
// 10 bits per byte, 1000 millisecond in a second
timeout = outq * 10 * 1000 / baud_rate;
if (timeout < 1) {
timeout = 1;
}
}
}
int r = poll(poll_fds, poll_len, timeout);
...
if (write_state == writing && r > 0 && (poll_fds[write_event].revent & POLLOUT)) {
DataEnable.Set(true); // Gets set even if already set.
int n = write(write_fd, write_data, write_datalen);
write_data += n;
write_datalen -= n;
if (write_datalen == 0) {
state = draining;
}
}
}
Stale thread, but I have been working on RS-485 with a 16550-compatible UART under Linux and find
tcdrain works - but it adds a delay of 10 to 20 msec. Seems to be polled
The value returned by TIOCOUTQ seems to count bytes in the OS buffer, but NOT bytes in the UART FIFO, so it may underestimate the delay required if transmission has already started.
I am currently using CLOCK_MONOTONIC to timestamp each send, calculating when the send should be complete, when checking that time against the next send, delaying if necessary. Sucks, but seems to work