ATtiny85 PWM lower frequency than expected - avr-gcc

I am new to programming MCUs and am trying to use PWM on an ATtiny85. I have looked around online for various tutorials and have managed to get it working with the code below. My problem is that I expect the PWM frequency to be 8MHz/256 = 31.25kHz, but it is actually 3.9kHz, which would suggest that a pre-scaler value of 8 is being used. I have looked at the datasheet but still can't figure it out.
#define F_CPU 8000000
#include <avr/io.h>
int main(void)
{
//Table 11-3 Compare Output Mode, Fast PWM mode
//Table 11-5 Waveform Generation Mode Bit Description
//COM0A0 - non-inverting
TCCR0A |= (0<<WGM02)|(1<<WGM01)|(1<<WGM00)|(1<<COM0A1);
//Table 11-6 Clock Select Bit Description
TCCR0B |= (0<<CS02)|(0<<CS01)|(1<<CS00); //pre-scale factor = 1
OCR0A=128; // 128/256 = 0.5 duty cycle
//make PWM pin output
DDRB |= (1<<DDB0);
while(1){}
return 0;
}
I am programming the MCU using a Raspberry Pi with avrdude and avr-gcc as per this instructable: http://www.instructables.com/id/Programming-the-ATtiny85-from-Raspberry-Pi/
Any help or suggestions would be greatly appreciated.
Thanks :)

I discovered that by default the fuses on the ATtiny85 are set to divide the 8MHz clock by 8 which accounts for the apparent pre-scale factor of 8 that I encountered. I changed the fuses according to this fuse calculator and it worked perfectly. It is strange that none of the tutorials I read mentioned this, but hopefully my struggles can help someone else who has the same problem.

Related

Switching the GPIO of a Raspberry Pi at a fixed frequency

I have a 208 - 232 Bit long binary signal that I want to send via a GPIO of the raspberry pi.
The delay between the bits needs to be constant.
How can I achieve this ?
The simplest solution that came to my mind was this (pseudocode):
send(gpio, message, delay){
for(int i = 0; i < lenght(message); i++){
if (message[i] == 1){
gpio.high()
}
else{
gpio.low()
}
sleep(delay)
}
}
But the frequency at which I want to send this message is around 40kHz so the delay between two bits is only 25us.
How can I assure it is exactly&constantly that much delay.
In userspace there is no way to garantee that anything is happening in "real time".
Because of this I decided using a seperate Microcontroller that is responsible for time critical features that communicates via I2C or UART with the RaspberryPi.
This way the RaspberryPi can do high abstraction level decisions and show animation to a user while being able to send messages over ultrasound.
Another way I came up with is to create a file similar to .wav and then use DMA or other techniques like with audio. But as I haven't tried it I don't know if the Pi is sufficient to do this at higher Samplingrates.

I have a problem lighting an LED in a microcontroller stm32f373 discovery

I have a problem lighting the LED in the microcontroller device discovery stm32f373
I used STM32 cube mx and the HAL library the program was executed, but the LED did not light up. Performed work according to STM instruction. Lesson 4. HAL library. STM32 CUBE MX. LEDs and button link russian
set pins for power, inputs and outputs
discovery
Turn on the rcc-> HSE bus
In Clock Configuration, enabled HSE. Configured by manipulated as follows
clock
Added an endless loop changing it.
while (1)
{
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
HAL_Delay(5000); //1 minut
HAL_GPIO_TogglePin(GPIOD, GPIO_PIN_8);
HAL_Delay(5000);
HAL_GPIO_TogglePin(GPIOD, GPIO_PIN_8);
HAL_GPIO_TogglePin(GPIOD, GPIO_PIN_9);
HAL_Delay(5000);
HAL_GPIO_TogglePin(GPIOD, GPIO_PIN_9);
}
Did I do everything right?
Explain the reason why the LED may not light.
The pins of the microcontroller have their own identifier. Where can I find leg information? Will this fit Discovery Device Description ?
I used the English documentation offered by the author of the lesson, only the version for my controller. Description of STM32F3 HAL and low-layer drivers STM32F373xx
LED pin PC9, PC8
You need a second delay with HAL_Delay. Otherwise you toggle the LED, jump to the begin of the while and toggle the LED again. So it might be that the LED is switched on for only a few clock cycles depending on the initial state of the I/O.
while (1)
{
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
HAL_Delay(500);
HAL_GPIO_TogglePin(GPIOD, GPIO_PIN_8);
HAL_Delay(500);
HAL_GPIO_TogglePin(GPIOD, GPIO_PIN_8);
}
You need to enable the clock for the GPIO peripheral which the LED is connected to, before you set up pins as outputs and try to toggle them.
In the RCC->AHBENR there are bits to turn individual GPIO ports clocks on and off, GPIOD is bit 20, so RCC->AHBENR |= (1 << 20); would do. There will be defines existing depending on which librarys you're using, so use those instead of the (1 << 20) magic number.
EDIT
After your edit, you've added at the bottom that the LEDs are pins PC8 & PC9, your code is toggling PD8 and PD9. Check which way it should be.
Have you configured the GPIO as outputs in STM32CubeMX?
Is interrupts enabled? If not, you will notice when debugging that HAL_Delay never returns. Try to place a couple of breakpoints and see if your while-loop actually executes.

ASIOCallbacks::bufferSwitchTimeInfo comes very slowly in 2.8MHz Samplerate with DSD format on Sony PHA-3

I bought a Sony PHA-3 and try to write an app to play DSD in native mode. (I've succeeded in DoP mode.)
However, When I set the samplerate to 2.8MHz, I found the ASIOCallbacks::bufferSwitchTimeInfo come not so fast as it should be.
It'll take nearly 8 seconds to request for 2.8MHz samples which should be completed in 1 second.
The code is merely modified from the host sample of asiosdk 2.3, thus I'll post a part of the key codes to help complete my question.
After ASIO Start, the host sample will keep printing the progress to indicating the time info like this:
fprintf (stdout, "%d ms / %d ms / %d samples **%ds**", asioDriverInfo.sysRefTime,
(long)(asioDriverInfo.nanoSeconds / 1000000.0),
(long)asioDriverInfo.samples,
(long)(**asioDriverInfo.samples / asioDriverInfo.sampleRate**));
The final expression will tell me how many seconds has elapsed. (asioDriverInfo.samples/asioDriverInfo.sampleRate).
Where asioDriverInfo.sampleRate is 2822400 Hz.
And asioDriverInfo.samples is assigned in the ASIOCallbacks::bufferSwitchTimeInfo like below:
if (timeInfo->timeInfo.flags & kSamplePositionValid)
asioDriverInfo.samples = ASIO64toDouble(timeInfo->timeInfo.samplePosition);
else
asioDriverInfo.samples = 0;
It's the original code of the sample.
So I can easily find out the time elapsed very slowly.
I've tried to raise the samplerate to even higher, say 2.8MHz * 4, it's even longer to see the time to advance 1 second.
I tried to lower the samplerate to below 2.8MHz, the API failed.
I surely have set the SampleFormat according to the guide of the sdk.
ASIOIoFormat aif;
memset(&aif, 0, sizeof(aif));
aif.FormatType = kASIODSDFormat;
ASIOSampleRate finalSampleRate = 176400;
if(ASE_SUCCESS == ASIOFuture(kAsioSetIoFormat,&aif) ){
finalSampleRate = 2822400;
}
In fact, without setting the SampleFormat to DSD, setting samplerate to 2.8MHz will lead to an API failure.
Finally, I remembered all the DAW (Cubase / Reaper, ...) have an option to set the thread priority, so I doubted the thread of the callback is not high enough and also try to raise its thread priority to see if this could help. However, when I check the thread priority, it returns THREAD_PRIORITY_TIME_CRITICAL.
static double processedSamples = 0;
if (processedSamples == 0)
{
HANDLE t = GetCurrentThread();
int p = GetThreadPriority(t); // I get THREAD_PRIORITY_TIME_CRITICAL here
SetThreadPriority(t, THREAD_PRIORITY_HIGHEST); // So the priority is no need to raise anymore....(SAD)
}
It's same for the ThreadPriorityBoost property. It's not disabled (already boosted).
Anybody has tried to write a host asio demo and help me resolve this issue?
Thanks very much in advance.
Issue cleared.
I should getBufferSize and createBuffers after kAsioSetIoFormat.

Atmega2560 setup PWM and interrupt on positive edge

I am trying to do 2 operations on the same timer: PWM and interrupt on positive edge. I can make both work individually, but can not seem to make them work together. I am using at atmega2560 chip on the Arduino board and trying to do it on Timer1, and this is the code that does the PWM:
TCCR1A = 0;
TCCR1B = 0;
TCCR1A |= (1<<WGM11)|(1<<COM1A1)|(1<<COM1B1);
TCCR1B |= (1<<WGM12)|(1<<WGM13)|(1<<CS10);
ICR1 = 29999;
OCR1A = 0;
OCR1B = 0;
ICR1 sets the frequency to about 533Hz, and the OCR1A is the duty cycle; I vary that throughout the rest of my software, as it is meant to control a DC motor. What I want to do next is on every positive edge of the 533Hz, is to trigger and interrupt. I have tried to use TIMSK1 but could not seem to make it work. Would anyone know how to program this? Thanks
You should provide the individual code for the positive edge detection and pwm since you said you can make them both individually work. It would make it easier to see what you're doing and why they do not work together as opposed giving us nothing to work from. The implementation of PWM and interrupt is dependant on the environment and your ic, but the general algorithms are the same. Its most likely something minor you are overlooking in your code so I would include that to get more responses.

IRQ 8 isn't working... HW or SW?

First, I program for Vintage computer groups. What I write is specifically for MS-DOS and not windows, because that's what people are running. My current program is for later systems and not the 8086 line, so the plan was to use IRQ 8. This allows me to set the interrupt rate in binary values from 2 / second to 8192 / second (2, 4, 8, 16, etc...)
Only, for some reason, on the newer old systems (ok, that sounds weird,) it doesn't seem to be working. In emulation, and the 386 system I have access to, it works just fine, but on the P3 system I have (GA-6BXC MB w/P3 800 CPU,) it just doesn't work.
The code
setting up the interrupt
disable();
oldrtc = getvect(0x70); //Reads the vector for IRQ 8
settvect(0x70,countdown); //Sets the vector for
outportb(0x70,0x8a);
y = inportb(0x71) & 0xf0;
outportb(0x70,0x8a);
outportb(0x71,y | _MRATE_); //Adjustable value, set for 64 interrupts per second
outportb(0x70,0x8b);
y = inportb(0x71);
outportb(0x70,0x8b);
outportb(0x71,y | 0x40);
enable();
at the end of the interrupt
outportb(0x70,0x0c);
inportb(0x71); //Reading the C register resets the interrupt
outportb(0xa0,0x20); //Resets the PIC (turns interrupts back on)
outportb(0x20,0x20); //There are 2 PICs on AT machines and later
When closing program down
disable();
outportb(0x70,0x8b);
y = inportb(0x71);
outportb(0x70,0x8b);
outportb(0x71,y & 0xbf);
setvect(0x70,oldrtc);
enable();
I don't see anything in the code that can be causing the problem. But it just doesn't seem to make sense. While I don't completely trust the information, MSD "does" report IRQ 8 as the RTC Counter and says it is present and working just fine. Is it possible that later systems have moved the vector? Everything I find says that IRQ 8 is vector 0x70, but the interrupt never triggers on my Pentium III system. Is there some way to find if the Vectors have been changed?
It's been a LONG time since I've done any MS-DOS code and I don't think I ever worked with this particular interrupt (I'm pretty sure you can just read the memory location to fetch the time too, and IRQ0 can be used to trigger you at an interval too, so maybe that's better. Anyway, given my rustiness, forgive me for kinda link dumping.
http://wiki.osdev.org/Real_Time_Clock the bottom of that page has someone saying they've had problem on some machines too. RBIL suggests it might be a BIOS thing: http://www.ctyme.com/intr/rb-7797.htm
Without DOS, I'd just capture IRQ0 itself and remap all of them to my own interrupt numbers and change the timing as needed. I've done that somewhat recently! I think that's a bad idea on DOS though, this looks more recommended for that: http://www.ctyme.com/intr/rb-2443.htm
Anyway though, I betcha it has to do with the BIOS thing:
"Notes: Many BIOSes turn off the periodic interrupt in the INT 70h handler unless in an event wait (see INT 15/AH=83h,INT 15/AH=86h).. May be masked by setting bit 0 on I/O port A1h "