- Fri Apr 18, 2014 12:23 pm
#170361
If I have a program loop that is keeping track of time using micros() and I use an external interrupt ISR that has a delayMicroseconds() up to 51 milliseconds (max, but this condition is highly unlikely, and yes, I'm aware I'd need to stack the delays to get that long), will micros() be off in the main loop by that much, or does it still keep track of time overall regardless of interrupt running?
Essentially, what I'm trying to do is:
(main loop):
Essentially, what I'm trying to do is:
(main loop):
Code: Select all
(ISR):
determine the period between interrupt events in the range 60ms to 3ms, probably by a flag set by the ISR using micros()
Read some inputs (analog mostly)
do some math to determine how long an output should be on for based on this information, make it available for the ISR.
(this period will be between 0 and 85% of the current time between interrupts, based on various factors, but expected to be around 25% on average, so between 15ms and 0.75ms, the latter likely to be longer and the former shorter.)
Move a stepper motor to a postion determined by above math
Code: Select all
(if needed set a flag or volatile variable for the main loop to measure the time between events)
Turn on an output
wait for a period of time determined by the main loop math
Turn off the output
Last edited by TrollHammer on Sat Apr 19, 2014 11:37 pm, edited 1 time in total.
Yes, I think strang... er, unconventionally.