- Wed Jul 15, 2020 11:00 pm
#217174
I am seeing really strange behaviour when exchanging delay(1) vs delaymicrosecond(1000) in code. Its hard to see exactly where the time difference is due to not having a scope but I am running an LED matrix and loop() time on average doubles (883ms to 1633ms) when I exchange delay(1) vs delaymicroseconds(1000) and this really effects the stutter of the display. There is only one delay in the entire code. It's as if it effects the entire timing of the Arduino. All the rest of the code does is bit bang pins to update shift registers out to the display, along with a bit bang clock and a display enable.