SparkFun Forums 

Where electronics enthusiasts find answers.

Everything ARM and LPC
By fll-freak
#167894
I am using an AT91SAM7S256 in a project that is using the DGBU as a UART and the PITC for the timer tick of on RTOS. Both of the interrupts that occur from these peripherals vector to the "System" interrupt on channel 1. I have created a single interrupt service routine that is attempting to determine the reason(s) for the interrupt and properly service the event.

In the case of the Programmable Interval Timer Controller, when I get an interrupt I can determine if it was one of the interrupt sources by reading the status register and looking at the PITS bit. This seems to work fine.

But the DBGU peripheral is driving me nuts. To set the stage for my question:

I am using the DBGU in conjunction with PDC (DMA). Just to keep things simple as I develop the driver, I am only transmitting from the UART and only using the single DMA not the chained ("Next") capability. To simply things even more, my foreground only sends out a few bytes every second at 115,200 baud. This prevents any possibility that my foreground task may step on the DMA buffers before they are done. If I do not enable any interrupts, my transmissions work fine. Hence the basic configuration of the UART works (baud, ...) and my ability to set PDC pointers and count registers must be ok as well.

So the next step was to enable the DBGU_PTCR.TXTEN bit. Now as long as the PITC is turned of, I get a very nice interrupt at the conclusion of the DMA. I can even use that opportunity to reload the DMA buffers and send out another block of data.

But now, I try to have the PITC and the DBGU coexist on the System Interrupt. The PITC is set to go off 64 times a second. It expires and generates an interrupt well before I even think about sending out a string over the DBGU. So the System Interrupt ISR gets called, and I look to see if the PITC was responsibe by looking at PIT_SR.PITS. It is responsible, so I increment a variable and read PIT_PIVR to clear it.

Now I need to look at the DBGU to see if it happened to generate an interrupt at the same time. And this is where I fall flat on my face.

I can't use the DBGU_SR.ENDTX bit as an indication as it only reports state not edges. If I never enable the TXTEN interrupt or start a DMA, it still reports the DMA as being inactive.

I can't use the DBGU_PTSR.TXTEN bit as this simply indicates that the PDC is enabled that that it caused the interrupt.

So what is the magic sauce? What combination of register reads can one use to know if the DBGU was the reason for generating a PDC/DMA based interrupt.
By UhClem
#167956
My first reaction to an interrupt being triggered without setting a flag when it is vital to have that flag is to take a hammer and turn it back into the silicon dust it deserves to be.

Bit surely someone has run into this problem before so a search (if you find the right keywords) should turn up something.

Just how committed are you to using DMA? It seems like overkill for something as low bandwidth as a debug interface and perhaps just plain old interrupt driven routines would be better. Surely there are flags set for transmitter buffer empty interrupts. If not, see hammer. :-)
By fll-freak
#167961
Hammer! What a good idea. I would never have considered that as a way to solve an interrupt problem!

As far as I can tell, the SAM7 DMA engine will generate an interrupt but does not have a register flag that positively identifies the reason. I have a support ticket into Atmel for confirmation.

The part does have the traditional "transmit register empty" and "Receiver full" interrupts and that is a possible way out of the problem. But the dang part has NO hardware FIFO on the incoming data. If you don't grab that byte before the next comes in (38 usec at 115200 baud) than you are out of luck. This was my reason for using the DMA. I have higher priority tasks that could possibly take more than 38usec to complete.

I have spent a few hours Goggling but have yet to turn up anything. I found a tutorial on using the DMA, but it did not use it in a shared environment. It was also a rather brain dead example simply echoing back bytes.

I do have a workaround. Rather than run the DMA with an interrupt, I am using a timer to service the UART's DMA every 1 msec. With DMA buffers of 120 bytes, I have enough buffer space to take in all the bytes that could be received in that block of time. It does add some latency, but I think I can live with that. At present it seems to work although I have not put it to the acid test yet.