SparkFun Forums 

Where electronics enthusiasts find answers.

Your source for all things Atmel.
By Peter Loron
#56196
I have a Futaba NA202MD13AA vacuum fluorescent display. It is a nice 20x2 display module with 2x Oki MSC1162A chips and one MSC1164 chip. They are available very cheaply at Prime Electronics.

These displays were apparently custom made for NCR, and I can find no programming information for the module as a whole, except for what is at this site.

Between that very high-level information and the datasheet on the C1162A, I've written some code for my Arduino (ATmega168), but I cannot get the the display to work. Nothing shows up. I'm pretty sure the display itself is functional because occasionally when I interrupt the microcontroller, I get some pixels lit on the display...just got lucky with spurious signals.

The module has a 10-pin socket, with the lines duplicated. According to the OVC site linked above the lines are Vcc, Gnd, Clock, Data, and Enable.

It appears that you need to clock in the raw bits for each column...there is no high level interface.

Does anybody have experience with writing code to interface with a similar display? I've pulled out what little hair remains, but I cannot figure out what I'm doing wrong!

I'd really appreciate some help from the hive mind.

Thanks.

-Pete
By lou
#56403
Peter Loron wrote: Does anybody have experience with writing code to interface with a similar display? I've pulled out what little hair remains, but I cannot figure out what I'm doing wrong!
Well, yes, actually. Not the same display, but something similar.

The thing is, though, I also cannot figure out what you're doing wrong, since I cannot see what you're doing at all. It could be any of numerous things. I'm assuming you know that you have to continuously refresh the display. You can't just clock in the data once and have it keep going. You have to send out 20 columns of 92 bits per column, and you need to refresh the entire frame over 60 times per second to avoid flicker. I'm not sure what else to throw out there, not having any idea what you're doing, or even what you're doing right.
Peter Loron wrote:I'm pretty sure the display itself is functional because occasionally when I interrupt the microcontroller, I get some pixels lit on the display...just got lucky with spurious signals.
Interrupt it how? Do the lit pixels bear any relationship to what is supposed to be displayed? Maybe you're blanking the display most of the time, and maybe interrupting the microcontroller (whatever that means in this particular case) unblanks the display long enough for you to see something. This display appears to use the same line for latching and blanking. It really could be as simple as that.

If the ~CL and LS lines on the driver chip are simply tied together to make that line in the module interface (I have no way to know this; I do not have this module, and it is just a guess), that means you'd want to drive the combined line low (because that blanks the display) until you start scanning the display. For each column, you'd lower it to blank the display, shift 92 bits, and raise it again to unblank the display until it is time to refresh the next column. If the display is not wired that way, this is completely wrong. I suppose the easiest way to find that out would be if you have continuity between pins 24 and 26 on one of the MSC1162A chips and TP8 at the interface header. If there is not such continuity, or there is only between 2 of those 3 places, there may be an inverter (or transistor version thereof) in there, which is good. It means the latches don't go to waste. Which line that is would speak to what strategy you must employ to do the latching and blanking.
By Peter Loron
#56420
Thank you for the response. Yes, my code cycles over and over to keep the display refreshed.

The interruption I was referring to occurred when I pulled the USB cable off the Arduino board, removing power. I am sending all 1's for the character bits, and some of the characters on the display were lit up solid...those may be a result of the data I was writing in.

There is no continuity between TP8 and any of those pins on the chips. This post on AVRfreaks has a schematic of the board.

I have watched the clock, enable, and data lines with a logic analyzer, and they are doing what I believe to be the correct things according to the MSC1162A timing diagram. Obviously I'm missing some piece, though.

Below is my code. I hope you can help me figure out what I'm missing or not doing correctly.
Code: Select all
// Arduino VFD Driver Library
// Author: Peter Loron, peterl at standingwave.org
//
// v0.1 - 27 SEP 2008
//
// Developed for the Futaba NA202MD13AA VFD
//
// Information gathered from http://wiki.ovccorp.com/index.php?pagename=Futaba%20NA202MD13AA
//

//define IO pins
int pin_data = 2;
int pin_clock = 3;
int pin_enable = 4;
boolean enableHigh;

void setup() 
{ 
  //setup control pins
  pinMode(pin_data, OUTPUT);
  pinMode(pin_clock, OUTPUT);
  pinMode(pin_enable, OUTPUT);

  digitalWrite(pin_clock, HIGH);
  digitalWrite(pin_enable, LOW);
  digitalWrite(pin_data, LOW);

  digitalWrite(13,HIGH);

  enableHigh = false;
}

///////////////////////////////////////////////////////////////

void loop() 
{ 
  for(int i = 0; i < 20; i++) {
    sendColumn(i);
  }

  delayMicroseconds(100);
}

//writes out a column
void sendColumn(int col) {
  boolean lastBit = false;

  //char 1
  for(int i = 0; i < 35; i++) {
    writeToDisplay(1);
  }

  //char 2
  for(int i = 0; i < 35; i++) {
    writeToDisplay(1);
  }

  //zero bits
  writeToDisplay(0);
  writeToDisplay(0);

  //column select
  for(int i = 0; i < 20; i++) {
    if(i == 19) { 
      lastBit = true; 
    }
    if(col == i) {
      writeToDisplay(1,lastBit);
    } 
    else {
      writeToDisplay(0,lastBit);
    }
  }

}

//writes a bit to the display
void writeToDisplay(int val) {
  writeToDisplay(val,false); 
}

void writeToDisplay(int val, boolean trigger) {

  //start the clock cycle
  digitalWrite(pin_clock, LOW);
  delayMicroseconds(1); 
  
  //drop the enable pin if it is high
  if(enableHigh) {
    digitalWrite(pin_enable, LOW);
    enableHigh = false;
  }

  //write out the data bit
  if(val == 0) {
    digitalWrite(pin_data, LOW);
  } 
  else {
    digitalWrite(pin_data, HIGH);
  }

  delayMicroseconds(1);

  //raise the clock pin
  digitalWrite(pin_clock, HIGH);

  //if this is the last bit, trigger the shift by raising the enable pin?
  if(trigger) {
    digitalWrite(pin_enable, HIGH);
    enableHigh = true;
  }

  digitalWrite(pin_data, LOW);
}
By lou
#56436
Peter Loron wrote:Thank you for the response. Yes, my code cycles over and over to keep the display refreshed.
As it should be. Good.
Peter Loron wrote:The interruption I was referring to occurred when I pulled the USB cable off the Arduino board, removing power. I am sending all 1's for the character bits, and some of the characters on the display were lit up solid...those may be a result of the data I was writing in.
So, in that state, the VFD board is still getting power from somewhere else (you can't get 1A from USB and be in spec). When the AVR chip loses power, the pins may (it's been awhile, and some chips do this differently, so you should double-check to make sure I'm not wrong) become high-Z/inputs, thus letting R3 on the VFD board pull the EN line high, which unblanks the display. If you plan on doing that sort of thing (resetting the Arduino while the display is powered), you may want to desolder R3 from the driver board. That would at least keep the display blanked when the AVR is in reset. For the same reason, you should use the watchdog timer to reset the chip should it hang and leave the display unscanned indefinitely. It should be sufficient to have the display refresh code kick the watchdog.
Peter Loron wrote:There is no continuity between TP8 and any of those pins on the chips. This post on AVRfreaks has a schematic of the board.
Ooh, a schematic. Excellent. Ok, so a low on EN blanks the display and a high-going edge latches. Cool. That's what you need to know. Apparently, the clock line is also inverted. Asking about continuity was a stopgap as I didn't expect you to have a schematic. This is better.

The deal with the mystery shift register and transistorlike device might be some sort of ID readback functionality. If you put the data line into INPUT mode and clock in 8 bits (after EN has been set high), you ought to read back 0x0b, 0x50, 0xf5, or 0xbf, depending on whether it is lsb or msb first and what kind of device Q1 actually is. You probably don't need that functionality, but it would be a good thing to know about were you to want to share the interface/pins with another device. Unless you make some special arrangements regarding the data line, you should never issue clock pulses when EN is high. Under certain circumstances, it would pull the data line low, which could potentially burn out your AVR pin if it's trying to drive it high.
Peter Loron wrote: I have watched the clock, enable, and data lines with a logic analyzer, and they are doing what I believe to be the correct things according to the MSC1162A timing diagram. Obviously I'm missing some piece, though.
Well, for one thing, the clock line is inverted.

So your logic analyzer shows that the EN line is high a decent amount of the time, right? Your code looks like it ought to be. At least around 100us out of each 284us. Though the result might be dim, it should be visible.
Peter Loron wrote: Below is my code. I hope you can help me figure out what I'm missing or not doing correctly.
Ok. Looks good. Except that you aren't accommodating the clock inversion. That shouldn't throw you off by more than a bit, though. I can't see anything that jumps out at me as causing your problem. Of course it would help if I actually had one of those displays and an Arduino.

It's not a model of efficiency or anything, but this is the stage when you're just trying to find something that works. I know that stage. There'll be time for optimization later. For one thing, you might want to use the SPI hardware instead of spending so much time shifting bits. For now, though, the way you have it is easier to understand, and, until you can get it working, that's more important than efficiency.
By Peter Loron
#56505
Ah! That did it! Thanks!

/me bows

Yes, the code is totally unoptimized...just something to get started. Once I get the display happy, I'll improve the effeciency.

I would really appreciate it if you could elaborate on your suggestion of using the SPI hardware instead of the current bit-banging I'm doing. This page on the Arduino site discusses some direct register pokes to more quickly manipulate the pins.

Now I just need to cobble together some character maps for a lookup table.

Thanks again!

-Pete
By lou
#56515
Peter Loron wrote:Ah! That did it! Thanks!

/me bows
Awesome.
Peter Loron wrote:Yes, the code is totally unoptimized...just something to get started. Once I get the display happy, I'll improve the effeciency.
Yup. The code that refreshes the display is quite possibly the best candidate for optimization in the entire system. It will probably run more than a thousand times per second.
Peter Loron wrote:I would really appreciate it if you could elaborate on your suggestion of using the SPI hardware instead of the current bit-banging I'm doing. This page on the Arduino site discusses some direct register pokes to more quickly manipulate the pins.
Step one is to read the SPI section in the '168 datasheet. It's section 18 in my copy. You'll want to use it in master mode. You probably want CPHA=1 and CPOL=0 (SPI Mode 1), but you should verify that. Whether it's lsb-first or msb-first (DORD) is going to be dictated by how the display maps pixels and columns. One way will likely produce a venetian blind effect when lighting progressive bit positions, and the other way will likely produce the expected linear progression. Short of tracing out which driver chip pin goes to which display grid, one is left with experimentation. Fortunately, there are only two possibilities ffor DORD.

Once it is set up, the SPI hardware behaves a lot like the USART. Write a byte to SPDR to send it out of the interface (to your display). Wait for notification that the byte has finished being sent before sending the next one. Depending on how fast you run the SPI and how much work you have to do to get the next byte to send, you might not even have to wait. Still, it's good to check; particularly when getting things working. Hopefully, this is all happening fast enough that you don't have to enable and service the SPI interrupt; you'd just poll the SPIF bit in SPSR.

How fast you run the SPI is going to depend on things like the specs of the display board and the cabling to it. Start slow (f[OSC]/128) to get things working and then crank it up until you start seeing errors in the communication (and then back off). One of the things that the SPI hardware gets you is bit timing (the reason you have two 1us pauses in your code). The hardware does that for you. You just have to wait for 8 bits to be done so you can hand it the next 8, and you can be preparing the next 8 while you're waiting. Each column refresh is going to consist of around 9 bytes of bitmap data and another 2.5 bytes indicating which column is active.

2.5? Yes. 20 bits. The whole bitstream is 92 bits, which does not yield an integer when divided by 8. The first byte you shift out will contain 4 bits which will be ignored. They will fall off the end of the shift chain. That will yield 96 bits, which is 12 bytes, even. If you want, you can do what I did and use those 4 wasted bits, which still appear in the bitmap data in RAM, for things like character attributes. Though, on this display, they'd probably be more like column attributes. If it helps, you can pretend those other documents say to shift out 4 zero bits before the bitmap data, just like it says to shift out two afterward.

For maximum speed, the writing of 12 bytes of data to the SPI should be done step-by-step instead of in a finite state machine or other loop. There's nothing wrong with those mechanisms. We're just optimizing the heck out of the display refresh routine, and the loop tests and branches take time.

Remember, you'll probably be in interrupt space here (refreshing column in response to timer rollover), so any dawdling (including waiting for SPI transfers to finish) takes time away from the rest of the system. When testing with low SPI speeds, make sure the column refresh timer has a period long enough that the ISR can actually complete. If it is too short, the machine will spend all of its time servicing the interrupt and none of it doing other things, such as updating the bitmap data.

You don't need to worry about MISO unless you want to read back the ID byte (when the EN line of the display is high). If you want to do that, you have to temporarily change MOSI to an input to keep it from fighting with the data output by the display.
Peter Loron wrote:Now I just need to cobble together some character maps for a lookup table.
Oh yeah, that's fun. Well, not really. I usually end up throwing together some sort of Perl script. Such a script tends to take an input file full of # and space characters (graphically laying out the lit and unlit pixels) and make the lists of numbers for me. Choice of language is unimportant, of course. You're probably going to want to create two character sets. One for the top row and one for the bottom row. That's an implementation detail, of course, but it will make things go faster at runtime. One set is shifted a certain number of bits from the other one. This is because each character cell takes 35 bits, which is not a multiple of 8. You could use just one character map and do the shifting every time you print a character in the second row, but it is faster to just have the work done ahead of time in a second map
Peter Loron wrote:Thanks again!

-Pete
You're welcome.
By Peter Loron
#56526
Ok, thanks for the info! I need to go wrap my head around the whole interrupt thing for a while...

-Pete