- Wed Dec 10, 2008 1:57 am
I have put together a summary of the current status of the project, by
asking each of the significant developers, what equipment/hardware
approach they are using. Note that there are developers working on both
the TCM8240 (1300x1040) and TCM8230 (640x480) sparkfun cameras. The
'40 appears to have JPEG encoding on board, but so far I think the
developers have had most success using the camera in RGB mode.
(1) KreAture ('40):
* Developed a breakout board, and published it on the forum
* Published the underside, and side pinouts of the '40
* Published header structs for BMP output from RGB data. Adding this
struct as a header to the raw RGB data from the camera will allow
most windows (and others?) software to read the file as a standard
bitfield BMP without any further processing.
* Experimented with JPG output, but currently is using RGB. His
platform has too little memory to handle all the data from an image in
one go so captures 2-4 lines of data at a time and skips onto the next
image until a complete frame is captured. Data is continuously sent to
host via serial. If the image is stationary, this works ok but takes a
long time per image.
* Used ATmega64 for most tests, but wants to run a AT32 7000 model
on theNGW100 board later. Has access to, and uses STK500, STK600,
Jtag ICE MKII as well as 40msps, 200msps and 1gsps digital scopes.
(2) ma4826 ('40):
* Published a set of I2C register values and matching 352x288 images
* Using a CPLD at 50MHz (XC95288XL 30% used) and a SRAM 256Kx16
(12ns) connected to a PC for the tests.
* Has tried with these clock values with and without PLL.
1) 6.25MHz (50/8)
2) 8.33MHz (50/6)
3) 12.5MHz (50/4)
* Has tried with these sizes:
4) 1280x1024 (In the SRAM fit 1280x200)
* Without JPEG everything is OK, but with JPEG obtains defective
(3) buffercam ('40):
* Has shown pictures of his breakout board
* Has shown I2C register values
* Has shown noisy 352x288 images, but the images are looking much better now.
The confetti-like noise in the first image was due to the data
acquisition device (an Agilent Mixed Scope).
* Using ma8426 I2C register values for the most part
* Now can stream 352x288px images at a rate of 2.8 FPS onto a
microSD card continuously. (The 2GB card can hold almost an hour of
pictures at this rate.)
* Hardware is a PIC32 USB Starter Kit microcontroller running at 80MHz.
The data from the camera feeds into a AL440B-12 512KB FIFO buffer.
The data is clocked out using the Parallel Master Port protocol on the
PIC32. We use the FatFs file system to write the data to a microSD
card using the SPI mode.
* This is for a senior design project, so there will be MUCH more info
and the full code posted by the end of the week.
* As far as buffercam can tell, there is nothing that limits the setup from
capturing JPEG data except figuring out the correct settings for the
registers. The data captured in JPEG mode seems to have a good JPEG
header but no valid data in the body of the image. (It's just a string of
bytes that repeats over and over.)
(4) Twingy ('30)
* Using AT91SAM7S64, has shown 128x96 images
* Will publish a summary paper soon
(5) Random ('30)
* Using ARM STM32 F103 VBT6, has shown 128x96 images
* Random uses the ARM chip to clock in data using some short assembler
code that reads the data clock, waits for it to drop low and then
copies in the byte of data to RAM. It reads in one line (128 pixels
or 256 bytes) and then triggers a DMA channel to send this data out
to the attached LCD, which takes RGB data in the same format, over
SPI. The DMA channel shunts the data over SPI automatically.
This is repeated for each line, and then again for each image (with
the LCD told to redraw at each image).
* The camera is configured in RGB mode, 128x96, no sync codes. More
specifically, register 0x03 is set to 0x22 and register 0x1E to 0x48.
* The camera is interfaced using a PCB made commercially which
connects the camera to the ARM directly, there is no supporting logic.
The camera's data pins are connected to 8 sequential pins on a port
on the ARM, and the sync lines are connected to random I/Os.
* Is not using any debugging but is using the "Logic" logic analyser to
look at the data the camera is sending and figure out timing. Is
programming the thing with a USB-TINY-ISP from Olimex and the
camera is the '30, the smaller version without JPG and with a max
image size of 640x480 (not that anyone's got this yet).
* Uses a hardware interrupt trigger on each VD and HD sync. The VD
interrupt routine enables the HD interrupts, and each of those runs
the assembler that reads in the image data. The interrupts are normal
interrupts fired by an event connected to an interrupt signal.
* Was originally only storing the first 32 pixels of the first 24 lines
of the image and sending them serially to an OLED screen, but is now
using the sparkfun LCD which can be sent data considerably faster (it
takes less time to send it the data then it does to receive it from
* The camera is clocked at 6MHz to configure, then 4MHz when receiving
data. It only initialises correctly every now and again but more often
than not - this seems to be pretty much random but might be a
consequence of the slow clock. When initialisation fails gets random
colour noise or blocks of solid colours, no consistent failure.
I hope this is helpful to others (like me) who have come late to this project.