SparkFun Forums 

Where electronics enthusiasts find answers.

Have a good idea for a new product for SFE or Olimex? Let us know!
By KreAture
Good question.
I think it's little endian. (as in least significant byte first.)
I store it in mem so that bmp.DATA[0] points to 'B' and bmp.DATA[1] is 'M'.
That would make my stuct little endian and since the struct matches the MCU I think it's little endian too.

(I hate endianness!)
By buffercam

I just want to confirm something:
With the original settings that you posted, you are getting a frame rate of about 0.5 fps, correct?

That is, the camera takes about 2 seconds to transmit each image?

By ma4826
buffercam wrote:ma4826,

I just want to confirm something:
With the original settings that you posted, you are getting a frame rate of about 0.5 fps, correct?

That is, the camera takes about 2 seconds to transmit each image?


Best regards,

By kols
I'm gonna use this camera with a micro-sd-card, and a battery, taking pictures at less then 1 pic per sec. What controllers can handle the imagedata this censor creates, and which one is the smallest possible one?
By KreAture
So far, I don't think anyone has found a single controller that can do it alone. You will in most cases need extra memory to handle the image data in addition to the storage card to save them permanently.
That means:
1x 48 to 144 pin device as a MCU.
1x 48-200 pin memory chip
1x micro sd socket
1x camera module
and a way to connect it all.

If anyone has found a way to make this smaller, please tell...
By buffercam
At such a slow transfer rate, I think it might be possible to do without a memory chip. The data just needs to be transferred to SD while it is being received from the camera.

I'm trying to go faster, so I'm using the AL440B-12 512K FIFO.
By KreAture
The transfer rates do not add up though.
Camera is supposed to deliver full, half and quarter framerate of 15 fps.
The slowest pll input is supposedly 6 MHz and that should deliver around 8-9 MHz pll rate. According to appnote, a minimum of 17 MHz output clock is required for full output at any framerate but I suspect that means full output at full framerate.

Without the PLL, minimum input is 6 MHz and according to appnote we might reduce the derived output clock by setting R05 [7:6] to 0 for quarter rate output. I have not verified this.

Further, setting the image size small will halve or even quarter the rate.

I think, using a avr32 or similar, it might be possible to receive and save to micro-SD on the fly. This is what I want to try.
It would mean handling the full 17 MHz output rate though, but that might be possible using the image sensor hardware interface. If camera also runns in jpeg mode the data size would be reduced.

Untill I have the image sensor interface working right I won't know if it is doable though. The sd interface is very fast so this should be no problem. (10 to 25 Mbyte/s with dma handling.)
By buffercam
I don't remember the number, but the output clock for the settings specified by ma4826 were in the kHz range. If I remember correctly, I think it was around 500kHz. Definitely under 1 MHz.

I agree that JPEG can't be captured w/o external memory.

You can get 10 Mbyte/s with SD? I don't think that's possible with SPI. What interface are you using?
EDIT: Some ARM processors have a MDI interface that can write fast to SD, apparently. Like the NXP LPC2368. I don't have any experience with that, though.
By KreAture
I have written libraries to handle SD cards correctly. That is, with full querying as well as identification and support for SDHC cards.
Normal cards allow 25 MHz connection speed, 1 ot 4 bit.
For 1-bit mode, theoretical max transfer is 25/8 = 3.125 Mbyte/s.
For 4-bit mode, theoretical max transfer is 25/2 = 12.5 Mbyte/s

The SDHC cards allow 50 MHz connection speed allowing twice the rates.
That's 6.25 and 25 Mbyte/s.
Since the avr32 series has support for 4-bit mode as well as the ability of being programmed with custom clock frequencys I can get 48 MHz on the SD interface and still have a USB compatible clockrate on the rest of the system. (And 144 MHz on the main cpu.)
By buffercam
Wow. Nice.
By Random
I've finally had some success with the '30:

I'm using 15fps, RGB, 128x96 (same resolution as the LCD, which is the $1 SF sell), no sync codes, everything else at default.
My microchip is an STM32 F103 VBT6, a Cortex-M3 ARM. I'd recommend the VET6 over it, though - it has significantly more memory, so you'd be able to store a whole frame (at 128x96) in RAM and write it out at your leisure. With this, I'm writing out each line after reading it, using 18MHz SPI with DMA.

The image on the screen updates fast enough that you can't see a scan line and motion is a bit blurred but pretty similar to a mobile phone. Colour reproduction on the LCD is excellent, colours look lifelike.

The camera is clocked at 6MHz initially and then dropped to 4MHz when I start collecting image data (at this resolution DCLK=0.5 EXTCLK so 2MHz data input).

I'm reading the image data with some simple assembler that runs on a normal interrupt firing on each HD rise, using another interrupt that fires on VD rise to enable the HD ones.
By hughanderson

I have put together a summary of the current status of the project, by
asking each of the significant developers, what equipment/hardware
approach they are using. Note that there are developers working on both
the TCM8240 (1300x1040) and TCM8230 (640x480) sparkfun cameras. The
'40 appears to have JPEG encoding on board, but so far I think the
developers have had most success using the camera in RGB mode.

(1) KreAture ('40):
* Developed a breakout board, and published it on the forum
* Published the underside, and side pinouts of the '40
* Published header structs for BMP output from RGB data. Adding this
struct as a header to the raw RGB data from the camera will allow
most windows (and others?) software to read the file as a standard
bitfield BMP without any further processing.
* Experimented with JPG output, but currently is using RGB. His
platform has too little memory to handle all the data from an image in
one go so captures 2-4 lines of data at a time and skips onto the next
image until a complete frame is captured. Data is continuously sent to
host via serial. If the image is stationary, this works ok but takes a
long time per image.
* Used ATmega64 for most tests, but wants to run a AT32 7000 model
on theNGW100 board later. Has access to, and uses STK500, STK600,
Jtag ICE MKII as well as 40msps, 200msps and 1gsps digital scopes.

(2) ma4826 ('40):
* Published a set of I2C register values and matching 352x288 images
* Using a CPLD at 50MHz (XC95288XL 30% used) and a SRAM 256Kx16
(12ns) connected to a PC for the tests.
* Has tried with these clock values with and without PLL.
1) 6.25MHz (50/8)
2) 8.33MHz (50/6)
3) 12.5MHz (50/4)
* Has tried with these sizes:
1) 352x288
2) 160x120
3) 320x240
4) 1280x1024 (In the SRAM fit 1280x200)
* Without JPEG everything is OK, but with JPEG obtains defective
jpeg images.

(3) buffercam ('40):
* Has shown pictures of his breakout board
* Has shown I2C register values
* Has shown noisy 352x288 images, but the images are looking much better now.
The confetti-like noise in the first image was due to the data
acquisition device (an Agilent Mixed Scope).
* Using ma8426 I2C register values for the most part
* Now can stream 352x288px images at a rate of 2.8 FPS onto a
microSD card continuously. (The 2GB card can hold almost an hour of
pictures at this rate.)
* Hardware is a PIC32 USB Starter Kit microcontroller running at 80MHz.
The data from the camera feeds into a AL440B-12 512KB FIFO buffer.
The data is clocked out using the Parallel Master Port protocol on the
PIC32. We use the FatFs file system to write the data to a microSD
card using the SPI mode.
* This is for a senior design project, so there will be MUCH more info
and the full code posted by the end of the week.
* As far as buffercam can tell, there is nothing that limits the setup from
capturing JPEG data except figuring out the correct settings for the
registers. The data captured in JPEG mode seems to have a good JPEG
header but no valid data in the body of the image. (It's just a string of
bytes that repeats over and over.)

(4) Twingy ('30)
* Using AT91SAM7S64, has shown 128x96 images
* Will publish a summary paper soon

(5) Random ('30)
* Using ARM STM32 F103 VBT6, has shown 128x96 images
* Random uses the ARM chip to clock in data using some short assembler
code that reads the data clock, waits for it to drop low and then
copies in the byte of data to RAM. It reads in one line (128 pixels
or 256 bytes) and then triggers a DMA channel to send this data out
to the attached LCD, which takes RGB data in the same format, over
SPI. The DMA channel shunts the data over SPI automatically.
This is repeated for each line, and then again for each image (with
the LCD told to redraw at each image).
* The camera is configured in RGB mode, 128x96, no sync codes. More
specifically, register 0x03 is set to 0x22 and register 0x1E to 0x48.
* The camera is interfaced using a PCB made commercially which
connects the camera to the ARM directly, there is no supporting logic.
The camera's data pins are connected to 8 sequential pins on a port
on the ARM, and the sync lines are connected to random I/Os.
* Is not using any debugging but is using the "Logic" logic analyser to
look at the data the camera is sending and figure out timing. Is
programming the thing with a USB-TINY-ISP from Olimex and the
camera is the '30, the smaller version without JPG and with a max
image size of 640x480 (not that anyone's got this yet).
* Uses a hardware interrupt trigger on each VD and HD sync. The VD
interrupt routine enables the HD interrupts, and each of those runs
the assembler that reads in the image data. The interrupts are normal
interrupts fired by an event connected to an interrupt signal.
* Was originally only storing the first 32 pixels of the first 24 lines
of the image and sending them serially to an OLED screen, but is now
using the sparkfun LCD which can be sent data considerably faster (it
takes less time to send it the data then it does to receive it from
the camera!)
* The camera is clocked at 6MHz to configure, then 4MHz when receiving
data. It only initialises correctly every now and again but more often
than not - this seems to be pretty much random but might be a
consequence of the slow clock. When initialisation fails gets random
colour noise or blocks of solid colours, no consistent failure.

I hope this is helpful to others (like me) who have come late to this project.

Regards Hugh
By KreAture
Interesting sumary by hughanderson and I caught something I hadn't noticed. The corrupt/illegal data in jpeg mode might be error code...
I can't remember where in the docs I read this, but there are some limits to resolution and minimum pll speeds that must be met for jpeg output to work. Might be worth looking into.

I'm heading down that road as soon as I have time to finish my code for the avr32 platform. My plan is to use the image sensor interface as it should be fully compatible with the cmos cam.
By silic0re
Well done Random, that looks like a fantastic project! :)
  • 1
  • 13
  • 14
  • 15
  • 16
  • 17
  • 31
long long title how many chars? lets see 123 ok more? yes 60

We have created lots of YouTube videos just so you can achieve [...]

Another post test yes yes yes or no, maybe ni? :-/

The best flat phpBB theme around. Period. Fine craftmanship and [...]

Do you need a super MOD? Well here it is. chew on this

All you need is right here. Content tag, SEO, listing, Pizza and spaghetti [...]

Lasagna on me this time ok? I got plenty of cash

this should be fantastic. but what about links,images, bbcodes etc etc? [...]