SparkFun Forums 

Where electronics enthusiasts find answers.

Have questions about a SparkFun product or board? This is the place to be.
By sebmadgwick
#98851
It seems that some files created by a device I am working on are corrupt and cannot be read by my computer. Files are always closed and the drive unmounted before the SD is removed. Files >100MB in size (created over a 20+ min period) are almost certain to be corrupt, smaller files are less likely so - In tests I created over 100 smaller files (<500kB) in rapid succession (closing and unmounted after each) and found around 2-3% are corrupted.

Please can anyone offer suggestions as to what is going wrong? I am out of ideas!
By UhClem
#98878
How old is your computer MMC/SD card reader?

The differences between the MMC card standard, SD 1.0, and SD 2.0 will confuse older readers and they will not read it correctly. If you write with one of these older devices they will corrupt the data on the card. I had this problem and it required a bit of work (I mentioned this problem in this thread) to figure it out.

Reading and writing data is the same but the format for the Card Specific Data changed.
By sebmadgwick
#98885
Thanks for the suggestion. That does not seem to be the problem (based on the following).

Since my previous post I have done allot of tests that have hopefully provided results to accurately represent the situation:
- Files logged at 256Hz * 87bytes = 22.3kB/s with 3.685MHz SPI do not corrupt.
- Files logged at 512Hz * 87bytes = 44.5kB/s with 10MHz SPI do not corrupt.
- Files logged at 1024Hz * 87bytes = 89kB/s with 10MHz SPI may corrupt.
Other info:
- The problem is on my SanDisk card, my only other card (Samsung) is not subject to this problem.
- The first file seems to corrupt far more than any others.

Can any suggestions be made with this new information?? I am hesitant to blame the SanDisk card itself.
By sebmadgwick
#98907
I have done such a test (199MB file, SPE uSD card reader: COM-09433, all cards first formatted)
- Samsung MC2GR256UACY-PA, 37 secs
- SanDisk 1GB (from SFE), 33 secs
- Kingston ND185-002.AD00LF, 28 secs
- Samsung MB-MS2G311, 33 secs

!!! HOWEVER, this data is contrary to those speeds achieved with my device !!!
Both Samsung cards are capable of sustaining throughputs almost twice that of the SanDisk card.

The Kingston card causes my device to give up file writing after too many consecutive 'fileWrite()' fails, with 10MHz SPI and only seems to work with my lower option of 3.685MHz. Its practical throughput with my device is very low.

My hypothesis is (and I would greatly appreciate feedback on this) my device is functioning just fine and that the different manufactures’ SD cards have different SPI performance characteristics (as seen with my device), but their SD-protocol performance characteristics (as seen with the PC) are largely the same. Worst case scenario is that the lower performing SPI leads to a corrupt file.
This would explain why the Kingston is so fast with a PC but cannot exceed a slow 25kB/s with my device on SPI; and why the Samsung cards achieve a remarkable throughput double that of other cards (~200kB/s) yet are slower with the PC file transfer.

The solution can only be recommend Samsung (or similar) SD cards to my device's users (?).
By mac
#99012
markaren1 wrote:Using a PC and card adaptor write a large file to each card, time transfers.

See if there is any obvious difference between card performance.

-Mark
Not relevant as SD card reader do not use SPI mode for reading/writing.
By rmteo1
#99014
Or look at an MCU with SDIO (such as STM32F103xxx) instead of using SPI.
The SDIO features include the following:
● Full compliance with MultiMediaCard System Specification Version 4.2. Card support
for three different databus modes: 1-bit (default), 4-bit and 8-bit
● Full compatibility with previous versions of MultiMediaCards (forward compatibility)
● Full compliance with SD Memory Card Specifications Version 2.0
● Full compliance with SD I/O Card Specification Version 2.0: card support for two
different databus modes: 1-bit (default) and 4-bit
● Full support of the CE-ATA features (full compliance with CE-ATA digital protocol
Rev1.1)
● Data transfer up to 48 MHz for the 8 bit mode
● Data and command output enable signals to control external bidirectional drivers.
By sebmadgwick
#99182
Thanks for all your help. I have now done plenty more experimenting. Based on this and your comments I would conclude that for robust file writing with a sustained throughput >25kB/s, the SPI interface is not robust enough and that the SDIO would be a better choice (I have not used it).

I should point out that although sustained throughputs >100kB/s were achievable; they potentially caused a corrupt file (when single files >> 1GB or more). Also, high throughputs like this were only possible with Samsung cards (vs. SanDisk and Kingston)
By UhClem
#99204
sebmadgwick wrote:Thanks for all your help. I have now done plenty more experimenting. Based on this and your comments I would conclude that for robust file writing with a sustained throughput >25kB/s, the SPI interface is not robust enough and that the SDIO would be a better choice (I have not used it).

I should point out that although sustained throughputs >100kB/s were achievable; they potentially caused a corrupt file (when single files >> 1GB or more). Also, high throughputs like this were only possible with Samsung cards (vs. SanDisk and Kingston)
Did you try enabling CRC checks?
By sebmadgwick
#99279
I was hoping that with lower throughputs my problems would disappear. At first they did but I left my device logging over night (0.5GB file) and all files on the drive became corrupted! Previous 100MB and 200MB files have been OK.

Do you think that using the CRC will solve this problem? Do does anyone have any advice before I spend a whole lot of time incorporating CRC into my library and perhaps find my problems still haven’t disappeared? btw. I can't use SDIO, I have to get SPI working robustly.
By UhClem
#99281
sebmadgwick wrote:I was hoping that with lower throughputs my problems would disappear. At first they did but I left my device logging over night (0.5GB file) and all files on the drive became corrupted! Previous 100MB and 200MB files have been OK.

Do you think that using the CRC will solve this problem? Do does anyone have any advice before I spend a whole lot of time incorporating CRC into my library and perhaps find my problems still haven’t disappeared? btw. I can't use SDIO, I have to get SPI working robustly.
CRC checks may not solve your problem. They will rule out the actual SPI data link. If you don't get CRC errors then the problem is elsewhere. Since it is highly unlikely that the SD card would mangle the data internally, the problem would be in your code.

One trick you might try is to try out your code on a captive file system. I found that testing my FAT code was much easier this way. I used mkfs (Linux) to create a FAT16 file system in a file. I then replaced the layer of my code that handled the SD card interface with code to access this file.
By sebmadgwick
#99305
After a few hours I have given up on getting CRC working. I don't think that a 3.7MHz SPI bus on a processional PCB with tracks <30mm in length is resulting in bit errors. This is supported by the fact that I have written many >100MB files with no content errors.

UhClem, thanks for that tip but I do not think it would be an productive exercise for me. I do not have a great understanding of FAT. The FAT library I am using is straight out of a book as I am extremely limited in how much time I can spend on this work.

Do you have any suggestions as to where my system could be going wrong, given: I can write 100MB and 200MB files with no FAT errors or file content errors, but when I created that 500MB file, all files on the drive become corrupt. It may or may not be relevant that larger files are created over a far longer period of time.
By UhClem
#99312
sebmadgwick wrote:After a few hours I have given up on getting CRC working. I don't think that a 3.7MHz SPI bus on a processional PCB with tracks <30mm in length is resulting in bit errors. This is supported by the fact that I have written many >100MB files with no content errors.

UhClem, thanks for that tip but I do not think it would be an productive exercise for me. I do not have a great understanding of FAT. The FAT library I am using is straight out of a book as I am extremely limited in how much time I can spend on this work.
You could perform a simple test even without understanding FAT16. Build a large test file system and then write a large file to it using your FAT16 code. Large enough so that when writing to a SD card you have trouble. Then check the file to see if it is mangled. If it is, then the problem is with the FAT16 code and not the cards.
Do you have any suggestions as to where my system could be going wrong, given: I can write 100MB and 200MB files with no FAT errors or file content errors, but when I created that 500MB file, all files on the drive become corrupt. It may or may not be relevant that larger files are created over a far longer period of time.
It has been a year or so since I was deep into the FAT code so I have forgotten most of the details. But a problem that appears only with large files suggests an overflow of some sort. Since it doesn't fall on an obvious boundary for 16 or 32 bit numbers, it would be subtle. Something like overflowing an intermediate operation. Not that I can think of a reason for that at the moment.

What FAT library are you using?