SparkFun Forums 

Where electronics enthusiasts find answers.

Everything ARM and LPC
By TheDirty
#76462
millwood wrote:there could be other reasons: speed, upward mobility, additional peripherals, costs, etc.
Speed, possibly, but there's some pretty fast 8bits and if you want speed without memory, there are other options.
Upward Mobility to what? You can't go over 32k on the development platform you've chosen, without spending big $$'s.
There's no peripherals on these chips that I don't have on 8bit uC's, the only benefit is using a lot of them at once, which brings us back to the memory problem.
Cost? What? ARM chips aren't cheap.
millwood wrote:so far, I haven't written a program with >8k code (.hex). so it will take me a while to breach that 32k barrier.
So, what happens when you do, and you've committed yourself to a development environment that you can't use unless you pay big $$'s for? SD Card FAT library, GLCD, RTOS, RF/Ethernet Stack... It doesn't take much to get over 32k. If I wasn't running firmware larger than 32k, I'd really question why I was using ARM chips at all.

IAR and Keil maybe fabulous, but they are a dead end for hobbyists unless you have $2.5k+ for a single seat license.
By millwood
#76464
TheDirty wrote:Upward Mobility to what?
to more capable chips, if you DO indeed want to;
Cost? What? ARM chips aren't cheap.
but they can be cheaper than 8-bit chips of comparable performance.

lm3s628 is $4.50/1 at digikey. that's 60+mips chip, 8x10bit adc at 1msps, onboard temp sensor, 3 timers, 22 prioritized interrupt channels, tons of gpios, 2 uarts, rtc, mpu, etc.

the cheapest pic32 at digikey is $5+/1.
So, what happens when you do, and you've committed yourself to a development environment that you can't use unless you pay big $$'s for? SD Card FAT library, GLCD, RTOS, RF/Ethernet Stack... It doesn't take much to get over 32k.
I have committed myself to a development tool that costs me exactly $0 right now. and in the future, if I wanted to move up and don't want fork over the cash, I can go other route, like GCC.
IAR and Keil maybe fabulous, but they are a dead end for hobbyists unless you have $2.5k+ for a single seat license.
they are a dead end for hobbyists who have to develop cost over 32k AND are unwilling / able to pay for the tools.

I would like to see when was you or another hobbyist actually developed a 32k or larger package.
By TheDirty
#76473
I can see why they say you like to argue nonsensically.

Converting all your code to GCC is not trivial. Why would you start with a development tool that you can't continue to use?

Your assertion that hobbyists using ARM chips do not go over 32k is frankly ridiculous and silly.

My embedded ethernet project is over 32k and I've just really started with this..
By JJ
#76480
Code reusability is better with gcc as well. Your next project might be on say MIPS or Blackfin, and it's nice to be able to repurpose generic code with minimum fuss. It's also easier to use to unit test on an x86 system by mocking lower level drivers. Like a mock SD driver that returns sectors from a disk image.
By inventore123
#76490
+1 for GCC
It is a good compiler, it is available for many architectures ranging from desktop PCs to microcontrollers, and eases code portability.
When I moved form the AVR to the ARM it was easier because the compiler was the same.
On the ARM it takes a while getting used to it, but it's definitely worth learning how to use it.
And based on some benchmark I made it also has very fast floating-point math on ARM.

And 32K code limit is a big problem for anybody doing serious microcontroller projects.
My last ARM project was an mp3 player and the code was 175KB, because it had:
- 35KB of bitmap colour images stored to flash (even those alone would break the code limit!)
- self-designed RTOS
- FAT32 read/write library
- part of the C and C++ standard library
- many lines of code for GUI drawing, and application code
By millwood
#76495
TheDirty wrote:I can see why they say you like to argue nonsensically.
you too.
Converting all your code to GCC is not trivial.
it depends on how the code is written. I often port code across mcu platforms and have had no problem with that.
Why would you start with a development tool that you can't continue to use?
because you needs *may* change.
Your assertion that hobbyists using ARM chips do not go over 32k is frankly ridiculous and silly.

your assertion that I asserted that is ridiculous and silly. I was merely saying that it has not been an issue with ME. i cannot speak for others.
My embedded ethernet project is over 32k and I've just really started with this..
then the free versions of keil / iar are not for you. it is just that obvious.

but that doesn't mean that they are not for others. our needs are diverse and what works for me may not work for you and what don't work for you may work for me. you will just have to recognize that.
By millwood
#76496
inventore123 wrote:+1 for GCC
when I started with arm, i did some study on the various ides and one thing that worried me then was the code quality of the gcc compiler and its "ever-changing" nature. they seem to be less mature and create (more?) bloated code in the end.
My last ARM project was an mp3 player and the code was 175KB, because it had:
I believe the limitation is to the code section but not the size of the .hex file (ie code + data). I have compiled a small program with large bitmap (>32k) with no program in keil - haven't tried it on a real mcu so i couldn't tell if it actually worked but the compiler didn't complain about it.

edit: i wanted to add that I was wrong above the last point. if the code includes a large piece of data that is linked into the .hex file, the linker in the free demo version will complain about its size. if your code has a large piece of data but does not reference it (as i did), the compiler will not include the data piece in the .hex file and the linker will thus not complain.

hope it clarifies.
Last edited by millwood on Wed Jul 08, 2009 7:50 am, edited 1 time in total.
By millwood
#76552
also, you can run GCC from within Keil so you get the best of both worlds: free and unlimited GCC compiler and a great gui from Keil.
By stevech
#76556
inventore123 wrote: My last ARM project was an mp3 player and the code was 175KB, because it had:
- 35KB of bitmap colour images stored to flash (even those alone would break the code limit!)
- self-designed RTOS
- FAT32 read/write library
- part of the C and C++ standard library
- many lines of code for GUI drawing, and application code
\


For reference: My current unfinished project is 4,700 lines of C code and compiles to much less than 32KB of flash in thumb mode and a 30% or so more in ARM mode. This includes the app, a cooperative task scheduler (single stack, multiple finite state machines instead of a preemptive RTOS), and a dual serial port buffered interrupt driver. And periodic timer.

As to RAM use, I am very careful to not declare static RAM buffers willy-nilly as PC people do. Tasks use no RAM if they are dormant.

With debugging on, it's much larger.
By inventore123
#76570
one thing that worried me then was the code quality of the gcc compiler and its "ever-changing" nature. they seem to be less mature and create (more?) bloated code in the end.
It's ever changing because it is constantly improved. That's a good thing, not a bad one.
also, you can run GCC from within Keil so you get the best of both worlds: free and unlimited GCC compiler and a great gui from Keil.
That's good, i didn't know it was possible. But so you were more worried by the GCC's ease of use than of it's code quality ;)
For reference: My current unfinished project is 4,700 lines of C code and compiles to much less than 32KB
Well, let's take 175KB and remove the 35KB of images, which is 140KB
Then consider that I have:

- 20,000 lines of C++ code, with classes, templates, virtual functions and exceptions
(this is considering the preemptive RTOS, GUI library, filesystem library and application code)
- malloc/free, new/delete
- printf/sprintf with floating point support
- fopen, fclose, fread, fwrite from C standard library (that call low level filesystem library)
- other C standard library functions
- floating point math
- <vector> <list> <string> <algorithm> <bitset> from C++ standard library

This is all ARM code, no thumb, no interworking.

To me it still seems an acceptable code size.
And by the way, the cpu is an LPC2138 with 500KB of flash, so what's the point in shrinking so much the code? The nice thing of ARM is that they have lots of flash 8)

stevech, you haven't said what compiler are you using
By millwood
#76574
inventore123 wrote:It's ever changing because it is constantly improved. That's a good thing, not a bad one.
it depends. to me, the compilers / ides are a tool i use to achieve my goal. i don't want to constantly learn about my tools.
But so you were more worried by the GCC's ease of use than of it's code quality ;)
its code quality is my worry. as to its ease of use, well, gcc itself is just command line and has no ease of use to speak of. it is the ides on top of it that has the concept of ease of use.
stevech, you haven't said what compiler are you using
he seems to suggest that he like iar earlier in the thread.
By stevech
#76604
GCC is NOT just a command line compiler. WINARM, WINAVR, YAGARTO are IDEs. Rowley too, since it's GCC based.

But, in terms of support, stability, contractual recourse if there is a show-stopping bug in the tools, and shorter learning curve, a commercial compiler is always prudent. And these factors are why there are companies selling such.

IMO, the code quality from IAR and Keil is probably insignificantly different. Ease of use, documentation, debugger/JTAG support quite similar. ImageCraft has a lower cost commercial compiler. After try-before-buy, I chose IAR and J-Link, with the dongle license scheme. Very pleased.
By millwood
#76605
stevech wrote:GCC is NOT just a command line compiler. WINARM, WINAVR, YAGARTO are IDEs. Rowley too, since it's GCC based.
gcc the compiler / linker / utility / etc. (toolchain) is command-line based.

the IDEs (winarm, yagarto, keil uvision, etc.) provide a gui to the commandline programs.

it is the same set-up with PICC (commandline compiler) and hi-tide (gui) / mplab (gui), etc.
By cfb
#76618
millwood wrote:
stevech wrote:GCC is NOT just a command line compiler. WINARM, WINAVR, YAGARTO are IDEs. Rowley too, since it's GCC based.
gcc the compiler / linker / utility / etc. (toolchain) is command-line based.
The fact that gcc is both a command-line compiler AND can be accessed from an IDE is a bonus. Whereas IDEs are indispensible during the initial coding phase of development, command-line compilers can be very useful for non-interactive batch-builds during the maintenance / release phases e.g. if you are supporting several different targets with your application.

For that very reason although our Armaide Oberon-07 development system has a fully integrated compiler and linker, a separate command-line compiler and linker were also developed.
By theatrus
#76621
Not having a batch operation command line compiler is a major show stopper.

GCC is in no way scary. GDB is a great debugger. Its just what kool-aid you like swallowing, and I am very much in the UNIX tools arena.