SparkFun Forums 

Where electronics enthusiasts find answers.

Have a good idea for a new product for SFE or Olimex? Let us know!
By Arachnivore
#105733
Can we get the ball rolling on some open neural network stuff? I have a neural net fetish but there aren't many hobbyist tools out there to play with.

Introduction:
The great thing about neural nets is that instead of programing a new algorithm for each system you want to control, you program one algorithm that generates algorithms to solve any given problem. In nature, this approach has proven to be remarkably efficient and robust even when compared to the best custom designed algorithms produced by decades of research. They're especially handy for modeling non-linear problems.

Here's an over-simplified explanation of how they work:
Each sensor (i.e. each pixel on a CMOS camera chip) has a corresponding "neuron" in the input layer. The number of neurons in a layer corresponds to the number of dimensions that that layer maps data to. So, the input from each sensor is initially assumed to be linearly independent from all other sensors. Of course, the input from pixels on a camera sensor are not completely independent, so the inputs to this sensor layer will map to a lower dimensional manifold. If you feed the output from the sensor layer into a second layer with fewer neurons (fewer dimensions to map data to), the second layer will reduce the dimensionality of the data. In doing so, it will model relationships between the inputs.

Here's a decent introduction to the field

Here's a great video on the state of the art (circa 2007)

In the above video, Dr. Hinton invites the audience to visit his website where all the Matlab code for his demonstrations is freely available. I believe a good starting point (on the software side) would be to translate this code to Octave so that you don't have to take out a second mortgage for a copy of Matlab.

On the hardware side, I'd really like to see a heterogeneous computing platform for hobbyists. A cortex M3 connected to an FPGA with some on-board RAM would be great. Sadly, each new generation of FPGAs seems to have fewer and fewer non-BGA packaged devices.

Altera's Cyclone III with 40K Logic cells is the largest such device, but it's a little pricey and doesn't have cascading DSP blocks with accumulators or distributed RAM for registers etc. which kind of cripples it.

Lattice's ECP2 with 20K logic cells has cascading DSP blocks and distributed ram but the built in DDR memory interfaces are only available on BGA-packaged devices. Also, at less than half the price of the Cyclone, it has about 1/4 the embedded RAM and 1/5 the embedded multipliers (though they run about twice as fast).

Finally, there's the 9K logic cell Spartan-6 from Xilinx. It's pretty much the same story as the lattice part but with half the resources and a little nicer DSP blocks. It will probably end up being more cost effective (more resources/$) than the Lattice ECP2 because the ECP2 is lagging the Spartan-6 by about two process nodes (ECP2 is made on 90 nm silicon, Spartan-6 is made on 45 nm), however; I don't think the Spartan-6 goes into full production until mid-2011.

There, I've said my piece. Anyone else interested in this stuff?
By Cannibal
#105834
During my undergrad (EE) I took a course dealing with neural networks via problems of machine vision/image classification, as well as some much simpler problems.

While they are amazing, I recall the first few examples we did in class with a simple neural net learning to 'solve' something as simple as the XOR table.

Basically it was surprisingly slow to train it to do even this, and if we over-trained it the network began to 'learn' patterns in our test data that weren't anything to do with the xor function at all. Meaning while they are great when it's very hard to define the given problem with a reasonable set of parameters, they have their own demons. :D

I wouldn't mind seeing more on the topic, but I don't know how broad the appeal is.
By I2CMASTER
#105917
This sounds intresting and I think I would want one.
By Arachnivore
#106296
Whenever I talk to other engineers about neural nets, they always point out that neural nets are slow or have never been scaled past pet projects in a research lab.

It's true that the field of artificial neural networks (ANNs) has run into several pitfalls, but progress is still being made and I think the potential is obvious. Your brain performs feats that would take a football stadium sized cluster of conventional computers to replicate. It does all this while consuming about 30 Watts of power. The emergent algorithms that your brain constructs to model how the world works are incredibly sophisticated and robust. The system can cope with a great deal of damage. There are cases where surgeons have had to remove an entire hemisphere of a child's brain to prevent a tumor from spreading, and the other hemisphere reorganized itself to perform the tasks of the missing hemisphere.

If ANNs were opened up to hobbyists, it may help progress in the field. Besides, it shouldn't be a huge problem if modern ANNs don't scale well past pet projects. Pet projects are the hobbyist's bread and butter, right?

If I'm in the minority here and there simply isn't demand for a hobbyist ANN platform, that would be a shame, but I think the hybrid ARM+FPGA platform would still be a worthwhile product for many applications. Also, I believe heterogeneous computing is going to become more important and mainstream in the near future, so it would be great if there was an affordable platform that allowed students to get some experience with it. SF's FPGA offerings are getting a little long in the tooth. They're charging $80 for a $20 part on a break-out-board.
User avatar
By leon_heller
#106299
Training ANNs can be slow, but once trained they can be very fast. I think they are used a lot in the UK for number plate recognition. When I worked for BAe Military Aircraft Division I did some work with them, and supervised a student project at Hull University which involved their use for gesture recognition using a Dataglove.
By I2CMASTER
#107822
I'm building a beam robot using Nv and Nu networks and a ANN would be great to have even if it was a very simple so it would replace all the single Nv and Nu neurons! Also I would be willing to pay for an ANN for $25 that simply reads a input, learns from the input, and then puts out an output. There are 4 sensors, 2 light and 2 touch sensors. Also there are at least 2 motors to drive so as you could see a very simple ANN would be great for Sparkfun!