I'm looking at implementing an algorithm I wrote for Android on an Arduino. The algorithm is already pretty basic, but I have a lot of ways to optimize it still. It boils down to detection of blobs and a single channel of color would work fine. The most important part would be thresholding and detecting blobs (which I haven't figured to how to do with the limited resources, but that aside).
Doing the math, most instructions on Atmel chips use only one clockcycle. For an image of 320*240 at 30 fps, I come to the minimum of 2304000 clockcycles if one pixel requires one cycle. Given an 8mhz CPU, I would at max be allowed 3.5 clockcycles and double that for 16mhz. (I could skip parts of frames if previously there were no blobs found in the vicinity).
That seems doable, but I'm guessing there's more at play than this. Input seems the most troubling to me; an analog signal from a camera needs an adc (which is limited to max 10000 on most Arduinos), but could I use a comparator instead and do the thresholding analog? If so, is there documentation on this? Also, would it help if I received significant bits by interrupt or would that take an equal amount of clockcycles?
Alternatively I could use a serial camera, but I'd imagine this would require more clockcycles. Also I wouldn't be able to do thresholding with a comparator.
Can someone verify/comment on my thoughts on this? What methods are readable?