On the way to autonomous robots, perception is a key point. Among all the perception senses, vision is undoubtedly themost important for the information it can provide. However, it is not easy to identify what is seen from the provided visual input.On this regard, inspired by humans, we have studied motion as a primary cue. Particularly, we present a computational solutionfor motion detection, object location and tracking from images captured by perspective and fisheye cameras. The proposedapproach has been validated with an extensive set of experiments and applications using different testbeds of real environmentswith real and/or virtual targets.
展开▼