Scratchbot is designed as a rescue bot, going places where there is low visibility. It’s defining feature is the fact that it uses “whiskers” to feel for things. We feel like this is a little gimmicky. If it is a low visibility situation, wouldn’t IR or audio, possibly sonar be a more effective? How would it differentiate between different physical obstacles? Are the whiskers really new? Aren’t they really just bump sensors? Maybe they have something a little more complicated going on. There was another recent bot that utilized whiskers and compared different tactile profiles to determine what it was touching.