Cyborg, Or Leafy Sensor Array?

Some plants react quickly enough for our senses to notice, such as a Venus flytrap or mimosa pudica. Most of the time, we need time-lapse photography at a minimum to notice while more exotic sensors can measure things like microscopic pores opening and closing. As with any sensor reading, those measurements can be turned into action through a little trick we call automation. [Harpreet Sareen] and [Pattie Maes] at MIT brought these two ideas together in a way which we haven’t seen before where a plant has taken the driver’s seat in a project called Elowan. Details are sparse but the concept is easy enough to grasp.

We are not sure if this qualifies as a full-fledged cyborg or if this is a case of a robot using biological sensors. Maybe it all depends on which angle you present this mixture of plant and machine. Perhaps it is truly is the symbiotic relationship that the project claims it to be. The robot would not receive any instructions without the plant and the plant would receive sub-optimal light without the robot. What other ways could plants be integrated into robotics to make it a bona fide cyborg?

Continue reading “Cyborg, Or Leafy Sensor Array?”