Ecoation is changing the way we produce and protect our food. The main reason farmers and greenhouse growers use pesticides is that their operations are massive and by the time they learn about an issue, it is already established and spreading, and farmers frankly have no other option but to fight the fire by all means!
Plants have this wonderful ability to respond to their environment. They do communicate with the world and each other through chemical signals that they emit to express how they feel. In another world, they change their smell when they are not happy or at danger. I studied this system for my PhD more than a decade ago and there is a whole category of science (chemical ecology) that investigate plants’ chemical communication. We asked ourselves, if plants talk about their own health, why can’t we just listen to them? Plants can tell us if something is wrong, and we can address the issue at early stages and fix it before the problem gets out of hand and force us to use the strongest weapon in the arsenal: chemical pesticides!
Fast forward few years, we built just that. The core capability of Ecoation is pinpointing crop stress at early stages and helping farmers address the issue locally, and in most cases, without the use of pesticides. To enable this AI based sensing capability, we had to develop a mobility platform that can address the large size of operations. Farms and greenhouses use various mobility platforms from tractors to trolly carts. In the first product line, we devised a platform that can be mounted on different mobility platforms that farmers use. As they move about their farms and do their primary task, the cameras and sensors and on-board computers will capture data from the plants and provide real-time analysis and live alerts when issues are found. This allows the growers to save on labour, getting 3-4 tasks done with only one person driving the cart and also digitally archive the status of the crop. As part of the technology, we produced the world-first Virtual Reality and near-live streaming service for agriculture. In 5-20 minutes after a machine makes its way through the plant row, growers, consultants, crop managers, and farm owners can see an immersive 8K, 360° views of every square meter of the greenhouse and take a “Virtual Walk” inside the crop from anywhere in the world. This tremendously changes the way we interact with our crops. Senior growers can spend their time making decisions instead of walking up and down the crops and look at them and consultants don’t have to fly to faraway places, just to spend few hours on the ground and make some recommendations. The carbon emission saving of our platform is equivalent of removing 242 cars from the streets for one year.
To push the envelope even further, we designed and built a fully autonomous self-driving vehicle for greenhouses that can go inside every row of the plants and autonomously scan the crop, broadcast issues live and most importantly, work after-hours though out the night when workers are not around. This profound advancement landed us the Innovation Award at Greentech Amsterdam as the first Canadian company who won this major prestigious award in the Netherlands which is de-facto the ground for horticulture innovation. However our design still needed improvement, our intelligence allowed us to identify crop issues, but human human intervention was necessary to fix problems when they arose. So our next step in the design process was to add robotic arms to our platform to advance our product to become a “Find & Fix” solution. The result is a platform which autonomously monitors the rows of the crop, assess the health of plants and if it finds an area that needs help, it treats the area by releasing biological control agents (good bugs that eat bad bugs) to maintain healthy nutritious plants without a drop of pesticides. The future is now and this small Canadian company is changing the way we produce and protect our food, forever!