Our solution has adapted and advanced with our research on this project while remaining firmly focused on the benefits we can provide to end users.
We have created a system to provide farmers with precise information about the state of their fields utilising artificial intelligence and drone technology. Based upon feedback from our partners that the drone surveys would need to be provided as a service to the farmers we have created two seperate mobile applications: one for the farmers to enter and recieve information about their fields and another for a company offering drone surveys to the farmer.
In our system the farmer can order a survey (either indepnedently or based our a recommendation from the system) which is then picked up by an associated third-party service provider. This provider utilises our second application to receive information on all the surveys required and completed by the company. It then provides navigation to the field location and provides automated control of the drone to survey the field, without intensive training. The results are processed on the mobile device showing an immmediate overlay of the troubled areas and once uploaded to our cloud server, the same results are distributed to the farmer
The video below shows our mobile applications in action with the navigation and scouting displayed at an increaded speed for brevity
We have been working hard to optimise our disease segmentation model and embed it in our mobile application. This will allow us to provide the user with the segmentation result even when there is no internet available.
We have also been working on improving the model to work with lower resolutions so that we can a map overlay based on the frames extracted from the lower resolution video streamed from the drone during the flight.
As you can see from the screenshots we have made excellent progress in this task by managing to successfully achieve both objectives although the current analysis speed is slow. This means the on-device segmentation will be used as a backup to the cloud service while we work on futher optimisation.
As our disease segmentation model is currently cloud-based the drone survey application has to upload the images to the cloud for analysis after they have been downloaded from the drone.
As our aim is to provide immediate feedback we have developed an efficient on device classification model that uses frames extracted from the low resolution video stream sent from the drone during the flight.
This allows us to immediately highlight which areas have disease before getting the advanced segementation results
As part of our shift to providing a mobile application for a drone survey provider we thought about how such as service would work.
Our vision is that businesses will be able to register as service providers with our system and then farmers will be able to link their account to different providers based on their needs.
So after a farmer has requested a survey, an employee of the service provider could log in to our drone survey application, recieve a list of all the surveys assigned to their company and view them on a map. Selecting the one they wish to survey, the application provides GPS navigation to the field, where they can connect the drone and have it automatically survey the field.
To that end we have added the navigation utility to the application using MapBox
One of the outcomes of our successful project meeting in August was the realisation that the small holder farmers may not be the target for our mobile application and that we should instead be targeting third party organisations such as farmers associations or drone survey companies. These would be able to afford to purchase drones and organise for them to be transported to the farmers fields as needed through out the season.
This change in focus has resulted in the development of a completely new application incorporating the automated drone control technology previously worked on but aimed instead at an end user envisaged to be an individual working for such an organisation.
Initial development has been focused on providing the user with a list of fields to be surveyed and displaying their location on a map. Once at the field the user can initiate the automated survey and see the result along with any recommended actions. The same information will then be passed on the the fields owner directly by the platform.
Future development will be focused on providing further utility to the user such as better searches and navigation to fields that need surveying, current and historical comparisons of the survey results and more actions on disease detection.
The mobile application has continued development to present end users with all the relevant information about their holdings. The following video demonstrates the current state of the application enabling the user to monitor multiple fields and crops over time so they can see current sensor data, the disease analysis of uploaded images and videos as well as the results of drone surveys of the fields.
The application aims to empower the user to improvetheir crops by linking them to relevant information on how to treat and prevent diseases as they are detected as well as providing access to general advice on crop maintenance.
We have also included a forum facility to take advantage of the local community prompting the user to post images that have been unsuccessfully analysed or are outside the scope of our models, for other people to advise on. This can then feed back into our data gathering facilities
We have developed a real-time crop disease detection tool designed to provide the rapid diagnosis of disease in crops. The process utilises artificial intelligence and drone imagery to show a visual representation of the disease while providing an estimation of the severity of the problem.
The video demonstrates this technique on a Malaysian rice paddy detecting bacterial leaf blight (BLB)
We have started development of the automatic drone control by out mobile application. The following video shows the initial user interface design allowing the user to instruct the drone to survey the field, followed by automatic waypoint generation and a simulated drone flight.
The drones location is mapped for the user to see and the images taken are displayed before being uploaded to the data platform for analysis witht he resulting stitched overlays displayed on the map for the user