DIANNA
Deep Insight And Neural Network Analysis, DIANNA is the only Explainable AI, XAI library for scientists supporting Open Neural Network Exchange, ONNX - the de facto standard models format.
XAI4GEO
This project aims to develop explainable AI methods for Earth Observation building on successful collaborations between the eScience Center and Faculty ITC of the University of Twente.
UAVs/drones are increasingly being used in developing countries for development purposes. UAV imagery captures more detail and thus more objects that have a stronger local relevance. Potential users in developing countries often come with the question: can I use AI to detect X?
Practically, available training data is often limited to very general classification tasks (e.g. building footprints). Training data for objects such as local water points, evidence of erosion or landslides, etc. are not available, and it is infeasible to consider collecting large consolidated training sets for each one.
One possibility is to use Deep Learning algorithms requiring few examples. Yet this should be coupled by an explanation. An explanation could be a similar prototype from the training data and/or textual explanations. Such explainable AI techniques are developed for computer vision tasks, but are still new for remote sensing applications and this project aims to address this gap.
Explainable AI tool for scientists
Deep Insight And Neural Network Analysis, DIANNA is the only Explainable AI, XAI library for scientists supporting Open Neural Network Exchange, ONNX - the de facto standard models format.