The paper “Rewarding Explainability in Drug Repurposing with Knowledge Graphs”, authored by LASIGE’s PhD student Susana Nunes, Samy Badreddine (Sony AI), and Cátia Pesquita (integrated researcher at LASIGE), was presented at the 34th International Joint Conference on Artificial Intelligence (IJCAI 2025), a top-ranked venue (CORE A*).
The paper introduces REx, a reinforcement learning system designed to improve the explainability of AI-driven drug discovery. By combining prediction accuracy with explanation quality, REx generates interpretable and biologically grounded insights that help validate AI predictions. It identifies explanatory paths in biomedical knowledge graphs that connect drugs with their potential disease targets, ensuring that these explanations are both faithful to the model’s predictions and relevant within established biomedical knowledge. In evaluations across multiple biomedical datasets, REx outperformed existing approaches while producing explanations that better align with scientific reasoning, contributing to more transparent and trustworthy AI models in healthcare research.
IJCAI 2025 took place in Montreal Canada, from August 16 to 22, 2025. The paper is available here.