resumo
- The Ambient Assisted Living (AAL) systems are humancentered and designed to prioritize the needs of elderly individuals, providing them with assistance in case of emergencies or unexpected situations. These systems involve caregivers or selected individuals who can be alerted and provide the necessary help when needed. To ensure effective assistance, it is crucial for caregivers to understand the reasons behind alarm triggers and the nature of the danger. This is where an explainability module comes into play. In this paper, we introduce an explainability module that offers visual explanations for the fall detection module. Our framework involves generating anchor boxes using the K-means algorithm to optimize object detection and using YOLOv8 for image inference. Additionally, we employ two well-known XAI (Explainable Artificial Intelligence) algorithms, LIME (Local Interpretable Model) and Grad-CAM (Gradient-weighted Class Activation Mapping), to provide visual explanations.