How a Conversational Explainer Makes AI More Responsible
The implementation of AI in critical decision-making processes is rapidly increasing. Explaining model outcomes in an understandable way to every stakeholder is important, but a difficult operation. Every stakeholder has unique information needs and experience. With the white paper, we explain how we came to the development of an interactive chatbot and why.
Users can easily obtain the exact explanation they need by “asking the ML model” why it made a particular decision. In the same way, they might question a colleague. Now, let’s check how a Conversational Explainer makes AI more Responsible.
Conversational XAI in a nutshell
Personalized explanations tailored to specific needs and displaying only the information you really need
Interact through human dialogue that feels natural, even to those without a technical background
Practice effective human oversight by allowing stakeholders to make well-informed evaluations
Download white paper
Want to know more about our innovative Conversational XAI chatbot?
What is Conversational XAI?
AI is rapidly evolving and becomes an integral part of our life. In fact, AI is already used in a wide range of critical decision-making processes across several domains. However, with the tremendous growth also comes the necessity to ensure that AI is being used responsibly. As a result, there is a rising interest in human-machine interaction. This can be seen by the number of articles being published about this topic. The upcoming EU AI Act is bolstering this development, enforcing effective human oversight for high-risk AI.
These factors have led to the development of multiple explainability techniques to translate machine model logic into something stakeholders can understand. However, stakeholders are from diverse backgrounds and may struggle to leverage explanations. In fact, it is hard to create a single explanation that fits all needs and requirements of the stakeholders. To summarize, this concern requires a human approach.
Why Conversational XAI?
Decision-making is a diverse process that may vary from domain to domain, team to team, and person to person. Understanding the decision maker in their thought process, knowledgeability and background will ultimately help to deliver the information that is required to support the decision-making process.
We developed a conversational XAI chatbot in collaboration with Tim Kleinloog (Deeploy), Nilay Aishwarya, and Ujwal Gadiraju (TU Delft) to address the aforementioned concerns by providing stakeholders the capability to obtain diverse explanations through a human-like dialogue system. Thus boosting stakeholders’ capabilities to access different explanations and improving important feedback loops.
Please fill in your e-mail and we update you when we have new content!
From Deployment to Monitoring: Deeploy’s Comprehensive MLOps Lifecycle
6 Steps to Understand Deeploy’s Comprehensive MLOps Lifecycle MLOps is a combination of data enginee…
SageMaker & Deeploy: A non-technical introduction
SageMaker & Deeploy: A non-technical introduction SageMaker & Deeploy: A non-technical intro…