Videos

Making AI Human Again: The Importance of Explainable AI (XAI)



The OR Society

Philip Pilgerstorfer, Data Scientist, QuantumBlack, a McKinsey Company

Abstract: Proliferation of machine learning techniques have led to more companies driving their decision-making process with complex models. In this increasingly fast paced world, it is becoming more difficult to understand how AI systems are making our everyday decisions. Along with increased regulation, there is a growing demand to build explainable models without compromising performance. In this session we will present how QuantumBlack is using XAI to drive adoption, diagnose bias, and deliver impact across organisations.
Bio: Philip Pilgerstorfer is a Data Scientist at QuantumBlack. He has delivered projects in manufacturing, oil and gas, motorsports and pharmaceutical. He is a contributor to QuantumBlack’s internal R&D efforts like Explainable AI (XAI) and on causal inference. Philip’s academic background is in econometrics, statistics, and machine learning.

Source

Similar Posts

One thought on “Making AI Human Again: The Importance of Explainable AI (XAI)
  1. I really think we need to take it easy on AI. We will be AIs chimps. We don’t kill them off we don’t torture them but we do study them and we are the higher being of chimps and that’s what we will be to them. I have experienced what DMT has to offer and love is the answer. So supersmart AI will be fine it’s the AI that cannot think for itself and less programmed by human which most humans are greedy and evil

Comments are closed.

WP2Social Auto Publish Powered By : XYZScripts.com