Videos

AI & Privacy Engineering with Michelle Dennedy (Cisco) and David Bray (FCC) (#229)



CXOTALK

AI, machine learning, and predictive analytics rely on massive data sets. While holding the potential for great benefit to society, this explosion of data collection creates privacy and security risks for individuals. In this episode, one of the world’s foremost privacy engineers explores the broad privacy implications of data and artificial intelligence. There are important implications for digital transformation.

Michelle Finneran Dennedy is VP and Chief Privacy Officer at Cisco Systems. Dr. David A. Bray is an Eisenhower Fellow and Chief Information Officer (CIO) of the Federal Communications Commission. Michael Krigsman is an industry analyst and host of CXOTALK.

For more information: https://www.cxotalk.com/episode/artificial-intelligence-privacy-engineering
——————
See our upcoming shows: https://cxotalk.com
——————
Follow us on Twitter: https://twitter.com/cxotalk
——————
From the transcript:

Michelle, what is privacy engineering?

Michelle Dennedy:

(02:51) Excellent. So, privacy, by design, is a policy concept that was first introduced at large… It was hanging around for ten years in the networks and coming out of Ontario, Canada with a woman named Ann Cavoukian, who was the commissioner at the time of Ontario. But in 2010, we introduced the concept at the Data Commissioner’s Conference in Jerusalem, and it was adopted by over 120 different countries to say that privacy should be something that is contemplated in the build; in the design; and that means not just the technical tools they can buy and consume, [but] how you operationalize; how you run your business; how you organize around your business.

(03:35) And, getting down to business on my side of the world, privacy engineering is really using the techniques of the technical, the social, the procedural, the training tools that we have available, and in the really most basic sense of engineering to say, “What are the routinized systems? What are the frameworks? What are the techniques that we use to mobilize privacy-enhancing technologies that exist today, and look across the processing lifecycle to actually build in and solve for privacy challenges?”

(04:10) And I’ll double-click on the word “privacy.” It does not mean having clean underpants, already using encryption. Privacy, in the functional sense, is the authorized processing of personally-identifiable data using fair, moral, legal, and ethical standards. So, we really bring down each one of those things and say, “What are the functionalized tools that we can use to promote that whole panoply and complicated movement of personally-identifiable information across networks with all of these other factors built in?” It’s not something that you’re going to paste onto the end easily. You’re certainly not going to disclaim it away with a little notice at the end saying, “Hey! By the way, I’m taking all your data! Cheerio!” Instead, you’re really going to build it into each layer and fabric of the network, and that’s a big part of why I came to Cisco a couple of years ago. [It’s] if I can change the fabric down here, and our teams can actually build this in and make it as routinized and invisible, then the rest of the world can work on the more nuanced layers that are also difficult and challenging.

Where does privacy intersect with AI?

David Bray:

(05:40) So, I loved what Michelle said about this is actually something that’s not just putting on encryption, which I think a lot of people will think is a panacea and it’s not going to solve everything. It’s worth going back to roots of when did the act come about in the United States. It came about when we started doing these things called […] data processing, or we were able to start correlating information, and the […] came something could be made of these correlations given your consent, too. And so, what Michelle said about building beyond and thinking about networks: That really gets to where we’re at today, now in 2017, which is it’s not just about individual machines making correlations; it’s about different data feeds streaming in from different networks where you might make a correlation that the individual has not given consent to with […] personally identifiable information.

(06:30) And so, for AI, if you think about it, it really is just sort of the next layer of that. We’ve gone from individual machines, networks, to now we have something that is looking for patterns at an unprecedented capability, that at the end of the day, it still goes back to what is coming from what the individual has given consent to? What is being handed off by those machines? What are those data streams?

Source

Similar Posts

WP2Social Auto Publish Powered By : XYZScripts.com