Title: "Compressive Learning meets privacy"
Speaker: Vincent Schellekens
Location: "Shannon" Seminar Room, Place du Levant 3, Maxwell Building, 1st floor
Date / Time (duration): Wednesday 03/04/2019, 11h00 (~45')
Abstract: Compressive Learning (CL) is a framework where a target learning task (e.g., clustering or density fitting) is not performed on the whole dataset of signals, but on a heavily compressed representation of it (called sketch), enabling training with reduced time and memory resources. Because the sketch only keeps track of general tendencies (i.e., generalized moments) of the dataset while discarding individual data records, previous work argued that CL should protect the privacy of the users that contributed to the dataset, but without providing formal arguments to back up this claim. This work aims to formalize this observation.
In this talk, I will first give an introduction to privacy for the signal processing community, with a focus on differential privacy: a strong guarantee that has gained a lot of popularity in the privacy community. Then, after a general overview of CL, I will then present a strategy to exploit the compression properties of the sketch to provide formal privacy guarantees (i.e., differential privacy). The proposed method thus allows to perform a learning task while at the same time 1) reducing the required computational resources, and 2) protecting the privacy of the dataset contributors.
Last updated May 06, 2019, at 09:21 AM