Circa to appear at neurips ’21

Our paper on stochastic ReLU functions for private inference will appear at Neural Information Processing Systems (NeuRIPS) 2021! The work is led by EnSuRe alum Zahra Ghodsi, and in collaboration with Brandon Reagen and Nandan Jha. The latency of many cryptographic private inference schemes is dominated by ReLUs. Circa introduces a new stochastic ReLU function that occasionally outputs incorrect values (for example, allowing small negative values to pass through to the output), but has up to 5x lower private inference latency. We show that SoTA DNNs are robust to these occasional errors and only incur a small accuracy drop.

(a) Regular ReLU. (b),(c) Proposed Circa modifications.

Deepreduce to appear at ICML’21

Joint work with Nandan Jha, Zahra Ghodsi and Brandon Reagen, DeepReDuce seeks to eliminate redundant or ineffectual ReLUs from a deep network to support private inference. Compared to the state-of-the-art for private inference, DeepReDuce improves accuracy and reduced ReLU count by up to 3.5% and 3.5×, respectively.

kang defends his phd work. Congrats dr. liu!

Kang succesfully defended his thesis on backdooring attacks and defenses for deep learning. Kang’s thesis work resulted in the first defense in literature against deep neural network backdoors, along with an investigation of security vulnerabilities inherent in ML for design automation and deep learning based privacy-preserving tools. Congrats Dr. Liu! Kang has accepted a faculty position at Huazhong University in China — we’re all excited to see what he does next!