Introducing Opacus: A High-speed Library For Training PyTorch Models With Differential Privacy
We are releasing Opacus, a new high-speed library for training PyTorch models with differential privacy (DP) that’s more scalable than existing state-of-the-art methods. Differential privacy is a mathematically rigorous framework for…
Share