- 27 Jun, 2022 2 commits
-
-
Rasoul Akhavan Mahdavi authored
-
Rasoul Akhavan Mahdavi authored
-
- 24 Jun, 2022 4 commits
-
-
Xinda Li authored
-
Rasoul Akhavan Mahdavi authored
-
Xinda Li authored
-
Xinda Li authored
-
- 13 Jun, 2022 1 commit
-
-
Rasoul Akhavan Mahdavi authored
-
- 26 May, 2022 1 commit
-
-
Rasoul Akhavan Mahdavi authored
-
- 18 May, 2022 1 commit
-
-
Rasoul Akhavan Mahdavi authored
-
- 17 May, 2022 1 commit
-
-
Rasoul Akhavan Mahdavi authored
-
- 16 May, 2022 2 commits
-
-
Rasoul Akhavan Mahdavi authored
-
Rasoul Akhavan Mahdavi authored
-
- 03 May, 2022 2 commits
- 02 May, 2022 1 commit
-
-
Diaa authored
-
- 28 Apr, 2022 2 commits
-
-
-
Rasoul Akhavan Mahdavi authored
-
- 22 Apr, 2022 1 commit
-
-
Brian Knott authored
Summary: Fixes DPSMPC (Note: this ignores all push blocking failures!) Reviewed By: shree-gade Differential Revision: D35790682 fbshipit-source-id: e8667ade54696843a887d80bc4497665208e862d
-
- 13 Apr, 2022 1 commit
-
-
Tarek Elgamal authored
Summary: CrypTen uses four internal random number generators to produce random values of specific types. These generators are called "local", "global", "prev", and "next". The names of these generators indicate the group of parties that share a seed for a generator: These seeds are initialized and coordinated by the crypten._setup_prng() function (__init__.py). For testing purposes, this diff allows users to set these seeds deterministically by exposing crypted.seed() function that runs only in debug mode Reviewed By: knottb Differential Revision: D35476423 fbshipit-source-id: 3007a6a36c52f7463c0bf17f22f5ed07651d455f
-
- 07 Apr, 2022 1 commit
-
-
Brian Knott authored
Summary: Pull Request resolved: https://github.com/facebookresearch/CrypTen/pull/375 Adds the following functionality to crypten.optim: - Adds `grad_threshold` kwarg to check for and eliminate gradient explosion - Adds missing `zero_grad` function to crypten Optimizers Reviewed By: shree-gade Differential Revision: D35366600 fbshipit-source-id: 7ec15e5dc3d7f0bba0379ac906411fffc90b3a36
-
- 29 Mar, 2022 3 commits
-
-
Brian Knott authored
Summary: Pull Request resolved: https://github.com/facebookresearch/CrypTen/pull/373 Removes pytest-runner in setup.py since it is deprecated here: https://pypi.org/project/pytest-runner/ This has caused issues for some OSS users. Reviewed By: yuansen23 Differential Revision: D35153940 fbshipit-source-id: b91be72e0c38edcd199af4ae2be269a9552cc84c
-
Brian Knott authored
Summary: Test was failing because loss forward was being compared on both parties. Only the feature_src should have the correct plaintext value since it holds the reference model. Reviewed By: yuansen23 Differential Revision: D35186281 fbshipit-source-id: 4fc4c188942a352c37fe9952edd0df6c494f923f
-
Brian Knott authored
Summary: Using `//` for torch floor divide is deprecated (and prints an annoying warning). There were 2 instances of this in replicated.py. This diff removes them. Reviewed By: yuansen23 Differential Revision: D35188089 fbshipit-source-id: e897a42345ab2292fa66c424106badf441e58f95
-
- 22 Mar, 2022 1 commit
-
-
Tarek Elgamal authored
Summary: X-link: https://github.com/fairinternal/CrypTen/pull/249 Pull Request resolved: https://github.com/facebookresearch/CrypTen/pull/368 Adding updates to Tutorial 5 to match the changes in the model modules when converting the model from pytorch to crypten. These changes are related to a recent change in onnx. Reviewed By: knottb Differential Revision: D35062376 fbshipit-source-id: b63b68b57917c266039875668f93b862cc91b996
-
- 21 Mar, 2022 1 commit
-
-
Brian Knott authored
Summary: - Adds `RapporLoss` to `crypten.nn.loss` - Modifies `DPSplitModel` to enable application of RAPPOR loss. - Improve testing for `DPSplitModel` - Add RapporLoss to testing in `test_gradients.py` - Add RapporLoss testing in `test_privacy_models.py` Reviewed By: yuansen23 Differential Revision: D34826725 fbshipit-source-id: 281dd01dc4e8222dec17ee969adf6aa57b5171fa
-
- 07 Mar, 2022 1 commit
-
-
Dmitry Vinnik authored
Summary: Our mission at Meta Open Source is to empower communities through open source, and we believe that it means building a welcoming and safe environment for all. As a part of this work, we are adding this banner in support for Ukraine during this crisis. Pull Request resolved: https://github.com/facebookresearch/CrypTen/pull/363 Reviewed By: marksibrahim Differential Revision: D34654547 Pulled By: dmitryvinn-fb fbshipit-source-id: adcfdc014b47e0a95ca25ab8d2b82917053e625e
-
- 24 Feb, 2022 2 commits
-
-
Brian Knott authored
Summary: Pull Request resolved: https://github.com/facebookresearch/CrypTen/pull/356 X-link: https://github.com/fairinternal/CrypTen/pull/248 Adds `maxsize` kwarg to `marshallcline.run_multiprocess` decorator. The decorator uses `multiprocessing.Queue()` to store return values for function evaluations. If the return values are too large, the `Queue` must be initialized with a `maxsize` argument which specifies the memory allocated for the queue. See https://github.com/facebookresearch/CrypTen/issues/354 Reviewed By: sayanghosh Differential Revision: D34422744 fbshipit-source-id: 3cddbebcab70c399097d49be9d4a9606ddf75d28
-
Brian Knott authored
Summary: Handles exceptions in the safe class whitelist to prevent crashes during deprecations in PyTorch. Reviewed By: shree-gade Differential Revision: D34430504 fbshipit-source-id: 074f4292f4daef14297a95a293aea1dbc4261371
-
- 15 Feb, 2022 1 commit
-
-
Kurt Mohler authored
Summary: Pull Request resolved: https://github.com/pytorch/pytorch/pull/72540 Reviewed By: jbschlosser Differential Revision: D34216823 Pulled By: bdhirsh fbshipit-source-id: 1bc9930ab582771ebf02308e035576cd1a0dbe47
-
- 31 Jan, 2022 1 commit
-
-
Brian Knott authored
Summary: Fixed config error in example FBLearner workflows. This should stop Cogwheel issues. Reviewed By: yuansen23 Differential Revision: D33747836 fbshipit-source-id: 6e5696ddf99b34057b682dd04891e25b3ce5b6b7
-
- 21 Jan, 2022 1 commit
-
-
Brian Knott authored
Summary: Adds OmegaConf dependency to CircleCI config. After testing, this should be the last component for CircleCI to pass. There may still be some flaky tests. Reviewed By: romovpa Differential Revision: D33711175 fbshipit-source-id: f699290fd7ec482bd8cf77b56f964f67244bf66c
-
- 20 Jan, 2022 1 commit
-
-
Brian Knott authored
Summary: Eliminates skipped tests since they create warnings in Phabricator. D31599001 attempts to fix this by passing sandcastle tests to remote execution, but the tests don't quite work. In the meantime I am eliminating these tests to improve signals in phabricator. Reviewed By: sayanghosh Differential Revision: D33096392 fbshipit-source-id: 4a6aaa0ee11c5a2701c5e377c1d35cba1034cf60
-
- 13 Jan, 2022 1 commit
-
-
Brian Knott authored
Summary: Fixes eval mode by only performing plaintext forward pass during eval mode. Reviewed By: yuansen23 Differential Revision: D33551477 fbshipit-source-id: 908f39dce8b1863f962ecd84c072e4d037458710
-
- 15 Dec, 2021 1 commit
-
-
Brian Knott authored
Summary: Pull Request resolved: https://github.com/fairinternal/CrypTen/pull/247 Pull Request resolved: https://github.com/facebookresearch/CrypTen/pull/341 Pulls from: https://github.com/facebookresearch/CrypTen/pull/335 Fixes a few nits in `pycon-workshop-2020/training_across_parties.py` - Applies reshape to model inputs if necessary - Replaces `crypten.{save/load}` with `crypten.{save/load}_from_party` where appropriate Reviewed By: marksibrahim Differential Revision: D32884598 fbshipit-source-id: ef622a51f735028cb081774c8b34c0fdf171b1ba
-
- 07 Dec, 2021 1 commit
-
-
Brian Knott authored
Summary: Edge case with size-1 outputs had a bug where inverse broadcast would choose the wrong dimension to expand. This diff solves the issue and adds testing for this case. Reviewed By: yuansen23 Differential Revision: D32920552 fbshipit-source-id: ef3bb150251016d795d0fd62992e31181ddb779e
-
- 06 Dec, 2021 1 commit
-
-
Brian Knott authored
Summary: Implements the noisy_dLdP version of DPSplitMPC. WIP: Still need to implement the algorithm that computes noisy dLdP from dLdW. The rest seems to be implemented properly. Reviewed By: yuansen23 Differential Revision: D32714613 fbshipit-source-id: 9d247514a8278c6cb9a3b5069d31d0fbc3bc07da
-
- 01 Dec, 2021 1 commit
-
-
Brian Knott authored
Summary: Pull Request resolved: https://github.com/facebookresearch/CrypTen/pull/331 Pull Request resolved: https://github.com/fairinternal/CrypTen/pull/246 Diff 2 for caching pre-processed tuples. - Implements Save / Load for Tuple Cache - Adds save / load to cache test https://github.com/facebookresearch/CrypTen/issues/175 Reviewed By: yuansen23 Differential Revision: D32623019 fbshipit-source-id: e0a581f3805a2b66dad8170f22e0a6f950702a03
-
- 19 Nov, 2021 1 commit
-
-
Brian Knott authored
Summary: Pull Request resolved: https://github.com/facebookresearch/CrypTen/pull/326 Pull Request resolved: https://github.com/fairinternal/CrypTen/pull/245 Creates a caching system for storing pre-processed tuples. In order to use for Offline Tuple Generation, the following is required: - Implement saving tuple cache to file and/or - Asynchronous / parallelized tuple generation (i.e. in separate threads / processes) Implementation works for all tuple providers and implements an abstract class for generic tuple-generation protocols. Note: Also addresses https://github.com/facebookresearch/CrypTen/issues/175 Reviewed By: karthikprasad Differential Revision: D32283362 fbshipit-source-id: d7c17fd113092ace9768c482a9bb4e255b06a348
-
- 29 Oct, 2021 2 commits
-
-
Brian Knott authored
Summary: Implements RR variant of DP-Split-MPC by flipping labels on the `label_src` party side prior to encryption of labels. Differential Revision: D31865247 fbshipit-source-id: d8f1ceb85d1701fdf3246862b74613ac4c3f72af
-
Brian Knott authored
Summary: Implements DP-Split-MPC algorithm. Differential Revision: D30402780 fbshipit-source-id: beb61af20c72c03f337b7bc6e2693f4f5ff98a37
-