TF Encrypted is a framework for encrypted machine learning in TensorFlow. It looks and feels like TensorFlow, taking advantage of the ease-of-use of the Keras API while enabling training and prediction over encrypted data via secure multi-party computation and homomorphic encryption. TF Encrypted aims to make privacy-preserving machine learning readily available, without requiring expertise in cryptography, distributed systems, or high performance computing.
See below for more background material, explore the examples, or visit the documentation to learn more about how to use the library.
TF Encrypted is available as a package on PyPI supporting Python 3.5+ and TensorFlow 1.12.0+:
pip install tf-encrypted
Creating a conda environment to run TF Encrypted code can be done using:
conda create -n tfe python=3.6
conda activate tfe
conda install tensorflow notebook
pip install tf-encrypted
Alternatively, installing from source can be done using:
git clone https://github.com/tf-encrypted/tf-encrypted.git
cd tf-encrypted
pip install -e .
make build
This latter is useful on platforms for which the pip package has not yet been compiled but is also needed for development. Note that this will get you a working basic installation, yet a few more steps are required to match the performance and security of the version shipped in the pip package, see the installation instructions.
The following is an example of simple matmul on encrypted data using TF Encrypted:
import tensorflow as tf
import tf_encrypted as tfe
@tfe.local_computation('input-provider')
def provide_input():
# normal TensorFlow operations can be run locally
# as part of defining a private input, in this
# case on the machine of the input provider
return tf.ones(shape=(5, 10))
# define inputs
w = tfe.define_private_variable(tf.ones(shape=(10,10)))
x = provide_input()
# define computation
y = tfe.matmul(x, w)
with tfe.Session() as sess:
# initialize variables
sess.run(tfe.global_variables_initializer())
# reveal result
result = sess.run(y.reveal())
For more information, check out the documentation or the examples.
-
High-level APIs for combining privacy and machine learning. So far TF Encrypted is focused on its low-level interface but it's time to figure out what it means for interfaces such as Keras when privacy enters the picture.
-
Tighter integration with TensorFlow. This includes aligning with the upcoming TensorFlow 2.0 as well as figuring out how TF Encrypted can work closely together with related projects such as TF Privacy and TF Federated.
-
Support for third party libraries. While TF Encrypted has its own implementations of secure computation, there are other excellent libraries out there for both secure computation and homomorphic encryption. We want to bring these on board and provide a bridge from TensorFlow.
Blog posts:
-
Introducing TF Encrypted walks through a simple example showing two data owners jointly training a logistic regression model using TF Encrypted on a vertically split dataset (by Alibaba Gemini Lab)
-
Federated Learning with Secure Aggregation in TensorFlow demonstrates using TF Encrypted for secure aggregation of federated learning in pure TensorFlow (by Justin Patriquin at Cape Privacy)
-
Encrypted Deep Learning Training and Predictions with TF Encrypted Keras introduces and illustrates first parts of our encrypted Keras interface (by Yann Dupis at Cape Privacy)
-
Growing TF Encrypted outlines the roadmap and motivates TF Encrypted as a community project (by Morten Dahl)
-
Experimenting with TF Encrypted walks through a simple example of turning an existing TensorFlow prediction model private (by Morten Dahl and Jason Mancuso at Cape Privacy)
-
Secure Computations as Dataflow Programs describes the initial motivation and implementation (by Morten Dahl)
Papers:
-
Privacy-Preserving Collaborative Machine Learning on Genomic Data using TensorFlow outlines the iDASH'19 winning solution built on TF Encrypted (by Cheng Hong, et al.)
-
Crypto-Oriented Neural Architecture Design uses TF Encrypted to benchmark ML optimizations made to better support the encrypted domain (by Avital Shafran, Gil Segev, Shmuel Peleg, and Yedid Hoshen)
-
Private Machine Learning in TensorFlow using Secure Computation further elaborates on the benefits of the approach, outlines the adaptation of a secure computation protocol, and reports on concrete performance numbers (by Morten Dahl, Jason Mancuso, Yann Dupis, et al.)
Presentations:
-
Privacy-Preserving Machine Learning with TensorFlow, TF World 2019 (by Jason Mancuso and Yann Dupis at Cape Privacy); see also the slides
-
Privacy-Preserving Machine Learning in TensorFlow with TF Encrypted, O'Reilly AI 2019 (by Morten Dahl at Cape Privacy); see also the slides
Other:
-
Privacy Preserving Deep Learning – PySyft Versus TF Encrypted makes a quick comparison between PySyft and TF Encrypted, correctly hitting on our goal of being the encryption backend in PySyft for TensorFlow (by Exxact)
-
Bridging Microsoft SEAL into TensorFlow takes a first step towards integrating the Microsoft SEAL homomorphic encryption library and some of the technical challenges involved (by Justin Patriquin at Cape Privacy)
TF Encrypted is open source community project developed under the Apache 2 license and maintained by a set of core developers. We welcome contributions from all individuals and organizations, with further information available in our contribution guide. We invite any organizations interested in partnering with us to reach out via email.
Don't hesitate to send a pull request, open an issue, or ask for help! We use ZenHub to plan and track GitHub issues and pull requests.
We appreciate the efforts of all contributors that have helped make TF Encrypted what it is! Below is a small selection of these, generated by sourcerer.io from most recent stats:
We are very grateful for the significant contributions made by the following organizations!
TF Encrypted is experimental software not currently intended for use in production environments. The focus is on building the underlying primitives and techniques, with some practical security issues postponed for a later stage. However, care is taken to ensure that none of these represent fundamental issues that cannot be fixed as needed.
- Elements of TensorFlow's networking subsystem does not appear to be sufficiently hardened against malicious users. Proxies or other means of access filtering may be sufficient to mitigate this.
Please open an issue, or send an email to [email protected].
Licensed under Apache License, Version 2.0 (see LICENSE or http://www.apache.org/licenses/LICENSE-2.0). Copyright as specified in NOTICE.