Skip to content

Commit

Permalink
[Python] Update tensorflow doc (#25954)
Browse files Browse the repository at this point in the history
* update tensorflow doc

* indent

* remove whitespace

* add input doc
  • Loading branch information
riteshghorse authored Mar 25, 2023
1 parent 874b687 commit 0557c33
Show file tree
Hide file tree
Showing 3 changed files with 31 additions and 17 deletions.
2 changes: 1 addition & 1 deletion CHANGES.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@
* Schema'd PTransforms can now be directly applied to Beam dataframes just like PCollections.
(Note that when doing multiple operations, it may be more efficient to explicitly chain the operations
like `df | (Transform1 | Transform2 | ...)` to avoid excessive conversions.)
* The Go SDK adds new transforms periodic.Impulse and periodic.Sequence that extends support
* The Go SDK adds new transforms periodic.Impulse and periodic.Sequence that extends support
for slowly updating side input patterns. ([#23106](https://github.com/apache/beam/issues/23106))

## Breaking Changes
Expand Down
23 changes: 15 additions & 8 deletions website/www/site/content/en/documentation/ml/about-ml.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,14 +112,21 @@ that illustrates running Scikit-learn models with Apache Beam.

#### TensorFlow

To use TensorFlow with the RunInference API, you need to do the following:

* Use `tfx_bsl` version 1.10.0 or later.
* Create a model handler using `tfx_bsl.public.beam.run_inference.CreateModelHandler()`.
* Use the model handler with the [`apache_beam.ml.inference.base.RunInference`](/releases/pydoc/current/apache_beam.ml.inference.base.html) transform.

See [this notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow.ipynb)
that illustrates running TensorFlow models with Apache Beam and tfx-bsl.
To use TensorFlow with the RunInference API, you have two options:

1. Use the built-in TensorFlow Model Handlers in Apache Beam SDK - `TFModelHandlerNumpy` and `TFModelHandlerTensor`.
* Depending on the type of input for your model, use `TFModelHandlerNumpy` for `numpy` input and `TFModelHandlerTensor` for `tf.Tensor` input respectively.
* Use tensorflow 2.7 or later.
* Pass the path of the model to the TensorFlow `ModelHandler` by using `model_uri=<path_to_trained_model>`.
* Alternatively, you can pass the path to saved weights of the trained model, a function to build the model using `create_model_fn=<function>`, and set the `model_type=ModelType.SAVED_WEIGHTS`.
See [this notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tensorflowhub.ipynb) that illustrates running Tensorflow models with Built-in model handlers.
2. Using `tfx_bsl`.
* Use this approach if your model input is of type `tf.Example`.
* Use `tfx_bsl` version 1.10.0 or later.
* Create a model handler using `tfx_bsl.public.beam.run_inference.CreateModelHandler()`.
* Use the model handler with the [`apache_beam.ml.inference.base.RunInference`](/releases/pydoc/current/apache_beam.ml.inference.base.html) transform.
See [this notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow.ipynb)
that illustrates running TensorFlow models with Apache Beam and tfx-bsl.

## Custom Inference

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -120,14 +120,21 @@ that illustrates running Scikit-learn models with Apache Beam.

### TensorFlow

To use TensorFlow with the RunInference API, you need to do the following:

* Use `tfx_bsl` version 1.10.0 or later.
* Create a model handler using `tfx_bsl.public.beam.run_inference.CreateModelHandler()`.
* Use the model handler with the [`apache_beam.ml.inference.base.RunInference`](/releases/pydoc/current/apache_beam.ml.inference.base.html) transform.

See [this notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow.ipynb)
that illustrates running TensorFlow models with Apache Beam and tfx-bsl.
To use TensorFlow with the RunInference API, you have two options:

1. Use the built-in TensorFlow Model Handlers in Apache Beam SDK - `TFModelHandlerNumpy` and `TFModelHandlerTensor`.
* Depending on the type of input for your model, use `TFModelHandlerNumpy` for `numpy` input and `TFModelHandlerTensor` for `tf.Tensor` input respectively.
* Use tensorflow 2.7 or later.
* Pass the path of the model to the TensorFlow `ModelHandler` by using `model_uri=<path_to_trained_model>`.
* Alternatively, you can pass the path to saved weights of the trained model, a function to build the model using `create_model_fn=<function>`, and set the `model_type=ModelType.SAVED_WEIGHTS`.
See [this notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tensorflowhub.ipynb) that illustrates running Tensorflow models with Built-in model handlers.
2. Using `tfx_bsl`.
* Use this approach if your model input is of type `tf.Example`.
* Use `tfx_bsl` version 1.10.0 or later.
* Create a model handler using `tfx_bsl.public.beam.run_inference.CreateModelHandler()`.
* Use the model handler with the [`apache_beam.ml.inference.base.RunInference`](/releases/pydoc/current/apache_beam.ml.inference.base.html) transform.
See [this notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow.ipynb)
that illustrates running TensorFlow models with Apache Beam and tfx-bsl.

## Use custom models

Expand Down

0 comments on commit 0557c33

Please sign in to comment.