Class TPUDistributionStrategy
Defined in tensorflow/contrib/tpu/python/tpu/keras_support.py
.
The strategy to run Keras model on TPU.
__init__
__init__(
tpu_cluster_resolver=None,
using_single_core=False
)
Construct a TPUDistributionStrategy.
Args:
tpu_cluster_resolver
: Any instance ofTPUClusterResolver
. If None, will create one with '' as master address.using_single_core
: Bool. This is the debugging option, which might be removed in future once the model replication functionality is mature enough. IfFalse
(default behavior), the system automatically finds the best configuration, in terms of number of TPU cores, for the model replication, typically using all avaiable TPU cores. If overwrites asTrue
, force the model replication using single core, i.e., no replication.
Raises:
Exception
: No TPU Found on the given worker.