Friday, November 8, 2024
spot_img
HomeSOFTWAREDeep learning with TensorflowWhat's new in TensorFlow 2.12 and Keras 2.12?

What’s new in TensorFlow 2.12 and Keras 2.12?

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi4Rm9rtDB4jMm-DzSAPH-_DS6S0qjrnmIz5WZ__2KT22zDhQUGPvbS0FgR5vz0TFw62PTrwP_y0jIH47s9VZRj0uOSHQMzyO-GAoWwGpXvYY693DZ9r3StwgsxVzqNdhlFp2hnzn-KKbzakS1sX0dxlQzB0wyxzO5nmDRO3mRCP8yZogvNrKS3RGIO/s1600/Tensorflow-septmber-update-social%20%282%29.png

March 28, 2023 —

Posted by the TensorFlow & Keras teamsTensorFlow 2.12 and Keras 2.12 have been released! Highlights of this release include the new Keras model saving and exporting format, the keras.utils.FeatureSpace utility, SavedModel fingerprinting, Python 3.11 wheels for TensorFlow and many more.TensorFlow CoreSavedModel FingerprintingModels saved with tf.saved_model.save now come with a fingerprint fil…

Posted by the TensorFlow & Keras teams

TensorFlow 2.12 and Keras 2.12 have been released! Highlights of this release include the new Keras model saving and exporting format, the keras.utils.FeatureSpace utility, SavedModel fingerprinting, Python 3.11 wheels for TensorFlow and many more.

SavedModel Fingerprinting

Models saved with tf.saved_model.save now come with a fingerprint file containing hashes to uniquely identify the SavedModel. Multiple fingerprints are derived from the model content, allowing you to compare the structure, graph, signatures, and weights across models. Read more about fingerprinting in the RFC and check out the read_fingerprint API and Fingerprint class.

tf.function

tf.function now uses the Python inspect library to consistently mimic the decorated function’s signature. WYSIWYG: decorated and non-decorated behavior is identical, even for complex uses like wrapping (functools.wraps) and partial application (functools.partial).

We now detect incompatible tf.function input types (such as mismatched functools.wraps calls). Additionally, we have improved type constraining logic (input_signature) for better error messages and consistency (e.g. a function with no parameters now automatically has input_signature=[]).

Additionally, we have added experimental.extension_type.as_dict() to convert tf.experimental.ExtensionTypes to Python dicts.

New model format

The biggest new Keras feature in this release is the new model export formats. We’ve completely reworked Keras saving and serialization to cleanly separate two key use cases:

1. Python saving & reloading. This is when you save a Keras model to re-instantiate it later in a Python runtime, exactly as it was. We achieve this with a new file format, called the “Keras v3” format (.keras). You can start using it by calling model.save(“your_model.keras”, save_format=”keras_v3″).
2. Model export for inference in a runtime that might not support Python at all (e.g. the TF Serving runtime). You can create a lightweight (single-file) export via model.export(“your_model”) – and reload it in TF Serving or Python via tf.saved_model.load(“your_model”). By default, this format only preserves a single serving endpoint, the forward pass of the model, available upon reload as .serve(). Further customization is available through the keras.export.ExportArchive class.

In the 2.13 release, keras_v3 will become the default for all files with the .keras extension. The format supports non-numerical state such as vocabulary files and lookup tables, and it is easy to save custom layers with exotic state elements (such as a FIFOQueue). The format does not rely on loading arbitrary code through bytecode or pickling, so it is safe by default. This is a big advance for secure ML. Note that due to this safety-first mindset, Python lambdas are disallowed at loading time. If you want to use a lambda, and you trust the source of the model, you can pass safe_mode=False to the loading method.

The legacy formats (“h5” and “Keras SavedModel” format based on TF SavedModel) will stay supported in perpetuity. However, we recommend that you consider adopting the new Keras v3 format for richer Python-side model saving/reloading, and using export() for inference-optimized model export.

FeatureSpace

Another exciting feature is the introduction of the keras.utils.FeatureSpace utility. It enables one-step indexing and preprocessing of structured data – including feature hashing and feature crossing. See the [feature space tutorial](https://keras.io/examples/structured_data/structured_data_classification_with_feature_space/) for more information.

Like all Keras APIs, FeatureSpace is built with progressive disclosure of complexity in mind, so it is fully customizable – you can even specify custom feature types that rely on your own preprocessing layers. For instance, if you want to create a feature that encodes a text paragraph, that’s just two lines of code:

from tensorflow.keras import layers, utils
custom_layer = layers.TextVectorization(output_mode=”tf_idf”)
feature_space = utils.FeatureSpace(
features=
“text”: FeatureSpace.feature(
preprocessor=custom_layer, dtype=”string”, output_mode=”float”
),
,
output_mode=”concat”,
)

There are just the release highlights – there are many more Keras-related improvements included, so be sure to check out the release notes!

tf.data

Warm starting

tf.datahas added support for warm-starting input processing. Ifwarm_start=True(ontf.data.experimental.OptimizationOptions), tf.data will start preemptively start background threads during iterator creation (instead of waiting for the first call toGetNext). This allows users to improve latency to the initialGetNextcall at the expense of higher memory usage.

Re-randomizing across epochs

tf.dataadded a newrerandomize_each_iterationargument totf.data.Dataset.random(), to control whether the sequence of generated random numbers should be re-randomized every epoch, or not (the default behavior). Ifseedis set andrerandomize_each_iteration=True,random()will produce a different (deterministic) sequence of numbers every epoch. This can be useful when training over a relatively smaller number of input examples to ensure that the model doesn’t learn the sequence itself.

Infra Updates

Protobuf python runtime version was upgraded to 4.21.9. All protobuf *_pb2.py stubs are generated now with protoc 3.21.9. The minimal supported protobuf runtime version is 3.20.3.We released Python 3.11 wheels for the TensorFlow packages with this release!We removed Python 3.7 support from this version. Going forward, we will no longer release patches for Python 3.7.

 

Post Disclaimer

The information provided in our posts or blogs are for educational and informative purposes only. We do not guarantee the accuracy, completeness or suitability of the information. We do not provide financial or investment advice. Readers should always seek professional advice before making any financial or investment decisions based on the information provided in our content. We will not be held responsible for any losses, damages or consequences that may arise from relying on the information provided in our content.

RELATED ARTICLES

Most Popular

Recent Comments

error: Content is protected !!