Skip to content

v2.10.0-rc0

Pre-release
Pre-release
Compare
Choose a tag to compare
@rtg0795 rtg0795 released this 04 Aug 05:46
· 3 commits to 2.10 since this release

Release 2.10.0-rc0

Major Features and Improvements

  • New ByteSplitter which tokenizes strings into bytes.
  • New tutorial: Fine tune BERT with Orbit [will be added to tensorflow.org/text soon].
  • Fixed an issue where dynamic TF Lite tensors were not getting resized correctly.

Bug Fixes and Other Changes

  • Fix typo error in subwords_tokenizer guide with text.WordpieceTokenizer
  • Fixes prepare_tf_dep.sh for OSX.
  • Add cross-links to tensorflow_models.nlp API reference.
  • (Generated change) Update tf.Text versions and/or docs.
  • Update shape inference of kernel template for fast wordpiece and activate the op test.
  • Update configure.sh for Apple Silicon.
  • Export Trimmer ABC to be usable as tf_text.Trimmer
  • Fix TensorFlow checkpoint and trackable imports.
  • Correct tutorial explanation: meaning of attention weights
  • Modernize fine_tune_bert.
  • Lint and update the Fine-tuning a BERT model tutorial
  • Use pointer for pointer math instead of iterator. Fixes c++17 compilation for regex_split on windows.
  • Add install_bazel.sh script to make it easy to install the correctly needed version of Bazel. (#946)
  • Make install_bazel.sh script executable.
  • Prevent runtime errors from happening due to invalid regular expressions using regex_split & RegexSplitter.
  • Centralize tensorflow-models docs into a top-level docs/ directory.
  • Remove link to non-existant section on tf.org.
  • Move fine_tune_bert guide.
  • Updated the spelling mistakes in subwords_tokenizer.ipynb
  • Fixes a bug caused by passing an empty tensor into SentencepieceTokenizer's detokenize method.
  • Update build for Sentencepiece. Darts was not properly being depended on.
  • Improve Sentencepiece build by adding missing dependency - str_format.
  • Fix typos and lint Neural machine translation with attention tutorial
  • Fix external link formatting, lint NMT with attention tutorial

Thanks to our Contributors

This release contains contributions from many people at Google, as well as:

gadagashwini, mnahinkhan, Steve R. Sun, synandi