tf.keras.optimizers.schedules.ExponentialDecay
Stay organized with collections
Save and categorize content based on your preferences.
A LearningRateSchedule
that uses an exponential decay schedule.
Inherits From: LearningRateSchedule
tf.keras.optimizers.schedules.ExponentialDecay(
initial_learning_rate,
decay_steps,
decay_rate,
staircase=False,
name='ExponentialDecay'
)
Used in the notebooks
When training a model, it is often useful to lower the learning rate as
the training progresses. This schedule applies an exponential decay function
to an optimizer step, given a provided initial learning rate.
The schedule is a 1-arg callable that produces a decayed learning
rate when passed the current optimizer step. This can be useful for changing
the learning rate value across different invocations of optimizer functions.
It is computed as:
def decayed_learning_rate(step):
return initial_learning_rate * decay_rate ^ (step / decay_steps)
If the argument staircase
is True
, then step / decay_steps
is
an integer division and the decayed learning rate follows a
staircase function.
You can pass this schedule directly into a keras.optimizers.Optimizer
as the learning rate.
Example: When fitting a Keras model, decay every 100000 steps with a base
of 0.96:
initial_learning_rate = 0.1
lr_schedule = keras.optimizers.schedules.ExponentialDecay(
initial_learning_rate,
decay_steps=100000,
decay_rate=0.96,
staircase=True)
model.compile(optimizer=keras.optimizers.SGD(learning_rate=lr_schedule),
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(data, labels, epochs=5)
The learning rate schedule is also serializable and deserializable using
keras.optimizers.schedules.serialize
and
keras.optimizers.schedules.deserialize
.
Args |
initial_learning_rate
|
A Python float. The initial learning rate.
|
decay_steps
|
A Python integer. Must be positive. See the decay
computation above.
|
decay_rate
|
A Python float. The decay rate.
|
staircase
|
Boolean. If True decay the learning rate at discrete
intervals.
|
name
|
String. Optional name of the operation. Defaults to
"ExponentialDecay ".
|
Returns |
A 1-arg callable learning rate schedule that takes the current optimizer
step and outputs the decayed learning rate, a scalar tensor of the
same type as initial_learning_rate .
|
Methods
from_config
View source
@classmethod
from_config(
config
)
Instantiates a LearningRateSchedule
from its config.
Args |
config
|
Output of get_config() .
|
Returns |
A LearningRateSchedule instance.
|
get_config
View source
get_config()
__call__
View source
__call__(
step
)
Call self as a function.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-06-07 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-06-07 UTC."],[],[],null,["# tf.keras.optimizers.schedules.ExponentialDecay\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/optimizers/schedules/learning_rate_schedule.py#L79-L184) |\n\nA `LearningRateSchedule` that uses an exponential decay schedule.\n\nInherits From: [`LearningRateSchedule`](../../../../tf/keras/optimizers/schedules/LearningRateSchedule) \n\n tf.keras.optimizers.schedules.ExponentialDecay(\n initial_learning_rate,\n decay_steps,\n decay_rate,\n staircase=False,\n name='ExponentialDecay'\n )\n\n### Used in the notebooks\n\n| Used in the guide |\n|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| - [Import a JAX model using JAX2TF](https://www.tensorflow.org/guide/jax2tf) - [Migration examples: Canned Estimators](https://www.tensorflow.org/guide/migrate/canned_estimators) |\n\nWhen training a model, it is often useful to lower the learning rate as\nthe training progresses. This schedule applies an exponential decay function\nto an optimizer step, given a provided initial learning rate.\n\nThe schedule is a 1-arg callable that produces a decayed learning\nrate when passed the current optimizer step. This can be useful for changing\nthe learning rate value across different invocations of optimizer functions.\nIt is computed as: \n\n def decayed_learning_rate(step):\n return initial_learning_rate * decay_rate ^ (step / decay_steps)\n\nIf the argument `staircase` is `True`, then `step / decay_steps` is\nan integer division and the decayed learning rate follows a\nstaircase function.\n\nYou can pass this schedule directly into a [`keras.optimizers.Optimizer`](../../../../tf/keras/Optimizer)\nas the learning rate.\nExample: When fitting a Keras model, decay every 100000 steps with a base\nof 0.96: \n\n initial_learning_rate = 0.1\n lr_schedule = keras.optimizers.schedules.ExponentialDecay(\n initial_learning_rate,\n decay_steps=100000,\n decay_rate=0.96,\n staircase=True)\n\n model.compile(optimizer=keras.optimizers.SGD(learning_rate=lr_schedule),\n loss='sparse_categorical_crossentropy',\n metrics=['accuracy'])\n\n model.fit(data, labels, epochs=5)\n\nThe learning rate schedule is also serializable and deserializable using\n[`keras.optimizers.schedules.serialize`](../../../../tf/keras/optimizers/schedules/serialize) and\n[`keras.optimizers.schedules.deserialize`](../../../../tf/keras/optimizers/schedules/deserialize).\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-------------------------|---------------------------------------------------------------------------|\n| `initial_learning_rate` | A Python float. The initial learning rate. |\n| `decay_steps` | A Python integer. Must be positive. See the decay computation above. |\n| `decay_rate` | A Python float. The decay rate. |\n| `staircase` | Boolean. If `True` decay the learning rate at discrete intervals. |\n| `name` | String. Optional name of the operation. Defaults to `\"ExponentialDecay`\". |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|---|---|\n| A 1-arg callable learning rate schedule that takes the current optimizer step and outputs the decayed learning rate, a scalar tensor of the same type as `initial_learning_rate`. ||\n\n\u003cbr /\u003e\n\nMethods\n-------\n\n### `from_config`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/optimizers/schedules/learning_rate_schedule.py#L66-L76) \n\n @classmethod\n from_config(\n config\n )\n\nInstantiates a `LearningRateSchedule` from its config.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ||\n|----------|---------------------------|\n| `config` | Output of `get_config()`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| A `LearningRateSchedule` instance. ||\n\n\u003cbr /\u003e\n\n### `get_config`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/optimizers/schedules/learning_rate_schedule.py#L177-L184) \n\n get_config()\n\n### `__call__`\n\n[View source](https://github.com/keras-team/keras/tree/v3.3.3/keras/src/optimizers/schedules/learning_rate_schedule.py#L162-L175) \n\n __call__(\n step\n )\n\nCall self as a function."]]