![]() 158 159 The values of `alpha` and `scale` are 160 chosen so that the mean and variance of the inputs are preserved 161 between two consecutive layers as long as the weights are initialized 162 correctly (see `tf.` initializer) 163 and the number of input units is "large enough" 164 (see reference paper for more information). 130 131 Returns: 132 The exponential linear unit (ELU) activation function: `x` if `x > 0` and 133 `alpha * (exp(x) - 1)` if `x 0: return scale * x` 150 - `if x 1) with the 156 output of the `tf.` function to ensure a slope larger 157 than one for positive inputs. `alpha` controls the value to 129 which an ELU saturates for negative net inputs. 128 alpha: A scalar, slope of negative section. 97 98 The exponential linear unit (ELU) with `alpha > 0` is: 99 `x` if `x > 0` and 100 `alpha * (exp(x) - 1)` if `x > import tensorflow as tf 116 > model = tf.keras.Sequential() 117 > model.add(tf.2D(32, (3, 3), activation='elu', 118. _keras _logits = x # pylint: disable=protected-access 91 return output 92 93 94 95 def elu (x, alpha = 1.0 ): 96 """Exponential Linear Unit. ' 87 'Received input: %s ' % (x ,)) 88 89 # Cache the logits to use for crossentropy loss. reduce_sum (e, axis =axis, keepdims = True ) 84 output = e / s 85 else : 86 raise ValueError ( 'Cannot apply softmax to a tensor that is 1D. reduce_max (x, axis =axis, keepdims = True )) 83 s = math_ops. softmax (x, axis =axis ) 80 else : 81 # nn.softmax does not support tuple axis. rank > 1 : 78 if isinstance (axis, int ): 79 output = nn. 63 64 Examples: 65 66 **Example 1: standalone usage** 67 68 > inputs = tf.random.normal(shape=(32, 10)) 69 > outputs = tf.(inputs) 70 > tf.reduce_sum(outputs) # Each sample in the batch now sums to 1 71 72 73 **Example 2: usage in a `Dense` layer** 74 75 > layer = tf.(32, activation=tf.) 76 """ 77 if x. 59 60 Returns: 61 Tensor, output of softmax transformation (all values are non-negative 62 and sum to 1). 58 axis: Integer, axis along which the softmax normalization is applied. 53 54 The input values in are the log-odds of the resulting probability. 50 51 The softmax of each vector x is computed as 52 `exp(x) / tf.reduce_sum(exp(x))`. 46 47 Softmax is often used as the activation for the last 48 layer of a classification network because the result could be interpreted as 49 a probability distribution. The `axis` argument sets which axis 45 of the input the function is applied along. 43 44 Each vector is handled independently. 41 42 The elements of the output vector are in range (0, 1) and sum to 1. 33 _TF_ACTIVATIONS_V2 = 36 37 38 39 def softmax (x, axis =- 1 ): 40 """Softmax converts a vector of values to a probability distribution. 31 # This dict maps the activation function name from its v2 version to its 32 # canonical name. ![]() This results in errors in 29 # model exporting and loading as Keras can't find any activation function with 30 # the name of `softmax_v2`. util import dispatch 24 25 # b/123041942 26 # In TF 2.x, if the `tf.nn.softmax` is used as an activation function in Keras 27 # layers, it gets serialized as 'softmax_v2' instead of 'softmax' as the 28 # internal method name is returned in serialization. generic_utils import serialize_keras_object 21 from tensorflow. generic_utils import deserialize_keras_object 20 from tensorflow. layers import advanced_activations 19 from tensorflow. 14 # = 15 """Built-in activation functions.""" 16 17 from tensorflow. 12 # See the License for the specific language governing permissions and 13 # limitations under the License. 5 # You may obtain a copy of the License at 6 # 7 # 8 # 9 # Unless required by applicable law or agreed to in writing, software 10 # distributed under the License is distributed on an "AS IS" BASIS, 11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 2 # 3 # Licensed under the Apache License, Version 2.0 (the "License") 4 # you may not use this file except in compliance with the License. See also the last Fossies "Diffs" side-by-side code changes report for "activations.py": 2.13.0_vs_2.14.0-rc0.ġ # Copyright 2015 The TensorFlow Authors. As a special service "Fossies" has tried to format the requested source page into HTML format using (guessed) Python source code syntax highlighting (style: standard) with prefixed line numbers.Īlternatively you can here view or download the uninterpreted source code file. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |