## How to add a constant to all the initialized weights of a tensor in Tensorflow?

If I wanted to a value, say epsilon, to all of my initialized weights, how might be the best way to approach this? Normally, I do not define an initializer, which causes Tensorflow to default to the glorot_uniform_initializer() (Source: https://www.tensorflow.org/api_docs/python/tf/glorot_uniform_initializer )

Instead of having my values initialize in a uniform way centered around zero, I want them to be centered around some epsilon. That epsilon value would be dependent on the number of values in the initial weight matrix.

Do you suggest that I just change the "seed" parameter in the glorot_uniform_initializer? Or might there be a better way to do this?

The `seed`

parameter is the random seed, that's not going to help you.

I'd suggest once you run `sess.run( tf.global_variables_initializer() )`

or similar, that is, *after* the weights are initialized, get a pointer to the weight tensor of the layer in question, and use `tf.assign_add()`

to modify the weight values by a constant ** ε**.

**tf.constant_initializer,** Initializer that generates tensors with constant values. All elements of the initialized variable will be set to the corresponding value in the value argument. A Tensor. Has the same type as x . Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License , and code samples are licensed under the Apache 2.0 License .

You can use a customized initializer:

def add_constant_initializer(base_initializer, epsilon): return lambda *a, **kw: base_initializer(*a, **kw) + epsilon tf.get_variable( 'MyVar', shape=(...), dtype=..., initializer=add_constant_initializer(tf.glorot_uniform_initializer(), epsilon))

**tf.constant,** Note: All eager tf.Tensor values are immutable (in contrast to tf.Variable ). There is nothing especially constant about the value returned from tf.constant . Introduce and tune L2 regularization for both logistic and neural network models. Remember that L2 amounts to adding a penalty on the norm of the weights to the loss. In TensorFlow, you can compute the L2 loss for a tensor t using nn.l2_loss(t). The right amount of regularization should improve your validation / test accuracy.

You can use the `random_uniform_initializer`

and customize it to your needs. You just need to provide it the `minval`

and `maxval`

that you want. If you want to set it to use the same range as a glorot uniform initializer plus a small `epsilon`

, then just set minval and maxval to `epsilon+-sqrt(6 / (fan_in + fan_out))`

. See the documentation for details.

**tf.keras.initializers.Constant,** Returns a tensor object initialized to self.value . Args: shape : Shape of the tensor. dtype : Optional dtype of the TensorFlow programs use a tensor data structure to represent all data — only tensors are passed between operations in the computation graph. You can think of a TensorFlow tensor as an n-dimensional array or list. In this tutorial, we'll take a look at some of the Tensor Types used in TensorFlow.

**tf.Variable,** After construction, the type and shape of the variable are fixed. Additionally, all the operators overloaded for the Tensor class are carried over to variables. to False . validate_shape : If False , allows the variable to be initialized with a value of unknown shape. Dispatches to add for strings and add_v2 for all other types. Weight initialization in TensorFlow. This section will show you how to initialize weights easily in TensorFlow. The full code can be found on this site’s Github page. Performing Xavier and He initialization in TensorFlow is now really straight-forward using the tf.contrib.layers.variance_scaling_initializer. By adjusting the available

**Module: tf.keras.initializers,** class Constant : Initializer that generates tensors with constant values. class GlorotNormal class Initializer : Initializer base class: all initializers inherit from this class. class Ones : Initializer that generates tensors initialized to 1. class VarianceScaling : Initializer capable of adapting its scale to the shape of weights tensors. A TensorFlow variable is the best way to represent shared, persistent state manipulated by your program. Variables are manipulated via the tf.Variable class. A tf.Variable represents a tensor whose value can be changed by running ops on it. Specific ops allow you to read and modify the values of this tensor.

**TensorFlow variables,** This guide covers how to create, update, and manage tf. Create a variable other related tools, you will likely want to get a list of all variables in a (say) model. __init__() self.my_var = tf.Variable(1.0) self.my_var_list = [tf.Variable(x) for x in Below is some code to do that by creating bilinear weights in numpy and then passing them to a constant_initializer, then passed as the W_init parameter when creating the DeConv2dLayer. Not sure if/where it would make sense to add/contribute something like this to tensorlayer, custom initializers somewhere?