Constant tensors are used for fixing particular values that you do not expect to change throughout the training process. They can also be used to initialize the values of trainable tensors (called variable tensors).
The most basic way to create a tensor is to use the tf.constant()
function. The following will create the vector [1,2,3]
with floating point values.
graph = tf.Graph() with graph.as_default(): tf_a = tf.constant([1,2,3], dtype=tf.float32) with tf.Session(graph=graph) as sess: a = sess.run(tf_a) print(a)
[OUTPUT] [ 1. 2. 3.]
We could actually pass a numpy array as the first argument to create a tensorflow constant version of that array.
data = np.array([[1, 2, 3], [2, 1, 2], [3, 3, 3], [4, 2, 4]]) graph = tf.Graph() with graph.as_default(): tf_a = tf.constant(data, dtype=tf.float32) with tf.Session(graph=graph) as sess: a = sess.run(tf_a) print(a)
[OUTPUT] [[ 1. 2. 3.] [ 2. 1. 2.] [ 3. 3. 3.] [ 4. 2. 4.]]
We could also create a new constant tensor by specifying the dimensions, and telling it that we want all the values to be the same.
graph = tf.Graph() with graph.as_default(): tf_a = tf.constant(0.5, shape=(3,5), dtype=tf.float32) with tf.Session(graph=graph) as sess: a = sess.run(tf_a) print(a)
[OUTPUT] [[ 0.5 0.5 0.5 0.5 0.5] [ 0.5 0.5 0.5 0.5 0.5] [ 0.5 0.5 0.5 0.5 0.5]]
Random Normal values:
tf_a = tf.random_normal(shape=(3,3), stddev=0.1)
[[-0.08344573 0.06120605 0.11113518] [-0.00953743 -0.06543638 0.07574533] [ 0.01894631 -0.13772231 0.05138101]]
Random normal values with truncated values.
tf_a = tf.truncated_normal(shape=(3,3), stddev=0.1)
[[-0.01948051 0.02700883 0.12265765] [ 0.09466425 -0.058675 -0.00613389] [ 0.0907902 0.09812441 0.04852368]]
tf_a = tf.random_uniform([3, 3], -2.0, 2.0)
[[-1.20364857 1.17370081 1.78660917] [-0.50856924 -1.77878332 1.99628162] [-0.16094685 0.23817921 0.03258276]]