Max Bartolo

2 minute read

In the context of Machine Learning, it is convenient to think of a tensor as an $n$-dimensional array. Tensor dimensionality is also commonly referred to as its order, degree or rank, which formally is the sum of the tensor contravariant and covariant indices.

Scalars are $0$-th order tensors. Vectors can be represented as $1$-dimensional arrays and are therefore $1$st-order tensors. In a fixed basis, a standard linear map that maps a vector to a vector, is represented by a matrix (a $2$-dimensional array) and is therefore a $2$nd-order tensor. Note that not all multidimensional arrays are necessarily a representation of a tensor (see holors).

For an $n$-dimensional array, elements are identified by $n$ coordinates. For example, a $3$-dimensional tensor $T$ would have elements identified by $\mathbf{T}_{i, j, k}$.

Tensors are usually given uppercase variable names in a bold sans-serif font (note that the font used here isn’t sans-serif, but ideally it should be).

We can represent a 3-dimensional tensor as a NumPy array in Python:

# Create a 3-dimensional array filled with 1s
T = np.ones(shape=(5, 6, 7))

# We could also create a 3-dimensional array populated with random samples from a uniform distribution over [0, 1).
# [0, 1) means the range between 0 and 1 inclusive of 0 and exclusive of 1
T = np.random.random_sample(size=(5, 6, 7))

# This is a convenience function for np.random.rand where we don't use a tuple to specify the shape
# Instead we pass the dimensions as integers
T = np.random.rand(5, 6, 7)

print("Element at indices i={}, j={}, k={} is {:.3F}".format(2, 0, 5, T[2, 0, 5]))
print("T has shape {}".format(T.shape))
Element at indices i=2, j=0, k=5 is 0.348
T has shape (5, 6, 7)

Another useful function is np.random.randn() which populates the array with samples drawn from the standard normal distribution. A standard distribution refers to the case with zero mean and unit standard deviation (therefore also unit variance) i.e. with parameters $\mu = 0, \sigma = 1$ which is also normal (i.e. the integral of the probability density is $1$).

Source: https://github.com/maxbartolo/ml-index/blob/master/01-basic-math/01-linear-algebra/04-tensors.ipynb

comments powered by Disqus