A tensor is a mathematical object that generalizes scalars, vectors, and matrices to higher dimensions. Think of it as a hierarchy: scalars are rank-0 tensors with no indices, vectors are rank-1 tensors with one index, matrices are rank-2 tensors with two indices, and higher-order tensors have three or more indices. The rank or order of a tensor tells us how many indices we need to specify a particular element within that tensor structure.
Tensor rank, also called order, determines how many indices you need to specify an element. A rank-0 tensor is a scalar - just a single number with no indices. A rank-1 tensor is a vector, requiring one index to access elements. A rank-2 tensor is a matrix, needing two indices for row and column. Rank-3 tensors are like 3D arrays, requiring three indices. The shape of a tensor describes its size along each dimension - for example, a 3 by 3 matrix has shape (3,3), while a 2 by 2 by 2 tensor has shape (2,2,2).
Tensor operations follow specific mathematical rules. Addition works element-wise between tensors of the same shape - each corresponding element is added together. Scalar multiplication scales every element in the tensor by the same factor. Tensor contraction involves summing over shared indices, like matrix multiplication but generalized to higher dimensions. Broadcasting allows operations between tensors of different shapes by extending smaller tensors. For all operations to be valid, tensor dimensions must align properly according to the mathematical rules.
The defining property of tensors is how they transform under coordinate system changes. When we rotate or transform our coordinate system, tensor components change according to specific mathematical transformation laws, but the underlying geometric object remains invariant. This is what distinguishes true tensors from ordinary multi-dimensional arrays. For example, when we rotate a coordinate system by 45 degrees, a vector's components change, but its magnitude and direction in space remain the same. The transformation law ensures that physical quantities represented by tensors are independent of our choice of coordinate system.
Tensors have widespread applications across many fields. In physics, stress tensors describe forces acting on materials, with components representing different types of stress like compression and shear. Electromagnetic field tensors unify electric and magnetic fields into a single mathematical object. In computer science, neural networks use tensors to represent weights and process multi-dimensional data. Image processing relies on tensors for operations like convolution and filtering. In engineering, elasticity tensors describe how materials deform under stress, while fluid dynamics uses tensors to model complex flow patterns. Machine learning extensively uses tensors for representing and manipulating high-dimensional data efficiently.