When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Tensor (machine learning) - Wikipedia

    en.wikipedia.org/wiki/Tensor_(machine_learning)

    In machine learning, the term tensor informally refers to two different concepts (i) a way of organizing data and (ii) a multilinear (tensor) transformation. Data may be organized in a multidimensional array (M-way array), informally referred to as a "data tensor"; however, in the strict mathematical sense, a tensor is a multilinear mapping over a set of domain vector spaces to a range vector ...

  3. Tensor software - Wikipedia

    en.wikipedia.org/wiki/Tensor_software

    Xerus [52] is a C++ tensor algebra library for tensors of arbitrary dimensions and tensor decomposition into general tensor networks (focusing on matrix product states). It offers Einstein notation like syntax and optimizes the contraction order of any network of tensors at runtime so that dimensions need not be fixed at compile-time.

  4. Tensor - Wikipedia

    en.wikipedia.org/wiki/Tensor

    The collection of tensors on a vector space and its dual forms a tensor algebra, which allows products of arbitrary tensors. Simple applications of tensors of order 2 , which can be represented as a square matrix, can be solved by clever arrangement of transposed vectors and by applying the rules of matrix multiplication, but the tensor product ...

  5. Multilinear algebra - Wikipedia

    en.wikipedia.org/wiki/Multilinear_algebra

    Multilinear algebra is the study of functions with multiple vector-valued arguments, with the functions being linear maps with respect to each argument. It involves concepts such as matrices, tensors, multivectors, systems of linear equations, higher-dimensional spaces, determinants, inner and outer products, and dual spaces.

  6. Mixed tensor - Wikipedia

    en.wikipedia.org/wiki/Mixed_tensor

    As an example, a mixed tensor of type (1, 2) can be obtained by raising an index of a covariant tensor of type (0, 3), =, where is the same tensor as , because =, with Kronecker δ acting here like an identity matrix.

  7. Voigt notation - Wikipedia

    en.wikipedia.org/wiki/Voigt_notation

    In mathematics, Voigt notation or Voigt form in multilinear algebra is a way to represent a symmetric tensor by reducing its order. [1] There are a few variants and associated names for this idea: Mandel notation, Mandel–Voigt notation and Nye notation are others found.

  8. Tensor contraction - Wikipedia

    en.wikipedia.org/wiki/Tensor_contraction

    In multilinear algebra, a tensor contraction is an operation on a tensor that arises from the canonical pairing of a vector space and its dual.In components, it is expressed as a sum of products of scalar components of the tensor(s) caused by applying the summation convention to a pair of dummy indices that are bound to each other in an expression.

  9. Glossary of tensor theory - Wikipedia

    en.wikipedia.org/wiki/Glossary_of_tensor_theory

    This means that there is no need to distinguish covariant and contravariant components, and furthermore there is no need to distinguish tensors and tensor densities. All Cartesian-tensor indices are written as subscripts. Cartesian tensors achieve considerable computational simplification at the cost of generality and of some theoretical insight.