When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Laplace's equation - Wikipedia

    en.wikipedia.org/wiki/Laplace's_equation

    In mathematics and physics, Laplace's equation is a second-order partial differential equation named after Pierre-Simon Laplace, who first studied its properties.This is often written as = or =, where = = is the Laplace operator, [note 1] is the divergence operator (also symbolized "div"), is the gradient operator (also symbolized "grad"), and (,,) is a twice-differentiable real-valued function.

  3. Gradient - Wikipedia

    en.wikipedia.org/wiki/Gradient

    The gradient of the function f(x,y) = −(cos 2 x + cos 2 y) 2 depicted as a projected vector field on the bottom plane. The gradient (or gradient vector field) of a scalar function f(x 1, x 2, x 3, …, x n) is denoted ∇f or ∇ → f where ∇ denotes the vector differential operator, del.

  4. Green's function for the three-variable Laplace equation

    en.wikipedia.org/wiki/Green's_function_for_the...

    Examples of these can be seen to exist in rotational cylindrical coordinates as an integral Laplace transform in the difference of vertical heights whose kernel is given in terms of the order-zero Bessel function of the first kind as | ′ | = (+ ′ ′ ⁡ (′)) (> <), where > (<) are the greater (lesser) variables and ′.

  5. Laplace operator - Wikipedia

    en.wikipedia.org/wiki/Laplace_operator

    In spherical coordinates in N dimensions, with the parametrization x = rθ ∈ R N with r representing a positive real radius and θ an element of the unit sphere S N−1, = + + where Δ S N−1 is the Laplace–Beltrami operator on the (N − 1)-sphere, known as the spherical Laplacian.

  6. Vector calculus identities - Wikipedia

    en.wikipedia.org/wiki/Vector_calculus_identities

    In Cartesian coordinates, the divergence of a continuously differentiable vector field = + + is the scalar-valued function: ⁡ = = (, , ) (, , ) = + +.. As the name implies, the divergence is a (local) measure of the degree to which vectors in the field diverge.

  7. Gradient theorem - Wikipedia

    en.wikipedia.org/wiki/Gradient_theorem

    The gradient theorem states that if the vector field F is the gradient of some scalar-valued function (i.e., if F is conservative), then F is a path-independent vector field (i.e., the integral of F over some piecewise-differentiable curve is dependent only on end points). This theorem has a powerful converse:

  8. Slope - Wikipedia

    en.wikipedia.org/wiki/Slope

    Slope illustrated for y = (3/2)x − 1.Click on to enlarge Slope of a line in coordinates system, from f(x) = −12x + 2 to f(x) = 12x + 2. The slope of a line in the plane containing the x and y axes is generally represented by the letter m, [5] and is defined as the change in the y coordinate divided by the corresponding change in the x coordinate, between two distinct points on the line.

  9. Gradient method - Wikipedia

    en.wikipedia.org/wiki/Gradient_method

    In optimization, a gradient method is an algorithm to solve problems of the form min x ∈ R n f ( x ) {\displaystyle \min _{x\in \mathbb {R} ^{n}}\;f(x)} with the search directions defined by the gradient of the function at the current point.