Search results
Results From The WOW.Com Content Network
Hamilton's principle states that the true evolution q(t) of a system described by N generalized coordinates q = (q 1, q 2, ..., q N) between two specified states q 1 = q(t 1) and q 2 = q(t 2) at two specified times t 1 and t 2 is a stationary point (a point where the variation is zero) of the action functional [] = ((), ˙ (),) where (, ˙,) is the Lagrangian function for the system.
for a system of particles at coordinates .The function is the system's Hamiltonian giving the system's energy. The solution of this equation is the action, , called Hamilton's principal function.
Specifically, the more general form of the Hamilton's equation reads = {,} +, where f is some function of p and q, and H is the Hamiltonian. To find out the rules for evaluating a Poisson bracket without resorting to differential equations, see Lie algebra ; a Poisson bracket is the name for the Lie bracket in a Poisson algebra .
A Hamiltonian system is a dynamical system governed by Hamilton's equations. In physics, this dynamical system describes the evolution of a physical system such as a planetary system or an electron in an electromagnetic field. These systems can be studied in both Hamiltonian mechanics and dynamical systems theory.
Hamilton's principal function = (,;,) is obtained from the action functional by fixing the initial time and the initial endpoint , while allowing the upper time limit and the second endpoint to vary. The Hamilton's principal function satisfies the Hamilton–Jacobi equation, a formulation of classical mechanics .
The generating function F for this transformation is of the third kind, = (,). To find F explicitly, use the equation for its derivative from the table above, =, and substitute the expression for P from equation , expressed in terms of p and Q:
Inspired by—but distinct from—the Hamiltonian of classical mechanics, the Hamiltonian of optimal control theory was developed by Lev Pontryagin as part of his maximum principle. [2] Pontryagin proved that a necessary condition for solving the optimal control problem is that the control should be chosen so as to optimize the Hamiltonian.
In physics, Hamilton's principle states that the evolution of a system ((), …, ()) described by generalized coordinates between two specified states at two specified parameters σ A and σ B is a stationary point (a point where the variation is zero) of the action functional, or = (,,, ˙,, ˙,) = where ˙ = / and is the Lagrangian.