Search results
Results From The WOW.Com Content Network
In mathematics, the discriminant of an algebraic number field is a numerical invariant that, loosely speaking, measures the size of the (ring of integers of the) algebraic number field. More specifically, it is proportional to the squared volume of the fundamental domain of the ring of integers , and it regulates which primes are ramified .
Since the number of integral ideals of given norm is finite, the finiteness of the class number is an immediate consequence, [1] and further, the ideal class group is generated by the prime ideals of norm at most M K. Minkowski's bound may be used to derive a lower bound for the discriminant of a field K given n, r 1 and r 2.
The first class is the discriminant of an algebraic number field, which, in some cases including quadratic fields, is the discriminant of a polynomial defining the field. Discriminants of the second class arise for problems depending on coefficients, when degenerate instances or singularities of the problem are characterized by the vanishing of ...
Since the sign of the discriminant of a number field K is (−1) r 2, where r 2 is the number of conjugate pairs of complex embeddings of K into C, the discriminant of a cubic field will be positive precisely when the field is totally real, and negative if it is a complex cubic field.
In algebraic number theory, the different ideal (sometimes simply the different) is defined to measure the (possible) lack of duality in the ring of integers of an algebraic number field K, with respect to the field trace. It then encodes the ramification data for prime ideals of the ring of integers. It was introduced by Richard Dedekind in 1882.
For the real quadratic field = (with d square-free), the fundamental unit ε is commonly normalized so that ε > 1 (as a real number). Then it is uniquely characterized as the minimal unit among those that are greater than 1. If Δ denotes the discriminant of K, then the fundamental unit is
In mathematics, especially in algebraic number theory, the Hermite–Minkowski theorem states that for any integer N there are only finitely many number fields, i.e., finite field extensions K of the rational numbers Q, such that the discriminant of K/Q is at most N. The theorem is named after Charles Hermite and Hermann Minkowski.
An algebraic number field (or simply number field) is a finite-degree field extension of the field of rational numbers. Here degree means the dimension of the field as a vector space over Q {\displaystyle \mathbb {Q} } .