Search results
Results From The WOW.Com Content Network
Technically, it should be called the principal square root of 2, to distinguish it from the negative number with the same property. Geometrically, the square root of 2 is the length of a diagonal across a square with sides of one unit of length; this follows from the Pythagorean theorem. It was probably the first number known to be irrational. [1]
In the case of two nested square roots, the following theorem completely solves the problem of denesting. [2]If a and c are rational numbers and c is not the square of a rational number, there are two rational numbers x and y such that + = if and only if is the square of a rational number d.
A method analogous to piece-wise linear approximation but using only arithmetic instead of algebraic equations, uses the multiplication tables in reverse: the square root of a number between 1 and 100 is between 1 and 10, so if we know 25 is a perfect square (5 × 5), and 36 is a perfect square (6 × 6), then the square root of a number greater than or equal to 25 but less than 36, begins with ...
The square root of 2 was likely the first number proved irrational. [27] The golden ratio is another famous quadratic irrational number. The square roots of all natural numbers that are not perfect squares are irrational and a proof may be found in quadratic irrationals.
Quadratic irrational numbers, irrational solutions of a quadratic polynomial ax 2 + bx + c with integer coefficients a, b, and c, are algebraic numbers. If the quadratic polynomial is monic (a = 1), the roots are further qualified as quadratic integers. Gaussian integers, complex numbers a + bi for which both a and b are integers, are also ...
The square root of 2 was the first such number to be proved irrational. Theodorus of Cyrene proved the irrationality of the square roots of non-square natural numbers up to 17, but stopped there, probably because the algebra he used could not be applied to the square root of numbers greater than 17. Euclid's Elements Book 10 is dedicated to ...
A more general proof shows that the mth root of an integer N is irrational, unless N is the mth power of an integer n. [7] That is, it is impossible to express the mth root of an integer N as the ratio a ⁄ b of two integers a and b, that share no common prime factor, except in cases in which b = 1.
The set of rational numbers is not complete. For example, the sequence (1; 1.4; 1.41; 1.414; 1.4142; 1.41421; ...), where each term adds a digit of the decimal expansion of the positive square root of 2, is Cauchy but it does not converge to a rational number (in the real numbers, in contrast, it converges to the positive square root of 2).