When.com Web Search

Search results

  1. Results From The WOW.Com Content Network
  2. Residue (complex analysis) - Wikipedia

    en.wikipedia.org/wiki/Residue_(complex_analysis)

    This formula can be very useful in determining the residues for low-order poles. For higher-order poles, the calculations can become unmanageable, and series expansion is usually easier. For essential singularities, no such simple formula exists, and residues must usually be taken directly from series expansions.

  3. Residue theorem - Wikipedia

    en.wikipedia.org/wiki/Residue_theorem

    In complex analysis, the residue theorem, sometimes called Cauchy's residue theorem, is a powerful tool to evaluate line integrals of analytic functions over closed curves; it can often be used to compute real integrals and infinite series as well.

  4. Method of mean weighted residuals - Wikipedia

    en.wikipedia.org/wiki/Method_of_mean_weighted...

    The method of mean weighted residuals solves (,,, …,) = by imposing that the degrees of freedom are such that: ((,,, …,),) =is satisfied. Where the inner product (,) is the standard function inner product with respect to some weighting function () which is determined usually by the basis function set or arbitrarily according to whichever weighting function is most convenient.

  5. Residual (numerical analysis) - Wikipedia

    en.wikipedia.org/wiki/Residual_(numerical_analysis)

    When one does not know the exact solution, one may look for the approximation with small residual. Residuals appear in many areas in mathematics, including iterative solvers such as the generalized minimal residual method , which seeks solutions to equations by systematically minimizing the residual.

  6. Errors and residuals - Wikipedia

    en.wikipedia.org/wiki/Errors_and_residuals

    Thus to compare residuals at different inputs, one needs to adjust the residuals by the expected variability of residuals, which is called studentizing. This is particularly important in the case of detecting outliers, where the case in question is somehow different from the others in a dataset. For example, a large residual may be expected in ...

  7. Residual sum of squares - Wikipedia

    en.wikipedia.org/wiki/Residual_sum_of_squares

    The general regression model with n observations and k explanators, the first of which is a constant unit vector whose coefficient is the regression intercept, is = + where y is an n × 1 vector of dependent variable observations, each column of the n × k matrix X is a vector of observations on one of the k explanators, is a k × 1 vector of true coefficients, and e is an n× 1 vector of the ...

  8. Non-linear least squares - Wikipedia

    en.wikipedia.org/wiki/Non-linear_least_squares

    Consider a set of data points, (,), (,), …, (,), and a curve (model function) ^ = (,), that in addition to the variable also depends on parameters, = (,, …,), with . It is desired to find the vector of parameters such that the curve fits best the given data in the least squares sense, that is, the sum of squares = = is minimized, where the residuals (in-sample prediction errors) r i are ...

  9. Partial residual plot - Wikipedia

    en.wikipedia.org/wiki/Partial_residual_plot

    Residuals = residuals from the full model, ^ = regression coefficient from the i-th independent variable in the full model, X i = the i-th independent variable. Partial residual plots are widely discussed in the regression diagnostics literature (e.g., see the References section below).