Numerical Differentiation and Approximating derivatives from data

Numerical Differentiation:

Approximating derivatives from data:

Presume that a variable y depends on another variable x that is y = f(x) however we only know the values of f at a finite set of points example as data from an experiment or a simulation

(x1, y1), (x2, y2), . . . , (xn, yn).

Presume then that we need information about the derivative of f(x). One noticeable idea would be to approximate f′(xi) by the Forward Difference:

1038_Forward difference.jpg

This formula pursues directly from the definition of the derivative in calculus. A substitute would be to utilize a Backward Difference

1076_backward difference.jpg

Since the errors for the forward difference as well as backward difference tend to have opposite signs it would seem probable that averaging the two methods would give a better result than either alone. If the points are consistently spaced that is xi+1− xi= xi− xi−1 = h then averaging the forward as well as backward differences leads to a symmetric expression called the Central Difference:

499_central difference.jpg

Errors of approximation:

We can utilize Taylor polynomials to derive the accuracy of the forward backward and central difference formulas. For illustration the usual form of the Taylor polynomial with remainder (sometimes called Taylor’s Theorem) is:

1683_taylors theorem.jpg

Where c is a few (unknown) number between x and x+h. Letting x = xi, x+h = xi+1 and solving for f′(xi) leads to:

1724_taylors theorem.jpg

Notice that the quotient in this equation is precisely the forward difference formula. Therefore the error of the forward difference is −(h/2)f′′(c) which means it is O(h). Replacing h in the above computation:

1679_types of differences.jpg

The three dissimilarity approximations of y′i

by−h gives the error for the backward dissimilarity formula it is as well O(h). For the central difference the error is able to be found from the third degree Taylor polynomial with remainder:

607_taylor polynomial.jpg

where xi ≤ c1 ≤ xi+1 and xi−1 ≤ c2 ≤ xi. Subtracting these two equations as well as solving for f′(xi) leads to

1904_taylor polynomial.jpg

This demonstrate that the error for the central difference formula is O(h2). Therefore central differences are significantly better and so It is best to utilize central differences whenever possible.

There are as well central difference formulas for higher order derivatives. These each have error of order O(h2):

881_higher order derrivatives.jpg

Partial Derivatives:

Suppose u = u(x, y) is a function of two variables that we merely know at grid points (xi, yj). We will utilize the notation

ui,j= u(xi, yj)

Frequently throughout the rest of the lectures we can presume that the grid points are evenly spaced with an increment of h in the x direction and k in the y direction. The central dissimilarity formulas for the partial derivatives would be:

ux(xi, yj) ≈1/2h(ui+1,j− ui−1,j) and
uy(xi, yj) ≈1/2k(ui,j+1− ui,j−1) .

The next partial derivatives are:

uxx(xi, yj) ≈1/h2 (ui+1,j − 2ui,j+ ui−1,j) and
uyy(xi, yj) ≈1/k2 (ui,j+1 − 2ui,j+ ui,j−1) ,

and the mixed partial derivative is:

uxy(xi, yj) ≈1/4hk(ui+1,j+1 − ui+1,j−1 − ui−1,j+1 + ui−1,j−1) .

Caution- Notice that we have indexed uij Therefore that as a matrix every row represents the values of u at a certain xiand each column contains values at yj. The arrangement in the matrix doesn’t coincide with the usual orientation of the xy-plane.

Let’s consider an illustration. Let the values of u at (xi, yj) are recorded in the matrix:

627_matrix in xy plane.jpg

Presume the indices begin at 1 i is the index for rows and j the index for columns. Assume that h = .5 and k = .2. Afterwards uy(x2, y4) would be approximated by the central difference:

140_central difference.jpg

The biased derivative uxy(x2, y4) is approximated by:

2250_biased derrivative.jpg

Latest technology based Matlab Programming Online Tutoring Assistance

Tutors, at the, take pledge to provide full satisfaction and assurance in Matlab Programming help via online tutoring. Students are getting 100% satisfaction by online tutors across the globe. Here you can get homework help for Matlab Programming, project ideas and tutorials. We provide email based Matlab Programming help. You can join us to ask queries 24x7 with live, experienced and qualified online tutors specialized in Matlab Programming. Through Online Tutoring, you would be able to complete your homework or assignments at your home. Tutors at the TutorsGlobe are committed to provide the best quality online tutoring assistance for Matlab Programming Homework help and assignment help services. They use their experience, as they have solved thousands of the Matlab Programming assignments, which may help you to solve your complex issues of Matlab Programming. TutorsGlobe assure for the best quality compliance to your homework. Compromise with quality is not in our dictionary. If we feel that we are not able to provide the homework help as per the deadline or given instruction by the student, we refund the money of the student without any delay.