- +44 141 628 6080
- info@tutorsglobe.com

**Numerical Differentiation**:

* Approximating derivatives from data*:

Presume that a variable y depends on another variable x that is y = f(x) however we only know the values of f at a finite set of points example as data from an experiment or a simulation

(x_{1}, y_{1}), (x_{2}, y_{2}), . . . , (x_{n}, y_{n}).

Presume then that we need information about the derivative of f(x). One noticeable idea would be to approximate f′(xi) by the Forward Difference:

This formula pursues directly from the definition of the derivative in calculus. A substitute would be to utilize a Backward Difference

Since the errors for the forward difference as well as backward difference tend to have opposite signs it would seem probable that averaging the two methods would give a better result than either alone. If the points are consistently spaced that is x_{i+1}− x_{i}= x_{i}− x_{i−1} = h then averaging the forward as well as backward differences leads to a symmetric expression called the Central Difference:

* Errors of approximation*:

We can utilize Taylor polynomials to derive the accuracy of the forward backward and central difference formulas. For illustration the usual form of the Taylor polynomial with remainder (sometimes called Taylor’s Theorem) is:

Where c is a few (unknown) number between x and x+h. Letting x = x_{i}, x+h = x_{i+1} and solving for f′(x_{i}) leads to:

Notice that the quotient in this equation is precisely the forward difference formula. Therefore the error of the forward difference is −(h/2)f′′(c) which means it is O(h). Replacing h in the above computation:

The three dissimilarity approximations of y′i

by−h gives the error for the backward dissimilarity formula it is as well O(h). For the central difference the error is able to be found from the third degree Taylor polynomial with remainder:

where x_{i} ≤ c_{1} ≤ x_{i+1} and x_{i−1} ≤ c_{2} ≤ x_{i}. Subtracting these two equations as well as solving for f′(xi) leads to

This demonstrate that the error for the central difference formula is O(h^{2}). Therefore central differences are significantly better and so It is best to utilize central differences whenever possible.

There are as well central difference formulas for higher order derivatives. These each have error of order O(h^{2}):

* Partial Derivatives*:

Suppose u = u(x, y) is a function of two variables that we merely know at grid points (xi, yj). We will utilize the notation

u_{i,j}= u(x_{i}, y_{j})

Frequently throughout the rest of the lectures we can presume that the grid points are evenly spaced with an increment of h in the x direction and k in the y direction. The central dissimilarity formulas for the partial derivatives would be:

u_{x}(x_{i}, y_{j}) ≈1/2h(u_{i+1,j}− u_{i−1,j}) and

u_{y}(x_{i}, y_{j}) ≈1/2k(u_{i,j+1}− u_{i,j−1}) .

The next partial derivatives are:

u_{xx}(x_{i}, y_{j}) ≈1/h^{2}_{ }(u_{i+1,j} − 2u_{i,j}+ u_{i−1,j}) and

u_{yy}(x_{i}, y_{j}) ≈1/k^{2} (u_{i,j+1} − 2u_{i,j}+ u_{i,j−1}) ,

and the mixed partial derivative is:

u_{xy}(x_{i}, y_{j}) ≈1/4hk(u_{i+1,j+1} − u_{i+1,j−1 }− u_{i−1,j+1 }+ u_{i−1,j−1}) .

Caution- Notice that we have indexed uij Therefore that as a matrix every row represents the values of u at a certain xiand each column contains values at yj. The arrangement in the matrix doesn’t coincide with the usual orientation of the xy-plane.

Let’s consider an illustration. Let the values of u at (x_{i}, y_{j}) are recorded in the matrix:

Presume the indices begin at 1 i is the index for rows and j the index for columns. Assume that h = .5 and k = .2. Afterwards uy(x_{2}, y_{4}) would be approximated by the central difference:

The biased derivative uxy(x_{2}, y_{4}) is approximated by:

**Latest technology based Matlab Programming Online Tutoring Assistance**

Tutors, at the **www.tutorsglobe.com**, take pledge to provide full satisfaction and assurance in **Matlab Programming help** via **online tutoring**. Students are getting 100% satisfaction by **online tutors **across the globe. Here you can get homework help for Matlab Programming, project ideas and tutorials. We provide email based **Matlab Programming help**. You can join us to ask queries 24x7 with live, experienced and qualified online tutors specialized in Matlab Programming. Through **Online Tutoring**, you would be able to complete your homework or assignments at your home. Tutors at the **TutorsGlobe** are committed to provide the best quality **online tutoring **assistance for **Matlab Programming Homework help** and **assignment help** services. They use their experience, as they have solved thousands of the Matlab Programming assignments, which may help you to solve your complex issues of Matlab Programming. **TutorsGlobe** assure for the best quality compliance to your homework. Compromise with quality is not in our dictionary. If we feel that we are not able to provide the **homework help** as per the deadline or given instruction by the student, we refund the money of the student without any delay.

1955683

Questions

Asked

3689

Tutors

1454310

Questions

Answered

Start Excelling in your courses, Ask an Expert and get answers for your homework and assignments!!

Submit Assignment©TutorsGlobe All rights reserved 2022-2023.

## Binding energy of Nucleus

tutorsglobe.com binding energy of nucleus assignment help-homework help by online nuclear reaction tutors