Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/January 2000
Problem 1
|
Let be a function with 3 continuous derivatives. Let be a quadratic polynomial that interpolates at . Let and
|
Problem 1a
|
Show that
where depends only on and determine . (Hint: the key to this is to prove that vanishes at some point in . The result can then be obtained by integration.) |
Solution 1a
Proof of Hint
Claim: There exists such that
Proof:
The interpolation polynomial may be expressed using dividing difference coefficients i.e.
which implies
In general, the divided difference coefficients may be expressed as a factorial weighted point of a derivative of i.e.
Hence,
which implies
Application of Hint
From the hint we know that
which implies
Since is quadratic, is constant i.e.
for all
Therefore,
By the fundamental theorem of calculus,
Therefore,
Problem 1b
|
Now suppose and has 4 continuous derivatives. In this case show
|
Solution 1b
Third Derivative of f has Zero
We know that , because . Now, by we can conclude that there exists such that .
Application of Fundamental Theorem of Calculus (Twice)
Problem 2a
|
Find such that is a polynomial of degree and this set is orthogonal on with respect to the weight function . (Note: , ) |
Solution 2a
Apply Gram Schmidt
To find orthogonal use the Gram Schmidt method.
Let be a basis of .
Calculate p_0
Choose .
Calculate p_1
From Gram Schmidt, we have
, where
Therefore
Calculate p_2
Proceeding with Gram Schmidt, we have
where
Therefore
Problem 2b
|
Derive the 2-point Gaussian formula
|
Solution 2b
Find the Nodes
The nodes and are the roots of the th orthogonal polynomial i.e.
Applying the quadratic formula yields the roots:
Find the Weights
The approximation is exact for polynomials at most of degree . Hence, we have the following system of equations
Solving the solving the system of equation by substitution yields the weights:
Problem 3
|
Let be an nonsingular matrix, and consider the linear system |
Problem 3a
|
Write down the Jacobi iteration for solving in a way that it would be programmed on a computer |
Solution 3a
Choose
for
- <convergence condition>
end
Where , is diagonal, are lower and upper triangular, respectively.
Problem 3b
|
Suppose has non-zero elements with . How many operations per iteration does the Jacobi iteration take? |
Solution 3b
The diagonal entries of are non-zero since otherwise would not exist.
Therefore contains off-diagonal non-zero entries.
The computation during each iteration is given by
Therefore there are multiplies in each iteration.
Problem 3c
|
Assume that is strictly diagonally dominant: for
|
Solution 3c
Let .
Theorem 8.2.1 [SB] states that if and only if the Jacobi iteration converges.
Matrix multiplication and the definitions of gives the explicit entrywise value of
for and
Then, using Gerschgorin's Theorem and diagonal dominance, we have the result.