Numerical Methods Qualification Exam Problems and Solutions (University of Maryland)/January 2000

From testwiki
Jump to navigation Jump to search

Problem 1

Let f be a function with 3 continuous derivatives. Let q be a quadratic polynomial that interpolates f at x0<x1<x2. Let h=max(x1x0,x2x1) and

maxx0xx2|f(x)|=K.

Problem 1a

Show that

maxx0xx2|f(x)q(x)|=Chα,

where C>0 depends only on K and determine αi. (Hint: the key to this is to prove that f(x)q(x) vanishes at some point in [x0,x2]. The result can then be obtained by integration.)

Solution 1a

Proof of Hint

Claim: There exists η(x0,x2) such that f(η)q(η)=0


Proof: The interpolation polynomial may be expressed using dividing difference coefficients i.e.


q(x)=f[x0]+f[x0,x1](xx0)+f[x0,x1,x2](xx0)(xx1)


which implies


q(x)=2f[x0,x1,x2]


In general, the divided difference coefficients may be expressed as a factorial weighted point of a derivative of f i.e.


()f[x0,,xn]=f(n)(ξ)(n)!ξI[x0,,xn]


Hence,


f[x0,x1,x2]=f(η)2!η[x0,x2]


which implies


q(η)f(η)=0

Application of Hint

From the hint we know that


f(η)q(η)=0


which implies


f(x)q(x)=f(x)q(x)f(η)+q(η)


Since q(x) is quadratic, q(x) is constant i.e. q(x)=K for all x


Therefore,


f(x)q(x)=f(x)q(x)f(η)+q(η)=f(x)f(η)


By the fundamental theorem of calculus,


f(x)f(η)=ηxf(t)dtf(x)|xη|2hf


Therefore,


maxx[x0,x2]|f(x)q(x)|2f(x)Ch1

Problem 1b

Now suppose h=x2x1=x1x0 and f has 4 continuous derivatives. In this case show


|f(x1)q(x1)|Chβ


where β>α. What is C in terms of the derivatives of f?


Solution 1b

Third Derivative of f has Zero

We know that f[x0,x1,x2,x]=0, because pP2. Now, by () we can conclude that there exists η*I[x0,x1,x2,x] such that f(η*)=0.

Application of Fundamental Theorem of Calculus (Twice)

f(x1)f(η)=ηx1f(3)(x)dx=ηx1f(3)(x)f(3)(η*)0dx=ηx1η*xf(4)(t)dtdxf(4)|xη*||x1η|2f(4)Ch2,

Problem 2a

Find {p0,p1,p2} such that pi is a polynomial of degree i and this set is orthogonal on [0,) with respect to the weight function w(x)=ex. (Note: 0xnexdx=n!, 0!=1.)

Solution 2a

Apply Gram Schmidt

To find orthogonal {p0,p1,p2} use the Gram Schmidt method.

Let {v0=1,v1=x,v2=x2} be a basis of P2.

Calculate p_0

Choose p0=v0=1.

Calculate p_1

From Gram Schmidt, we have


p1=xv1(v1,p0)(p0,p0)1p0, where


(v1,p0)=0ex1xdx=1


(p0,p0)=0ex11dx=1


Therefore p1=x1

Calculate p_2

Proceeding with Gram Schmidt, we have

p2=x2v2(v2,p1)(p1,p1)(x1)p1(v2,p0)(p0,p0)1p0 where


(v2,p1)=0exx2(x1)dx=0exx3dx3!0exx2dx2!=4


(p1,p1)=0ex(x1)(x1)dx=0ex(x22x+1)dx=0exx2dx2!20exxdx1!+0ex1dx0!=1


(v2,p0)=0exx21dx=2!=2


(p0,p0)=0ex11dx=1


Therefore


p2=x24x+2

Problem 2b

Derive the 2-point Gaussian formula


0f(x)exdxw1f(x1)+w2f(x2)


i.e. find the weights and nodes

Solution 2b

Find the Nodes

The nodes x1 and x2 are the roots of the nth orthogonal polynomial i.e. p2(x)=x24x+2


Applying the quadratic formula yields the roots:

x1=22


x2=2+2

Find the Weights

The approximation is exact for polynomials at most of degree 2n1=3. Hence, we have the following system of equations


0exdx=1=w11f(x1)+w21f(x2)


0xexdx=1=w1(22)f(x1)+w2(2+2)f(x2)


Solving the solving the system of equation by substitution yields the weights:


w1=2+24


w2=224

Problem 3

Let A be an n×n nonsingular matrix, and consider the linear system Ax=b


Problem 3a

Write down the Jacobi iteration for solving Ax=b in a way that it would be programmed on a computer

Solution 3a

Choose x0


for k=0,1,2,

x(i+1)=D1(b(L+U)x(i))
<convergence condition>

end


Where A=D+L+U, D is diagonal, L and U are lower and upper triangular, respectively.

Problem 3b

Suppose A has m non-zero elements with m<<n2. How many operations per iteration does the Jacobi iteration take?

Solution 3b

The n diagonal entries of A are non-zero since otherwise D1 would not exist.


Therefore A contains mn off-diagonal non-zero entries.


The computation during each iteration is given by


x(i+1)=D1(b(L+U)x(i)mn multiplies)n multiplies


Therefore there are (mn)+n=m multiplies in each iteration.

Problem 3c

Assume that A is strictly diagonally dominant: for i=1,,n,


|aii|>j=1,j1n|aij|


Show that the Jacobi iteration converges for any guess x(0). (Hint: You may use Gerschgorin's theorem without proving it.)

Solution 3c

Let E:=D1(L+U).


Theorem 8.2.1 [SB] states that ρ(E)<1 if and only if the Jacobi iteration converges.


Matrix multiplication and the definitions of D1,L,U gives the explicit entrywise value of E


eij=aijaii for ji and eii=0


Then, using Gerschgorin's Theorem and diagonal dominance, we have the result.

|λeii0|<j=1jin|eij|=j=1jin|aij||aii|<1

Template:BookCat