MANE 3351
Lecture 11
Classroom Management
Agenda
- Test 1 not graded
- Lab 4 extended until end of day on Thursday October 9, 2025
- Newton’s Method
Resources
Handouts
Newton's Method
Both the bisection and False Position methods were bracketing approaches to find roots that required two starting points that "bracketed" the root. Today's topic, Newton's method or Newton-Raphson method, require only one starting point. Newton's method also requires the knowledge of the derivative.
Geometric Inspiration
Brin (2020)1 demonstrate the geometric inspiration for Newton's method

Newton's Method
- The formula is simply
New Example Problem
- Consider a new function, \(f(x)=e^x+2^{-x}+2\cos(x)-6=0\)

First Derivative
- Newton's Method requires the first derivative:
Review of Derivatives
- An excellent table derivatives is found at Table of Derivatives
- A helpful site is Derivative Calculator
Pseudo-code
Brin (2020)1provides the following pseudo-code

Vectorizing a Function
- All Python functions considered so far have operated on scalars
- Functions can be created to process a vector of values with a single call. This is called a vectorized function
- Consider plotting the new function defined earlier using a vectorized function
Vectorized Function Code
import math
import numpy as np
import matplotlib.pyplot as plt
#
def f(x):
return (math.exp(x)+2**-x+2*math.cos(x)-6)
# convert f(x) in a vectorized function that can be applied to array
vec_f=np.vectorize(f)
x=np.linspace(0,5,101)
#print(x)
# y=f(x)
f_x=vec_f(x)
#print(f_x)
fig, ax = plt.subplots()
ax.plot(x,f_x)
ax.axhline(y=0.0, color='r', linestyle='-')
ax.set(xlabel='x', ylabel='f(x)',
title='New Function')
plt.show()
Newton Raphson Python Code
import math
def f(x):
return (math.exp(x)+2**(-x)+2*math.cos(x)-6)
def f_prime(x):
return(math.exp(x)-2.0**(-x)*math.log(2.0)-2*math.sin(x))
N=100
tol=0.0005
x_0=3.5
counter=0
for j in range(N+1):
counter=counter+1
x=x_0-f(x_0)/f_prime(x_0)
print("x={}".format(x))
if math.fabs(x-x_0)<tol:
print("the root is {} with value {}, required {} steps".format(x,f(x),counter))
break
x_0=x
print("completed")
Convergence
- Cheney and Kincaid (2004)3 study the performance of Newton's methods
- Assumptions
- \(f\) contains two continuous derivatives, \(f^\prime\) and \(f^{\prime\prime}\)
- \(r\) is a simple root, \(f^\prime(r)\neq 0\)
- If \(r\) is started sufficiently close to \(r\), converges quadratically to \(r\)
-
\(|r-x_{n+1}|\leq c|r-x_n|^2\)
-
In other words, \(x_{n+1}\) has approximately twice as many correct digits as \(x_n!\)
Other Comments
Kiusalaas (2013)2 provides the following introduction to the Newton-Raphson Method
The Newton-Raphson algorithm is the best known method of finding roots for a good reason: It is simple and fast. The only drawback of the methods is that it uses the derivative \(f^\prime(x)\) of the function as well as the function \(f(x)\) itself. Therefore, the Newton-Raphson method is usable only in problems where \(f^\prime(x)\) can be readily computed.
Importance of Good Starting Point
Cheney and Kincaid (2004)3, provide several illustrations of bad starting points and the problems that can occur.
