0% found this document useful (0 votes)
24 views

Derivatives

Uploaded by

Harold Balubal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

Derivatives

Uploaded by

Harold Balubal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

In [8]: import numpy as np

import matplotlib.pyplot as plt

In [54]: # Define the function f(x) = 2 + 5*sin(x) + 0.1*sin(30x)


def f_func(x):
return 2 + 5*np.sin(x) + 0.1*np.sin(30*x)

# Define the noiseless version of the function, g(x) = 2 + 5*sin(x)


def g_func(x):
return 2 + 5*np.sin(x)

# Analytical derivative of f(x) = d/dx [2 + 5*sin(x) + 0.1*sin(30x)]


def f_adiv(x):
return 5*np.cos(x) + 3*np.cos(30*x)

# Forward difference method


def forward_diff(func, x, h):
return (func(x + h) - func(x)) / h

# Central difference method


def central_diff(func, x, h):
return (func(x + h) - func(x - h)) / (2*h)

## Plotting
fig, ax = plt.subplots()

h = 0.1 # our baseline step size


# Create an array of x values equally spaced with distance h
x = np.arange(-0.25*np.pi, 0.25*np.pi, h)

# Plot the analytical derivative


ax.plot(x, f_adiv(x), color="k", label="Analytical derivative")

# Plot the forward difference for the noiseless function


ax.plot(x, forward_diff(g_func, x, h), color='r', label="Noiseless")

# Plot the forward difference for different h values


for hv in [h/2, h, 2*h]:
ax.plot(x, forward_diff(f_func, x, hv), label=f"Forward diff, h={hv}")

ax.grid()
ax.legend()
plt.show()

*SIMULATED NOISY FUNCTION: QUESTION 1


Small ℎ: More accurate derivative approximation, but potentially amplifying any high-frequency noise (the term 0.1sin(30x)).

Medium ℎ: A balance between capturing the overall shape and smoothing out the noise.

Large ℎ: The approximation becomes less accurate and the function may smooth out too much, leading to loss of detail in the derivative.

*Visualizing the forward difference will show increasing smoothing as ℎ increases, but with small ℎ , you'll notice more sensitivity to the high-frequency component.

In [55]: # Define the function f(x) = 2 + 5*sin(x) + 0.1*sin(30x)


def f_func(x):
return 2 + 5*np.sin(x) + 0.1*np.sin(30*x)

# Define the noiseless version of the function, g(x) = 2 + 5*sin(x)


def g_func(x):
return 2 + 5*np.sin(x)

# Analytical derivative of f(x) = d/dx [2 + 5*sin(x) + 0.1*sin(30x)]


def f_adiv(x):
return 5*np.cos(x) + 3*np.cos(30*x)

# Forward difference method


def forward_diff(func, x, h):
return (func(x + h) - func(x)) / h

# Central difference method


def central_diff(func, x, h):
return (func(x + h) - func(x - h)) / (2*h)

## Plotting
fig, ax = plt.subplots()

h = 0.1 # our baseline step size


# Create an array of x values equally spaced with distance h
x = np.arange(-0.25*np.pi, 0.25*np.pi, h)

# Plot the analytical derivative


ax.plot(x, f_adiv(x), color="k", label="Analytical derivative")

# Plot the central difference for the noiseless function


ax.plot(x, central_diff(g_func, x, h), color='r', label="Noiseless")

# Plot the forward difference for different h values


for hv in [h/2, h, 2*h]:
ax.plot(x, central_diff(f_func, x, hv), label=f"Central diff, h={hv}")

ax.grid()
ax.legend()
plt.show()

*SIMULATED NOISY FUNCTION: QUESTION 2


Central difference tends to provide better accuracy with less amplification of noise for the same ℎ compared to the forward difference. The central difference method also tends to smooth the high-frequency oscillation (the "noise" from the 0.1sin(30x) term) better than the forward difference method.

Implications of Noise and Step Size With:

Small ℎ, both methods will capture the high-frequency oscillations more precisely, but these oscillations can be interpreted as noise, which may not be desirable in certain contexts (e.g., if you're only interested in the large-scale behavior).

Increasing ℎ helps smooth out these oscillations, but it comes at the cost of losing detail in the overall behavior of the function.

Central difference is more resistant to noise for a given ℎ and is thus preferable if you are trying to mitigate the impact of high-frequency oscillations.

In conclusion, if you consider the 0.1sin(30x) term as noise, using a larger ℎ or the central difference method would reduce its impact, providing a cleaner approximation of the primary behavior of the function (dominated by 5sin(x)).

In [58]: import numpy as np


import matplotlib.pyplot as plt

# Create a grid for x and y


x_grid = np.linspace(-1, 3, 100)
y_grid = np.linspace(-2, 1.5, 100)

X, Y = np.meshgrid(x_grid, y_grid)
phi = X**2 - 2*X + Y**4 - 2*Y**2 + Y

# Calculate partial derivatives using np.roll


h = x_grid[1] - x_grid[0] # step size
dphi_dx = np.roll(phi, -1, axis=1) - phi
dphi_dy = np.roll(phi, -1, axis=0) - phi

# Display the results


print("Partial derivative ∂ϕ/∂x (forward difference):")
print(dphi_dx[1:-1, 1:-1] / h)

print("\nPartial derivative ∂ϕ/∂y (forward difference):")


print(dphi_dy[1:-1, 1:-1] / h)

# Visualization
fig, ax = plt.subplots(1, 2, figsize=(12, 5))
ax[0].matshow(dphi_dx[1:-1, 1:-1] / h, extent=[x_grid.min(), x_grid.max(), y_grid.min(), y_grid.max()], origin='lower')
ax[0].set_title('Partial Derivative ∂ϕ/∂x')
ax[1].matshow(dphi_dy[1:-1, 1:-1] / h, extent=[x_grid.min(), x_grid.max(), y_grid.min(), y_grid.max()], origin='lower')
ax[1].set_title('Partial Derivative ∂ϕ/∂y')
plt.colorbar(ax[0].matshow(dphi_dx[1:-1, 1:-1] / h), ax=ax[0])
plt.colorbar(ax[1].matshow(dphi_dy[1:-1, 1:-1] / h), ax=ax[1])
plt.show()

Partial derivative ∂ϕ/∂x (forward difference):


[[-3.87878788 -3.7979798 -3.71717172 ... 3.7979798 3.87878788
3.95959596]
[-3.87878788 -3.7979798 -3.71717172 ... 3.7979798 3.87878788
3.95959596]
[-3.87878788 -3.7979798 -3.71717172 ... 3.7979798 3.87878788
3.95959596]
...
[-3.87878788 -3.7979798 -3.71717172 ... 3.7979798 3.87878788
3.95959596]
[-3.87878788 -3.7979798 -3.71717172 ... 3.7979798 3.87878788
3.95959596]
[-3.87878788 -3.7979798 -3.71717172 ... 3.7979798 3.87878788
3.95959596]]

Partial derivative ∂ϕ/∂y (forward difference):


[[-18.14399712 -18.14399712 -18.14399712 ... -18.14399712 -18.14399712
-18.14399712]
[-16.8859421 -16.8859421 -16.8859421 ... -16.8859421 -16.8859421
-16.8859421 ]
[-15.67806188 -15.67806188 -15.67806188 ... -15.67806188 -15.67806188
-15.67806188]
...
[ 5.78093687 5.78093687 5.78093687 ... 5.78093687 5.78093687
5.78093687]
[ 6.415618 6.415618 6.415618 ... 6.415618 6.415618
6.415618 ]
[ 7.08827822 7.08827822 7.08827822 ... 7.08827822 7.08827822
7.08827822]]

In [18]: import numpy as np


import matplotlib.pyplot as plt

# Create a grid for x and y


x_grid = np.linspace(-1, 3, 100)
y_grid = np.linspace(-2, 1.5, 100)

X, Y = np.meshgrid(x_grid, y_grid)
phi = X**2 - 2*X + Y**4 - 2*Y**2 + Y

# Calculate partial derivatives using np.roll (forward difference)


h = x_grid[1] - x_grid[0] # step size
dphi_dx = np.roll(phi, -1, axis=1) - phi
dphi_dy = np.roll(phi, -1, axis=0) - phi

# Visualization of Partial Derivatives


fig, ax = plt.subplots(1, 2, figsize=(12, 5))

# Contour plot for ∂ϕ/∂x


contour1 = ax[0].contour(X[1:-1, 1:-1], Y[1:-1, 1:-1], dphi_dx[1:-1, 1:-1] / h, levels=20, cmap='viridis')
ax[0].set_title('Contour of Partial Derivative ∂ϕ/∂x')
ax[0].set_xlabel('x')
ax[0].set_ylabel('y')
fig.colorbar(contour1, ax=ax[0])

# Contour plot for ∂ϕ/∂y


contour2 = ax[1].contour(X[1:-1, 1:-1], Y[1:-1, 1:-1], dphi_dy[1:-1, 1:-1] / h, levels=20, cmap='viridis')
ax[1].set_title('Contour of Partial Derivative ∂ϕ/∂y')
ax[1].set_xlabel('x')
ax[1].set_ylabel('y')
fig.colorbar(contour2, ax=ax[1])

plt.tight_layout()
plt.show()

*GRADIENT: QUESTION 1
Here we computed the first derivatives using forward differences, revealing the spatial behavior of the scalar field.

∂ϕ/∂x: This derivative measures the rate of change of the scalar field 𝜙 with respect to x. Positive values indicate that 𝜙 is increasing as 𝑥 increases, while negative values indicate a decrease.

∂ϕ/∂y: Similarly, this derivative measures the rate of change with respect to y. The plots show how the field varies in the y direction.

The output helps visualize how the scalar field changes in space, identifying regions where the field is increasing or decreasing.

In [20]: fig, ax = plt.subplots()

x_grid = np.linspace(-1, 3, 100)


y_grid = np.linspace(-2, 1.5, 100)

X, Y = np.meshgrid(x_grid, y_grid)
phi = X**2 - 2*X + Y**4 - 2*Y**2 + Y

ms = ax.matshow(phi, vmin=-3, vmax=0.6, origin="lower", alpha=0.3, extent=[x_grid.min(), x_grid.max(), y_grid.min(), y_grid.max()])


cp = ax.contour(X, Y, phi, levels=[-3, -2.4, -1.8, -1, -0.5, 0.6])
plt.title('Original Field')
bgcb = fig.colorbar(ms, ax=ax)
fig.colorbar(cp, cax=bgcb.ax)

# Calculate second partial derivatives using central difference


laplacian_x = np.roll(phi, -1, axis=0) - 2 * phi + np.roll(phi, 1, axis=0)
laplacian_y = np.roll(phi, -1, axis=1) - 2 * phi + np.roll(phi, 1, axis=1)

# Laplacian
laplacian_phi = laplacian_x + laplacian_y

# Display the output


print("\nLaplacian ∇²ϕ(x, y):")
print(laplacian_phi[1:-1, 1:-1] / (h**2))

# Visualization
plt.figure(figsize=(6, 5))
plt.matshow(laplacian_phi[1:-1, 1:-1] / (h**2), extent=[x_grid.min(), x_grid.max(), y_grid.min(), y_grid.max()], origin='lower')
plt.title('Laplacian ∇²ϕ(x, y)')
plt.colorbar()
plt.show()

Laplacian ∇²ϕ(x, y):


[[34.40165465 34.40165465 34.40165465 ... 34.40165465 34.40165465
34.40165465]
[33.13686183 33.13686183 33.13686183 ... 33.13686183 33.13686183
33.13686183]
[31.89503542 31.89503542 31.89503542 ... 31.89503542 31.89503542
31.89503542]
...
[16.79134224 16.79134224 16.79134224 ... 16.79134224 16.79134224
16.79134224]
[17.70835804 17.70835804 17.70835804 ... 17.70835804 17.70835804
17.70835804]
[18.64834025 18.64834025 18.64834025 ... 18.64834025 18.64834025
18.64834025]]

<Figure size 600x500 with 0 Axes>

In [17]: # Calculate second partial derivatives using central difference


laplacian_x = np.roll(phi, -1, axis=0) - 2 * phi + np.roll(phi, 1, axis=0)
laplacian_y = np.roll(phi, -1, axis=1) - 2 * phi + np.roll(phi, 1, axis=1)

# Laplacian
laplacian_phi = laplacian_x + laplacian_y

# Visualization of Laplacian
plt.figure(figsize=(6, 5))
contour_laplacian = plt.contour(X[1:-1, 1:-1], Y[1:-1, 1:-1], laplacian_phi[1:-1, 1:-1] / (h**2), levels=20, cmap='plasma')
plt.title('Contour of Laplacian ∇²ϕ(x, y)')
plt.xlabel('x')
plt.ylabel('y')
plt.colorbar(contour_laplacian)
plt.show()
GRADIENT: QUESTION 2
Here we used central differences to calculate the Laplacian, which provides insights into the curvature and overall distribution of the field. The Laplacian ∇^(2)𝜙 gives insight into the curvature of the scalar field. Positive values indicate areas of local minimum (concave regions), while negative values indicate local maximum (convex regions). Regions where the
Laplacian is zero imply that the function is locally flat. The output helps in understanding how the scalar field behaves spatially, indicating the points of acceleration or deceleration of the field.

Original Scalar Field: The first plot shows the original scalar field defined as 𝜙(𝑥,𝑦) = x^2 − 2x + y4 − 2y^2+y. This field is a combination of polynomial functions of x and y. The contours represent levels of constant 𝜙 across the xy-plane. The regions of interest include areas of high values (indicating peaks) and low values (indicating valleys or depressions).

Laplacian, on the other hand, the second plot visualizes the Laplacian of the field. The Laplacian indicates how 𝜙 curves in space; positive values suggest that the field is locally convex (indicating a local minimum), while negative values suggest local concavity (indicating a local maximum). A zero value suggests flatness in that region, indicating stability in the
potential field described by 𝜙.

Physical Implications: The original field is relevant in contexts like potential energy surfaces, where understanding the local minima and maxima can inform stability and equilibrium conditions. The Laplacian is crucial in physical phenomena like heat distribution and fluid dynamics, where it can indicate how heat or other quantities diffuse across a medium.

You might also like