Skip to content

JosuSC/Evaluative_Optimization_Assignment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Iterative Numerical Optimization

Python Jupyter NumPy Pandas Matplotlib

📌 Overview

This repository contains Jupyter Notebooks implementing and analyzing numerical algorithms to find the global minimum of a non-convex, oscillatory mathematical function. The project focuses on comparing the computational performance, convergence rates, and stability of two classical iterative methods: Gradient Descent and Newton's Method.

🧮 Problem Statement

The objective is to minimize the following multivariable function:

$$ f(x, y) = \frac{1}{2 + \cos(x + y)} + \frac{(x - y)^2 + (x + y)^2}{20} $$

Function Properties:

  • Oscillatory Nature: The trigonometric term $\frac{1}{2 + \cos(x + y)}$ creates a landscape fraught with local minima and ridges, challenging the algorithms' ability to find the global optimum without getting stuck.
  • Coercivity: The quadratic expression $\frac{(x - y)^2 + (x + y)^2}{20}$ ensures strict coercivity. As the variables $x$ and $y$ grow toward infinity, the function value also approaches infinity, acting as a "funnel" that guarantees the existence of a global minimum near the origin.

⚙️ Algorithms and Execution Flow

1. Gradient Descent (Steepest Descent)

  • Flow: Uses first-order information (the gradient vector $g = \nabla f(x)$) to descend the function's surface. The position is updated iteratively by moving in the opposite direction of the gradient, scaled by a fixed learning rate (step size) $\alpha = 0.1$.
  • Stopping Criterion: The algorithm halts when the Euclidean norm of the spatial step falls below the tolerance threshold: $|x_{new} - x| < \epsilon$ (where $\epsilon = 10^{-6}$).

2. Newton's Method

  • Flow: Incorporates second-order information by evaluating both the gradient $g$ and the Hessian matrix $H = \nabla^2 f(x)$.
  • Competitive Performance Insight: Instead of directly inverting the Hessian matrix (np.linalg.inv), the differential step $\Delta$ is obtained by solving the linear system $H \cdot \Delta = g$. This approach drastically minimizes floating-point error propagation, improves numerical stability, and decreases overall computational cycles—a critical optimization practice in numerical analysis.
  • Stopping Criterion: Identical to Gradient Descent ($|x_{new} - x| < \epsilon$).

💻 Key Code Implementation

Below is a breakdown of the core updates per iteration, emphasizing stability over naive math translation:

# Gradient Descent State Update
x_new = x - alpha * g

# Newton's Method State Update (Geometrically optimized step)
delta = np.linalg.solve(H, g)  # Efficiently solves H * delta = g
x_new = x - delta

📥 Inputs and 📤 Outputs

Both algorithms are tested under identical conditions to measure resilience:

  • Inputs: A variety of starting coordinates $x_0$ (e.g., [-50, 5], [5, -2]), tolerance ($\epsilon = 10^{-6}$), and an iteration limit.
  • Outputs Generated:
    • minimo: The final optimal spatial coordinates $[x^, y^]$.
    • valor_minimo: The optimized function value evaluated at the minima.
    • iteraciones: Total structural iterations needed to break the tolerance threshold.
    • trayectoria: An accumulated array list tracing the step-by-step Cartesian movement across the grid space, suitable for np.meshgrid plotting.

📊 Convergence Insights

A comparative data-driven analysis is handled using Pandas DataFrames at the end of the execution. This explicitly demonstrates the mathematical efficiency leap:

  • Quadratic Convergence: Newton's method frequently achieves convergence with orders of magnitude fewer iterations.
  • Linear/Sub-linear convergence: Gradient descent struggles in flatter valleys, requiring significantly more iterations to trigger the halting criteria.

🚀 Requirements and Setup

To run the notebooks locally, ensure you have a Python environment set up with Jupyter and the required analytical backbones:

pip install numpy pandas matplotlib jupyter

Simply launch jupyter notebook in the repository root and open Algoritmos.ipynb or optimization.ipynb.

About

🧮 A numerical optimization project that solves a non-convex oscillatory function using Gradient Descent and Newton’s Method. It compares convergence, efficiency, and stability, with trajectory tracking and visualization of the optimization process.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors