Skip to content

Local LLM mathematical assistant with real-time tool integration, symbolic computation, and LaTeX rendering using LM Studio + Python

Notifications You must be signed in to change notification settings

luizrobertomeier/Advanced-Math-AI-Assistant

Repository files navigation

Advanced Math AI Assistant

Autor: Luiz Roberto Meier
Data: August 25, 2025

An intelligent mathematical assistant that seamlessly combines LM Studio's local LLM capabilities with external computational tools. The system automatically decides when to use internal knowledge versus external computation, providing perfect LaTeX rendering in Jupyter notebooks.

Features

  • Seamless Intelligence: Automatically chooses between internal knowledge and external computation
  • Advanced Math Tools: Eigenvalues, differential equations, quantum mechanics, numerical analysis
  • Perfect LaTeX Rendering: Beautiful mathematical expressions in Jupyter
  • Local & Private: Runs entirely on your machine
  • Real-time Tool Integration: Live communication between LLM and computational backend

Demo

Seamless Mathematical Intelligence

Jupyter Interface

Advanced Tool Integration (Flask server running on http://localhost:5000 )

Tool Execution

LM Studio with model loaded and API server running on http://localhost:1234

Tool Execution

Prerequisites

  • Linux (tested on Linux Mint)
  • Python 3.8+
  • At least 32GB RAM (recommended for 20B model)
  • NVIDIA RTX 4060 or better (RTX 4090 recommended for optimal performance)

Installation

Step 1: Install LM Studio

  1. Download LM Studio 0.3.6+ from lmstudio.ai
  2. Install the AppImage:
    chmod +x LM-Studio-*.AppImage
    ./LM-Studio-*.AppImage

Step 2: Download the Model

In LM Studio:

  1. Go to "Models" tab
  2. Search for openai/gpt-oss-20b
  3. Download the model
  4. Load the model (ensure it shows the tool icon ⚒️)

Step 3: Start LM Studio API Server

  1. Go to "Developer" tab in LM Studio
  2. Click "Start Server"
  3. Note the URL (typically http://localhost:1234)

Step 4: Set Up Python Environment

# Clone repository
git clone https://github.com/yourusername/Advanced-Math-AI-Assistant
cd Advanced-Math-AI-Assistant

# Create virtual environment
python3 -m venv venv
source venv/bin/activate

# Install dependencies
pip install -r requirements.txt

Step 5: Start Math Tools Server

# In terminal 1 (with venv activated)
python server.py

Server will start at http://127.0.0.1:5000

Step 6: Launch Jupyter Interface

# In terminal 2 (with venv activated)
jupyter notebook

Open Seamless Math Assistant.ipynb and run the first cell.

Usage

Basic Examples

# Simple arithmetic (uses external computation for precision)
assistant.ask("What's 847 * 293?")

# Conceptual questions (uses internal knowledge)
assistant.ask("What are eigenvalues?")

# Advanced computations (uses external tools)
assistant.ask("Find eigenvalues of [[3, 1, 0], [1, 3, 1], [0, 1, 3]]")

# Mixed questions (uses both internal knowledge and external computation)
assistant.ask("Explain eigenvalues and find them for [[2, 1], [1, 2]]")

Available Mathematical Tools

  • Basic Arithmetic: Precise calculations
  • Eigenvalue Analysis: Matrix eigenvalues and eigenvectors
  • Differential Equations: Symbolic ODE system solutions
  • Quantum Mechanics: Harmonic oscillator solutions
  • Numerical Methods: Numerical ODE integration

How It Works

  1. Question Analysis: LLM analyzes your mathematical question
  2. Decision Making: Automatically chooses internal knowledge vs external tools
  3. Tool Execution: Calls appropriate mathematical tools when needed
  4. Result Integration: Combines computed results with natural language explanations
  5. LaTeX Rendering: Displays beautiful mathematical expressions

Troubleshooting

LM Studio Issues

  • Ensure model shows tool icon ⚒️ (indicates function calling support)
  • Check API server is running on port 1234
  • Verify model is openai/gpt-oss-20b

Python Environment Issues

# Recreate environment if needed
rm -rf venv
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt

Tool Server Issues

  • Check server is running on port 5000
  • Ensure no firewall blocking local connections
  • Verify SymPy and NumPy are installed correctly

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • LM Studio for local LLM infrastructure
  • SymPy for symbolic mathematics
  • NumPy/SciPy for numerical computation
  • OpenAI for the GPT-OSS-20B model architecture

About

Local LLM mathematical assistant with real-time tool integration, symbolic computation, and LaTeX rendering using LM Studio + Python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published