Skip to content

hawksilent/adaptive_teaching

 
 

Repository files navigation

Toward In-Context Teaching

This repository contains code for our paper, Toward In-Context Teaching: Adapting Examples to Students' Misconceptions.

Citation

@inproceedings{ross2024incontext,
    title = "Toward In-Context Teaching: Adapting Examples to Students' Misconceptions",
    author = "Alexis Ross and Jacob Andreas",
    booktitle = "ACL 2024",
    publisher = "Association for Computational Linguistics",
    url= "https://arxiv.org/abs/2405.04495",
}

Installation

  1. Clone the repository.

    git clone https://github.com/alexisjihyeross/adaptive_teaching
    cd adaptive_teaching
  2. Download and install Conda.

  3. Create a Conda environment.

    conda create -n pedagogy python=3.7
  4. Activate the environment.

    conda activate pedagogy
  5. Download the requirements.

    pip3 install -r requirements.txt
  6. Set Environment Variables

    Experiments with GPT-based models require setting OPENAI environment variables.

    export OPENAI_API_KEY={KEY}
    export OPENAI_ORGANIZATION={KEY}
    export PYTHONPATH=./

Synthetic Experiments

  1. Run Experiments

    The scripts below contain code for evaluating AToM, GPT4-based teachers, and other baselines with the synthetic learners in the AdapT evaluation framework.

    For example, to run experiments for functions, you could use the following command:

    bash scripts/run_functions.sh

    The code defaults to logging with wandb. Set the WANDB_PROJECT variable in these scripts to determine which wandb projects results are logged to.

  2. View Results

    You can use the following command to download results from wandb:

    python src/analyze.py --entity ${ENTITY} --project ${PROJECT}

Human Experiments

The script scripts/run_human.sh contains the script for running a server for human experiments.

By default, it runs the experiments in the paper: 22 experimental conditions (11 target concepts, 2 student types), 5 seeds each, for 3 different teachers: Random, ATOM, and GPT4.

Results are saved locally to results/human/experiments.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 78.7%
  • JavaScript 10.8%
  • HTML 6.1%
  • CSS 4.2%
  • Shell 0.2%