Skip to content

EMNLP22: Multi-View Reasoning: Consistent Contrastive Learning for Math Word Problem

License

Notifications You must be signed in to change notification settings

zwq2018/Math-Reasoning-With-PLMs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-View Reasoning: Consistent Contrastive Learning for Math Reasoning

This repo contains the code and data for our three papers about math reasoning:

  • EMNLP 2022 findings: Multi-View Reasoning: Consistent Contrastive Learning for Math Reasoning
    arXiv

  • EMNLP2023 main: An Expression Tree Decoding Strategy for Mathematical Equation Generation
    arxiv

  • IEEE/ACM Transactions on Audio, Speech, and Language Processing (TASLP): Specialized Mathematical Solving by a Step-By-Step Expression Chain Generation
    Taslp

image

##🌿🌿 Requirements

  • transformers pip3 install transformers
  • Pytorch > 1.7.1

##🌿🌿 Description

  • We provide the code for multilingual datasets ( math23k and mathQA )
  • We also provide preprocessing code to process the equation by pre-order and post-order traversal
  • We adopt Roberta-base and Chinese-BERT from HuggingFace for multilingual datasets. So it needs to be connected to the internet or copy the HuggingFace parameters from somewhere else and load them directly
  • Part of bottom-up view code is based on Deductive-MWP (https://github.com/allanj/Deductive-MWP.git)

##🌿🌿 Usage

  • To reproduce our results, you can either directly use the dataset we provided, where we have processed the equation using pre-order and post-order traversal, or directly process the original dataset using our code

  • To run the code for math23k in the corresponding folder:

    python main_math23k.py
    
  • To preprocess the original dataset for equation augmentation:

    cd preprocess
    python process_math23k.py
    
  • The training code for the two datasets, MathQA and Math23k, differs slightly. Firstly, they utilize different pretrained models, with MathQA employing "roberta-base" instead of "roberta-chinese" for the English dataset. Secondly, MathQA has a different set of constants, including more constants such as '100.0', '1.0', '2.0', '3.0', '4.0', '10.0', '1000.0', '60.0', '0.5', '3600.0', '12.0', '0.2778', '3.1416', '3.6', '0.25', '5.0', '6.0', '360.0', '52.0', and '180.0'. Lastly, MathQA has two more operators compared to Math23k, with the additional operators being '^' and '^_rev'.

constants = ['100.0', '1.0', '2.0', '3.0', '4.0', '10.0', '1000.0', '60.0', '0.5', '3600.0', '12.0', '0.2778','3.1416', '3.6', '0.25', '5.0', '6.0', '360.0', '52.0', '180.0']
uni_labels = ['+', '-', '-_rev', '*', '/', '/_rev','^', '^_rev']

##🌿🌿 Details

From top-down view:

Xnip2023-08-26_18-27-18

From bottwom-view:

Xnip2023-08-26_18-27-38

##🌿🌿 Citation If you find this work useful, please cite our paper

@inproceedings{zhang-etal-2022-multi-view,
    title = "Multi-View Reasoning: Consistent Contrastive Learning for Math Word Problem",
    author = "Zhang, Wenqi  and
      Shen, Yongliang  and
      Ma, Yanna  and
      Cheng, Xiaoxia  and
      Tan, Zeqi  and
      Nong, Qingpeng  and
      Lu, Weiming",
    booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2022",
    month = dec,
    year = "2022",
    address = "Abu Dhabi, United Arab Emirates",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2022.findings-emnlp.79",
    doi = "10.18653/v1/2022.findings-emnlp.79"
}

@inproceedings{zhang-etal-2023-expression,
    title = "An Expression Tree Decoding Strategy for Mathematical Equation Generation",
    author = "Zhang, Wenqi  and
      Shen, Yongliang  and
      Nong, Qingpeng  and
      Tan, Zeqi  and
      Ma, Yanna  and
      Lu, Weiming",
    booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
    month = dec,
    year = "2023",
    address = "Singapore",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.emnlp-main.29",
    doi = "10.18653/v1/2023.emnlp-main.29",
}
@ARTICLE{10552332,
  author={Zhang, Wenqi and Shen, Yongliang and Hou, Guiyang and Wang, Kuangyi and Lu, Weiming},
  journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing}, 
  title={Specialized Mathematical Solving by a Step-By-Step Expression Chain Generation}, 
  year={2024},
  volume={32},
  number={},
  pages={3128-3140},
  keywords={Cognition;Mathematical models;Decoding;Task analysis;Annotations;Collaboration;Labeling;Math word problem;math reasoning;expression generation;collaboration;step-by-step},
  doi={10.1109/TASLP.2024.3410028}}

About

EMNLP22: Multi-View Reasoning: Consistent Contrastive Learning for Math Word Problem

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages