This project addresses the challenge of deciphering handwritten mathematical symbols. By leveraging convolutional neural networks (CNNs) and transfer learning, we explored the strengths and weaknesses of different preprocessing steps and neural networks. Our approach employed pre-trained models like Xception and MobileNetV2, which were fine-tuned on a dataset enhanced through normalization and augmentation techniques. These preprocessing steps included standardizing pixel values and using rotations and reflections to improve dataset robustness and reduce overfitting. Our experiments achieved 88% on a small test set, highlighting the potential of CNNs for mathematical notation recognition.
See our source code here