Acta Anthropologica Sinica ›› 2026, Vol. 45 ›› Issue (02): 296-309.doi: 10.16359/j.1000-3193/AAS.2025.0024

• Research Articles • Previous Articles     Next Articles

Determining gender based on the images of human barefoot footprints

YAO Li1,2(), LUO Zhen1, GE Heng1, MA Xiaoyun3, LIN Zhehan1   

  1. 1. College of Forensic Science, Criminal Investigation Police University of China, Shenyang 110035
    2. Key Laboratory of Trace Examination and Identification Technology, Ministry of Public Security, Criminal Investigation Police University of China, Shenyang 110035
    3. Anti drug Information and Technology Center of the Ministry of Public Security, Beijing 100193
  • Received:2024-05-17 Revised:2025-03-13 Online:2026-04-15 Published:2026-04-17

Abstract:

To explore the correlation between barefoot footprints and human gender, this study proposes a gender analysis model based on a Back Propagation (BP) neural network optimized by genetic algorithms. The model integrates footprint morphology, biomechanical principles, and traditional footprint analysis theories. A total of 500 barefoot footprint images from healthy subjects aged 20-30 were collected. These images underwent preprocessing using MATLAB software, including denoising, enhancement, binarization, and edge detection, followed by the extraction of 54 features encompassing grayscale distribution, geometric characteristics, and morphological parameters. Genetic algorithms were employed to select the optimal feature subset, identifying seven key discriminative features such as heel-to-ball distance (dFT6), barefoot ball width (dS1S2), and barefoot length (dOP).

The optimized BP neural network classifier achieved an average accuracy of 84.34% in gender classification, significantly outperforming traditional BP networks (70.10%), logistic regression (75.62%), and support vector machines (78.94%). The analysis revealed that features like heel-to-ball distance and barefoot ball width exhibited strong sexual dimorphism, aligning with anatomical differences between genders. For instance, males generally displayed larger heel-to-ball distances and wider ball regions compared to females. Additionally, geometric features such as the rectangularity and aspect ratio of the metatarsal-arch-heel region further contributed to classification accuracy, reflecting structural variations in footprint morphology.

Statistical significance tests (e.g., t-tests and Mann-Whitney U tests) confirmed that the selected features demonstrated significant gender-related differences (p≤0.05), with high effect sizes (Cohen’s d>0.5) for critical parameters. Notably, while some statistical features (e.g., probability density functions) lacked individual significance, their nonlinear interactions within the neural network enhanced overall model performance. This highlights the importance of feature synergy in machine learning-based classification.

The study underscores the feasibility of applying artificial neural networks to footprint analysis, particularly in forensic contexts. The genetic algorithm’s ability to optimize feature selection and network parameters improved model robustness and generalization, addressing limitations of traditional methods that rely on linear assumptions or manual feature engineering. However, challenges remain, including sample homogeneity and potential errors in ink-based footprint acquisition. Future work should expand datasets to include diverse age groups, refine feature extraction techniques (e.g., incorporating pressure distribution data), and develop integrated software tools for real-world forensic applications.

In conclusion, this research advances the automation and objectivity of footprint-based gender analysis, offering both academic insights and practical value for criminal investigations. By bridging biomechanics, computer vision, and evolutionary computation, the proposed framework demonstrates the potential of machine learning in decoding complex biological patterns embedded in human footprints.

Key words: footprint image, foot type, gender, genetic algorithms

CLC Number: