PAWPRINT: Whose Footprints Are These?
Identifying Animal Individuals by Their Footprints

ICIP 2025

Inpyo Song, Hyemin Hwang, Jangwon Lee
Sungkyunkwan University, South Korea

🐾 Interactive PAWPRINT Challenge

Try matching these adorable pets with their unique footprints!

🎯 How to Play

Click on a pet, then click on their matching footprint to connect them!

πŸ• Meet Our Furry Friends

🐾 Their Footprints

This interactive game demonstrates the core concept behind our PAWPRINT research:
identifying individual animals through their unique footprint patterns! πŸ”¬


Abstract

In the United States, as of 2023, pet ownership has reached 66% of households and continues to rise annually. This trend underscores the critical need for effective pet identification and monitoring methods, particularly as nearly 10 million cats and dogs are reported stolen or lost each year. However, traditional methods for finding lost animals like GPS tags or ID photos have limitationsβ€”they can be removed, face signal issues, and depend on someone finding and reporting the pet.

To address these limitations, we introduce PAWPRINT and PAWPRINT+, the first publicly available datasets focused on individual-level footprint identification for dogs and cats. Through comprehensive benchmarking of both modern deep neural networks (e.g., CNN, Transformers) and classical local features, we observe varying advantages and drawbacks depending on substrate complexity and data availability.

These insights suggest future directions for combining learned global representations with local descriptors to enhance reliability across diverse, real-world conditions. As this approach provides a non-invasive alternative to traditional ID tags, we anticipate promising applications in ethical pet management and wildlife conservation efforts.


PAWPRINT Datasets

PAWPRINT and PAWPRINT+ are the first public datasets for identifying individual dogs and cats by their footprints. PAWPRINT has 933 clear prints from controlled surfaces (sand/clay) from 13 dogs and 7 cats. PAWPRINT+ adds 1,662 prints from varied real-world terrains (wood, dirt, asphalt, snow, rock) to test robustness in challenging conditions.

PAWPRINT
πŸ“Š 933 prints
πŸ• 13 dogs, 🐈 7 cats
πŸ–οΈ Sand & clay only
PAWPRINT+
πŸ“Š 1,662 prints
πŸ• 12 dogs
🌍 7 natural surfaces

Key Results

Controlled Environment (PAWPRINT):
β€’ Deep learning models excel: EfficientNet-B1 achieved 95.5% accuracy
β€’ Classical SIFT features: 83.3% accuracy

Challenging Environment (PAWPRINT+):
β€’ Deep models struggle: Best only 35.8% accuracy (ResNet-50 ArcFace)
β€’ SIFT remains robust: 34.7% accuracy without training
β€’ Family members often confused (genetics matter!)

πŸ’‘ Key Insight: Classical features like SIFT are surprisingly competitive in challenging real-world conditions, suggesting hybrid approaches could be promising.

Method Comparison

Different approaches focus on different footprint aspects: SIFT detects local keypoints, ResNet highlights overall shape, and Vision Transformers capture global context. This suggests combining methods could improve robustness.

SIFT
Local keypoints
Robust to terrain changes
ResNet
Overall shape
Best in controlled conditions
ViT
Global context
Attention-based features
Can you match the footprint?

🐾 Dataset Download

Help Our Research Grow

If our PAWPRINT datasets contribute to your research, we would be grateful if you could cite our work!

BibTeX

@misc{song2025pawprintfootprintstheseidentifying,
  title={PawPrint: Whose Footprints Are These? Identifying Animal Individuals by Their Footprints}, 
  author={Inpyo Song and Hyemin Hwang and Jangwon Lee},
  year={2025},
  eprint={2505.17445},
  archivePrefix={arXiv},
  primaryClass={cs.CV},
  url={https://arxiv.org/abs/2505.17445}, 
}

Acknowledgements

Many thanks to the pets who patiently walked 🐾, sat, and stared for the camera 🐢 β€” and to their owners for making it all happen.