Conference
USTOMB, Doctoral Days in Computer Sciences JDI'24, 1st ed., SIMPA Laboratory, Department of Computer Sciences, University of Science and Technology of Oran - Mohamed Boudiaf USTO-MB, Oran, ALGERIA, 2024 Jun 6
APA
Click to copy
Abdelkader, H., & Hayat, Y. (2024). Human Optical Flow using YOLOv8 and Density Maps. In USTOMB (Ed.), Doctoral Days in Computer Sciences JDI'24 (1st ed.). Oran, ALGERIA: SIMPA Laboratory.
Chicago/Turabian
Click to copy
Abdelkader, Haddag, and Yedjour Hayat. βHuman Optical Flow Using YOLOv8 and Density Maps.β In Doctoral Days in Computer Sciences JDI'24, edited by USTOMB. 1st ed. Oran, ALGERIA: SIMPA Laboratory, 2024.
MLA
Click to copy
Abdelkader, Haddag, and Yedjour Hayat. βHuman Optical Flow Using YOLOv8 and Density Maps.β Doctoral Days in Computer Sciences JDI'24, edited by USTOMB, 1st ed., SIMPA Laboratory, 2024.
BibTeX Click to copy
@conference{haddag2024a,
title = {Human Optical Flow using YOLOv8 and Density Maps},
year = {2024},
month = jun,
day = {6},
address = {Oran, ALGERIA},
edition = {1},
institution = {University of Science and Technology of Oran - Mohamed Boudiaf USTO-MB},
journal = {Doctoral Days in Computer Sciences JDI'24},
publisher = {SIMPA Laboratory},
school = {Department of Computer Sciences},
author = {Abdelkader, Haddag and Hayat, Yedjour},
editor = {USTOMB},
howpublished = {},
month_numeric = {6}
}
Optical flow estimation serves as a cornerstone in diverse domains, spanning from video processing to computer vision, facilitating tasks such as human action recognition, pose estimation. Despite its pivotal role, challenges persist in accurately estimating optical flow owing to variations in representation and interpretation across applications, limited availability of real-world data. This paper proposes a novel approach, focusing on precise optical flow estimation tailored for human motion in videos. Harnessing the power of pretrained deep learning models and density map approximations, our method offers multi-task learning capabilities with low compute requirements. Our aim is to furnish a portable and cross-domain solution for real world applications. Our method exhibits promising results in capturing motion details, subtracting background while offering computational efficiency, thereby contributing to advancements in human-centric optical flow research.