תקציר
Neuromorphic cameras feature asynchronous event-based pixel-level processing and are particularly useful for object tracking in dynamic environments. Current approaches for feature extraction and optical flow with high-performing hybrid RGB-events vision systems require large computational models and supervised learning, which impose challenges for embedded vision and require annotated datasets. In this work, we propose ED-DCFNet, a small and efficient (< 72k) unsupervised multi-domain learning framework, which extracts events-frames shared features without requiring annotations, with comparable performance. Furthermore, we introduce an open-sourced event and frame-based dataset that captures indoor scenes with various lighting and motion-type conditions in realistic scenarios, which can be used for model building and evaluation. The dataset is available at https://github.com/NBELab/UnsupervisedTracking.
| שפה מקורית | אנגלית |
|---|---|
| כותר פרסום המארח | Proceedings - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2024 |
| מוציא לאור | IEEE Computer Society |
| עמודים | 2191-2199 |
| מספר עמודים | 9 |
| מסת"ב (אלקטרוני) | 9798350365474 |
| מזהי עצם דיגיטלי (DOIs) | |
| סטטוס פרסום | פורסם - 2024 |
| אירוע | 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2024 - Seattle, ארצות הברית משך הזמן: 16 יוני 2024 → 22 יוני 2024 |
סדרות פרסומים
| שם | IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops |
|---|---|
| ISSN (מודפס) | 2160-7508 |
| ISSN (אלקטרוני) | 2160-7516 |
כנס
| כנס | 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2024 |
|---|---|
| מדינה/אזור | ארצות הברית |
| עיר | Seattle |
| תקופה | 16/06/24 → 22/06/24 |
הערה ביבליוגרפית
Publisher Copyright:© 2024 IEEE.