Deep Learning for Video Understanding - Zuxuan Wu
Mar Ago 13, 2024 2:45 pm
[/center]
pdf | 24.02 MB | English | Isbn:9783031576799 | Author: Zuxuan Wu, Yu-Gang Jiang | Year: 2024
[/center]
About ebook: Deep Learning for Video Understanding
This book presents deep learning techniques for video understanding. For deep learning basics, the authors cover machine learning pipelines and notations, 2D and 3D Convolutional Neural Networks for spatial and temporal feature learning. For action recognition, the authors introduce classical frameworks for image classification, and then elaborate both image-based and clip-based 2D/3D CNN networks for action recognition. For action detection, the authors elaborate sliding windows, proposal-based detection methods, single stage and two stage approaches, spatial and temporal action localization, followed by datasets introduction. For video captioning, the authors present language-based models and how to perform sequence to sequence learning for video captioning. For unsupervised feature learning, the authors discuss the necessity of shifting from supervised learning to unsupervised learning and then introduce how to design better surrogate training tasks to learn video representations. Finally, the book introduces recent self-training pipelines like contrastive learning and masked image/video modeling with transformers. The book provides promising directions, with an aim to promote future research outcomes in the field of video understanding with deep learning.
https://rapidgator.net/file/c30ac9a83c2d10685df0a2cf5bd32bad/
https://filestore.me/f7mr0exxo32e
[/center]
- Multimodal Learning toward Micro-Video Understanding
- Deep Learning In Python - A basic introduction to Deep Learning with Advanced Neur...
- Deep Learning for Data Architects: Unleash the Power of Python's Deep Learning Alg...
- Deep Learning With Tensorflow Second Edition Deep Learning Tensorflow Packt Publis...
- Understanding Video Game Music - Tim Summers
Permisos de este foro:
No puedes responder a temas en este foro.