报告时间:2015年5月19日星期二上午10:00
报告地点:浙江大学紫金港校区图书信息中心B楼CAD&CG国家重点实验室402室
报告题目:Deep learning on recommender system and video analysis
报告人:Naiyan Wang博士
主持人:蔡登 教授
Abstract:
In this talk, I will briefly introduce our recent works on applying deep learning to recommender system and video analysis. In the first part, I will introduce the application of content based collaborative filtering. In this work, we first utilize SDAE and CNN as powerful tools to extract features from content, and then these features are served as side information in collaborative filtering. These two parts are seamlessly integrated and collaboratively trained from end to end. In the second part, I will focus on the application of visual tracking and multimedia event detection/recounting. In visual tracking, we proposed a novel structured output objectness CNN to account for the intrinsic properties of the task. First, the output of the CNN is a pixelwise probability map instead of a single label or real number as in conventional CNN. Second, the objectness CNN is first pretrained to distinguish objects from non-objects, and then finetuned into online tracking. In the MED task, we integrated the video classification task and key evidence finding task into one unified CNN framework. As a result, the CNN can not only output the label for the video, but also output the supporting spatial and temporal evidence. At last, I will summarize some research trends.
Bio:
Naiyan Wang is currently the final year PhD candidate in CSE department, HongKong University of Science and Technology. His supervisor is Prof. Dit-Yan Yeung. Before that, he got my BS degree from Zhejiang University, 2011 under the supervision of Prof. Zhihua Zhang. His research interest focuses on applying statistical computational model to real problems in computer vision and data mining. Currently, he mainly works on sparse representation, matrix factorization, deep learning. Especially he is interested in the area of visual tracking, object detection, image classification and recommender system.