Research

Digital Content and Media Sciences Research Division

YU Yi
Digital Content and Media Sciences Research Division, Assistant Professor
Degrees: Computer Science, Nara Women’s University
Research Fields: Human and Knowledge Media
Detail: http://researchmap.jp/yiyu/
WEB

Introduction

Mining Valuable Knowledge about People Activities

The advancement of mobile devices has enabled people to readily create and share large volumes of multimedia data anywhere and anytime, using their mobile devices to take photos and post comments, record location information, and share with their friends. On the one hand, by analyzing the social multimedia information and data generated by an individual user, we can learn what content an individual user is interested in. On the other hand, diverse data are aggregated over the Internet as more usersinteract with various online platforms.
In this sense, users themselves act as sensors and form a participatory sensing to gain insights of a society. By analyzing these data, it is possible to learn the characteristics of people daily life all over the world. Here, my research focuses on data mining and knowledge discovery from individual level to society level and creating intelligent systems and applicationsthat felicitously support people daily life. I am working on algorithms that analyze, understand, and model the interesting things around people.

Venue Inference

User-generated data could contain different aspects of user daily life and imply user preference, for example, a user's visit (check-in) at an Italian restaurant could imply "the user loves Italian food". In addition, many users check in at a venue online to post photos and tips describing this venue. Such multimedia contents, reflecting different aspects of a venue, provide a means of participatory sensing. From these multimedia contents, new users can get a rough image of the venue. When a user in Tokyo, who posted a photo on Flickr showing a yummy pizza, visits Nagoya and searches a restaurant, he will be recommended a pizza restaurant based on his preferences and the experience of other users. Obviously, leveraging and analyzing such kinds of data play an increasingly important role in inferring where people love to go and stay.

Personalized Recommendation

People in their everyday life always face all kinds of choices, e.g., goods at shopping, venues for visiting, multimedia contents for watching, but sometimes they actually have little a priori knowledge. This calls for smart mobile services to make recommendation to users, and the term "smart" means understanding a user, and understanding his physical and mental state. For example, you can imagine the following scenario: a mum brings her sons for outdoor activities. She shoots a video when the little boys play on the beach and swim in the sea. Later she wants to add music of her own style to this video to make this video more appealing. Here, what I am researching is to make felicitous recommendations, considering all aspects of the targets. It is very interesting to extract activity data from a user-centric point of view. Exploiting such data could be very beneficial to provide personalized recommendation. To this end, we categorize user activity logs from different data sources by using semantic concepts, and on this basis, exploit multimodal data to estimate user preferences.

Web has become a repository that contains diverse and rich knowledge in heterogeneous formats. Educational materials related to one topic could be distributed by different sources in the Internet. To efficiently leverage these resources, it is necessary to organize different media data in a unified way. Moreover, users of different levels have different understanding capabilities. Therefore, it is important to provide to each user multimedia contents that are most understandable to this individual user. Based on analyzing user patterns of learning activities, I would like to predict the most proper learning contents for individualized instructions.

Social Event Discovery

Fostered by multimedia technology innovation and user engagements, we have witnessed an unprecedented growth of user generated data. In the physical world, people visit different places in person. Meanwhile, people also like to share on social networks their experience, in the formats of text, images, and videos etc. As a result, a large amount of social multimedia data is generated by users every day and accumulated on the Internet over time. Such multimedia data not only implies user's opinions, but also provides a lot of comments about the events at venues. People are involved in the participatory sensing of events at venues, and more data leads to more accurate representation of the events. These data could not only be very beneficial for studying variousevent patterns, but also be used to generate more descriptive and explanatory analysis for revealing the relationships between social multimedia data and user physical activities. We have developed a system called EventBuilder for real-time multimedia event summarization by visualizing social media. Our team participated in Yahoo-Flickr Event Summarization Grand Challenge and won the second place in 2015.

PDF Download

SPECIAL