[세미나] [전기전자세미나] 10월 4일 Life-Immersive Mobile Computing: Cases, Opportunities, and Challenges

2018-10-01l Hit 4194

851. Life-Immersive Mobile Computing:

Cases, Opportunities, and Challenges

연사: 이영기 서울대학교 컴퓨터공학부 교수

일시: 2018년 10월 04일(목), 17:00 ~ 18:00

장소: 서울대학교 제1공학관(301동) 118호



Life-immersive mobile sensing applications have rapidly penetrated every facet of our daily lives including child education, elderly support, smart transportation, and automated homes. They have evolved to capture and use a rich set of human activities and internal states (such as intention, engagement, attention, depression) to provide highly usable situational services. For example, an upcoming mobile advertisement application will precisely target customers with the buying intention and send promotions exactly when they are free to look at the coupons. Over the years, I have been working on core challenges to (1) innovate mobile/wearable/IoT systems to enable highly enriched, and available context awareness, (2) design innovative life-immersive sensing applications, and (3) deploy the systems and applications to real target users at scale and make them highly usable. In this talk, I will speak about share critical challenges and opportunities in life-immersive mobile computing with example systems and applications that I have built and deployed. Then, I will introduce two specific systems I have been working on: (1) DeepMon, the mobile deep learning system to enable continuous vision sensing application, and (2) CommBetter, the real-time face-to-face interaction sensing system.


l 학력

Assistant Professor in the Computer Science and Engineering, Seoul National University.

Assistant Professor, Singapore Management University. 2013-2018

PhD in Computer Science, KAIST, Korea

l 연구 분야

Youngki has broad research interests in building experimental and creative software systems, which involve multi-dimensional considerations across operating systems, applications, and users. More specifically, he has been developing mobile and sensor systems to enable always-available and highly enriched awareness of human behavior, emotion, and surrounding contexts. Also, he addressed core technical challenges such as improving energy efficiency, handling concurrency, and improving recognition accuracy. Furthermore, he has been building and deploying innovative mobile sensing applications in various application domains such as daily healthcare, childcare, education in collaboration with domain experts. He published a stream of his work in top-tier conferences and journals such as ACM MobiSys, ACM SenSys, ACM UbiComp, IEEE TMC. He has also been serving technical program committees and organizing committees for such conferences including program co-chairs for UbiComp 2018. More details about him are available at