Youngjun is a lecturer (Assistant Professor) in the department of computer science at University College London (UCL) and a key academic member in UCLIC (UCL Interaction Centre) and GDI-ARC (Global Disability Innovation Academic Research Centre). Also, he is a group leader of Physiological Computing and Artificial Intelligence, and a co-founder of KIT-AR (UCL/Sintef Spinout company).
He explores, builds and evaluates novel techniques and technologies for the next generation of artificial intelligence-powered physiological computing* that boosts disability technology innovation.
He has pioneered mobile thermal imaging-based physiological sensing and automated detection of affective states (e.g. mental stress). He obtained a PhD in computational physiology and thermography from Faculty of Brain Sciences at UCL (and obtained a MSc in Robotics, a BSc in ICT – summa cum laude).
In 2011-2018, he worked as a senior research scientist at LG Electronics (full-time: 2011-2015, leave of absence: 2015-2018) and led a variety of industrial research projects, successfully commercialising his novel sensing and machine learning technologies (e.g. gesture-driven advanced touchscreen for vehicles**).
His research has been funded by EPSRC, Bentley Motors, EC H2020, NTT, ADB and DfID. He has also secured a series of equipment grants (> £100k). His earlier academic studies (including 4-year BSc, 2-year MSc, 3-year PhD) were fully funded – the primary funders includes prestigious scholarship/grant bodies: EC H2020, UCL-ORS, National Research Foundation of Korea, LG and Samsung.
He has authored more than 70 articles (including 45 granted patents) in areas related to affective, physiological computing, machine learning, human-computer interaction, accessible user interfaces, and multimodal sensing and feedback. Some of the achievements have been featured in forums for the general public such as BBC News, Phys.Org, Imaging and Machine Vision Europe, Science Daily, and SBS News.
*What he means by physiological computing is technology that helps us listen to our bodily functions, psychological needs and adapts its functionality.
** His key research outcomes (in industry) were actively promoted to Google, BMW, Porsche, Bentley, Volkswagen, Jaguar, Mercedes-benz, etc. (One of extremely successfully commercialised products is: Proximity Touch for 12.3inch-unit display in Porsche Panamera).
• Multi-million pound research grant portfolio
• Supervision as of 2020: he supervises 3 PhD students, 5 PDRA/RAs, and over 10 MSc/MEng students (on average per year).
• Teaching: Research Methods & Making Skills (COMP0145, Module Leader), Affective Computing and Human Robot Interaction (COMP0053, Module Contributor), MSc Disability, Design and Innovation (DDI) – Dissertation (COMP0159, Module Leader).
• Board: UCL Grand Challenge of Transformative Technologies (2019 – present), MDPI Sensors Journal Topic Editor Board (2020 – present), Snowdon Scholarship for disabled students, review panel (2019 – present),
• Conference Organising Committee: e.g. ACII 2021 (Special Session Chair), ICMI 2020 (Senior Program Committee), ACII 2019 (Senior Program Committee)
• Reviewer: over 10 Journals (e.g. TOCHI, Biomedical Optics Express, JMIR), over 10 International Conference Proceedings (e.g. CHI, UIST, ACII, EuroHaptics, ICMI)
NEWS from youngjuncho.com