Auditory User Interface Design
Human-Computer I nteraction
Myounghoon “Philart” Jeon Mind Music Machine Lab Center of Cyber-Human Systems Cognitive Science, Computer Science CS 1000 – October 13, 2015
Human-Computer I nteraction Myounghoon Philart Jeon Mind Music - - PowerPoint PPT Presentation
Auditory User Interface Design Human-Computer I nteraction Myounghoon Philart Jeon Mind Music Machine Lab Center of Cyber-Human Systems Cognitive Science, Computer Science CS 1000 October 13, 2015 Auditory User Interface Design
Auditory User Interface Design
Myounghoon “Philart” Jeon Mind Music Machine Lab Center of Cyber-Human Systems Cognitive Science, Computer Science CS 1000 – October 13, 2015
Auditory User Interface Design
Philart’s Personal…
Auditory User Interface Design
Teaching Educational Background
Measure Studio
Georgia Institute of Technology (2012)
Institute of Technology (2010)
University in Korea (2004)
Korea (2000)
Korea (2000)
University in Korea (2007)
2 1 HCI Researcher @Daum
Comm., UX/UI Designer & Sound Designer @LG Elec.
3 4
Best Papers (HFES, HCII), Ergonomic Design Award, IF Comm. Design Award Co-work with SS, H/K Motors, Toyota, GE, Panasonic, etc. HFES, CHI, HCII, MobileHCI, ASSETS, CSUN, ICAD, AutomotiveUI, UbiComp, etc.
Auditory User Interface Design
Auditory User Interface Design
Auditory User Interface Design
Auditory User Interface Design
Auditory User Interface Design
Auditory User Interface Design
Center of Cyber-Human Systems, Institute of Computing and Cybersystems
Human-Centered Design: Designing systems of the users, by the users, and for the users. We are interested in People, Art, Design, Technology, & eXperiences
Auditory User Interface Design
Auditory User Interface Design
Expand a nd artists’ e emotiona nal e expres essions ns a and a d aesthet etic dimens ensions using v g visualiz ization ion a and s sonif ific ication ion a at the i imm mmers rsiv ive v virt rtual e environ ironment
Auditory User Interface Design
– 120 Hz – Sub-millimeter precision
Auditory User Interface Design
distributes it to 8 tail nodes, each of which is connected to 3 multivisions; and (2) the sonifier via the scripting language.
Auditory User Interface Design
Auditory User Interface Design
Auditory User Interface Design
Based i ed in Chicago, c crea eates es large geometric p piec eces es,
Auditory User Interface Design
“Orr rrico l laid face d dow
iece of
hol
graphite p pencils in in both ha
. He pu push shed o
ll, jetting hi himself lf forward o
p of the he pi
. He dragged hi his g s graphite pe pencils ls a along with hi him; a as s he he w writhed hi his s wa way ba back to the s e starting ng p position o n over a and o d over a again, n, h he l e lef eft beh behind nd hi himself lf a a pi pictorial hi l hist story o
his m s motion.” .” “He k knelt lt o
sheet o
paper, st striking i it with g graphite a as s he he sw swung hi his a s arms i in a pe pendula lar m motion, a , and slowl wly r y revolved a ed atop the e mat.”
Auditory User Interface Design
The e outcomes es o
collaboration n and T d Tony’ y’s works w were d e displayed yed in the F Finnis ish Ame meric rican Herit ritage ge C Center i r in Hancock, M MI.
Auditory User Interface Design
Crea eativity & y & Intent entiona nality y
Auditory User Interface Design
Auditory User Interface Design
Ta Takin ing d g driv ivers rs’ e emot motio ions a and a affect i into a
improve road s d safet ety b y by estimating ng a a driver er’s affec ective s e states es a and d interv rvenin ing w g with d dynamic ic t technol
Auditory User Interface Design
Auditory User Interface Design
Auditory User Interface Design
Our first sy syst stem u use ses t s the he S Support- Ve Vector M
ines ( (SVMs) a algor
ithm, whi hich c could ld d detect po posi sitive, n , negative, , and n neutral a l affective st
second sy syst stem u use ses t s the he V Viola la-Jo Jone nes o
bject det detection f n framework, wh which could det detect more s e specific affec ective s e states es, i includi ding anger, h happi ppiness ss, a and s surpr prise se.
Auditory User Interface Design
Table 2. Mapping variables for observation states and sonification parameters Observation States Sonification Parameters (SP) Affective States (AS) Driving Behaviors (DB)
Musical Parameters (MP)
Human Factors (HF)
System Factors (SF)
ObservationStates = AS(sFEX, sFEMG, sEMP, sHR, sRE, sSC, sEEG) x DB(sLD, sSWA, sSP, sPF, sCO) SonificationParameters = MP(cGE, cKEY, cTE) x HF(cFA, cPR, cEX) x SF(cTI, cDU, cRE, cIN ) SonificationOutputs = f(ObservationStates x SonificationParameters) Interm rmit ittent s sonif ific icatio ion b based o
driver r affectiv ive s states a and b behavior iors Contin inuou
sonif ific icatio ion u using mu g multis istre ream s soundscapes
Auditory User Interface Design
Auditory User Interface Design
Facil ilit itate s socia ial a and e emot motion ional i intera raction ion o
ildre ren w with A ASD usi sing phy physical l and m musical st l stimuli li
Auditory User Interface Design
Auditory User Interface Design
“How much t the hey qu quest stioned t the he n nature of art?” “What t they a y added t ded to the c concep eption o n of art?”
Auditory User Interface Design
Res Research: Robot A Accep eptanc nce e Hum uman an-Ro Robo bot T Team I Inter eraction
Auditory User Interface Design