RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      KCI등재

      2단계 자율주행 중 시각 및 청각 이차 과제 수행이 제어권 전환 능력에 미치는 영향 = Effect of Visual and Auditory Secondary Task on Takeover Performance in Level 2 Automated Driving

      한글로보기

      https://www.riss.kr/link?id=A108766592

      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      다국어 초록 (Multilingual Abstract)

      Automated vehicles are becoming increasingly popular. Currently, commercialized level 2 automated vehicles provide convenience by allowing drivers to perform non-driving related tasks (or secondary tasks) while reducing the burden of driving. However, these vehicles cannot fully respond to all driving situations. Thereby, during level 2 automated driving, drivers must always keep their eyes on the road and take control of the vehicle if the automated features fail to work. If the driver does not react appropriately during the takeover, it may cause traffic accidents. Thus, drivers must always maintain a high level of takeover performance during automated driving. Since visual secondary tasks (e.g., reading a book) can impair their ability to react, auditory secondary tasks (e.g., listening to audiobooks) are preferred in vehicular environments. However, according to multiple resource theory, auditory secondary tasks may also have the potential to impair takeover performance. In this study, we investigated how various visual and auditory secondary tasks impact drivers' takeover performance in level 2 automated driving environments. Our results showed that, in addition to visual secondary tasks, auditory secondary tasks can also impair the driver’s takeover performance. This suggests that, during level 2 automated driving, it may be unsafe to engage in auditory secondary tasks. Currently, commercialized level 2 automated vehicles do not monitor drivers’ auditory secondary tasks and intervene when necessary. Based on our findings, we discuss design implications for monitoring and intervening in drivers' auditory secondary tasks during autonomous driving.
      번역하기

      Automated vehicles are becoming increasingly popular. Currently, commercialized level 2 automated vehicles provide convenience by allowing drivers to perform non-driving related tasks (or secondary tasks) while reducing the burden of driving. However,...

      Automated vehicles are becoming increasingly popular. Currently, commercialized level 2 automated vehicles provide convenience by allowing drivers to perform non-driving related tasks (or secondary tasks) while reducing the burden of driving. However, these vehicles cannot fully respond to all driving situations. Thereby, during level 2 automated driving, drivers must always keep their eyes on the road and take control of the vehicle if the automated features fail to work. If the driver does not react appropriately during the takeover, it may cause traffic accidents. Thus, drivers must always maintain a high level of takeover performance during automated driving. Since visual secondary tasks (e.g., reading a book) can impair their ability to react, auditory secondary tasks (e.g., listening to audiobooks) are preferred in vehicular environments. However, according to multiple resource theory, auditory secondary tasks may also have the potential to impair takeover performance. In this study, we investigated how various visual and auditory secondary tasks impact drivers' takeover performance in level 2 automated driving environments. Our results showed that, in addition to visual secondary tasks, auditory secondary tasks can also impair the driver’s takeover performance. This suggests that, during level 2 automated driving, it may be unsafe to engage in auditory secondary tasks. Currently, commercialized level 2 automated vehicles do not monitor drivers’ auditory secondary tasks and intervene when necessary. Based on our findings, we discuss design implications for monitoring and intervening in drivers' auditory secondary tasks during autonomous driving.

      더보기

      참고문헌 (Reference)

      1 Huynh, S., "iMon:Appearance-based Gaze Tracking System on Mobile Devices" ACM 5 (5): 1-26, 2021

      2 UN Regulation, "Uniform provisions concerning the approval of vehicles with regard to Automated Lane Keeping Systems" UNECE 5-64, 2021

      3 Mok, B., "Tunneled in: Drivers with active secondary tasks need more time to transition from automation" 2840-2844, 2017

      4 Wan, J., "The effects of lead time of take-over request and nondriving tasks on taking-over control of automated vehicles" IEEE 48 (48): 582-591, 2018

      5 Akuchie, M, "Tesla's Staged Self-Driving Video Is A Prime Example Of Deceptive Marketing"

      6 Klender, J, "Tesla with sleeping driver proves there’s still misunderstanding and irresponsibility surrounding autonomy"

      7 SAE International, "Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles" SAE international 1-5, 2018

      8 Ma, R., "Situation awareness and workload in driving while using adaptive cruise control and a cell phone" Elsevier 35 (35): 939-953, 2005

      9 Shin, H. S., "Real time car driver's condition monitoring system" 951-954, 2010

      10 Wangwiwattana, C., "Pupilnet, measuring task evoked pupillary response using commodity rgb tablet cameras: Comparison to mobile, infrared gaze trackers for inferring cognitive load" ACM 1 (1): 1-26, 2018

      1 Huynh, S., "iMon:Appearance-based Gaze Tracking System on Mobile Devices" ACM 5 (5): 1-26, 2021

      2 UN Regulation, "Uniform provisions concerning the approval of vehicles with regard to Automated Lane Keeping Systems" UNECE 5-64, 2021

      3 Mok, B., "Tunneled in: Drivers with active secondary tasks need more time to transition from automation" 2840-2844, 2017

      4 Wan, J., "The effects of lead time of take-over request and nondriving tasks on taking-over control of automated vehicles" IEEE 48 (48): 582-591, 2018

      5 Akuchie, M, "Tesla's Staged Self-Driving Video Is A Prime Example Of Deceptive Marketing"

      6 Klender, J, "Tesla with sleeping driver proves there’s still misunderstanding and irresponsibility surrounding autonomy"

      7 SAE International, "Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles" SAE international 1-5, 2018

      8 Ma, R., "Situation awareness and workload in driving while using adaptive cruise control and a cell phone" Elsevier 35 (35): 939-953, 2005

      9 Shin, H. S., "Real time car driver's condition monitoring system" 951-954, 2010

      10 Wangwiwattana, C., "Pupilnet, measuring task evoked pupillary response using commodity rgb tablet cameras: Comparison to mobile, infrared gaze trackers for inferring cognitive load" ACM 1 (1): 1-26, 2018

      11 The Associated Press, "Patrol: Tesla Autopilot driver was watching movie, crashed"

      12 Kun, A. L., "On the feasibility of using pupil diameter to estimate cognitive load changes for in-vehicle spoken dialogues" Interspeech 3766-3770, 2013

      13 Wickens, C. D, "Multiple resources and mental workload" Sage Publications 50 (50): 449-455, 2008

      14 Endsley, M. R, "Measurement of situation awareness in dynamic systems" Sage Publications 37 (37): 65-84, 1995

      15 Mehler, B., "MIT AgeLab delayed digit recall task (n-back)" Massachusetts Institute of Technology 17-, 2011

      16 Banks, V. A., "Is partially automated driving a bad idea? Observations from an on-road study" Elsevier 68 : 138-145, 2018

      17 Arkonac, S. E., "In-car distractions and automated driving:a preliminary simulator study" ACM 346-351, 2019

      18 Mehler, B., "Impact of incremental increases in cognitive workload on physiological arousal and performance in young adult drivers" Sage Publications 2138 (2138): 6-12, 2009

      19 Radlmayr, J., "How traffic situations and non-driving related tasks affect the take-over quality in highly automated driving" Sage Publications 58 (58): 2063-2067, 2014

      20 Naujoks, F., "From partial and high automation to manual driving : Relationship between non-driving related tasks, drowsiness and take-over performance" Elsevier 121 : 28-42, 2018

      21 Louw, T., "Engaging in NDRTs affects drivers’ responses and glance patterns after silent automation failures" Elsevier 62 : 870-882, 2019

      22 Konstantopoulos, P., "Driver's visual attention as a function of driving experience and visibility, Using a driving simulator to explore drivers’ eye movements in day, night and rain driving" Elsevier 42 (42): 827-834, 2010

      23 Dosovitskiy, A., "CARLA: An open urban driving simulator" PMLR 1-16, 2017

      24 Glisky, E. L, "Brain Aging: Models, Methods, and Mechanisms" CRC Press/Taylor &Francis 2007

      25 Mehler, B., "Assessing the Demands of Voice Based In-Vehicle Interfaces - Phase II Experiment 3 - 2015 Toyota Corolla" Massachusetts Institute of Technology 1-62, 2015

      더보기

      동일학술지(권/호) 다른 논문

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      주제

      연도별 연구동향

      연도별 활용동향

      연관논문

      연구자 네트워크맵

      공동연구자 (7)

      유사연구자 (20) 활용도상위20명

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼