http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
DEA-AR/AHP 모형을 이용한 국내 건설기업의 경영효율성 분석
이경주,박정로,김재준,Lee, Kyung-Joo,Park, Jung-Lo,Kim, Jae-Jun 대한건축학회 2012 대한건축학회논문집 Vol.28 No.6
The management efficiency is becoming important for korean construction company after the IMF of 1997 and the global financial crisis of 2008. Previous study analyzed efficiency of construction company using nonparametric method based DEA(Data Envelopment Analysis) model. but To analyze efficiency using DEA model is problem that can be evaluated inefficient company to efficient company. Therefore, this study is to analyze efficiency of korean construction company using DEA-AR(Assurance Region)/AHP model. previous study and theories were studied, variables were selected, and critical variables were for the efficiency analysis. The study analyzed the efficiency of construction companies using the DEA model and the DEA-AR/AHP model and compared the results. The study eliminated the drawbacks of the existing DEA model and ranking for measured values of efficiency required a follow-up process to mark the difference between measured values of efficiency some efficient construction companies. The ranking several construction companies shown the analysis using the existing DEA model. This study is expected to provide reasonable evaluation standards for the management of construction companies in the ranking of companies' management evaluation.
이경주,김계영 한국차세대컴퓨팅학회 2016 한국차세대컴퓨팅학회 논문지 Vol.12 No.4
본 논문에서는 하나의 팬-틸트-(PTZ) 카메라를 사용한 2m 내외의 원거리에 있는 사람의 시선을 추하는 방법 을 제안한다. 제안된 방법에서는 사람의 치에 따라 카메라를 각모드 는 각모드로 환하여 시선추을 한 상을 획득한다. 각모드에서 카메라 FOV(Field of View) 내에 존재하는 얼굴을 탐지하면 얼굴의 치를 계산 하여 각모드로 환한다. 각모드에서 획득한 상은 원거리에 있는 사람의 시선방향 정보를 담고 있다. 시선의 방향을 계산하기 한 방법은 얼굴포즈 추정과 시선방향 산출 단계로 구성된다. 얼굴포즈 추정은 얼굴에서 과 코 의 치정보를 이용하여 추정한다. 그리고 시선방향은 먼 가변템릿을 통해 동공을 분할하고, 의 양 끝과 얼 굴포즈 정보를 이용하여 안구 심을 추출하는 과정으로 산출한다. 실험에서는 본 논문에서 제안한 시선추 알고리 즘이 원거리에 있는 사람의 시선방향을 효과으로 산출하는 것을 거리별 실험을 통해 보여다. This paper suggests a method for tracking gaze of a person at a distance around 2 m, using a single pan-tilt-zoom (PTZ) camera. In the suggested method, images are acquired for gaze tracking by turning the camera to the wide angle mode, or the narrow angle mode, depending on the location of the person. The face that is present in the field of view (FOV) of a camera, is detected in the wide angle mode. Once the location of the face is calculated, the camera turns to the narrow angle mode. The images, which have been acquired in the narrow angle mode, contain information on the direction of gaze of the person, who is at a distance. The method for calculating the direction of gaze is comprised of the head pose estimation, and gaze direction calculation steps. The head pose estimation is performed, by using the location information on the eyes and nose in the face. And the direction of gaze is generated using the process of partitioning the pupil, through a deformable template, and extracting the center of an eye using the end points of both eyes and head pose information. This study shows that the proposed gaze tracking algorithm can effectively track the direction of a person's gaze, at varying distances.