http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
배승예 포항공과대학교 융합대학원 2022 국내석사
Predicting outfit compatibility refers to determining whether fashion items look good if worn together. In recent years, a few studies have used visual-semantic space in outfit compatibility prediction and proposed modeling or equations to capture high-level information. However, the proper textual attributes help to form a more accurate visual-semantic space, and providing the domain-specific details allows the compatibility model to learn semantically more robust information. This thesis proposes a method to extract the domain-specific fashion attributes using color expertise. The proposed method maps the pattern and adjectives corresponding to the closest one among Kobayashi’s color triplets of each item as fashion style concepts. Then, it adjusts the resulting concepts by zero-shot classification of fine-tuned CLIP to make them more distinguishable. Experiments with four datasets that differ in the composition of the extracted concepts in text attributes are conducted to validate the proposed method. The dataset, including adjusted fashion style concepts, outperforms the prior baseline with a 14% increase in FITB accuracy of outfit compatibility prediction. The result shows that high-level semantic features are prominent to get the unified representation through visual-semantic space and verifies that our approach is more applicable to the fashion domain for outfit diagnosis.