http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Mark A. Runco,Jody J. Illies,Roni Reiter-Palmon 대한사고개발학회 2005 The International Journal of Creativity & Problem Vol.15 No.1
Explicit instructions are often used to enhance performance on tests of divergent thinking. Previous research has not, however, compared explicit instructions which focus on criteria with those that focus on tactics. It is one thing to be instructed to “be original” (one possible criterion) and quite another to be given procedures to find original ideas (e.g., “think of things that will bethought of by no one else”). The research reported in the present article was designed with that in mind. In addition to comparing the different types of instructions, it also compared college- students (N = 211) who received instructions which varied in the degree of expli- citness of the applicable strategy. Regression analyses indicated that the procedural instructions had a more robust impact on divergent thinking than did the concep- tual instructions. This difference was especially clear when the divergent thinking tests were scored for ideational originality. Implications for the educational setting and for established group techniques (e.g., brainstorming)are explored.
Assessing the Accuracy of Judgments of Originality on Three Divergent Thinking Tests
Mark A. Runco,Gayle T. Dow 대한사고개발학회 2004 The International Journal of Creativity & Problem Vol.14 No.2
The componential model of creative thinking includes ideation (divergent think-ing), problem finding skills, and evaluative accuracy. This investigation was de-signed to assess the accuracy of originality judgments across three different kinds of divergent thinking tests, one (Uses) being verbal, one (Pattern Meanings) figural, and one (Consequences) “realistic.” These tasks were compared using three indices of evaluative accuracy. A second objective of this investigation was to examine the discriminant validity of the evaluative accuracy scores. Given that evaluations re-flect judgment, it was possible that existing measures of judgment and critical thinking would be strongly related to (and even redundant with) the new measures evaluative accuracy. Results from the present investigation confirmed that there were differences among various divergent thinking tests in terms of the accuracy of judgments. Additionally, the measures of evaluative accuracy used here were largely unrelated to more traditional measures of judgment and convergent thinking, which supports the discriminant validity of the evaluative measures.