Creative Solutions and their Evaluation:
Comparing the Effects of Explanation and Argumentation Tasks
on Student Reflections
Andria Andilioua,
P. Karen Murphyb
a
University of Bristol, United Kingdom
b The
Pennsylvania State University, USA
Article received 12 February
2014 / revised 24 March 2014 / accepted 15 June 2014 /
available online 27 June 2014
Abstract
Creative
problem solving which results in novel and effective ideas or
products is most advanced when learners can analyze, evaluate,
and refine their ideas to improve creative solutions. The purpose of this
investigation was to examine creative problem solving
performance in undergraduate students and determine the tasks
that support critical self-evaluations of creative solutions
by comparing alternative types of reflective tasks.
Participants (n = 103) first provided demographic information
and responded to individual difference measures (i.e.,
divergent thinking, need for cognition, and beliefs about
creative outcomes) and then read a problem scenario in which
they assumed the role of a high school teacher who was asked
to design a creative college preparatory course. Following,
participants completed either an explanation reflective task
or an argument based reflective task. Finally,
participants evaluated their proposed course by rating it on
characteristics that describe the originality and
effectiveness of creative solutions. Findings confirmed the
role of divergent thinking as a positive predictor of the
originality of a creative solution, whereas, need for
cognition, and academic major were positive predictors of the
effectiveness of a creative solution. Participants rated
their creative solutions differentially depending on their
beliefs and the type of reflective task. Those whose beliefs
aligned better with conceptualizations of creative outcomes
assessed more positively the originality and effectiveness of
their solution. The
findings indicate that the argumentation task could
potentially promote reflective and critical thinking about a
creative solution as participants who completed the
argumentation task evaluated their solution more
conservatively.
Keywords: creative problem solving; creativity beliefs;
self-evaluation; reflection; argument diagrams
Corresponding
author: Andria Andiliou, Academic Staff Developer,
University of Bristol, andria.andiliou@bristol.ac.uk
http://dx.doi.org/10.14786/flr.v2i3.87
1.
Introduction
Creative
problem
solving is manifested in everyday life situations as well as in
academic contexts (Diakidoy & Constantinou, 2001). It represents a goal
directed cognitive process that results in the production of
original and effective solutions when no obvious solution method
is available (Antiliou, 2012).
Consider the example of a mud engineer trying to find an
innovative solution to prevent oil leaks from an underwater
well, the example of a community center manager attempting to
design activities that address the needs of a diverse community,
or a group of preschool children trying to improvise the rules
of a game to accommodate more players. In all cases, the
situations call for novel but still effective solutions.
Many
countries around the world have identified the development of
creative thinking across subject areas as a core student
learning outcome (Diakidoy & Constantinou, 2001) and have
pushed for problem-based approaches that provide students with
opportunities to construct creative solutions to authentic
problems. However,
students are often overwhelmed when they are asked to submit
creative assignments or generate creative solutions. Arguably, one possible
explanation for these negative reactions is that students have
to rely on their creativity beliefs, but they are uncertain
about the characteristics of creative outcomes. Consequently, they are
unsure about how creative their proposed solution is or how to
determine the creativeness of their solution. Unfortunately, the
role of beliefs has not been adequately explored with respect to
creative problem solving performance. As such, the first
goal of the present study was to examine the contribution of
cognitive and affective individual difference variables
including an individual’s beliefs on creative performance.
Besides
creative
performance per se, researchers have explored the
self-evaluations of the proposed solutions and research evidence
suggests that students’ evaluations of self-generated solutions
are superficial rather than reflective (Runco & Chand,
1994). In addition,
students tend to engage in case-building about a solution. Instead of critically
judging a solution, students argue and justify their solutions
by discounting potential obstacles and consequences or
curtailing the importance or extensiveness of the problem
(Byrne, Shipman, & Mumford, 2010; Daily & Mumford, 2006;
Nussbaum, 2008). In
order to promote more critical reasoning and reflective
evaluations in problem solving, researchers examined the
effectiveness of structure supports such as prompts (Chen &
Bradshaw, 2007; Ge & Land, 2003), directions (Feretti,
MacArthur, & Dowdy, 2000; Nussbaum & Sinatra, 2003; Nussbaum &
Kardash, 2005), cases (Choi & Lee, 2009; Hernandez-Serano
& Jonassen, 2003), visual representations (Nussbaum, 2008;
Nussbaum & Schraw, 2007), collaborative reasoning, and
argumentation tools and tasks (Cho & Jonassen, 2002). These types of
structure supports engage students in thinking about other
perspectives, opinions, and approaches to the problem. This is particularly
the case when the structure support involves argumentation as a
mechanism to elaborate, make explicit the reasoning underlying
the problem solving and to foster reflection about a solution
(Andriessen, 2006; Oh & Jonassen, 2007).
A
type of structure support, argumentation tools, were found to
significantly improve students’ argumentation skills and group
problem solving with some marginal effects on individual problem
solving (Cho & Jonassen, 2001; Oh & Jonassen, 2007;
Uribe, Klein, & Sullivan, 2003). However, past research
has not specified how argumentation tools and specifically,
argument diagrams influence the self-evaluation of a solution
when the problem calls for creative solutions which are original
and effective. Thus,
the second goal of this study was to address this gap in the
literature by investigating the effects of an argument diagram
on the self-evaluation of a creative solution that participants
forwarded to a course design problem.
1.1 Creative Problem
Solving
Several
models
have been proposed to describe creative problem solving
including: The Simplex Model of Creative Process (Basadur et
al., 1994), the Creative Problem Solving framework (Isaksen et
al., 1994), and the Model of Creative Thought (Mumford et al.,
1991). In the
Simplex Model the problem solver moves in cycles of ideation and
evaluation that occur in different phases of problem solving
during which the learner generates and formulates the problem,
solves the problem and implements the relevant, appropriate, and
original ideas (Runco & Chand, 1994). According to the
Creative Problem Solving framework, the learner needs to
understand the problem, generate solution ideas, and plan for
action by developing solutions that could be effectively
implemented (Treffinger, 1995).
In Mumford et al. (1991) model of creative thought the
learner combines and reorganizes categories or concepts to
develop a new understanding of the problem (ideas), which are
then evaluated and implemented.
These aforementioned models illustrate that creative
problem solving evolves across several cognitive subprocesses
initiated by the construction of a problem space, the generation
of ideas, and the evaluation of a selected solution.
Creative
problem
solving is a form of ill-structured problem solving, which
results in the production of original and effective solutions
(Antiliou, 2012). Drawing
on a review of the empirical literatures of creative and
ill-structured problem solving, certain individual difference
variables were expected to have an effect on the creativity of a
solution with respect to its originality and effectiveness.
Specifically, divergent thinking that is the ability to generate
multiple ideas was found to be predictive of creative problem
solving performance (e.g., Hunter et al., 2008; Reiter-Palmon et
al., 2009; 1997). In
addition, research evidence indicated that need for cognition
that represents an individual’s tendency to engage in and enjoy
effortful cognitive endeavours (Cacioppo, Petty, & Kao,
1984) also, predicts creative problem solving performance
(Butler et al., 2003; Hunter et al., 2008; Osburn & Mumford,
2006).
Students’
domain
knowledge and their beliefs about the characteristics of
creative outcomes can potentially exhibit an influence on
creative problem solving performance. Research evidence
suggests that problem solvers’ perceptions of a task and their
domain knowledge impacts the search for relevant information,
the representation of the problem, and the evaluation of
potential solutions (Jonassen, 1997; Voss et al., 1991). Also, evidence
indicates that knowledge of the important concepts and
principles of a domain contributes in better performance on
ill-structured tasks (Shin, Jonassen, & McGee, 2003; Voss
& Post, 1988) and serves as the foundation of creative
solutions (Weisberg, 2006).
Evaluation
is
a component of problem solving and it represents the
metacognitive process during which problem solvers reflect and
assess a proposed solution.
Evaluation is essential for ill-structured problems that
require a creative solution because it is the process by which
problem solvers can determine whether a proposed solution meets
the creative criteria of originality and effectiveness. According to Voss and
her colleagues (1981) argumentation is a means for problem
solvers to evaluate more analytically a solution by elaborating
and clarifying the solution and identifying its limitations. Researchers have
experimented with graphic organizers such as argument diagrams
in order to promote critical and reflective thinking in writing
tasks. For example,
Nussbaum and Schraw (2007) found that argument diagrams
supported better integration of arguments and counterarguments
in writing tasks. For
the present study, our second aim was to determine whether an
argument task (i.e., argument diagram) promotes more reflective
critical self-evaluations of a potentially creative solution in
comparison to an explanation task.
1.2 Explanation and
Argumentation for Critical Thinking
Explanation
and
argument tasks have been used to promote and assess
understanding, critical thinking, conceptual change, and problem
solving (Reznitskaya, Anderson, & Kuo, 2007; Jonassen &
Kim, 2010; Nussbaum & Sinatra, 2003; Willey & Voss,
1999). Explanation is a constructive learning activity during
which learners elaborate and clarify an idea by explaining it to
oneself and it was found to lead to enhanced learning, more
accurate self-assessments, and more effective problem-solving
(Fonseca & Chi, 2011). When learners elaborate they generate
inferences and integrate information with prior knowledge. The
self-explanation effect was found to be positive both for
learners with low and high prior knowledge. A possible
interpretation of this result is that for individuals with low
knowledge, self-explaining allows them to generate inferences to
fill their knowledge gaps and for learners with high prior
knowledge, self-explaining allows them to repair their existing
mental models (Chi, 2000; Fonseca & Chi, 2011). Research
findings indicate that self-explanation can be a powerful
learning strategy due to the underlying cognitive mechanisms
that allow learners to identify and remedy knowledge gaps by
generating inferences and to develop and repair their knowledge
representation models. Thus,
when learners move beyond simply knowledge-telling with
summaries or paraphrased statements and are engaged in
self-explanation through inference generation and knowledge
integration they seem to gain a deeper understanding. However, even if
self-explanation to one-self represents a constructive learning
activity it was found to be somewhat less effective in learning
and problem solving tasks when compared to more interactive
learning activities such as responding to question prompts,
explaining to someone else, and discussing with a partner to
generate collaborative explanations. In the present study,
within the context of a problem solving task an explanation
prompt was compared with an argument task to examine the degree
to which the tasks promote reflective thinking when evaluating a
proposed creative solution.
Argumentation
was
conceptualized by Kuhn (1991) as the cognitive process of
formulating and weighting the arguments for and against a course
of action, a point of view, or a solution to a problem.
Argumentation skills are comprised of the skill to generate
reasons, offer evidence, and provide counterarguments and
rebuttals. Three
theoretical frameworks have been applied to analyze
argumentation based on rhetorical and dialectical arguments in
educational settings. Rhetorical
arguments are put forward to persuade or convince others about a
claim or proposition without consideration to alternative
positions (Toulmin, 1958).
Dialectical arguments are based on the dialogue between
supporters of alternative positions during a dialogue game or a
discussion (Jonassen & Kim, 2010). Through dialectical
argumentation within an individual or within a group,
individuals resolve differences, compromise between multiple
opinions, and convince on the advantages of a position.
Researchers
in
the learning sciences draw primarily on three theoretical
approaches to analyze and evaluate the quality of argumentation:
Toulmin’s rhetorical argumentation framework, pragma-dialectics
(van Eeemeren & Grootendorst, 1992) and Walton’s (2000)
dialogue theory. Toulmin
has proposed an argument scheme to describe the structure of
effective argumentation that includes a sequential set of
components: a claim that expresses the position, facts or
opinions that serve as data in support of the claim, a warrant
as justification, and elaborative elements such as a backing,
qualifier, and rebuttal to potential counterclaims. Although Toulmin’s
framework is useful for analyzing rhetorical argumentation to
determine the soundness and effectiveness of a line of reasoning
of an individual (Andriessen, 2006), it has two primary
limitations: its
complexity (e.g., warrants are often implicit) and its focus on
the perspective of one proponent (Leitão, 2003; van Eemeren
& Grootendorst, 1999) instead of argumentation as “a
discourse phenomenon” (Andriessen, 2006) especially with
reference to educational contexts.
Two
theoretical models that are more applicable to the dialectical
nature of argumentation as it is manifested in educational
contexts are the pragma-dialectics (van Eeemeren &
Grootendorst, 1992) and dialogue theory (Walton, 2000). Based on the
pragma-dialectics, argumentation is a means of resolving
differences of opinions through critical discussions that evolve
in four stages. First,
people present their positions at the confrontation stage, they
assume their roles and agree on procedures at the opening stage,
they defend and challenge during the argumentation stage and at
the concluding stage they decide who has prevailed the critical
discussion. Another collaborative view of argumentative
discourse is conceptualized in Walton’s dialogue theory (Walton,
2000) in which Walton argues that argumentation is a
goal-directed and interactive dialogical activity during which
individuals reason together about arguments to generate one
proposed solution. Walton
(2000) identified specific forms of dialogue (e.g., information
seeking, negotiation, persuasion, inquiry) along with
argumentation schemes comprised of critical questions and moves
to model and support argumentation. Educators can draw on
the schemes suggested in dialogue theory to plan, organize, and
evaluate classroom and online discussions, and use argumentation
as a vehicle for critical thinking and problem solving. In the present study
an overarching critical question was used to stimulate student
argumentation with an imaginary group of stakeholders with the
purpose of exploring the potential of a creative solution to an
authentic problem.
1.2.1 Supporting
Argumentation
Researchers
have
documented that acquiring the skills to argue effectively is
challenging both for adolescents and young adults (Felton &
Kuhn, 2001; Reznitskaya et al. 2001). In order to engage
students in argumentation and promote the development of
argumentation skills educational researchers have designed and
experimented with argumentation supports in the contexts of
reading, writing, and problem solving tasks. Among these
argumentation supports are directions, computerized and
face-to-face collaborative argumentation, and visual
argumentation aids.
Goal
directions have been used as a means to promote reflective
argumentation in problem solving and writing tasks. For example, Nussbaum
and Sinatra (2003) asked undergraduate students who provided a
wrong answer to a physics problem in which they had to predict
the path of a falling object to counter-argue by providing
reasons why a person would hold an opposing position. The researchers found
that students who proposed counterarguments had a more
integrated understanding of the problem situation and the
important underlying concepts.
The effectiveness of directions in supporting
argumentation varied based on the goal they conveyed. When students were
instructed to persuade an audience instead of explaining their
position or solution, evidence indicated that they engaged in
case-building, the overall quality of writing was poorer with
fewer counterarguments but more reasons in justification of
their position. Alternatively
more specific directions that guided students to generate
complete arguments were more effective in facilitating student
argumentation. When
Nussbaum and Kardash (2005) gave goal directions that varied in
generality (i.e., opinion, reason, counterargue/rebut), they
found that the group that received more specific directions to
persuade by generating reasons, evidence, counterclaims and
rebuttals, produced writing of better overall quality, more
balanced, and the participants offered more counterarguments and
rebuttals. In a
follow-up experiment, Nussbaum and Kardash (2005) compared the
effects of two types of goal directions (e.g., express an
opinion or persuade an audience) and the effects of a two-sided
non-refutational text. Undergraduate
students who were directed to express an opinion and read the
text produced essays of better overall quality, wrote more
elaborative arguments and offered more counterarguments in
comparison to those who were directed to persuade as the text
stimulated students’ thinking.
Directions to persuade had a significant negative effect
on the overall quality of argumentation only for students who
did not read the text. Researchers
raised caution about persuasion directions as it is possible
that students’ rely on a misconception that they are more
effective in convincing an audience by elaborating on their
position than raising counterarguments (Ferretti, MacArthur,
& Dowdy, 2000; Nussbaum & Kardash, 2005). In order to promote
more balanced and reflective reasoning researchers have utilized
other structure supports such as collaborative reasoning
discussions and visual argumentation aids including
computer-scaffolding tools, outlines and diagrams.
Researchers
have
investigated the effect of collaborative argumentation both
computerized and face-to-face to facilitate students’ critical
reasoning and argumentation within the context of problem
solving tasks. In
two exemplar studies researchers examined the effects of
argumentation scaffolds and question prompts on the quality of
argumentation, the group problem solving performance and
transfer to individual problem solving (Cho & Jonassen,
2003; Oh & Jonassen, 2007).
Typical argumentation scaffolds included sentence openers
that helped to explicate a solution, agree or disagree with a
solution, put forward evidence, and elaborate on the solution. In addition, guidance
questions functioned as scaffolds (e.g., How can you verify the
accuracy or value of your solution?) in the collaborative
discussion environments. Researchers
found that the argumentation scaffolds improved the quality of
the discussion in terms of the number of argument components
including claims on how to solve the problem and evidence to
support the solution (Cho & Jonassen, 2003; Oh &
Jonassen, 2007). There
was also improvement in the overall quality of problem solving
subprocesses including problem definition, selection of relevant
information, hypothesis generation and testing, solution
development and evaluation.
However, in both studies the researchers did not detect
significant transfer effects of argumentation scaffolding on
individual problem solving.
Thus, suggesting that learners may need long-term and
more comprehensive opportunities for extended engagement in
collaborative problem solving to effectively transfer and apply
argumentation skills in individual problem solving.
Collaborative
discourse
was also effective in facilitating argumentation when combined
with instruction on basic argumentation concepts and reading of
multiple texts. In
a study of middle school students, Martunen and Laurinen (2006)
found that after reading three texts and participating in pair
conversations on the topic of genetically modified organisms,
the student-constructed argumentation diagrams were more
elaborative and reflective as they included more themes and
arguments. Moreover,
Kim (2001) found that incorporating a metacognitive group
monitoring activity in collaborative reasoning discussions had
contributed in more dialogic and reflective student writing. In addition, the
counterarguments and rebuttals increased in the post-discussion
essays and the essays provided evidence that students were
attentive to their reasoning by reflecting and evaluating their
position. In
conclusion, guided opportunities in which learners participate
in argument-based discourse facilitate development and
internalization of argumentation knowledge and skills, and
improve both the quality of the arguments and the peer
dialogues.
Visual
argumentation
aids such as argument diagrams have been utilized by researchers
to promote coherent and organized argumentation with
well-integrated arguments and counterarguments in support of a
final position. In
a series of studies Nussbaum and Schraw (2007) examined the
effectiveness of a graphic organizer to guide more balanced and
reflective argumentation. They
found that both instruction about the criteria of a good
argument and the graphic organizer improved the quality of
writing, increased the number of counterarguments, and the
overall integration score.
However, the students who used the graphic organizer
preferred to apply refutation as an integration strategy in
comparison with students who received criteria instruction and
primarily used weighing and synthesizing opposing perspectives
into a creative position. In
a follow-up study, that aimed to facilitate students to become
more metacognitively reflective and explore perspectives on an
issue before integrating them into a final position, Nussbaum
(2008) modified the graphic organizer to an Argumentation Vee
Diagram (AVD). In
this study Nussbaum examined whether an elaborative intervention
that utilizes the diagram with instruction on how to integrate
arguments and counterarguments, and discussion of the criteria
of evaluating the strength of arguments and counterarguments
results in better argumentation and has a transfer effect. The experimental group
improved their writing in terms of integration over three
sessions using most frequently the synthesis strategy but there
was no significant transfer effect to a task in which the
diagram was removed. In
another study, when fifth graders collaboratively used an
argument diagram, they generated more coherent arguments than
when they collaborated to list pro-con positions (Scwarz,
Neuman, & Biezuner, 2000). Thus, the studies provide
evidence that the argument diagrams have the potential to
stimulate consideration of counterarguments and facilitate more
elaborated and coherent argumentation but more research is
needed to determine whether the use of diagrams enhances
reflective and critical thinking about complex issues.
Virtual
graphic
tools have also been used to support student argumentation and
engagement in critical discussions. Typically computerized
argumentation diagrams have the capacity to represent both the
components of an argument and relations of support and
disagreement (Jonassen & Kim, 2010). In their study of the
VCRI argumentation tool, Munneke, van Amelsvoort, and Andriessen
(2003) examined the role of argumentative diagrams that were
constructed in advance individually or collaboratively during an
electronic discussion in supporting student interaction with the
purpose of writing a collaborative text on genetically modified
organisms. The researchers found that diagrams that were
constructed in advance helped students to focus their subsequent
discussions on argumentation and were used as information
sources and collaborative diagrams were also used for
note-taking to summarize the discussion. Thus, the diagram
helped to maintain focus and functioned as an aid for organizing
and maintaining coherence during the discussion. Munneken and
colleagues (2003) noted though, that even if diagrams stimulated
collaborative discussions still argumentation was one-sided as
most diagrams were very unbalanced. Another study
conducted by Easterday, Aleven, and Scheines (2007) provided
further evidence of the effectiveness of argument diagrams as a
graphic organizer for argumentation. Learners in this
experimental study who analyzed public policy problems using a
causal diagram organized better their perceptions of the
arguments in comparison with students who only read about the
problem in a text. However,
students who used the diagramming tool learned more about
constructing causal arguments as they were engaged in a more
constructive activity while using the tool to formulate their
arguments. As
Newell and colleagues (2011) argued in their review of the
studies on teaching and learning argumentation, the diagrams
printed or virtual help learners manage the complexities of
argumentation and especially the task of considering alternative
perspectives and integrating arguments with counterarguments but
more evidence is needed to support whether they facilitate more
critical and reflective thinking.
1.3 Purpose of the Study
The
purpose of the study was to examine creative problem solving
performance in undergraduate students and compare how
alternative tasks (e.g., explanation or argumentation) support
reflective self-evaluations of creative solutions. Two research questions
guided our investigation:
How
do individual differences in divergent thinking, need for
cognition, beliefs about creative outcomes, and academic major
impact the creativity of a solution with respect to its (a)
originality and (b) effectiveness?
To
what extent does a reflective task (i.e., an explanation task or
an argumentation task) differentially support the students’
self-evaluation of their creative solution?
Based
on
the review of literature the following five hypotheses were
forwarded:
Hypothesis 1.1.
Students who are strong divergent thinkers and high in need for
cognition will propose creative solutions that are both original
and effective.
Hypothesis 1.2.
Students who conceptualize creative solutions as both original
and effective will develop a solution that is highly effective
and may or may not be as original. These students will also
evaluate their creative solutions more positively.
Hypothesis 1.3.
Students who possess more extensive prior knowledge, based on
their academic major, will develop highly effective solutions.
Hypothesis 2.1. For
students who complete the argumentation task, the effectiveness
of the proposed creative solution will strongly and positively
predict the self-evaluation of the solution with respect to its
effectiveness.
Hypothesis 2.2. For
students who respond to the explanation task, the effectiveness
of their proposed creative solution will be less predictive of
their self-evaluation of the effectiveness of the solution.
2.
Method
The
purpose of this study was to explore creative problem solving
performance in undergraduate students and compare alternative
tasks that support reflective self-evaluations of their proposed
creative solutions. The study was designed based on a
single-factor, between groups design with two comparison groups
(i.e., Explanation or Argumentation Task).
For
this study participants were recruited from an undergraduate
educational psychology course at a public research university in
the United States. The
completion rate was 82% with 103 volunteers completing the
study. The sample
was comprised of primarily sophomores (52%) and freshmen (30%),
the majority were females (n=88), and more than half of the
participants were education majors (57%) in comparison to 43% of
non-education students (e.g., communication sciences and
disorders, kinesiology). The
demographics were comparable to most introductory courses
required for teacher certification.
2.2.1 Demographics
Respondents
completed
a demographic cover page in which they provided background
information including their academic major, academic
classification, courses they completed in preparation for the
transition to college, and courses they enrolled or completed
pertaining to curriculum and instruction. Participants also
listed and described their teaching experiences.
2.2.2 Divergent Thinking
The
two divergent thinking tasks were derived from the tasks in
Guilford’s Consequences’ Test A’ (Christensen, Merrifield, &
Guilford, 1953). For
each task students had two minutes to generate as many possible
results to each of these hypothetical scenarios: (a) What would
happen if a new invention makes it unnecessary for people to
eat? (b) What would happen if a new invention makes it
unnecessary for people to sleep?
The responses were scored for ideational fluency
operationalized as the number of distinct valid ideas recorded
by a respondent excluding any duplicates or irrelevant ideas due
to a misinterpretation of the scenario. On average, for the
two divergent thinking tasks participants generated M1=6.11(2.46)
ideas and M2=5.52(2.27). Due to their marginal
internal consistencies (α=.63), the two scores were entered as
separate divergent thinking indicators for data analysis.
2.2.3 Beliefs
Questionnaire
A
28-item Likert scale designed for the purposes of this study was
administered to gauge participants’ beliefs about creative
outcomes with reference to a creative course. Twelve items
targeted characteristics of a creative course related to its (a)
originality (i.e., innovative, unusual, original, novel, unique,
and imaginative) and (b) effectiveness (i.e., successful,
affordable, effective, implementable, goal-directed, and
feasible). These
characteristics are recurring terms that describe creative
outcomes in the extant literature of creativity and creative
problem solving. The
remaining 16 items were distracters. An example of an item
on the belief scale is: “Creative high school courses are
implementable.” Participants
rated the belief scale items with a score ranging from Not Very
(0) to Very (5) to indicate how typical the characteristic is of
a creative course.
A
factor analysis with a Principal axis factoring (PAF) and a
Promax rotation was conducted with the 12 characteristics of
creative courses to determine the underlying structure of the
belief scale. The Promax rotation was selected because it is a
type of rotation that aids the interpretation of the factor
analysis results when the factors are believed to be correlated
as in this case (r12=.52).
Three
factors
were extracted with eigenvalues of 37.066, 10.688, and 6.996
respectively and they exceeded the criterion of 1.0 based on the
Kaiser-Guttman rule (Guttman, 1954; Kaiser, 1960). However, only two
underlying factors were detected in the scree plot. The characteristic
affordable was the only one that loaded on the 3rd factor, which
explained 6.996 % of the data. This characteristic was the only
item that targeted financial aspects of a creative course and
this is possibly why this characteristic failed to load on the
two first factors that represented the effectiveness and
originality of a course. Thus,
this item was removed and a second factor analysis was conducted
with the 11 items. Two
factors emerged from the final factor analysis with eigenvalues
of 4.828 and 1.497, which explained 39.984% and 9.917% of the
variation in the data, respectively. As evidenced in Table
1, nine items had loadings greater than the Harman criterion
value of .40.
The
two detected factors represent underlying characteristics of
creative courses with the first factor representing the
effectiveness dimension and the second factor representing the
originality dimension of a creative course (r12
=.57). Characteristics
that underlie the effectiveness of a creative course included
successful, effective, innovative, implementable, and feasible. Characteristics that
underlie the originality dimension of a creative course included
the characteristics imaginative, unique, novel, and original. A belief scale with
the nine items was formulated with acceptable internal
consistency (α=.87). The
internal consistency of the two component subscales were α1=
.84 for effectiveness and α2=.81 for originality. The composite score
for the entire belief scale ranged from 0 to 45 with higher
scores indicating beliefs in agreement with current
conceptualizations of creative outcomes in the literature. The average belief
score was M=26.89(7.29) suggesting that participants’ beliefs
were in moderate alignment with these conceptualizations.
Table 1
Coefficients for the
Factor Analysis with Promax Rotation for the Beliefs Scale
Characteristic |
Effectiveness |
Originality |
Successful |
.998 |
-.153 |
Effective |
.959 |
-.195 |
Innovative |
.605 |
.070 |
Implementable |
.509 |
.181 |
Feasible |
.440 |
.169 |
Imaginative |
.004 |
.812 |
Unique |
.081 |
.757 |
Novel |
.095 |
.619 |
Original |
.233 |
.619 |
Unusual |
-.231 |
.367 |
Goal-directed |
.325 |
.307 |
Eigenvalues |
4.828 |
1.497 |
Percentage of Variance |
39.984 |
9.917 |
Note. Factor loadings
>.40 are in boldface.
2.2.4. Need for Cognition
Scale
The
18-item abbreviated Need for Cognition Scale (Cacioppo, Petty,
& Kao, 1984, p.306) was administered (α=.79) to assess
participants’ tendency to engage in and enjoy effortful
cognitive endeavours. An
example item from the scale is the following: “I would prefer
complex to simple problems”.
Participants rated the statements with a score ranging
from Not Very Much (0) to Very Much (5). For the scoring of the
scale, a composite Need for Cognition score was calculated and
it ranged from 0 to 90. On average, participants manifested
moderate to low Need for Cognition M=48.5(10.46).
2.2.5 Solution Self-Evaluation
Questionnaire
Finally,
participants
evaluated their creative course on a 16-item Likert scale
questionnaire developed for the study, which consisted of two
distracter items and 14 items that represented criteria of a
creative solution with respect to its originality (i.e.,
innovative, unusual, original, imaginative, novel, unique, and
risky) and effectiveness (i.e., effective, successful,
affordable, implementable, goal-directed, feasible, and
organized). These
items reflected descriptive characteristics of creative outcomes
identified in the extant theoretical and empirical literature of
creativity and creative problem solving. Participants rated how
creative their proposed solution was based on the aforementioned
characteristics on a scale ranging from Not Very (0) to Very
(5).
A
factor analysis with a Principal axis factoring (PAF) and a
Promax rotation was conducted with the 14 items after the
distracters were removed. Two factors emerged with eigenvalues
equal to 6.086 and 2.452, which explained 40.58% and 14.01% of
the data, respectively. The
factor intercorrelation was moderate (r12=.49). Table
2 summarizes the loadings on the two factors based on the
pattern matrix.
Based
on
the results of a factor analysis two subscales were formulated:
the originality self-evaluation and the effectiveness
self-evaluation subscale with seven items each. Both subscales ranged
from 0 to 35 and had acceptable internal consistency indices of
α1=.87 and α2=.88 respectively. Οn average,
participants evaluated their proposed course solution low in
originality M=18.96(6.73) and moderate in effectiveness
M=26.69(5.38).
Table
2
Coefficients for the
Exploratory Factor Analysis with Promax Rotation for the
Self-Evaluation Scale
Characteristic |
Course Effectiveness |
Course Originality |
Effective |
.81 |
.04 |
Successful |
.78 |
.10 |
Affordable |
.74 |
-.30 |
Organized |
.73 |
.10 |
Goal-directed |
.69 |
.00 |
Implementable |
.69 |
-.09 |
Feasible |
.65 |
.04 |
Unique |
.08 |
.86 |
Imaginative |
-.02 |
.80 |
Unusual |
-.13 |
.75 |
Original |
.13 |
.68 |
Novel |
.06 |
.68 |
Risky |
-.40 |
.60 |
Innovative |
.34 |
.56 |
Eigenvalues |
6.09 |
2.45 |
Percentage of Variance |
40.58 |
14.01 |
Note. Factor loadings
>.40 are in boldface.
2.3
Procedure
Participants
completed
the study through an online survey system (Qualtrics) that
randomly assigned them to a condition either the Explanation (n1=53)
or the Argumentation (n2=50) task. The study was
self-paced as students completed it in one sitting at their own
pace. Students
first provided demographic information and responded to two
counterbalanced divergent thinking tasks. Following, they
completed a beliefs questionnaire and the Need for Cognition
Scale. Then all
participants read the same problem scenario and in response to
it they developed a creative course as a solution to the problem
described in the scenario.
Following participants responded to a reflective task
(i.e., an explanation or an argumentation task) about their
proposed creative course. Finally,
all participants evaluated the creativity of their course by
rating it on a scale with a set of characteristics that describe
the originality and effectiveness of a creative solution.
2.4
Problem Solving Task
The
problem scenario was originally developed by Hunter and his
colleagues (2008) for a study of undergraduate students’ idea
generation and problem solving.
The specific scenario was selected for two reasons: (a)
it had been previously used with undergraduate students and has
yielded acceptable interrater agreement scores (0.70-0.80) with
respect to the originality and quality scores assigned to the
solution and (b) the embedded problem solving task is
ill-structured as it requires students to: extract the important
and relevant information from the scenario, identify the
parameters and constraints for solving the problem, apply
personal beliefs about creative courses and creative teaching,
draw on their knowledge and experiences to define the problem,
make judgments, and establish criteria for evaluation.
The
scenario required participants to assume the role of a high
school teacher asked to design a creative college preparatory
course for the high school’s seniors to better prepare them for
college and reduce the college dropout rate among this high
school’s graduates. The
final paragraph in the problem scenario explained the task: “In her description of the
requirements for the course, the principal makes one point
very clear, the senior prep course needs to be a creative high
school course designed to prepare the high school students for
college. She emphasized that you need to take a creative
approach in designing and teaching the course. The principal
has asked you to 1) identify the overall goal of the course
and 2) list and describe the specific learning activities that
you will include in the course.” Also, for the purposes of
the study we modified the problem scenario in two ways. First, any
descriptions or conceptualizations of a creative course were
removed so that participants rely on their own beliefs and
understandings of a creative course. Still we emphasized in the
scenario that the problem solver needs to take a creative
approach in designing and teaching the course. Second, in the final
paragraph we identified the two specific tasks that participants
had to complete after reading the scenario.
We
conducted two pilot studies followed by two focus group
discussions in order to gather evidence for the
comprehensibility and the face validity of the problem scenario
and the problem solving task.
The pilot study participants were representative of the
sample (i.e., non-education and education majors) and in general
the students found the directions clear and the scenario
understandable. They
also acknowledged the authenticity of the problem scenario as
they pointed that it challenged them to provide a solution to a
real life problem: the fact that high school students are not
prepared for the academic, social, and emotional challenges of
the transition to college.
Pilot study participants said that once they read the
problem scenario they had to pause and reflect on what their
needs were when they moved to college and what is important for
a student to succeed in college.
Overall, the authenticity of the problem scenario and its
relevance to students’ recent college transition experiences
seems to have motivated participants to engage with the task as
they agreed on the importance of designing a high school course
to prepare students for college.
2.4.1. Coding
A
coding scheme was developed to summarize the responses that
participants provided to the problem solving task. Specifically, the
scheme was used to code the learning activities participants
suggested for their creative high school course. An iterative
procedure was followed to develop the coding scheme and
establish its validity. The
researcher and an independent coder (Coder A) applied a keyword
content analysis approach to identify the task-relevant units
within a response. A
task-relevant unit was defined as any distinct task-relevant
statement that captured learning activities that participants
generated for their high school course. A learning activity
was defined as any learning experience, enactive (i.e., actual
doing) or vicarious (i.e., students observe, listen or are
engaged in other ways), designed for the learners to attain an
instructional goal such as the acquisition of information,
knowledge, skills, abilities, attitudes and strategies
(Antiliou, 2012, p. 66).
The
first author began by reading all of the responses to generate
an initial set of coding categories to summarize the responses
to the question prompt that participants recorded. This initial review of
responses revealed that participants recorded learning
activities as well as assessment activities and other
instruction and course design elements such as materials,
educational technology, and learning goals. Following, the
researcher provided directions to another colleague (Coder A) to
develop independently her version of the coding scheme. The directions
included the problem scenario, the problem solving task, a set
of coding guidelines and an example of a coded response. Then, Coder A
proceeded to read the entire set of responses to independently
generate a second version of the coding scheme.
In
the two discussions that followed between the first author and
Coder A, the two independent coders analyzed, compared, and
synthesized the two alternative coding schemes to generate a
merged coding guide that included a coding scheme and a set of
guidelines with definitions, assumptions, and decision rules. There was consensus
that the coding scheme should capture task-relevant units that
identified not only learning and assessment activities but also
other instruction/course design elements (e.g., instructional
materials or educational technology, etc.).
A
total of 47 coding categories were included in the coding scheme
and they are summarized under ten overarching categories
including discussion-based activities, problem solving
activities, experiential learning activities, reading and
writing assignments (see Table 3 for the complete list of
categories). The 47
coding categories represented learning or assessment activities,
as well as other instruction/course design elements. Six of the 47 coding
categories were further divided into four additional specific
subcodings. For example, the category modelling had two
subcodes: instructor model and other models. Also, the category
expository writing had two subcodes: extended and brief. In the case of coding
categories with more specific subcodes, the coding was done by
applying the more specific subcode. The coding guide was
used for a trial coding (10%) followed by a discussion to
resolve potential differences, refine the scheme, and clarify
the decisions rules.
Following
the
development and validation of the coding scheme, the intercoder
agreement for the reliability of the coded responses was
examined. The
researcher and Coder A independently coded another 20% of the
responses for this purpose.
In each response, the coders (a) identified the total
number of task-relevant units and (b) coded the task-relevant
units including the learning activities or other
instruction/course design elements.
A
total of N=349 valid task-relevant units were recorded by the
participants. The
intercoder agreement for the total number of task-relevant units
was α=.79 and for the type of code assigned to a task-relevant
unit was α=.72. Both
indices were above the moderate criterion .70 selected for the
conservative Kalpha Coefficient of intercoder agreement. In a
discussion that followed, the coders first resolved
disagreements on the number of task-relevant units and then
disagreements on the assigned codes in order to reach consensus.
2.4.2. Scoring
The
creativity of a course that participants proposed was
operationalized with respect to the average originality and
effectiveness of the valid task-relevant units. Each valid
task-relevant unit that a participant recorded was scored for
its originality and effectiveness.
Among the valid task-relevant units, 315 units were
classified as learning or assessment activities. Another 34
task-relevant units represented instruction/course design
elements, which referred to aspects of instruction or course
design including materials, educational technology, the
structure of the course, and the learning environment.
Originality:
Originality was defined as the rareness of occurrence of a
task-relevant unit within the pool of valid units (N=349)
generated by all participants.
The originality score (x) assigned to a valid
task-relevant unit (i) was the rareness proportion of the
specific code within the pool of (a) learning/assessment
activity units or (b) instruction/course elements, depending on
the nature of the coded unit.
For example, a task-relevant unit with the code
instructor modelling appeared 55 times so its proportion of
occurrence within the pool of learning/assessment activities was
55/315=0.18 and its rareness of occurrence was 1-0.18=.82. Similarly, a
task-relevant unit with the code educational technologies
appeared 6 times within the pool of the 34 instruction/course
design elements, thus its proportion of occurrence was 6/34=0.18
and its rareness of occurrence was 1-0.18=.82.
The
average originality score for the solution proposed by a
participant (j) was: average originality =
Effectiveness: The
potential effectiveness of a learning activity was defined as
the degree to which a learning or assessment activity or other
instruction/course design element could contribute to the smooth
transition and academic success during the first years of
college. An
effectiveness rubric was developed to operationalize and score
each task-relevant unit (see Appendix). The rubric was
developed by drawing on the literatures of instructional design,
and college transition and persistence and instructional design
(Eggen & Kauchak, 2010; Goldbrick-Lab et al., 2007; Louie,
2007; Pritchard et al, 2007; Roe Clark, 2005). The effectiveness
scores ranged from Inadequate (0) to Strong (4) effectiveness. The effectiveness of a
learning activity or other task-relevant unit was considered
Strong if (a) it targeted important information, knowledge,
abilities, skills, or strategies for smooth transition and
academic success in the first years of college and (b) it
strongly aligned (i.e., directly relevant) with the identified
overall goal of the course.
Examples of potentially effective activities include
those that targeted writing, note taking, and test taking skills
but also coping strategies and interpersonal skills.
In
order to establish the reliability of the effectiveness scores
another colleague was trained to serve as a second rater using a
subset of the pilot data. Following,
the researcher and the second rater independently, scored two
subsets of the data to reach an acceptable interrater agreement
level (α=.82). When
all valid task-relevant units were scored for their potential
effectiveness, an average effectiveness score for a solution was
estimated by applying the following formula: average
effectiveness =
2.5.1
Reflective
Task
In
both experimental conditions participants completed one of two
alternative post problem solving tasks. Students in the
Explanation condition (n1=53) were directed to
“Provide an explanation of their high school course to the
school board members”. The
explanation task required a short written response to this
prompt.
Students
in
the Argumentation condition (n2=50) completed an
argument diagram. The
argumentation diagram is a modified Argumentation Vee Diagram[1]
(Nussbaum, 2008) adapted (a) for the online administration of
the study and (b) for constraining participants to use weighing
as the integration strategy between the arguments and
counterarguments. Figure
1 presents the argumentation diagram administered in this study. The overarching
question inquired whether the proposed course is a potentially
creative course. The
participants generated reasons in favour of their creative
course and corresponding potential objections of the school
board. Then they
were directed to reread the reasons and objections and decide
for each pair whether the reason or objection was stronger. Participants could
offer up to 5 pairs of reasons and objections.
Figure 1. The
argumentation diagram utilized in the study. (see pdf file)
3.
Results
The
purpose of the study was to examine problem solving performance
and identify reflective tasks that better support students’
self-evaluations of their proposed creative solutions. Participants completed
online a set of individual difference measures before responding
to the problem solving task in which they assumed the role of a
high school teacher who was asked to design a creative college
preparatory course for the high school senior students. The participants
identified the overall goal of their high school course and
generated specific learning activities for the course. Following they
completed an explanation or argument reflective task and rated
the creative course in terms of its originality and
effectiveness. A
summary of the creative solutions that participants forwarded is
followed by the presentation of the descriptive statistics and
corresponding statistical models performed to answer the two
research questions.
3.1
The Creative Solutions
Participants
designed
a creative course to reduce the high college dropout rate among
the high school graduates and better prepare them for the
transition to college. Participants
listed and described specific learning activities that they
would implement in their course. Among the most widely
referenced learning activities within the pool of valid
task-relevant units generated by the respondents were instructor
led activities in which the instructor or another more
experienced individual (e.g., guest speaker) was responsible for
providing instructional support such as presenting content and
sharing experiences. Equally
popular were activities that were based on experiential
learning, such as simulations, fieldtrips, and student
presentations.
Table 3
Frequency of Occurrence of Overarching
Categories within Valid Task-Relevant Units (N=349)
Overarching Category |
Frequency of Occurrence f |
Percentage of Occurrence % |
Discussion |
18 |
5.16 |
Warm Up |
4 |
1.15 |
Instructor Led |
94 |
26.93 |
Problem Solving |
6 |
1.72 |
Experiential Learning |
93 |
26.65 |
Research |
13 |
3.72 |
Writing Assignments |
58 |
16.62 |
Reading Assignments |
3 |
0.86 |
Classroom Assessment |
26 |
7.45 |
Instruction/Course Design |
34 |
9.74 |
Other
learning
activities included writing (i.e., expository, persuasive,
reflective, organizational aids) and reading assignments (e.g.,
textbooks, articles, or reports), discussions including
student-centered discussions, debates, and discussions with
experts. Moreover,
participants identified research activities, for example
searching information about an academic topic, and searching
about potential careers and colleges, and learning activities
based on problem solving (e.g., decision making).
Classroom
assessments
such as formative, summative, and diagnostic assessments were
included in participants’ proposed learning activities (n1=26;
7.45%). Several
students (n2=34; 9.74%) suggested other instructional
or course design elements such as materials and educational
technologies. It is
possible that these students had interpreted the prompt more
broadly than intended such that they provided ideas about how
they would organize the course and plan instruction to attain
the goals of the creative course.
3.2 Predictors
of Creative Solutions
In
the first research question we examined the extent to which
individual difference variables including divergent thinking,
need for cognition, beliefs about creative outcomes, and
academic major impact the creativity of a proposed solution in
terms of its average originality and potential effectiveness. Across the sample, the
mean average originality score was high M=0.9(0.09) and the mean
average effectiveness score was moderate M=3.23(0.59).
Descriptive
statistics
for the three continuous predictors are presented in Table 4. With respect to their
academic major, 57% of the sample were education majors and 43%
non-education majors (e.g., communications, kinesiology, or
human development). Participants
exhibited moderate divergent thinking ability and on average
they generated six valid ideas.
Considerable more variability was evident in
participants’ need for cognition which was moderate to low.
Table
4
Means and Standard
Deviations of Predictors of Creative Solutions
Variables |
M |
SD |
Range |
Divergent Thinking (I) |
6.11 |
2.42 |
- |
Divergent Thinking (II) |
5.54 |
2.23 |
- |
Need for Cognition |
48.50 |
10.50 |
0-90 |
Beliefs |
26.92 |
7.31 |
0-45 |
Moreover,
the
students’ beliefs about creative outcomes mean score was
M=26.89(7.29), which indicates that participants’ beliefs were
somewhat in alignment with conceptualizations in the literature.
Participants rated high characteristics of creative outcomes
pertaining to their effectiveness [i.e., feasible M=3.98(1.24),
effective M=3.42(1.04), and successful M=3.23(1.12)]. They also acknowledged
as important characteristics those describing the originality of
a creative course, for example, innovative M=3.25(1.21) and
imaginative M=3.02(1.05). This
result suggests that participants took into consideration the
context of schooling and appreciated not only the originality
but also the effectiveness of a course as an important quality
of a creative course.
Two
regression models were conducted to determine the predictors of
the average originality and effectiveness of a solution since
the correlation between the two outcome variables of average
originality and potential effectiveness was non-significant
(r=.07, p=.5). Due
to the violation of the assumption of the residuals, instead of
a multiple regression, an ordinal regression model was conducted
to determine the predictors of average solution originality. Thus, the dependent
variable was transformed into an ordinal variable with three
levels of average originality (i.e., low, moderate, high) to
determine the cumulative odds ratio of proposing a solution of
high originality. High
average originality (≥.94) was manifested by 52 participants,
moderate average originality (.86≤ y ≤.93) was exhibited by 30
participants, and 21 participants scored low (≤.85) in average
originality.
Table
5
Predictors of Solution
Originality
|
Variable |
Estimate |
Wald |
p |
Confidence
Intervals |
Threshold |
Low |
-.828 |
0.47 |
.49 |
[-3.19, 1.53] |
|
Moderate |
.572 |
0.23 |
.63 |
[-1.78, 2.93] |
Parameter |
Divergent
Thinking (II) |
0.20* |
4.82 |
.03 |
[.02, 0.38] |
|
Beliefs |
0.01 |
.03 |
.47 |
[-0.05, 0.02] |
|
Academic Major |
0.50 |
0.02 |
.90 |
[-0.73, 0.83] |
|
Need for
Cognition |
-0.01 |
0.51 |
.86 |
[-0.05, 0.02] |
The initial full ordinal
regression model was non-significant. The non-significant
predictors were removed stepwise and the ordinal
regression model reached significance (-2LL=67.66, χ2(4) =5.08,
p=0.02) with divergent thinking (Task II) being the only
significant predictor (Table 5).
Divergent thinking positively predicted average solution
originality and for each unit increase in divergent thinking
participants had lower cumulative odds of developing a solution
of poorer originality (low or moderate) by a factor of 0.82.
A
multiple linear regression with the same individual difference
variables as predictors was performed (see Table 6) to determine
their effect on the average effectiveness of creative solutions.
The full model was significant but explained a modest amount of
variation [F(4,97)=3.51, p=0.01, R2=0.13]. Academic major and
need for cognition positively predicted the average
effectiveness of a creative solution. For education majors,
a solution was on average 0.24 (p=.02) more effective in
comparison to a solution proposed by a non-education major. In addition, for each
unit of increase in need for cognition there was a 0.21 (p=.03)
increase in solution effectiveness.
Table
6
Predictors of Solution
Effectiveness
Variable |
B |
β |
p |
Confidence Intervals |
Constant |
2.72**
|
<.001 |
2.18-3.27 |
|
Divergent Thinking (II) |
0.03 |
0.12 |
.22 |
-0.02-0.07 |
Beliefs |
-0.01 |
-0.1 |
.32 |
-0.02-0.01 |
Academic Major |
0.23
|
0.24 |
.02* |
0.04-0.43 |
Need for Cognition |
0.01
|
0.21 |
.03* |
0.001-0.02 |
3.3 Creative
Solution
Self-Evaluation
In
the second research question we explored the effect of two
alternative reflective tasks: the explanation and the
argumentation task on the self-evaluations of the creative
solution, with respect to its originality and effectiveness. A multivariate
multiple regression (MMR) was performed with four predictors as
covariates and the type of reflective task (1=Explanation,
2=Argumentation) as the fixed factor in
the model. The
covariates included beliefs about creative outcomes, academic
major, and the average assigned originality and effectiveness
score. The MMR
analysis was conducted since the two outcome variables namely
the effectiveness and originality self-evaluations were
significantly and positively correlated (r12=.38, p<.001).
Table 7
Predictors of Creative Solutions
Self-Evaluations
Hotelling’s
Trace |
F |
p |
Partial η2 |
Observed
Power |
|
Intercept |
.20 |
11.67 |
.001 |
.20 |
.99 |
Beliefs |
.27 |
17.31 |
.001* |
.27 |
1.00 |
Average Originality |
.03 |
1.34 |
.27 |
.03 |
.28 |
Average Effectiveness |
.001 |
.001 |
.99 |
.001 |
.05 |
Academic Major |
.02 |
.77 |
.47 |
.02 |
.18 |
Condition |
.11 |
5.77 |
.004* |
.11 |
.86 |
The
MMR model was significant [F(2,94) =11.67, p=<.001, η2 =.20]. The type of reflective
task [F(2,93) =5.77, p=.004, η2 =.11] and beliefs about creative
outcomes [F(2,93) =17.31, p=<.001, η2 =.27] had a
significant effect on the self-evaluations (see Table 7). Specifically, the type
of reflective task significantly and positively predicted the
effectiveness self-evaluations [F(1,94) =11.23, p<.001, η2
=.11]. Participants
in the argumentation condition evaluated their creative course
by 2.81 [p=.001, 95% (1.14, 4.47)] points lower than
participants in the explanation condition when all other
predictors were equal. Thus,
the argument diagram was a structure support that seems to have
promoted more conservative self-evaluations about the proposed
creative solution.
Participants’
beliefs
about the characteristics of creative outcomes were a
significant positive predictor of the self-evaluations of
originality [F(1,94) =14.63, p<.001, η2 =.14] and
effectiveness [F(1,94) =28.74, p<.001, η2 =.23]
of a forwarded creative solution.
In fact, participants whose beliefs better aligned with
the current conceptualizations of creative outcomes evaluated
higher the creativity of their solution both in terms of its
originality and effectiveness.
4. Discussion
Students across education levels are challenged
to acquire complex cognitive skills including creative
thinking. In this study we examined the
individual difference variables that contribute to creative
performance in problem solving with respect to the originality
and effectiveness of a proposed creative solution. In addition, we
attempted to address a gap in the literature related to the
effect of argumentation tasks on the self-evaluation of creative
solutions.
The
major contribution of the present study is the development of
the creative solution self-evaluation questionnaire which is a
reliable rating scale that can be administered by teachers and
used by students to evaluate creative solutions, ideas, and
products with respect to a set originality and effectiveness
criteria. However,
the self-evaluation scale needs to be further validated to
determine whether it yields the same underlying structure for
creative outcomes across fields as it is also possible that
additional criteria have to be met for an outcome to be judged
as creative in a different field since influential individuals
in each field evaluate ideas based on some consensus about the
contribution of an idea in the field (Antiliou, 2010;
Csikszentmihalyi, 1999).
Our
investigation also contributes to the research efforts to
identify the cognitive and affective variables that predict the
creative performance of novices.
The findings of the study aligned with research findings
regarding the predictors of creative performance. Divergent thinking was
reported as a predictor of creative problem solving (Diakidoy
& Constantinou, 2001; Hunter et al., 2008; Reiter-Palmon et
al., 1997), and this ability to generate various, distinct
responses to a divergent thinking task was found in this study
to be the single significant predictor of the originality of a
proposed creative solution.
The
effectiveness of a creative solution was predicted by affective
and cognitive variables, specifically, need for cognition and
academic major. The
findings add to the existing evidence, which show that
individuals with high need for cognition perform more
effectively when solving complex problems (Butler et al., 2003;
Nair & Ramnarayan, 2000; Osburn & Mumford, 2006). However, it is
worrisome that participants reported moderate to low need for
cognition since this cognitive disposition to enjoy effortful
and challenging endeavours represents a prerequisite for
lifelong learning and continued professional development
especially for future educators.
Academic major served as a prior knowledge proxy and it
positively predicted the effectiveness of the creative solution. In the future,
researchers who aim to examine creative problem solving in
specific disciplines could administer measures of domain
knowledge such as the pedagogical/psychological (PPK) knowledge
measure of general pedagogical knowledge (Voss, Kunter, Baumer,
2011) instead of relying on proxies of prior knowledge, which
was a limitation in this study.
In
the present study we also aimed to investigate the type of tasks
that support more reflective self-evaluations of creative
solutions. The
findings of the study provide some indication that argumentation
tasks facilitate more critical self-evaluations of the
effectiveness of creative solutions. Participants who
completed the argument diagram rated the effectiveness of their
course more conservatively in comparison to those who responded
to the explanation prompt, possibly because the argument diagram
provided a structure support for students to elaborate, reflect
more deeply and to critically analyze their proposed solution by
considering alternative perspectives held by other stakeholders
(Jonasssen & Kim, 2010; Nussbaum & Sinatra, 2003;
Suthers, 2001). Further,
Andriessen cited Baker (2004) to argue that argumentation is a
mechanism through which students not only provide explanations
but also prepare a justification to explicitly describe their
rationale, which fosters better reflection. Munneke (2004) and
colleagues also argued that as a knowledge representation tool a
diagram explicitly presents the structure of argumentation,
thus, providing an overview and making components and
perspectives more visible.
The fact that participants were more conservative about
their solutions after completing the argumentation diagram
provides some evidence for Voss’s (1981) idea that argumentation
is a mechanism that allows students to not only elaborate and
clarify the solution but also identify potential limitations,
thus, becoming more critical of their solution. However, more research
evidence from a study based on a think aloud procedure is needed
to provide stronger evidence on how students reflect on their
solutions and whether the argument diagram itself promotes more
reflective self-evaluations.
Given that in the present study the design of the
argument diagram guided students to apply the weighing
argument-counterargument integration strategy, a follow-up study
with a think aloud methodology would allow for a more authentic
assessment of the integration strategies (e.g., synthesis,
refutation, and minimization) that students choose to apply in
tasks that require a creative solution that realizes benefits
and minimizes disadvantages.
The
findings of the study confirm the important role of beliefs as
an affective variable that impacts problem solving with regard
to the self-evaluations of a proposed creative solution rather
than on creative performance per se. In fact, participants
whose beliefs about the characteristics of creative outcomes
aligned better with current conceptualizations in the literature
rated both the originality and effectiveness of their solutions
more positively. This
finding signals the need for educators to pay more attention to
affective dimensions of learning including students’ beliefs
since they inform critical thinking such as the self-evaluations
of solutions. Teachers
need to provide students with opportunities to explicate,
contradict, and enrich their beliefs through classroom
discussions, encounters with creative individuals, and exposure
to examples of creative work across domains. When practitioners
realize that students’ beliefs are narrow or naïve they can also
administer rating scales in advance to provide students with
criteria for their self-evaluations. The finding also
confirms that ontological beliefs about the nature of creative
outcomes play an important role in the self-evaluation process. Educational
researchers have shown interest in examining how epistemological
beliefs impact problem solving performance (Lodewyk, 2007; Muis,
2008; Oh & Jonassen, 2007) but further research can be
conducted in other knowledge domains by using approaches such as
think aloud protocols, interviews, and classroom discourse to
provide additional evidence on the role of ontological beliefs
in creative problem solving in which learners have to draw on
their creativity beliefs to define the problem and establish
criteria to evaluate a potentially creative solution.
Thinking
as
argument is implicated in the beliefs that people hold, the
judgments they make, and the conclusions they come to; it arises
every time a significant decision must be made (Jonassen &
Kim, 2010, p.439). Drawing
on the findings of this study, we encourage educators who aim to
facilitate students’ critical thinking to use argument-based
tasks in the form of diagrams to support students in generating,
organizing, and evaluating their arguments and counterarguments
in order to make more reflective evaluations during problem
solving.
Keypoints
References
Anderson, L.W., & Krathwohl, D.R.
(Eds.). (2001). A taxonomy of learning, teaching, and
assessment: A revision of Bloom's taxonomy of educational
objectives. New York: Longman.
Andriessen, J. (2006). Arguing to
Learn. In: K. Sawyer (Ed.) Handbook of the Learning Sciences (pp.443-459).
Cambridge: Cambridge University Press.
Andiliou, A.
& Murphy, P. K. (2010). Examining variations among
researchers’ and teachers’ conceptualizations of creativity: A
review and synthesis of contemporary research, Educational
Research Review, 4(3),
201-219.
Antiliou, A.
(2012). The effect of an
argumentation diagram on the self-evaluation of a creative
solution. (Unpublished doctoral dissertation). The
Pennsylvania State University, University Park, PA.
Basadur, M.,
Runco, M. A., & Vega, L. A. (2000). Understanding how
creative thinking skills, attitudes and behaviors work together:
A causal process model. Journal of Creative Behavior, 34(2),
77-100.
Butler, A. B., Scherer,
L. L., & Reiter-Palmon, R. (2003). Effects of solution
elicitation aids and need for cognition on the generation of
solutions to ill-structured problems. Creativity Research
Journal, 15(2-3), 235-244. Doi:10.1207/S15326934CRJ152&3_13.
Byrne, C. L., Shipman,
A. S., & Mumford, M. D. (2010). The effects of forecasting
on creative problem-solving: An experimental study. Creativity
Research Journal, 22(2),
119-138.
Cacioppo, J. T., Petty,
R. E., & Kao, C. F. (1984). The efficient assessment of
need for cognition. Journal
of Personality Assessment, 48(3), 306-307.
Chen, C., &
Bradshaw, A. C. (2007). The effect of web-based question
prompts on scaffolding knowledge integration and
ill-structured problem solving. Journal of Research on
Technology in Education, 39(4), 359-375.
Chi, M.T.H. (2000).
Self-explaining expository texts: The dual processes of
generating inferences and repairing mental models. In R.
Glaser (Ed.), Advances in Instructional Psychology, Hillsdale, NJ: Lawrence Erlbaum Associates.
Cho, K.,
& Jonassen, D. H. (2003). The effects of
argumentation scaffolds on argumentation and problem solving.
Educational Technology Research and Development, 50(3),
5-22.
Christensen,
P.
R., Merrifield, P. R., & Guilford, J. P. (1953).
Consequences form A-1. Beverly Hills, CA: Sheridan Supply.
Csikszentmihalyi,
M.
(1999) Implications of
a Systems Perspective for the Study of Creativity. In
R. J. Sternberg, (Ed.), Handbook of creativity
(pp. 313-335). New York: NY Cambridge University Press.
Dailey, L.R. & Mumford, M.D.
(2006). Evaluative aspects of creative thought: Errors in
appraising the implications of new ideas. Creativity Research
Journal, 18(3),
367-384.
Diakidoy, I. N., & Constantinou, C.
P. (2001). Creativity in physics: Response fluency and task
specificity. Creativity Research Journal Special Issue:
Commemorating Guilford's 1950 Presidential Address, 13(3-4),
401-410.
Eggen, P. & Kauchak, D. (2010).
Educational Psychology: Windows on Classrooms (8th
ed.). New Jersey: Pearson Education.
Easterday, M.W., Aleven, V., &
Scheines, R. (2007). Tis better to construct or to receive?
Effect of diagrams on analysis of social policy. In R. Luckin,
K. R. Koedinger, & J. Greer (Eds.), Proceedings of the 13th
International Conference on Artificial Intelligence in Education
(pp. 93-100). Amsterdam: IOS.
Felton, M., & Kuhn, D. (2001). The
development of argumentative discourse skill. Discourse Processes, 32(2&3), 135-153.
Ferretti, R. P., MacArthur, C. A.,
& Dowdy, N. S. (2000). The effects of an elaborated goal on
the persuasive writing of students with learning disabilities
and their normally achieving peers. Journal of Educational
Psychology, 92(4),
694-702.
Fonseca, B. A.
& Chi, T. H. (2011). Instruction Based on
Self-Explanation. In R. E. Mayer & P.
Alexander (Eds.) Handbook
of research for learning and instruction. New York, NY:
Routledge
Ge, X., Chen, C., &
Davis, K. A. (2005). Scaffolding novice instructional
designers' problem-solving processes using question prompts in
a web-based learning environment. Journal of Educational
Computing Research, 33(2), 219-248.
Ge, X., & Land, S.
M. (2003). Scaffolding students' problem-solving processes in
an ill-structured task using question prompts and peer
interactions. Educational Technology Research and
Development, 51(1), 21-38. Doi:10.1007/BF02504515.
Goldbrick-Lab, S., Carter F. D., &
Wagner, R. W. (2007). What higher education has to say about the
transition to college? Teachers
College Record, 109(10), 2444-2481.
Hunter, S. T.,
Bedell-Avers, K. E., Hunsicker, C. M., Mumford, M. D., &
Ligon, G. S. (2008). Applying multiple knowledge structures in
creative thought: Effects on idea generation and
problem-solving. Creativity Research Journal, 20(2),
137-154.
Isaksen, S. G. &
Treffinger, D. J. (1985). Creative Problem Solving:
The Basic Course, Buffalo, NY: Bearly Limited.
Jonassen, D. H. (1997).
Instructional design models for well-structured and
ill-structured problem-solving learning outcomes. Educational Technology:
Research & Development, 45(1), 65-94.
Jonassen, D.H., &
Kim, B. (2010). Arguing to learn and learning to argue: Design
justifications and guidelines. Educational Technology:
Research & Development, 58, 439-457.
Kim, S. (2001). The effects of group
monitoring on transfer of learning in small group discussions.
Unpublished doctoral dissertation, University of Illinois at
Urbana-Champaign.
Kuhn, D. (1991). The
skills of argument. Cambridge, UK: Cambridge University
Press.
Leitão, S. (2003). Evaluating and selecting
counterarguments. Written
Communication, 20, 269-306.
Lodewyk, K. R. (2007).
Relations among epistemological beliefs, academic achievement,
and task performance in secondary school students.
Educational Psychology, 27(3), 307-327.
Marttunen, M., &
Laurinen, L. (2006). Collaborative learning through argument
visualisation in secondary school. In S. N. Hogan (Ed.), Trends
in learning research. (pp. 119-138). Hauppauge, NY, US:
Nova Science Publishers.
Muis, K. R. (2008).
Epistemic profiles and self-regulated learning: Examining
relations in the context of mathematics problem solving. Contemporary Educational
Psychology, 33,
177-208.
Mumford, M.D., &
Mobely, M. I., Uhlman, C. E., Reiter-Palmon, R., & Doares,
L. M. (1991). Process analytic models of creative thought. Creativity Research
Journal, 4,
91-122.
Munneke, L. van
Amelsvoort, M., & Andriessen, J., (2003). The role of
diagrams in collaborative argumentation-based learning. International Journal of
Educational Research, 39, 113-131.
Nair, K. U., &
Ramnarayan, S. (2000). Individual differences in need for
cognition and complex problem solving. Journal of Research
in Personality, 34(3), 305-328.
Newell, G. E., Beach, R.,
Smith, J., & VanDerHeide, J. (2011). Teaching and learning
argumentative reading and writing: A review of research. Reading Research Quarterly,
46(3), 273-304.
Nussbaum, E. M. (2008).
Using argumentation vee diagrams (AVDs) for promoting
argument-counterargument integration in reflective writing.
Journal of Educational Psychology, 100(3), 549-565.
Nussbaum, E. M., &
Schraw, G. (2007). Promoting argument-counterargument
integration in students’ writing. Journal of Experimental
Education, 76,
59-92.
Nussbaum, E. M., &
Kardash, C. M. (2005). The effects of goal instructions and
text on the generation of counterarguments during writing. Journal of Educational
Psychology, 97,
157-169.
Nussbaum, E. M., &
Sinatra, G. M. (2003). Argument and conceptual engagement. Contemporary Educational
Psychology, 28, 384-395. Doi:10.1016/S0361-476X(02)00038-3
Oh, S., & Jonassen,
D. H. (2007). Scaffolding online argumentation during problem
solving. Journal of Computer Assisted Learning, 23(2),
95-110.
Osburn, H. K., &
Mumford, M. D. (2006). Creativity and planning: Training
interventions to develop creative problem-solving skills.
Creativity Research Journal, 18(2), 173-190.
Pritchard, M. E., Wilson, G., &
Yamnitz, B. (2007). What predicts adjustment among college
students? A longitudinal panel study. Journal of American College
Health, 56(1),
15-21.
Reiter-Palmon, R.,
Illies, M. Y., Cross, L. K., Buboltz, C., & Nimps, T.
(2009). Creativity and domain specificity: The effect of task
type on multiple indexes of creative problem-solving.
Psychology of Aesthetics, Creativity, and the Arts, 3(2),
73-80.
Reiter-Palmon, R.,
Mumford, M. D., O'Connor Boes, J., & Runco, M. A. (1997).
Problem construction and creativity: The role of ability, cue
consistency and active processing. Creativity Research
Journal, 10(1), 9-23.
Reznitskaya A.,
Anderson, R. C., McNurlen, B., Nguyen-Jahiel, K., Archondidou,
A., & Kim, S. Y. (2001). Influence of oral discussion on
written argument.
Discourse Processes, 32(2&3), 155-175.
Roe Clark, M. (2005). Negotiation the
Freshman Year: Challenges and Strategies Among First-Year
College Students. Journal
of College Student Development, 46(3), 296.
Runco, M. A., & Chand, I. (1994).
Problem finding, evaluative thinking, and creativity. In M. A.
Runco (Ed.), Problem finding, problem solving, and
creativity. (pp. 40-76). Westport, CT, US: Ablex
Publishing.
Scwarz, B. B., Neuman, Y., &
Biezuner, S. (2000). Two wrongs may make a right ...if they
argue together! Cognition and Instruction, 18(4), 461-494.
Shin, N., Jonassen, D.
H., & McGee, S. (2003). Predictors of well-structured and
ill-structured problem solving in an astronomy simulation.
Journal of Research in Science Teaching, 40(1), 6-33.
Suthers, D. D.
(2001). Towards a systematic study of representational guidance
for collaborative learning discourse. Journal of Universal
Computer Science, 7(3), 254–277.
Toulmin, S. E. (1958). The uses of argument.
Cambridge, England: Cambridge University Press.
Uribe,
D., Klein, J. D., & Sullivan, H. (2003). The effect of
computer-mediated collaborative learning on solving ill-defined
problems. Educational Technology Research & Development,
51(1), 5-19.
van Eemeren, F., &
Grootendorst, R. (1999). Developments in Argumentation Theory.
In J. Andriessen & P. Coirier (Eds.). Foundations of
argumentative text processing (pp. 43-57). Amsterdam: Amsterdam University Press.
Van
Eemeren, F. H., & Grootendorst, R. (1992). Argumentation,
communication, and fallacies: A pragma-dialectical perspective.
Hilisdale, NJ: Lawrence Erlbaum Associates.
Voss, J. F. & Post,
T. A. (1988). On the solving of ill-structured problems. In M.
T. H. Chi, R. Glaser & M. J. Farr (Eds.), The nature of expertise
(pp. 261-285). Hillsdale, NJ: Lawerence Erlbaum Associates.
Voss, J. F., Wolfe, C.
R., Lawrence, J. A., & Engle, R. A. (1991). From
representation to decision: An analysis of problem solving in
international relations. In R. J. Sternberg, & P. A.
Frensch (Eds.), Complex problem solving: Principles and
mechanisms. (pp. 119-158). Hillsdale, NJ, England:
Lawrence Erlbaum Associates.
Voss, T., Kunter, M.,
& Baumert, J. (2011). Assessing teacher candidates’
general pedagogical and psychological knowledge: Test
construction and validation. Journal of Educational Psychology, 103(4), 952-969.
Walton, D. (2000). The
place of dialogue theory in logic, computer science, and
communication studies. Synthese, 123, 327-346.
Walton, D. N. (1996).
Argumentation schemes for presumptive reasoning. Mahwah, NJ:
Laurence Erlbaum Associates.
Weisberg, R. W. (2006). Expertise and
reason in creative thinking: Evidence from case studies and the
laboratory. In J. C. Kaufman & J. Baser (Eds.) Creativity and Reason in
Cognitive Development (pp. 7-42). New York: Cambridge
University Press.
Wiley,
J.,
& Voss, J. F. (1999). Constructing arguments from multiple
sources: Tasks that promote understanding and not just memory
for text. Journal of Educational Psychology, 91(2), 301-311.
APPENDIX
Effectiveness Scoring Rubric
Table 8
Effectiveness
Scoring Rubric for the Coded Task-Relevant Units
Score |
Descriptor |
4-Strong |
The learning
activity or other instruction/course design element § targets
important information, knowledge, abilities, skills
and strategies for academic success
in college or smooth
transition to college. § strongly
aligns with the overall goal of the course. |
3-Moderate (weak/strong)
OR (strong/weak) |
The learning
activity or other instruction/course design element § targets
somewhat important information, knowledge, abilities,
skills and strategies for academic success
in college or smooth
transition to college. § strongly
aligns with the overall goal of the course. The learning
activity or other instruction/course design element § targets important information,
knowledge, abilities, skills and strategies for academic success
in college or smooth
transition to college. § weakly aligns with the
overall goal of the course. |
2-Weak (weak/weak) |
The learning
activity or other instruction/course design element § targets somewhat
important information, knowledge, abilities, skills
and strategies for academic success
in college or smooth
transition to college. § weakly aligns with the
overall goal of the course. |
1-Insufficient (weak/inadequate)
OR (inadequate/weak) |
The learning
activity or other instruction/course design element § targets
somewhat important information,
knowledge, abilities, skills and strategies for academic success
in college or smooth
transition to college. § does not align with the
overall goal of the course. The learning
activity or other instruction/course design element § does not
target information, knowledge, abilities, skills
and strategies for academic success
in college or smooth
transition to college. § weakly aligns with the
overall goal of the course. |
0-Inadequate (inadequate/inadequate) |
The learning
activity or other instruction/course design element § does not
target information, knowledge, abilities, skills
and strategies for academic success
in college or smooth
transition to college. § does not align with the
overall goal of the course. |
[1] Two pilot studies (n1=19; n2=9)
and focus group discussions were conducted to ensure that
the modified argumentation diagram is comprehensible and
that participants are able to complete the diagram. Please,
contact the first author for information on the pilot
studies.