What is Instructional Design?
Instructional Design is the systematic process of
translating general principles of learning and instruction into plans for
instructional materials and learning.
Instructional Design as a Process:
Instructional
Design is the systematic development of instructional specifications using
learning and instructional theory to ensure the quality of instruction. It is
the entire process of analysis of learning needs and goals and the development
of a delivery system to meet those needs. It includes development of
instructional materials and activities; and tryout and evaluation of all
instruction and learner activities.
Instructional Design as a Discipline:
Instructional
Design is that branch of knowledge concerned with research and theory about
instructional strategies and the process for developing and implementing those
strategies.
Instructional Design as a Science:
Instructional
Design is the science of creating detailed specifications for the development,
implementation, evaluation, and maintenance of situations that facilitate the
learning of both large and small units of subject matter at all levels of
complexity.
Instructional Design as Reality:
Instructional
Design can start at any point in the design process. Often a glimmer of an idea
is developed to give the core of an instruction situation. By the time the
entire process is done the designer looks back and she or he checks to see that
all parts of the "science" have been taken into account. Then the
entire process is written up as if it occurred in a systematic fashion.
Instructional Design is the practice of arranging media (communication technology) and content
to help learners and teachers transfer knowledge most effectively. The process
consists broadly of determining the current state of learner understanding,
defining the end goal of instruction, and creating some media-based
"intervention" to assist in the transition. Ideally the process is
informed by pedagogically tested theories of learning and may take place in
student-only, teacher-led or community-based settings. The outcome of this
instruction may be directly observable and scientifically measured or
completely hidden and assumed.
As a field,
Instructional Design is historically and traditionally rooted in cognitive and
behavioural psychology. However, because it is not a regulated field, and
therefore not well understood, the term 'instructional design' has been
co-opted by or confused with a variety of other ideologically-based and / or
professional fields. Instructional Design, for example, is not Graphic Design
although graphic design (from a cognitive perspective) could play an important
role in Instructional Design.
Instructional Design Process
STEP 1: ANALYZE.
- Goal - One of the
keys to successful instructional design is beginning with a clear picture
of your desired end result. In other words, you have to know exactly where
you want to go!
Begin by reviewing the overall goal of your
technology project. Consider the following questions before formulating and
writing your goal statement on the planning form:
- Why are you
doing this project?
- How do you
hope this project will enhance learning for your students?
- What learning
challenge(s) is this project expected to conquer?
- Audience - Another key
to successful instructional planning is having at least a general idea of
the learning characteristics and needs of the students.
Continue your analysis by listing the
probable characteristics of students who will be the target audience for your
project. Consider the following questions to help guide your thinking as you
develop your learner profile:
- What
classification of students generally take this course?
- Are most of
them majors or non-majors in the discipline?
- What have
they struggled with most in the past?
- Why do most
of them take the course (general education, major requirement, elective,
etc.)?
- How much
background knowledge do they typically have on the subject?
- Generally
speaking, what are their attitudes toward the course content?
- What is the
extent of prior experience with the content for most students who take
the course?
STEP 2: DESIGN AND DEVELOP
- General topics - The first
step in designing your specific learning outcomes is to define the scope
of the project. You began thinking about the scope when you stated the
overall goal. Continue by listing the major topics of information and/or
knowledge you expect students to study.
Before listing the general topics that will
define the scope of your project, consider the following questions:
- What is the
big picture?
- What are the
major topics studied in this class?
- What topics
are listed on the syllabus?
- What are the
general chapter headings in the textbook?
- "Performance-Based"
Learning Outcomes - The terms listed below are essentially
synonymous. They refer to course goals that:
- specify the information and/or
skills to be mastered AND
- specify what students will do
to demonstrate mastery.
- learning
outcomes
- performance-based
outcomes
- learning
objectives
- performance-based
learning outcomes
- course
objectives
- performance-based
objectives
- performance
outcomes
- performance-based
learning objectives
Once
developed, these learning outcomes are included in the course syllabus for two
reasons. First, they clarify for students exactly what they will be expected to
learn. Second, they tell students exactly what they will have to do to earn
grades reflecting various levels of mastery.
When
developing performance-based learning outcomes, it is important to keep the
following distinction in mind:
- activities
designed to help students master information and skills ARE DIFFERENT FROM
- activities
designed to allow students to demonstrate the extent to which they have
mastered the information and skills
do to earn grades
reflecting various levels of mastery.
When
developing performance-based learning outcomes, it is important to keep the
following distinction in mind:
- activities
designed to help students master information and skills ARE DIFFERENT FROM
- activities
designed to allow students to demonstrate the extent to which they have
mastered the information and skills
Instructional
Design Taxonomies
- Bloom's Cognitive Taxonomy
- evaluation - judge
value of ideas, appraise, predict, assess, select, rate, choose
- synthesis - put
together parts, compose, construct, formulate, manage, prepare, design,
plan
- analysis - dissect
parts, detect relationships, diagram, compare, differentiate, criticize,
debate
- application - use
methods, concepts, principles, apply, practice, demonstrate, illustrate,
operate
- comprehension -
understand information, discuss, explain, restate, report, tell, locate,
express, recognize
- knowledge - recall
information, define, repeat, list, name, label, memorize
- Krathwohl's Affective Taxonomy
- characterizing -
incorporate ideas completely into practice, recognized by the use of them
- organizing - commits
to using ideas, incorporates them into activity
- valuing - thinks
about how to take advantage of ideas, able to explain them well
- responding - answers
questions about ideas
- receiving - listens
to ideas
Objectives
of Instructional Design
Learning is an active process in which learners construct
new ideas or concepts based upon their current/past knowledge. The learner
selects and transforms information, constructs hypotheses, and makes decisions,
relying on a cognitive structure to do so. Cognitive structure (i.e., schema,
mental models) provides meaning and organization to experiences and allows the
individual to "go beyond the information given".
Thus,
- Instruction must be
concerned with the experiences and contexts that make the student willing
and able to learn (readiness).
- Instruction must be
structured so that it can be easily grasped by the student (spiral
organization).
- Instruction should be
designed to facilitate extrapolation and or fill in the gaps (going beyond
the information given).
Foster a learning culture
1. Offer training, within an overall culture that encourages cooperation,
risk-taking, and growth.
2. Get learners' buy-in and commitment in achieving training goals.
Motivate learners.
3. Demonstrate the value of the training to the learners and cultivate their
sense of confidence in their ability to master the objectives
Make training problem-centered.
4. Draw on authentic needs and contexts; make requirements of learning tasks
similar to important requirements of job tasks.
5. Encourage learners' active construction of meaning, drawing on their
existing knowledge (Resnick, 1983).
6. Teach multiple learning outcomes together (Gagne & Merrill, 1990).
7. Sequence instruction so that learners can immediately benefit from what
they learn by applying it to real-world tasks.
Help learners assume control of their learning.
8. Provide coaching.
9. Provide scaffolding and support in performing complex tasks.
a. Adjust tools (equipment), task, and environment.
b. Provide timely access to information and
expertise.
c. Provide timely access to performance feedback.
d. Utilize group problem-solving methods.
e. Provide help only when the learner is at an
impasse and only enough help for the learner to complete the task.
10. Fade support.
11. Minimize mean time to help (i.e., provide "just-in-time"
training).
12. Encourage learners to reflect on their actions.
13. Encourage exploration.
14. Encourage learners to detect and learn from their errors.
Provide meaningful "practice."
15. Provide opportunities for learners to apply what they've learned in
authentic contexts. If it is not feasible to practice on real tasks, provide
cases or simulations.
16. Personalize practice (Ross & Morrison, 1988).
Designing for Instructional Events
There are nine instructional events and corresponding
cognitive processes:
- Gaining attention
(reception) - show a variety of examples related to the issue to be
covered ...
- Informing learners of the
objective (expectency) - pose questions, and outline the objectives
...
- Stimulating recall of
prior learning (retrieval) - review summaries, introductions and
issues covered ...
- Presenting the stimulus
(selective perception) - adopt a definition and framework for
learning/understanding
- Providing learning
guidance (semantic encoding) - show case studies and best practices
...
- Eliciting performance
(responding) - get user-students to create outputs based on issues learnt
...
- Providing feedback (reinforcement)
- check all examples as correct/incorrect
- Assessing performance
(retrieval) - provide scores and remediation
- Enhancing retention and
transfer (generalization) - show examples and statements and ask
students to identify issues learnt ...
These events should satisfy or provide the necessary conditions for learning
and serve as the basis for designing instruction and selecting appropriate
media
Factors affecting Learning
Instructional
Design is largely affected by how a user learns:
Meaningfulness effect
Highly meaningful words are easier to learn and
remember than less meaningful words. This is true whether meaningful is
measured by
1) the number of associations the learner has for the word,
2) by frequency of the word
3) or by familiarity with the sequential order of letters,
4) or the tendency of the work to elicit clear images.
An implication is that retention will be improved to the extent the user can
make meaning of the material.
Serial position effects
Serial position effects result from the
particular placement of an item within a list. Memory is better for items
placed at beginning or end of list rather than in the middle. An exception to
these serial positions is the distinctiveness effect - an item that is
distinctively different from the others will be remembered better, regardless
of serial position.
Practice effects
Active practice or rehearsal improves retention, and
distributed practice is usually more effective than massed practice. The
advantage to distributed practice is especially noticeable for lists, fast
presentation rates or unfamiliar stimulus material. The advantage to
distributed practice apparently occurs because massed practice allows the
learner to associate a word with only a single context, but distributed
practice allows association with many different contexts.
Transfer effects
Transfer effects are effects of prior learning on the
leaning of new material. Positive transfer occurs when previous learning makes
new learning easier. Negative transfer occurs when it makes the new learning
more difficult. The more that two tasks have in common, the more likely that
transfer effects occur.
Interference effects.
Interference effects occur when memory or
particular material is hurt by previous or subsequent learning. Interference
effects occur when trying to remember material that has previously been
learned. Interference effects are always negative.
Organization effects
Organization effects occur when learners chunk or
categorize the input. Free recall of lists is better when learners organize the
items into categories rather than attempt to memorize the list in serial order.
Levels-of-Processing effects
The more deeply a word is processed, the
better it will be remembered. Semantic encoding of content is likely to lead to
better memory. Elaborative encoding, improves memory by making sentences more
meaningful.
State-Dependent effects
State- or Context-dependent effects occur
because learning takes place in within a specific context that must be
accessible later, at least initially, within the same context. For example,
lists are more easily remembered when the test situation more closely resembles
the leaning situation, apparently due to contextual cues available to aid in
information retrieval.
Mnemonic effects
Mnemonics - strategies for elaborating on relatively
meaningless input by associating the input with more meaningful images or
semantic context. Four well-known mnemonic methods are the place method, the
link method, the peg method and the keyword method.
Abstraction effects
Abstraction is the tendency of learners to pay
attention to and remember the gist of a passage rather than the specific words
of a sentence. In general, to the extent that learners assume the goal is
understanding rather than verbatim memory and the extent that the material can
be analyzed into main ideas and supportive detail, learners will tend to
concentrate on the main ideas and to retain these in semantic forms that are
more abstract and generalized than the verbatim sentences included in the
passage.
Levels effect
This effect occurs when the learner perceives that some
parts of the passage are more important than others. Parts that occupy higher
levels in the organization of the passage will be learned better than parts
occupying low levels.
Prior Knowledge effects
Prior knowledge effects will occur to the
extent that the learner can use existing knowledge to establish a context or
construct a schema into which the new information can be assimilated.
Inference effects
Inference effects occur when learners use schemas or
other prior knowledge to make inferences about intended meanings that go beyond
what is explicitly stated in the text. Three kinds of inferences are case
grammar pre-suppositions, conceptual dependency inferences and logical
deductions.
Student misconception effects.
Prior knowledge can lead to
misconceptions. Misconceptions may be difficult to correct due to fact that
learner may not be aware that knowledge s a misconception. Misconception occurs
when input is filtered through schemas that are oversimplified, distorted or
incorrect.
Text Organization Effects
Text organization refers to the effects that
the degree and type of organization built into a passage have on the degree and
type of information that learners encode and remember. Structural elements such
as advanced organizers, previews, logical sequencing, outline formats,
higlighting of main ideas and summaries assist learning in retaining
information. These organization effects facilitate chunking, subsumption of
material into schemas and related processes that enable encoding as an
organized body of meaningful knowledge. In addition, text organization elements
cue learners to which aspects of the material are most important.
Mathemagenic Effects
Mathemagenic effects, coined by Rothkopf (1970) , refer to various
things that learners do to prepare and assist their own learning. These effects
refer to the active information processing by learners. Mathemagenic activities
include answering adjunct questions or taking notes and can enhance learning.
Tools to Enable Instructional Strategies
If you selected one of the following
strategies ...
|
... then the following technology tools
can help enable your strategies:
|
A. Conversing, Discussing
|
e-mail, listservs, discussion boards,
chat
|
B. Mentoring, Questioning, Supporting a
Partner
|
e-mail; live, synchronous camera(s) for
mentor/mentee to discuss; chat room with white board, digital drop boxes for
file sharing and written critiques
|
C. Debating
|
e-mail, discussion boards, web sites that
showcase controversies or experts with opinions and theories; use resources
as the basis for discussion, such as www.ideachannel.com
|
D. Impersonating, Role Playing
|
asynchronous tools (i.e., e-mail,
discussion boards, chat) or synchronous tools (i.e., Symposium, CU-SeeMe,
live net-cams)
|
E. Sharing Data, Analyzing
|
e-mail, listservs, spreadsheets, data
analysis software
|
F. Developing a New Product or Artifact
|
web page editors for students, e-mail and
other communication tools, digital drop boxes for file sharing, server space
to post projects online, tools that allow for voting on or attaching comments
to students' work for the purpose of recognizing best or improving weak
artifacts
|
G. Traveling Virtually, Situating
Curriculum in the Context of Expeditions
|
a significant grant budget may be
required to create live expeditions, consisting of technology to upload live
broadcasts to satellites and back down to Internet servers with live
audio/video streams; alternatively, quests could be videotaped and delivered
at a later time via standard Internet video streaming
|
H. Seeking, Collecting, Organizing,
Synthesizing Online Information (Research)
|
web resources, either individual pages
related to a course, or entire archives from which students conduct research
to identify topics of interest or relevance to assignments
|
I. Exploring Real World Cases and
Problems
|
web-page editors (e.g., Dreamweaver),
photo editors (e.g., Photoshop), perhaps video editors (e.g., Premiere) and
knowledge of video streaming for Internet (e.g., Real Producer)
|
J. Accessing Tutorials with Exercises,
Quizzes, Questions, Online Drill-and- Practice
|
for creating virtual exercises, knowledge
of multimedia development programs (e.g., Director, Flash) and/or mechanisms
for placing them on the Internet (e.g., Shockwave, Java)
|
Assess
Instructional Outcomes
Assessing competencies developed as a result of learning is
critical ... this table shows competencies and ways to measure them
Competencies
|
Measures
|
evaluation
|
rubrics, critical thinking scales; rate quality of student
arguments, predictions, conclusions
|
synthesis
|
products or artifacts synthesized by students (web pages,
reports); rate according to desired criteria: originality, organizational
scheme, appropriate use of evidence versus conjecture
|
analysis
|
debates, critiques, discussions, case analyses; assess
student ability to extract relevant variables underlying a problem, issue, or
situation
|
application
|
word problems, experiments; assess student ability to
apply principles and theories to solve novel problems
|
comprehension
|
short answer questions
|
knowledge
|
multiple choice, true-false, matching
|
characterizing
|
practical experiences; interview, observe student beyond
class, in real settings
|
organizing
|
projects, cases
|
valuing
|
discussions
|
responding
|
problems, questions
|
receiving
|
problems, questions
|
The Traditional ID Process
The process which has tended to
guide ID is as follows (Main 1993: 38-39).
Analysis
The aim of this stage is to
determine training needs and produce a needs assessment document. Components
include:
- Goal
analysis: reducing abstract desired
outcomes to specific performances that can be measured;
- Performance
analysis: determining the reasons for
and solutions to the differences between present behaviour and desired
outcomes;
- Target
Population analysis: finding out
the relevant characteristics of the potential learners;
- Task
analysis: specifying and determining the
exact nature of the task the students must learn, analysing it into
sub-divisions, and deciding which aspects can be assumed to be in place
prior to the training;
- Media
Selection: finding the best combination
of media to carry out the training as determined in the other components;
- Cost
analysis: determining the cost of the
project, and tailoring the project to meet budgetary constraints.
Design
The aim of the design phase is to
develop a blueprint of how the finished product will look, and to produce a
storyboard and flowchart of the whole structure of the finished product. There are
several key design issues to be resolved at this stage, including:
- Interface
design: developing a consistent,
user-friendly, attractive layout for the basic controls and functions;
- Sequencing:
deciding on the best educational order in which to place the different
lessons and sub-components;
- Lesson
design: developing the strategies to
be followed within each lesson to best put across the teaching point, with
the emphasis being on retaining motivation and maximising retention;
- Learner
Control: deciding how much control the
learner can have over the lesson flow, and identifying key decision points
in the lesson sequence.
Development
This phase involves the programmers,
graphic artists, writers and subject matter experts filling out the
specifications in the blueprint. During this phase, a working model is usually
developed, and this is then formatively evaluated, with the feedback being
integrated into the ongoing development process. The outcome of this phase
should be the full learning programme.
Implementation and Evaluation
The final two phases involve
delivery of the completed programme to the learners, and evaluation of whether
the goals as set out in the needs assessment are met. Strict controls are
maintained in the delivery to facilitate a coherent summative evaluation.
Problems with this Approach
There are many problems with the
traditional approach to ID.
Rowland et al. (1994) distinguish between rational and creative approaches
to design. The former, epitomised in engineering, emphasises the need for
clearly defined concepts and skills, and prescribes a systematic method for
approaching problems. The creative approach, on the other hand, is based on
flexible, creative solutions to situations which are seen as unique. ID has
tended to follow the rational route, but a move to a more creative methodology
is necessary.
Reigeluth (1996) outlines the
paradigmatic shift from Industrial to Information Age thinking. These changes
happening in the world of work mean that the traditional ID approaches are no
longer capable of delivering what is required. Traditional approaches have
facilitated sorting of learners into standardised categories, thereby promoting
conformity and compliance. This is in direct contrast to what is now vital in
the business world, namely customised learning which allows individuals to
develop their own unique potentials and creativity so as to promote initiative,
diversity and flexibility within the organisation.
Gros et al. (1997: 51) criticise traditional
approaches for two reasons. Firstly, ID theory has been either too specific in
its prescriptions for it to be readily applicable to different situations, or
it has been too general, rendering its solutions vague and impractical.
Secondly, ID models are linear in character. This makes the design process
inflexible and less able to accommodate interactive changes, as is the case
with rapid prototyping. What is needed, instead, is a model that promotes an
iterative approach to ID.
Winn (1997: 36-37) points to the
causal basis of ID theory. The linear design process assumes that human
behaviour in instructional situations in predictable. He advances four
arguments against the predictability of human behaviour.
- All
individuals are different.
- Learners'
metacognitive abilities mean that they can choose to use different methods
of learning; this means that it is impossible to predict which method is
best, and what outcomes will be achieved.
- The
learning environment is very important in determining the outcome. The
designer can never predict what all learning contexts will be like, and so
cannot predict the learners' behaviour.
- People
do not think logically. The designer cannot predict the lack of
planfulness of the learner, and so cannot use a linear, predictable plan
to design the learning programme.
Jonassen et al. (1997: 28) criticise the positivist
basis of ID models. This basis in positivism has led to certain fundamental
assumptions by ID about learning situations.
- Learning
situations are closed systems.
- Knowledge
is an object which can be "put into" a learner (i.e. it is the
instructor's responsibility).
- Human
behaviour is predictable.
- Processes
in the educational setting can be understood according to the laws of
linear causality.
- Certain
interventions determine certain outcomes.
These assumptions are being
challenged by a variety of sources from within the scientific community, the
original parent of positivism. In contrast to the assumptions outlined above,
Jonassen et al.
(1997: 28) maintain that the elusive and complex nature of human consciousness
make it impossible to describe, let alone predict, what will happen in learning
situations. Knowledge is not a static object, but is rather distributed in
society, constantly subject to revision and negotiation. Further, based on
Heisenberg's Uncertainty Principle, they deny that causal relationships can
ever be established, as the act of studying any phenomenon alters its nature.
The best one can achieve is an estimate of probability. They also maintain that
learning systems are open-ended. The number and complexity of the variables
involved mean that any attempt to isolate specific variables is reductionist
and simplistic, and cannot do justice to the "fluctuations and
perturbations" (p. 28) found in the educational setting. Finally, the fact
that over many years of research, there have been no clear findings of
significant differences, indicates that ultimately educational settings are
unpredictable and cannot be approached in a linear fashion.
Alternative Approaches to ID
Jonassen and the New Sciences
In contrast to the positivist
approach criticised above, Jonassen et
al (1997: 29-33) suggest using Hermeneutics, Fuzzy Logic and Chaos
Theory as a basis for ID. They describe each theory and outline the
implications for ID.
Hermeneutics
emphasises the importance of socio-historical context in mediating the meanings
of individuals creating and decoding texts. This means that ID must strive to
introduce gaps of understanding which allow the learner to create his/her own
meanings. Another implication is that learners need to become aware of their
own and others' biases. Exercises must problematise the world of ideas and
values, rather than simplifying and codifying it. As Jonassen et al. (1997) express
it, "Good learners are naturally sceptical learners" (p.30). A third
implication is that other factors outside of the immediate learning situation
play a role in the learner's creation of meaning. Designers need to work in a
manner that allows the flexibility and openness that will enable these
"external" factors a place in the instruction. Finally, the learning
programme should facilitate understanding of different time periods, and other
cultures, so that learners' understanding is not mediated only by their own
unconscious biases.
Fuzzy Logic
is based on the idea that reality can rarely be represented accurately in a
bivalent manner. Rather, it is multivalent, having many varieties and shades
which do not have to belong to mutually exclusive sets. In terms of needs
assessment and design, the implication of this is that behaviour can only be
understood probabilistically, using continua, rather than binary measures.
Also, it means that problem areas, such as student perceptions of the efficacy
of the educational programme, can be incorporated into the design.
Chaos Theory
is useful for non-linear, dynamic situations where Newtonian physics is no
longer applicable, where input and output are not in direct proportion. Chaos
theory is also necessary where there is sensitive dependence on initial
conditions (i.e. where a very small change in the initial situation leads to
great changes later). Chaos theory finds order in the chaos of natural
structures through looking for self-similarity and self-organisation, patterns
that are repeated at different levels of complexity through a structure (e.g. a
fractal). Since the linear, deterministic approach is inapplicable to
educational settings, Chaos theory can offer ID some useful alternatives.
Firstly, designers need to include metacognitive skills in their designs, to
enable learners to deal with the complexity flexibly, rather than hushing it up
through simplification, and thereby crippling the learner who will all too soon
be faced by aspects of reality that do not fit the simplified scheme. Secondly,
ID needs to take account of learners' emotions, and promote self-awareness on
this level, not just the cognitive.
Reigeluth and the Information Age
Reigeluth (1996; 1997) discusses the
implications of the shift into the Information Age for ID theory. The most
important aspect of this whole shift is that instruction needs to be customised
rather than standardised (1997: 45). This implies that the instruction is
learner-centred, and is based on authentic tasks (1996: 14). The teacher needs
to become a facilitator, empowering the learners to construct their own
knowledge, rather than being the sole source of direction and knowledge in the
class. Reigeluth also suggests an alternative to the linear stages of the ID
process. The entire process cannot be known in advance, so designers are
required to do "just-in-time analysis" (p.15), synthesis, evaluation
and change at every stage in the ID process. To fit in with the demands of the
Information age, the designer will also need to become more aware of the
broader social context within which the instruction takes place, and will need
to consult more broadly with stakeholder groups so that a common vision of the
final instruction and the means to develop it is arrived at. The final
implication of this approach, is that learners should become
"user-designers" (p.18), with much of the design happening at the
point of delivery.
This is related to Winn's (1997: 37)
assertion that "the activities of the instructional designer need to take
place at the time the student is working with the instructional material".
He maintains that ID decisions should be made on the fly as a response to
student involvement in the learning process.
Gros et al.: ID for Multimedia
Gros et al. (1997: 51-52) outline the
characteristics of more powerful models of ID that will facilitate multimedia
authoring. They maintain that ID models need to allow a more flexible design
process that includes rapid prototyping, and that there must be a clearer link
between skill and knowledge acquisition. Whereas much ID focuses on cognitive
skills and ignores the multi-perspectival presentation of knowledge, multimedia
authoring tends to emphasise the presentation of knowledge without due regard
for developing cognitive skills. A new model of ID needs to combine the best of
both worlds by using a more constructivist approach, one which starts with
relevant, non-trivial scenarios (derived from a needs analysis) as situations
within which the cognitive skills are developed and practiced.
Elaboration Theory and Hypermedia
Hoffman (1997) makes the link
between Reigeluth's Elaboration Theory (ET) and hypermedia. ET is "a
macro-strategy that focuses on the organisation and sequencing of
subject-matter content" (p.59). The key idea in ET is that within a subject
area there is an epitome,
an overarching, organising concept. This is the first concept to teach, and
then what follows is elaboration of the epitome. Each component of the
elaboration also has its own epitome and sub-concepts. The elaboration of a an
epitome could include concepts
(which answer the question "What?"), procedures (which answer the question
"How?"), and theories
(which answer the question "Why?"). Further elaboration of these
could include definition, examples and practice.
The key aspects of Hypermedia are
that it should provide easy access to information within an interactive
environment which can be customised. The web-like linking of ideas that
characterises hypermedia is more akin to the functioning of human cognition
than is the traditional linear structure found in much educational programming.
It is this kind of structure that is proposed by ET also.
The advantages of this kind of model
(ET/Hypermedia) for ID is that modularity and plasticity are possible. A
modular approach makes it possible to easily make changes in response to
learner needs without changing the overall structure. Plasticity is also
possible as a web structure can grow and develop rapidly and easily, and can be
easily customised from the user end, making learner control more feasible.
Conclusion
To sum up this whole discussion, one
can say that ID theory, in that it guides the practice of designers, is
necessary and plays an important role. However, it needs to change in many
respects if it is to fulfil this role adequately. In general, ID theory needs
to move in the direction of flexibility and learner-empowerment if it is to
allow ID to keep up with technological and institutional changes.
It is perhaps fitting to conclude
this paper in the words of Jonassen et
al (1997: 33). They conclude their article thus:
"Like the chiropractor who
realigns your spine, we might become healthier from a realignment of our
theories. If we admit to and attempt to accommodate some of the uncertainty,
indeterminism, and unpredictability that pervade our complex world, we will
develop stronger theories and practices that will have more powerful (if not
predictable) effects on human learning."
Rapid
Instructional Design
Objectives
· Justify the need for just-in-time instructional design strategies
to replace the conventional ISD model for designing instructional packages
today.
· Apply the basic principles of trading off resources between design
and delivery and among the three components of effective instruction to speed
up the instructional design process.
· Apply appropriate shortcuts, combinations, and deletions to the
convention ISD model to speed up the instructional design process.
· Use templates and shells to speed up the instructional design
process.
· Use appropriate equipment to speed up the instructional design
process.
· Make more effective use of human resources (including
subject-matter experts and trainees) to speed up the instructional design
process.
· Reduce self-doubt and guilt by positively associating cheaper and
faster instructional design with better learning effects.
Two Basic Trade-Offs
Before exploring some
specific strategies, I would like to offer two basic trade-offs to prevent you
from sacrificing the effectiveness of the product for the efficiency of the
process. These trade-offs help you identify specific instructional design
components for cutting corners.
The first of these
trade-offs is between the design and the delivery of instruction. Design
involves all activities undertaken before the actual learner interacts with the
instructional package in a real-world training situation. Delivery is what
happens subsequently. An important principle (and constraint) is that you can
trade off resources allocated to these two phases. For example, if you have a
high resource level for delivery (subject matter experts as instructors, plenty
of instructional time, small groups of learners, and alternative instructional
materials), you can skimp on the design. On the other hand, if you have extremely
limited resources for the delivery of instruction (nonspecialist instructors,
tight learning schedule, and large groups of learners), you need to allocate
extra time and other resources to the design process. The basic idea here is
that you pay now or pay later. Depending on the context, you can (and should)
select the optimum allocation of resources between design and delivery. It
would be inefficient (and misanthropic) for you to produce idiot-proof
instructional packages for all situations without carefully taking into
consideration the resources available for the delivery of instruction.
Just-in-time instructional design requires that you exploit everything
available in the instructional scene.
The second trade-off is
among the three components of an effective instructional package. Irrespective
of your preferred school of psychology, effective instruction has these three
components:
1. Presentation to learners of new information related to the
instructional objectives.
2. Activities by learners that require them to process the
information and to provide a response.
3. Feedback to learners to provide reinforcement for desirable
responses and remediation for undesirable ones.
Whether these three
components are applied at a micro level (as in the case of step-by-step
directions on how to tie a shoe string) or at a macro level (as in the case of
a global case study on cross-cultural sensitivity), they are essential in all
instructional packages. When instructional designers falsely assume that any
one of these three components is sufficient, the result is false economy and
faulty instruction. To provide a few stereotypical (and nongeneralizable)
examples, college professors primarily present information; self actualization
gurus focus exclusively on processing by learners; and significant others
typically concentrate on giving feedback. The result in all these cases is
incomplete learning.
The important point here
is that you should not ignore any of these three components. The just-in-time
principle is that you can adopt or design these three components independently
of each other--in the initial stages. For example, you can rapidly videotape
the talking head of a subject-matter expert explaining the subtleties of a
complex concept. You can then design appropriate practice activities to
facilitate the mastery of the concepts and to provide suitable feedback in the
form of a model response. As long as you integrate these three components in
the final package, you produce effective instruction faster and cheaper.
A Preview
Here's a checklist that
summarizes the 10 strategies (and their associated guidelines) for rapid
instructional design:
Strategy 1. Speed up the
process.
Guideline 1. Use
shortcuts in various phases of the instructional design process.
Guideline 2. Combine
different phases of the instructional design activities.
Strategy 2. Use a
partial process.
Guideline 3. Skip phases
in the instructional design process that are unnecessary or superfluous.
Guideline 4. Produce a
lean version of the instructional package for immediate use and continuously
improve it after implementation.
Strategy 3. Incorporate
existing Instructional materials.
Guideline 5. Use a
systematic approach to analyze learner and delivery variables to adapt the
content and activities in existing instructional material.
Guideline 6.
Deliberately design generic instructional materials for local finish.
Strategy 4. Incorporate
existing noninstructional materials.
Guideline 7. Use
noninstructional materials to present the basic content. Design suitable
activities and feedback systems to reinforce this content.
Guideline 8. Design
instructional packages around job aids.
Strategy 5. Use
templates.
Guideline 9. Use
templates to specify the content, sequence, activities, and feedback
requirements for different types of learning.
Guideline 10 . Use
standard procedures for designing small-group instructional activities.
Strategy 6. Use
computers and recording devices.
Guideline 11. Use
suitable software packages to speed up various aspects of analysis, design,
writing, illustration, evaluation, and revision.
Guideline 12. Use audio
and videotape recording equipment to save time on analysis and production.
Strategy 7. Involve more
people.
Guideline 13. Use an
emergency team to rapidly work through all phases of systematic instructional
design.
Guideline 14. Use
vertical teams to specialize on different phases of instructional design or
horizontal teams to specialize on different modules of the instructional
package.
Strategy 8. Make
efficient use of subject matter experts.
Guideline 15. Train and
support subject-matter experts to become performance-oriented trainers.
Guideline 16. Change the
role of subject-matter experts.
Strategy 9. Involve
trainees in speeding up instruction.
Guideline 17. Use interactive
techniques to shift instructional design responsibilities to the trainees.
Guideline 18. Use peer
tutoring to maximize mutual learning and teaching.
Strategy 10. Use
performance support systems.
Guideline 19. Facilitate
learning through individualized systems of instruction.
Guideline 20. Use
suitable incentives to reward learning.
Strategy 1. Speed up the
process.
As long as you treat the
conventional ISD model as a flexible framework (and not as compulsory
commandments), you can use it to prevent waste of time. The important point to
remember is modify the model to suit your needs.
Guideline 1. Use
shortcuts in various phases of the instructional design process.
You can save significant
time and resources by employing shortcuts within the conventional ISD
procedure. These shortcuts are based on the experiences of practitioners or the
findings of researchers. Every phase and step of the instructional design
process can benefit from several of these shortcuts. Here are some examples:
Needs analysis. To confirm or reject an apparent need, use existing records (for
example, reports of employee accidents or copies of customer complaints)
instead of extensive interviews.
Task analysis. To identify various steps of an administrative procedure, check
with the corporate policy manual. Ask employees to describe exceptions and
modifications of this procedure.
Production. Ask a subject matter expert to demonstrate an activity and make a
videotape recording. Use this approach to bypass the elaborate ritual of preparing
a treatment, writing a rough script, formatting a shooting script, preparing a
storyboard, and producing the instructional video.
Expert reviews. Instead of sending out review copies to various experts and
waiting for them give you feedback, conduct a focus group session. Give copies
of the material to a selected group, and walk them through a structured
discussion. Among other things, this approach saves time by requiring experts
to reconcile differences of opinions and provide you with specific prescriptions.
Evaluation and revision. Test the instructional package individually with four or five
representative learners, making on-the-spot revisions during the tryout
session. Research studies indicate that the improvements resulting from this
procedure are comparable to those from elaborate evaluation with stratified
random samples of several learners, control groups, batteries of pretests and
posttests, and sophisticated statistical analyses.
Guideline 2. Combine
different phases of the instructional design activities.
Most practitioners
realize that the phases and steps in the instructional design process are
merely for convenience and not absolute divisions. For example, you cannot
declare that all your analyses are completed at a specific time and that you
will not do any more analysis later. You can deliberately combine adjacent
steps in the instructional design process to save time. Here are some examples:
Analysis and design. Instead of completing a comprehensive analysis of an entire
course, you can begin writing the course materials, undertaking analyses as
needed. The act of writing the material will help you come up with the right
questions for your analysis.
Analysis and evaluation.
Most valid evaluation
strategies accurately reflect the results of various analyses. For example,
final tests should be based on the task analysis and the final impact of
training should be evaluated against the need analysis. You can save
instructional design time by reporting the results of different analyses in the
form of evaluation blueprints.
Evaluation and design. A standard operating procedure in instructional design is to
specify behavioral objectives and use them as the basis for constructing
criterion tests and designing instructional content. You can bypass the step of
writing instructional objectives, and use criterion test items to provide the
operational definitions of the objectives.
Evaluation and
implementation. In most situations, the
prototype instructional package is an obvious improvement on earlier
instructional attempts. There is generally no need to conduct an contrived
pilot test before actually using the package for training purposes. Unless you
have serious reservations about your instructional design competencies, combine
your field test with the first run of the training program. In addition to
saving time, the data from this approach will be more realistic and useful.
Strategy 2. Use a
partial process.
The conventional ISD
model is too comprehensive for everyday use. You should definitely consider
completing all of its phases and steps when you are designing a comprehensive
mathematics curriculum for high schools. But you don't have to blindly follow
all the steps for creating a simple checklist for three local salespeople on
how to complete their expense reports.
Guideline 3. Skip phases
in the instructional design process that are unnecessary or superfluous.
Most instructional
designers are indoctrinated to feel guilty if they skip any phase or step in
the conventional ISD process. This results in unnecessary waste of time and
other resources. You can improve the efficiency of instructional design by
recognizing and avoiding unnecessary activities. Here are some examples:
Needs analysis. If your client is convinced there is a training need, avoid
challenging the statement and insisting on conducting your own front-end
analysis, needs analysis, performance analysis, and so on. Assume that the
client is intelligent and his or her conclusion is legitimate. After all,
perceptions are as important as reality and you are not going to make friends
with your client by beginning the project with an apparently unnecessary
activity. Stop wasting time and money.
Summative Evaluation. Instructional designers frequently attempt to conduct a final
field test under controlled conditions to validate the cost-effectiveness of
the instructional package. While this is an important and worthwhile
undertaking, ask yourself, Who Cares? and So what? Unless you are working on
your doctoral dissertation, there is no special advantage in collecting data
and writing reports if nobody reads them and no useful improvements result.
Meetings and Report
Writing. An enormous amount of time and
money is spent in having people attend meetings and write reports before,
during, and after instructional design. Significant savings can be achieved by
eliminating all unnecessary meetings, having meetings attended by only the
essential decision makers, increasing the productivity of meetings with
specific agenda and time limits, replacing information-dissemination meetings
with memoranda and voice-mail messages, eliminating all unnecessary reports,
and limiting the essential reports to single pages.
Guideline 4. Produce a
lean version of the instructional package for immediate use and continuously
improve it after implementation.
There is a built-in bias
toward overkill in the conventional ISD process. The obsession--for doing it
right the first time through painstaking analysis and planning, for pleasing
all the people all the time through incorporating everyone's inputs and
feedback, and for attempting perfection through several rounds of testing,
revision, and retesting--violates the Pareto principle. Much time (and other
resources) can be saved by focusing on critical content and key steps and
producing a lean instructional package. Improvements to this core package can
be added gradually after it is implemented. Here are some specific suggestions:
·
Classify content areas
in terms of importance. Separate the nice-to-know elements from the absolutely
essential ones. Ignore (or merely list) the former and spend your time and
resources in designing detailed instruction for the latter.
·
Identify target
subgroups and focus on the majority. Design your package for use by the
subgroup to which most of your learners belong. You can temporarily ignore the
advantaged minority and make some special provisions (such as remedial
tutoring) for the disadvantaged minority.
·
Stop your initial
instructional design at the end of the minimal activities. During the initial
stage, just conduct a task analysis and construct a criterion test. Organize
the test items in an appropriate sequence and use the collection as the initial
instructional package. Use a subject-matter expert to provide the necessary
instruction. Later, gradually replace this instructor with suitable
instructional content and activities.
Strategy 3. Incorporate
existing instructional materials.
Information
technologists estimate that more than half a billion instructional and
educational materials of various types exist in the English language. In spite
of this, whenever instruction is indicated, the tendency is always to create a
brand new training package.
Guideline 5. Use a
systematic approach to analyze learner and delivery variables to adapt the
content and activities in existing instructional material.
The not-invented-here
reaction to existing instructional materials is expensive and time consuming.
Even if an off-the-shelf instructional package does not exactly meet your
requirements, it is usually cheaper and faster to modify the material than to
design a new package from scratch. Even in cases when there are absolutely no
available materials (such as in training for a new computer program), it is
possible to adapt instructional packages that deal with some related product or
procedure.
Here are some specific
suggestions for incorporating an existing instructional material into a new
package:
·
Begin with a quick
analysis of the new problem, task, content, learner, language, and delivery
variables.
·
Check the existing
instructional materials against the results of these analyses.
·
Modify the intents
(goals and objectives), contents, and activities. Specific modifications may
include deleting portions dealing with unnecessary objectives, adding new
objectives and content, simplifying the language, and modifying the activities.
·
For an alternative
approach based on formative evaluation, take the existing instructional
material in its current form and try it out with a representative group of trainees.
Based on the feedback, make appropriate modifications to the materials to
better meet the needs and preferences of the trainees.
Guideline 6.
Deliberately design generic instructional materials for local finish.
Use this just-in-time
technique in large organizations with standard policies and procedures that are
adapted to local conditions, cultures, and resources. The generic version is
rapidly produced at the corporate headquarters and local variations are created
in branch locations. The success of this approach depends on using flexible
design principles to create the original package. Here are some suggestions for
this approach:
· Modularize the instructional package by objectives (rather than by
content) to permit local designers to delete modules or to rearrange them based
on their specific goals.
· Use media that are easy to revise. Printed materials are easier to
modify than multimedia productions. Within print, pages formatted with word
processing software are easier to modify than those that are typeset in the
traditional fashion. Simple illustrations are easier to modify than complicated
artwork.
· Whenever possible, build the training package around set of job
aids. By modifying the job aids to suit the local needs and constraints you can
rapidly modify the scope and sequence of the instructional package.
· Make the structure and organization of the training package
clearly visible through the use of appropriate headings. Use sectional headings
and page numbers. Provide detailed table of contents and indexes. These
elements should permit local designers to immediately locate the appropriate
sections for modification.
· Check all your illustrations and people's names and incidents in
examples and exercises to make sure they are culturally neutral (or diverse).
This guideline is especially important if your organization has multinational
locations.
· Include a collection of alternative examples and cases along with
the generic package. Provide keyword indexes to these examples to permit local
designers to choose the most appropriate ones.
· Include a collection of alternative exercises and activities. Use
suitable classification schemes to identify the key features of each
alternative.
Strategy 4. Incorporate
existing noninstructional materials.
If you accept the
three-component division of an instructional package into content, activities,
and feedback, you can integrate several interesting and instructive
noninstructional materials to present the basic content.
Guideline 7. Use
noninstructional materials to present the basic content. Design suitable
activities and feedback systems to reinforce this content.
For example, in training
technical advisors for a hydroelectric project in West Africa, you can have
them read a collection of short stories about life in Ghanaian villages. This
provides the trainees basic background information about the cultural values of
the people they will be working with. To provide an opportunity to process this
information, you can create an adjunct activity that requires participants to
prepare a list of major cultural differences between them and the villagers. To
provide feedback, you can ask participants to compare their lists with those
provided by cultural anthropologists.
Here are some additional
examples of noninstructional materials being integrated into instructional
packages:
· A course on public speaking uses videotapes of several
professional and amateur speakers. Trainees are provided with a checklist for
evaluating key elements of each speaker's performance as a prelude to
videotaping one's own presentation and critiquing it.
· A new employee orientation package includes the annual report of
the corporation and its policy manual. Trainees spend an hour reviewing these
documents and coming up with the correct answers to 20 factual questions.
· An in-house management-training package contains various excerpts
from television sitcoms. The facilitator uses these as examples of different
management styles. Later, trainee teams create their own sitcom segments to
illustrate a new type of manager for the next decade.
· A workshop package on change management uses reprints of articles
from back issues of Popular
Mechanics published in the 1940s. Participants read glowing
reports of technological breakthroughs and figure out the reasons why they did
not live up to their promise. This module introduces them to factors that
enhance and inhibit the adoption of innovations.
Guideline 8. Design
instructional packages around job aids.
Job aids are checklists,
decision tables, worksheets, flowcharts, and other such items that improve the
performance of a person as he or she is performing--without the need for
remembering specific steps or factual information. The telephone directory is
an example of a job aid that improves your ability to call others without
having to memorize random digits. Instructional packages for most procedural
tasks can be designed efficiently by beginning with the design of job aids.
Here is a simple two-step procedure for using this strategy:
· Use your task analysis to identify steps and decisions in the
procedure. Prepare a set of job aids that will enable a nonspecialist to
complete the procedure. Coach a person through these job aids to collect
feedback. Modify the job aids to make them more effective and user-friendly.
· Analyze the job aids to identify basic skills for using them.
Prepare an instructional package to teach trainees how to use the job aids.
Test your package on representative learners and modify it on the basis of
their feedback.
Frequently, the
necessary job aids may already be available (for example, in computer
documentation, equipment troubleshooting manuals, and cookbooks). You can
design an instructional package to teach trainees how to use them.
Strategy 5. Use
templates.
You can use job aids to
simplify the task of instructional design. Templates provide a convenient type
of job aids.
Guideline 9. Use
templates to specify the content, sequence, activities, and feedback
requirements for different types of learning.
Conventional ISD models
place too much emphasis on procedures and not on principles. They prescribe
global tasks such as prepare the draft version of your instructional material
and fail to provide guidance in the selection of appropriate instructional
strategies and tactics. Instructional objectives can be classified into
specific types of learning and, although there is no one best strategy for each
type of learning, there are a few preferred strategies based on empirical
principles of learning. Effective and efficient CBT authoring systems
frequently include templates for designing instruction to facilitate a specific
type of learning. Such templates can also be used for non-computer based
instruction. Worksheets, decision tables, and checklists can speed up the
instructional design process at the strategic and tactical levels. Here are
some examples:
· Use some convenient scheme to classify instructional objectives
into such types as factual information, concepts, processes, procedures, and
principles. For each type of information, use a standard format for creating
criterion test items.
· For teaching factual information, use this template: Present the
information in suitable chunks, emphasize logical links, provide mnemonics to
facilitate recall, require trainees to process the information, provide
suitable feedback, review the information, repeat the information in different
configurations, and summarize the information.
· For teaching concepts, use this template: Present clear-cut
examples, present matched nonexamples to emphasize critical features of the
concept, present divergent examples to emphasize variable features, require the
trainee to discriminate among new examples and nonexamples, provide feedback,
and test for the ability to generalize and to discriminate.
· For teaching procedures, use this template: Provide an overview of
the entire procedure, demonstrate each step and identify its critical elements,
coach the trainees as they practice each step, require the trainee to
demonstrate the mastery of each step, integrate all steps, provide systematic
practice toward fluent application.
Guideline 10. Use
standard procedures for designing small-group instructional activities.
Instructional designers
frequently have difficulties designing experiential activities that involve interaction
among trainees. To simplify and speed up the design of an activities-based
instructional package, you can use several pre-established shells which are
associated with different types of learning. One efficient tool in this area is
the framegame which is a training game deliberately designed to permit the easy
loading of different instructional content. Here are two examples of framegames
from a computer training context:
Blockout Bingo is a framegame for teaching multiple discriminations. In a sample
game, each trainee is provided with a 5 x 5 grid with two sets of 12 numbers
representing function key numbers in the computer key board. The facilitator
calls out a task to be completed with an application program. Trainees identify
the square with the correct function key for performing the task. After a
10-second pause, the facilitator gives the correct answer. If correct, the
trainee makes a big check mark on the square. The first trainee to mark five
squares in a straight line wins the game.
Blockout Bingo can be easily loaded with other content: ASCII code numbers for
special characters, hot keys for keyboard commands, locations of menu items,
names of different type faces, and control keys for glossary items. Obviously,
this activity can be used beyond computer training wherever basic association
are to be mastered.
Infer is a framegame for teaching concepts. In a sample game, groups of
trainees are given a handout with acceptable file names in one column and
unacceptable file names in another. The facilitator makes a statement about
file names (for example, "A file name should not mix letters and
numbers") and a selected player decides whether the statement is true,
false, or cannot be judged, based on the examples and nonexamples in the
handout. Other players may challenge the first player's decision. Based on the
correctness of the decision and the challenge, players accumulate points. The
player with the highest score at the end of 15 minutes wins the game.
Infer can be easily loaded with other conceptual topics such as expense
categories, paragraph tags, subroutines, field names, and page layout. The game
can obviously be used with concepts and principles from any subject area.
Strategy 6. Use
computers and recording devices.
Instructional design
involves the production, revision, and reproduction of various materials. Like
any other production activity it can be speeded up considerably by the use of
high-tech equipment.
Guideline 11. Use
suitable software packages to speed up various aspects of analysis, design,
writing, illustration, evaluation, and revision.
If you are not using
computers to produce your instructional packages, you are at a competitive
disadvantage. Investment in even the simplest computer system can significantly
speed up your production. Here are some examples of how computer software can
be used in different stages of instructional design:
Analysis
· Form design packages (for
example, FormTool or Xerox FormBase) for quickly designing questionnaires and
forms for data collection.
· Flowcharting software (for
example, ABC Flowcharter or EasyFlow) for rapidly preparing flowcharts during
and after task analyses.
· Spreadsheets (for
example, Microsoft Excel or Lotus 1-2-3) and statistical packages (for example,
SPSS) for analyzing, summarizing, and charting quantitative data.
· Personal information management systems (for example, Lotus Agenda) for sorting and analyzing qualitative
data and open-ended comments during analysis.
Design
· Creativity tools (for
example, IdeaFisher or Idea Generator) for designing the instructional package.
· Specially designed expert systems (for example, those found in proprietary CBT authoring systems)
to ensure the use of appropriate instructional strategies and tactics.
· Outliners (as found in Microsoft
Word) and idea
processors (for example, MaxThink) for systematically building
up from analysis data through criterion test items to instructional content.
· Word processors (for
example, Microsoft Word or WordPerfect) for producing initial drafts,
revisions, and for archiving earlier versions.
· Spell checkers and proofreading packages
(for example, Grammatik or Right Writer) for cleaning up the draft version and
for maintaining an appropriate reading level.
· Desktop publishing software
(for example, Ventura Publisher or PageMaker) for rapidly laying out finished
pages.
· Graphic packages (for
example, Corel Draw or Micrografx Designer) for producing charts and
illustrations.
· Presentation packages (for
example, Powerpoint or Persuasion) for rapidly producing slides and
transparencies.
Evaluation and revision
· Specially designed computer software for the initial presentation
of the instructional text and automatic trapping of student responses.
· Various software for data analysis mentioned earlier.
· Groupware (for example, For
Comment) for collaborative review and editing of the instructional material.
· Word processors, desktop publishing packages,
and graphics software
for rapidly revising and resequencing text and illustrations.
Guideline 12. Use audio
and videotape recording equipment to save time on analysis and production.
In recent years,
camcorders, microcassette recorders, and other electronic recording devices
have become cheaper, smaller, lighter, friendlier, and more powerful. They
provide another set of tools for automating and speeding different aspects of
the instructional design process. Here are some examples:
· During task analysis, you can videotape an expert demonstration of
a complex technical task. By replaying, pausing, slowing down, and freezing this
videotape, you can complete a thorough task analysis without wasting the
expert's time.
· You can have an expert videotape his or her demonstration and mail
the tape to you. This saves travel time and money.
· During design, you can record an interview with a subject-matter
expert and edit the tape for presenting the basic instructional content.
· During design, you can record a lecture on video or audiotape and
use it as the quick-and-dirty prototype.
· During evaluation, you can videotape a focus group session and
conduct a leisurely review later to analyze and summarize the feedback.
Strategy 7. Involve more
people.
The usual reaction to
urgent demands is to immediately hire more people. This is a fairly expensive
approach and is not always guaranteed to produce results. Frequently, the time
saved by hiring the more workers is less than the time wasted in coordinating
the larger group. However, there are times when many hands could speed up the
instructional design work.
Guideline 13. Use an
emergency team to rapidly work through all phases of systematic instructional
design.
Inspired by a keynote
presentation by Robert Mager at an NSPI Conference, I have experimented with a
SWAT (Specialized
Workers And Tactics) Team approach
to instructional design during emergencies (as in the case of training relief
workers in East Africa). In this approach, a specially-assembled team is given
a specific training objective and all the necessary equipment and support
staff. Here's a brief description of how a marathon instructional design
session works:
· The chief subject-matter expert (CSME) in the team is given a
specific instructional objective, a time limit, and a brief description of the
target population.
· A small group (4 to 7) of representative trainees are taught the
by CSME who is assisted by an assistant SMEs. This instructional session is
videotaped and auditioned. A group of instructional designers and support staff
members observe the session from behind a one-way mirror.
· As the session progresses, the designers conduct a task analysis
by paralleling the CSME's performance. They also coordinate the preparation of
suitable job aids, handouts, and visuals based on the CSME's performance.
· Immediately after the session, evaluators debrief the trainees
while designers interview the SMEs.
· Within moments of the first session, a new version of the lesson
is presented to a second group of representative trainees. This time, an
instructional designer is in charge of the session and he or she uses the
handouts, job aids, visuals, and videotape segments. An SME assists the
instructional designer whenever content expertise is required. The
instructor/designer makes on-the-spot modifications on the materials and these
are incorporated in the master set by other team members.
· After appropriate improvements to the instructional package, the
next session is conducted by a representative trainer (who has been observing
the earlier sessions). He or she wears a wireless earphone through which the
team provides coaching advice whenever appropriate. While this session is going
on, the team produces an instructor's manual.
· The SWAT team repeats testing and revision activities a few more
times. Shortly after the last session, the instructional package (including the
instructor's manual) is ready for duplication.
Guideline 14. Use
vertical teams to specialize on different phases of instructional design or
horizontal teams to specialize on different modules of the instructional
package.
To benefit from
additional project team members, you should coordinate and support the team
with full-time managers. In general, you can use either of these two approaches
for organizing the team:
Vertical structure. In this approach, you ask your team members to specialize in
different phases of the instructional design process. For example, you can use
one person to specialize in analysis, another in design, another in evaluation,
and so on. The main advantages of this approach include the efficiency of a
person being able to concentrate on just one task and each specialist not being
constrained by future tasks. The disadvantages include loss of useful
information from one phase to the next and the earlier specialists running out
of things to do during later stages. To achieve the maximum benefit from this
approach, the outputs each person passes to the next person in the process line
should meet prespecified criteria.
Horizontal structure. In this approach, you divide the instructional package into
several self-contained modules and assign the production responsibility for
each to a different team. Each team works on its module from the initial
analysis to the final testing. The advantages of this approach include the
teams accomplishing a complete task and no useful information being lost from
one phase to the next. The disadvantages include lack of objectivity in
evaluating and revising your own package, and possible lack of consistency
among modules produced by different teams. To achieve the maximum benefit from
this approach, all teams should work from the same instructional design model
and to same standard specifications.
Strategy 8. Make
efficient use of subject matter experts.
It is a dangerous
misconception that if you know what to teach (your subject matter), then you
are ready to train. Equally dangerous is the opposite misconception (created
and maintained by instructional designers) that subject matter experts cannot
train. In reality, you can use subject matter experts to deliver training--and
thereby increase the efficiency of instructional design.
Guideline 15. Train and
support subject-matter experts to become performance-oriented trainers.
Left to their own
devices, most subject-matter experts train the way they were trained--with an
obsession for transmitting all the glorious technical details of everything
they know. You need to shift the SME-Trainer's focus from covering the
curriculum to changing trainees' behavior. Here are some suggestions:
· Involve the SME-trainers in the design of the instructional
package. Explain how the materials are geared toward changing the trainees'
behavior.
· Support the training with handouts, technical manuals, and other
documents. Reassure the trainers that the trainees will have access to accurate
and up-to-date information whenever needed.
· In your train-the-trainer sessions, practice what you preach. For
example, don't lecture on the importance of interactive activities.
· Use behavior modeling. Ask the SMEs to observe an expert trainer
in action. Stress how the trainer focuses on learning rather than on lecturing.
· Involve the SME-trainers in practice teaching roleplays. Videotape
these sessions and provide specific feedback to improve the training behavior.
· Provide lesson plans to structure and support the SME's training
activity. Instead of including a content outline in the lesson plan, list a
series of questions which the trainee should be able to answer at the end of
the lesson.
· Include several interactive activities in the instructional
materials. Train the SME-trainers to facilitate these activities.
Guideline 16. Change the
role of subject-matter experts.
One drastic approach for
shifting the SME-trainers away from lecturing and toward performance
improvement is to give them a different job title. Here are a few suggestions
on how to change the role of the SME-trainer:
· Make the SMEs coaches instead of trainers. Explain their task is
to improve the job performance of the members of their team.
· Involve the coaches in designing job aids. Later, encourage the
coaches to explain the use of these job aids to the people they coach.
· Train the SMEs on the essential steps of the coaching process.
Include roleplays of coaching situations.
· Stress the importance of guided practice. Provide a detailed list
of practice exercises for use by the coach.
· Stress the importance of giving feedback to the performer. Provide
a job aid to the coaches on how to give specific and timely feedback.
· Make the SMEs consultants instead of trainers. Explain that their
task is to help the employees perform better on their job.
· Train the employees to use the SMEs as internal consultants. Have
them organize individual and small-group consulting sessions.
Strategy 9. Involve
trainees in speeding up instruction.
The trainees themselves
are an important--and ignored--resource in instruction. You can tap this
valuable resource by using appropriate strategies in instructional design and
delivery.
Guideline 17. Use
interactive techniques to shift instructional design responsibilities to the
trainees.
You can shift the
practice component to trainee control and responsibility during the delivery of
instruction. Adjunct gaming, in which games are used to reinforce the
instructional content presented through different methods and media, will help
you do this. Here are some examples of adjunct games:
The Press Conference Game
begins with the trainees brainstorming a list of critical questions related to
a topic they are to study. Teams of trainees edit a collection of these
questions and take turns to interview one or more subject-matter experts for a
specified period of time. At the conclusion of each segment of this press
conference, the other teams prepare and present a brief summary of the major
concepts and the questioning team awards score points. The game is continued
with every team getting a turn to question the experts.
The Team Quiz design uses
subject-matter experts to present relevant information in the form of a
10-minute lecturettes. After each lecturette, different teams spend 5 minutes
to compare their notes and to prepare a set of questions. Later, teams take
turns quizzing each other to win points.
Reading assignments
provide the basic content in the Question
Game. Each trainee prepares 10 cards on the content of the
assignment with a question on one side and the correct answer on the other. The
trainees are divided into groups of four or five. Each group shuffles its cards
and exchanges them with some other group. Using the new cards, trainees in each
group play a game with players taking turns to read the question on the top card
and coming up with an answer within 15 seconds. Other trainees in the group may
challenge the answer. Depending on the original answer, the challenge, and the
correct answer on the back of the card, trainees earn score points. Game ends
when all the cards are used. The trainee with the highest score is the winner.
Guideline 18. Use peer
tutoring to maximize mutual learning and teaching.
The strategy of using
trainees to teach each other has been in use since ancient times. Recent
research studies confirm the instructional effectiveness of this strategy and
the truth inherent in the Latin advice, doce
ut discas (teach in order to learn). From an instructional design
point of view, peer tutoring enables you to spend less time on the design by
utilizing the trainees as valuable resources during delivery. Here are some
suggestions to maximize the instructional benefits from this approach:
To initiate the peer
tutoring process, an initial set of trainees have to be taught the basic skills
and knowledge. This can be done through any suitable medium and method. An
important strategy is to teach different knowledge and skill items to different
trainees so that everyone has to (and is able to) teach and learn from the
others.
Self-managed learning
teams increase the efficiency of
peer learning. You can use the ingenious cooperation-competition blend in which
teams coach and support each other during the collaborative learning periods
and fight for points with contestants from other teams during the competitive
tournament period.
One-on-one tutoring is significantly effective with such training objectives as
conversing in a foreign language and mastering a motor skill. By tutoring,
testing, and certifying a few representative trainees and dividing them into
two teams, you can initiate an effective peer learning format: Certified tutors
teach other trainees on an individual basis. When completed, the trainee is
given a performance test by a certified member of the other team. If
successful, the trainee becomes a certified member of the team to which his or
her tutor belongs. This process is continued until all trainees are certified.
Strategy 10. Use
performance support systems.
Learning, just like
anything else you teach to others, is a performance. You can teach others how
to learn and, equally importantly, you can improve others' learning through
different types of performance support, including physical facilities, tools
and supplies, job aids, and incentive systems. By shifting your focus from
providing training to facilitating learning, you can achieve significant
savings of time and money.
Guideline 19. Facilitate
learning through individualized systems of instruction.
Different people learn
differently and this fact prevents us from being able to design the perfect instructional
package for all trainees. However, by providing instructional alternatives and
flexible structures, you can demonstrate your respect for diversity and save
instructional design time. Here are some suggestions for setting up an
individualized instructional system:
· In most instructional situations, you can provide choices among
instructional materials, methods, and schedules--but not among instructional
objectives. All trainees are required to demonstrate the achievement of the
same set of objectives.
· Conduct appropriate analysis to specify instructional objectives.
Rewrite these objectives in the language of the novice trainee, and include a
rationale in terms of relevance to the workplace. Use a course map diagram to
show alternative paths for mastering these objectives.
· Construct a criterion-referenced test based on these objectives.
Prepare parallel versions of this test so that the trainees can take them
repeatedly.
· Collect all available instructional materials in the relevant
subject area. Include textbooks, manuals, reprints, audiotapes, videotapes, and
CBT courseware.
· Review the materials and prepare a list of resources for each
objective in your list. Make specific references to different materials.
· If some objectives are not covered by any available material,
prepare your own handouts, audiotapes, or videotapes.
· Store copies of the materials in a convenient study room, along
with the necessary media equipment and computers.
· Design an administrative system for tracking, training, and
certifying the trainees. Prepare a handout explaining how the trainee can work
through the system and get certified.
Guideline 20. Use
suitable incentives to reward learning.
Because of impossible
deadlines for implementing a mandated course, in one of my client
organizations, we were forced to try an innovative approach. We took the money
allotted to the instructional design project and used it to reward employees
who passed the certification test on their own. This approach to instructional
incentives can help you save design time in those situations alternative
instructional resources are available. Here are some suggestion for designing
suitable instructional incentive systems:
· Conduct an incentives analysis to identify various rewards which
motivate members of your target population. Prepare a menu of suitable rewards.
Don't limit yourself to monetary rewards. Try such creative alternatives as
lunch with the president or a prestigious parking space.
· Specify the instructional objectives and procedures for
demonstrating their mastery. List several intermediate tests rather than a
single final test.
· Match each level of mastery with appropriate incentives. Offer
alternative rewards at each level.
· In corporate settings, work with the personnel department to link
the mastery of the instructional objectives to pay increases, promotions, and
other such job-related rewards. Add disincentives for the non-mastery of
important instructional objectives within an appropriate time period.
Two Concluding Thoughts
Mix and match the
strategies. The just-in-time
strategies discussed above are not mutually exclusive entities. You can use
different combinations to save time and money in your instructional design
project. For example, you may skip an instructional design activity, combine
two others, take a few shortcuts in another, computerize your production, and
deliver your final package within an individualized instructional framework.
Exactly which strategies you select and how you combine them should depend on
the resources and constraints in your instructional design situation.
Faster, cheaper--and
better! When you discuss these
just-in-time strategies with your professional colleagues, you will be accused
of compromising basic principles, returning to the prehistoric period, and
reducing instructional integrity. If you listen to others long enough, you will
begin to feel guilty and doubt your motivation. But remember that the final
criterion for evaluating instruction is how well the trainees learn. My
experiences (which are confirmed by the experiences of my students) actually
suggest that quick-and-dirty instructional packages often result in
higher-quality instruction. Paradoxical though this may sound, a little
reflection reveals the logic. When you do not have time to make a big
production out of instructional design, you are forced to focus on the basics.
You and your team are not tempted into bells, whistles, and other
embellishment. The resulting instructional package is lean and powerful.