Loading...

Messages

Proposals

Stuck in your homework and missing deadline? Get urgent help in $10/Page with 24 hours deadline

Get Urgent Writing Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework & Achieve A+ Grades.

Privacy Guaranteed - 100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

Explain why goals should follow the aeiou theory

09/11/2021 Client: muhammad11 Deadline: 2 Day

RESEARCH AND EVALUATION

The best way to find things out is not to ask questions

at all. If you fire off a question, it is like firing off a

gun—bang it goes, and everything takes flight and

runs for shelter. But if you sit quite still and pretend

not to be looking, all the little facts will come and

peck around your feet, situations will venture forth

from thickets, and intentions will creep out and sun

themselves on a stone; and if you are very patient, you

will see and understand a great deal more than a person with a gun does. (Huxley, 1982, p. 20)

This marvelous quote from Huxley’s The Flame Trees of

Thika illustrates a metaphorical rationale for a major refo-

cusing of procedures for evaluation of distance education

systems. Traditional evaluation models have concentrated

on the empirical and quantitative procedures that have been

practiced for decades (Fitzpatrick, Sanders, & Worthen,

2004; Stufflebeam & Shinkfield, 2007). More recently, eval-

uators of distance education programs have begun to pro-

pose more qualitative models that include the collection of

many non-numerical types of information (Rovai, 2003;

Sherry, 2003).

CHAPTER GOAL

The purpose of this chapter is to present

approaches for evaluation of distance

education courses, programs and systems.

CHAPTER OBJECTIVES

After reading and reviewing this chapter,

you should be able to

1. Differentiate between research and

evaluation.

2. Define evaluation.

3. Explain the six categories of evaluation

information: in measures of activity,

efficiency, outcomes, program aims,

policy, and organizations.

4. Describe the AEIOU approach to

evaluation and its five levels—

accountability, effectiveness, impact,

organizational context, and

unanticipated consequences.

CHAPTER 12

Evaluating Teaching and Learning at a Distance

CHAPTER 12 ■ EVALUATING TEACHING AND LEARNING AT A DISTANCE 307

Because it is easy to think of them as being the same thing, it is important to differen-

tiate between theory-based research and evaluation. Simonson, Schlosser, and Orellana

(2011) provided a review of distance education literature including research on and about

distance education. This review summarized distance education research as follows:

■ Distance education is just as effective as traditional education in regard to learner out-

comes.

■ Distance education learners generally have more favorable attitudes toward distance

education than traditional learners, and distance learners feel they learn as well as non-

distant students.

■ The research clearly shows that distance education is an effective method for teaching

and learning.

Evaluation, as contrasted to research, is the systematic investigation of the worth or

merit of an object. Program evaluation is the systematic investigation of the worth of an

ongoing or continuing distance education activity (Yarbrough, Shulha, Hopson, &

Caruthers, 2011). Martinez, Liu, Watson, and Bichelmeyer (2006) discuss the importance

of evaluating distance education programs. Evaluation of programs is used to identify

strengths and weaknesses as well as the benefits and drawbacks of teaching and learning

online. They asked students, administrators, and instructors to evaluate course manage-

ment categories, such as registration, support services, advising, and sense of community.

One important finding of this study was the equivalence of the distance education program

to the traditional program (Martinez et al., 2006).

This chapter focuses on approaches to evaluation for the purpose of improving dis-

tance education and determining the worth of distance education activities. Rose (2000)

identified a number of databases related to evaluation of distance education courses that are

available on the World Wide Web. These online databases provide a repository of up-to-

date information about online courses. Additional information related to evaluation and

distance education is available in Ruhe and Zumbo (2009), Thompson and Irele (2007),

Cyrs and Smith (1990), Fitz-Gibbon and Morris (1987), Fitzpatrick et al. (2004), and

Rossi, Lipsey, and Freeman (2003).

EVALUATION AND DISTANCE EDUCATION—FIVE STEPS

Evaluation procedures are becoming of critical interest to trainers and teachers who are

adopting distance education (Peak & Berge, 2006). As new distance education systems are

being planned and implemented there is considerable concern that the time and effort

required to move to distance delivery of instruction produced a valuable educational expe-

rience, thus, evaluation is regularly a part of plans to move from traditional face-to-face

instruction to distance education. Kirkpatrick and Kirkpatrick’s (2006) evaluation

approach with its four levels of evaluation, supplemented by Phillips (2003); the fifth eval-

uation level—return on investment—seems to be the preferred approach of many trainers,

and some educators.

Kirkpatrick and Kirkpatrick’s evaluation approach has been traditionally used to eval-

uate classroom training and teaching, especially in the private, government and military

sectors. It is a straightforward approach that produces usable information for the trainer.

The four levels of the approach are designed to obtain answers to commonly asked ques-

tions about training—Did they like it? Did they learn it? Will they use it? Will it matter?

(Simonson, 2007).

308 PART 3 ■ MANAGING AND EVALUATING DISTANCE EDUCATION

Level 1—Reactions (Did They Like It)

As the word reactions implies, evaluation at this level measures how participants in the

training program feel about the educational activity. Students are asked what they liked and

did not like about training, sometimes several times during a course or program. Students

are required to use checklists, likert responses to statements, and open ended comments, all

to determine if the training was perceived positively by participants.

Level 2—Learning

At this level, evaluation strategies attempt to determine more than learner satisfaction.

Rather, evaluators assess the extent to which learners have advanced in skills, knowledge,

or attitude. What and how much did participants learn? What new skills do they possess?

And, what new and appropriate attitudinal positions have been produced.

Methods include objective testing, team assessment, and self-assessment. Often pre-

test, post-test change is used as a measure at Level 2.

Level 3—Transfer

At this level, evaluators attempt to determine if the skills, knowledge and attitudes

learned as a result of training are being transferred to the work place or to actual learner

activities. Evaluation questions deal with the use of new skills, or the application of new

knowledge to events. Timing of the evaluation at this level is critical, and problematic,

since it is difficult to know when transfer actually occurs.

Level 4—Results

Evaluation activities at this level attempt to measure the success of the training or

teaching program in terms of increased productivity, improved quality, lower costs, and for

businesses, even higher profits. Trainers are increasingly being ask to demonstrate the

direct and indirect impact of training on the success of the organization and to relate train-

ing to mission accomplishment. In schools, Level 4 evaluations often look at enrollments

in additional courses, learning motivation, and educational achievement.

Level 5—Return on Investment

Increasingly, many training and educational organizations that are adopting distance

education are interested in the concept of return on investment—converting training results

from eLearning activities into monetary values and comparing these costs to the cost of the

training program to determine a return on investment. Phillips (2003) describes a five step

process to determine return on investment.

1. First, it is necessary to collect Level 4 data to determine if there is a change in job or

educational performance that is positive and also measurable? This assumes that there

were evaluation data collected concerning the first four levels of the Kirkpatricks’

model.

CHAPTER 12 ■ EVALUATING TEACHING AND LEARNING AT A DISTANCE 309

2. Second, evaluators need to identify the training that contributed to the change in per-

formance. Testing can be used, as can control groups that receive different training, or

no training at all.

3. Third, it is necessary to convert the results of training or education into monetary values. This often means a relatively subjective process must be undertaken to quantify outcomes related to the training.

4. Next, the evaluation process requires the determination of the total cost of training.

This includes trainer costs, facilities expenses, materials purchased and other

expenses.

5. Fifth, return on investment, or ROI, is determined by comparing the monetary bene-

fits to the costs. In this manner, it is possible to quantify the impact of training, the

effectiveness of education and the value of the instruction.

The ROI process is time consuming, requires a skilled evaluation team, and is some-

times criticized because it produces evaluation results that look at what has happened,

rather than what will happen. Peak and Berge (2006) also recommend that not everything

needs to be measured. Rather, leaders should determine what they think is important and

then trainers evaluate those areas.

While evaluation has always been somewhat important in corporate and military train-

ing and of interest to a lesser extent in education, the recent phenomenal growth of distance

education has made many leaders want to know what the implications are of moving to

training and teaching that is not face-to-face. Thus, Kirkpatricks’ and Phillips’ evaluation

approaches have received increased attention, especially since most evidence clearly dem-

onstrates distance education works academically to produce required achievement gains.

The evidence is clear that students learn just as effectively when they are taught at a dis-

tance as compared to when they learn in a traditional classroom (Simonson, 2007). Thus, it

can be generalized that traditional training and eLearning work equally well. The question

for evaluators then becomes the determination of the advantages, if any, of moving to an

eLearning environment? Evaluators are looking at cost savings, time savings, increased

motivation and satisfaction, economies of scale, and other nonachievement outcome met-

rics. Evaluation of eLearning should provide leaders evidence they need to support or to

refute training decisions.

EVALUATION AND THE OPEN UNIVERSITY

Program evaluation at the Open University of Great Britain is the systematic investigation

of the merit of a particular distance education program, curriculum, or teaching method,

and how it might be improved compared with alternatives. As part of evaluation proce-

dures for distance education by the Open University (Woodley & Kirkwood, 1986, 2005),

two alternative strategies have been merged. The first is the traditional, positivist-empiri-

cist approach to evaluation. This represents an attempt to apply the rules and procedures of

the physical sciences to evaluation. The second is a more eclectic view of evaluation that

incorporates qualitative and naturalistic techniques for the evaluation of distance educa-

tion.

The traditional strategy normally includes an experiment that determines the effective-

ness of a distance education strategy. The distance education project is structured from its

beginning with the requirements of the evaluator in mind. Carefully matched samples are

picked, controls are established, and variables are selected for which comparison data will

310 PART 3 ■ MANAGING AND EVALUATING DISTANCE EDUCATION

be collected. Next, objective tests of variables are selected or constructed. Data are col-

lected before, during, and always after the instructional event or procedures. Then the eval-

uator takes the data and prepares the evaluation report, which is submitted weeks or months

later.

The primary outcome of this type of evaluation is the comparison of the data collected

from the two or more categories of learners. For example, the distant learners are compared

with those taught locally, and conclusions about the effectiveness of the distance education

activity are made.

This approach represents the traditional process for the evaluation of distance educa-

tion. Recently at the Open University and elsewhere, a countermovement has emerged

(House, 2010). Advocates of this counterapproach are united in one primary way: They are

opposed to the traditional, quantitative procedures for evaluation. Increasingly, evaluation

activities are incorporating more naturalistic methodologies with holistic perspectives.

This second perspective for evaluation uses focus groups, interviews, observations, and

journals to collect evaluation information in order to obtain a rich and colorful understand-

ing of events related to the distance education activity.

From a practical standpoint, most evaluators now use a combination of quantitative

and qualitative measures. Certainly, there is a need to quantify and count. Just as certainly,

there is a need to understand opinions and hear perspectives.

According to Woodley and Kirkwood (1986, 2005), six categories of evaluation infor-

mation can be collected about distance education activities:

1. Measures of Activity. These measures are counts of the numbers of events, people,

and objects. Administrative records often provide data for activity questions. Activity

questions are ones such as:

■ How many courses were produced?

■ How many students were served?

■ How many potential students were turned away?

2. Measures of Efficiency. Measures of efficiency are closely related to measures of

activity, and often administrative records can be the source of efficiency information.

Efficiency questions often asked are ones such as:

■ How many students successfully completed the course?

■ What was the average student’s workload?

■ How many students enrolled in additional courses?

■ How much did the course cost?

■ How much tuition was generated?

3. Measures of Outcomes. Measures of adequate learning are usually considered the

most important measures of outcomes of distance education activities. Often, inter-

views with learners are used to supplement course grades in order to find students’

perceptions about a distance education activity. Mail surveys are also efficient ways

to collect outcome information from distant learners. Other outcome measures

include documenting the borrowing and use of courses and course materials by other

institutions as an indicator of effectiveness, and the enrollment by students in addi-

tional, similar courses as indicators of a preliminary course’s success.

4. Measures of Program Aims. Some distance teaching programs specify their aims

in terms of what and whom they intend to teach, and evaluation information is col-

lected to establish the extent to which these aims were met. One common aim of dis-

CHAPTER 12 ■ EVALUATING TEACHING AND LEARNING AT A DISTANCE 311

tance education programs is to reach learners who otherwise would not be students.

Surveys of learners can be used to collect this type of information.

5. Measures of Policy. Evaluation in the policy area often takes the form of market

research. Surveys of prospective students and employers can be used to determine the

demand for distance education activities.

Policy evaluation can also include monitoring. Students can be surveyed to deter-

mine if tuition is too high, if appropriate courses are being offered, and if there are

impediments to course success, such as the lack of access to computers or the library.

Sometimes policy evaluation can be used to determine the success of experimen-

tal programs, such as those for low achievers or for students who normally are not

qualified for a program. The purpose of policy evaluation is to identify procedures that

are needed or that need to be changed, and to develop new policies.

6. Measures of Organizations. Sometimes it is important to evaluate a distance educa-

tion institution in terms of its internal organization and procedures. Evaluators some-

times are asked to monitor the process of course development or program delivery to

help an organization be more efficient. This category of evaluation requires on-site

visits, interviews, and sometimes the use of journals by key organization leaders.

These six categories of evaluation are not used for every distance education activity.

Certainly, some modest evaluation activity is almost always necessary. It is important that

the activities of evaluators be matched to programmatic needs. Woodley and Kirkwood

(1986, 2005) have summarized evaluation in distance education as being a fairly eclectic

process that utilizes procedures that should match program needs to evaluation activities.

QUALITY SCORECARD AND QUALITY MATTERS

Evaluating Programs and Courses

Two widely used and standardized evaluation instruments are the Sloan Consortium’s

Quality Scorecard for the Administration of Online Education Programs, and the Quality

Matters Rubric Standards. These two instruments can be used to evaluate online programs

and courses and are also effective for use when courses are designed—as models for effec-

tive programs and courses (Quality Matters, 2013; Sloan-C, 2013)

The Scorecard deals with issues such as institutional support, technology support,

course development and instructional design, course structure, teaching and learning,

social and student engagement, faculty support, student support and evaluation and assess-

ment. Quality Matters provides a rubric for courses, including the course overview, learn-

ing objectives, assessment and measurement, instructional materials, learning interaction

and engagement, technology, learner support and accessibility. Both tools are excellent.

THE AEIOU APPROACH

Fortune and Keith (1992) proposed the AEIOU approach for program evaluation, espe-

cially the evaluation of distance education projects. The effectiveness of this approach has

been demonstrated through evaluating the activities of the Iowa Distance Education Alli-

ance Star Schools Project (Simonson & Schlosser, 1995a; Sorensen, 1996, Sorensen &

Sweeney, 1995, 1996, 1997; Sweeney, 1995), a multiyear, statewide distance education

312 PART 3 ■ MANAGING AND EVALUATING DISTANCE EDUCATION

activity. Additionally, the model has been used to evaluate a number of other innovative

projects, such as the Iowa Chemistry Education Alliance in 1995, the Iowa General Chem-

istry Network in 1994, and the DaVinci Project: Interactive Multimedia for Art and Chem-

istry (Simonson & Schlosser, 1995b). More recently, a major distance education initiative

in South Dakota used a modified version of the AEIOU approach (Simonson, 2005).

The AEIOU approach is similar to Woodley and Kirkwoods’ in that it is an eclectic

one that uses quantitative and qualitative methodologies. It has two primary purposes as an

evaluation strategy. First, the model provides formative information to the staff about the

implementation of their project. Second, it provides summative information about the

value of the project and its activities. The AEIOU evaluation process provides a framework

for identifying key questions necessary for effective evaluation. Some evaluation plans use

only parts of the framework, whereas other, more comprehensive plans use all compo-

nents. Some examples of evaluation questions asked in comprehensive distance education

projects are presented next.

Component 1—Accountability

Did the Planners Do What They Said They Were Going to Do? This is the first step in deter-

mining the effectiveness of the program, project or course and is targeted at determining if

the project’s objectives and activities were completed. Evaluation questions typically cen-

ter on the completion of a specific activity and often are answered “yes” or “no.” Addition-

ally, counts of numbers of people, things, and activities are often collected.

Questions such as the following are often asked to determine project accountability:

■ Were the appropriate number of class sessions held?

■ How many students were enrolled?

■ How many copies of program materials were produced, and how many were distrib-

uted?

Methods Used: Accountability information is often collected from project adminis-

trative records. Project leaders are often asked to provide documentation of the level of

completion of each of the project’s goals, objectives, and activities. Sometimes evaluators

interview project staff to collect accountability data.

Component 2—Effectiveness

How Well Done Was the Program, Project, or Course? This component of the evaluation

process attempts to place some value on the program, course or project’s activities. Effec-

tiveness questions often focus on participant attitudes and knowledge. Obviously, grades,

achievement tests, and attitude inventories are measures of effectiveness. Less obvious are

other ways to determine quality. Often, raters are asked to review course materials and

course presentations to determine their effectiveness, and student course evaluations can be

used to collect reactions from distance education participants.

Examples of questions to determine effectiveness include:

■ Were the in-service participants satisfied with their distance education instruction?

■ Did the students learn what they were supposed to learn?

■ Did the teachers feel adequately prepared to teach distant learners?

CHAPTER 12 ■ EVALUATING TEACHING AND LEARNING AT A DISTANCE 313

Methods Used: Standardized measures of achievement and attitude are traditionally

used to determine program effectiveness. Surveys of students and faculty can be used to

ask questions related to perceptions about the appropriateness of a project or program.

Focus groups (Morgan, 1996) also provide valuable information. Participants are system-

atically asked to respond to questions about the program. Finally, journals are sometimes

kept by project participants and then analyzed to determine the day-to-day effectiveness of

an ongoing program.

Component 3—Impact

Did the Program, Course, or Project Make a Difference? During this phase of the evalua-

tion, questions focus on identifying the changes that resulted from the program’s activities,

and are tied to the stated outcomes of the project or course. In other words, if the project

had not happened, what of importance would not have occurred? A key element of mea-

surement of impact is the collection of longitudinal data. The impact of distance education

courses is often determined by following learners’ progress in subsequent courses or in the

workplace to determine if what was learned in the distance education course was useful.

Determinants of impact are difficult to identify. Often, evaluators use follow-up stud-

ies to determine the impressions made on project participants; and sometimes in distance

education programs, learners are followed and questioned by evaluators in subsequent

courses and activities.

Questions might include:

■ Did students register for additional distance education courses?

■ Has the use of the distance education system increased?

■ Have policies and procedures related to the use of the distance education system been

developed or changed?

Methods Used: Qualitative measures provide the most information to the evaluator

interested in program impact. Standardized tests, record data, and surveys are sometimes

used. Also, interviews, focus groups, and direct observations are used to identify a pro-

gram’s impact.

Component 4—Organizational Context

What Structures, Policies, or Events in the Organization or Environment Helped or Hindered

the Project in Accomplishing its Goals? This component of evaluation has traditionally

not been important even though evaluators have often hinted in their reports about organi-

zational policies that either hindered or helped a program. Recently, however, distance

educators have become very interested in organizational policy analysis in order to deter-

mine barriers to the successful implementation of distance education systems, especially

when those systems are new activities of traditional educational organizations, such as

large public universities.

The focus of this component of the evaluation is on identifying those contextual or

environmental factors that contributed to, or detracted from, the project or course’s ability

to conduct activities. Usually these factors are beyond the control of the project’s partici-

pants. Effective evaluation of organizational context requires the evaluator to be intimately

314 PART 3 ■ MANAGING AND EVALUATING DISTANCE EDUCATION

involved with the project or course in order to have a good understanding of the environ-

ment in which the project or course operates.

Questions typically addressed in evaluating organizational context include:

■ What factors made it difficult to implement the project or to successfully complete the

course?

■ What contributed most to the success or failure of the program, course, project, or the

students in the course?

■ What should be done differently to improve things and make the course more effective?

Methods Used: Organizational context evaluation uses interviews of key personnel

such as faculty or students, focus groups made up of those impacted by a program, and doc-

ument analysis that identifies policies and procedures that influence a program or course.

Direct participation in program activities by the evaluator is also important. Sometimes

evaluators enroll in distance education courses. More often, a student is asked to complete

a journal while enrolled in a course. By participating, the evaluator is confronted directly

with the organizational context in which a program exists, and can comment on this context

firsthand.

Component 5—Unanticipated Consequences

What Changes or Consequences of Importance Happened as a Result of the Program,

Course, or Project That Were Not Expected? This component of the AEIOU approach is to

identify unexpected changes of either a positive or negative nature that occurred as a direct

or indirect result of the program, course, or project. Effective evaluators have long been

interested in reporting anecdotal information about the project or program that they were

evaluating. It is only recently that this category of information has become recognized as

important, largely because of the positive influence on evaluation of qualitative proce-

dures. Often, evaluators, especially internal evaluators who are actively involved in the

project or course’s implementation, have many opportunities to observe successes and fail-

ures during the trial-and-error process of beginning a new program. Unanticipated conse-

quences of developing new or modified programs, especially in the dynamic field of

distance education, are a rich source of information about why some projects are successful

and others are not. Central to the measurement of unanticipated outcomes is the collection

of ex post facto data.

Examples of questions asked include:

■ Have relationships between collaborators or students changed in ways not expected?

■ Have related, complementary projects been developed?

■ Were unexpected linkages developed between groups or participants?

■ Was the distance education system used in unanticipated ways?

■ Did the distance education system have an impact on student learning other than that

expected?

Methods Used: Interviews, focus groups, journals, and surveys that ask for narrative

information can be used to identify interesting and potentially important consequences of

implementing a new program. Often, evaluators must interact with project participants or

course students on a regular basis to learn about the little successes and failures that less

CHAPTER 12 ■ EVALUATING TEACHING AND LEARNING AT A DISTANCE 315

sensitive procedures overlook. Active and continuous involvement by evaluators permits

them to learn about the project as it occurs.

Sweeney (1995) advocates an eclectic approach to evaluation, an approach also sup-

ported by Fitzpatrick et al. (2004). The AEIOU model is a dynamic one that permits the

evaluator to tailor the process of program evaluation to the specific situation being studied.

PROGRAM EVALUATION: EXAMPLES

South Dakota

South Dakota has a network for distance education that connects every school in the

state. Currently, hundreds of classrooms are connected to the Digital Dakota Network

(DDN). The DDN was funded using state monies and grants from telecommunications pro-

viders, such as QWEST Communications.

Implementation of the DDN was called the Connecting the Schools project. As the

network came online and began to be used, it was decided that a comprehensive evaluation

effort was needed. Evaluators used the AEIOU approach and collected both quantitative

and qualitative information (Simonson, 2005; Simonson & Bauck, 2001).

Quantitative information was collected using a locally developed survey called the

Connecting the Schools Questionnaire (CSQ). The CSQ asked respondents to provide four

categories of information: demographics, information about personal innovativeness,

questions about organizational innovativeness, and questions about distance education.

Demographic information was collected in order to obtain a profile of the teachers in

the state, and included questions about age, years of experience, gender, academic back-

ground, and professional involvement. The second part of the CSQ was a modified version

of Hurt, Joseph, and Cook’s (1977) innovativeness scale (Simonson, 2000). The innova-

tiveness scale is a standardized measure of how innovative a person thinks he or she is. Part

three of the CSQ was a modified version of Hurt and Tiegen’s (1977) Perceived Organiza-

tional Innovativeness scale. The scale is a standardized measure of a person’s perception of

his or her employer’s organizational innovativeness. The final section of the CSQ asked

questions about distance education. These questions were to find out how much South

Dakota teachers knew about distance education and to determine their general feelings

about the impact of distance education on teaching and learning.

The qualitative portion of the CSQ evaluation in South Dakota used focus groups, par-

ticipant observations, interviews, and site visits. Three questions were at the heart of the

quantitative evaluation. First, evaluators tried to determine what educators thought would

be the greatest benefits provided by implementing distance education. Second, attempts

were made to determine what was preventing individuals from becoming involved in dis-

tance education. Next, school superintendents were selected randomly and interviewed to

determine their perceptions of the impact of distance education and the Digital Dakota Net-

work on education in their school districts (Calderone, 2003). Finally, questions were

asked about the impediments to distance education.

When quantitative data were combined with qualitative information, a rich under-

standing was provided to education leaders about South Dakota’s ability to adopt distance

education (Learning at a Distance: South Dakota, www.tresystems.com/projects/). Com-

plete results of the evaluation were reported in Simonson (2005). In general, the evaluation

of the South Dakota project verified that Rogers’s (2003) theory concerning the diffusion

of innovations was directly applicable to distance education efforts in South Dakota and

316 PART 3 ■ MANAGING AND EVALUATING DISTANCE EDUCATION

that this theory could effectively serve as a model for promoting the adoption of innova-

tions, such as the DDN specifically, and distance education in public schools, more gener-

ally.

Iowa

Several years ago, it was decided that a three-phase plan should be implemented to

establish distance education classrooms throughout the state of Iowa. Recently, hundreds

of sites were connected to this distance education infrastructure, which was named the

Iowa Communications Network (ICN).

As part of the implementation plan for the ICN, a comprehensive evaluation program

was put into action. This program utilized the AEIOU approach and collected data from

thousands of sources and individuals. The evaluation approach went through several stages

during the 5 years it was used. First, evaluators concentrated on evaluating the construc-

tion, connection, and implementation of the ICN’s physical infrastructure. Records related

to classroom design, construction schedules, and dollars spent were collected and

reviewed, and summary results were reported. This related to the accountability compo-

nent of the AEIOU approach.

Next, those involved in the decision-making process for establishing the network were

interviewed and completed surveys. Evaluators used the results to develop reports on the

effectiveness of the processes used to construct the ICN. To determine impact, evaluators

conducted follow-up investigations of classroom utilization and examined records of how

the system was used.

The program evaluators examined many interesting organizational issues, such as who

made decisions about where classrooms were located, how funds were obtained and spent,

and who controlled access to the system. One interesting outcome was related to the use of

the distance education classroom, which were typically locked. Utilization of this class-

room was related to who had the room key, with the second highest usage for locked room

when the school library media specialist had the key. If the principal had the key, usage

was relatively low. Highest usage occurred when the room was not locked during regular

school hours.

Finally, program evaluators identified unanticipated outcomes. One of the most signif-

icant was the infusion of several millions of dollars from federal, state, and local sources to

support the development of the network. How these funds were obtained and used added to

the importance of the evaluation report.

Once the network was built and a plan for its continued growth was put into place,

evaluators shifted their primary focus to the human side of the growth of distance education

in the state. Staff development, technical training, curriculum revisions, and school restruc-

turing became the focus of network planners and funding agencies, so program evaluators

used the AEIOU model to obtain information about these activities. The approach was used

to provide formative information about the development of programs and their impact on

teachers and learners, and also to provide information on outcomes, or summative informa-

tion, to document the successes and failures of various program activities.

A true understanding of activities of evaluators of this statewide, multiyear project can

only be gained by reviewing the yearly reports they submitted. However, it is important to

note that the evaluation plan provided the following information:

Accountability. Evaluators checked records, interviewed staff, and visited classrooms to

determine the status of the development of the ICN, both as a physical system and as

CHAPTER 12 ■ EVALUATING TEACHING AND LEARNING AT A DISTANCE 317

a tool used by teachers to deliver courses to distant learners. The accountability focus

shifted during the project as its activities shifted from construction to implementation

and finally to maintenance.

Effectiveness. Evaluators conducted interviews and focus groups to determine what

impact the availability of the ICN had on classroom education. Surveys were sent and

reports were generated that helped education leaders to better understand what role

distance education was playing.

Impact. As the network became widely available and the number of courses and activi-

ties increased, it became possible to determine the impact of the ICN and distance edu-

cation events on education in the state. Students were tested and grades reported. Most

of the achievement data showed that learning occurred and good grades were obtained.

More important, the availability of new learning experiences grew considerably.

Organizational Context. From the beginning of the ICN project, the role of the state as

compared with local educational organizations was a focus of evaluation activities.

One outcome was to identify where cooperation between agencies was necessary, such

as in scheduling, and where local control, such as in course selection, should be main-

tained. Project evaluators identified and reported on what the data seemed to indicate

were the barriers and the contributors to the effective growth and utilization of the

ICN.

Unanticipated Outcomes. During the project, scores of unanticipated outcomes were

identified and reported. Among the most interesting were:

■ The movement of the ICN into the role of Internet service provider

■ The role of the ICN in attracting external grants

■ The role of distance education and the ICN in the movement to restructure schools

■ The impact of the ICN on positive attitudes toward technology in education

■ The emerging role of the public television station in Iowa education

There were also many other unanticipated outcomes. The AEIOU approach was useful

in helping the state’s educators in evaluating the role of distance education as an

approach and the ICN as an infrastructure. Evaluation played a significant part in the

positive implementation and use of this new technology in the state of Iowa.

STUDENT EVALUATION OF DISTANCE EDUCATION COURSES

The purpose of a course evaluation is to fulfill accreditation requirements and to provide a

means for reporting course and instructor effectiveness. Standardized course evaluation

forms are available that have already been developed and have gone through rigorous psy-

chometric analyses. The literature suggests course and instructor evaluation models that

focus on six constructs:

■ Teaching and learning

■ Developing a community of learners

■ The instructor

■ The student

■ Implementation of the course

■ Technology use

318 PART 3 ■ MANAGING AND EVALUATING DISTANCE EDUCATION

Evaluation instruments should possess the psychometric characteristics of standard-

ized measures, meaning they should be valid, reliable, administered in a consistent manner,

and have normative tables so scores can be compared.

Valid instruments measure what they are supposed to measure, in this case the effec-

tiveness of online courses and online teaching. Reliable measures are consistent. In other

words, if the measure was administered a second time the scores should be very similar.

Consistent administration of course evaluations ensures that more or less favorable condi-

tions of testing do not influence the results. Finally, scores for any course evaluation are

difficult to decipher if there is no comparison data. Often, scores from evaluations for many

courses are collected so that the scores for any individual course and instructor can be com-

pared with others. Usually, any identifiers for comparison courses are removed. It is impor-

tant to remember that course and instructor evaluations are to be used for continuous

improvement, and to provide input for course revisions.

A sample evaluation instrument to collect students’ perceptions about the six con-

structs, the Online Course Evaluation Instrument (OCEI, pronounced ooh-see), is shown in

Figure 12–1.

SUMMARY

As distance education in the United States increases in importance, evaluation will con-

tinue to be a critical component of the process of improvement. Certainly, the literature is

clear. Eclectic models of evaluation such as the ones advocated by Woodley and Kirkwood

(1986) and Sweeney (1995) are most applicable to distance education program evaluation.

Evaluators should use quantitative and qualitative procedures. Distance education pro-

grams and even single courses should be accountable to their goals, should be at least as

effective as alternative approaches, and should have a positive impact. Evaluators should

attempt when possible to identify what organizational contexts support effective distance

education systems, and unanticipated events both should be shared with interested readers

and should be used to improve courses.

If you are very patient, you will see and understand. (Huxley, 1982, p. 20)

REFERENCES

Calderone, T. (2003). Superintendents’ perception of their role in the diffusion of distance education

(Unpublished doctoral dissertation). Nova Southeastern University, Fort Lauderdale, FL.

Cyrs, T., & Smith, F. (1990). Teleclass teaching: A resource guide (2nd ed.). Las Cruces, NM: Cen-

ter for Educational Development.

Fitz-Gibbon, C., & Morris, L. (1987). How to design a program evaluation. Newbury Park, CA:

SAGE.

Fitzpatrick, J., Sanders, J., & Worthen, B. (2004). Program evaluation: Alternative approaches and

practical guidelines (3rd ed.). Upper Saddle River, NJ: Pearson/Allyn & Bacon.

Fortune, J., & Keith, P. (1992). Program evaluation for Buchanan County Even Start. Blacksburg,

VA: College of Education, Virginia Polytechnic Institute and State University.

House, E. (Ed.). (2010). New directions in educational evaluation. Lewes, England: Falmer Press.

Hurt, H., Joseph, K., & Cook, C. (1977). Scales for the measurement of innovativeness. Human Com-

munications Research, 4(1), 58–65.

CHAPTER 12 ■ EVALUATING TEACHING AND LEARNING AT A DISTANCE 319

FIGURE 12–1 An evaluation instrument

ONLINE COURSE EVALUATION INSTRUMENT (OCIE)

Course Name:

Gender:

Age:

Class Level:

Class Term:

____ Male

____ Female

____ Years

____ Undergraduate

____ Master

____ Doctorate

____ Summer

____ Fall

____ Winter

Class Size: ____ Class size 1 to 10

____ Class size 11 to 20

____ Class size 21 to 30

____ Class size 31 to 40

____ Class size 41 and above

First Experience in an Online Course ____ Yes ____ No

Please rate each item using the following scale:

5 – Strongly agree

4 – Agree

3 – Neither agree nor disagree

2 – Disagree

1 – Strongly disagree

Teaching and Learning

1. The course has clearly stated objectives _________

2. The course activities are consistent with course objectives _________

3. The course syllabus is an accurate guide to course requirements _________

4. The course materials are a helpful guide to key concepts covered in the class _________

5. The course projects and assignments build understanding of concepts and principles _________

6. The course presents appropriate skills and techniques _________

7. The course is current with developments in the field _________

Developing a Community of Learners

1. Collaborative work is a valuable part of this course _________

2. There is opportunity to learn from other students _________

3. Differing viewpoints and discussions are encouraged in this class _________

4. Mutual respect is a concept practiced in this course _________

5. Each student has an opportunity to contribute to class learning _________

The Instructor

1. The instructor clearly states the methods of evaluation that will be used to assess student work _________

2. The instructor uses a variety of methods to evaluate _________

3. The instructor shows respect for the various points of view represented in this class _________

4. The instructor makes learning interesting and motivates students to learn _________

5. The instructor uses technology in ways that help learning of concepts _________

6. The instructor responds to questions with consideration _________

7. The instructor displays a clear understanding of course topics _________

320 PART 3 ■ MANAGING AND EVALUATING DISTANCE EDUCATION

Hurt, H., & Teigen, C. (1977). The development of a measure of perceived organizational innova-

tiveness. Communication Yearbook I (pp. 377–385). New Brunswick, NJ: International Com-

munications Association.

Huxley, E. (1982). The flame trees of Thika: Memories of an African childhood. London, England:

Chatto and Windus.

Kirkpatrick, D. (1994). Evaluating training programs: The four levels. San Francisco, CA: Berrett-

Koehler.

Kirkpatrick, D., & Kirkpatrick, J. (2006). Evaluating training programs: The four levels (3rd ed).

San Francisco, CA: Berrett-Koehler

Martinez, R., Liu, S., Watson, W., & Bichelmeyer, B. (2006). Evaluation of a web-based master’s

degree program: Lessons learned from an online instructional design and technology program.

Quarterly Review of Distance Education, 7(3), 267–283.

Morgan, D. (1996). Focus groups as qualitative research. Newbury Park, CA: SAGE.

Peak, D., & Berge, Z. (2006). Evaluation and eLearning. Turkish Online Journal of Distance Educa-

tion, 7(1), Article 11.

Phillips, J. (2003). Return on investment (2nd ed.). Burlington, MA: Butterworth-Heinmann.

Quality Matters. (2013). Quality matters rubric standards 2011-2013 edition. Retrieved from https://

qualitymatteres.org

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.

Rose, E. (2000). An evaluation of online distance education course databases. DEOSNEWS, 10(11),

1–6. Retrieved from http://www.ed.psu.edu/acsde/deos/deosnews.deosarchives.asp

Rossi, P., Lipsey, M., & Freeman, H. (2003). Evaluation: A systematic approach (7th ed.). Newbury

Park, CA: SAGE.

Rovai, A. P. (2003). A practical framework for evaluating online distance education programs. Inter-

net and Higher Education, 6(2), 109–124.

Ruhe, V., & Zumbo, B. (2009) Evaluation in distance education and e-learning. New York, NY:

Guilford.

Sherry, A. (2003). Quality and its measurement in distance education. In M. Moore & W. Anderson

(Eds.), Handbook of distance education (pp. 435–459). Mahwah, NJ: Erlbaum.

Simonson, M. (2000). Personal innovativeness, perceived organizational innovativeness, and com-

puter anxiety: Updated scales. Quarterly Review of Distance Education, 1(1), 69–76.

Simonson, M. (2005). South Dakota’s statewide distance education project. In Z. L. Berge & T. Clark

(Eds.), Virtual schools: Planning for success (pp. 183–197). New York, NY: Teachers College

Press.

Simonson, M. (2007). Evaluation and distance education. Quarterly Review of Distance Education,

8(3), vii-ix.

Simonson, M., & Bauck, T. (2001). Learning at a distance in South Dakota: Description and evalua-

tion of the diffusion of an innovation. Proceedings of Research and Development Papers pre-

sented at the Annual Convention of the Association for Educational Communications and

Technology, Atlanta, GA. (ERIC Document Reproduction Service No. ED47103)

Simonson, M., & Schlosser, C. (1995a). More than fiber: Distance education in Iowa. Tech Trends,

40(3), 13–15.

Simonson, M., & Schlosser, C. (1995b). The DaVinci project. Paper presented at the Iowa Computer-

Using Educators Conference, Des Moines.

Simonson, M., Schlosser, C., & Orellana, A. (2011). Distance education research: A review of the lit-

erature. Journal of Computing in Higher Education, 23(2), 124–142

Sloan-C. (2013). Quality scorecard for the administration of online programs: A handbook. Retrieved

from http://sloanconsortium.org/quality_scorecard_online_program

Sorensen, C. (1996). Final evaluation report: Iowa distance education alliance. Ames, IA: Research

Institute for Studies in Education.

Sorensen, C., & Sweeney, J. (1995). ICN Technology Demonstration Evaluation. USDLA Education

at a Distance, 9(5), 11, 21.

CHAPTER 12 ■ EVALUATING TEACHING AND LEARNING AT A DISTANCE 321

Sorensen, C., & Sweeney, J. (1996, November). AEIOU: An approach to evaluating the statewide

integration of distance education. Paper presented at the annual meeting of the American Eval-

uation Association, Atlanta, GA.

Homework is Completed By:

Writer Writer Name Amount Client Comments & Rating
Instant Homework Helper

ONLINE

Instant Homework Helper

$36

She helped me in last minute in a very reasonable price. She is a lifesaver, I got A+ grade in my homework, I will surely hire her again for my next assignments, Thumbs Up!

Order & Get This Solution Within 3 Hours in $25/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 3 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 6 Hours in $20/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 6 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 12 Hours in $15/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 12 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

6 writers have sent their proposals to do this homework:

Math Guru
Phd Writer
Assignment Helper
Assignments Hut
Finance Homework Help
Top Quality Assignments
Writer Writer Name Offer Chat
Math Guru

ONLINE

Math Guru

I will be delighted to work on your project. As an experienced writer, I can provide you top quality, well researched, concise and error-free work within your provided deadline at very reasonable prices.

$43 Chat With Writer
Phd Writer

ONLINE

Phd Writer

As per my knowledge I can assist you in writing a perfect Planning, Marketing Research, Business Pitches, Business Proposals, Business Feasibility Reports and Content within your given deadline and budget.

$38 Chat With Writer
Assignment Helper

ONLINE

Assignment Helper

This project is my strength and I can fulfill your requirements properly within your given deadline. I always give plagiarism-free work to my clients at very competitive prices.

$38 Chat With Writer
Assignments Hut

ONLINE

Assignments Hut

As per my knowledge I can assist you in writing a perfect Planning, Marketing Research, Business Pitches, Business Proposals, Business Feasibility Reports and Content within your given deadline and budget.

$29 Chat With Writer
Finance Homework Help

ONLINE

Finance Homework Help

I am a PhD writer with 10 years of experience. I will be delivering high-quality, plagiarism-free work to you in the minimum amount of time. Waiting for your message.

$46 Chat With Writer
Top Quality Assignments

ONLINE

Top Quality Assignments

After reading your project details, I feel myself as the best option for you to fulfill this project with 100 percent perfection.

$17 Chat With Writer

Let our expert academic writers to help you in achieving a+ grades in your homework, assignment, quiz or exam.

Similar Homework Questions

Https www debt osr nsw gov au dms - WRITTEN INTERVIEW QUESTIONS FOR PHD APPLICATION IN CYBERSECURITY - Biopsychology of emotion stress and health - Mass moment of inertia semicircle - Deep water training pdf - Morgan horse for sale - Qub lost student card - Assessing a company's future financial health harvard solution - Week 5 Current event - Shark bite hydraulic press - One of the questions in the pew internet - Oxidation and reduction examples in daily life - Microsoft access reserved words - Trial 1 meiotic division without crossing over pipe cleaners diagram - Python DFS BFS - Scsa career and enterprise - C++ dictionary - 18mm ring size uk - Strategic Plan Summary - Day of mourning 1938 - What type of company would use a process costing system - The First Meditation of Rene Descartes’s Meditations on First Philosophy in which he provides reasons for doubting his own opinions. - 1 turtledove rise greenough - One crown office row - Keune ultimate blonde color chart - Colour royale 1.5 activator instructions - Write an eportfolio assignment - Introduction to type isabel briggs myers pdf - Silkair forward zone seat - English - Jainism vs sikhism worksheet - Romeo and juliet context - How does the setting of the lottery affect the story - Jeff bliss duncanville today - MC WK3 - Pmp 9 knowledge areas - Treetops dog rescue guildford - Bank loan general journal entry - What does an echidna eat - Https firstworld firstsource com intranet - Odd couple opening monologue - Grand canyon university children's functional health pattern assessment - Preparation of isoamyl acetate - Catheter associated urinary tract infection picot - A propeller shaft for a small yacht - Domain analysis in ooad - Traumatic stress of law enforcement - Southwestern university's food service case study - Mn oac 3 2h2o - IDST 2050C - Copple farm boot fair - Pigments in spinach leaves - The century unpinned answers - Bach's cantata no 140 is an intricate reconstruction of a - The Five Phases of the Aggression Cycle - Ethical issues in enron the smartest guys in the room - Discussion and summary - Descriptive statistics in nursing research - Xc2 please enter password - University of indiana plagiarism test - Principles of industrial relations - Arb classic canopy roof rack supports - Circuit breaker ratings schneider - Title ix compliance checklist - Luckiest girl alive spoiler ending shell - Network management tools associated with policy compliance - 43a marlow street wembley - Yellowstone wolves food web - Pico question examples for pressure ulcers - Response - Pull me back grace gaustad lyrics - Integrated physics and chemistry curriculum - Bsbinn601 answers - Labrador west health centre - Joe brainard i remember pdf - Which one of the following pieces of writing - Discussion - Tone colour music words - Catechol oxidase and potato extract lab report - 16c first street magill - Cambridge igcse chemistry 4th edition - Paper and Powerpoint presentation - CYPRODEV212 - 700 words essay due in 5 hours - Venturi meter experiment lab report conclusion - Forecasting problems in operations management - Colleagues Response week 11 - Mater christi mass times - Designer For Women Snapback Caps Online? - Studies by christopher jencks revealed that there is little evidence that school reform efforts - British army driving school - Hoselink stand up weeder - 5 advantages and disadvantages of project delivery methods - Slade school of fine art entry requirements - Surface weather map station model - Admin sadeempc com - Density mass volume units - Gatwick airport layout plan - What is a soliloquy - Kingshill medical practice forth