Loading...

Messages

Proposals

Stuck in your homework and missing deadline? Get urgent help in $10/Page with 24 hours deadline

Get Urgent Writing Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework & Achieve A+ Grades.

Privacy Guaranteed - 100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

Research and evaluation in counseling erford pdf

01/12/2021 Client: muhammad11 Deadline: 2 Day

Journal of Counseling & Development  ■  Spring 2007  ■  Volume 85162

Assessment & Diagnosis

© 2007 by the American Counseling Association. All rights reserved.

Program  evaluation  in  counseling  has  been  a  consistent  topic  of  discourse  in  the  profession  over  the  past  20  years  (Gysbers,  Hughey, Starr, & Lapan, 1992; Hadley & Mitchell, 1995; Loesch,  2001; Wheeler & Loesch, 1981). Considered an applied research  discipline, program evaluation refers to a systematic process of  collecting and analyzing information about the efficiency, the ef- fectiveness, and the impact of programs and services (Boulmetis &  Dutwin, 2000). The field of program evaluation has grown rapidly  since the 1950s as public and private sector organizations have  sought quality, efficiency, and equity in the delivery of services  (Stufflebeam, 2000b). Today, professional program evaluators are  recognized as highly skilled specialists with advanced training in  statistics, research methodology, and evaluation procedures (Hosie,  1994). Although program evaluation has developed as a distinct  academic and professional discipline, human services professionals  have frequently adopted program evaluation principles in order to  conduct micro-evaluations of local services. From this perspective,  program evaluation can be considered as a type of action research  geared toward monitoring and improving a particular program or  service. Because micro-evaluations are conducted on a smaller  scale,  they  may  be  planned  and  implemented  by  practitioners.  Therefore, for the purposes of this article, we consider counseling  program evaluation to be the ongoing use of evaluation principles  by counselors to assess and improve the effectiveness and impact  of their programs and services.

Challenges to Counseling Program Evaluation Counseling program evaluation has not always been conceptual- ized from the perspective of practicing counselors. For instance,  Benkofski and Heppner (1999) presented guidelines for counsel-

ing program evaluation that emphasized the use of independent  evaluators  rather  than  counseling  practitioners.  Furthermore,  program evaluation literature has often emphasized evaluation  models and principles that were developed for use in large-scale  organizational evaluations by professional program evaluators  (e.g., Kellaghan & Madaus, 2000; Kettner, Moroney, & Martin,  1999). Such models and practices are not easily implemented by  counseling practitioners and may have contributed to the hesi- tance of counselors to use program evaluation methods. Loesch  (2001)  argued  that  the  lack  of  counselor-specific  evaluation  models has substantially contributed to the dichotomy between  research and practice in counseling. Therefore, new paradigms  of  counseling  program  evaluation  are  needed  to  increase  the  frequency of practitioner-implemented evaluations. 

Much  of  the  literature  related  to  counseling  program  evaluation  has  cited  the  lack  of  both  counselors’  ability  to  systematically  evaluate  counseling  services  and  of  their  interest  in  doing  so  (e.g.,  Fairchild,  1993;  Whiston,  1996).  Many reasons have been suggested for counselors’ failure to  conduct evaluations. An important reason is that conducting  an evaluation requires some degree of expertise in research  methods, particularly in formulating research questions, col- lecting relevant data, and selecting appropriate analyses. Yet  counselors typically receive little training to prepare them for  demonstrating outcomes (Whiston, 1996) and evaluating their  services  (Hosie,  1994).  Consequently,  counselor  education  programs have been criticized for failing to provide appropri- ate evaluation and research training to new counselors (Bor- ders, 2002; Heppner, Kivlighan, & Wampold, 1999; Sexton,  1999; Sexton, Whiston, Bleuer, & Walz, 1997). Counselors  may, therefore, refrain from program evaluation because of 

Randall L. Astramovich, Department of Counselor Education, University of Nevada, Las Vegas; J. Kelly Coker, Harbin and As- sociates Psychotherapy, Fayetteville, North Carolina. J. Kelly Coker is now at the Department of Counselor Education, Capella University. Correspondence concerning this article should be addressed to Randall L. Astramovich, Department of Counselor Education, University of Nevada, Las Vegas, 4505 Maryland Parkway, Box 453066, Las Vegas, NV 89154-3066 (e-mail: Randy. Astramovich@unlv.edu).

Program Evaluation: The Accountability Bridge Model for Counselors Randall L. Astramovich and J. Kelly Coker

The accountability and reform movements in education and the human services professions have pressured coun- selors to demonstrate outcomes of counseling programs and services. Evaluation models developed for large-scale evaluations are generally impractical for counselors to implement. Counselors require practical models to guide them in planning and conducting counseling program evaluations. The authors present the Accountability Bridge Counseling Program Evaluation Model and discuss its use in evaluating counseling services and programs

Journal of Counseling & Development  ■  Spring 2007  ■  Volume 85 163

The Accountability Bridge Model for Counselors

a lack of confidence in their ability to effectively collect and  analyze data and apply findings to their professional practice  (Isaacs, 2003). However, for those counselors with the req- uisite  skills  to  conduct  evaluations,  their  hesitance  may  be  related to the fear of finding that their services are ineffective  (Lusky & Hayes, 2001; Wheeler & Loesch, 1981).

Despite calls for counselors and counseling programs to em- brace research and evaluation as an integral part of the provision of  counseling services (e.g., Borders & Drury, 1992; Fairchild, 1994;  Whiston, 1996), there is virtually no information that documents  counselors’ interest in and use of counseling program evaluation.  Although counselors may place minimal value on research and  evaluation activities (Loesch, 2001), strong sociopolitical forces,  including the emphasis on managed care in mental health and  the school reform movement in public education, often require  today’s counselors to use evaluation methods to demonstrate the  effectiveness and impact of their counseling services. 

Program Evaluation and Accountability Distinguishing between program evaluation and accountability  is  essential  because  many  professionals  use  the  terms  inter- changeably and, occasionally, as categories of each other. For  instance, Isaacs (2003) viewed program evaluation as a type of  accountability that focuses primarily on program effectiveness  and improvement. However, from our perspective, counseling  program  evaluation  precedes  accountability. As  defined  by  Loesch  (2001),  counseling  program  evaluations  help  practi- tioners “maximize the efficiency and effectiveness of service  delivery through careful and systematic examination of program  components, methodologies, and outcomes” (p. 513). Counsel- ing program evaluations, thus, have inherent value in helping  practitioners plan, implement, and refine counseling practice  regardless of the need to demonstrate accountability. However,  when called on to provide evidence of program effectiveness  and  impact,  counselors  can  effectively  draw  on  information  gathered from their own program evaluations. 

We, thus, conceptualize counseling accountability as provid- ing specific information to stakeholders and other supervising  authorities about the effectiveness and efficiency of counseling  services (Studer & Sommers, 2000). In our view, demonstrat- ing accountability forms a bridge between counseling practice  and the broader context of the service impact on stakeholders.  However, accountability should not be the sole motivation for  counseling  program  evaluation. As  emphasized  by  Loesch  (2001), counseling program evaluations should be undertaken  to improve counseling services rather than merely to provide a  justification for existing programming.

The Need for New Models of Counseling Program Evaluation

We believe that a significant contributor to counselors’ dis- interest in evaluation involves the lack of practical program 

evaluation  models  available  to  them  for  this  purpose.  Fur- thermore, confusion about the differences between program  evaluation and accountability appear to deter counselors from  engaging  in  ongoing  program  evaluations  (Loesch,  2001).  Therefore, the development of new, counselor-specific models  that clearly conceptualize program evaluation and account- ability may provide the necessary impetus to establish program  evaluation as a standard of practice in counseling. 

Recent  examples  of  counselor-focused  evaluation  ap- proaches  include  Lusky  and  Hayes’s  (2001)  consultation  model of counseling program evaluation and Lapan’s (2001)  framework  for  planning  and  evaluating  school  counseling  programs. Gysbers and Henderson (2000) also discussed the  role of evaluation in school counseling programs and offered  practical  strategies  and  tools  that  counselors  could  imple- ment. These approaches have helped maintain a focus on the  importance of counseling program evaluation. 

The  purpose  of  this  article  was  to  build  on  the  emerg- ing  counselor-focused  literature  on  program  evaluation  by  providing counselors with a practical model for developing  and implementing evaluation-based counseling services. As  Whiston (1996) emphasized, counseling practice and research  form a continuum rather than being mutually exclusive activi- ties. Although some counselors may identify more strongly  with  research  and  others  more  strongly  with  practice,  both  perspectives provide valuable feedback about the impact of  counseling on clients served. Indeed, evaluation and feedback  are integral parts of the counseling process, and most coun- selors will identify with the idea of refining their practice by  using feedback from numerous sources as a basis.

This article is geared both to practitioners who may have  had  little  prior  training  in  or  experience  with  counseling  program evaluations and to counselor educators interested in  training students in counseling program evaluation methods.  We begin by discussing accountability in counseling and the  uses  of  counseling  program  evaluation.  Next,  we  present  the Accountability  Bridge  Counseling  Program  Evaluation  Model and discuss the steps involved in its implementation.  Finally, we discuss implications and make recommendations  for training counselors in evaluation skills.

Accountability in Counseling Accountability has become a catchword in today’s sociopoliti- cal climate. Since the 1960s, local, state, and federal govern- ment  spending  has  been  more  closely  scrutinized  and  the  effectiveness of social programs and initiatives more carefully  questioned (Houser, 1998; Kirst, 2000). As professionals in  the  social  services  field,  counselors  have  not  been  shielded  from the demands to demonstrate successful and cost-effective  outcomes,  nor  have  counseling  programs.  Despite  increas- ing  pressure  to  document  effectiveness,  some  counselors  maintain that counseling programs are generally immeasur- able (Loesch, 2001). However, given the rising demands for 

Journal of Counseling & Development  ■  Spring 2007  ■  Volume 85164

Astramovich & Coker

accountability  in  education  and  social  programs,  such  an  attitude is undoubtedly naïve. In fact, funding of educational  programs  and  social  services  often  hinges  on  the  ability  to  demonstrate  successful  outcomes  to  stakeholders.  Because  counselors often rely on third-party and government funding,  the future of the counseling profession may indeed rest on the  ability of practitioners to answer the calls for documentation  of effectiveness (Houser, 1998).

School Counseling Accountability

Today’s school counselors face increased demands to demon- strate program effectiveness (Adelman, 2002; Borders, 2002;  Herr, 2002; House & Hayes, 2002; Lusky & Hayes, 2001).  Primarily rooted in the school reform movement, demonstrat- ing  accountability  is  becoming  a  standard  practice  among  school counselors (Dahir & Stone, 2003; Fairchild & Seeley,  1995; Hughes & James, 2001; Myrick, 2003; Otwell & Mullis,  1997; Vacc & Rhyne-Winkler, 1993). Standards-based educa- tion  reforms,  including  the  No  Child  Left  Behind  (NCLB)  Act of 2001, have fueled pressures on local school systems  to  demonstrate  effective  educational  practices  (Albrecht  &  Joles, 2003; Finn, 2002; Gandal & Vranek, 2001). The NCLB  Act of 2001 emphasizes student testing and teacher effective- ness; however, school counselors have also recognized that in  the current educational environment, actively evaluating the  effectiveness of their school counseling programs is crucial.  Although  the  pressures  for  accountability  have  seemingly  increased  in  recent  years,  Lapan  (2001)  noted  that  school  counselors  have  developed  results-based  systems  and  used  student  outcome  data  for  many  years.  Furthermore,  school  counselors have historically been connected with school re- form, and their roles have often been shaped by educational  legislation (Herr, 2002). 

Although  accountability  demands  are  numerous,  school  counselors may fail to evaluate their programs because of time  constraints,  elusiveness  of  measuring  school  counseling  out- comes, lack of training in research and evaluation methods, and  the fear that evaluation results may discredit school counseling  programs  (Schmidt,  1995).  Because  of  these  factors,  when  school counselors attempted to provide accountability, they may  have relied on simple tallies of services and programs offered to  students. However, as discussed by Fairchild and Seeley (1995),  merely documenting the frequency of school counseling services  no longer meets the criteria for demonstrating program effective- ness. Although data about service provision may be important,  school counselors must engage in ongoing evaluations of their  counseling  programs  in  order  to  assess  the  outcomes  and  the  impact of their services.

Trevisan (2000) emphasized that school counseling pro- gram evaluation may help the school counseling profession  by providing accountability data to stakeholders, generating  feedback about program effectiveness and program needs, and  clarifying the roles and functions of school counselors. As the 

profession of school counseling evolves, increasing emphasis  on leadership and advocacy (Erford, House, & Martin, 2003;  House & Sears, 2002) and on comprehensive school coun- seling  programs  (American  School  Counselor Association  [ASCA], 2003; Sink & MacDonald, 1998; Trevisan, 2002b)  will coincide with ongoing research and program evaluation  efforts  (Paisley  &  Borders,  1995; Whiston,  2002; Whiston  & Sexton, 1998). ASCA’s (2003) revised national standards  for school counseling reflect the importance of school coun- seling  accountability  and  provide  direction  for  practicing  school  counselors  in  the  evaluation  of  their  comprehensive  school counseling programs (Isaacs, 2003). Considering the  accountability  and  outcomes-focused  initiatives  in  today’s  education  environment,  school  counselors  need  skills  and  tools for systematically evaluating the impact of the services  they provide (Trevisan, 2001).

Mental Health Counseling Accountability

Like  professional  school  counselors,  today’s  mental  health  counselors  have  experienced  significant  pressures  to  dem- onstrate the effectiveness and the efficiency of their counsel- ing services. To secure managed care contracts and receive  third-party  reimbursements,  mental  health  counselors  are  increasingly required to keep detailed records about specific  interventions and outcomes of counseling sessions (Granello  & Hill, 2003; Krousel-Wood, 2000; Sexton, 1996). Despite  the  financial  implications  of  avoiding  such  accountability  measures,  many  mental  health  counselors  have  fought  for  autonomy from third-party payers in the provision of coun- seling services. Mental health counselors often indicate that  their ability to provide quality mental health care to clients is  hampered by managed care’s demands to demonstrate tech- nical proficiency and cost-effective service delivery (Scheid,  2003). Furthermore, mental health counselors often express  concerns  about  their  therapeutic  decision-making  capacity  being curtailed by managed care (Granello & Hill, 2003).

Managed care’s mandate for accountability in the field of  mental  health  counseling  may  have  resulted,  in  part,  from  counselors’ failure to initiate their own outcomes assessments  (Loesch, 2001). However, the emergence of empirically sup- ported treatments (ESTs) has helped counselors respond to  the call for accountability from managed care (Herbert, 2003).  Specifically,  ESTs  draw  on  evidence-based  practices  from  empirical  counseling  research  to  provide  counselors  with  intervention  guidelines  and  treatment  manuals  for  specific  client problems. Yet, mental health counselors may resist the  use of such approaches, insisting that counseling procedures  and outcomes cannot be formally measured and that attempt- ing  such  evaluations  merely  reduces  time  spent  providing  counseling  services  (Sanderson,  2003).  Today’s  managed  care  companies,  however,  may  require  counselors  to  base  their practice on specific ESTs in order to receive payment  for services. Further complicating the issue is the fact that, 

Journal of Counseling & Development  ■  Spring 2007  ■  Volume 85 165

The Accountability Bridge Model for Counselors

as  previously  noted  with  other  areas  of  counseling,  mental  health counselors often receive no training in evaluating the  outcomes and impact of their services (Granello & Hill, 2003;  Sexton et al., 1997). Ultimately, resistance from mental health  counselors to document counseling outcomes may be due to  insufficient counselor training in evaluation methods.

Despite  the  tumultuous  history  of  the  pressures  brought  to bear on mental health practitioners by managed care for  accountability,  there  is  a  major  impetus  for  shifting  toward  examining  program  effectiveness  and  outcomes  in  mental  health  counseling—the  benefit  of  forging  a  professional  identity. Kelly (1996) underscored the need for mental health  counselors to be accepted as legitimate mental health provid- ers who are on the same professional level as social workers,  psychologists,  and  psychiatrists.  The  ability  to  document  outcomes and identify effective treatments is, therefore, criti- cal  in  furthering  the  professional  identity  of  mental  health  counselors within the mental health professions.

Accountability in Other Counseling Specialties

Although most literature on counseling accountability empha- sizes school and mental health settings, calls for accountability  have also been directed to other counseling specialties. Bishop  and Trembley (1987) discussed the accountability pressures  faced in college counseling centers. Similar to school coun- selors and mental health counselors, college counselors and  those in authority in college counseling centers have resisted  accountability  demands  placed  on  them  by  authorities  in  higher education. Bishop and Trembley also noted that some  counselors have maintained that counseling centers are de- signed for practice rather than research. 

Ultimately, all counseling practitioners, despite their spe- cialty area, are faced with the need to demonstrate program  effectiveness. Although counselors may be hesitant or unwill- ing to evaluate the effectiveness of their services because they  see little relevance to their individual practice, the future of  the  counseling  profession  may  well  be  shaped  by  the  way  practitioners respond to accountability demands.

Program Evaluation in Counseling In  recent  years,  the  terms  program evaluation  and  ac- countability have often been used synonymously in dis- cussions of counseling research and outcomes. However,  accountability efforts in counseling generally result from  external  pressures  to  demonstrate  eff iciency  and  effec- tiveness. On the other hand, counselor-initiated program  evaluations  can  be  used  to  better  inform  practice  and  improve counseling services. We believe that a key shift  in  the  profession  would  be  to  have  counselors  continu- ally  evaluate  their  programs  and  outcomes  not  because  of external pressures, but from a desire to enhance client  services  and  to  advocate  for  clients  and  the  counseling 

profession. New perspectives on the role of evaluation of  counseling practices may ultimately help program evalu- ation become a standard of practice in counseling.

Program evaluation models have proliferated in the fields  of economics, political science, sociology, psychology, and  education (Hosie, 1994) and have been used for improving  quality  (Ernst  &  Hiebert,  2002),  assessing  goal  achieve- ment,  decision  making,  determining  consumer  impact,  and  examining cost-effectiveness (Madaus & Kellaghan, 2000).  Many program evaluation models were developed for use in  large-scale organizational evaluations and are, thus, impracti- cal for use by counselors. Furthermore, large-scale program  evaluation models are generally based on the assumption that a  staff of independent evaluation experts or an assessment team  will plan and implement the evaluation. Within the counsel- ing  professions,  however,  financial  constraints  generally  make such independent evaluations of programs unfeasible.  Consequently,  counselors  usually  rely  on  limited  resources  and  their  own  research  skills  to  carry  out  an  evaluation  of  program  effectiveness.  Fortunately,  many  of  the  principles  and practices of large-scale evaluation models can be adapted  for use by counselors.

Given the wide range of program evaluation definitions and  approaches, models from human services professions and edu- cation appear most relevant for the needs of counselors because  these models generally emphasize ongoing evaluation for pro- gram improvement (e.g., Stufflebeam, 2000a). Counseling pro- gram evaluation may be defined as the ongoing use of evaluation  principles by counselors to assess and improve the effectiveness  and impact of counseling programs and services. Ongoing coun- seling program evaluations can provide crucial feedback about  the direction and the growth of counseling services and can also  meet the accountability required by stakeholders (Boulmetis &  Dutwin, 2000; Loesch, 2001; Stufflebeam, 2000b).

Reasons for Evaluating Counseling Programs

Program  evaluations  may  be  initiated  for  various  reasons;  however,  evaluations  are  intended  to  generate  practical  in- formation rather than to be mere academic exercises (Royse,  Thyer, Padgett, & Logan, 2001). Counseling program evalu- ations should, therefore, provide concrete information about  the effectiveness, the efficiency, and the impact of services  (Boulmetis  &  Dutwin,  2000).  Specifically,  counseling  pro- gram evaluations can yield information that will demonstrate  the degree to which clients are being helped. Evaluations may  also provide feedback about client satisfaction and can help  to distinguish between effective and ineffective approaches  for the populations being served (Isaacs, 2003). On a broader  scope, program evaluations can help to determine if services  are having an influence on larger social problems (Royse et  al., 2001). On the contextual level, evaluations can provide  information about the use of staff and program resources in  the provision of services (Stufflebeam, 2000a).

Journal of Counseling & Development  ■  Spring 2007  ■  Volume 85166

Astramovich & Coker

Accountability to stakeholders has often been a consideration  in formulating approaches to counseling program evaluation. For  example, Lapan (2001) indicated that program evaluations help  counselors to identify effective services that are valued by stake- holders. Thus, by using stakeholder feedback in program planning  and then providing valued services, counselors are better prepared  to demonstrate the accountability of their programs and practice.  Internal accountability may be requested by administrators of local  programs to determine if program staff and resources are being  used effectively. On the other hand, external accountability may  be requested by policy makers and stakeholders with an interest  in the effectiveness of provided services (Priest, 2001).

Counseling program evaluations are generally implemented to  provide information about local needs; however, in some instances  information from local evaluations may have significant implica- tions for the entire counseling profession. As discussed by Whiston  (1996), the professional identity of counselors can be enhanced  through action research that demonstrates the effectiveness of ser- vices. By conceptualizing program evaluations as a type of action  research, counselors have the potential to consider this effort as a  contribution to the growing research-base in counseling. 

Questions That Evaluations May Answer

Counseling program evaluations, like all forms of evalua- tions, are undertaken to answer questions about the effective- ness of programs and services in meeting specific goals (Berk  &  Rossi,  1999).  Questions  about  the  overall  effectiveness  and  impact  of  services  may  be  answered,  as  well  as  more  discrete, problem-specific concerns. Furthermore, questions  posed in evaluations help guide the collection and analysis  of  outcome  information  and  the  subsequent  reporting  of  outcomes to stakeholders.

Numerous questions may be explored with evaluations.  Powell,  Steele,  and  Douglah  (1996)  indicated  that  evalu- ation  questions  generally  fall  into  four  broad  categories:  outcomes and impacts, program need, program context, and  program  operations. The  following  are  some  examples  of  the types of questions that counseling program evaluations  may answer:

  •  Are clients being helped?   •  What methods, interventions, and programs are most 

helpful for clients?   •  How satisfied are clients with services received?   •  What are the long-term effects of counseling programs 

and services?   •  What impact do the services and programs have on 

the larger social system?   •  What are the most effective uses of program staff?   •  How well are program objectives being met?

Program  evaluations  are  generally  guided  by  specific  questions  related  to  program  objectives.  Guiding  questions 

help counselors to plan services and gather data specific to  the problems under investigation. Depending on program and  stakeholder needs, counseling evaluations may be designed  to  answer  many  questions  simultaneously  or  they  may  be  focused on specific objectives and outcomes. As part of an  ongoing  process,  the  initial  cycle  of  a  counseling  program  evaluation may yield information that can help to define or  refine further problems and questions for exploration in the  next evaluation cycle.

Ultimately, counseling program evaluations may serve many  purposes and may provide answers to a variety of questions.  However, if counselors are to implement evaluations, a practical  framework for conceptualizing the evaluation process seems  essential. Counselors, thus, need a conceptual foundation for  guiding the evaluation of their programs and services.

The Accountability Bridge Counseling Program Evaluation Model for Counselors

The Accountability Bridge Counseling Program Evaluation  Model  (see  Figure  1)  provides  a  framework  to  be  used  by  individual  counselors  and  within  counseling  programs  and  counseling agencies to plan and deliver counseling services  and  to  assess  their  effectiveness  and  impact.  Drawing  on  concepts  from  the  business  evaluation  model  proposed  by  Ernst  and  Hiebert  (2002)  and  the  Context,  Input,  Process,  Product  Model  (CIPP)  developed  by  Stufflebeam  (2000a),  the Accountability  Bridge  Counseling  Program  Evaluation  Model  organizes  counseling  evaluation  into  two  reoccur- ring cycles that represent a continual refinement of services  based on outcomes, stakeholder feedback, and the needs of  the populations served. The counseling program evaluation cycle focuses on the provision and outcomes of counseling  services, whereas the counseling context evaluation cycle ex- amines the impact of counseling services on stakeholders and  uses their feedback, along with the results yielded by needs  assessments, to establish and refine the goals of counseling  programs. The two cycles are connected by an “accountability”  bridge, whereby results from counseling practices are com- municated  to  stakeholders  within  the  context  of  the  larger  service system. Providing accountability to stakeholders is,  therefore, an integral part of the model. Although it is beyond  the scope of this article to discuss each component in depth, a  basic review of the framework and principles of the model will  help counselors begin to conceptualize the process of planning  and implementing counseling program evaluations.

Counseling Program Evaluation Cycle

The counseling program evaluation cycle involves the planning  and implementation of counseling practice and culminates with  assessing  the  outcomes  of  individual  and  group  counseling,  guidance services, and counseling programs. Four stages are  involved in the counseling program evaluation cycle.

Journal of Counseling & Development  ■  Spring 2007  ■  Volume 85 167

The Accountability Bridge Model for Counselors

1. Program planning. Although we enter the discussion of  the model at the program planning stage, information obtained  from the counseling context evaluation cycle is critical in the  planning process. Thus, on the basis of input obtained from  needs assessments and the subsequent formation of service  objectives,  counseling  programs  and  services  are  planned  and developed to address the needs of the populations served.  Program  planning  involves  identifying  specific  counsel- ing  methods  and  activities  that  are  appropriate  for  certain  populations as well as determining the availability of needed  resources,  including  staff,  facilities,  and  special  materials  (Royse et al., 2001). 

Lapan  (2001)  stressed  that  effective  school  counseling  programs  meet  objectives  by  planning  results-based  inter- ventions that can be measured. Therefore, a key component  of the program planning process involves the simultaneous  planning of methods for measuring outcomes (Boulmetis &  Dutwin,  2000).  For  instance,  during  the  program  planning  phase,  a  community  counseling  agency  that  is  planning  a  new  substance  abuse  aftercare  program  should  determine  the means of assessing client progress through the program.  Furthermore,  developing  multiple  outcome  measures  can  help increase the validity of findings. Gysbers and Hender- son  (2000)  discussed  several  means  for  assessing  school  counseling outcomes, including pretest–posttest instruments,  performance indicators, and checklists. Studer and Sommers  (2000) indicated that multiple measures, such as assessment  instruments,  observable  data,  available  school-based  data,  and client/parent/teacher interviews, could be used in school  counseling program evaluation. In mental health and college  counseling specialties, similar measures of client and program  progress can be used, including standardized assessment tools  such  as  depression  and  anxiety  inventories.  Other  means  of  collecting  outcome  data  include  surveys,  individual  and 

group interviews, observation methods, and document review  (Powell et al., 1996). Furthermore, data can be collected over  a 1- to 3-year period to determine program effectiveness over  longer periods of time (Studer & Sommers, 2000). 

A  f inal  consideration  in  the  program  planning  stage  involves  determining  when  clients  will  complete  selected  measures and assessments . Individuals who will be respon- sible for gathering and processing the information should be  identified as well. For example, in a community agency setting,  counselors may take responsibility for collecting data about  their own client caseload, whereas a counselor supervisor may  collect data from community sources. 

2. Program implementation. After programs and services  have been planned and outcome measures have been selected,  programs and services are initiated. Sometimes referred to as  “formative  evaluation,”  the  program  implementation  phase  actualizes the delivery of services shaped by input from the  counseling context evaluation cycle. During program imple- mentation, counselors may identify differences between the  planned programs and the realities of providing the services.  Therefore,  at  this  point,  decisions  may  be  made  to  change  programs before they are fully operational or to make refine- ments in programs and services as the need arises.

3. Program monitoring and refinement. Once programs and  services  have  been  initiated  and  are  fully  operational,  coun- selors may need to make adjustments to their practice based  on  preliminary  results  and  feedback  from  clients  and  other  interested parties. Programs and services may, therefore, need  to be refined and altered to successfully meet the needs of the  clientele served. Monitoring program success helps to ensure  the quality of counseling services and maximizes the likelihood  of finding positive results during outcomes assessments. 

4. Outcomes assessment. As  programs  and  services  are  completed, outcomes assessments help to determine if objec-

FIGURE 1

Accountability Bridge Counseling Program Evaluation Model

Program   Monitoring and  Refinement

Feedback  From Stakeholders

Journal of Counseling & Development  ■  Spring 2007  ■  Volume 85168

Astramovich & Coker

tives have been met. Therefore, during the outcomes assessment  phase, final data are collected, and all program data are analyzed  to  determine  the  outcomes  of  interventions  and  programs.  Counseling outcome data should be analyzed and interpreted as  soon as possible after being collected (Gysbers & Henderson,  2000).  Data  analysis  approaches  differ  for  quantitative  and  qualitative  data,  and  counselors  with  limited  research  back- ground may need to seek assistance from peers and supervisors  with knowledge of analyzing a variety of data sets. Available  data analysis computer software can also expedite the analysis  and interpretation of data. Such software programs also allow  for easy creation of charts and graphs that can play a key role  in the dissemination of evaluation results.

Homework is Completed By:

Writer Writer Name Amount Client Comments & Rating
Instant Homework Helper

ONLINE

Instant Homework Helper

$36

She helped me in last minute in a very reasonable price. She is a lifesaver, I got A+ grade in my homework, I will surely hire her again for my next assignments, Thumbs Up!

Order & Get This Solution Within 3 Hours in $25/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 3 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 6 Hours in $20/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 6 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 12 Hours in $15/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 12 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

6 writers have sent their proposals to do this homework:

Engineering Solutions
University Coursework Help
Quality Assignments
Professional Coursework Help
Essay & Assignment Help
Study Master
Writer Writer Name Offer Chat
Engineering Solutions

ONLINE

Engineering Solutions

I find your project quite stimulating and related to my profession. I can surely contribute you with your project.

$27 Chat With Writer
University Coursework Help

ONLINE

University Coursework Help

I can assist you in plagiarism free writing as I have already done several related projects of writing. I have a master qualification with 5 years’ experience in; Essay Writing, Case Study Writing, Report Writing.

$41 Chat With Writer
Quality Assignments

ONLINE

Quality Assignments

I can assist you in plagiarism free writing as I have already done several related projects of writing. I have a master qualification with 5 years’ experience in; Essay Writing, Case Study Writing, Report Writing.

$43 Chat With Writer
Professional Coursework Help

ONLINE

Professional Coursework Help

I will be delighted to work on your project. As an experienced writer, I can provide you top quality, well researched, concise and error-free work within your provided deadline at very reasonable prices.

$49 Chat With Writer
Essay & Assignment Help

ONLINE

Essay & Assignment Help

I have worked on wide variety of research papers including; Analytical research paper, Argumentative research paper, Interpretative research, experimental research etc.

$30 Chat With Writer
Study Master

ONLINE

Study Master

As per my knowledge I can assist you in writing a perfect Planning, Marketing Research, Business Pitches, Business Proposals, Business Feasibility Reports and Content within your given deadline and budget.

$29 Chat With Writer

Let our expert academic writers to help you in achieving a+ grades in your homework, assignment, quiz or exam.

Similar Homework Questions

Essay - No warmth could warm nor weather chill him - Tectonics and continental drift lab answers - Sap pi bpm example - National herbarium of victoria - Urgent - Coll 148 week 4 quiz - Work breakdown structure for a birthday party - Ariba guided buying pdf - Mod 2 PAASU - Valencia atlas help desk - Strategic Marketing 3 - What are seven consequences of inaccurate coding and incorrect billing - Discussion part 6 - Why shoplifting is wrong essay - Business case analysis - Second negative constructive speech example - Certificate iv in security operations - Main idea of cinderella - Bsbcmm401a make a presentation assessment answers - SC - Allison transmission fault codes - How to make a gpa calculator in visual basic - For all work solver - Allen bradley autocad electrical 2016 library - Preset and clear in d flip flop - Ram home loan rates - Nutr1023 - Coefficient of restitution units - Write 293 805 in two other forms - Discussion post: Database Search - Can someone do my Week 1 work for me Discussion 1 & 2 and in Organizational Behavior? - Socrates was concerned with - Biomes virtual lab answers key - According to the sub-saharan africa anglo culture cluster, the most desirable leader behavior is - Sex drugs and economics - Business contingency and disaster recovery - Voltage adder without op amp - Santa clara university essay prompts - Eileen o brien umbc - Predominance of coccobacilli consistent with shift - A tale of two schools summary - Two eigenvalues of a 3x3 matrix - Grasslin digital timer user instructions - Sap hybris commerce documentation - A visit to the doctor by roald dahl - Sample of minutes writing - Ask a biologist worksheet answers - Seathorne care home skegness - International university of japan - Willoughby hall nottingham uni - Gym organizational chart - Ovary gamete crossword clue - Project performance overview - Regression Modeling - Discussion 2 250 words by 08/14/2020 at 6:00 pm add references and citations - Leadership and governance definition - Anova worksheet - BUSN601 Week 2 - Employee induction plan template - Dss consulting case study - Lección 4 contextos 3 qué son - The working poor summary by chapter - Week 2 Assignment - Grow management consultants - Human Resources Management - A character variable can never store more than - Antelope tea hockhua - Apply the fill teal accent 4 soft bevel text effect - Nutrition and wellness changes in adults - Ndis participant portal sign in - Mcgraw hill connect answers anatomy and physiology - Digital capacitance meter using 555 timer - Aes uses a feistel structure - Procter and gamble organizational chart 2018 - Iupac name of alkane alkene alkyne - Yellow maize kernels surrounded by husks - Ethical hacking vs penetration testing - Network consultation for designit - Electromagnetic spectrum webquest answer key pdf - Who can complete this assignment by tomorrow, September 29th by 7am? - Annotated Bibliography - Mount everest 1996 case study - Dystan medical supply company cold packs and hot packs - A long-standing charge against intermediaries is that they mark up prices beyond the ________. - Golden leaf foundation board of directors - Revel for living democracy 2016 presidential election access card - Below is the Lawyers' Pledge of Professionalism. Comment on one point- why it is important and how you have or have not seen this in action. - Crim101 - Bs en 60947 2 - Unit 1 Discussions (HIS101 & HRM411) - A king is born - Walt disney mission statement 2014 - 200 Words - Unequal tie rod length - Hris dgfp mohfw gov bd - Worksheet 3 currency exchange answers - Sections of a quantitative research report - Aluminium foil melting point - Hay job evaluation system chart