Loading...

Messages

Proposals

Stuck in your homework and missing deadline? Get urgent help in $10/Page with 24 hours deadline

Get Urgent Writing Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework & Achieve A+ Grades.

Privacy Guaranteed - 100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

Social work evaluation enhancing what we do

05/12/2021 Client: muhammad11 Deadline: 2 Day

Discussion 1: Reporting a Process Evaluation

Just as in needs assessments, interviews and focus groups are common tools for obtaining information about the processes involved in the implementation of programs. Process evaluation should include specifics about purpose, questions which the evaluation will address, and methods that social workers will use to conduct evaluations.

Review the many examples of process evaluation results described in Chapter 8 of Dudley, J. R. (2014). Social work evaluation: Enhancing what we do. (2nd ed.) Chicago, IL: Lyceum Books, or in the optional resources. Select an example of a process evaluation that produced valuable information. Compare the description of those results with the Social Work Research Qualitative Groups case study located in this week’s resources.

· Post a description of the process evaluation that you chose and explain why you selected this example.

· Describe the stage of program implementation in which the evaluation occurred, the informants, the questions asked, and the results.

· Based upon your comparison of the case study and the program evaluation report that you chose, improve upon the information presented in the case study by identifying gaps in information.

· Fill in these gaps as if you were the facilitator of the focus group. Clearly identify the purpose of the process evaluation and the questions asked.

References (use 3 or more)

Dudley, J. R. (2014). Social work evaluation: Enhancing what we do. (2nd ed.) Chicago, IL: Lyceum Books.

Chapter 8, “Improving How Programs and Practice Work” (pp. 167–207)

Plummer, S.-B., Makris, S., & Brocksen S. (Eds.). (2014b). Social work case studies: Concentration year. Baltimore, MD: Laureate International Universities Publishing. [Vital Source e-reader].

Read the following section:

“Social Work Research: Qualitative Groups” (pp. 68–69)

Document: Bliss, M. J., & Emshoff, J. G. (2002). Workbook for designing a process evaluation. Retrieved from http://beta.roadsafetyevaluation.com/evaluationguides/info/workbook-for-designing-a-process-evaluation.pdf (PDF)

Georgia Department of Human Resources, Division of Public Health.

Example of Process Evaluation
Boyce, C., & Neale, P. (2006). Conducting in-depth interviews: A guide for designing and conducting in-depth interviews for evaluation input. Pathfinder International Tool Series: Monitoring and Evaluation – 2. Retrieved from http://www.cpc.unc.edu/measure/training/materials/data-quality-portuguese/m_e_tool_series_indepth_interviews.pdf

Social Work Research: Qualitative Groups
A focus group was conducted to explore the application of a cross-system collaboration and its effect on service delivery outcomes among social service agencies in a large urban county on the West Coast. The focus group consisted of 10 social workers and was led by a facilitator from the local office of a major community support organization (the organization). Participants in the focus group had diverse experiences working with children, youth, adults, older adults, and families. They represented agencies that addressed child welfare, family services, and community mental health issues. The group included five males and five females from diverse ethnicities.

The focus group was conducted in a conference room at the organization’s headquarters. The organization was interested in exploring options for greater collaboration and less fragmentation of social services in the local area. Participants in the group were recruited from local agencies that were either already receiving or were applying for funding from the organization. The 2-hour focus group was recorded.

The facilitator explained the objective of the focus group and encouraged each participant to share personal experiences and perspectives regarding cross-system collaboration. Eight questions were asked that explored local examples of cross-system collaboration and the strengths and barriers found in using the model. The facilitator tried to achieve maximum participation by reflecting the answers back to the participants and maintaining eye contact.

To analyze the data, the researchers carefully transcribed the entire recorded discussion and utilized a qualitative data analysis software package issued by StatPac, which offers a product called Verbatim Blaster. This software focuses on content coding and word counting to identify the most salient themes and patterns.

The focus group was seen by the sponsoring entity as successful because every participant eventually provided feedback to the facilitator about cross-system collaboration. It was also seen as a success because the facilitator remained engaged and nonjudgmental and strived to have each participant share their experiences.

In terms of outcomes, the facilitator said that the feedback obtained was useful in exploring new ways of delivering services and encouraging greater cooperation. As a result of this process, the organization decided to add a component to all agency annual plans and reports that asked them to describe what types of cross-agency collaboration were occurring and what additional efforts were planned.

(Plummer 68-69)

Plummer, Sara-Beth, Sara Makris, Sally Brocksen. Social Work Case Studies: Concentration Year. Laureate Publishing, 10/21/13. VitalBook file.

Discussion 2: Social Work Agency Budgeting

Human services organizations cannot work in isolation because of the breadth and depth of social issues they address in their mission to provide services. By partnering with other organizations in the community, human services organizations can expand their service delivery. These community partners can complement the work of the social work agency and help raise additional funds for services. Strategic partnerships are not limited to nonprofit organizations; human services organizations may also work with local businesses. When considering community partners, administrators and leaders should keep an open mind and think about unique partnerships that will benefit the community.

For this Discussion, search for examples in your local community of partnerships between human services organizations and local businesses and/or nonprofits. (You may review the partnership example described at the top of page 306 in Understanding Your Social Agency, 3rd ed.) Consider how the human services organizations, nonprofits, businesses, and community benefit from these partnerships. Also consider limitations to these collaborative endeavors.

· Post a description of examples in your local community of partnerships between human services organizations and local businesses and/or nonprofits that benefit the community.

· Analyze the collaboration to identify aspects that provide benefits that go beyond the initial collaborative effort.

· Explain how these aspects may benefit the human services organization.

· In addition, identify aspects of the collaboration that may lead to challenges, and explain how they may challenge the human services organization.

References (use 3 or more)

Lauffer, A. (2011). Understanding your social agency (3rd ed.). Washington, DC: Sage.

Chapter 9, “Fundraising and Development” (pp. 285–320)

Bowman, W. (2011). Financial capacity and sustainability of ordinary nonprofits. Nonprofit Management & Leadership, 22(1), 37–51.

LeRoux, K. (2009). Managing stakeholder demands: Balancing responsiveness to clients and funding agents in nonprofit social service organizations. Administration & Society, 41(2), 158–184.

Discussion 3: Financial Capacity and Sustainability in Human Services

Receiving funding from a grant or other source of funds is a great accomplishment. Once the funding is received, the human services organization must be able to manage the funds effectively. The organization must also develop a plan to sustain the program after the funding period ends or the potential for change from the funded program may be limited. One way to determine an organization’s capacity for fundraising and/or financial management is to assess its strengths and weaknesses in these areas and consider ways to improve. This type of assessment should be undertaken before the organization begins to actively seek funds.

For this Discussion, you will evaluate an aspect of financial management or fundraising efforts in a human services organization with which you are familiar. Refer to the inventory tool on page 319 of Understanding Your Social Agency, 3rd ed., for ideas on how to evaluate an organization’s fundraising efforts.

· Post your brief description of an organization with which you are familiar (e.g., a field placement, a previous employer) and evaluate one area of the organization’s financial management or fundraising that needs improving, and explain why.

· Explain three strategies the organization could implement to improve its financial management or fundraising situation.

· Explain how each strategy supports improvement.

References (use 3 or more)

Lauffer, A. (2011). Understanding your social agency (3rd ed.). Washington, DC: Sage.

Chapter 9, “Fundraising and Development” (pp. 285–320)

Bowman, W. (2011). Financial capacity and sustainability of ordinary nonprofits. Nonprofit Management & Leadership, 22(1), 37–51.

LeRoux, K. (2009). Managing stakeholder demands: Balancing responsiveness to clients and funding agents in nonprofit social service organizations. Administration & Society, 41(2), 158–184.

Workbook

for Designing a Process Evaluation

Produced for the

Georgia Department of Human Resources

Division of Public Health

By

Melanie J. Bliss, M.A. James G. Emshoff, Ph.D.

Department of Psychology Georgia State University

July 2002

Evaluation Expert Session July 16, 2002 Page 1

What is process evaluation?

Process evaluation uses empirical data to assess the delivery of programs. In contrast to outcome evaluation, which assess the impact of the program, process evaluation verifies what the program is and whether it is being implemented as designed. Thus, process evaluation asks "what," and outcome evaluation asks, "so what?"

When conducting a process evaluation, keep in mind these three questions:

1. What is the program intended to be? 2. What is delivered, in reality? 3. Where are the gaps between program design and delivery?

This workbook will serve as a guide for designing your own process evaluation for a program of your choosing. There are many steps involved in the implementation of a process evaluation, and this workbook will attempt to direct you through some of the main stages. It will be helpful to think of a delivery service program that you can use as your example as you complete these activities. Why is process evaluation important? 1. To determine the extent to which the program is being

implemented according to plan 2. To assess and document the degree of fidelity and variability in

program implementation, expected or unexpected, planned or unplanned

3. To compare multiple sites with respect to fidelity 4. To provide validity for the relationship between the intervention

and the outcomes 5. To provide information on what components of the intervention

are responsible for outcomes 6. To understand the relationship between program context (i.e.,

setting characteristics) and program processes (i.e., levels of implementation).

7. To provide managers feedback on the quality of implementation 8. To refine delivery components 9. To provide program accountability to sponsors, the public, clients,

and funders 10. To improve the quality of the program, as the act of evaluating is

an intervention.

Evaluation Expert Session July 16, 2002 Page 2

Stages of Process Evaluation Page Number

1. Form Collaborative Relationships 3 2. Determine Program Components 4 3. Develop Logic Model* 4. Determine Evaluation Questions 6 5. Determine Methodology 11 6. Consider a Management Information System 25 7. Implement Data Collection and Analysis 28 8. Write Report**

Also included in this workbook:

a. Logic Model Template 30 b. Pitfalls to avoid 30 c. References 31

Evaluation can be an exciting, challenging, and fun experience

Enjoy!

* Previously covered in Evaluation Planning Workshops. ** Will not be covered in this expert session. Please refer to the Evaluation Framework

and Evaluation Module of FHB Best Practice Manual for more details.

Evaluation Expert Session July 16, 2002 Page 3

Forming collaborative relationships

A strong, collaborative relationship with program delivery staff and management will likely result in the following:

Feedback regarding evaluation design and implementation Ease in conducting the evaluation due to increased cooperation Participation in interviews, panel discussion, meetings, etc. Increased utilization of findings

Seek to establish a mutually respectful relationship characterized by trust, commitment, and flexibility.

Key points in establishing a collaborative relationship:

Start early. Introduce yourself and the evaluation team to as many delivery staff and management personnel as early as possible.

Emphasize that THEY are the experts, and you will be utilizing their knowledge and

information to inform your evaluation development and implementation.

Be respectful of their time both in-person and on the telephone. Set up meeting places that are geographically accessible to all parties involved in the evaluation process.

Remain aware that, even if they have requested the evaluation, it may often appear as

an intrusion upon their daily activities. Attempt to be as unobtrusive as possible and request their feedback regarding appropriate times for on-site data collection.

Involve key policy makers, managers, and staff in a series of meetings throughout the

evaluation process. The evaluation should be driven by the questions that are of greatest interest to the stakeholders. Set agendas for meetings and provide an overview of the goals of the meeting before beginning. Obtain their feedback and provide them with updates regarding the evaluation process. You may wish to obtained structured feedback. Sample feedback forms are throughout the workbook.

Provide feedback regarding evaluation findings to the key policy makers, managers,

and staff when and as appropriate. Use visual aids and handouts. Tabulate and summarize information. Make it as interesting as possible.

Consider establishing a resource or expert "panel" or advisory board that is an official

group of people willing to be contacted when you need feedback or have questions.

Evaluation Expert Session July 16, 2002 Page 4

Determining Program Components

Program components are identified by answering the questions who, what, when, where, and how as they pertain to your program.

Who: the program clients/recipients and staff What: activities, behaviors, materials When: frequency and length of the contact or intervention Where: the community context and physical setting How: strategies for operating the program or intervention

BRIEF EXAMPLE: Who: elementary school students What: fire safety intervention When: 2 times per year Where: in students’ classroom How: group administered intervention, small group practice

1. Instruct students what to do in case of fire (stop, drop and roll). 2. Educate students on calling 911 and have them practice on play telephones. 3. Educate students on how to pull a fire alarm, how to test a home fire alarm and how to

change batteries in a home fire alarm. Have students practice each of these activities. 4. Provide students with written information and have them take it home to share with their

parents. Request parental signature to indicate compliance and target a 75% return rate. Points to keep in mind when determining program components Specify activities as behaviors that can be observed

If you have a logic model, use the "activities" column as a starting point

Ensure that each component is separate and distinguishable from others

Include all activities and materials intended for use in the intervention

Identify the aspects of the intervention that may need to be adapted, and those that should

always be delivered as designed. Consult with program staff, mission statements, and program materials as needed.

Evaluation Expert Session July 16, 2002 Page 5

Your Program Components

After you have identified your program components, create a logic model that graphically portrays the link between program components and outcomes expected from these components.

Now, write out a succinct list of the components of your program. WHO: WHAT: WHEN: WHERE: HOW:

Evaluation Expert Session July 16, 2002 Page 6

What is a Logic Model

A logical series of statements that link the problems your program is attempting to address (conditions), how it will address them (activities), and what are the expected results (immediate and intermediate outcomes, long-term goals).

Benefits of the logic model include:

helps develop clarity about a project or program, helps to develop consensus among people, helps to identify gaps or redundancies in a plan, helps to identify core hypothesis, helps to succinctly communicate what your project or program is about.

When do you use a logic model Use... - During any work to clarify what is being done, why, and with what intended results - During project or program planning to make sure that the project or program is logical and complete - During evaluation planning to focus the evaluation - During project or program implementation as a template for comparing to the actual program and as a filter to determine whether proposed changes fit or not. This information was extracted from the Logic Models: A Multi-Purpose Tool materials developed by Wellsys Corporation for the Evaluation Planning Workshop Training. Please see the Evaluation Planning Workshop materials for more information. Appendix A has a sample template of the tabular format.

Evaluation Expert Session July 16, 2002 Page 7

Determining Evaluation Questions

As you design your process evaluation, consider what questions you would like to answer. It is only after your questions are specified that you can begin to develop your methodology. Considering the importance and purpose of each question is critical.

BROADLY.... What questions do you hope to answer? You may wish to turn the program components that you have just identified into questions assessing: Was the component completed as indicated? What were the strengths in implementation? What were the barriers or challenges in implementation? What were the apparent strengths and weaknesses of each step of the intervention? Did the recipient understand the intervention? Were resources available to sustain project activities? What were staff perceptions? What were community perceptions? What was the nature of the interaction between staff and clients?

These are examples. Check off what is applicable to you, and use the space below to write additional broad, overarching questions that you wish to answer.

Evaluation Expert Session July 16, 2002 Page 8

SPECIFICALLY ... Now, make a list of all the specific questions you wish to answer, and organize your questions categorically. Your list of questions will likely be much longer than your list of program components. This step of developing your evaluation will inform your methodologies and instrument choice. Remember that you must collect information on what the program is intended to be and what it is in reality, so you may need to ask some questions in 2 formats. For example:

How many people are intended to complete this intervention per week?" How many actually go through the intervention during an average week?"

Consider what specific questions you have. The questions below are only examples! Some may not be appropriate for your evaluation, and you will most likely need to add additional questions. Check off the questions that are applicable to you, and add your own questions in the space provided. WHO (regarding client): Who is the target audience, client, or recipient? How many people have participated? How many people have dropped out? How many people have declined participation? What are the demographic characteristics of clients?

Race Ethnicity National Origin Age Gender Sexual Orientation Religion Marital Status Employment Income Sources Education Socio-Economic Status

What factors do the clients have in common? What risk factors do clients have? Who is eligible for participation? How are people referred to the program? How are the screened? How satisfied are the clients?

YOUR QUESTIONS:

Evaluation Expert Session July 16, 2002 Page 9

WHO (Regarding staff): Who delivers the services? How are they hired? How supportive are staff and management of each other? What qualifications do staff have? How are staff trained? How congruent are staff and recipients with one another? What are staff demographics? (see client demographic list for specifics.)

YOUR QUESTIONS: WHAT: What happens during the intervention? What is being delivered? What are the methods of delivery for each service (e.g., one-on-one, group session, didactic instruction,

etc.) What are the standard operating procedures? What technologies are in use? What types of communication techniques are implemented? What type of organization delivers the program? How many years has the organization existed? How many years has the program been operating? What type of reputation does the agency have in the community? What about the program? What are the methods of service delivery? How is the intervention structured? How is confidentiality maintained?

YOUR QUESTIONS: WHEN: When is the intervention conducted? How frequently is the intervention conducted? At what intervals? At what time of day, week, month, year? What is the length and/or duration of each service?

Evaluation Expert Session July 16, 2002 Page 10

YOUR QUESTIONS: WHERE: Where does the intervention occur? What type of facility is used? What is the age and condition of the facility? In what part of town is the facility? Is it accessible to the target audience? Does public transportation access

the facility? Is parking available? Is child care provided on site?

YOUR QUESTIONS: WHY: Why are these activities or strategies implemented and why not others? Why has the intervention varied in ability to maintain interest? Why are clients not participating? Why is the intervention conducted at a certain time or at a certain frequency?

YOUR QUESTIONS:

Evaluation Expert Session July 16, 2002 Page 11

Validating Your Evaluation Questions

Even though all of your questions may be interesting, it is important to narrow your list to questions that will be particularly helpful to the evaluation and that can be answered given your specific resources, staff, and time.

Go through each of your questions and consider it with respect to the questions below, which may be helpful in streamlining your final list of questions. Revise your worksheet/list of questions until you can answer "yes" to all of these questions. If you cannot answer "yes" to your question, consider omitting the question from your evaluation.

Validation

Yes

No

Will I use the data that will stem from these questions?

Do I know why each question is important and /or valuable?

Is someone interested in each of these questions?

Have I ensured that no questions are omitted that may be important to someone else?

Is the wording of each question sufficiently clear and unambiguous?

Do I have a hypothesis about what the “correct” answer will be for each question?

Is each question specific without inappropriately limiting the scope of the evaluation or probing for a specific response?

Do they constitute a sufficient set of questions to achieve the purpose(s) of the evaluation?

Is it feasible to answer the question, given what I know about the resources for evaluation?

Is each question worth the expense of answering it?

Derived from "A Design Manual" Checklist, page 51.

Evaluation Expert Session July 16, 2002 Page 12

Determining Methodology Process evaluation is characterized by collection of data primarily through two formats: 1) Quantitative, archival, recorded data that may be managed by an computerized

tracking or management system, and 2) Qualitative data that may be obtained through a variety of formats, such as

surveys or focus groups.

When considering what methods to use, it is critical to have a thorough understanding and knowledge of the questions you want answered. Your questions will inform your choice of methods. After this section on types of methodologies, you will complete an exercise in which you consider what method of data collection is most appropriate for each question.

Do you have a thorough understanding of your questions?

Furthermore, it is essential to consider what data the organization you are evaluating already has. Data may exist in the form of an existing computerized management information system, records, or a tracking system of some other sort. Using this data may provide the best reflection of what is "going on," and it will also save you time, money, and energy because you will not have to devise your own data collection method! However, keep in mind that you may have to adapt this data to meet your own needs - you may need to add or replace fields, records, or variables.

What data does your organization already have? Will you need to adapt it?

If the organization does not already have existing data, consider devising a method for the organizational staff to collect their own data. This process will ultimately be helpful for them so that they can continue to self-evaluate, track their activities, and assess progress and change. It will be helpful for the evaluation process because, again, it will save you time, money, and energy that you can better devote towards other aspects of the evaluation. Management information systems will be described more fully in a later section of this workbook.

Do you have the capacity and resources to devise such a system? (You may need to refer to a later section of this workbook before answering.)

Evaluation Expert Session July 16, 2002 Page 13

Who should collect the data?

Given all of this, what thoughts do you have on who should collect data for your evaluation? Program staff, evaluation staff, or some combination?

Program Staff: May collect data from activities such as attendance, demographics, participation, characteristics of participants, dispositions, etc; may conduct intake interviews, note changes regarding service delivery, and monitor program implementation.

Advantages: Cost-efficient, accessible, resourceful, available, time-efficient,

and increased understanding of the program. Disadvantages: May exhibit bias and/or social desirability, may use data for critical

judgment, may compromise the validity of the program; may put staff in uncomfortable or inappropriate position; also, if staff collect data, may have an increased burden and responsibility placed upon them outside of their usual or typical job responsibilities. If you utilize staff for data collection, provide frequent reminders as well as messages of gratitude.

Evaluation staff: May collect qualitative information regarding implementation, general characteristics of program participants, and other information that may otherwise be subject to bias or distortion.

Advantages: Data collected in manner consistent with overall goals and timeline

of evaluation; prevents bias and inappropriate use of information; promotes overall fidelity and validity of data.

Disadvantages: May be costly and take extensive time; may require additional

training on part of evaluator; presence of evaluator in organization may be intrusive, inconvenient, or burdensome.

Evaluation Expert Session July 16, 2002 Page 14

When should data be collected?

Conducting the evaluation according to your timeline can be challenging. Consider how much time you have for data collection, and make decisions regarding what to collect and how much based on your timeline. In many cases, outcome evaluation is not considered appropriate until the program has stabilized. However, when conducting a process evaluation, it can be important to start the evaluation at the beginning so that a story may be told regarding how the program was developed, information may be provided on refinements, and program growth and progress may be noted. If you have the luxury of collecting data from the start of the intervention to the end of the intervention, space out data collection as appropriate. If you are evaluating an ongoing intervention that is fairly quick (e.g., an 8-week educational group), you may choose to evaluate one or more "cycles." How much time do you have to conduct your evaluation? How much time do you have for data collection (as opposed to designing the evaluation, training, organizing and analyzing results, and writing the report?) Is the program you are evaluating time specific? How long does the program or intervention last? At what stages do you think you will most likely collect data?

Soon after a program has begun Descriptive information on program characteristics that will not change; information requiring baseline information During the intervention Ongoing process information such as recruitment, program implementation After the intervention Demographics, attendance ratings, satisfaction ratings

Evaluation Expert Session July 16, 2002 Page 15

Before you consider methods

A list of various methods follows this section. Before choosing what methods are most appropriate for your evaluation, review the following questions. (Some may already be answered in another section of this workbook.)

What questions do I want answered? (see previous section)

Does the organization already have existing data, and if so, what kind?

Does the organization have staff to collect data?

What data can the organization staff collect?

Must I maintain anonymity (participant is not identified at all) or confidentiality

(participant is identified but responses remain private)? This consideration pertains to existing archival data as well as original data collection.

How much time do I have to conduct the evaluation?

How much money do I have in my budget?

How many evaluation staff do I have to manage the data collection activities?

Can I (and/or members of my evaluation staff) travel on site?

What time of day is best for collecting data? For example, if you plan to conduct

focus groups or interviews, remember that your population may work during the day and need evening times.

Evaluation Expert Session July 16, 2002 Page 16

Types of methods

A number of different methods exist that can be used to collect process information. Consider each of the following, and check those that you think would be helpful in addressing the specific questions in your evaluation. When "see sample" is indicated, refer to the pages that follow this table.

√ Method Description

Activity, participation, or client tracking log

Brief record completed on site at frequent intervals by participant or deliverer. May use form developed by evaluator if none previously exists. Examples: sign in log, daily records of food consumption, medication management.

Case Studies Collection of in-depth information regarding small number of intervention recipients; use multiple methods of data collection.

Ethnographic analysis

Obtain in-depth information regarding the experience of the recipient by partaking in the intervention, attending meetings, and talking with delivery staff and recipients.

Expert judgment Convene a panel of experts or conduct individual interviews to obtain their understanding of and reaction to program delivery.

Focus groups Small group discussion among program delivery staff or recipients. Focus on their thoughts and opinions regarding their experiences with the intervention.

Meeting minutes (see sample)

Qualitative information regarding agendas, tasks assigned, and coordination and implementation of the intervention as recorded on a consistent basis.

Observation (see sample)

Observe actual delivery in vivo or on video, record findings using check sheet or make qualitative observations.

Open-ended interviews – telephone or in person

Evaluator asks open questions (i.e., who, what, when, where, why, how) to delivery staff or recipients. Use interview protocol without preset response options.

Questionnaire Written survey with structured questions. May administer in individual, group, or mail format. May be anonymous or confidential.

Record review

Obtain indicators from intervention records such patient files, time sheets, telephone logs, registration forms, student charts, sales records, or records specific to the service delivery.

Structured interviews – telephone or in person

Interviewer asks direct questions using interview protocol with preset response options.

Evaluation Expert Session

July 16, 2002 Page 17

Sample activity log

This is a common process evaluation methodology because it systematically records exactly what is happening during implementation. You may wish to devise a log such as the one below and alter it to meet your specific needs. Consider computerizing such a log for efficiency. Your program may already have existing logs that you can utilize and adapt for your evaluation purposes.

Site:

Recorder:

Code

Service

Date

Location

# People

# Hours

Notes

Evaluation Expert Session July 16, 2002

Page 18

Meeting Minutes

Taking notes at meetings may provide extensive and invaluable process information that can later be organized and structured into a comprehensive report. Minutes may be taken by program staff or by the evaluator if necessary. You may find it helpful to use a structured form, such as the one below that is derived from Evaluating Collaboratives, University of Wisconsin-Cooperative Extension, 1998.

Meeting Place: __________________ Start time: ____________ Date: _____________________________ End time: ____________ Attendance (names): Agenda topic: _________________________________________________ Discussion: _____________________________________________________ Decision Related Tasks Who responsible Deadline 1. 2. 3. Agenda topic: _________________________________________________ Discussion: _____________________________________________________ Decision Related Tasks Who responsible Deadline 1. 2. 3. Sample observation log

Evaluation Expert Session July 16, 2002

Page 19

Observation may occur in various methods, but one of the most common is hand-recording specific details during a small time period. The following is several rows from an observation log utilized during an evaluation examining school classrooms.

CLASSROOM OBSERVATIONS (School Environment Scale) Classroom 1: Grade level _________________ (Goal: 30 minutes of observation)

Time began observation: _________Time ended observation:_________ Subjects were taught during observation period: ___________________

PHYSICAL ENVIRONMENT

Question

Answer 1. Number of students

2. Number of adults in room: a. Teachers b. Para-pros c. Parents

Total: a. b. c.

3. Desks/Tables a. Number of Desks b. Number of Tables for students’ use c. Any other furniture/include number (Arrangement of desks/tables/other furniture)

a. b. c.

4. Number of computers, type

5. How are computers being used?

6. What is the general classroom setup? (are there walls, windows, mirrors, carpet, rugs, cabinets, curtains, etc.)

7. Other technology (overhead projector, power point, VCR, etc.)

8. Are books and other materials accessible for students?

9. Is there adequate space for whole-class instruction?

12. What type of lighting is used?

13. Are there animals or fish in the room?

14. Is there background music playing?

15. Rate the classroom condition Poor Average Excellent

16. Are rules/discipline procedures posted? If so, where?

17. Is the classroom Noisy or Quiet? Very Quiet Very Noisy

Choosing or designing measurement instruments Consider using a resource panel, advisory panel, or focus group to offer feedback

Evaluation Expert Session July 16, 2002

Page 20

regarding your instrument. This group may be composed of any of the people listed below. You may also wish to consult with one or more of these individuals throughout the development of your overall methodology.

Who should be involved in the design of your instrument(s) and/or provide feedback? Program service delivery staff / volunteers Project director Recipients of the program Board of directors Community leader Collaborating organizations Experts on the program or service being evaluated Evaluation experts _________________________ _________________________ _________________________

Conduct a pilot study and administer the instrument to a group of recipients, and then

obtain feedback regarding their experience. This is a critical component of the development of your instruments, as it will help ensure clarity of questions, and reduce the degree of discomfort or burden that questions or processes (e.g., intakes or computerized data entry) elicit.

How can you ensure that you pilot your methods? When will you do it, and whom will you use as participants in the study? Ensure that written materials are at an appropriate reading level for the population.

Ensure that verbal information is at an appropriate terminology level for the population. A third or sixth-grade reading level is often utilized.

Remember that you are probably collecting data that is program-specific. This may

increase the difficulty in finding instruments previously constructed to use for questionnaires, etc. However, instruments used for conducting process evaluations of other programs may provide you with ideas for how to structure your own instruments.

Evaluation Expert Session July 16, 2002

Page 21

Linking program components and methods (an example) Now that you have identified your program components, broad questions, specific questions, and possible measures, it is time to link them together. Let's start with your program components. Here is an example of 3 program components of an intervention.

Program Components and Essential Elements: There are six program components to M2M. There are essential elements in each component that must be present for the program to achieve its intended results and outcomes, and for the program to be identified as a program of the American Cancer Society.

Possible Process Measures

1) Man to Man Self-Help and/or Support Groups The essential elements within this component are:

• Offer information and support to all men with prostate cancer at all points along the cancer care continuum

• Directly, or through collaboration and referral, offer community access to prostate cancer self-help and/or support groups

• Provide recruitment and on-going training and monitoring for M2M leaders and volunteers

• Monitor, track and report program activities

• Descriptions of attempts to schedule and advertise group meetings

• Documented efforts to establish the program • Documented local needs assessments • # of meetings held per independent group • Documented meetings held • # of people who attended different topics and speakers • Perceptions of need of survey participants for

additional groups and current satisfaction levels • # of new and # of continuing group members • Documented sign-up sheets for group meetings • Documented attempts to contact program dropouts • # of referrals to other PC groups documented • # of times corresponding with other PC groups • # of training sessions for new leaders • # of continuing education sessions for experienced

leaders • # and types of other on-going support activities for

volunteer leaders • # of volunteers trained as group facilitators • Perceptions of trained volunteers for readiness to

function as group facilitators

Evaluation Expert Session July 16, 2002

Page 22

2) One-to-One Contacts The essential elements within this component are:

• Offer one-to-one contact to provide information and support to all men with prostate cancer, including those in the diagnostic process

• Provide recruitment and on-going training and monitoring for M2M leaders and volunteers

• Monitor, track and report program activities

• # of contact pairings

• Frequency and duration of contact pairings

• Types of information shared during contact pairings

• # of volunteers trained

• Perception of readiness by trained volunteers

• Documented attempts for recruiting volunteers

• Documented on-going training activities for volunteers

• Documented support activities

3) Community Education and Awareness

The essential elements within this component are:

• Conduct public awareness activities to inform the public about prostate cancer and M2M

• Monitor, track and report program activities

• # of screenings provided by various health care

providers/agencies over assessment period • Documented ACS staff and volunteer efforts to

publicize the availability and importance of PC and screenings, including health fairs, public service announcements, billboard advertising, etc.

• # of addresses to which newsletters are mailed • Documented efforts to increase newsletter mailing list

Page 23

Linking YOUR program components, questions, and methods

Consider each of your program components and questions that you have devised in an earlier section of this workbook, and the methods that you checked off on the "types of methods" table. Now ask yourself, how will I use the information I have obtained from this question? And, what method is most appropriate for obtaining this information?

Program Component

Specific questions that go with this

component

How will I use this

information?

Best method?

Page 24

Program Component

Specific questions that go with this

component

How will I use this

information?

Best method?

Evaluation Expert Session July 16, 2002

Page 25

Data Collection Plan Now let's put your data collection activities on one sheet - what you're collecting, how you're doing it, when, your sample, and who will collect it. Identifying your methods that you have just picked, instruments, and data collection techniques in a structured manner will facilitate this process.

Method

Type of data (questions, briefly indicated)

Instrument used

When implemented

Sample

Who collects

E.g.: Patient interviews in health dept clinics

Qualitative - what services they are using, length of visit, why came in, how long wait, some quantitative satisfaction ratings

Interview created by evaluation team and piloted with patients

Oct-Dec; days and hrs randomly selected

10 interviews in each clinic

Trained interviewers

Page 26

Evaluation Expert Session July 16, 2002

Consider a Management Information System

Process data is frequently collected through a management information system (MIS) that is designed to record characteristics of participants, participation of participants, and characteristics of activities and services provided. An MIS is a computerized record system that enables service providers and evaluators to accumulate and display data quickly and efficiently in various ways.

Will your evaluation be enhanced by periodic data presentations in tables or other structured formats? For example, should the evaluation utilize a monthly print-out of services utilized or to monitor and process recipient tracking (such as date, time, and length of service)?

YES

NO

Does the agency create monthly (or other periodic) print outs reflecting services rendered or clients served?

YES

NO

Will the evaluation be conducted in a more efficient manner if program delivery staff enter data on a consistent basis?

YES

NO

Does the agency already have hard copies of files or records that would be better utilized if computerized?

YES

NO

Does the agency already have an MIS or a similar computerized database?

YES

NO

If the answers to any of these questions are YES, consider using an MIS for your evaluation. If an MIS does not already exist, you may desire to design a database in which you can

enter information from records obtained by the agency. This process decreases missing data and is generally efficient.

If you do create a database that can be used on an ongoing basis by the agency, you may

consider offering it to them for future use.

Page 27

Evaluation Expert Session July 16, 2002

Information to be included in your MIS Examples include: Client demographics Client contacts Client services Referrals offered Client outcomes Program activities Staff notes

Jot down the important data you would like to be included in your MIS. Managing your MIS What software do you wish to utilize to manage your data? What type of data do you have? How much information will you need to enter? How will you ultimately analyze the data? You may wish to create a database directly in the program you will eventually use, such as SPSS? Will you be utilizing lap tops?

Page 28

Evaluation Expert Session July 16, 2002

If so, will you be taking them onsite and directly entering your data into them?

How will you download or transfer the information, if applicable?

What will the impact be on your audience if you have a laptop? Tips on using an MIS If service delivery personnel will be collecting and/or entering information into the MIS

for the evaluator's use, it is generally a good idea to provide frequent reminders of the importance of entering the appropriate information in a timely, consistent, and regular manner.

For example, if an MIS is dependent upon patient data collected by public health officers

daily activities, the officers should be entering data on at least a daily basis. Otherwise, important data is lost and the database will only reflect what was salient enough to be remembered and entered at the end of the week.

Don't forget that this may be burdensome and/or inconvenient for the program staff.

Provide them with frequent thank you's. Remember that your database is only as good as you make it. It must be organized and

arranged so that it is most helpful in answering your questions. If you are collecting from existing records, at what level is he data currently available?

For example, is it state, county, or city information? How is it defined? Consider whether adaptations need to be made or additions need to be included for your evaluation.

Back up your data frequently and in at least one additional format (e.g., zip, disk, server).

Consider file security. Will you be saving data on a network server? You may need to

consider password protection.

Page 29

Evaluation Expert Session July 16, 2002

Allocate time for data entry and checking.

Allow additional time to contemplate the meaning of the data before writing the report.

Page 30

Evaluation Expert Session July 16, 2002

Implement Data Collection and Analysis

Data collection cannot be fully reviewed in this workbook, but this page offers a few tips regarding the process.

General reminders: THANK everyone who helps you, directs you, or participates in anyway.

Obtain clear directions and give yourself plenty of time, especially if you are traveling

long distance (e.g., several hours away). Bring all of your own materials - do not expect the program to provide you with writing

utensils, paper, a clipboard, etc. Address each person that you meet with respect and attempt to make your meeting as

conducive with their schedule as possible. Most process evaluation will be in the form of routine record keeping (e.g., MIS). However, you may wish to interview clients and staff. If so: Ensure that you have sufficient time to train evaluation staff, data collectors, and/or

organization staff who will be collecting data. After they have been trained in the data collection materials and procedure, require that they practice the technique, whether it is an interview or entering a sample record in an MIS.

If planning to use a tape recorder during interviews or focus groups, request permission

from participants before beginning. You may need to turn the tape recorder off on occasion if it will facilitate increased comfort by participants.

If planning to use laptop computers, attempt to make consistent eye contact and spend

time establishing rapport before beginning. Some participants may be uncomfortable with technology and you may need to provide education regarding the process of data collection and how the information will be utilized.

If planning to hand write responses, warn the participant that you may move slowly and

Page 31

Evaluation Expert Session July 16, 2002

may need to ask them to repeat themselves. However, prepare for this process by developing shorthand specific to the evaluation. A sample shorthand page follows.

Page 32

Evaluation Expert Session July 16, 2002

Annual Evaluation Reports The ultimate aim of all the Branch’s evaluation efforts is to increase the intelligent use of information in Branch decision-making in order to improve health outcomes. Because we understand that many evaluation efforts fail because the data are never collected and that even more fail because the data are collected but never used in decision-making, we have struggled to find a way to institutionalize the use of evaluation results in Branch decision-making. These reports will serve multiple purposes:

The need to complete the report will increase the likelihood that evaluation is done and data are collected.

The need to review reports from lower levels in order to complete one’s own report hopefully will cause managers at all levels to consciously consider, at least once a year, the effectiveness of their activities and how evaluation results suggest that effectiveness can be improved.

The summaries of evaluation findings in the reports should simplify preparation of other reports to funders including the General Assembly.

Each evaluation report forms the basis of the evaluation report at the next level. The contents and length of the report should be determined by what is mot helpful to the manager who is receiving the report. Rather than simply reporting every possible piece of data, these reports should present summary data, summarize important conclusions, and suggest recommendations based on the evaluation findings. A program-level annual evaluation report should be ten pages or less. Many my be less than five pages. Population team and Branch-level annual evaluation reports may be longer than ten pages, depending on how many findings are being reported. However, reports that go beyond ten pages should also contain a shorter Executive Summary, to insure that those with the power to make decisions actually read the findings. Especially, the initial reports may reflect formative work and consist primarily of updates on the progress of evaluation planning and implementation. This is fine and to be expected. However, within a year or two the reports should begin to include process data, and later actual outcome findings. This information was extracted from the FHB Evaluation Framework developed by Monica Herk and Rebekah Hudgins.

Page 33

Evaluation Expert Session July 16, 2002

Suggested shorthand - a sample The list below was derived for a process evaluation regarding charter schools. Note the use of general shorthand as well as shorthand derived specifically for the evaluation.

CS Charter School

mst

Most

Sch School b/c Because Tch Teacher, teach st Something P Principal b Be VP Vice Principal c See Admin Administration, administrators r Are DOE Dept of Education w/ When BOE Board of Education @ At Comm Community ~ About Stud Students, pupils = Is, equals, equivalent Kids Students, children, teenagers ≠ Does not equal, is not the same K Kindergarten Sone Someone Cl Class # Number CR Classroom $ Money, finances, financial, funding,

expenses, etc. W White + Add, added, in addition B Black < Less than AA African American > Greater/more than SES Socio-economic status ??? What does this mean? Get more

info on, I'm confused… Lib Library, librarian DWA Don't worry about (e.g. if you wrote

something unnecessary) Caf Cafeteria Ψ Psychology, psychologist Ch Charter ∴ Therefore Conv Conversion (school) ∆ Change, is changing S-up Start up school mm Movement App Application, applied ↑ Increases, up, promotes ITBS Iowa Test of Basic Skills ↓ Decreases, down, inhibits LA Language arts X Times (e.g. many x we laugh) SS Social Studies ÷ Divided (we ÷ up the classrooms) QCC Quality Core Curriculum C With Pol Policy, politics Home, house Curr Curriculum ♥ Love, adore (e.g. the kids ♥ this) LP Lesson plans Church, religious activity Disc Discipline O No, doesn't, not Girls, women, female 1/2 Half (e.g. we took 1/2) Boys, men, male 2 To

Page 34

Evaluation Expert Session July 16, 2002

F

Father, dad

c/out

without

P Parent

2B

To be

M Mom, mother

e.g.

For example

i.e. That is

If the person trails off, you missed information

Appendix A

Logic Model Worksheet

Population Team/Program Name __________________________ Date _______________________ If the following CONDITIONS AND ASSUMPTIONS exist...

And if the following ACTIVITIES are implemented to address these conditions and assumptions

Then these SHORT-TERM OUTCOMES may be achieved...

And these LONG-TERM OUTCOMES may be acheived...

And these LONG- TERM GOALS can be reached....

Page 35

Evaluation Expert Session July 16, 2002

Appendix B Pitfalls To Avoid Avoid heightening expectations of delivery staff, program recipients, policy makers, or

community members. Ensure that feedback will be provided as appropriate, but may or may not be utilized.

Avoid any implication that you are evaluating the impact or outcome. Stress that you are

evaluating "what is happening," not how well any one person is performing or what the outcomes of the intervention are.

Make sure that the right information gets to the right people - it is most likely to be utilized

in a constructive and effective manner if you ensure that your final report does not end up on someone's desk who has little motivation or interest in utilizing your findings.

Ensure that data collection and entry is managed on a consistent basis - avoid developing an

evaluation design and than having the contract lapse because staff did not enter the data.

Page 36

Evaluation Expert Session July 16, 2002

Appendix C References

References used for completion of this workbook and/or that you may find helpful for additional information.

Centers for Disease Control and Prevention. 1995. Evaluating Community Efforts to Prevent Cardiovascular Diseases. Atlanta, GA. Centers for Disease Control and Prevention. 2001. Introduction to Program Evaluation for Comprehensive Tobacco Control Programs. Atlanta, GA. Freeman, H. E., Rossi, P. H., Sandefur, G. D. 1993. Workbook for evaluation: A systematic approach. Sage Publications: Newbury Park, CA. Georgia Policy Council for Children and Families; The Family Connection; Metis Associates, Inc. 1997. Pathways for assessing change: Strategies for community partners. Grembowski, D. 2001. The practice of health program evaluation. Sage Publications: Thousand Oaks. Hawkins, J. D., Nederhood, B. 1987. Handbook for Evaluating Drug and Alcohol Prevention Programs. U.S. Department of Health and Human Services; Public Health Service; Alcohol, Drug Abuse, and Mental Health Administration: Washington, D. C. Muraskin, L. D. 1993. Understanding evaluation: The way to better prevention programs. Westat, Inc. National Community AIDS Partnership 1993. Evaluating HIV/AIDS Prevention Programs in Community-based Organizations. Washington, D.C. NIMH Overview of Needs Assessment. Chapter 3: Selecting the needs assessment approach. Patton, M. Q. 1982. Practical Evaluation. Sage Publications, Inc.: Beverly Hills, CA.

Page 37

Evaluation Expert Session July 16, 2002

Posavac, E. J., Carey, R. G. 1980. Program Evaluation: Methods and Case Studies. Prentice-Hall, Inc.: Englewood Cliffs, N.J. Rossi, P. H., Freeman, H. E., Lipsey, M. W. 1999. Evaluation: A Systematic Approach. (6th edition). Sage Publications, Inc.: Thousand Oaks, CA. Scheirer, M. A. 1994. Designing and using process evaluation. In: J. S. Wholey, H. P. Hatry, & K. E. Newcomer (eds) Handbook of practical program evaluation. Jossey-Bass Publishers: San Francisco. Taylor-Powell, E., Rossing, B., Geran, J. 1998. Evaluating Collaboratives: Reaching the potential. Program Development and Evaluation: Madison, WI. U.S. Department of Health and Human Services; Administration for Children and Families; Office of Community Services. 1994. Evaluation Guidebook: Demonstration partnership program projects. W.K. Kellogg Foundation. 1998. W. K. Kellogg Foundation Evaluation Handbook. Websites: www.cdc.gov/eval/resources www.eval.org (has online text books) www.wmich.edu/evalctr (has online checklists) www.preventiondss.org

When conducting literature reviews or searching for additional information, consider using alternative names for "process evaluation," including: formative evaluation program fidelity implementation assessment implementation evaluation program monitoring

Homework is Completed By:

Writer Writer Name Amount Client Comments & Rating
Instant Homework Helper

ONLINE

Instant Homework Helper

$36

She helped me in last minute in a very reasonable price. She is a lifesaver, I got A+ grade in my homework, I will surely hire her again for my next assignments, Thumbs Up!

Order & Get This Solution Within 3 Hours in $25/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 3 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 6 Hours in $20/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 6 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 12 Hours in $15/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 12 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

6 writers have sent their proposals to do this homework:

Financial Analyst
Top Rated Expert
Assignment Hub
George M.
Accounting Homework Help
Study Master
Writer Writer Name Offer Chat
Financial Analyst

ONLINE

Financial Analyst

This project is my strength and I can fulfill your requirements properly within your given deadline. I always give plagiarism-free work to my clients at very competitive prices.

$42 Chat With Writer
Top Rated Expert

ONLINE

Top Rated Expert

I have done dissertations, thesis, reports related to these topics, and I cover all the CHAPTERS accordingly and provide proper updates on the project.

$16 Chat With Writer
Assignment Hub

ONLINE

Assignment Hub

I am a PhD writer with 10 years of experience. I will be delivering high-quality, plagiarism-free work to you in the minimum amount of time. Waiting for your message.

$24 Chat With Writer
George M.

ONLINE

George M.

I am a professional and experienced writer and I have written research reports, proposals, essays, thesis and dissertations on a variety of topics.

$35 Chat With Writer
Accounting Homework Help

ONLINE

Accounting Homework Help

I find your project quite stimulating and related to my profession. I can surely contribute you with your project.

$39 Chat With Writer
Study Master

ONLINE

Study Master

After reading your project details, I feel myself as the best option for you to fulfill this project with 100 percent perfection.

$37 Chat With Writer

Let our expert academic writers to help you in achieving a+ grades in your homework, assignment, quiz or exam.

Similar Homework Questions

Needs in a few hours. 2-3 paragraphs-must use citations posted. - Pavlovian model of consumer behaviour - Waves and vibrations lab - Kohlberg moral dilemma examples - Short discussion - Humour in much ado about nothing - Uow social science cover sheet - Monash law winter units 2021 - Week 6 - Please respond if you can complete in 8 hours - Identifying your followership style questionnaire - Mgm holdings share price - Universal intellectual standards - 3 reasons college still matters - Karnataka state safety institute - Ben and jerry's funky monkey - Overloading a binary operator as a member requires two arguments. - Risk management case study for students - Hidden rules and expected and unexpected behaviors - Jeroo lab answers - Oral Language and Literacy Development PPT - Hydrogen gas is evolved during the reaction between - Hedge pig shakespeare definition - Dynamic earth webquest answer key - Telecommunications and Networking DQ - 7.2 3.7 lab troubleshooting advanced eigrp - Briere trauma symptom checklist - Security Architecture and Design - Sids and stars explained - LEG 500 Discussion week 5 - Examples of problem solving skills in customer service - TLMT601 Week 3 Case Study 1 - Cn molecular orbital diagram - Polymorphic light eruption pcds - Www stickyball net phonics html - Blonde king charles cavalier - A trucking firm suspects that the mean - System analysis and design tools and techniques - Effective sensible heat factor - Interference and wavelength of laser light lab conclusion - Uts academic transcript online - Enterprise Risk management - Natalie rogers carl rogers - Independent t test table - University of salford staff channel - Molarity of 3 hydrogen peroxide - Environmental science history timeline - Real time business intelligence at continental airlines case study - Studio art study design - Ip australia designs database - Project management google books - I believe statements list - Need help in US History quiz as fast as possible and i need some that is good in us history. - ORG Paper - 46 flora avenue mildura - The term _________________ refers to configuring a web page is optimally ranked by search engines - On october organic farming purchases wind turbines - Sci 207 week 5 lab report - V 2 gm r - Learning outside the classroom theory - Century national bank case study - Ib math sl formula booklet 2019 - Childs v.weis - Dis 8 - Network rail app catalogue - Dorothy e johnson biography - Hovey house boston college - E coli on cled agar - Journal -2 - What is rohs compliant mean - Wzf - The pcl r checklist a measure of evil - Research paper - Attribution theory - National code of good practice for australian apprenticeships - Should phones be banned in school persuasive - A pair of tickets amy tan theme - Vo2 max lab report example - How to fold bedspread - Art history research paper thesis example - Dragon software for dyslexia - Describe the importance of data/information visualization - Td nails gateway palmerston - Brother printer unable to print 4f - Controversy and research paper - Imovie playful theme music download - Amoxicillin 500mg for diverticulitis - Disc questions and answers - Chiang fundamental mathematical economics solution - Concepts and theories in nursing - Beery vmi 6th edition - Sandra singer stage school - Marvel parts inc manufactures auto accessories - Jm loveridge plc southampton - For the following questions determine the blood type being tested - Why is allusion effective - John locke psychology tabula rasa - Crestron dm tx 4k 100 c 1g wt - Jean lave and etienne wenger situated learning - Fireangel st 622 warranty - Situation Analysis