Assignment: Developing A Program Evaluation

**ATTACHED ARE SOME RESOURCES

A PROGRAM I WOULD LIKE OF FOCUS IS  Federal housing programs  THAT HELP HOMELESS. THANKS.

To ensure the success of a program evaluation, a social worker must generate a specific detailed plan. That plan should describe the goal of the evaluation, the information needed, and the methods and analysis to be used. In addition, the plan should identify and address the concerns of stakeholders. A social worker should present information about the plan in a manner that the stakeholders can understand. This will help the social worker receive the support necessary for a successful evaluation.

To prepare for this Assignment, identify a program evaluation you would like to conduct for a program with which you are familiar. Consider the details of the evaluation, including the purpose, specific questions to address, and type of information to collect. Then, consider the stakeholders that would be involved in approving that evaluation. Review the resources for samples of program evaluations.

By Day 7

Submit the following:

  • A 1-page stakeholder analysis that identifies the stakeholders, their role in the agency and any concerns that they might have about the proposed program evaluation
  • A 2- to 3-page draft of the program evaluation plan to submit to the stakeholders that:
    • Identifies the purpose of the evaluation
    • Describes the questions that will be addressed and the type of information that will be collected
    • Addresses the concerns of the stakeholders that you identified in your Stakeholder Analysis

      Program Evaluation Studies

      TK Logan and David Royse

      A variety of programs have been developed to address social problems such as drug addiction, homelessness, child abuse, domestic violence, illiteracy, and poverty. The goals of these programs may include directly addressing the problem origin or moderating the effects of these problems on indi- viduals, families, and communities. Sometimes programs are developed

      to prevent something from happening such as drug use, sexual assault, or crime. These kinds of problems and programs to help people are often what allracts many

      social workers to the profession; we want to be part of the mechanism through which society provides assistance to those most in need. Despite low wages, bureaucratic red tape, and routinely uncooperative clients, we tirelessly provide services tha t are invaluable but also at various Limes may be or become insufficient or inappropriate. But without conducting eva luation, we do not know whether our programs are helping or hurting, that is, whether they only postpone the hunt for real solutions or truly construct new futures for our clients. This chapter provides an overview of program evaluation in gen – eral and outlines the primary considerations in designing program evaluations.

      Evaluation can be done informally or formally. We are constantly, as consumers, infor- mally evaluating products, services, and in formation. For example, we may choose not to return to a store or an agency again if we did not evaluate the experience as pleasant. Similarl y, we may mentally take note of unsolicited comments or anecdotes from clients and draw conclusions about a program. Anecdotal and informal approaches such as these gen- erally are not regarded as carrying scientific credibility. One reason is that decision biases play a role in our “informal” evaluation. Specifically, vivid memories or strongly negative or positive anecdotes will be overrepresented in our summaries of how things are evaluated. This is why objective data are necessary to truly understand what is or is not working.

      By contrast, formal evaluations systematically examine data from and about programs and their outcomes so that better decisions can be made about the interventions designed to address the related social problem. Thus, program evaluation involves the usc of social research meLhodologies to appraise and improve the ways in which human services, poli- ci~s, and programs are co nducted. Formal eva l.uation, by its very nature, is applied research.

      Formal program evaluations attempt to answer the following general ques tion: Does the p rogram work? Program evaluation may also address questions such as the following: Do our clients get better? How does our success rate compare to those of other programs or agencies? Can the same level of success be obtained through less expensive means?

      221

       

       

      222 PART II • QUANTITATIVE A PPROACHES: TYPES OF STUD IES

      What is the experience o f the typical client? Sho uld this prog ram be terminated and its funds applied elsewhere?

      Ideally, a tho rough program eval uation would address more complex questions in three main areas: (1) Does the program produce the intended outcomes and avoid unin- tended negative o u tcomes? (2) For whom does the program work best and un der what conditions? and (3) Ilow well was a p rogram model developed in one setting adapted to another setti ng?

      Evaluation has taken an especially p rominent role in practi.ce today because o f the focu~ on evidence-based practice in social programs. Social work, as a profession, has been asked to use evidence-based practice as an ethical obligation (Kessler, Gira, & Poertner, 2005). Evidence-based practice is defined diLTerently, but most definit ions include using program evaluation data to help determine best practices in whatever area of social programming is being considered. In other words, evidence-based practice incl udes using objective indica- tors of success in addition to p ractice or more subjective indicators of success.

      Formal program evaluations can be found on just about every topic. For instance, Fraser, Nelson, and Rivn rd ( 1997) h ave examined th e effectiveness of family preservation services; Kirby, Korpi, Adivi, and Weissman ( 1997) have evalu ated an AIDS and preg- nancy prevention middle school program. Mo rrow- Howell, Beeker-Kemppainen, and Judy ( 1998) evaluated an interven tion designed to reduce the risk of suicide in elderl y adult clients of a crisis hotline. Richter, Snider, and Gorey ( 1997) used a quasi-experimental design to study the effects of a g roup work interven tio n on female sur vivors of childho od sexual abuse. Leukefeld and colleagues ( 1998) examined the effects of an I IlV prevention intervention with injecting drug and crack users. Logan and colleagues (2004) examin ed the effects of a drug co urt in terven tion as well as the costs of drug co urt compared with t he economic benefits of the drug court progra m.

      Basic Evaluation Considerations

      Before beginning a program eva luntion, several issues must be initially considered. These issues are decisions 1 hat are critical in determining the evaluation methodology and goals. Although you may not have complete answers to th ese qu estions when beginning to plan a n evaluation, these ques tion s help in developing th e plan and must be answered before a n evaluation ca n be carried out. We can 1.um up these considerations with the following questions: who, what, where, when, and why.

      First, who will do the evaluation? This seems like a simple question at first glance. llowever, this particular consideration has major implications for the evaluation results. P rogram evaluators ca n be categorized as being either in ternal or external. An internal evaluator is someone who is a program staff member or regular agency employee, whereas an external evaluator is a professional, on contract, hired for the specific purpose of evalu- a tion. Th ere are adva ntages nnd disa dvan tages to using either type of evaluato r. For example, the internal evaluator probably will be very familia r with the staff and the program . This may save a lot of planning time. The d isadvnn tage is that eva luatio ns com- pleted by an internal eva luator may be considered less valid by outside agencies, including the funding source. The external evaluator gene rally is thought to be less biased in terms of evaluation outcomes beca use he or she has no persona l investment in the program. One disadvantage is that an externa l evaluator frequently is viewed as an “o utsider” by the staff w ithin an agency. This may affect the amount of time necessar)’ to conduct the eva lua tion or cause problems in the overall evaluation if agency staff are reluctant to cooperate.

       

       

      CHAPTER 13 • P ROGRAM E VALUATION S1 UD I ES 223

      Second, what resources are available to conduct the evaluation? Hiring an outside eval- uator ca n be expensive, whi le having a staff person conduct the evaluation m ay be less expensive. So, in a sense, you may be trading credibility for less cost. In fact, each method- ological decision will have a trade-off in credibility, level of information, and resources (including time and mo ney). Also, t he amount and level of infor mation as well as the research design \ .. ciU be determined, to some e11.”1ent, by what resources are available. A comprehensive and rigorous eval uation does take significant resources.

      Third, where will the information come from? If an eval uation can be done using exist- ing data, the cost will be lower than if data must be collected from numerous people such as clien ts and/or staff across m ultiple sites. So having some sense of where the data will come from is important.

      Fou rth, when is the evaluation information needed? In o ther wo rds, what is the time- fra me for the evaluation? The timeframe will affect costs and design of research methods.

      Fifth, why is the evaluation being conducted? Is the evaluation being conducted at the request of th e fun ding so urce? Is it being cond ucted to improve services? Is it being con- ducted to document the cost-benefit trade-off of the program? If future program funding decisions will depend on the results of the evaluation, then a lot more importance will be attache d to it than 1f a new manager simply wants to know whether clients were satisfied with services. The more that is riding on an evaluation, the more attention will be given to the methodology and the more threa tened staff ca n be, especially if they think that th e purp ose of the evaluation is to down size and trim excess employees. In other words, there arc many reasons an evaluation is being considered, and these reasons may have implica- tions for the evaluati on methodology and implemen tation.

      Once the issues described above have been considered, more complex questions and trade-offs will be needed in planning the evaluation. Specifically, six ma in issues guide and shape the design of any program evaluation effort and m ust be given thoughtful and delib erate consideration.

      L Defining the goal of the program evaluation

      2. Un dersta ndi ng the level of infor mation needed for the program evaluation

      3. Determining the methods and analysis that need to be used for the program evaluation

      4. Consider in g issues that might a ri se and strategies to keep the eval uation on course

      5. Developing results into a useful fo rm at for the program stakeholders

      6. Providing practical and useful feedback about the program strengths and weak- nesses as well as providing infor matio n about next steps

      Defining the Goal of the Program Evaluation

Definition of the facets of understanding

One popular approach to lesson planning for all levels of education is backward design or the backward planning approach. The idea behind backward design/backward planning is that you teach and plan toward the end goal, or what you want the students to be able to know or do. This typically ensures that the content being taught stays focused and more organized, thus promoting better understanding for the students.

*If you are not familiar with the Backward Design or Backward Mapping, I encourage you to go to YouTube and look up videos on these concepts.

reinforces the learning of backward design and provides an opportunity to develop a planning template for future use.

Part 1: Backward Design Guide

Create a 350- to 525-word, guide that explains the use of backward design in lesson planning.

Include the following in your guide:

  • Explain backward design using these terms:
  • Definition of the facets of understanding
  • Explanation
  • Interpretation
  • Application
  • Perspective
  • Empathy
  • Self-knowledge
  • Describe the 3 stages of backward design:
  • Desired results
  • Assessment evidence
  • Learning plan

Discussion Questions You Must Access The Following Article To Answer The Questions: Boyle, D. K.,

Discussion Questions

You must access the following article to answer the questions:

Boyle, D. K., & Thompson, S.A. (2020). CMSRNs’ continuing competence methods and perceived value of certification: A descriptive study. MEDSURG Nursing, 29(4), 229-254.  https://chamberlainuniversity.idm.oclc.org/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=145282314&site=eds-live&scope=site (Links to an external site.)

1. Locate the literature review section. Summarize using your own words from one of the study/literature findings. Be sure to identify which study you are summarizing.

2. Discuss how the author’s review of literature (studies) supported the research purpose/problem. Share something that was interesting to you as you read through the literature review section.

3. Describe one strategy that you learned that would help you create a strong literature review/search for evidence. Share your thoughts on the importance of a thorough review of the literature.

Signature Assignment: Training Program Plan

Title

ABC/123 Version X

1
  Signature Assignment: Training Program Plan*

AET/570 Version 4

2

University of Phoenix Material

Signature Assignment: Training Program Plan*

The Training Program Plan project provides you with the opportunity to demonstrate the knowledge and skills necessary to develop a training program that leads to a positive impact on adult learning. You will design and create a training program plan that includes the following components:

· Needs or gap analysis

· Training program description

· Budget

· Stakeholders and goals

· Training promotional materials

· Program evaluation

Part I – Needs or Gap Analysis

It is recommended that you begin this section of the Training Program Plan in Week 1.

Analyze your identified educational need in your workplace organization by completing a needs or gap analysis. Use a survey or questionnaire, or conduct a focus group to determine needs or gaps.

Write a 350- to 700-word analysis of the educational needs at the organization. Include evidence from your survey, questionnaire, or focus group to support your analysis, and explain how a training program will support a positive impact on adult learning.

Part II – Training Program Description

It is recommended that you begin this section of the Training Program Plan in Week 2.

Based on the needs/gap analysis, describe the training program that you will be designing for your workplace organization:

· Define the scope of the training program.

· Describe the intended audience.

· Define the program goals.

· Define the program objectives.

Write a 350-word introduction to the training program explaining how it will produce a positive impact on adult learning.

Format the paper consistent with APA guidelines.

Part III – Budget

It is recommended that you begin this section of the Training Program Plan in Week 3.

Create a budget for the training program you are designing for your workplace organization based on the scope, audience, goals, and objectives described in Part II.

Include cost estimates for all the following categories in your budget.

· Personnel

· External staff (e.g., consultants)

· Materials or Equipment

· Technical support

· Travel

· Facilities

Part IV – Stakeholders and Goals

It is recommended that you begin this section of the Training Program Plan in Week 4.

Using the program goals you identified in Part II, create a list of stakeholders aligned to each goal.

Explain the role of each stakeholder in the training program and how you intend to gain their support to produce a positive impact on adult learning for the training program.

Part V – Training Promotional Materials

It is recommended that you begin this section of the Training Program Plan in Week 5.

Create promotional materials for the training program you are developing for your workplace organization. Include the following in your promotional materials:

· A marketing message that includes a logo, slogan, and fact sheet

· Benefits of the training program, including instructional practices to produce a positive impact on adult learning

· An explanation of how you will communicate and distribute the promotional materials in your organization

Part VI – Program Evaluation

It is recommended that you begin this section of the Training Program Plan in Week 5.

Create an evaluation for your training program. Include qualitative and quantitative items in your evaluation. Consider using a survey, questionnaire, etc.

Part VII – Implementation

This activity will be delivered in Week 6.

Design a 10- to 15-minute activity based on a training program goal and aligned objective(s). Access the Technology Resources Library to select a presentation tool.

Present the activity in a medium of your choice to the class.

Copyright © XXXX by University of Phoenix. All rights reserved.

Copyright © 2017 by University of Phoenix. All rights reserved.