Review Capstone Examples

C:\Users\mbrown3\AppData\Local\Microsoft\Windows\Temporary Internet Files\Content.IE5\3QIURI7L\MC900441498[1].pngEdD Capstone Rubrics and Checklists by Stage

Revised EdD Rubrics and Checklists…Can you explain clearly what forms are required when?

The stages and timing of the University Research Review process have not fundamentally changed; only the documents have changed. However, the following outline may help clarify.

Proposal Writing Stage

Student : Complete the proposal and the appropriate Doctoral Study Checklist by identifying the page number where items are located in the proposal. Use the comment blocks to provide any clarifying information for the reviewers.

Student : Submit the proposal and Doctoral Study Checklist to the Committee Chair.

 

Committee Chair : Review the proposal and the Doctoral Study Checklist and evaluate the proposal.

· If the proposal is assessed as not ready for committee review based on the criteria in the Doctoral Study Minimum Standards Rubric, the Committee Chair provides feedback to the student using the Checklist and/or the Doctoral Study Minimum Standards Rubric (depending on the nature of the feedback).

· If the proposal is assessed as ready for further review, the Committee Chair forwards the proposal, the Doctoral Study Checklist, and his/her completed Doctoral Study Minimum Standards Rubric to the Committee Member for review.

· Once the Committee Chair and Committee Member agree that the proposal has met all the Doctoral Study Minimum Standards Rubric criteria (items 1-8 of the 10 criteria), the proposal is ready for Committee URR review. The Committee Chair then forwards to the Committee URR: 1) proposal document; 2) Turnitin Report; 3) the completed Doctoral Study Checklist completed by the student and with any comments by the committee Chair and/or Member; and 4) the Doctoral Study Minimum Standards Rubrics completed by each member indicating that the applicable standards have been met (it is best to keep the history of comments to the student for Committee URR to review as well).

Doctoral Study Writing Stage

Student : Complete the doctoral study and extend the Doctoral Study Checklist by identifying the page number where items are located in the final doctoral study. Add directly to the checklist used for the proposal, when possible. Use the comment blocks to provide any clarifying information for the reviewers.

Student : Submit the completed doctoral study and Doctoral Study Checklist to the Committee Chair.

Committee Chair: Review the final study and the Doctoral Study Checklist and evaluate the final study.

· If the study document is assessed as not ready for committee review based on the criteria in the Doctoral Study Minimum Standards Rubric, the Committee Chair provides feedback to the student using the Checklist and/or the Doctoral Study Minimum Standards Rubric (depending on the nature of the feedback).

· If the final study is assessed as ready for further review, the Committee Chair forwards the study, the Doctoral Study Checklist, and a completed Doctoral Study Minimum Standards Rubric to the Committee Member for review.

· Once the Chair and Committee Member agree that the final doctoral study has met all the Doctoral Study Minimum Standards Rubric criteria (all of the 10 criteria), the doctoral study is ready for Committee URR review. The Committee Chair then forwards to the Committee URR: 1) final doctoral study document; 2) Turnitin Report; 3) the completed Doctoral Study Checklist completed by the student and with any comments by the committee Chair and/or Member; and 4) the Doctoral Study Minimum Standards Rubrics completed by each member indicating that all standards have been met (it is best to keep the history of comments to the student for Committee URR to review as well).

Final Study Stage/Post-Oral Defense (pre-CAO review)

Committee Chair and Committee Member:

· Chair forwards the final study document and final quality rubric from each committee member directly to the URR, while copying doctoralstudy@waldenu.edu

Committee URR:

· Forwards review (brief statement to chair, final document, and final quality rubric—indicating if approved for CAO review) to committee chair, while copying doctoralstudy@waldenu.edu

Clinical Field Experience A: Listening, Speaking, And Vocabulary Strategies

Allocate at least 5 hours in the field to support this field experience.

Observe at least one grade PreK-3 classroom with ELLs. Focus your observations on the strategies utilized to develop listening, speaking, and vocabulary. Discuss with your mentor teacher strategies they employ to differentiate instruction and assessment to meet the learning needs of all students.

Use any remaining field experience hours to assist the mentor teacher in providing instruction and support to the class.

Write a 250-500 word reflection regarding your observations. Your reflection should include an example of the effective use of a strategy within each of the following domains:

  • Listening
  • Speaking
  • Vocabulary development

How could these strategies be differentiated to accommodate ELLs at various levels of English language proficiency?

APA format is not required, but solid academic writing is expected.

This assignment uses a rubric. Review the rubric prior to beginning the assignment to become familiar with the expectations for successful completion.

You are required to submit this assignment to LopesWrite. A link to the LopesWrite technical support articles is located in Class Resources if you need assistance.

Document the locations and hours you spend in the field on your Clinical Field Experience Verification Form.

Submit the Clinical Field Experience Verification Form to the learning management system in the last topic. Directions for submitting can be found on the College of Education site in the Student Success Center.

Employee Realignment Table

Complete both parts of this assignment.

Part One:

Create a one-page employee realignment table that compares recruitment, retention, replacement, and removal of employees in the organizational change process.

Include the following for each realignment practice:

  • Definition
  • Purpose
  • Desired outcome
  • Implementation strategy

Part Two:

Evaluate your Learning Team’s Organizational Change Process presentation and identify the competencies needed for job performance in the areas of change that your team chose. Relate those competencies that you identified in the team presentation to the table you created that realigns employees through recruitment, retention, replacement, and removal to create organizational renewal.

Write a 275- to 350-word summary describing the areas of your Learning Team’s Organizational Change Process presentation that could benefit from each of the four processes of employee realignment: recruitment, retention, replacement, and removal based on your table and the competencies that you identified.

Assessment Article Reflection

Read the following assessment article.  Choose one article (see below) to read and complete a summary reflection. In your own words explain what you have learned from the article. Tell how the article will help guide your instruction in the future?  Has the article left you with any questions? Please cite the article source in APA format.  (The article must be one-page in length using Times New Roman , 12 font, and double spaced.)

For APA citation style guide please click the link below:

https://lib.usm.edu/help/tutorials/

Articles on Student Assessment

  1. Bundock, K., O’Keeffe, B.V., Stokes, K., & Kladis, K. (2018). Strategies for minimizing variability in progress monitoring or oral reading fluency. Teaching Exceptional Children, 50(5), 273-281.Strategies for Minimizing Variability.pdf
  2. Lindstrom, E. R., Gesel, S.A., & Lemons, C.J. (2019). Data-based individualization in reading: Tips for successful implementation. Intervention in School and Clinic, 55(2), 113-119.Data-based Individualization in reading.pdf
  3. Powell, S. R. & Stecker, P.M. (2014). Using data-based individualization to intensify mathematics intervention for students with disabilities. Teaching Exceptional Children, 46(4), 31-37. Using Data-based Individualization to Intensify Mathematics.pdf

    764097TCXXXX10.1177/0040059918764097Council for Exceptional ChildrenTEACHING Exceptional Children research-article2018

    Strategies for Minimizing Variability in

    Progress Monitoring of Oral Reading Fluency

    Kaitlin Bundock, Breda V. O’Keeffe, Kristen Stokes, and Kristin Kladis

    Progress Monitoring T

    E A

    C H

    IN G

    E xc

    ep ti

    on a l C h il d re

    n ,

    V o l. 5

    0, N

    o .

    5, p

    p .

    27 3 –

    28 1.

    C o p yr

    ig h t

    20 18

    T h e

    A u th

    o r(

    s) . D

    O I:

    1 0.

    11 77

    /0 04

    00 59

    91 87

    64 09

    7

     

    https://doi.org/10.1177/0040059918764097
    http://crossmark.crossref.org/dialog/?doi=10.1177%2F0040059918764097&domain=pdf&date_stamp=2018-03-26

     

    274 CounCil for ExCEptional ChildrEn

    Mr. Long is a special education teacher in an urban school district. Three times per year, he uses Dynamic Indicators of Basic Early Literacy Skills (DIBELS) Next to assess his students’ oral reading fluency (ORF) skills at their chronological grade level. Mr. Long conducts weekly progress monitoring of all students who score below the expected benchmark score for words read correctly per minute (WCPM). Students are assessed at either their grade level, if they are reading at or above 50 WCPM according to the DIBELS Next progress- monitoring guidelines (Dynamic Measurement Group, 2012), or at their instructional level based on results from a survey level assessment. To conduct the assessments, Mr. Long takes students out of the classroom during various times of the day. Depending on the time of day, Mr. Long uses different setting locations, including the hallway, a conference room, and an unused classroom. Students are taken individually or in small groups, depending on how far away he must take them for the assessment.

    After a few weeks, Mr. Long notices one of the students, Laine, has inconsistent scores in her data set (see Figure 1). Laine, a third-grade student with a specific learning disability, had scores of 61, 43, 75, and 57 WCPM over 4 weeks. Mr. Long compares Laine’s scores with those of other students in the group and notices the other students’ scores are more consistent. For example, Mason’s scores are 66, 71, 71, and 72 WCPM during the same time period (see Figure 2). Mr. Long consults with the school’s reading specialist and finds out that “high variability” includes a range of 10 or more words read correctly above or below the trend line. Because Mr. Long has graphed Laine’s data with a trend line characterizing the data, he can quickly determine that her data are highly variable. Mr. Long realizes that highly variable data can obscure what Laine’s true progress might be. He sees the need to collect more data to determine if the variability can be reduced before a good decision about changing her intervention can be made.

    CBM is useful and effective for monitoring student progress in important skills, such as reading, mathematics, and writing. Research has shown that (a) CBM can be easily implemented and interpreted by teachers (e.g., Fuchs, Deno, & Mirkin, 1984), (b) student outcomes have improved when teachers use CBM to inform instructional decision making (e.g., Fuchs, Fuchs, Hamlett, & Stecker, 1991), (c) reliable and valid measures have been developed that predict important student outcomes (e.g., Fuchs, Fuchs, & Maxwell, 1988; Kim, Petscher, Schatschneider, & Foorman, 2010; Wayman, Wallace, Wiley, Tichá, & Espin, 2007), and (d) CBM can be an integral component of multi-tiered systems for identifying and monitoring students’ academic needs (e.g., Kovaleski, VanDerHeyden, & Shapiro, 2013; M. R. Shinn, 2007). CBM for reading (CBM-R) is an efficient and effective research-based progress- monitoring tool to monitor student growth in reading and to evaluate the effectiveness of targeted instruction (Good et al., 2011; Hosp, Hosp, & Howell, 2016). CBM-R is easy to administer and requires minimal resources, such as time and materials. Furthermore, the feedback teachers receive from administering CBM-R can inform instructional decision making and provide critical data about

    individual student progress toward reading goals. Given the utility of CBM-R, it is widely used as a key data source for instructional and eligibility decision making (Ardoin, Christ, Morena, Cormier, & Klingbeil, 2013).

    The most commonly used CBM-R is ORF (CBM ORF). CBM ORF is a research-based, standardized assessment of connected text that is administered to individual students. CBM ORF is a good indicator of a student’s current skill level and predictor of future reading performance (Deno, Fuchs, & Marston, 2001; Fuchs, Fuchs, Hosp, & Jenkins, 2001; Kim et al., 2010). CBM ORF requires the student to use a variety of different literacy skills, such as decoding, vocabulary, and comprehension (Hosp et al., 2016). CBM ORF originated in the 1970s, when practitioners randomly selected passages from the curriculum materials used in the classroom (e.g., Deno, 1985; Deno, Marston, Shinn, & Tindal, 1983). This practice increased the utility and validity of the measure for making instructional decisions; however, researchers found that student performance on passages within a grade level varied substantially, decreasing the reliability of these measures (see Hintze & Christ, 2004). Later iterations of CBM ORF included development of passages equated based on readability formulae (e.g., Aimsweb;

    Figure 1. Laine, third-grade student curriculum-based measurement oral reading fluency, high variability to moderate variability

     

     

    TEACHING ExCEptional ChildrEn | May/JunE 2018 275

    M. M. Shinn & Shinn, 2002; DIBELS, 6th ed.; Good & Kaminski, 2002). Unfortunately, student performance on these passages continued to be excessively variable within grade levels (e.g., Poncy, Skinner, & Axtell, 2005). Excessive variability makes the data difficult to interpret, and therefore, recommendations for instructional modifications become unclear.

    Currently, CBM ORF passages have been written using readability formulae for initial equating, then field-tested with students to choose the most equivalent passages to include in published sets (e.g., DIBELS Next; Good et al., 2011; easyCBM; Alonzo, Tindal, Ulmer, & Glasgow, 2006; FastBridge; Christ & Colleagues, 2015). Although some researchers have found persistent variability among these more modern passages (Cummings, Park, & Schaper, 2013), those studies were conducted with higher-performing students than those who are typically included in progress monitoring (e.g., students scoring at or above benchmark at screening). Other researchers found that when passages are implemented as intended, such as to progress monitor students below or well below benchmark, acceptably low levels of variability are seen (O’Keeffe, Bundock, Kladis, Yan, & Nelson, 2017;

    Tindal, Nese, Stevens, & Alonso, 2016). Given the challenges presented by excessive variability, educators should be aware of possible sources of

    variability and have strategies to prevent and address variability in CBM ORF progress monitoring. These strategies should be followed in addition to the recommendations from the specific publisher of the CBM ORF in use and from general recommendations for implementing and interpreting CBM (e.g., Hosp et al., 2016).

    Indicators of Excessive Variability in CBM ORF Progress Monitoring

    Educators need to determine how much variability is too much when evaluating student progress-monitoring data. Typically, educators evaluate progress-monitoring data using time

    series graphs, with words read correctly on each measurement occasion graphed over time. When educators use visual analysis to determine if a student is making adequate progress or not, multiple graphical components can affect this decision. For example, the amount of variability and the degree of slope in the data can make evaluation decisions more or less accurate, with higher variability and lower slope making decisions substantially less accurate (Nelson, Van Norman, & Christ, 2017; Ottenbacher, 1990; Van Norman & Christ, 2016). If inaccurate decisions are made based on variable data, students who need a change in intervention may not receive it, whereas students who do not need a change may experience an unneeded change in intervention. For CBM ORF, researchers have suggested that very low variability exists when most (i.e., 2/3) of the data points fall within five correctly read words per minute (five above and five below) of a trend line,

    and acceptable variability exists when most of the data points fall within 10 correctly read words per minute (10 above and 10 below) of a trend line (Christ, Zopluoglu, Monaghen, & Van Norman, 2013). These values are based on ranges across grade levels (e.g., Christ & Silberglitt, 2007); therefore, students who read more slowly would have lower limits of variability that are acceptable. If available through an electronic database (e.g., AimswebPlus; Pearson, 2017), researchers recommend making these determinations based on confidence intervals, which are generated statistically with the student data (Christ & Silberglitt, 2007). Values that fall outside these ranges can be considered extreme values, which can

    Figure 2. Mason, third-grade student curriculum-based measurement oral reading fluency, very low variability