FAQs

2016 Measurable Progress Frequently Asked Questions (FAQs)

This document of frequently asked questions is the fourth pass at an interactive document that will be updated as needed. If you have a question that's not covered below, please message Susanne Bell at susanne@smarterlearninggroup.com. (Last updated December 7, 2016.) 

Frequently Asked Questions

Making Measurable Progress for Low-Income Children

 

1. What progress indicators do communities involved in the Campaign for Grade-Level Reading use to measure progress for low-income children?

  • School Readiness — More children from low-income families are ready for school and developmentally on track, or fewer children entering kindergarten with undetected, undiagnosed and untreated conditions or delays that can impede learning.
  • School Attendance — Fewer children from low-income families are chronically absent.
  • Summer Learning — More children from low-income families are maintaining or increasing their reading levels over the summer.
  • Overall Grade-Level Reading — More children from low-income families reading at or above grade level at the end of first, second and third grade.

2. How do these progress indicators relate to the criteria for 2016 Pacesetter Honors?

The GLR Campaign seeks to recognize communities with 2016 Pacesetter Honors that meet either or both of the following criteria:

  • Demonstrate population-level, community-wide measurable progress for low-income children on at least one of the above solutions areas within the past five years.
  • Demonstrate exemplary work in one or more aspects of the GLR Campaign's framework for success, scale and sustainability:
    • Aligning, linking, stacking and bundling the most proven and promising strategies, programs and practices
    • Integrating efforts to support parent success and address the health determinants of early school success
    • Driving with data to establish baselines, set targets, track progress, disaggregate for subgroups, create early warning and response systems, tailor strategies and ensure shared accountability
    • Building cross-sector collaboration, community-wide mobilization and a coalition of local funders committed to achieving the result
    • Prioritizing children and families in public housing and reaching those children who are especially vulnerable (children with learning differences; dual language learners; children who are in foster care, who are homeless or whose parents are incarcerated)
    • Utilizing technology to expand reach, mobilize constituencies, improve service delivery and/or streamline operations

3. How do these progress indicators relate to the criteria for the 2017 All-America City Award?

A community that receives the 2017 All-America City Award will need to provide quantitative (numerical) data to demonstrate that it has moved the needle on outcomes for low-income children in at least two of the progress indicators within the past five years. Note: Community-wide, population-level progress is not an expectation for this award.

4. In order to receive 2016 Pacesetter Honors or the 2017 All-America City Award, do I need to provide evidence of progress in outcomes specifically for low-income children?

Yes. In order to be recognized, communities should provide evidence of measurable progress in outcomes for low-income children specifically.

5. How should I present data to demonstrate evidence of measurable progress for low-income children?

You are asked to provide the following: the specific indicator you have used (e.g., "percent/number of entering kindergartners scoring ready"), your baseline number and percent (and year), and the most recent number and percent (and year) for the age or grade level where low-income children have made the most progress (i.e., your best-case example). Please be sure to specify the instrument used to gather your data (such as a particular assessment), and the source of your data (e.g., school system, health department, etc.). You should also indicate whether you are reporting on a community-wide, population-level data or a subset of that (such as a school or cluster of schools) by responding appropriately to the specific question (Section A, #11, 31, 40, 50).

6.  Is there an expectation regarding the amount (or degree) of progress that a community should make over its baseline? 

No, we’re looking for measurable progress against your baseline and relative to your locally defined target.
 

7.  Will it suffice to demonstrate measurable progress by submitting data on the same group or cohort of low-income children within the same school year, or over a single summer?

No, we're looking for YEAR-OVER-YEAR measurable progress (that is, over at least two years). So, for example, it is insufficient to show an improvement in kindergarten readiness scores from the beginning of a school year to the end of that same school year for the same cohort of low-income preschool children. It would be sufficient, for example, to show that a greater percentage of low-income preschoolers score ready for kindergarten for the most recent cohort of children for whom you have data, as compared with an earlier cohort of preschoolers for whom you have baseline year data. 

Please note that this year-over-year expectation applies to ALL solutions areas. For example, communities will need to report two summers worth of data to demonstrate progress in reducing summer learning losses.
 

8.  At what scale does my community need to demonstrate measurable progress in outcomes for low-income children? What if my community only has data for a small group of children enrolled in a single program, single school, etc.?

For 2016 Pacesetter Honors (awarded only to communities that are formally affiliated with the Campaign for Grade-Level Reading Network), communities should be able to demonstrate community-wide, population-level progress for low-income children. This can be demonstrated in any of the following ways: (a) on the basis of the unit of change or catchment area you have defined for your grade-level reading initiative (e.g., if your initiative is countywide, your data would reflect progress on a countywide indicator); or (b)  on the basis of an entire neighborhood or system in a particular city (e.g., you could cite progress for all children in Headstart  or children from WIC families)..

For the 2017 All-America City Award, a community may provide evidence of progress for a subset of low-income children included within the design of your overall community initiative — for example, for a particular grade at the school level (a single classroom is not sufficient, however), or for a program operated by a local nonprofit (such as a Boys & Girls Club or Y). If the reported progress is not at the population level or community wide, you are asked to share your plans for how you will scale up and deepen the impact of your efforts.
 

9.  What if I have lots of data points to report as measurable progress for low-income children in one or more solutions areas (e.g., data for several grade levels, or data on various subgroups of students)?

We ask you to provide your baseline number and percent (and year) and most recent number and percent (and year) only for the particular age group or grade level for which low-income children have made the most progress that is, your single, best-case example for each solutions area and overall grade-level reading. Please note that we cannot accept or evaluate tables, charts, graphs or matrices for numerous groups or subsets of children. You should submit ONLY a single baseline number or percent and your single most recent number or percent for a particular age group or grade level. Please see below for additional assistance from the GLR Campaign's technical assistance partners.
 

10.  What if there has been a change in the assessment instrument used to measure progress in a particular solutions area (such as in school readiness) since we first started collecting data? That is, what if the assessment instrument used to establish my baseline in a particular solutions area is different from the instrument now used to collect data on student outcomes?

We acknowledge that over the duration of the GLR Campaign, there will likely be changes in assessment instruments that will result in the establishment of new baselines and benchmarks for measurable progress. In order to verify that measurable progress has been made in a particular solutions area, however, you are asked to use the same assessment instrument to submit a baseline data point and most recent data point. This may mean that you would need to submit a baseline point that was more recently established with the instrument currently in use in your community. In such a case, you would not submit the original baseline point you initially established with the instrument that is no longer in use.


11.  How do I show measurable progress in school readiness for low-income children?

You may submit your baseline and most recent data points in any of the following ways:

A.  You may present your baseline and most recent data reflecting a reduction in the number of low-income children who enter kindergarten with undetected, undiagnosed and untreated delays and conditions that impede learning (for example, health impairments,  such as vision, hearing, dental issues, or other physical ailments that impede learning; developmental delays and disabilities;  and social-emotional challenges).

  • You may present data from a locally-adopted data tracking system that captures the number of children from low-income families that receive a developmental screening and follow-up. This might include number of families that have responded to the ASQ or other quality screening tool and received follow-up.
  • You may present baseline and most recent data from Head Start reflecting how many children received developmental screenings and how many were referred for services. 

Explanatory Note

Head Start and Early Head Start programs annually collect and report data on a “Program Information Report” (PIR); this is public information so local programs should be able to share their data. Where Head Start and Early Head Start are operating together their data are aggregated on a common PIR. They collect data on the following questions (in the PIR Section C: Child & Family Services):

  • C.28 Number of all newly enrolled children since last year's PIR was reported
  • C.29 Number of all newly enrolled children who completed required screenings within 45 days for developmental, sensory, and behavioral concerns since last year’s PIR was reported

 

      a. Of these, the number identified as needing follow-up assessment or formal evaluation to determine if the child has a disability

Local Head Start and Early Head Start programs provide (and report on separately in the same section of the PIR) services, such as mental health services, for children who need them; they also track and report the number of children with mental health needs who are referred for services in the community.

  • You may also present baseline and most recent data from other sources, such as Medicaid, which reimburses primary care providers (and, under some circumstances, others) for conducting developmental screening.  You may also present data maintained by child welfare systems that are required to provide developmental screening to young children in their care.
  • If the sources from which you draw data on developmental screening also collect data on the follow-up (i.e., referral and linkage to services) that is appropriate when concerns are identified on the screening, we invite you to share baseline and most recent data on the follow-up as well.  Data on follow-up is not essential, however, if it is unavailable.
  • In states with a Quality Rating Improvement System (QRIS) for early care and education programs in place, you may present your baseline and most recent data reflecting the number of 3- and 4-year-olds in the community who are enrolled in high-quality early care and education programs that require screening and follow-up for developmental, sensory and behavioral concerns.

B.  You may present your baseline and most recent data from formal assessments of school readiness, including:

  • A state-adopted Kindergarten Entry Assessment/Kindergarten Readiness Assessment
  • A locally-adopted Kindergarten Entry Assessment/Kindergarten Readiness Assessment/Kindergarten Student Entry Profile

PLEASE NOTE: We are looking here for the overall/comprehensive readiness score on a multi-domain (emotional, behavioral, social, physical, cognitive) Kindergarten Entry Assessment, and not simply a subscore on one of the domains of the assessment. (For example, a subscore for, let's say, letter-naming fluency would be considered insufficient evidence of progress). Also, please note that in order to verify that measurable progress has been made in a particular solutions area you are asked to submit your baseline data point and most recent data point using the same assessment instrument.  Alternately, if a new assessment instrument has been introduced in your community and a comparability or conversion has been instituted to follow progress across the old and new instruments, you may also submit that information.

  • In states in which a large percentage of low-income children are enrolled in state-funded pre-K programs, you may present baseline and most recent data reflecting the number of children enrolled in the program who score at the end of the pre-K year at a level that indicates readiness for kindergarten
  • In communities that are scaling up one or more of the following evidence-based parent success tools, supports, programs and interventions that have a positive impact on school readiness, you may provide us with your baseline enrollment and most recent enrollment figures for your best such example:
    • HIPPY
    • Incredible Years
    • Parents as Teachers
    • PALS
    • PEEP
    • Triple P: Positive Parenting Program
    • Other program or tool for which there is an established evidence base (please provide name of program or tool, along with your data)

 For more information or assistance, please contact any of the following:

Ann Rosewater: annrosewater@comcast.net
Yolie Flores: yflores@gradelevelreading.net


12.  How do I show measurable progress in school attendance for low-income children?

The GLR Campaign's school attendance technical assistance team at Attendance Works (www.attendancworks.org) encourages communities to monitor, for the purposes of showing measurable progress, the percent and number of students who are chronically absent — those students missing 10 percent or more (18 days or more out of 180 day school year) for the past two or more years.  You are asked to provide two years of data reflecting your best-case example of measurable progress overall/in general for a particular grade-level (for any of grades K-3) and then specifically for low-income children in that same grade level.  This information will help us understand whether you are putting into place practices and policies that are helping improve attendance for all students and whether or not those efforts are also effectively addressing the needs of your most marginalized populations. 

For the purposes of aligning and determining the need for interventions for students based on their rate of absenteeism, Attendance Works encourages communities to monitor the following attendance bands for all students by grade and student subgroups — in particular students who qualify for free and reduced-price meals:

  • percent and number of students with satisfactory attendance (missing < 5 percent of school or 9 out of 180 day school year)
  • percent and number of students with at-risk attendance (missing 6-9 percent of school or 10-17 days out of 180 day school year)
  • percent and number of students with moderate chronic absence (missing 10- 19 percent of school or 18-35 days of 180 day school year)
  • percent of number of students with severe chronic absence (missing 20 percent or more of school or 36 or more of 180 day school year)

Attendance Works offers free Excel-based tracking tools that districts can use to calculate chronic absence levels by grade, school and student subpopulation. Visit the Attendance Works website for tools to calculate chronic absence

OPTIONAL ADDITIONAL CONSIDERATIONS

While there are many factors that contribute to chronic absence, the GLR Campaign is focusing intensively on health-related problems, because these concerns have a major effect on school attendance for our youngest students in particular. 

Attendance Works recommends first monitoring asthma, which disproportionately affects children from low-income families. We suggest tracking decreases in the rate of chronic absence among students identified as having asthma. We continue to explore how to track decreases in the number of absences related to other health-related concerns and would appreciate hearing about any methods emerging from the work of GLR Network communities.

For more information or assistance, please visit attendanceworks.org.


13.  How do I show measurable progress in summer learning for low-income children? 

The GLR Campaign's summer learning technical assistance partners at National Summer Learning Association (www.summerlearning.org) suggest the following measure to track progress for in summer learning:

More children from low-income families are maintaining or increasing their reading levels over the summer months.

What does this indicator mean?

  • More children” – We often talk about the summer slide and preventing summer learning loss. However, to keep things clear, we made the wording here positive. You should report increases, not decreases.
  • low-income families” – The data should be disaggregated to show progress for children from low-income families.
  • maintaining or increasing” – This is referring again to the idea of preventing summer learning loss by making sure that at the minimum children are maintaining over the summer, if not making gains.
  • reading levels” – To show measurable progress for summer learning, you should report on reading levels or skills only, as opposed to other measures you might be tracking during the summer such as program enrollment or number of books read.

The data on children’s reading levels or skills can come from a) your school district (i.e. spring and fall benchmark assessments) or b) in-program formal assessments. Common formal assessments for measuring reading levels or skills include, but are not limited to, the following:

  • DIBELS
  • aimsweb Test of Early Literacy
  • STAR Reader
  • San Diego Quick
  • Stanford Diagnostic
  • Scholastic Reading Inventory
  • Lexile Framework
  • Fountas and Pinnell
  • over the summer months” – You should report data that isolate the learning happening during the summer months and include as little of the school year as possible:

For the baseline summer: report the change from point 1 to point 2

For the most recent summer: report the change from point 3 to point 4

If you use school district data, the arrows will slide out a little, as there will be some overlap with the school year. If you use in-program data, the arrows will slide in little, as the data will be collected during the summer. The key is to be as consistent as you can and use data from the same time points for each year if possible.

For more information, please visit the NSLA website and the GLR TA Menu for Summer Learning. Want more? Contact glrpartnership@summerlearning.org.


14.  How do I show measurable progress in overall grade-level reading for low-income students?

You should provide a baseline data point (number and percent) and a recent data point (number and percent) for the end of either first, second or third grade (depending on the designation used in a particular state or community) that reflect the percent of low-income children who are proficient (and above) readers at that grade level, or who are reading at or above grade level.