Category Archives: Uncategorized

It’s Time to Stop Overlooking Juvenile Justice Education Policy

Just as juvenile justice education programs are commonly overlooked in mainstream educational equity conversations, they are also left behind in state education policy. The consequences for students are dire.

Juvenile justice education programs, as Bellwether Education Partners defines them, serve students in the court-ordered custody of a local or state agency. Settings can include short-term detention centers, long-term secure facilities, residential treatment centers, or other publicly and privately run facilities. The best estimates tell us that nearly a quarter of a million students were detained or committed to such facilities in 2019, where they had extremely limited access to education opportunities of all kinds including online learning, differentiated coursework, tutoring, dual-credit courses, career technical education, and work-based learning.

Our latest report finds that the governance, accountability, and finance policy designs are convoluted, inconsistent, and in some cases entirely absent in juvenile justice education programs. We reviewed state policy in all 50 states, Washington, D.C., and Puerto Rico and uncovered what advocates have long suspected: a mess of dizzying policies, contradictory regulations, and exceedingly complex statutes. Despite the best efforts of well-meaning and devoted educators, these incoherent policies mean that the vast majority of juvenile justice education programs fall short of anything resembling a “school.”

Students in juvenile justice education programs are unlikely to be offered education opportunities aligned with their needs while locked up — and more often than not, they will never enroll in school again when they’re released. 

If state leaders structure policy reforms around coherence within and among these three policies (governance, accountability, and finance), they can meaningfully improve the education provided to students in their care.

Governance

Governance policies define who is responsible for providing (or ensuring the provision of) education services to youth in custody. In at least 26 states, the agency responsible for providing education services in local detention centers is not the same as the agency responsible for education in state-run facilities. In some states, one agency is responsible for providing direct instruction in a juvenile facility, while another agency controls the funding. In California, only youth detained or committed for offenses considered most serious or violent are held at the state’s Department of Juvenile Justice facility, which operates separately from facilities run locally by county boards of education. 

A class-action lawsuit from 2014 shows how inconsistent governance policies can lead to finger pointing and ultimately the abdication of responsibility for student learning. In Contra Costa County, California, the county probation agency was responsible for discipline policy but the county office of education was responsible for educational services. The two entities disagreed on who was responsible for education in restrictive security programs, leaving teachers unable to provide students in solitary confinement with the same modality, quantity, or quality of instruction as their peers. 

Even trying to find and confirm governance policies for our research illustrated the problem: we had to call numerous offices in individual states to cross check competing information. 

Accountability

Accountability policies determine how programs are evaluated and what happens when they aren’t delivering. In traditional districts, agencies use assessment and attendance data, teacher evaluations, school visits, and other data-collection strategies to ensure schools provide a high-quality education. Each education agency then defines the interventions that follow when a program does not meet expectations.

To measure school success, education agencies need to decide on their “measuring stick,” or the kind of data they will evaluate. While traditional educational policy conversations still grapple with these questions and acknowledge that there is no one-size-fits-all solution, juvenile justice education programs are light years behind altogether.

Given the governance structures described above, it’s no surprise that juvenile justice education programs interact with many government agencies and are often required to submit data to offices with competing and incompatible goals, requirements, and processes.

Imagine this common reality: Mr. Dewan has students at a 9th-grade and 12th-grade level in his classroom. Some stay for a few days or weeks, while others stay for a few months — he never has the same group twice. Most of his students arrive without academic transcripts, so he relies on their recollection of past coursework and grades while awaiting prior records from any number of institutions. Over time, some students get shuffled to another facility without notice, while others attend a mandatory court date and never come back. Mr. Dewan doesn’t always know when a student has left the program, so he cannot plan for assessments in advance. The security or probation officers on staff periodically come in and remove a student from Mr. Dewan’s classroom, even when he has no concerns about safety. 

Having worked in and with such constraints, we respect how difficult it is to collect data, measure student and school success, and implement effective interventions. That said, a necessary component of any accountability system is defining how programs will be evaluated and what happens when they aren’t delivering for students. Our survey indicates that unlike nearly every other kind of education setting, most states have not defined in statute how juvenile justice education service providers are held accountable. 

Finance

Finance policies explain how states allocate funding to the agencies responsible for operating juvenile justice education programs. The people responsible for overseeing or operating these programs are best positioned to know where funding is needed the most. 

But our research shows that time and time again, the agency in control of finance is not the same as the one held accountable for results, creating a disincentive to allocate the resources necessary for high-quality programming. The greater the disconnect between finance and governance, the greater the chance that funding is not allocated for the right things. 

Beyond defining agency responsibility, there is little transparency about dollar amounts that actually make it to these educational programs. We know very little about how much states allocate for per-pupil funding in juvenile justice education programs. The reality is that students generally arrive at juvenile justice education programs lagging behind academically, in addition to potentially having significant unmet mental, behavioral, and physical health needs. State finance policies must take this reality into consideration and fund juvenile justice education programs accordingly. 

For this population of students, the stakes are too high not to get the fundamentals right. A child in the custody of a state agency is entrusted to the care of the government, creating a heightened moral responsibility (and arguably a legal one) for policymakers to provide that student with the highest-quality educational opportunities.

Read our new report here or view this resource to find out your specific state’s current policies. 

Three Strategies Social Entrepreneurs Can Use to Maximize Impact

Being a social entrepreneur requires an irrational and ambitious belief in the power of one’s work to transform a world in dire need of change. Just look at the scale and degree of change embodied in any social impact organization’s vision and mission statements about the transformation it aspires to create for the communities it serves. 

Education entrepreneurs are no exception. To make the irrational actionable and turn their ambitions into reality, leaders across the sector are increasingly turning to three strategies for impact:

  • Direct Impact: How an organization provides programming directly to its target beneficiaries.
  • Widespread Impact: How an organization builds the capacity of partner organizations to replicate elements of its program model.
  • Systemic Impact: How an organization shifts mindsets, relationships, and power to in turn shift the policies, practices, and resource flows that create stronger conditions for adoption of an organization’s values, program model, and its ultimate vision for change.

These strategies are not mutually exclusive but rather reinforcing and cyclical.

Rooted in promising practices from the education sector, Bellwether’s Pragmatic Playbook for Impact: Direct, Widespread, and Systemic is a practical resource for nonprofit decision-makers to maximize their impact, further equity, and respond to the urgency of this moment. The playbook covers:

  • The design considerations in first developing a Direct Impact model.
  • The reasons more organizations are considering expanding into Widespread Impact.
  • Widespread Impact design decisions and different models organizations can consider in prioritizing breadth versus depth of impact.
  • How organizations maximize fidelity of implementation of more intensive Widespread Impact models.
  • How organizations extend their work into Systemic Impact strategies to create the conditions necessary for their program model to achieve and sustain scale.
  • How organizations balance work across these three impact strategies — including aligning it with their theory of change, building out the organizational capabilities to execute across these strategies, and understanding how these strategies impact financial sustainability.
  • How organizations can measure their Widespread Impact. 
  • Three case studies showing how education nonprofits — including Envision Education/Envision Learning Partners, Saga Education, and uAspire — are effectively implementing Direct, Widespread, and Systemic Impact strategies in the field.

Social change is daunting, and this work isn’t easy. These resources can help education entrepreneurs across the country accelerate their impact as they work tirelessly to improve life outcomes for students.

To learn more, click here.

Celebrating AAPI Heritage Month

Photo by Allison Shelley/The Verbatim Agency for EDUimages.

In honor of Asian American & Pacific Islander Heritage (AAPI) Month, we invited Bellwether team members to reflect on how their heritage shapes their life and/or the impact & legacy of an AAPI historical hero. 

If you are Asian-American or Pacific Islander, how has your heritage impacted your life? Are there any special traditions that you or your family participate in?

Julie Nguyen, design & visual associate
As a first-generation Vietnamese-American, I know how valuable it is for kids to receive opportunities in education. My parents immigrated to the US when they were only 14/15 years old, finishing only some high school before transitioning to the workforce. Knowing this has only ignited my fire to achieve great (and beautiful) things in this life. I carry that strength and sacrifice with me, and am grateful and proud of those who came before me.
Krista Kaput, senior analyst, Policy & Evaluation

My grandma was born and raised in Japan, and met my grandpa when he was stationed in Fukuoka. I was raised to always be proud of my Japanese heritage. Growing up, I was taught how to make sushi and sukiyaki, and I also learned how to do origami. I also had the privilege to visit Tokyo and Kyoto a few years ago, and talking with my grandma about that experience was very special. As I’ve gotten older, we’ve had more honest conversations about her life growing up in Japan during and post-World War II and raising biracial sons in America, which have been pivotal in my life. I am so proud to be her granddaughter.

When you think of Asian-American or Pacific Islander historical heroes, so people no longer living today, who is someone you think of? Why is their legacy important to you and important more generally?

Titilayo Tinubu Ali, partner, Policy & Evaluation

Detroit activist, philosopher, and writer Grace Lee Boggs’ legacy offers so much wisdom that I find relevant to our work of dramatically improving education and life outcomes for systematically marginalized youth. She spoke of how those of us who seek transformation have a responsibility to keep growing, learning, and transforming ourselves. Her legacy calls us toward steady accountability and self-reflection so that we never lose sight of doing the internal and interpersonal work of transformation while we are shaping change “out there.”  

She also spoke of how “movements are born of critical connections rather than critical mass.”  When challenges in the education sector feel insurmountable, her readings remind me that big changes start small and there are lessons to learn in communities, classrooms, homes, and the tiniest units of change–even when we’re seeking to shape change at scale.

Understanding Parents Requires More Than a Single Poll Result

In statistics, it’s often said that “all models are wrong, but some are useful.” When it comes to polling parents on K-12 schooling, it’s similarly true that while no single result may be “right” it can be useful –– particularly when considered in the context of other polls. 

It’s always important to consider how new polling data points fit into longer-term trends –– something that’s exceptionally true in public opinion research. Bellwether’s new Parent Perception Barometer aggregates national polling data to provide a more nuanced perspective on parents’ complex opinions. It’s also a tool to mitigate against the temptation to put too much emphasis on the most recent poll.

A recent NPR/Ipsos survey about parents’ thoughts on schools provides an excellent reminder of why context matters when considering the results of new polls. This particular survey asked parents how much they agree with the following statement: “My child has fallen behind in school due to the pandemic.” Thirty-two percent of parents agreed with the statement. 

By just looking at this isolated data point, we may infer that two-thirds of parents don’t think the pandemic has negatively impacted their child’s academic progress. But examining this data in the context of other polls changes its interpretation. 

Recent polls tracked in the Parent Perception Barometer consistently indicate that a majority of parents have been concerned about their child’s academic progress throughout the pandemic. As of March 2022, data from National Parents Union/Echelon showed 66% of parents worry “a lot” or “some” about their child staying on track in school.

Data visualization courtesy of Bellwether’s Parent Perception Barometer.

Using the barometer, we can more easily identify key differences in the phrasing of the NPR/Ipsos poll that help inform how we interpret the its data, along with the results of other polls:

  • Wording matters. A key distinction between the NPR/Ipsos poll and others is the difference between a parents’ “perception of” their child’s academic performance (NPR/Ipsos) and parents’ “general worry or concern about” their child’s academic performance (National Parents Union/Echelon). There are multiple explanations why these two constructs may produce different results. A parent could be concerned about their child’s academic progress while also believing that their child isn’t falling behind. Cognitive biases may also limit parents’ willingness to tell a pollster that their child has fallen behind in school. Examining the nuances in survey item phrasing can help tease out when different polls are testing similar –– or in this case, different –– phenomena.
  • Reference points are important. Survey questions often ask about abstract concepts. For example, asking parents if their children have “fallen behind” or “are off track,” may mean different things to different parents. Should “falling behind” in school be interpreted as a comparison to others in their peer group, to the state’s academic standards, or to where the child would have been academically absent a pandemic? Some polls try to define the reference point by asking “compared to a typical school year” or “ready for the next grade,” but others (like the NPR/Ipsos poll) leave more room for interpretation by respondents, which can muddle results.
  • The timing of surveys can influence responses. In addition to what is asked in a survey, when the survey is administered can influence results. In the chart above, there’s a noticeable trend where parents report less concern about their child’s academic progress during the summer, only for those concerns to rebound during the academic year. A USC poll asked parents about how “concerned” or “unconcerned” they are with the amount their child learned this year compared to a typical school year. In a survey administered in April through May 2021, 64% of parents reported being concerned, compared to only 50% in June through July 2021. National Parents Union/Echelon polls illustrate similar declines over the summer in parent worry. This is less relevant for the NPR/Ipsos poll, but is worth considering as new data are released.

Given these considerations, which poll is “right”? The truth is, absent obvious flaws in the survey design — like biased phrasing or leading questions — most polls provide some useful information. When polls ask slightly different questions on a given topic, understanding the relationships between item phrasing and response data can help analysts derive more robust insights. 

Differing results among polls aren’t a flaw, but a feature. Tools like the Parent Perception Barometer separate the signal from the noise in assessing what parents actually think about K-12 schooling.

Tracking Parents’ Complex Perspectives on K-12 Education

Photo courtesy of Allison Shelley for EDUimages

Every policy wonk loves a good poll, and education policy wonks are no exception. Polls give added depth and dimension to an array of current (and shifting) public opinions, attitudes, and needs. But too often, wonks tend to over-index on the latest, flashiest data point as new polls are released — making it difficult to examine the broader context of other polls analyzing similar data points, or to contextualize prior administrations of the same poll.

The recency bias associated with new polling data is a persistent problem in fully understanding how parents think about K-12 education across the country. Contrary to media-driven hype, parents have diverse viewpoints that don’t fit broad narratives offered by pundits. Just as children and circumstances change over time, so do parents’ opinions on what their child needs. And to say that the COVID-19 pandemic brought change to parents and to their children’s educational needs is an understatement — one that underscores the need for a deeper examination of how parents’ views on K-12 education have (or haven’t) changed since March 2020.

Alex Spurrier, Juliet Squire, Andy Rotherham, and I launched the Parent Perception Barometer to help advocates, policymakers, and journalists navigate the nuance of parents’ opinion about K-12 education. The interactive barometer aggregates nationwide polling and other data on parents’ stated and revealed preferences regarding their children’s education. The first wave of polling data indicates that parents are largely satisfied with their child’s education and school, but many have specific concerns about their child’s academic progress as well as their mental health and well-being. As parent opinions aren’t static, the barometer will be updated on a regular basis with the release of new polling data.

There are multiple benefits of aggregating this polling data in the barometer: 

  • First, it allows us to examine emerging or persistent trends in the data. Looking at the same question asked over multiple time periods as well as similar questions asked from different polls separates signal from noise. 
  • Second, it shapes a holistic consideration of a body of relevant data, tempering the pull of recency bias that comes with each new poll’s release. 
  • Third, by analyzing similar poll questions, we identify data points that may be outliers. For instance, if three polls asking a similar question all indicate that parents strongly favor a particular policy, and a fourth poll indicates otherwise, we may look more closely at that poll’s language wording and be more cautious about the types of statements or conclusions we make.

The Parent Perception Barometer provides several ways to support a comprehensive analysis of parents’ perceptions. For those most interested in exploring data on a single topic across multiple sources, the Data Visualization tab provides a high-level summary of recent trends in parents’ stated and revealed preferences. For those looking for more technical background on the polls and data, information about specific polling questions, possible responses, and administration dates can be found within the Additional Detail tab. The barometer also allows users to view and download underlying source data. 

The Parent Perception Barometer is a valuable resource to ground policy and advocacy conversations in a nuanced, contextual understanding of parents’ opinions — bringing clarity and context to the K-12 education debate.