Mary Kadera
  • Home
  • About Me
  • Blog
  • How I Voted
  • Contact

Get the lead out

10/20/2024

 
Imagine you live in a small, fictional country. Your government cares about health and the environment, and it's recently issued new regulations about lead contamination. 

Lead, as you likely know, can cause all kinds of human health problems--from headaches to high blood pressure, memory problems, and miscarriage. In young children, lead poisoning causes developmental delays, learning difficulties, seizure, and more.

So, your government is going to require every neighborhood and community to reduce the level of lead in the water, in the soil, in buildings and in the air. Lead levels will be measured through a variety of tests, starting this year, and in 12 months each neighborhood and community will be publicly rated on how much lead is present. Communities will the highest lead levels will have to enter into a special agreement with the government and cede local control of environmental services, waste management, water treatment, and parks.

What kind of tests will the government use? Well, they haven't developed all of them yet. But by this time next year, they'll be in force and your rating will be published. 

What kinds of funding or assistance will the government provide to help communities reduce lead pollution? Well, they haven't worked that part out yet.

If your community loses some of its local control, what exactly will that mean and how long will it last? Well, the special agreements haven't been drafted yet.

But you'd better get started with that clean up, because it's going into effect one year from now.

Maybe you live in a wealthy neighborhood. Individually and collectively, you can pay for companies to come in and work on the problem. You don't want your neighborhood to earn a failing grade, right? What would that do to home values?

But maybe you live in a neighborhood that's less well-to-do. The houses are older and smaller, and many likely still contain lead-based paint. You live closer to the municipal waste incinerator, which for many years contaminated the soil, air, and water nearby.  You're going to start with more lead pollution, and your neighborhood has less money to fund its own clean up effort.



​Of course you don't live in a small, fictional country, and nobody is testing your yard or your blood for lead levels. But something very similar is happening in the state of Virginia to our public schools.

Recently, the Virginia Board of Education approved a new School Performance and Support Framework to evaluate how well schools are meeting academic expectations. The new Framework replaces the current accreditation system, which critics have charged is not rigorous or transparent enough.  The Framework will  rate each school as Distinguished, On Track, Off Track, or Needs Intensive Support. The Department of Education will publish these ratings next fall. School divisions with a certain percentage of schools rated "Needs Intensive Support" will be required to enter into a Memorandum of Understanding (MOU) with the state. 

Whether or not you believe we need a more rigorous accountability system, you've got to admit (I hope) that we need one that's fully built. 

What assessments will the state use to gauge academic performance? Well, some of them exist today but some of them haven't been created yet. Even the most familiar, the Standards of Learning, will be changing this year as new standards for Math and English Language Arts have rolled out. (It's worth noting that Virginia is introducing the new math standards and the new math SOL test in the same school year; normally, teachers get a year to familiarize themselves with the new standards and the related test rolls in the following year.) 

What funding and resources will the state be providing to schools that are rated Off Track or Needs Improvement? We don't know yet.

What will the Department of Education put in the MOU it will require for the lowest performing schools and districts? No one has seen it yet. 

It's a little like not knowing if you're actually going to earn your college degree because the university hasn't figured out its degree requirements--even though you're starting your senior year and you have less than 12 months before you're supposed to graduate. 

Or like the lead testing in my imaginary country. Similar to the less affluent neighborhoods in my example, there are schools and school divisions in the Commonwealth that will be disproportionately affected by the new academic accountability system. 

Under the new system, students who are English Learners will "count" towards a school's overall accountability rating after just three semesters. Federal law requires that school divisions test all EL students after three semesters, but until now in Virginia we've used those tests to check for growth of content knowledge, not mastery of the content. After just three semesters, an EL student's science SOL score may say more about the student's understanding of English than their understanding of the science content being assessed. For this reason, many states that count EL scores after three semesters will test students in their native language; Virginia does not allow this.*

In APS, 27% of our students are English Learners. The state average is 9%. In several of our schools, EL students comprise 50% or more of the student body (Carlin Springs and Randolph are highest at 75% and 59% respectively). What will those schools' ratings look like when compared to schools like Jamestown (2% EL) or Discovery (4% EL)?  EL students and families are also often economically disadvantaged. The extra support that more affluent families tap into, like private tutoring, disability screening, occupational and speech therapies, and more, are beyond the reach of many immigrant families, which exacerbates the gaps across and within school communities.

In Arlington, most of our school funding is locally generated, with the state kicking in just 14% of the school division's revenue. In other, less affluent counties in the Commonwealth, the state contributes two-thirds or more of the school division's operating funds. 

I imagine that if the state does not appropriate additional, dedicated funds to support schools labelled Off Track or Needs Intensive Support, school divisions like Arlington will figure out another way to finance the extra staff and resources we'd need for these schools. But what if you are Lee County in Southwest Virginia, and 70% of your education funding comes from the state? Or the City of Petersburg, which is 63% funded by the state? (It's worth noting that Petersburg schools have been operating under an MOU with the Virginia Department of Education since 2004. The state's involvement for 20 years has not yielded significant improvement.) 

I'm all for accountability (just like I'm all for lead-free neighborhoods). There are parts of the new school accountability framework that I believe could be beneficial. And in our country, state, and community, we do need to act with greater urgency to close chronic opportunity and achievement gaps that can have lifelong consequences for English Learners, students with disabilities, and our Black and Brown students. 

But "getting the lead out", if the solution is not crafted with care, is in a best-case scenario blundering and ineffective. In the worst case, it will do real damage to those we had ultimately set out to serve.



*The issue of when and how to test English Learners on subject area content knowledge is complex, and (at least in my study of it) does not lend itself to blanket guidelines that dictate how "all" EL students should be assessed. Variables include the language in which instruction was delivered; whether a student is fully literate in their native language, which is not the case for many students whose formal education was limited or interrupted before coming to the US; a student's English language proficiency (e.g., are they at WIDA Level 1 or WIDA Level 3?); and more. It's also not clear to me that a blanket "number of semesters" rule makes sense. Three semesters may well be too soon for some EL students' scores to "count" as a measure of how well a school is equipping students with content knowledge. Eleven semesters (or five and a half years) feels to me like it would be too long in many cases.

I note in WestEd's 2019 independent evaluation of APS's English Learner program that a large percentage (44%) of middle school English Learners had been classified as English Learners for five or more years, and a large percentage (40%) had been English Learners since Kindergarten. Since that time, of course, APS has made significant changes and investments in English Learner instruction as part of its 2019 settlement agreement with the Department of Justice. 
​

We are all investors

5/10/2024

 
It was time for a new elementary math curriculum in Traverse City, Michigan, and the school district decided to take a pretty unconventional approach to making its selection.

This is no small matter, as purchasing curricula (including print or digital textbooks, workbooks, and other components) can cost millions of dollars, and districts typically only make this investment every five to ten years.

In Traverse City, the curriculum adoption committee had narrowed it down to three new curricula, each of which was backed by research. But here’s where things get interesting: the district then decided to run a year-long pilot study of all three curricula and include a control group of students who would continue to use the existing materials.  Principals, teachers, and district leaders ran the pilot together.

At the end of the year, they found that only two of the curricula produced statistically significant improvements. They could then compare the financial costs of the two products, and they had practical wisdom from teachers who had implemented each product in the classroom to inform the decision about which product to select, what components to purchase, and how to roll it out to the rest of the district.

The associate superintendent overseeing math instruction called it “the best experience of my career.” One school board member shared, “For the first time in my board tenure, I feel that decisions have been rooted in objective information.”

This is one example of Academic Return On Investment (A-ROI), a collection of practices that many school districts are adopting to make more strategic decisions about how to invest their funds and how to evaluate the impact of their programs.

The ABCs of A-ROI

I’ve been learning about A-ROI from sources including the Government Finance Officers Association, the District Management Group, and Education Resource Strategies.

The question at the heart of A-ROI is: What does the most good, for whom, and at what cost?

Districts are using A-ROI to adopt new programs and initiatives, like in the math curriculum example shared above. Often, they run limited pilots before they scale implementation across a whole district.

Districts also use A-ROI to evaluate the return on investments they’ve already made, ensuring that existing initiatives are worth the time, money, and effort being expended.

Because staffing comprises the largest part of any school district’s budget, it’s important to capture the amount of staff time a particular program or initiative requires, as part of its overall cost; this is challenging, but not impossible, to do. There are formulas, tools, and templates available from districts that have already begun this journey.

That said, because A-ROI is intense, districts can’t analyze everything. Often, they choose to focus on the programs that consume the most resources, or where they’ve identified that a number of programs overlap and there might be redundancy.

The “Ugly Christmas Tree” in Boulder Valley, Colorado

A few years ago, Boulder Valley School District  was struggling with the same problem that a lot of school districts face: in a well-intentioned effort to support as many students as possible, it had layered one initiative on top of another, creating what  one former district leader calls “the ugly Christmas tree” effect: “too many decorations that, while individually well-intended, don’t work well together and weigh down the very thing they were intended to support.”

An initiative inventory confirmed some suspicions: school staff were trying to implement 251 initiatives from 28 teams across nine departments in the central office. Over the next six months, the district worked to glean as much information as it could about
  • the students served by each program
  • its known outcomes
  • its fully loaded costs, including allocation of staff time
  • and its connection to other efforts.
In parallel, through a survey of school principals district leaders gauged their perceived value of each program, and for which students. They also asked principals about the implementation status of each program, and whether additional support was needed to implement it effectively.

This didn’t instantly fix the problem—but it gave Boulder Valley a good place to start. The district is using this inventory to create a roadmap for when and how it will conduct more thorough analyses of specific initiatives as a regular part of its ongoing operation.

Five Tips I’ve Learned From Districts Who’ve Done It

1. Be clear at the outset about what “success” looks like. When a new initiative is proposed, specify the outcomes that will be measured, by whom, and when. Make sure everyone knows what data would be considered proof of success later on.

2. Combine evidence-based decision making with cost-benefit analysis. Evidence-based decision making says “Wow! This program delivers great results!” Cost-benefit analysis says, “Yeah, but it costs sixty gazillion dollars per student. What if we could get 70% of that same benefit with a program that costs a little less, and allows us to work on this other instructional need, too?”

3. Don’t be afraid of pilot tests. I’ve said in a school board meeting, “We can’t run pilot tests” and here is where I eat my words. We can and probably should. It’s the best way to reduce the risk of a district spending too much of its money and students’  and teachers’ time on an intervention that doesn’t work.

4. Beware the sunk cost fallacy. Only the likely future benefits and costs of a program—not the sunk costs—should be considered when making a decision on whether to invest in a program going forward. I think of this as the “bad boyfriend” cognitive bias. Yeah, you’ve been with him for four years. You’ve invested a lot of effort. But girl, it’s still time to go.

5. Don’t make it just about cutting costs. A-ROI might yield budget savings, but it’s ultimately about making the best instructional decisions for the district’s students. As such, it’s a process a district should run completely separately from its budget development cycle. Some districts also adopt formal policies stating that no employee will lose their job as a result of the findings of an A-ROI analysis. That analysis can lead to any one of the following results:
  • Wow. Great value. Let’s expand this program.
  • Delivering really well for some student populations: let’s use it in more targeted ways.
  • Results aren’t clear: let’s continue to monitor for X period of time.
  • We’ve uncovered this flaw: let’s fix that flaw and reevaluate in X period of time.
  • Let’s abandon this program.
 
In my work as a school board member, I can appreciate how intensive A-ROI would be to implement. But I am triply certain that A-ROI or a discipline very much like it is absolutely essential. We have to exercise this discipline if we are to be good stewards of taxpayer dollars, if we want to avoid overburdening educators with low- or no-value initiatives, and—most important—if we are really committed to providing the best education to our communities’ youngest citizens.

Like. A. Boss.

6/30/2023

 
Jamestown, Randolph, Carlin Springs, and Claremont. At these four APS elementary schools, something really interesting is happening: for two years in a row, these schools have produced double-digit growth in reading.*

APS uses a tool called DIBELS to measure the development of early literacy and early reading skills at the beginning, middle and end of the school year. I got curious about this question: what percentage of K-5 students are in the DIBELS “green zone” of proficiency in September, and how many more (or fewer) are testing at that level by the end of the year?

In eight of our 25 elementary schools, the number of students at or above proficiency increased by at least 10%. And at Jamestown, Randolph, Carlin Springs and Claremont, that happened two years in a row.

These schools had quite different journeys: one of them started with just 28% of its students in the green zone in September 2021, while another was at 73%. One of them is a Spanish immersion school while another is a neighborhood school. One is part of the  International Baccalaureate program and another operates with a community school model. At one school 72% of the students are Latino, and at another they comprise only 8% of the student population.

So what do these schools have in common—is there a “secret sauce” that’s driving this growth? In APS as in other districts, school and division leaders look at a number of contributing factors, including what kinds of curricula and instructional practices teachers are using; having a highly skilled and high-performing school staff; access to resources that meet the unique needs of that school’s population; and more. It’s not always be easy to identify with absolute certainty what’s working, but it’s really important to try to do so, so that those factors can be sustained and potentially spread to other schools.

Some other things I noticed in the DIBELS reading data:

1.  We have a few schools that experienced significantly more growth in 2022-23 than they did the previous year. What changed? (These schools are Ashlawn, Drew, and Escuela Key. Nice work!)

2.  We have a few schools that showed impressive gains for students with disabilities, while at other schools the percentage of that population in the green zone stayed flat or even decreased.**  This makes me wonder three things:
  • whether there are staffing challenges at certain schools (nationwide, school districts are having trouble recruiting and retaining special education teachers)
  • how to navigate the nuance in special education reading data (for example, some students receive special education services and are also English learners. One student could have an IEP, be an English learner and be identified as gifted.) and
  • how to factor in the fact that some families can afford supplemental private services.

3.  Ten schools showed double-digit growth in the number of Latino students reading proficiently last year, led by impressive 15% growth at Drew. The year before, the growth in proficiency for Latino students at Drew was just 4%. IMO, we should be congratulating the Drew community and asking,”How did you do that?”

4.  Ditto for English learners at Drew: an impressive 16% growth in proficiency over last school year.

5.  Ditto for the 12% growth in the number of Black students proficient in reading at Drew. Oakridge was a close second with 11% growth.


Some of you will take issue with my focus on growth measures. A hypothetical example: 20% growth in a single year may mean that just 10% of students in School XYZ were reading proficiently in September, and by June only 30% are doing so. 30% is still not where we want to be.

I appreciate this concern and wholeheartedly agree that our work isn’t done until 100% of our students at every school are reading proficiently. But for a variety of reasons, some of them not directly under a school’s control, that journey is longer for some school communities than others. (Consider, for example, that at some sites about 30% of kindergarteners start school with entry-level proficiency on DIBELS, while at other schools more than 70% do so.)

This is why school- and student-level growth measures are so important. They honor the hard work that is happening in certain schools even if they’re further from the finish line. Our commitment should be to support and accelerate that growth as much as possible.



*More precisely: APS uses a research-based tool called DIBELS to measure the development of early literacy and early reading skills. DIBELS yields really important data about individual students’ skills, but it can also show us what’s happening at the school level: for example, in this dashboard showing the percentage of students performing at different levels. For simplicity’s sake, the stoplight colors are useful: green generally equates to “proficient” and “above proficient.” Yellow and red are more concerning.


**Note: I am not a statistician but I do understand the general idea of statistical validity. My nod to this concept is: I did not analyze subpopulations that comprised less than 10% of the overall school population.

Are you ready for the end of average?

9/20/2022

 
You’re 21 years old, married to your high school girlfriend and already a father to two young boys. You dropped out of your high school in small-town Utah midway through your senior year because your principal told you and your parents there was no way you would graduate with a 0.9 GPA. You never really enjoyed or felt successful at school.

To support your family, you’ve worked nearly a dozen minimum-wage jobs and you rely on welfare checks to help keep your kids clothed, housed and fed. Your latest job? Administering enemas to residents in a nursing facility, a job you took because it pays $1 more per hour.

What’s going to happen to you, your wife, your kids?

If you’re Todd Rose, whose story this is, here’s what happens.

Your dad persuades you to get your GED. Your parents and in-laws scrape together money to help you enroll in night classes at the local college. Eventually, you graduate pre-med, earn your doctorate at Harvard, and become a Harvard professor.

At Harvard, Rose founded the Laboratory for the Science of Individuality. In 2016, he combined his personal story and his research in The End of Average--a book that rocked my world. (And no, that’s not hyperbole.) It’s changed the way I think about education.

​Rose opens the book with a problem that puzzled the U.S. Air Force in the 1950s: multiple, mysterious accidents that could not be explained by pilot error or mechanical malfunction in the aircraft.

They eventually discovered the cause: the cockpits had been designed using the average range of 10 body measurements from a population of approximately 4,000 pilots (e.g. height, thigh circumference, arm length, etc.). But zero pilots were “average” across all ten measurements. If a cockpit was designed for an average pilot, the cockpit fit no pilot. So the Air Force banned the average and forced jet manufacturers to design “to the edges,” meaning a cockpit that would be adjustable for even the tallest, shortest, thickest and thinnest.

What does this mean for education?

Think of a classroom or school designed for “the average.” It would likely feature
  • One style and size of student desks
  • Lots of whole-group instruction
  • One way for students to demonstrate what they know—e.g., a multiple-choice end-of-unit test that every student must take
  • Seat time: a standard number of hours all students must log to get a class credit.
  • One-dimensional, high-level reporting against an average: “I am a B student in math because I am above average.”
  • You’re gifted. Or not.
  • Rigid tracking systems where students are sorted based on performance relative to an average (that is, you are “honors track” or “remedial track” in most or all of your classes)
  • Standard operating procedures: all students are expected to eat at an assigned table in the cafeteria, walk silently in a straight line, and take notes in a certain way.

​This was Todd Rose’s K-12 school experience (and maybe yours, too). It wasn’t until college, when he discovered an honors program built around inquiry and the Socratic Method, that he felt inspired and challenged. Rose says, “I gradually realized that if I could just figure out how to improve the fit between my environment and myself, I might be able to turn my life around.”

In The End of Average, Rose explores the ways that none of us is really “average.” Instead, he argues, each of us has “jaggedness”— a unique set of strengths and weaknesses that all too often get obscured when we use overly simplistic, one-dimensional measurements.

Here's an example. Which man is bigger?
Picture

​Here's another example: Which 9th grade English student is smarter?
Picture

​Rose says, “If we want to know your intelligence, we give you an IQ test that is supposed to tap a range of abilities, but then we merge that into a single score. Imagine two young students have the same IQ score of 110 — the exact same number. One has great spatial abilities but poor working memory, and the other has the exact opposite jaggedness. If we just want to rank them then we could say the students are more or less the same in intelligence because they have the same aggregate scores. But if we wanted to really understand who they are as individuals enough to nurture their potential, we can’t ignore the jaggedness.”

"Right now because we believe in the myth of average, we believe that opportunity means providing equal access to standardized educational experiences,” Rose says in a Harvard interview. “However, since we know that nobody is actually average, it is obvious that equal access to standardized experiences is not nearly enough… it requires equal fit between individuals and their educational environments.”

What would a school or classroom committed to equal fit include?
  • Flexible seating
  • Dynamic grouping of students based on the level of support they need to master a particular skill or topic during that day/week/month
  • Multiple ways for students to demonstrate what they know
  • Multiple styles of instruction: project-based learning; workplace apprenticeships; virtual learning; etc.
  • Self-assessment and reflection: helping students understand their own strengths and weaknesses
  • Multiple categories of giftedness
  • More nuanced assessment and reporting. Instead of “I am a B student in math because I am above average,” a student could say, “I worked on these six math standards this quarter and here’s information about how well I understand each one.”
  • Flexible pacing: students can take the time they need to master a particular concept or skill. As soon as they’re ready for something more challenging, they move on.
  • Advancement based on competency instead of seat time.


School doesn’t have to feel like a 1950s Air Force fighter jet cockpit. Indeed, it can’t. For Rose, this is a social justice issue, it’s an economic imperative, and it’s deeply personal. “I know what it feels like, at least in my context, when you don't fit into the current system. Like the kid who is always feeling … worthless. And I also know what it means to find your fit—to actually find your potential and your calling in life. It leaves me with this sense that from the so-called bottom to the top of our academic system, there's an enormous amount of talent and potential and contributions waiting to tapped.”


​Images of the Rose family are from the Flip Your Script podcast website.
The "Bigger Man" graphic is from Todd Rose's TEDx talk.
The Jagged Learning Profile graphic is from Masters in Data Science.

grading and getting ahead at school

4/15/2022

 
This week my brother and I have been enjoying some Spring Break vacation time with my parents, and we've been remembering our childhood report cards from Fairfax County Public Schools.

In the (ahem) 1970s, my teachers wrote out report cards by hand on tissuey, airmail-like paper. I received both letter grades and rankings of “O” (Outstanding), “S” (Satisfactory) and “U” (Unsatisfactory) on a whole range of things including cooperation, self-control, carefulness, and the infamous “plays well with others.” I’ve seen other report cards from the same era that included grades for cleanliness, teeth, posture and more.

As a teacher in the 1990s and more recently as a parent, I’ve sometimes wondered whether I’m thinking about grading and assessment and advancement through school in ways that are too limited. Is my own brain stuck thinking about ideas that are outdated? Put another way: Am I worrying about what features we could add to make a better 8-track tape player while other people are checking out virtual- and augmented-reality music concerts and festivals?

Picture
Where my brain is.
Yes, the 1970s again, courtesy of the Columbia Record & Tape Club.)


Picture
Where maybe my brain should go.
(The Black Eyed Peas toured in augmented reality in 2018.
Cooler people than me already knew things like this were going on.)

To expand my own thinking, I like to check out what other school districts and states are doing. There’s an approach that I’ve been learning about that I think is really interesting, and maybe you will, too.

I want you to imagine an education system that is designed to allow students to move at their own pace as they’re able to demonstrate mastery. In this system, students aren’t rigidly organized into groups (classes and grade levels) based on their birth dates, nor are they required to clock a certain number of hours in the class to receive credit (in secondary education, this is called the Carnegie Unit.)

This system exists, and it’s called competency-based education (CBE). More than 30 states, and additional individual schools and districts, are either exploring or already implementing CBE. New Hampshire was one of the early adopters, having abolished the Carnegie Unit in 2005. In its place, the state mandated that all high schools measure credit according to students’ mastery of material rather than seat-time hours spent in class.

More recently, other states have launched or expanded CBE because of COVID. Vermont, Michigan, Utah and Rhode Island are among the states that have responded to pandemic disruptions in this way. The Hunt Institute writes, “States and districts have an opportunity to rethink the structure of their education system and consider building systems that are flexible, engaging, and equitable during these difficult times. CBE can provide students the opportunity to gain a personalized learning strategy that meets individual student need through an equity lens.”

Sandra Moumoutjis, an administrator affiliated with a lab school network in Pennsylvania, writes: “As we continue our third year of school affected by a global pandemic, we are not the same as we were before. Our normal way of doing school did not prepare us to support students, families, and teachers when everything changed. We are now forced to reckon with the glaring inequalities of our one-size fits all, grade-based, age-based, and time-based traditional school structures.”

Eric Gordon, the head of Cleveland’s school district, told his school board that by replacing the normal time-bound, traditional grade levels, students would be in a better position to catch up, learn what they need and not feel stigmatized by having to repeat a grade. “We’ve got opportunities here to really test, challenge and maybe abandon some of these time-bound structures of education that have never really conformed to what we know about good child development,”  he said.

Here’s what I really like about CBE: it it’s a system that is designed to fit the student, rather than expecting students to fit the system.  Are you ready for more advanced material and more challenge in one area because you’ve demonstrated mastery? Then you can move on. Need more time in another? That’s OK too. You are not “bad at” a certain subject simply because you aren’t marching in lockstep with your same-age peers in all subjects at all grade levels. (That said—it does require some monitoring and effort to identify and support students who aren’t making “reasonable progress,” which may signal a need for disability screening.)

The National Center for Learning Disabilities has stated, “One advantage of CBE is that it recognizes that all students have strengths and challenges and learn best at their own pace, sometimes with supports. The flexibility and individualization of CBE is also at the heart of effective instruction for students with learning and attention issues and is a core tenet of many special education laws.”



What does it look like in practice?
​

One practice (already familiar to many Montessori families and educators) is multi-age grouping, or grade bands. For example, instead of Grade 1, a student is enrolled in a “lower elementary” or “upper elementary” group. In its coverage of grade-banding in New Hampshire, Education Week reports: “When provided opportunities for learning within their developmental sweet spot (where they were challenged but not in over their head), students made tremendous progress."

This was reinforced from the perspectives of both students and parents. One parent in Pittsfield, NH, commented, “I was skeptical at the beginning of the year that this room was going to work for Z... He still struggles, but I feel that he has made great improvements both academically and socially this year. I think his confidence is boosted when he is paired with kids that are at his level, and the curriculum is meeting him at his level. I really like the concept of this classroom.”

A few years ago I got to visit Parker Charter Essential, a school outside of Boston that serves middle- and high school-aged students. Parker uses mastery-based progression to move students from Division 1 (roughly grades 7 and 8) through Division 3 (roughly grades 11-12). In Parker’s performance-based promotion system, students usually take four semesters per division, but students can move at a pace that’s appropriate to them, sometimes advancing to the next division more quickly in certain subjects and more slowly in others.

Gateway portfolios “make the case” for promotion to the next level and are featured at public exhibitions of the student’s work. Portfolios typically include multiple examples of high-quality student work products, accompanying feedback and rubrics, and a reflective cover letter. Matt, a senior at the school, told me,“Every student has control over their own learning. I can take as much time to master the curriculum as I need.”

Instead of grades each quarter, students at Parker receive detailed, quarterly narrative progress reports in each class. The guiding question for these reports is, “What can the student do, and under what conditions can she do it?” At the end of the student’s junior year, the staff assembles a final narrative that draws from each of these quarterly progress reports across grades 9-11.The academic dean, the student, and the student’s family all have the opportunity to review the narrative and give feedback. This narrative summary, with accompanying school profile and explanatory notes, constitutes the bulk of the student transcript for college admissions.

CBE students, who most often don’t receive traditional letter grades or a GPA, are not operating at a disadvantage when it comes to college admissions. For example, 75 colleges and universities in New England including Harvard, Dartmouth, MIT, Tufts and Bowdoin have signed on to support CBE. One admissions officer commented, “The context of it is we see transcripts from around the country and around the world. And there are countless variations on transcripts.”



I am pretty intrigued by what I’ve learned so far about this approach to education. I’d be interested to hear your thoughts and reactions, too.

    Author

    Mary Kadera is a school board member in Arlington, VA. Opinions expressed here are entirely her own and do not represent the position of any other individual or organization.

    Categories

    All
    Achievement
    Assessment
    Budget
    Communication
    Community
    Elections
    Facilities
    Family Engagement
    Future Of Education
    Governance
    Mental Health
    Relationships
    Safety
    Summer Learning
    Teachers
    Technology

    RSS Feed

  • Home
  • About Me
  • Blog
  • How I Voted
  • Contact