In an ideal world, writing year-end reports would be easy. We’d be able to provide granular bits of information and share the story of a child’s growth with his or her parents with relative ease. But it’s not an ideal world; it’s a world where we’re strapped for time, and especially at the end of the year when fatigue is high and patience is low, it’s a small miracle when we can even get the report cards done and send them on their merry way.

I, too, have felt the same strains of reports, but I also had a desire to do it better. I’ve been fortunate enough to spend the last few years in a school system where we don’t use grades. This is a blessing, in the sense that I can focus more of my time on providing individualized feedback and creating rich learning experiences, but it’s also challenging, in the sense that I have to be creative about how I demonstrate valid and reliable progress to parents.

You may be thinking that I might be a bit too preoccupied with managing parent expectations. After all, isn’t it about the kids?

It is about the kids. But through my work in a progressive, personalized learning school, I’ve learned that if parents don’t feel like partners, and if they don’t feel like you know their child individually, not only with regard to proficiency, but also as a human being, you won’t get anywhere with progressive pedagogy. When assessing in a novel way, using project-based learning, and exploring student autonomy, buy-in from parents is absolutely critical, and unfortunately, it will take a bit of extra work on your end. They need to be able to see their children are learning before they can trust you to spread your wings and try new thing with their children.

Start with Competency-Based Assessments

For the better part of six years now, I’ve been using a competency-based approach to assess children, in an effort to provide parents key insights on where their child is thriving and where their child needs a bit of extra work, but also to personalize instructional groupings and enhance my classroom instruction. For simplicity’s sake, I will provide one example of what this looks like in literacy.

Competency-based assessments are not conducted for the purpose of passing or failing; they are conducted with the purpose of better understanding the child. In literacy, there are a number of competency-based assessments that provide educators with a specific level at which each child has demonstrated proficiency. The Developmental Spelling Inventory (DSI), a criterion-referenced, standardized assessment of spelling development, is a prime example of this. Children are asked to spell a variety of words, with patterns increasing in complexity over time. The development of these spelling patterns is well-researched and generalizable for most children. While it’s possible for some children to deviate from the patterns of a developmental assessment such as the DSI, it’s highly unlikely. In fact, it’s so well-regarded because it works for most.

For the purposes of this post, I’m going to disregard those edge cases where kids don’t fit the developmental mold, not because those students in the extremes don’t matter, but instead because this type of report automation is meant to save educators time where saving time makes sense, so they can focus more on investing their expertise into those edge cases that need a little more TLC. It’s impossible to avoid having to invest the time in those cases, but it is possible to save a great deal of time with students who do follow developmental continua and are progressing expectedly.

These aforementioned patterns that make up a competency-based, standardized assessment then fit children into criteria, hence the term criterion-referenced assessment. For example, in the DSI, if a child can correctly spell 7 feature points in initial consonants, 7 feature points in ending consonants, but only 4 in short vowels, it’s highly likely that her appropriate instructional level is with short vowels. For reference, here are those stages in context:

Screen Shot 2017-05-30 at 3.23.15 PM.png

A lot of times, when I was writing my reports before, I’d always feel like I was writing the same thing over and over again, just in different orders and juxtapositions. Oliver was reading at a Level O and spelling at the EarlyWithin-Word Stage, while Olivia was reading at a Level P and at the Inflected Endings Stage. The sentence frames, qualifications, and descriptions of each level were relatively the same; the only thing that seemed to change was the respective levels and some of the individualized feedback.  Something like:

<<Child’s Name>> is currently reading at <<F&P Level>>. This means that <<Child’s Name>> is able to <<F&P Descriptor>>. This is <<at/above/below>> what we would expect for <<Child’s Name>>’s age. <<Anecdotal Notes>>

It occurred to me at that point that perhaps I could be working a bit smarter with writing my reports, if it was really this formulaic. After all, the process by which I was collecting data was standardized, and the assessments made clean categories that provided actionable feedback that was individualized. Perhaps, then, if I organize the data appropriately, then I could write some spreadsheet formulas that would take this data and put it into an individualized, narrative report, one that explains the criteria to parents, all the while feeling more personal, with fields for anecdotal feedback written specifically for that child.

Minimizing Complexity through Spreadsheets

And that’s exactly what I did. Because the data was all criterion-referenced, this meant that, if a child fell into a certain developmental range, that criteria would dictate next steps and an instructional level reliably. It wasn’t unique to the child, per se, but it was unique to all children that fell into that criteria.

With this in mind, I assumed that the majority of the report would simply be an arrangement of generalizable statements, determined by the criteria delineated from these competency-based assessments. All I had to do was write them.

I started with my data:

Screen Shot 2017-05-30 at 7.07.31 AM.png
Note: All data is real, but names have been changed to protect the innocent.

All of this data was tabulated by noting feature points in each child’s assessment. Based on this data, with a bit of help from the conditional formatting and what I know about each child, I was able to place them on the developmental continuum (final column), again not as a measure of passing or failing, but as a means for deciding class-wide and group-specific instructional needs, meanwhile communicating actionable feedback to parents.

From here, I was able to assign statements as to what this means for learning. Through a series of IF statements, I was able to automatically assign narrative criteria, explaining with a relatively large amount of specificity, what the individual child would be working on. You can see the formula below:

Screen Shot 2017-05-30 at 2.57.31 PM.png

After which, Isabella’s narrative, for spelling only, looks like this:

Screen Shot 2017-05-30 at 3.03.02 PM.png

Using this approach for each of the competency-based literacy assessments I used (i.e., Fountas and Pinnell, Core Phonics/Decoding, Orton-Gillingham Sight Words, Letters and Sounds, and CPAA), I was then able to generate a rather descriptive report for this child, providing a context and rationale for each assessment, meanwhile information on an individual child’s competencies:

Screen Shot 2017-05-30 at 3.05.59 PM.png

A Word of Caution

It’s important to remember that we must not reduce children down to their assessment scores, but it’s also important that parents have the proper information to truly help their children at home. That’s why I also include a brief anecdotal narrative (not included here), that helps to tell a child’s individualized story in literacy and numeracy and well. When this great data is provided, however, the anecdotal notes can be much briefer, but still as colorful as if you’d written it all from scratch.

Regardless, this method of reporting focuses mostly on what a child can do and what he or she is ready for, providing actionable feedback and a clear instructional pathway, as opposed to the traditional manner of reducing feedback down to meaningless gradated letters. Should you choose to go this route, start small and give yourself some time and space to do it gradually. This took me two years to develop, tweak, and make into a process that works for me. I’m still tweaking it each time, but it has certainly saved my co-teacher and me a great deal of time, meanwhile giving parents rich information!

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s