Why most parents’ kids are from Lake Wobegon

Photo from stage showing Garrison Keillor telling stories
“The Lake Wobegon effect” is a term for the tendency to overestimate one’s abilities relative to others.

Why do parents have such unrealistically high assessments of their students’ academic performance?

That’s a question Michael J. Petrilli asks at EducationNext in a blog post titled Common Confusion.

The Common Core was supposed to be associated with tests that showed more accurately the relationship between stated standards and student performance. As those tests have been used, student test scores have gone done—way down, in many cases—but parent’s reports of how their students are doing remains high.

A study released in May reported 90 percent of parents believe their children are performing at “grade level” or higher in their schoolwork.

In New York State, where I live, about a quarter of students fail to graduate on time. Of those that graduate and go on to college, about a third end up taking at least one remedial course in college. I can tell you  from my college teaching experience, if a student can’t pass the test to escape remedial English, that student hasn’t been at grade level for about eight years.

Chart showing2012 HS graduation rates in New York State and percentage of graduates that are college and career ready.
New York State’s report of college and career readiness of 2012 cohort of students.

Petrilli suggests giving parents more direct information about their kids’ performance on the report results, possibly even offering resources for concerned parents to use.

Peter Greene on his blog Curmudgucation takes issue with Petrilli’s comments, which Greene reads as being about Grade Inflation. Greene argues that if grade inflation exists in K-12 education, it’s allowed to happen because there’s no objective standard for what students should really be getting as a grade.

I find myself in agreement with both men on certain points: in particular with Petrilli on the need to report test results in ways that will make sense to parents, with Greene on the deleterious effects of the commodification of education.

That said, however, I think there is another factor that could be the causing parents’ assessment of their kids’ achievement to be way off:  The teachers could be accurately assessing what students have learned in their class, and the tests could be accurately assessing how well the students’ learning matched the standards but the material being taught and the material being tested may be very different.

I don’t have any hard data as to whether that is the case, but my observation of such things as topics for Twitter chats and for professional development workshops for teachers lead me to believe a great many teachers are focused on teaching such things as a growth mindset and grit, which can be acquired while engaged in activities that require developing those dispositions.

I think today’s educators spend way to much time attempting to teach things that they wouldn’t have to teach if they did a really good job teaching their academic content.

I don’t mean stuffing students with facts.

I mean teaching students to read, write, compute, listen, speak, and think in each of their academic subjects and giving students work that gives them the opportunity to exercise creativity, to be innovative and entrepreneurial, to treat others with respect, to make the world a better place.

To that end, it might not be bad if parents did ask their local school boards what they are doing to make sure teachers are teaching the right things.

Steering students to the appropriate English course

Earlier this week, Matt Reed asked how colleges should determine whether an adult needs developmental coursework.

Although the question is particularly relevant to colleges—remediation is expensive—high schools are finding that asking a similar question about eighth graders about to enter high school can make a difference their students’ success.

In my experience teaching first year college composition, I’ve not seen any placement exam or test score that could be relied on to sort students into the right courses. Some of the problem lies with the measuring devices, but the difference in how teacher A and teacher B approach the same course is equally problematic.

In the comments section, two people offered suggestions that I think have merit.

SFisher suggested:

Why not turn the summer bootcamps into a kind of 3-week trial period, at the end of which a student will be placed into an appropriate level by a faculty member, without the student having to battle the dreaded (In)Accuplacer beast?

Edward White suggested Directed Self-Placement as an alternative measure:

It in essence replaces the usual invalid testing with student counseling and information on what is required to pass particular courses and leaves the final decision (and responsibility) with the student. Most reports about it are highly positive.

Perhaps a combination of the two might be worth considering: A boot-camp trial period in which students write and get feedback on their, as Fisher suggests, and which, as White suggests, gives students enough information about what the various college writing courses entail that they can make their own decision about what’s appropriate for them.

I find most of my own students  expect college writing will demand imaginative writing, detailed descriptions of their personal lives, and what they invariably call “good grammer.”  Simply correcting those impressions takes a weight off technically-oriented students.

I also find most of my students, whether traditional “college age” or adults, have a pretty good idea of what their writing and writing-impacting problems are. They also are typically capable of determining what kind of help would really be helpful to them.

An additional value of the boot camp, is that it would give the college data from which to determine the most common problems of students entering their programs. From that data, the college could develop shared resources all faculty teaching a given course could access so that official course descriptions and what happens in the classrooms is a reasonable match.

Also from that data, the college could develop support structures—for example, in-person individual and small group tutoring, digital resources, online chats or webinars—that students can elect if their boot camp experience reveals a need for some particular help.

My idea might not work—many of my ideas don’t—but it almost has to be as good a remedial English, doesn’t it?