# SAT Challenge: Bloggers Dumber Than High-School Kids

Visit the Official Blogger SAT Challenge Site

The graph shows a histogram of the scores for the essays entered into the Blogger SAT challenge. It’s really a pretty nice distribution, with an average score of 2.899, a standard deviation of 1.28, and a standard deviation of the mean of 0.123 (so I’d make my students write it as “2.9 +/- 0.1”). The median and mode were both 3.

(Well, OK, we cheated a little on the stats to make life easier– the scores were averaged and then rounded up. If we keep the half-integer scores, the mean drops to 2.7 +/- 0.1, and the distribution looks a little more lopsided. Click here to see it.)

How does this compare to the students who took the SAT for real? The College Board reports the sum of scores from two graders, so the officially released average score was a 7.2 out of 12. We can do the math to put it on the same six-point scale we used, which turns out to be a 3.6 out of 6.

So, we have scientifically proven that high-school students are better writers than bloggers, right? Well…

(Continued below the fold)

I don’ think I’d really call these results a significant blow against blogger superiority. After all, we were asking people to take twenty minutes to write an SAT-style essay, with no real preparation or practice. We also had a fair number of people more or less blow the whole thing off– one person just pasted the instructions into the essay box, and left it at that (we didn’t publish that one on the site). At least a couple others wrote flippant one-sentence responses. That probably doesn’t happen on the real SAT.

(Excluding the handful of zero grades from the sample doesn’t raise the score all that much– from 2.9 to 3.0– but there were a lot more blow-off answers than just those four.)

I think these results do support my original point, way back when this whole thing started: it’s a lot harder to write a good short essay on demand than you might think when you have the chance to look at the question at leisure. Even bloggers, who spend a lot of time writing short essays of their own free will, don’t do all that well with a set topic and a tight time limit.

I’ll post a few comments from the graders a little later on, so you can see what the experts really thought…

## 8 thoughts on “SAT Challenge: Bloggers Dumber Than High-School Kids”

1. Like Vietnam+Northern Ireland=Iraq, the SAT essay extols format not content. Social activist recursive summation yields American zero-goal education. Teachers (educators!) cannot each, administrators (managers!) cannot administer, students (victims!) cannot learn, and everybody gets performance bonuses. Vote for bigger school budgets!

Universities increasingly declare the SAT does not predict academic performance. Why would language, mathematics, and reasoning skills plus accumulated objective knowledge confer any boon for diversity admissions wherein drug addiction, teen motherhood, and frank stupidity are overwhelming advantages?

Algebra was born in Muhammad ibn MÅ«sÄ al-á¸´wÄrizmÄ«’s terrorist treatise Al-Kitab al-Jabr wa-l-Muqabala. Every child learning algebra is a lost battle in the War on Terror and a Homeland Severity threat. Save our children!

2. At least a couple others wrote flippant one-sentence responses. That probably doesn’t happen on the real SAT.

Oh, but it does. Quite a few high school students hate the SAT (or just the essay section) and don’t take it seriously. It’s a pity drawing a picture wasn’t a possible response to the Challenge. =)

These students are in the minority, for sure, but their non-responses are included in the official stats.

3. I know that this is all in good fun and not meant to be rigorous, but I have to wonder how closely these graders’ scores would compare to those of the actual SAT graders.

4. The volunteer graders were real SAT graders, at least some of the time. That was the really amazing thing, to me, about the fact that we got them to volunteer– I was surprised that anyone who did this for real would voluntarily do it for fun…

5. Eric says:

As a high school student steeped in SAT and college rigamarole, I must say — I love you Uncle Al. Lmao.

On the SAT, format seems to be the key to any success:
Introduction scales from broad, alluring topic sentence to 3 point thesis. Body 1 begins with topic sentence that relates directly to thesis, continues with first point that must be supported by evidence and proven by analysis. The same goes for Bodies 2 and 3. Conclusion begins with a restatement of the thesis, must answer the “So What?” and “Why should anybody care?” questions. Essay ends with hmmmm statement.

For high school purposes, good writing is cold, emotionless, and mathematically flawless. Surely you cannot prove a point with strong appeals emotion… surely that has never been done. (P.S. cast me into hell for just using the second person).

Oh and an aside, Chad, Mr. Orzel.. whatever, I notice that you seem to be a motivated physics (math?) teacher. As such, you may be interested in the forum and blogs at http://artofproblemsolving.com/. The smartest students in the country are on that forum. I say this just because (collectively) everyone seems to be so concerned with the declining intelligence of our nation’s next generations.

6. Kevin says:

Universities increasingly declare the SAT does not predict academic performance. Why would language, mathematics, and reasoning skills plus accumulated objective knowledge confer any boon for diversity admissions wherein drug addiction, teen motherhood, and frank stupidity are overwhelming advantages?

They may be admitting it now, but the statistics have been there for decades. Generally, that stat refers to the fact that men traditionally do better on the SATs than women, but women have been more successful in College than men for 30 or 40 years.

7. Thomas says:

“I don’t think I’d really call these results a significant blow against blogger superiority.”

I would. Excluding the “blow-off” responses, the blogger participants self-selected to participate in the challenge because they thought themselves good writers. The SAT students, on the other hand, included every high school kid who wanted to go to college, or whose parents wanted him or her to go to college. That population includes many people who know themselves to be bad writers and many who took the test only because they had to.

Our blogger group is likely to be, overall, much better educated. (Their minimum level of education will be about equal to that of the SAT takers and their maximum level of eduction will include some college degrees, at least.) Our bloggers had the benefit of computers for word-processing; typing and editing are much easier on line than by hand.

Our bloggers lost. They didn’t lose to the average college freshman — they lost to the average wannabe college freshman.

This challenge was well conceived and well executed. Let us not invalidate the experiment by attempting to explain away results that disappoint us.

8. Wray Cummings says:

A participant, I managed a ‘3’ from the expert graders for a couple of paragraphs that nearly half (16/34) of the citizen graders rated a ‘2’. From my vantage either grade is a gift as the essay I produced bordered on non-responsive. That and the fact that I internally morphed Booker T Washington into George Washington Carver (which didn’t skew the result as much as I’d imagined).

It would have been a much stronger answer had I lead off with a simple restatement of the question. Score one for form. The real challenge, the only challenge and my only memory of the exercise was this: 20 minutes.

I’m not sure that timed writing is like timed arithmetic and logic puzzles and the other components of standardized testing. In the math/arithmetic sections timing measures one’s facility with underlying concepts, kinda like musical scales to a musician. The notion of “writing” that I brought to the test means something more like composition than scale facility, note selection over note production.

Maybe this is just a misunderstanding or mislabeling. By timing the SAT writing test they seem to be defining “writing” as “answering essay questions in classroom tests”, a reasonable distinction given the context.

Conceding this change of labels moves the blogger / student comparison into apples & oranges, writing for the world vs writing for the teacher.

Comments are closed.