Skip navigation
In my opinion

We need to assess student literacy skills

You can’t prove what you don’t measure.

BY NICHOLAS DION + VICKY MALDONADO | DEC 04 2013

While the benefits of strong literacy skills are well established, there is growing concern that Canadians’ reading and writing skills, including those of students attending postsecondary institutions in Ontario, are not meeting expectations. This is especially worrisome given that strong literacy skills are critical to students as they graduate into a highly competitive and increasingly globalized labour market.

The pressing question is whether students entering postsecondary education have the literacy skills required to succeed, and whether these skills are strengthened during a student’s time at university or college. Most institutions gather surprisingly little information about students’ language skills, either upon entry or at graduation. While most would agree that improved reading and writing skills are a key learning outcome of postsecondary education, you can’t prove what you don’t measure. The time has come, especially given the (largely successful) move toward mass higher education, to consider the rigorous and systematic assessment of students’ literacy skills as they enter and exit postsecondary education.

Currently, only international students who haven’t completed their high school education in English write language-skills assessments on entry at most institutions, the assumption being that if you study in English and you do well enough in high school to earn your diploma, your language skills are likely strong enough to succeed at university. Yet there are good reasons to question this assumption.

The OECD and Statistics Canada have been assessing Canadians’ literacy skills for the last 20 years, and their findings have been mixed. We know that students are more likely to attend university the higher their literacy scores at age 15, and that increased admission numbers bring students with a broad range of abilities. Without universal standards for curriculum and grading beyond the (relatively vague) guidelines set by the provinces, we also know that an “A” in English from High School B is not equivalent to an “A” in English from High School C.

So how is an institution to gauge a student’s true abilities? It can’t. It makes the best decision possible with the information available and waits to see how things turn out.

Entrance testing would allow universities and colleges to set their own clear, transparent standards – for high schools, for students and for faculty – and to trust in their assessment. It can be an expensive solution; that’s part of the reason universities (with a few exceptions) eliminated mandatory testing in the 1970s and ’80s, in favour of voluntary assistance offered through institutional writing centres. But the availability of technologies such as online platforms, which weren’t available 30 years ago, can help make such an initiative more affordable.

We know that those who have completed postsecondary education generally score highest on the OECD’s assessments. But the picture becomes a bit more complex when we break down the numbers. The OECD ranks literacy skills on a five-level scale, setting level 3 as the minimum level required to complete high school and function in the world. As a result, OECD analyses usually present level 3 as a threshold for quality literacy skills. We would argue that this standard is too low to reflect the academic achievements expected of university graduates – those who go on to compete to become engineers, scientists, researchers or public servants.

Many postsecondary students have mastered the basics and can write to be understood but are unable to move beyond this functional level. Prose remains inelegant and unsophisticated, document structure is rudimentary and is often limited to the “five-paragraph essay” taught in Ontario’s high schools, and critical thought may be lacking. Students can write, but not well enough to be counted among the “literacy elite,” OECD levels 4 and 5.  And in the latest round of OECD literacy assessments, only 29 percent of Canadians with a university or college credential scored at levels 4 and 5.

Institutions of higher education do not exist simply to teach applied career skills or discipline-specific knowledge. They also strengthen basic skills, including reading and writing. But how well universities and colleges are succeeding at this is based on reputation and anecdote, and that won’t do when students can’t find jobs and governments want to cut budgets. And it won’t do when the country’s best institutions are competing against those around the world. This might all sound alarmist – we all know we’re doing fine, right? If so, let’s prove it. But you can’t prove what you don’t measure.

 

COMMENTS
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *

  1. Roger Graves / December 10, 2013 at 17:38

    While I agree that some kind of assessment would be useful for students as they graduate from university, I’d like to point out that assessment–done correctly–is an expensive and extensive project. As part of the 1976 entrance class at the University of Waterloo, I wrote the entrance test on the topic of “how to put on a coat.” This wasn’t a very good test for a variety of reasons. But one issue it highlights is the need to have a test that accurately measures the writing skill that students either have at the end of high school or will need to develop in their degree programs (my 1976 test fails on this count). We could have an entrance test that measures whether or not students can write well at the secondary level, but recent research about the kind of writing assignments students do at university reveals that the genres of writing they are asked to produce at the university level are quite different from what they wrote in high school. So even if they were good high school writers, they will still need to develop as writers in the disciplines they study at university.

    Perhaps a better way for universities to spend money would be to support this development of writing abilities rather than fund a test to find out what they already know: all students need access to resources and support if they are to develop as university writers, regardless of their ability upon entrance. Having students develop a portfolio of their writing would be a more valuable assessment tool because the portfolio reflects more accurately than a test the conditions under which students write at university and beyond.

  2. Andrea Williams / December 12, 2013 at 13:49

    Universal standards and measures of literacy don’t recognize the degree to which higher-level writing competency (i.e., that which goes beyond mere functional literacy) is context-specific. For instance, reading and writing in chemistry requires a markedly different skill set from reading and writing in philosophy. Rather than focusing on generic (and ultimately less meaningful) measures, a more effective approach would be to allocate resources to teaching and assessing writing within specific disciplinary contexts. In other words, a better use of resources would involve identifying the writing skills required of chemists, philosophers, etc. and how best to teach these to our students. Such an approach has the additional benefit of reframing the issue of student literacy from a problem to be solved by someone else (often university admissions officers, writing centres, or English departments) to a shared responsibility of faculty across the disciplines.

Click to fill out a quick survey