Grade Inflation Essay Conclusion Format

Grade inflation is the assignment of a grade to a student who has not yet reached the achievement level represented by that grade. For example, teachers may grade students based on their effort and/or their motivation to learn rather than mastery of content. Also, schools within or between districts may compete for students (and tax dollars or tuition) by promoting a reputation for graduating students of a high caliber — those who have higher grades than students in other geographic areas. In addition, some schools simply sell grades to students willing to purchase them. Regardless of how such grades are bestowed, inflated grades do a disservice to students who feel they have earned them and believe they are prepared for college and/or a position in the workforce.

Keywords: American College Testing (ACT); Assessment; Grade Inflation; Grade Level Promotion; Higher Education; Naïve Selection; Scholastic Aptitude Test (SAT); Secondary Schools; Strategic Selection

Overview

Most parents would like to think that their child's report card is indicative of the child's academic achievement in class — the objective combination of content knowledge and performance since the previous report card. It is possible, however, that report cards reflect a subjective interpretation of one child's progress when compared to another child. It is also possible that report cards define a school's rank within a district or offer a public school's value over a private school. These are common reasons for the recent trend of grade inflation in secondary schools. Grade inflation is the difference between a student's grade and his actual attainment of course content reflective of that grade. In other words, a student who receives a B in 11th grade history should be able to demonstrate 80% proficiency of the course content when tested. If he can't, the B is inflated in relation to his knowledge. Grades are inflated in secondary schools for a variety of reasons.

According to an article in Gifted Child Today, a 2000 report showed that "86% of teachers consider student effort as a factor" when determining grades ("Grade Inflation," 2000). Most schools have a general rubric on which grades are developed, but teachers can incorporate how hard a student has worked into that rubric, sometimes defining who is promoted to the next level based on effort rather than content mastery. Grade promotion based on effort is misleading. It also creates dependence for the student; if he doesn't continue to receive inflated grades, his opportunities for success dwindle because he doesn't know as much as he should. For example, the ACT (American College Testing) exam is a standardized test taken in the junior or senior year of high school and is used by colleges to determine student placement in courses like English and math. ACT scores can determine scholarship eligibility and whether or not students require remediation once enrolled in college:

It is composed of four tests: English, Mathematics, Reading, and Science. A fifth score, the Composite score, is the average of the scores on the four subject tests. The ACT not only measures the knowledge and skills students have acquired during their high school years and their level of achievement as a result of their high school learning and instruction, but also serves as a measure of their preparation to undertake rigorous coursework at the postsecondary level ("Are High School Grades Inflated?" 2005, p. 2).

The Scholastic Aptitude Test (SAT) is also used by colleges to determine a student's scholarship eligibility and what — if any — advanced courses a student may take once enrolled. Depending on the college, SAT or ACT scores may be required for admission purposes. And, while a student can take each exam as many times as he wishes (for a fee) — using the highest score in admissions materials — many schools record each score and have a record of multiple attempts to increase that score. Even colleges that look more holistically at students' high school experiences (as opposed to weighing so heavily on a standardized test scores) would expect strong performances on these exams from students who have high averages in high school. Because so many high school averages include grade inflation, student averages are not correlating with standardized test scores:

In 1984, 28% of all students taking the SAT reported A averages; while in 1999, 39% of SAT-taking students reported A averages. Since performance on the SAT has not varied significantly over the past 23 years, researchers have concluded that this increase is a result of grade inflation ("Grade Inflation," 2000).

The grades reported by students taking these tests has increased, yet their test scores have not. Bracey (1994) notes similar results in that students are more recently reporting grades from A- to A+ as representative of their high school averages, yet their SAT scores are not reflecting such high levels of content mastery.

Further Insights

Why Inflate Grades?

Naïve

From a parent's perspective, grade inflation may determine what school their children will attend. For example, if students at School A receive better grades than students at School B, it may be assumed that School A has better teachers, a stronger administration, or more resources that lead to student success when compared to School B. Walsh (2010) explains that when it comes to parents and school selection, the choice process can be naïve or strategic (p. 152). Naïve selection occurs when parents don't pay attention to standardized test scores, when they make note of things like state of the art computer labs (or other high-end resources), or when they simply don't suspect grade inflation (p. 152). Strategic selection, on the other hand, happens when parents send their children to schools specifically because of grade inflation practices. Students who receive high grades are more likely to be accepted into college. Walsh (2010) explains that,

… highly educated parents with high-achieving students may be attracted by high perceived school quality or college-admissions gamesmanship more than average families. If this is so, a grade-inflating school district could not only attract more families but could attract a particular type of family. The peer quality and achievement of the district would be genuinely high, not because of school quality effects but because of selection (Walsh, 2010, p. 153).

In contrast, schools in districts with academically weak students face budgetary restraints if students don't pass from year to year. This may encourage teachers to inflate grades for students who may otherwise drop out. In a study conducted by Lekholm & Cliffordson (2008), it was observed that students had higher overall grades when compared to standardized test scores in districts with families having "lower educational backgrounds" (p. 195-196). Again, the 2001 legislation of No Child Left Behind places a monetary value on student pass rates, so grade inflation makes sense in economically struggling districts.

Overall, it appears that many school districts benefit from grade inflation. Public schools that compete with private schools or with schools within their own districts have an advantage when they graduate students with high GPAs: students want to attend those schools, and their parents are happy to enroll them there. Also, academically strong students are sought after by colleges, and at schools that inflate grades, colleges might believe they have a strong pool of applicants from which to choose. How long students from this pool remain in college is difficult to predict, but many will be faced with a reality they did not expect once they get to college because they are not prepared. Districts that enroll academically weak students also benefit as their students drop out less when they have passing grades. Thus, weaker students may view inflated grades as a reason to persist to graduation.

Buying the Grades

According to Hansen, "Whether due to years of grade inflation in high school … or society's overall disrespect for the immaterial value of education, many students tend to look at academic accomplishment as just another commodity to be purchased" (1998, p. 13). Unfortunately, some students actually do purchase the commodity. Farran (2009) discovered a common practice in Vancouver whereby students attending one high school could pay tuition at another school to take the same exact courses yet receive higher grades. The province of British Columbia created a policy to allow students whose high schools did not offer...

Statistics recently released across all British universities show that, over the past decade, the proportion of students gaining a first class degree has nearly doubled, from 11% in 2003-4 to 19% in 2012-13. The proportion of students attaining a 2.1 has also increased. Research at Lancaster University's School of Management argues that this simply reflects the rising quality of A-level students. Others have suggested that this may be evidence of "dishonesty", as universities chase league table recognition. Who is right?

The dishonesty argument hinges on university autonomy; a university that has degree awarding powers sets its own standards and could, in theory, manipulate them. In practice, though, there is a long-developed system of checks and balances that set levels of parity across broad networks of universities.

The external examination system is central to this. A broad range of external accrediting bodies, mostly for the professions, impose specific sets of requirements. And the Quality Assurance Agency (QAA) audits every university on a regular basis, looking in particular at the ways in which appropriate standards are implemented.

Every university that I know takes these assurance systems very seriously; in my university, I read every external examiner report for all our academic programmes. No system is perfect. But the argument that universities are dishonestly manipulating results is both lazy and ill-informed.

So why is the proportion of good degrees going up? Research such as the Lancaster study is important, because we don't know enough about the relationship between students' inherent abilities, the value that is added through the opportunities universities provide and the systems we use to measure these factors. There is no golden age of grading, despite nostalgia for a time when only a few students attained a first.

Back in the day, assessment took the form of an endurance race through five successive days of written papers and no-one would ever get more than 80%. It was never clear what was actually being assessed, other than the ability to remember a huge amount and write almost continually for five days. The stakes were much lower; since less than 10% of school levers went to any university, the employment advantage was gained by getting in, and getting out with something. People with third class degrees got jobs along with a reputation for valuing a good social life.

Today, this age of innocence is long gone. Students are schooled to worry about employment before they start university. With the majority of their contemporaries going on to post-compulsory education, the stakes are immensely higher and students work much harder. So one reason why there is an increasing proportion of graduates with good degrees is simply that they deserve them. Looked at it another way, it would be profoundly unfair to pile on all the pressure for attainment while continually moving the finishing line further towards the horizon.

In addition, assessment has become a lot more sophisticated, and appropriately so. There needs to be a sensible balance between formative assessment (coursework) and summative assessment (final examinations). Students today are not at university to be "filled up" with knowledge. Rather, they want the opportunity to develop advanced analytical techniques to make sense of an increasingly complex world, awash with information. This requires experiential learning, melding theory and principles with applications, and this in turn leads to a blend of assessment methods that measure students' abilities more closely. So golden age comparisons are setting very different worlds of learning against each other, as if they are comparable.

This said, the Lancaster study is, in itself, an inadequate explanation. Firstly, it misses the point that only about half of students in British Higher education have A-levels. Significant numbers of students enter university from further education colleges, with vocational qualifications. Secondly, it ignores the compelling evidence – lined up by the Sutton Trust and in other studies – that A-level attainment is correlated with socioeconomic status and household income. And, of course, A-level assessment methods have themselves changed dramatically over the past decades. We cannot, then, turn to A-levels for a simple assurance nothing has changed.

What is needed? We are probably nearing the point where traditional degree classifications will be abandoned. We should rather look for reliable, secure and trusted ways of providing our students with comprehensive transcripts of everything they have done at university. Because employers increasingly demand all this additional information in any case, we need to find ways of providing them with the best possible means of expressing a graduate's full range of capabilities, work and attainment while at university. This would be fairer to our students, more useful to employers and better than an arcane system of degree classification that is outliving its usefulness.

Martin Hall is vice-chancellor of the University of Salford – follow him on Twitter @VCSalford

This content is brought to you by Guardian Professional. Looking for your next university role? Browse Guardian jobs for hundreds of the latest academic, administrative and research posts

0 thoughts on “Grade Inflation Essay Conclusion Format”

    -->

Leave a Comment

Your email address will not be published. Required fields are marked *