Friday Dec 19, 2014
| Text Size
[-]
[+]
Search IPPmedia

Marking exams by computer could help, but may span new problems

22nd September 2012
Print
Comments
Editorial cartoon

Exam papers for the primary school leaving certificate will this year be put to marking by a computer system whose software is titled Optical Mark Reader (OMR), whose key element is optical mark recognition, when answers are placed on a special computer scanning sheet. While the Ministry of Education and Vocational Training expects to cut costs of relying on manual marking of exam papers, there could be other problems arising from that scheme, in which case it may not be a panacea for exam cheating or leakages. It is a question of the specific circumstances in which the move has been decided.

Those touting the system point out that it’s used in countries like Kenya and South Africa, and since we have the same multiple-choice exam’s format it can also be used in our case. Since the move was announced late last week, some commentators have complained not about the format as such but the multiple choice exam setting, intimating that it brings about a culture of clever guesswork instead of reasoning out a question. Some participants in talk shows on local radio and television were saying during the strike effort by teachers, that multiple-choicequestions permit illiterate pupils to pass to form one.

There is of course something in each of those observations, which raise doubt on the new system, for instance the fact that it is better to assess if a pupil has reasoned, instead of whether he or she guessed right.  The problem that brings about the multiple choice exam format and it is clear that it is felt far and wide, even in a relatively advanced country like South Africa, is that usually there are large numbers of pupils in a school, often studying by streams, and placing a ‘reason’ or small essay format of exams would exponentially increase the time and resources needed for the marking exercise. It is unworkable.

At the same time, multiple-choice exams aren’t pure guesswork in the manner in which one predicts which team shall win in a league match, as it involves making distinctions of a logical sort, or series and class for instance.  A multiple choice test requires that the pupil reason out why it isn’t this option but the next or the other one, and if she gets right in more than half or slightly less than half it follows that she can follow reasonably well the sort of material that her mind was required to process, which is the point about exams. In other words multiple choice is an exam - and it definitely tests reasoning as well.

Where the shoe starts pricking is the proper use of that system for the whole country, and maintain the requirements of bringing exams to marking centres on time and finding equipment and control staff in place for the whole exercise. It requires that the distances to be covered and time taken for exam papers to be put to automatic marking (optical recognition of answers) should be the same as what is expected in manual marking systems as delays raises opportunities for faulty placement of papers, etc. Still as the ministry took radical steps on cheating cases, incentive for bad habits may have declined.

In addition, a computer is just a tool in the hands of an individual, and can be put to the wrong use, unless the software itself cannot be manipulated (orders put on the screen and obeyed while marking an exam paper). The ministry said the method (optical recognition) shall cut costs, and did not really emphasize that it shall eliminate cheating, as perhaps that isn’t the function of electronic gadgetry.  In the final analysis the matter of cheating is a human drawback for which there is no software to prevent.

So there are worries about glitches and hitches of using fairly advanced software that works on special paper and scanners, which would presumably be available in the usual marking centres, unless perhaps the marking is shifted to district level and even beyond. There are issues of feasibility of marking exam papers by computer, and even in compiling results complications have come up, for instance NECTA (the national examination council) had a brush with hostile demonstrators on an Islamic Knowledge paper that seemingly was mishandled. The fact that NECTA blamed software problems is a shadow quite long.

SOURCE: THE GUARDIAN
0 Comments | Be the first to comment