2010 Judges Report
This is the sixth year that the Standard Bank IT Challenge has been held; principally for Universities in South Africa.
Each of the participating universities used the heats to select one team to represent the university:
• Nelson Mandela Metropolitan University
• Rhodes University
• Stellenbosch University*
• University of Cape Town
• University of KwaZulu-Natal
• University of Limpopo
• University of Pretoria
• University of the Western Cape
• University of the Witwatersrand
In addition there were two teams from within Standard Bank, which did not compete for the prizes, namely:
• CIB IT – made up of local Standard Bank employees
• SBL – made up of employees from the London office
Each team consisted of 4 members, one of whom acted as Team Manager.
In order to create teams reflecting the student population, and create opportunities for more students, teams needed to have at least one member of the opposite gender and one member from a historically disadvantaged background. This proved to be a successful innovation and should be refined and encouraged.
This year the Challenge supported three programming languages, Java, C++ and Python. Each team was supplied with a PC and access to a single printer.
The Challenge is aimed to show both programming ability as well as working in a
team with limited resources.
The Challenge was judged by four judges: Dr Bruce Merry, Dr Carl Hultquist, Max Rabkin and Peter Waker. Peter Waker is a Fellow of the Computer Society of South Africa and acted as Chief Judge. Dr Bruce Merry and Dr Carl Hultquist and Max Rabkin set the problems.
*The university has officially changed its name to “Stellenbosch University”. Afrikaans: “Universiteit Stellebosch”.
Teams were given 7 problems to solve: 1(a), 1(b), 2(a), 2(b), 3, 4, 5. Problem 5 was an “interactive” problem.
Teams were given Problems 1(a) and 5 at the start of the 6-hour competition. Problem 1(b) was issued as soon as a team had submitted 1(a) successfully. Three teams succeeded in solving 1(a) within 6 minutes.
After an hour, teams were given the remainder of the questions with the exception of 2(b) which was only issued on the successful submission of question 2(a).
A point was awarded for each correct solution when tested against test data. The time to complete a successful outcome on a question was recorded – any previous failed submissions on that question each created a 20-minute penalty for that problem.
In the case of the interactive question, each successful submission for Problem 5 counted as a correct submission. In addition the correct solutions were to be tested against each other at the end of the competition period. The winner of this got an extra point, and the second and third had 40 minutes and 20 minutes credited to their total elapsed time.
A breakdown of the submissions problem by problem shows the following:
A breakdown of the submission by language used shows that of the C++ submissions, 33% were correct, while 48% of the Java submissions and 46% of the Python submissions were correct:
• There were 90 submitted solutions of which 41 were correct – the highest number of correct solutions for any of the Challenge events so far.
• Only one team (Stellenbosch University) managed to get 7 points.
• Two teams were not able to submit valid solutions to the interactive question.
• For the interactive question, teams achieved the following ranking:
First: Stellenbosch University
Second: University of Cape Town
Third: Nelson Mandela Metropolitan University
First: Stellenbosch University
Second: University of KwaZulu-Natal
Third: University of Cape Town
The overall university rankings were:
|2||University of KwaZulu-Natal||5||09:00|
|3||University of Cape Town||5||15:13|
|4||University of the Witwatersrand||3||05:34|
|5||The Nelson Mandela Metropolitan University||3||05:53|
|6||University of Pretoria||3||08:00|
|8||University of the Western Cape||1||03:33|
|9||University of Limpopo||0||00:00|
Honourable Mention: Although not officially part of the competition, the local Standard Bank team would have been fourth had they been counted in. For much of the competition they were in third place.
1. Both the scoreboard and the visualisation of the interactive question worked well and should be carried over to the 2011 competition. The availability of both on the web should be made widely known.
2. More universities should be invited to take part in the heats for 2011.