2012 Judges Report

This is the eighth year that the Standard Bank IT Challenge has been held; principally for Universities in South Africa. The Challenge aims to show both programming ability and teamwork with limited resources.

Teams were made up of four members, one of whom acted as Team Manager. To ensure teams reflected the student population, and to create opportunities for more students, teams needed to include members from both genders and at least one member from a historically disadvantaged background. Each team was supplied with one PC only and access to a single printer.

Participating universities were:

University Province Teams
Cape Peninsula University of Technology WC 8
Nelson Mandela Metropolitan University EC 7
North-West University NW 7
Rhodes University EC 6
Stellenbosch University WC 8
University of Cape Town WC 12
University of KwaZulu-Natal KZN 9
University of Limpopo L 9
University of Pretoria G 8
University of the Western Cape WC 9
University of the Witwatersrand G 8
Walter Sisulu University EC 8

The heats were used to select the nine universities that would be invited to the finals. In addition, participating universities used the heats to select one team to represent the university. The teams competed at their own universities under supervision of a proctor from Standard Bank.

This year the Challenge again supported three programming languages: Java, C++ and Python.
The Challenge was judged by three judges: Julian Kenwood, Max Rabkin and Peter Waker. Peter Waker is a Fellow of the Computer Society of South Africa and acted as Chief Judge. Julian Kenwood and Max Rabkin set the problems.

Final Round: 16 May

The final round was held at the Standard Bank Headquarters in the Johannesburg CBD. Teams from the highest scoring universities attended. As in the past, the names of the teams were intriguing.

North-West University 402 Payment Required
Rhodes University Pythoneers
Stellenbosch University facepalm.jpg
University of Cape Town 150k-UP
University of KwaZulu-Natal Aslam’s Army
University of Pretoria Tuks 1
University of the Western Cape 0x41414141
University of the Witwatersrand Ctrl-Alt-Del

The team from the University of Limpopo had transport problems, and did not participate.

A team from Standard Bank also took part, although they did not compete for the prizes. The team,
SBCIB, was made up of local Standard Bank Corporate and Investment Banking employees.

Participants were given 7 problems to solve: 1a, 1b, 3a, 3b, 4 and 5. Problem 5 was an “interactive” problem.

All the problems, except 2b and 3b, were given at the start of the 6-hour competition. Problem 2b was issued as soon as a team had submitted 2a successfully and 3b as soon as 3a had been submitted successfully.

Teams gained a point for each correct solution when tested against test data. The time to complete a successful outcome on a question was recorded. Teams earned time-penalties for submitting non-successful solutions.

A breakdown of the submissions problem by problem shows the following:

Result 1a 1b 2 3a 3b 4 5 Total
ABNORMAL 4 3 1 6       14
CORRECT 8 4 6 4 1 1 8 32
FORMAT ERROR 1   1 2       4
TIME EXCEEDED   1 8 5 2 1   17
WRONG 4 4 5 5 3 1   22
TOTAL 17 12 21 22 6 3 8 89

Three of the teams used Java, four used Python and two used C++.
A breakdown of the submission by language used shows that of the C++ submissions, 54% were correct, while 28% of the Java submissions and 28% of the Python submissions were correct. However, these statistics may be meaningless as with so few teams the percentage correct submissions is determined by the strategy of the teams rather than the suitability of the language.

Result C++ Java Python Total
ABNORMAL 2 4 8 14
CORRECT 14 8 10 32
WRONG 8 7 7 22
TOTAL 26 28 35 89


• There were 89 submitted solutions of which 32 were correct – slightly lower proportionally than last year’s 32 out of 69.
• Only one team managed to solve all the problems.
• One team managed to solve problem 1a in 11 minutes.
• One team did not solve any of the problems.
• Seven teams were not able to submit valid solutions to the interactive question.
• In the inter-team playoff for the interactive question, the SBCIB team managed to beat all.


The winning team was selected according to the number of problems solved. If two or more teams solved the same number of problems, they were ranked according to the time taken. Teams earned time-penalties for submitting non-successful solutions; which explains why teams seem to have taken longer than the allocated six hours.

The winning teams were ranked as follows:

Place University Team Solved Time
1 Cape Town 150-UP 7 22:40:05
2 Pretoria Tuks 1 5 10:56:14
3 North West 402 Payment Required 5 13:46:33
4 Witwatersrand Crl-Alt-Del 4 07:52:29
5 Stellenbosch facepalm.jpg 4 06:31:23