Biomedical Engineering Reference
In-Depth Information
5 Conclusions and Scope for Future Work
The developed framework will be helpful for the examiner in the evaluation of
student practical examinations with minimum human intervention. The use of
computerized tools can reduce the limitations of manual process. Computerized
evaluation ensures uniformity of evaluation, speedy, and transparent results.
The size of programs evaluated by the software is from small practice questions
to single le programs. The scope of the current system is limited to C, C++, and
Java programming languages.
The framework can be improved to check the student programs for viruses by
integrating with some anti-virus program. It can be improved to handle multi-
le
projects. The framework can be extended to multi-agent technology including student
agent and examiner agent. The examiner agent can also be used for conducting the
written objective test of the students that may be a part of the practical examination.
References
1. Jackson D, Usher M (1997) Grading Student Programs using ASSYST. In: Proceedings of the
28th SIGCSE technical symposium pp 335 - 339. doi: 10.1145/268084.268210
2. Daly C (1999) RoboProf and an introductory computer programming course. In: Proceedings
of the 4th annual SIGCSE/SIGCUE on innovation and technology in computer science
education, Krakow, pp 155 - 158, 27 - 30 June 1999. doi: 10.1145/384267.305904
3. http://www.boss.org.uk/
4. Leal JP, Silva F (2003) Mooshak: a web-based multi-site programming contest system.
J Software
Pract Experience 33(6):567
581. doi: 10.1002/spe.522
-
5. Garc
n JL (2009) A course on algorithms and data structures
using on-line judging. In: Proceedings of the 14th annual ACM SIGCSE conference on
innovation and technology in computer science education 41(3):45
í
a-Mateos G, Fern
á
ndez-Alem
á
49. doi: 10.1145/1505496.
-
1562897
6. Montoya-Dato FJ, Fern
a-Mateos G (2009) An experience on Ada
programming using on-line judging. In: Proceedings 14th international conference on reliable
software technologies, pp 75
á
ndez-Alem
á
n JL, Garc
í
89. doi:10.1007/ 978-3-642-01924-16
7. Douce et al (2005) A technical perspective on ASAP
-
automated system for assessment of
programming. In: Proceedings of the 9th computer assisted assessment conference
8. Mandal AK, Mandal C, Reade CMP (2006) Architecture of an Automatic program evaluation
system. In: Proceedings of CSIE
9. Raadt M, Dekeyser S, Lee TY (2007) A student employing peer review and enhanced
computer assisted assessment of querying skills. Inform Educ 6(1):163 - 178
10. Zhang G, Ke H (2010) SQL paperless examination system design. In: 2010 Second
international conference on computer modeling and simulation, IEEE, 3:475 - 478. doi: 10.
1109/ICCMS.2010.468
11. Farrow M, King PJB (2008) Experiences with online programming examinations. IEEE Trans
Educ 51(2):251
12. Skupas B (2010) Feedback improvement in automatic program evaluation systems. Inform
Educ 9(2):229
237
-
13. Fern
n JL (2011) Automated assessment in a programming tools course. IEEE
Trans Educ 54(4):576
á
ndez Alem
á
581
-
Search WWH ::




Custom Search