An [LTI](http://www.imsglobal.org/activity/learning-tools-interoperability)-based Python autograder similar to [pythonauto](https://github.com/csev/pythonauto), but using [Brython](http://http://brython.info/) to support Python 3.
An [LTI](http://www.imsglobal.org/activity/learning-tools-interoperability)-based Python autograder similar to [pythonauto](https://github.com/csev/pythonauto), but using [Brython](http://http://brython.info/) to support Python 3.
This is a very simple project, requiring PHP support on the server-side. Once cloned, you should edit `index.php` to change the `$oauth_consumer_secret` to something unique for your installation. This is the consumer secret you need to specify in your [LMS](https://en.wikipedia.org/wiki/Learning_Management_System) for it to connect to the AutoMarker.
This is a very simple project, requiring PHP support on the server-side. All dependencies are managed by [Bower](https://bower.io/), so after cloning you should run `bower install`.
In your LMS, you can then link to `index.php?exercise_id=X`, where X can be 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 (or edit `exercises.py` to create different exercises).
Before connecting through an [LMS](https://en.wikipedia.org/wiki/Learning_Management_System), you should edit `index.php` to change the `$oauth_consumer_secret` to something unique for your installation.
Students can then click the links to launch the exercises and, once all tests have been passed, they are able to submit their code for grading. By default, grading is client-side only, though you can enable server-side checking by editing `test_code.sh` - though note that you should only run this as an unprivileged user, since this will run arbitrary Python code.
Using that consumer secret in your LMS, you can then link to `index.php?exercise_id=X`, where X can be 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 (or edit `exercises.py` to create different exercises).
You can also visit the exercises outside of an LMS. Testing will still work, though submission of grades is diabled.
Students can then click the links to launch the exercises and, once all tests have been passed, they are able to submit their code for grading. By default, grading is client-side only, though you can enable server-side checking by editing `test_code.sh` - note that you should only run this as an unprivileged user, since it allows testing of arbitrary Python code.
\ No newline at end of file
You can also visit the exercises outside of an LMS. Testing will still work, though submission of grades is disabled.