diff --git a/readme.md b/readme.md index 45cfe381078815a0fe91e057c0fe9278e2ab2564..189f08b57237cba6d3b2afff1cf6d58e2ae475bb 100644 --- a/readme.md +++ b/readme.md @@ -2,10 +2,12 @@ An [LTI](http://www.imsglobal.org/activity/learning-tools-interoperability)-based Python autograder similar to [pythonauto](https://github.com/csev/pythonauto), but using [Brython](http://http://brython.info/) to support Python 3. -This is a very simple project, requiring PHP support on the server-side. Once cloned, you should edit `index.php` to change the `$oauth_consumer_secret` to something unique for your installation. This is the consumer secret you need to specify in your [LMS](https://en.wikipedia.org/wiki/Learning_Management_System) for it to connect to the AutoMarker. +This is a very simple project, requiring PHP support on the server-side. All dependencies are managed by [Bower](https://bower.io/), so after cloning you should run `bower install`. -In your LMS, you can then link to `index.php?exercise_id=X`, where X can be 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 (or edit `exercises.py` to create different exercises). +Before connecting through an [LMS](https://en.wikipedia.org/wiki/Learning_Management_System), you should edit `index.php` to change the `$oauth_consumer_secret` to something unique for your installation. -Students can then click the links to launch the exercises and, once all tests have been passed, they are able to submit their code for grading. By default, grading is client-side only, though you can enable server-side checking by editing `test_code.sh` - though note that you should only run this as an unprivileged user, since this will run arbitrary Python code. +Using that consumer secret in your LMS, you can then link to `index.php?exercise_id=X`, where X can be 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 (or edit `exercises.py` to create different exercises). -You can also visit the exercises outside of an LMS. Testing will still work, though submission of grades is diabled. \ No newline at end of file +Students can then click the links to launch the exercises and, once all tests have been passed, they are able to submit their code for grading. By default, grading is client-side only, though you can enable server-side checking by editing `test_code.sh` - note that you should only run this as an unprivileged user, since it allows testing of arbitrary Python code. + +You can also visit the exercises outside of an LMS. Testing will still work, though submission of grades is disabled. \ No newline at end of file