From 03b2e8bab69daa212d02f79b1e41bd7e8d702b99 Mon Sep 17 00:00:00 2001
From: David Paul <David.Paul@une.edu.au>
Date: Mon, 4 Jul 2016 00:46:50 +0000
Subject: [PATCH] readme.md edited online with Bitbucket

---
 readme.md | 10 ++++++----
 1 file changed, 6 insertions(+), 4 deletions(-)

diff --git a/readme.md b/readme.md
index 45cfe38..189f08b 100644
--- a/readme.md
+++ b/readme.md
@@ -2,10 +2,12 @@
 
 An [LTI](http://www.imsglobal.org/activity/learning-tools-interoperability)-based Python autograder similar to [pythonauto](https://github.com/csev/pythonauto), but using [Brython](http://http://brython.info/) to support Python 3.
 
-This is a very simple project, requiring PHP support on the server-side. Once cloned, you should edit `index.php` to change the `$oauth_consumer_secret` to something unique for your installation. This is the consumer secret you need to specify in your [LMS](https://en.wikipedia.org/wiki/Learning_Management_System) for it to connect to the AutoMarker.
+This is a very simple project, requiring PHP support on the server-side. All dependencies are managed by [Bower](https://bower.io/), so after cloning you should run `bower install`.
 
-In your LMS, you can then link to `index.php?exercise_id=X`, where X can be 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 (or edit `exercises.py` to create different exercises).
+Before connecting through an [LMS](https://en.wikipedia.org/wiki/Learning_Management_System), you should edit `index.php` to change the `$oauth_consumer_secret` to something unique for your installation.
 
-Students can then click the links to launch the exercises and, once all tests have been passed, they are able to submit their code for grading. By default, grading is client-side only, though you can enable server-side checking by editing `test_code.sh` - though note that you should only run this as an unprivileged user, since this will run arbitrary Python code.
+Using that consumer secret in your LMS, you can then link to `index.php?exercise_id=X`, where X can be 1, 2, 3, 4, 5, 6, 7, 8, 9, or 10 (or edit `exercises.py` to create different exercises).
 
-You can also visit the exercises outside of an LMS. Testing will still work, though submission of grades is diabled.
\ No newline at end of file
+Students can then click the links to launch the exercises and, once all tests have been passed, they are able to submit their code for grading. By default, grading is client-side only, though you can enable server-side checking by editing `test_code.sh` - note that you should only run this as an unprivileged user, since it allows testing of arbitrary Python code.
+
+You can also visit the exercises outside of an LMS. Testing will still work, though submission of grades is disabled.
\ No newline at end of file
-- 
GitLab