As Elaine Bloss noted in her article Feedback to large cohorts - what does learning technology have to offer?, provision of quality feedback can be a particular challenge for those individuals responsible for large cohorts. However supportive of learner-centred and activity based assignments we may be, the sheer logistics of managing the marking and feedback process may lead us into a more didactic approach to teaching and learning. Of course, the e-learning community is always happy to rise to a challenge, and many are now developing tools that will assist in the delivery of problem based learning activities within large cohorts. So, whilst Elaine's article makes reference to the creation of GroupLog at the University of Bath, I see that UCLA have developed a tool known as Calibrated Peer Review (CPR).
Both GroupLog and CPR are online tools that enable text based activities to be delivered to large cohorts, whilst enabling students to benefit from viewing the work of their peers. Although these tools are based upon a similar approach, the latter goes a step further to include peer (and self) evaluation in the mix.
There are two stages to the completion of assignments through CPR. Stage 1 requires the student to explore the source material associated with the activity prior to writing their own response, whilst stage 2 is the evaluation stage. The evaluation stage involves the appraisal of essays (known as 'calibration texts') written by the tutor specifically for the assignment. This process validates the student's evaluation skills and ensures that they are able to review the work of their peers effectively. Next students are required to review and evaluate 3 texts written by their classmates, and finally to evaluate their own response to the activity.
This looks to be an excellent resource, although I cannot see anything on their site about local hosting - a shame as this is often a consideration for many institutions.