MarkUs Blog

MarkUs Developers Blog About Their Project

Archive for the ‘Course Administration’ Category

Marking Scheme for Tara, Fernando, and Farah

with 2 comments

Since Tara, Fernando, and Farah have been focusing on the development of features, they have decided to use a similar evaluation scheme.

Specifically, Tara, Fernando, and Farah will each be graded individually based on the following marking scheme:

1) Feature completion: 30%

  • For simple grade entry, this means:
    • An instructor can create a grade entry form and can assign TAs to enter the marks
    • An instructor and the appropriate TAs can enter grades into the form
    • An instructor can upload/download a grade entry form in CSV format
    • Students can view their grades for a grade entry form
  • For logging, this means:
    • A developer can log a message
    • A developer can specify the log level that he wants
    • An administrator can specify the desired type of rotation
    • An administrator can specify the desired file names and path of the log files
    • Localized messages for using I18n
  • For the notes system, this means:
    • Students can see nothing related to the notes
    • An instructor or a TA can create new notes on groupings
    • An instructor or a TA can view existing notes on a grouping when creating a new note
    • A user can edit or delete notes created by him/herself
    • An admin can edit or delete any notes
    • An instructor or a TA can see an aggregate view of notes on the new Notes tab
    • The groupings object designation for notes should be extended to also work for assignments and students

2) Feature testing: 20%

  • For simple grade entry, this means:
    • There should be tests for instructors, TAs, and students to ensure the above functionality exists as desired
  • For logging, this means:
    • Messages are logged properly
    • Log Files are rotated properly
    • Log level severity are differentiated and send to the corresponding log file
  • For the notes system, this means:
    • There should be tests for instructors, TAs, and students to ensure that the functionality does or doesn’t exist as desired

3) Documentation: 20%

  • The code should be documented
  • The tasks for this feature should be split up into tickets with clear descriptions
  • There should be a document posted under “MarkUs Component Descriptions” on the DrProject site that explains the current state of the feature, future plans, etc.
  • There should be a short screencast of the feature posted on the DrProject site if applicable

4) Overall Process: 30%

  • Blog posts, status reports, or review requests indicated steady progress throughout the term
  • A thorough design process was followed
  • Demonstrated good programming practice
  • Consulted with other team members about design/implementation decisions and/or helped other team members (eg. through blog posts, review requests etc.)
  • For Tara, Overall Process also includes maintenance (i.e. other tickets)

Fernando would also like the following breakdown for these categories:

  1. Feature completion (30%)
    • Logging – 15%
    • Notes System – 15%
  2. Feature testing (20%)
    • Logging – 10%
    • Notes System – 10%
  3. Documentation (20%)
    • Logging – 10%
    • Notes System – 10%

Tara would also like the following breakdown for these categories:

  1. Feature completion (30%)
    • Maintenance – 10%
    • Notes system – 20%
  2. Feature testing (20%)
    • Maintenance – 5%
    • Notes system – 15%
  3. Documentation (20%)
    • Maintenance – 5%
    • Notes System – 15%

Written by Farah Juma

October 30th, 2009 at 2:25 am

Posted in Uncategorized

Proposal: Grading Scheme Severin

without comments

Here is a possible way how my grade for this course could be evaluated. If I have included too vague criteria or this wouldn’t work for other reasons, I’d be happy to get feedback (particularly feedback from faculty).

  • Team performance (worth 40%). I would evaluate the team grade by peer evaluation. There should be a small questionnaire to be answered by each member of our team. Questions I’d ask: Did anybody in the team help you when you hit a roadblock (if ever)? If yes, who? How often? Did you get feedback, when you needed it? For example, did you get good feedback for your design proposal? If you could redo this term working on MarkUs, what would you make different? Overall, how would you rate the team performance (1-10; where 10 is the highest mark)? How did reviews work out for you? Did they help improving your Rails skills? Did you get good feedback? Individual answers will be send to Mike, who will then set the final grade for the team performance part. When answering the questionnaire, everybody should corroborate his or her answers with some specific facts, which led to his/her answer.
  • Individual evaluation (worth 50%). For my individual evaluation I would expect that the following criteria would be considered:
    • Given that I have been working on MarkUs before the start of the term, was I approachable and helpful in getting new developers up to speed/solving roadblocks?
    • A focal point for me this term is to create a paper on automated testing. This should provide a solid basis for future developers which get to implement this feature. I.e. the expected outcome is to produce a document which specifies requirements, possible problem points and advantages/disadvantages of possible scenarios (accounts for 20% of the grade).
    • Progress and completion of tickets which have been assigned to me. This includes complete test suites for submitted code (if possible), and according reviews have been posted on markusproject. Code has then been integrated into the main source code branch and tested for regressions. Each chunk of code submitted included appropriate documentation. Tickets have been appropriately used throughout the development process and necessary documentation (if any) has been created on the wiki. Standards for testing and code quality are met if the code passes the review process.
    • A significant chunk of my work has been maintenance and providing support for the MarkUs instances installed at University of Toronto (fixing, testing bugs in branches/release 0.5; creating and testing patches).
    • General participation and administrative tasks (participation in IRC meetings, provided meaningful feedback, wrote blog posts, took meeting minutes as arranged, collected status reports, etc.).
    • Has the aggreed upon process been followed when working on MarkUs. Did I make meaningful progress in working with the team.
  • Writing requirement (worth 10%) Have assigned writing tasks been completed as requested. How much effort did I put into carrying out the writing tasks.

What do you think? Please feel free to drop me a comment.

Written by Severin

October 28th, 2009 at 9:45 am

Posted in Uncategorized

How should your work in the course be evaluated?

with one comment

Greg has asked everyone to design their own evaluation plan.   There are two separate goals here:

  1. Get you thinking about how to define the success of your project.  How will you know if you have succeeded?
  2. Get you thinking about the things that you are doing that could be evaluated.  (More along the lines of a performance evaluation at a job.)

Perhaps the most obvious definition of success is that the feature you were asked to implement is complete and works.  Now define “complete” and “works”.  Think about the components of the feature and what you are doing to give the users/customers confidence that the features works.

One aspect of “complete” that I would like to highlight is the documentation left behind.  Could a new developer go to our web sites and find information on the design decisions that were made, on the state of each feature in terms of know problems,  or future enhancements.  The tickets should be up to date and clear.  There should be some kind of document that describes the current state and future plans (probably a short one).

Since this is a course and I care about not only the end product, but also the process we used to get there, and the learning experience(*), other things you might think would contribute to your grade include: participation level, willingness to help other students, willingness to participate in reviewing code, demonstrating “good” programming practice, consultation and design process, demonstrating steady progress.  I’m sure you can think of others.

I think it is probably appropriate to have at least 3 different evaluation mechanisms.  Mélane, Simon, and Gabriel fall under the evaluation scheme of their course, so it is probably appropriate for them to submit primarily that scheme.  Tara, Fernando and Farah have been focusing on the development of their features, and may want to use a similar evaluation scheme.  Severin has been doing more maintenance work and team lead kind of work, so his evaluation scheme should include those components.

A final tip.  Please don’t try to make the evaluation scheme too fine-grained.

* Can you tell I’ve been going to curriculum and teaching evaluation meetings?

Written by Karen Reid

October 23rd, 2009 at 10:43 am

Posted in Uncategorized

Who is collecting punch-lines? Who is taking minutes?

with one comment

There are 7 weeks left this term. For the remaining weeks I am creating this blog post so that everybody knows who is collecting punch-lines and who is taking minutes. I hope this will help you keeping track of it. The week numbers are the numbers your favorite calendar application shows you (1-52) 🙂

Week 37: punch-lines: Karen, minutes: Mike

Week 38: punch-lines: Mike, minutes: Tara

Week 39: Code Sprint, updates by Farah and Mike

Week 40: punch-lines: Tara, minutes: Fernando

Week 41: punch-lines: Fernando, minutes: Melanie

Week 42: punch-lines: Melanie, minutes: Severin

Week 43: punch-lines: Severin, minutes: Gabriel

Week 44: punch-lines: Gabriel, minutes: Simon

Week 45: punch-lines: Simon, minutes: Farah

Week 46: punch-lines: Tara, minutes: Fernando

Week 47: punch-lines: Melanie, minutes: Severin

Week 48: punch-lines: Gabriel, minutes: Simon

Week 49 (post-mortem): punch-lines: Mike, minutes: Farah

Written by Severin

October 7th, 2009 at 4:13 pm

Posted in Uncategorized