MarkUs Blog

MarkUs Developers Blog About Their Project

Archive for April, 2010

Introducing MarkUs 0.7.0!

with 2 comments

Wow, what a great semester!  The team worked really hard, and should be very proud of themselves.  Besides all of the fixed bugs and the newly written tests, the MarkUs team cranked out the following new features:

  • A more stable, polished, and usable notes system.  Notes can now be applied to students, groups, and assignment submissions
  • An alternative, “flexible” marking scheme type that allows graders to input arbitrary numerical grades, as opposed to the more strict rubric marking scheme
  • A new marks spreadsheet feature that allows instructors and graders to input grades from assignments, quizzes, or tests, in an Excel-like web-based spreadsheet.  Grades can also be uploaded and downloaded via CSV file
  • The table of student submissions can now be bookmarked, and the back-button works correctly
  • Lots of bugfixes and usability fixes.

So what are you waiting for?  Get MarkUs 0.7.0 right now!

Contributors:

Benjamin Vialle
Christian Jacques
Nelle Varoquaux
Victoria Mui
Joseph Maté
Robert Burke
Bryan Shen
Brian Xu
Fernando Garces
Farah Juma
Severin Gehwolf
Mike Conley

Written by m_conley

April 28th, 2010 at 11:16 pm

Posted in Uncategorized

In-memory database speeds up tests, but not in a such great scale.

without comments

It has been observed that test suites with Machinist are slower than those with transactional fixtures . This is because the bottleneck of test performance lies in database operations. Machinist normally requires much more database queries than transactional fixtures do. To solve the performance issue related to Machinist, we should either speed up database operations or reduce the number of database queries.

There is a type of database known as in-memory database. As its name suggests, it stores data in memory instead of file systems. Memory access is a lot faster than file access, and therefore, an in-memory database should have performance advantage over an normal file-based database. By using in-memory database for testing, database operations can be faster and thus test speed will go up.

We tried in-memory SQLite3 following the instructions on http://www.mathewabonyi.com/articles/2006/11/26/superfast-testing-how-to-in-memory-sqlite3/. I expected in-memory SQLite3 would have an enormous speed boost, but  an experiment comparing PostgreSQL v.s. in-memory SQLite3 shows that it is no more than 20% faster.

The experiment runs several ruby files with test database set to PostgreSQL and in-memory SQLite respectively. Here is the code in config/database.yml that sets the test database to in-memory SQLite3 and PostgreSQL:



# setup test db to in-memory SQLite3

test:
adapter: sqlite3
database: ":memory:"
verbosity: quiet




#setup test db to PostgreSQL

test:
adapter: postgresql
encoding: unicode
database: markus_test
username: markus
password: markus


I pick test/assignment_test.rb as the first benchmark, because this test suite uses Machinist and has heavy database operations.  On my laptop, it takes around 15.6s to run test/assignment_test.rb when in-memory SQLite3 is used, and it takes around 18.5 s when PostgreSQL is used. In-memory SQLite3 is only 18% faster for a typical Machinist test.

The other benchmark I use is a temporary source file, performance_test.rb in the root directory of the project:



# markus_root/performance_test.rb

ENV["RAILS_ENV"] = "test"
require File.expand_path(File.dirname(__FILE__) + "/config/environment")
require File.join(File.dirname(__FILE__), "test/blueprints/helper")

(ActiveRecord::Base.send :subclasses).each {|c| c.delete_all}

t1 = Time.new

5000.times { Section.make }

t2 = Time.new

puts "time span: #{t2-t1}s"


It gives 53.6s for PostgrSQL and 45.2 for in-memory SQLite3. In this case, in-memory SQLite3 is 19% faster.

19%, not as good as expected, is it? Yet, this is understandable. Using in-memory database is not equal to keeping objects within the same memory space used by the ruby application. The former requires interprocess (or sometimes internet) communications between a ruby application and a database, which itself is slow. And I suspect a big fraction of db operation is the communication between an application and a db, and thus we do not see a significant speed-up when using in-memory SQLite3. For this reason, I think that reducing the number of db queries is a better way to improve test speed. This can be testified if we observe the running time of an earlier version of assignment_test where transactional fixtures were used. This test suite runs in 11.6s even if PostgreSQL is the db, compared to 15.6s using in-memory SQLite3 for Machinist tests. So my conclusion is that in-memory SQLite is faster than file-based databases, but this should not be the reason to abandon transactional tests, because transactional tests are much faster than non-transactional ones.

Written by bryanshen

April 19th, 2010 at 2:32 am

Posted in Uncategorized

New Criterion Design Document – Draft

with 3 comments

Hi all,

Bryan and I work closely to refactor the whole project to adopt our new design. But at the mean time, we found that we may not finish the whole refactoring before the end of the term since it pretty much requires a full refactoring for all files (not really all but quite a lot). We prepared a document to present our design principle and refactoring plan for future reference and as well as to make sure we don’t go extreme *wrong”. 😛

I wrote this rough draft to present the basic ideas. It might not be detailed enough however it does show the critical points of the new design. We will revise it pretty soon.

Please feel free to ask questions and make suggestions regarding the new design.

New Criterion Doc

Written by Brian Xu

April 14th, 2010 at 9:18 pm

Posted in Uncategorized

Screencast: Entering marks for a grade entry form

with 4 comments

Written by Farah Juma

April 4th, 2010 at 10:43 pm

Posted in Uncategorized

Wrapping up simple grade entry

with one comment

This semester has flown by quickly! Here’s a summary of all the things that have been added to the simple grade entry feature this semester:

  • Table view – I worked on some performance issues related to the grades table. I also implemented some new features for the table.
    • Instructors can now enter grades for grade entry forms and the grades are automatically saved as they are entered. Each student’s total mark is automatically updated as the grades are entered. If a valid grade is entered, the table cell’s background colour changes to blue and then fades back to white. If an error occurred, the cell’s background colour is changed to red. Since it is possible for a student to have missed a test, blank marks for questions are considered to be valid.
    • The grades for all the students no longer appear on a single page. Basic pagination has been implemented. In addition, alphabetical pagination has been implemented so that an instructor can use a drop-down menu to quickly jump to a page containing a particular subset of students based on their last names.
    • Both unit tests and functional tests have been written for these features.
  • Student interface – Students can now see their marks for grade entry forms.
    • A student’s grade entry forms now appear on his/her main page when they log in. For grade entry forms for which marks have been entered and released, a student will be able to see his/her total mark as well as the class average on the main page. The student can click on a particular grade entry form to see the question-by-question breakdown of the marks. It’s possible that a student will have missed a test for some reason. If this is the case, the student’s mark will appear as N/A.
    • Both unit tests and functional tests have been written for this feature.
  • Releasing/unreleasing the marks – Instructors can now specify which marks to release/unrelease.
    • From the Grades page, an instructor can now release or unrelease the marks for particular students or for all the students. Once a student’s marks have been released/unreleased, the marking state column of the grades table gets updated appropriately. Both releasing and unreleasing the marks are operations that are logged for future reference.
    • Both unit tests and functional tests have been written for this feature.
  • Uploading/downloading the marks – Instructors can upload/download the grades as a CSV file.
    • From the Grades page, an instructor can choose to download the entire grades table as a CSV file. Alternatively, an instructor can choose to upload the grades from a CSV file. In the latter case, after attempting to upload the marks, a success/error message appears on the page indicating how many updates were valid/invalid.
    • A review request has been submitted for this feature. Ticket #634 has been created for the unit tests and functional tests for this feature.
  • Machinist tests – This semester, many of us converted existing tests to Machinist.
    • The tests for grade entry forms have been converted to Machinist and all the new tests that I wrote this semester used Machinist as well.

Current State / Future Work

Simple grade entry has come a long way since I first started working on it back in September! There are a few tickets left over for some UI-related issues. (All tickets for grade entry are tagged with “Simple Grade Entry”. They’re probably a great starting point for anyone new to MarkUs!) In addition, the work that is being done to assign TAs to specific rubric elements can probably be used in a similar fashion to assign TAs to specific questions for grade entry forms. Alternatively, TAs could be given permission to modify the entire table. (This simply requires enabling the table view for TAs as well as instructors.)

Working on the grade entry feature has been a blast! I’ve learned a lot and am really glad I got to be a part of the MarkUs team!


Written by Farah Juma

April 3rd, 2010 at 5:53 pm

Posted in Uncategorized