GithubHelp home page GithubHelp logo

grading_scripts's People

Contributors

markus93 avatar mmaher22 avatar novinsh avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

mohmmed-ali

grading_scripts's Issues

Code crash in filter_submissions

For some reason code crashed on a student with ID B**097 for first homework (B**097_2.ipynb) - 2 digits of the ID replaced with aestriks for privacy concerns. The traceback is as follows:

Traceback (most recent call last):
  File "main.py", line 151, in <module>
    filtered_submissions = filter_submissions(path, rerun_flag)
  File "main.py", line 28, in filter_submissions
    final_submissions.append(Submission(student_id, str(submissions[s]), trial_no, rerun_flag))
  File "/home/novin/workspace/school/lbd/MachineLearning/Meelis Kull/Grading_Scripts/Submission.py", line 20, in __init__
    data = json.dumps(json.load(f))
  File "/opt/anaconda3/lib/python3.6/json/__init__.py", line 299, in load
    parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
  File "/opt/anaconda3/lib/python3.6/json/__init__.py", line 354, in loads
    return _default_decoder.decode(s)
  File "/opt/anaconda3/lib/python3.6/json/decoder.py", line 339, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/opt/anaconda3/lib/python3.6/json/decoder.py", line 357, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

I was testing opt=3 for the time spent by students (issue #5) that I encountered this error. I just removed the student's submission to get through!

Set the default grade to 0.0 if possibly student has not attempted to answer the task

Currently, default grade is filled by 1.0 while it could be filled with 0.0 i.e. when a student had not attempted to solve the task (this could be detected automatically by checking the cell that student had to complete). This is important to keep the cells in the grading spreadsheet empty for tasks that student haven't had any submission. And dedicated zero for those tasks that they made attempt but failed (final result is same, but visually this way is more appealing when reading the sheet!)

A more advanced scenario is to have certain unit_tests and grade the task completely automatic (which not sure if needed at all?!)

Tag the problematic submissions for timing extraction

Add a note on the students (in written excel file) whose timing extraction was not successful. At the moment, those who were ignored or were not able to extract the timing (due to the absence of submission or some other issue) are printed for inspection as the output of option 3 of the script (corresponding to the timing part).

summary:

  • mark absent submissions
  • mark ignored submissions with reason
  • mark ignored submissions

Make the gathering process more convenient by skipping option

Now we have two options when there is an existing grading notebook: 1) overwrite 2) append
The third option 3) skip is needed when we have already graded half through and there has been some change to the conf.json and need to keep the grades up to that task (so to skip from some) and overwrite/append the one that was problematic.

Related issues: #3

Make the gathering process safer

If I already graded some of the student's homeworks and accidently run option 1 of the main script it will overwrite the graded notebooks and it does not save the previous grading progress.

Possibilities:

  • show a warning before overwriting
  • an additional option for adding new students' homeworks but not overwriting everything

Ignore irrelevant homework submissions

Some student has submitted previous homework. This creates empty cells in the grading notebooks as the beg and end flags of the current homework could not be matched.

A better solution is to avoid these submissions in the first place by checking the header of the homework and perhaps also check that there is no match. This facilitates the grading as there is no need to double-check if there has been a problem with extraction or student had a wrong submission.

This is also useful when extracting the timings. Right now, a sanity check is made on the number of fields expected for the timing. So that it prevents wrong submissions or submissions that altered the timing field!

Original Homework URL is broken

Since we are passing an absolute path to the conf.json for the path of the homeworks and we are usually opening jupyter-notebook from another directory other than the root (/) the URLs for the homeworks are not correct and the first couple of directories need to be replaced with the jupyter's path, also adding /tree/ in the path for jupyter notebook's sake.

Confusing students' block

I think something could be done to make the blocks more distinguishable. The fact that the student ID is placed after his/her solution is also confusing. A separating line maybe would help.

Grading related subtasks together

As we may want to grade related subtasks together without having separate noetbooks for each one but at the same time each one of them should end with a separate column and note in the results spreadsheet.

So, we need to add a feature that one notebook can have multiple grades and comments corresponding to the number of subtasks included in that notebook.

Extracting .zip submissions

Perhaps as a separate script, it would be good to have an automatic way of extracting contents of the zip file submissions (mainly extracting the ipynb)

Otherwise should be done manually and with many zipped submission, this becomes time-consuming.

Make it easier to grade students in different sessions

It would be beneficial if grading could be done on different passes. For example, two halves. Grading some of the students prior to the deadline and grading the rest after the deadline passed.

This is mainly needed to improve practice session quality as grading the homework gives a rough estimate of which tasks need more discussion at the practice. Also, since the deadline is close to two of the practice session, it is not easy to grade them prior to those practices.

At the moment grading students in different sessions/passes is a bit of hassle when results need to be put together.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.