GithubHelp home page GithubHelp logo

swcarpentry / collaborative-lesson-development Goto Github PK

View Code? Open in Web Editor NEW
22.0 22.0 6.0 6.15 MB

10 Simple Rules paper on collaborative lesson development

Home Page: https://arxiv.org/abs/1707.02662

License: Other

Makefile 0.86% TeX 99.14%

collaborative-lesson-development's Introduction

Welcome to Software Carpentry

This is a meta repository to help navigate the numerous repositories and lessons of the Software Carpentry community on GitHub.

What is Software Carpentry

Software Carpentry is a community of volunteer instructors who teach short workshops and develop lessons which empower researchers of all disciplines to learn about and improve the ways in which they create software and collaborate.

Code of Conduct

We are an open and inviting community with an actively enforced Code of Conduct. We value the involvement of everyone in this community - learners, instructors, hosts, developers, steering committee members and staff. We are committed to creating a friendly and respectful place for learning, teaching and contributing. All participants in our events and communications are expected to show respect and courtesy to others.

Lesson Repositories

Lesson Repository Site
The Unix Shell swcarpentry/shell-novice rendered
Version Control with Git swcarpentry/git-novice rendered
Version Control with Mercurial swcarpentry/hg-novice rendered
Using Databases and SQL swcarpentry/sql-novice-survey rendered
Programming with Python swcarpentry/python-novice-inflammation rendered
Programming with R swcarpentry/r-novice-inflammation rendered
R for Reproducible Scientific Analysis swcarpentry/r-novice-gapminder/ rendered
Programming with MATLAB swcarpentry/matlab-novice-inflammation rendered
Automation and Make swcarpentry/make-novice rendered
Instructor Training carpentries/instructor-training rendered

Run a workshop

For a fee, we can help you bring a Software Carpentry workshop to your organization. This fee supports the ongoing work of our community to train instructors in all disciplines and as a community support and develop the lessons that are needed to improve researcher's skills and practices with regard to software and data.

Get your organization to become a member

Our member organizations are organizations that are committed to supporting our organization so that they can have local instructors trained annually. Members also take an active role in helping to direct and support the sustainment and growth of the community.

Get involved

You can get involved with our community by subscribing to our newsletter joining our mailing lists or coming to one of our community events/meetings. Want to contribute to the lessons themselves, a list of our open GitHub issues is a good place to start and each of our lessons has a CONTRIBUTING.md (example from shell-novice) that details how and what contributions are welcome. We use etherpads to manage much of our work and communications, we have an etherpad of etherpads to help you find things you're looking for.

Lesson template

We maintain a set of CSS and Jekyll scaffolding that can be used to help organize and create lessons in our style (swcarpentry/styles). Instructions for how to contribute are in swcarpentry/lesson-example.

Infrastructure project

We run a bunch of workshops every year, to coordinate all of our workshop activities we have been developing an open source tool called AMY. AMY is a Django application that helps us to manage our volunteer network and track when and where we will and have run workshops. It is an essential part of our day-to-day operations and is itself an open source project that welcomes contributions.

Related projects

We work closely with the following projects and share community and instructor capacity:

Experimental Lessons

These lessons were created as part of a contract to deliver advanced material. They've been taught a few times, but could benefit from some care and attention from the community. If you're interested in engaging with these lessons, please contact Jonah Duckles

collaborative-lesson-development's People

Contributors

damienirving avatar gdevenyi avatar gvwilson avatar ianmilligan1 avatar k8hertweck avatar raynamharris avatar twitwi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

collaborative-lesson-development's Issues

Paper is overly sparse in its framing

The authors could do a better job of 1) motivating the need for this new approach to instructional material design and maintenance and 2) addressing criticisms that more conventional practitioners might have. These concerns could include reservations about the accuracy of open lessons, the coherency of modules made by contributors that likely span a range of experience in a given discipline, the reluctance of academic administrators to have their students taught with course materials developed by third parties, etc.

Double check for jargon

In the interest of attracting varied audiences to this article, I want us to be sure we double (and even triple) check for excessive jargon (related to both programming and education). One term that comes to mind is "forking."

Reviewer Response Letter

I have put a rough draft of the reviewer response up here in markdown format.

https://github.com/swcarpentry/collaborative-lesson-development/blob/master/reviewer-response.md

Revisions, comments, etc. greatly appreciated. The one area where I didn't think we had a full response for them was to this one:

I find Tip #3 a bit difficult to digest. The flow of the narrative becomes a bit cloudy when it turns into a 5-point lesson within the 10-point document. A reference has been made to a different publication, which should help - but a bit more clarity on how this collection of five points become "best practices" is necessary. Perhaps a paragraph that more clearly states the important content the authors extracted from Wiggins and McTighe and how they used them to develop these "best practices" would be easier to understand.

I did some hand-waving but any additional thoughts on that?

Auto-generate PDF or text on website?

Hi!

I'm very fascinated by this paper. I find the guiding principles so compelling.

As changes to this document come in, can the PDF be auto-generated? Or, could we have the text auto-generate onto a website somewhere (maybe using gh-pages)? It'd be useful to read this document in its latest form.

You all rock!!

Add acknowledgements

Making note here because I can't fix it right now: we need to acknowledge folks who have provided suggestions here on GitHub.

re-organize rules to improve flow

I'd like to suggest a reorganization to improve the flow. Rather than use my words, I'll use a revamped figure to illustrate one potential re-ordering. The number by the text corresponds to the current order so that you can see what I think could be moved up or down in the order.

2017-05-17-rules-03

Reorganize the section "Build community around lessons"

I enjoyed reading the d68fede version of the manuscript. Step 2 of "building community around lessons" introduces the challenges of keeping lessons up to date and working online. The feeling I had reading that section that early on in the paper was those two points seem a little nuanced and might fit better if merged with step 5, "encourage and empower contributors"; sections 3 and 4 provide more concrete strategies to serve the community.

The title also seems very similar to step 5. How different is building an online community to encouraging contributors?

I'm tempted to think that building an online community is more of a product of reaching the right people who have similar goals and experiences teaching the material in question, and who can give constructive feedback. To manage finding a community like that online, one would probably need to bootstrap off of another established community. For example, 2 years ago I had a conversation with a UW Madison student who mentioned coding bootcamp workshops taught by their local Hacker Within group. So it would be interesting to read suggestions about allies to seek out. Encouraging thinking beyond the need to satisfy one own's itch and find a teaching community could lead into the next point of building modular lessons.

More feedback from community

From @ntmoore:

In the paper, you mention that online Q&A sites "present a chorus of explanations geared at different levels and needs" (line 593 in the tex source)

This idea feels like a reference to the "multiple representations" than experts use to solve Physics (and Chemistry, Math, etc) problems. Sort of relevant reference: https://drive.google.com/file/d/0By53x8SYAF1lOGlCSTVPTHpYaUk/view

For example, in physics, when we talk about the trajectory of a ball (kinematics), we might:
Draw a picture, write an equation, sketch x-v-a stacks (plots), draw a force diagram and sum forces, make a series of energy bar charts, or actually toss a few balls.

Some people in Physics Education refer to this diversity of ways of telling the story of what's happening as using "multiple representations." If it seems relevant, such a reference might be a useful hook for readers with Science/Math Education backgrounds.

In the paper, you seem to be saying that providing a bunch of different explanations is useful, because people have both different needs and different understandings of the tools available. It might be relevant to add that as people become more expert in a domain, their facility with all available representations should increase (indeed, that's probably a working definition of expertise).

So (perhaps), in providing many different representations/answers etc, you are helping learners understand the scope of the field they're being introduced to.

Thanks for sharing.

Learning objectives?

I believe that the brief discussion of learning objectives could be amended. Currently (585977c) learning objectives / concept inventories / etc. are covered first in the section for Rule 1:

Rather than itemizing prior knowledge and learning objectives, it can be helpful to
write learner profiles to clarify the learner’s general background, what they already
know, what they think they want to do, how the material will help them, and any
special needs they might have.

I'm a little bit surprised, since this seems to be recommending against explicit learning objectives, while the section for Rule 4 includes:

An example of a particular lesson development practice is reverse instructional
design [2]. When this is used, lessons are built by identifying learning objectives,
creating summative assessments to determine whether those objectives have been met,
designing formative assessments to gauge learners’ progress and give them a chance to
practice key skills, putting those formative assessments in order, and only then writing
lessons to connect each to the next. This method is effective in its own right, but its
greatest benefit is that it gives everyone a common framework within which to
collaborate.

which is pretty clear on the benefits of such objectives.

I find explicit learning objectives to be beneficial in relation to many of the rules here, and could see them as a common thread throughout much of this paper.

Rule 1: Learning objectives help define the audience. Learners who have already achieved many/all of the objectives are too advanced. Learners who are not prepared to achieve them need additional background from other lessons.

Rule 3: Learning objectives promote modularity. Independent modules have non-overlapping objectives and the material is designed to achieve those objectives (given appropriate learner background).

Rule 4: As described above, learning objectives lend themselves to a best practices in lesson development and provide a "common framework" for collaboration.

Rule 7: Learning objectives provide a framework for lesson assessment. Are objectives appropriate? Is the material successful at teaching to the objectives?

Apologies for suggesting the addition of content without suggesting what to replace. If this is a major issue, I could imagine combining/condensing rules 3 and 8, and adding use of learning objectives as one of the 10 rules.

Switch to this repository

Please edit this PR and fill in the [] beside your name when you have forked this repository.

  • Gabriel A. Devenyi
  • Rémi Emonet
  • Rayna Harris
  • Kate Hertweck
  • Damien Irving
  • Ian Milligan
  • Greg Wilson

mention code of code conduct

Rule 3 has a paragraph on the Programming Historian omnibus person, so I thought it would make sense to add a brief mention of the SWC/DC code of conduct.

Figure 2

This diagram could be improved with a few arrow movements and slightly different wording. Active voice is always an efficient way to deliver information, so I kindly suggest that the authors could make a bit more fruitful use of it.

For instance, "Lesson materials are outlined and drafted via community input" could be more clearly stated by saying instead: "Outline and draft lesson materials with community input."
"Materials are updated continually" could also be switched into active voice.
"Lesson release..." --> "Release a stable version of the lesson."

Also, the arrow from 'recruit community..." into "Lesson materials..." could go and meet the arrow going from 'an individual or group...' towards 'Lesson materials...' It adds clarity.

Figures and other 10 simple rules papers

I can't find where Fig 1 is cited in the manuscript text (although I see where it's placed relative to the text?), but I'm also a total LaTeX noob.

At one point, Rayna had mentioned another figure (Fig 2?) which compared traditional publication to the model we emphasize. This would be a great way to help connect our paper to some of the other 10 Simple Rules papers, such as the one on preprints.

While we're thinking about it, some of the content from the 10 simple rules paper on MOOCs may be relevant?

Management

Great paper. I think needs something on management. Library Carpentry uses a maintainer model for its lessons, and - for me - the crucial aspect of this role is to ensure:

  1. That the lesson doesn't suffer from bloat (because many authors make for bloaty writing), and
  2. That discussions that have been had and rejected don't get had again (that is, they remember what the closed issues are).

In a way this is a partner to section '4 Teach best practices for lesson development'. That is, teaching best practice is one thing, but it means nothing if no-one is there to ensure that best practice is applied. Perhaps it fits in section '5 Encourage and empower contributors' in that we are talking about giving some contributors a sense of ownership, a specific and defined role in managing the ongoing development of a lesson, and credit for that work (which you don't mention at all, I think..)

I'm happy to help write this (depending on your timescales!)

Feedback

Greg asked for feedback on this via the Software Carpentry discussion email list. Overall, I think this looks like a useful summary of how to build lessons collaboratively. It's also useful for me to see this now as I think we'll be discussing exactly these issues in a mentoring group in a month or so.

  • References added. I think it would be useful to have a fair number of additional references - I've mentioned some of these below but pretty much every second could benefit from additional links to the literature.
  • Added. I think it would be worth being more specific about the learner profiles at the end of rule 1 (clarify your audience). It may be useful to point to the software carpentry learner profiles as an example, or to add a reference or two to wherever the idea is best or first described.
  • Added. It may also be useful to define what "done" could look like early in the process of developing a lesson or lessons. This could fit into rule 1 and would help focus lesson developers in a similar way to the learner profiles.
  • We don't actually do this, so it would be hard for us to recommend it. I wonder if it is worth mentioning the utility of recording the decisions that are made where developing a lesson in some place that can be referred to by future developers. I'm not convinced that this is something that gets done very well in Software Carpentry's lesson development but the decisions that lead to Y being taught before X, and that Z is not included in the lesson is important. Future lesson developers could benefit from knowing why these choices were made, and what the evidence for the choice was. (Maybe an earlier version taught X and Y in the logical order, but assessment and feedback indicated that this didn't work very well. Knowing this would be useful if the order was being reexamined.) I suppose this could fit in Rule 2, 4, 5 or 7.
  • Filed as #22. While the statement at the end of the first paragraph of Rule 4 ("Unfortunately, since most college and university faculty have little or no training in education, this knowledge and expertise is rarely transferred into classroom practice.") is certainly true, it would strengthen the argument if you could point to some data to back this up.
  • I think this is more detail than we can get into in a "10 Rules" paper. The issue of understanding the frame of reference of learners described at the end of Rule 7 could also indicate other issues around but not directly related to the lessons being developed (e.g. a lack of other lessons, a lack of opportunity for learners to attend more appropriate workshops, or failure to communicate the objectives or prerequisite material).
  • See #84. I wonder if it is worth mentioning that it could be worth separating issues highlighted by large scale surveys into those that can be addressed by directly changing the collaboratively developed lesson, and those that are better tacked elsewhere (e.g. by the host organization, or by creating a new lesson to supplement the existing one). This could act as a link between Rule 7 and 8, and links to Rule 10.
  • Finally, I spotted an extra closing double quote after 'beginner' on line 58.

Figures aren't used

Aside from their parenthetical mention at the end of the Introduction, neither of the figures is referred to in the main set of rules. I am not sure if both of these are necessary. Figure two, especially, encompasses a number of rather complex ideas about community building and lesson development that are not directly discussed at any length within the text. The second figure seems the more useful of the two, but in any event, I think both warrant more attention in the text or in captions if they are to be kept.

Minor issue checklist

  • Line 21 - the authors should distinguish the academic setting from the academic teaching setting, since the academic setting does very well in open source software development.
  • Lines 24-27 seem to be a personal statement against the academic system.
  • It would be useful to provide the URL in the text for rule #3 since online best practices for lesson development circles back to online collaborative lesson development.
  • line 141 - question their 'own' authority
  • In Rule #5, it would be helpful if the authors redirected to Rule #6 when discussing recognizing contributions.
  • Line 251, sustainable lessons 'in all domains'
  • Acknowledgements usually go at the end, not before the 10 rules?
  • pg 2, line 50: who -> whom
  • pg 2, line 53: which -> that
  • pg 3, line 77: has -> have
  • pg 3, line 78: I'd say 'formal training' here. There are many ways to learn to do something. Without this adjustment you risk coming off as dismissive or even derogatory towards faculty, which I don't think is your intent.
  • pg 4, line 117: I'd also revise this bit about programmers 'looking down' on Google Docs and the like. It contributes to the notion that programmers think they are better than others, which is not something that belongs in a manuscript about inclusive lesson development. Recent versions of Google Docs also allow for 'suggesting' mode edits, which could be considered a version of 'pre-merge review', in that the changes can be discussed, accepted, or rejected after they are initially made/proposed.
  • pg 5, line 168: look -> looks
  • pg 5, line 168: it has -> they have

Add brief mention of SWC mentoring subcommittee

I'd like to add a short mention to the Software Carpentry mentoring subcommittee because I think this has been a very successful program for building community. I considered adding this to rule #7 evaluate lessons because I think the insight from the mentoring program does help evaluate and improve lessons, but I think mostly it helps build community by bringing instructors from all over the world and giving them a platform to share their expertise and get feedback.

Christian T. Jacobs <[email protected]> feedback

Hi everyone,

I saw the link to your draft paper on the SWC Newsletter. I found it to be a very interesting read and in good shape. One piece of feedback that you may wish to take onboard:

In Section 4 it might be worth mentioning that, in addition to a collaboration tool's steep learning curve, the use of certain file formats and embedded images can further complicate pre-merge review. For example, we found (Git-tracked) IPython Notebook material to work very well when developing an undergrad programming course for geoscientists, but we often had to address time-consuming merge conflicts (both for the lecturers and the students) caused by e.g. re-running a piece of embedded Python code that updated an embedded plot/image, or using a more recent version of Notebook which caused small changes in header info, etc, and then merging/pulling:

"Additionally, in 2013, the IPython Notebook format was adopted (see above). The Git version control system was used in an attempt to apply any updates/corrections to the lecture notes as gracefully as possible, and also gave the students an insight into using version control to manage their work. However, this turned out to be somewhat counterproductive as merge conflicts frequently had to be resolved manually by the teaching assistants, which lowered the confidence the students had in the system they were using and added to the number of commands the students had to remember to download the latest revision of the lecture material. The majority of students in the class struggled to cope with both Linux and Git, which were both completely new to them."
http://dx.doi.org/10.5408/15-101.1 , https://arxiv.org/abs/1505.05425

Perhaps add a note to encourage readers to keep this in mind?

Hope this helps.

Kind regards,
Christian

Digesting tip #3

I find Tip #3 a bit difficult to digest. The flow of the narrative becomes a bit cloudy when it turns into a 5-point lesson within the 10-point document. A reference has been made to a different publication, which should help - but a bit more clarity on how this collection of five points become "best practices" is necessary. Perhaps a paragraph that more clearly states the important content the authors extracted from Wiggins and McTighe and how they used them to develop these "best practices" would be easier to understand.

Figure 1

In Figure 1. The images supporting Tip #7 are not very clear to interpret. I would think of something else here. Perhaps an old-style scale instead of 'great' and 'terrible'? Or something similar.

make pdf fails to build

Building the PDF seems to fail with the following error.

scopatz@artemis ~/collaborative-lesson-development master $ make pdf
pdflatex paper
This is pdfTeX, Version 3.14159265-2.6-1.40.17 (TeX Live 2016/Debian) (preloaded format=pdflatex)
 restricted \write18 enabled.
entering extended mode
(./paper.tex
LaTeX2e <2017/01/01> patch level 3
Babel <3.9r> and hyphenation patterns for 3 language(s) loaded.
(/usr/share/texlive/texmf-dist/tex/latex/base/article.cls
Document Class: article 2014/09/29 v1.4h Standard LaTeX document class
(/usr/share/texlive/texmf-dist/tex/latex/base/size10.clo))
...
Output written on paper.pdf (7 pages, 229939 bytes).
Transcript written on paper.log.
bibtex paper
This is BibTeX, Version 0.99d (TeX Live 2016/Debian)
The top-level auxiliary file: paper.aux
The style file: plos2015.bst
I found no \bibdata command---while reading file paper.aux
Warning--I didn't find a database entry for "hlw"
Warning--I didn't find a database entry for "wiggins-mctighe"
Warning--I didn't find a database entry for "lessons-learned"
Warning--I didn't find a database entry for "instructor-training"
Warning--I didn't find a database entry for "how-to-teach-programming"
Warning--I didn't find a database entry for "git-survey"
Warning--I didn't find a database entry for "shell2015"
Warning--I didn't find a database entry for "shell2017"
Warning--I didn't find a database entry for "choral-explanations"
Warning--I didn't find a database entry for "producing-oss"
(There was 1 error message)
Makefile:13: recipe for target 'pdf' failed
make: *** [pdf] Error 2

Add clickable refs to rules?

If we add anchors and clickable refs will PLOS paper have them? Depends on their final publication process I guess...

in defence of git

Interesting article, thanks for sharing. The comment about git being "a famously user-hostile tool" seems a little harsh, I thought. It has a steep learning curve for sure, and some functionality that I struggle to get my head around, but "hostile" is a strong word! The comment feels like it deserves a citation or some explanation at least.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.