Goal
Peer review is important to develop collaboration skills. It also lets you see how data scientists can come up with different solutions to the same problem. As you perform your peer review, keep two goals in mind: be thorough and be constructive.
Being thorough means reviewing your peer’s work with a fine-toothed comb. If they have an error or an omission in their repo, I want you to do your best to find it before I do!
At the same time, be constructive. Good peer review is a gift - an opportunity to lift each other up. Respectfully draw your peer’s attention to where and how they can improve their work.
Guidelines for respectful feedback
Focus on the code, not the person. E.g., “I think this comment needs more explanation” instead of “You didn’t explain this well enough”.
Try to explain why, not just what. E.g., “If you change this variable from
nutrientMgL
tonutrient_mg_l
then your variable naming will be more consistent” instead of “Make thisnutrient_mg_l
”.Acknowledge good work! Highlight parts of the code where your peer has successfully met specs.
Instructions
- Create a peer review conga line. Everyone should give and receive feedback from two different people.
- Fork the repo of the student you’re giving feedback to. Clone it on your computer.
- Create a markdown file in your forked repo called
peer-assessment.md
. - As with the self assessment, create one header for each category of specs (except Collaboration) and a subheader for each individual spec.
- For each spec, assess your peer’s work as Not yet or Meets spec. In one to two sentences, explain your assessment.
- For each spec you assess as Not yet, open an Issue on your peer’s GitHub repo describing what you think they could do to meet the spec.
- Commit your changes and push to GitHub.
- Open a pull request. Tag my username (@FlukeAndFeather) in the description of the pull request so I get a notification about it.