Very similar boat at my workplace. Additionally, all the coders we have are essentially at the same level - doesn’t mean we can’t help each other, but does mean we may have a bit of a plateau effect when it comes to improving code through review…
Generally speaking, the code review process is as follows:
Authors upload a working copy of their code to Code Ocean.
Code Ocean verifies that the code runs and delivers results.
Code Ocean provides Nature editors with a private link (blinded or unblinded) to the code capsule for peer review of the code.
Once the code and article are approved by reviewers, Code Ocean will mint a DOI and include a link to the article in the metadata.
Nature includes a link or embed the Code Ocean widget in the the article.
Nature readers will be able to run code and reproduce the results associated with an article by simply clicking a button, as well as edit the code or test it with new data and parameters.
Not everyone reading about this stuff knows what “refactoring” or “linters” are. In the summary blog post about this Community Call, we’d like to link those words to definitions. Anyone have favourite? Otherwise I’ll go with wikipedia
We’ve published a summary of the Community Call on this topic, written by the speakers, Hao Ye, Melanie Frazier, Julia Stewart-Lowndes, Carl Boettiger, and rOpenSci software peer review editor Noam Ross!