# Collaborative Lesson Development Training
:::info
**Dates**: 23-26 January 2024
**Time**: 13:00-17:00 UTC ([check in your local timezone](https://www.timeanddate.com/worldclock/fixedtime.html?msg=Collaborative+Lesson+Development+Training&iso=20240123T13&p1=%3A&ah=4))
**Zoom link**: https://carpentries.zoom.us/j/83880136406?pwd=xKXJvvJXVdKCoSdZ09fHR0KVHtKUG9.1
**Code of Conduct**: https://docs.carpentries.org/topic_folders/policies/code-of-conduct.html
**Curriculum**: https://carpentries.github.io/lesson-development-training/
**Training event page**: https://tobyhodges.github.io/2024-01-23-cldt-online/
**Trainers:**
- Toby Hodges
- Aleksandra Nenadic
- Mike Trizna
:::
[TOC]
## About This CodiMD
We will use this CodiMD to take notes, share links, exercises, etc with participants throughout the training.
Participants are encouraged to take shared notes on this page. The Trainers will show you how to use CodiMD at the beginning of the workshop.
## Attending
- Name / pronouns (optional) / affiliation / email address
- Aleks Nenadic, UK Software Sustainability Institute, Manchester, a.nenadic@software.ac.uk (trainer)
- Toby Hodges, The Carpentries, tobyhodges@carpentries.org (trainer and trainee!)
- Sabrina Lopez, MetaDocencia, sabrina.lopez@metadocencia.org (trainee)
- Mike Trizna, Smithsonian Institution, triznam@si.edu (trainer)
- James Munroe, 2i2c, jmunroe@2i2c.org
- Jenny Wong / she/her / 2i2c / jwong@2i2c.org
## Notes
[Notes from Days 1+2](https://codimd.carpentries.org/-y0JfAJZSK2vEB7-jvGeAg?edit)
## Day 3
### Stay on Target
**Objectives**:
After completing this episode, participants should be able to...
- Explain what is meant by the intended and attained curriculum of a lesson.
- Describe the importance of regular assessment while a lesson is being taught.
- Design assessments to identify the misconceptions learners might have during your lesson.
**Questions**:
- How can you measure learners' progress towards your lesson objectives?
- Why is it important to identify misconceptions as early as possible?
- Why should we create assessments before we have written the explanatory content of our lesson?
#### Exercise: misconceptions (5 minutes)
What are the common misconceptions learners can have about the topic of your lesson?
How might you identify that misconception in your learners while they follow your lesson?
Share your answer in the collaborative notes document.
Hint: Try thinking about related or common tools the learners might know and how applying that prior knowledge might lead to a misconception with the topic you are teaching.
- Sabrina:
- Is an image the same as an environment? (Partially an issue with terminology as e.g. 'image' means different things in different domains)
- Jenny:
- James:
- Difference between JupyterHub and JupyterLab
- Why is an image taking time to load when it loaded quicly the day before?
- what is the cost?
- Toby:
- what takes hub administrators by suprise? what misconceptions they potentially bring to the training? what typically trips people up?
- cloud computing/HPC analogy - when data is processed that takes money but people are taken by surprise that storing data also costs money - storage costs are often a misconception, output of a run being stored will cause a cost to the owner
#### Copyright / Licencing MCQ Example
An example MCQ to check on people’s misconceptions around licencing and reusing other people’s work could be designed as follows.
MCQ: Which of the following statements are true and which are false?
1. I don’t need permission because I am only using the copyrighted work in educational or non-profit purposes
2. I should always know the licence of any code, data, libraries, pictures or other work that you reuse or redistribute
3. Since I’m planning to give credit to the authors who created the work I reuse, I do not have to worry about or need permission
4. Material I obtain from the Internet is publicly accessible so no explicit permission is required
5. The work I want to use does not have a copyright notice on it, so it’s not protected by copyright and I’m free to use it
#### Exercise: designing a diagnostic exercise (20 minutes)
Create a multiple choice question (MCQ) that could be used in your lesson, to detect the misconception you identified above. As well as the correct answer, include 1-3 answer options that are not obviously incorrect (*plausible distractors*) and have *diagnostic power* i.e. each incorrect answer helps you pinpoint the exact misconception carried by the learner.
Write down what misconception is being tested for each incorrect answer.
Share your MCQ in the collaborative notes document.
##### Sabrina & James
Suppose a user writes a program to analyze a dataset. On their laptop using testing data the program works and is reasonably fast. When the user tries to run this same program on a real dataset their JupyterHub environment is it frustratingly slow.
Why is their interactive computing session being so slow?
a. It is caused by a poor internet connection between my home computer and the cloud.
b. Too many other users are using the cloud server at the same time.
c. I am trying to a computationally intensive task and the cloud computer has poor hardware.
d. I am accessing a very large file that is not local to the cloud server being used. <--
##### Toby & Jenny
Need to be language agnostic
Which of these is found in an image/in a container?
Definition of a container image:
code
runtime
system tools
system libraries
system settings
What could a container image describe? (check all that apply)
- [ ] my Python/R package
- [ ] applications
- [ ] system libraries
- [ ] runtime kernel
- [ ] operating system
- [ ] storage
- [ ] system settings
James: all of these could be in a container image.
[Reference blog post](https://jacobtomlinson.dev/posts/2023/being-intentional-with-container-terminology/).
**Key Points**:
- The goal of lesson development is to ensure that the **attained curriculum** matches the **intended curriculum** as closely as possible.
- Assessments are a way to determine whether the objectives you defined for the lesson have been reached.
- **Formative assessment** happens *during teaching* and provides feedback both to an instructor and a learner - about progress and whether learning of new concepts occurred but also about any misunderstandings and misconceptions which can hinder further learning.
- It is important to detect misconceptions as early as possible and formative assessments (such as multiple choice questions) can help us with this.
### Designing Assessments
**Objectives**:
After completing this episode, participants should be able to...
- Choose the format for an exercise based on the outcome it is intended to measure.
- Display exercises and their solutions in a lesson site.
**Questions**:
- Why are exercises so important in a lesson?
- What are some different types of exercises, and when should they be used?
- How should exercises be presented in a lesson website?
**56 Examples of Formative Assessment:**
https://www.edutopia.org/groups/assessment/250941
#### Recommended reading: More Example Assessment Types
* [Exercise Types Chapter from Teaching Tech Together](https://teachtogether.tech/en/index.html#s:exercises)
* [Edutopia’s 56 Examples of Formative Assessment](https://www.edutopia.org/groups/assessment/250941)
* [H5P Examples and Downloads for Interactive Content](https://h5p.org/content-types-and-applications)
Short-term/working memory: very quick to access, but limited in capacity
Long-term memory: slower to access, but essentially unlimited in capacity
Learning involves new information being "loaded" into working memory, then transferred to long-term memory.
Exercises provide learners with an opportunity to practice, which aids transfer from short-term to long-term memory.
Important to provide regular and frequent opportunities for this transfer, to avoid overloading the working memory.
#### Exercise Types
Different types of exercise are more or less apporpriate for learners at different levels of expertise e.g. novices will benefit from more structured/scaffolded exercises and worked examples, whereas those with more expertise will find these distracting. Scaffolded exercise types typically also do not provide a way to practice and assess higher-level cognitive skills e.g. evaluation, composition, design
**Greg Wilson's writing on Exercise Types:**
https://teachtogether.tech/en/index.html#s:exercises
#### Exercise: Exercise Types and When to Use Them (15 minutes)
Read about four of the exercise types [in the *Exercise Types* chapter of *Teaching Tech Together*](https://teachtogether.tech/en/index.html#s:exercises) by following the relevant link below.
- [fill-in-the-blanks](https://teachtogether.tech/en/index.html#fill-in-the-blanks)
- [faded examples](https://teachtogether.tech/en/index.html#faded-examples)
- [Parsons problems](https://teachtogether.tech/en/index.html#parsons-problem)
- [minimal fix](https://teachtogether.tech/en/index.html#minimal-fix)
Then, discuss the following questions together:
- What kind of skills would an exercise of this type assess?
Try to identify some action verbs like those we used to write lesson objectives earlier in the workshop. -> [Bloom's Taxonomy](https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/) again
- FITB: recall; understand; apply (in the example given in the book); the group noted that _where_ the blanks are put determines which skills are being assessed
- Faded Examples: not much difference between FITB and Faded Examples, but maybe used for different things. Faded Examples use a process that FITB does not. More guided.
- Would this type of exercise be suited to a novice audience?
Or is it a better fit for intermediate or advanced learners?
- FITB: the example is intended for novices; scaffolding lends itself more to novices, but it could be possible to create exercises of tis type for more advanced learners
- Faded examples: also for novices because of the scaffolding, but requires new skill levels at each step
- Would this kind of exercise work well in an in-person workshop setting?
Would it be better suited to self-directed learning and/or a virtual workshop?
- FITB: it should work well in both
- Faded:
**Feedback on the lesson:** make it clearer if the goal is to evaluate the problem type such as "Fill-in-the-Blank" in general or the specific example problem given in Wilson's text
#### Exercise: Assessing an Objective (30 minutes)
Using one of the exercise formats you have learned about so far, design an exercise that will require learners to perform one of the actions described in the objectives you wrote earlier, and that assesses their ability to do so.
These should be assessments of the lower-level objectives defined for individual episodes in the lesson, as opposed to the lesson-level objectives you wrote first.
Trainees working as a team can choose whether to work together on discussing and designing a single exercise to assess a single objective, or to divide their efforts and each focus on an exercise for their own episode.
If you choose to take the latter approach and finish with time to spare, spend the remainder reviewing and providing feedback on one another's assessments.
##### Jenny & Toby
> pull a container to make it available on their hub instance
The key things are knowing what the URLs look like, so that the Hub Champion knows which piece of information to add, and to know where to add it.
Potentially two exercises: one to search and choose an image that meets given criteria, and another afterwards to add that image to the hub (this one would assess the objective above).
> ADD SOME PRETEXT BLURB HERE. Which of the following would you paste into the _Custom Image_ field to add the latest version of the `handbook-authoring-image` image to your hub?
>
> a) quay.io/2i2c/handbook-authoring-image:ad18f6ea575d <- correct
> b) quay.io/2i2c/handbook-authoring-image
> c) docker pull quay.io/2i2c/handbook-authoring-image:ad18f6ea575d
> d) quay.io/2i2c/handbook-authoring-image:bbe4225a7940
Notes about incorrect answers:
b) missing the version tag
c) they forgot to remove the 'docker pull' part
d) the wrong tag
##### Sabrina & James
(Parsons problem assessment on starting a server)
The correct answer:
```
Event log
Server requested
2024-01-25T15:12:36Z [Warning] 0/1 nodes are available: 1 node(s) didn't match Pod's node affinity/selector. preemption: 0/1 nodes are available: 1 Preemption is not helpful for scheduling..
2024-01-25T15:12:40Z [Normal] pod triggered scale-up: [{https://www.googleapis.com/compute/v1/projects/catalystproject-392106/zones/southamerica-east1-c/instanceGroups/gke-latam-cluster-nb-n2-highmem-4-04d116e7-grp 0->1 (max: 100)}]
2024-01-25T15:13:37Z [Normal] Successfully assigned cicada/jupyter-jmunroe to gke-latam-cluster-nb-n2-highmem-4-04d116e7-x5xl
2024-01-25T15:13:41Z [Normal] Pulling image "busybox:1.36.1"
2024-01-25T15:13:46Z [Normal] Successfully pulled image "busybox:1.36.1" in 5.017407272s (5.017421664s including waiting)
2024-01-25T15:13:46Z [Normal] Created container volume-mount-ownership-fix
2024-01-25T15:13:46Z [Normal] Started container volume-mount-ownership-fix
2024-01-25T15:13:50Z [Normal] Pulling image "quay.io/jupyterhub/k8s-network-tools:3.2.1"
2024-01-25T15:13:53Z [Normal] Successfully pulled image "quay.io/jupyterhub/k8s-network-tools:3.2.1" in 3.225258717s (3.225278258s including waiting)
2024-01-25T15:13:53Z [Normal] Created container block-cloud-metadata
2024-01-25T15:13:53Z [Normal] Started container block-cloud-metadata
2024-01-25T15:13:57Z [Normal] Pulling image "rocker/binder:4.3"
2024-01-25T15:15:44Z [Normal] Successfully pulled image "rocker/binder:4.3" in 1m47.137277602s (1m47.137298921s including waiting)
2024-01-25T15:15:44Z [Normal] Created container notebook
2024-01-25T15:15:44Z [Normal] Started container notebook
Server ready at /user/jmunroe/
```
Simplified log
```
Event log
Server requested
# There is no available virtual machine already running with spare capacity
# Start a new virtual machine (node). Done once for a group (32/64) of users
2024-01-25T15:12:36Z [Warning] 0/1 nodes are available: 1 node(s) didn't match Pod's node affinity/selector. preemption: 0/1 nodes are available: 1 Preemption is not helpful for scheduling..
2024-01-25T15:12:40Z [Normal] pod triggered scale-up:
[{https://www.googleapis.com/compute/v1/projects/catalystproject-392106/zones/southamerica-east1-c/instanceGroups/gke-latam-cluster-nb-n2-highmem-4-04d116e7-grp 0->1 (max: 100)}]
2024-01-25T15:13:37Z [Normal] Successfully assigned cicada/jupyter-jmunroe to gke-latam-cluster-nb-n2-highmem-4-04d116e7-x5xl
# Provision the server by setting up the software needed for this new node (only needs to happen once per node)
2024-01-25T15:13:41Z [Normal] Pulling image "busybox:1.36.1"
2024-01-25T15:13:46Z [Normal] Successfully pulled image "busybox:1.36.1" in 5.017407272s (5.017421664s including waiting)
2024-01-25T15:13:50Z [Normal] Pulling image "quay.io/jupyterhub/k8s-network-tools:3.2.1"
2024-01-25T15:13:53Z [Normal] Successfully pulled image "quay.io/jupyterhub/k8s-network-tools:3.2.1" in 3.225258717s (3.225278258s including waiting)
# Download the image environment to this new node (different users may request different image environments but the system will reuse this image environment again on this same node between users )
2024-01-25T15:13:57Z [Normal] Pulling image "rocker/binder:4.3"
2024-01-25T15:15:44Z [Normal] Successfully pulled image "rocker/binder:4.3" in 1m47.137277602s (1m47.137298921s including waiting)
# Launch a new server (with this image environment) for the user
2024-01-25T15:15:44Z [Normal] Created container notebook
2024-01-25T15:15:44Z [Normal] Started container notebook
Server ready at /user/jmunroe/
```
The first user on a node starts a server. Place these steps in the correct order.
- Download the image environment
- Set up the sofware on the new node
- Launch a new server
- Start a new cloud machine
What steps will be skipped of the second user using the same image environment on the same node?
Answer:
- Start a new cloud machine
- Set up the sofware on the new node
True or False: An image environment is always downloaded for every user.
Image environment: "rocker/binder:4.3"
Do you expect this to be a Julia, R, or Python environment?
Why is Jupyter spelled "Jupyter" and not "Jupiter"?
-> JUlia PYthon R
#### Nested Fenced Divs for Exercises with Solutions
```markdown
:::::::::::::::::::: challenge
### Challenge Title
What is the solution to this exercise?
::::::::::: solution
This is the solution to this exercise.
::::::::::::::::::::
::::::::::::::::::::::::::::::
```
#### Exercise: Formatting Exercises in a Lesson Site (15 minutes)
Using the approach demonstrated above, format the exercise you designed previously as an exercise in your lesson site.
---
### Example Data and Narrative
**Objectives**:
After completing this episode, participants should be able to...
- Find candidate datasets to use in a lesson.
- Evaluate the suitability of a dataset to be used in a lesson.
- Choose examples that will prepare learners for formative assessments in the lesson.
- Develop a story
**Questions**:
- Why should a lesson tell a story?
- What considerations are there when choosing an example dataset for a lesson?
- Where can I find openly-licensed, published data to use in a lesson?
#### Dataset considerations
- Ethical use (see prompts below)
- License - CC0 Recommended
- Complexity - _Is it easy to understand?_ _Is it sufficiuently authentic?_
- Number and types of variables
#### Questions about Ethical Use of Datasets
- Does the data contain personally identifiable information?
- Was the data collected without permission from the groups or individuals included?
- Will the data be upsetting to learners in the workshop?
[CARE Principles for Indigenous Data Governance](https://doi.org/10.5334/dsj-2020-043) - Collective Benefit, Authority to Control, Responsibility, and Ethics
#### Examples of Public Repositories
- [Dryad](https://datadryad.org/)
- [The Data Cuartion Network’s datasets](https://datacurationnetwork.org/datasets/)
- [The Offical Portal for European Data](https://data.europa.eu/)
- [DataONE](https://www.dataone.org/)
- [The Official Portal for Argentina Data](https://www.datos.gob.ar/) - In Spanish
[FREQUENTLY ASKED QUESTIONS by CC](https://creativecommons.org/faq/#can-i-combine-material-under-different-creative-commons-licenses-in-my-work)
#### Exercise: Choosing a Dataset or Narrative (30 minutes)
Referring to the advice given above, find an appropriate dataset or a narrative for your lesson. Identify one or more potential candidates and note down the advantages and disadvantages of each one.
1. Imagine a hub champion, give them a name, etc.
- Describe the needs of the imaginary community, connect that to what the Hub Champion will encounter
- For Catalyst, try to tailor to the general domain needs of the communities (biomedical research)
- Communities will be analysing data on hubs. How do they do that? How to upload data to shared drive?
- James: many different ways. Sometimes people go to cloud because data is already there. But exactly where and how it is structured varies from domain to domain. Hesitation in picking any one is alienating those to whom the chosen example is not relevant. Some large, cloud-enabled datasets out there. Is there another existing Carpentries lesson that involves a large dataset, where challenge is accessing it without downloading it?
- Jenny: do people also upload data?
- James: yes, e.g. because they generated data on HPC, lots of reasons
- we can teach things that may not be directly applicable to their hub right now but are still relevant to being a hub champion
- save time for 'bring your own examples' - give exercises to think about this (reflection exercise)
- emphasise generalising skills
- perhaps community events that follow training could be more specific to particular hubs and cases
- biomedical examples to begin with, to cater for the Catalyst community, as an example needs to be chosen to start with, and this one fits the bill (with further considerations for other communities)
- "biomedical" examples are also too wide, Sabrina could inform the lesson develoment team about this (from the point of view of South American part of the community)
- other community is from Africa, they can be surveyed on this too
- this will impact the container image - what other considerations need to be taken into account (images, diagrams, data, hub configuration/authentication, etc.) when thinking of a fictional hub champion/typical learner and a community representative for this course?
- "bring your own examples" can be used for reflection exercises where people would be asked to write in a shared document how they would apply these general skills to their own workflows and this can be used by trainers to collect some feedback on adaptability to other domains
- what example data/artifacts will we need to find/prepare based on chosen hub champion persona
- container images
- hub configuration: resources; authentication
- Are there workflows that are a better fit for the hub environment? Technical constraints? So we could teach/recommend particular types of workflows?
3. Or take a "bring your own hub" approach, where we react to what they tell us about their context.
- this one is much more difficult to define, probably more difficult to teach
Feedback for Day 3: https://forms.gle/nwATXBnEednmP2Fi7
## Day 4
### How to Write a Lesson
**Objectives**:
After completing this episode, participants should be able to...
- Estimate the time required to teach a lesson.
- Summarise the content of a lesson as a set of questions and key points.
- Connect the examples and exercises in a lesson with its learning objectives.
**Questions**:
- Why do we write explanatory content last?
- How can I avoid demotivating learners?
- How can I prioritise what to keep and what to cut when a lesson is too long?
#### Lesson Time Management (10 minutes)
(5 minutes) In the shared notes document, note down your answers to these questions:
- From a design perspective, at what point is a lesson too long?
- What factors influence and constrain the length of a lesson?
- How might you prioritise what to keep if you have to cut lesson content down?
- Jenny:
- A lesson is too long when there are too many exercises that stray from the main learning objectives, or too much extra detail in the connecting text that doesn't get you to your "train station".
- Factors could be accessibility requirements, number of helpers available, spontaneous discussions, technical issues that need attention
- Prioritise the time and knowledge imparted from the instructor since exercises can be finished off async (but also still give enough time to do exercises!). Learning can still take place outside of the lesson.
- James:
- If there are more "tangents" and "asides" that core material
- If the material is not essential to most learners in the target audience
- If the material is better tackled in self-directed learning or long-format reading
- If a content would be easily removed in a "short-form" version of the lesson, does it really need to be in the lesson at all
- If the material is not actually relevant to the learners even if the instructor may have lots of prior effort in that topic (example: drop a Python2 vs Python3 discussion)
- Sabrina:
- when contains too many exercises that actually are not strongly related and can be regrouped
- anything out of planned (for better or worse): technical issues, group dynamic
- the main exercises that lead to the learning objectives
- Toby:
- When you find it difficult to draw a single narrative through all of the content; when the lesson strays into topics beyond the originally intended focus; when it is unrealistic to expect people to follow it from the beginning to the end without extended breaks for practice, etc.
- the time required to teach it is the main one; generally speaking, the length of a lesson scales with its number of learning objectives
- review the LOs and be prepared to make hard choices. Who was the writer who said _"kill your darlings"_?
(5 minutes) In the remaining time, your Trainers will lead a discussion based on the responses.
#### Length Considerations
- What is essential to include?
- What can be left out if needed?
- Are there checkpoints where the lesson could end if needed?
- Can important concepts be moved up earlier to ensure they are covered?
A template for notes to take from a pilot workshop of your new lesson: https://codimd.carpentries.org/lesson-pilot-observation-notes-template#
#### 5 Ways to Handle Extraneous Overload - [Renkl 2014](https://www.cambridge.org/core/books/cambridge-handbook-of-multimedia-learning/worked-examples-principle-in-multimedia-learning/8753055D1FB47CF1E2BB897FD44FBEF8)
1. Eliminate extraneous material
2. Insert signals emphasizing the essential material
3. Eliminate redundant printed text
4. Place printed text next to corresponding parts of graphics
5. Eliminate the need to hold essential material in working memory for long periods of time
#### Review Your Text for Demotivations
- dismissive language - e.g. ‘simply’, ‘just’
- use of stereotypes - check learner profiles for stereotypes too
- expert awareness gaps, i.e. places where you may be assuming the learners know more than they actually do
- fluid representations, i.e. using different terms with the same meaning interchangeably
- unexplained or unnecessary jargon/terminology
- unexplained assumptions
- sudden jumps in difficulty/complexity
#### Review Your Text for Accessibliity
- Avoiding regional/cultural references and idioms that will not translate across borders/cultures
- Avoiding contractions i.e. don’t, can’t, won’t etc.
- Checking that all figures/images have well written alternative text, including writing altnerative text for data visualizations.
- Checking the header hierarchy - no h1 headers in the lesson body, no skipped levels
- Using descriptive link text - no “click here” or “this page”, etc.
- Checking the text and foreground contrast for images
We recommend https://wave.webaim.org/ and the associated browser plugin for checking the accessibility of webpages, including your lessons.
#### Episode Metadata
Objectives, Questions and Key Points can be provided for episodes. (Objectives and Questions are required for the Overview box to build.) All three are fenced div classes:
- `questions`
- `objectives`
- `keypoints`
#### Exercise: Completing episode metadata (10 minutes)
Add key points and questions to your episode.
To check the formatting requirements, see the Introduction Episode example in your lesson or [the Workbench Documentation](https://carpentries.github.io/sandpaper-docs/episodes.html#questions-objectives-keypoints)
```markdown
:::::::::::::::: keypoints
::::::::::::::::::::::::::
```
#### Reflection Exercise (15 minutes)
We have reached the end of the time you have to work on the episodes of your lesson in this training. This exercise provides you with a chance to look back over everything you have sketched out for your episode and the lesson as a whole and consider what still needs to be done before it can be taught.
You can use this time however you judge will be most beneficial to your preparations for teaching your episode in a trial run.
If you are not sure how to start, consider mapping out the relationships between the objectives of your episode and the examples and exercises via which they will be taught and assessed.
For example,
> The read CSV and inspect demo supports Objective 2 (load a simple CSV data set with Pandas) and will be delivered using participatory live coding.
> The objective will be assessed with an exercise that requires learners to apply the read_csv function to another file and count the rows in the resulting DataFrame object.
- Does any of your planned content not support any learning objectives?
- Is there at least one piece of content planned for each learning objective?
- Is there a formative assessment planned for each learning objective?
- What do you still need to add/work on?
- discuss contextualisation for LatAm
- we need to refine our objectives at lesson and episode level
- we also need to keep track of lesson design in a structured way - we will use https://codimd.carpentries.org/catalyst-lesson-design-notes#
- find out more about target audience
- What can you remove/consider removing?
- How will the narrative and example data you have chosen for your lesson support teaching and assessment?
- What diagram or other visual aids could you add to supplement your text?
**Key Points**:
- The objectives and assessments provide a good outline for an episode and then the text fills in the gaps to support anyone learning or teaching from the lesson.
- It is important to review your lesson for demotivating language, cognitive load, and accessibility.
- To reduce cognitive load and ensure there is enough time for for the materials, consider which lesson objectives are not needed and remove related content and assessments.
LOs on site:
PENDING (from the notes):
- User intro to a JupyterHub. The difference between RStudio or JupyterLab 'images/environments' and the overall JupyterHub platform. Cloud computing concepts like machine type, node, autoscaling. When is a hub automatically shut down. Memory limits. Example cloud workflow demonstration.
- Cloud computing and cost management. Using Grafana to measure usage. Options for shared storage in cloud computing.
LOs on site:
Managing Software Images
- explain the value of reproducibility provided by images
- describe how software images are integrated with their hub environment
- evaluate whether an existing image meets users’ software needs
- pull an image container to make it available on their hub instance
Customising Software Images
- customise an image from a template
- ensure the reproducibility of images using repo2docker
- host an image on a container registry
Episode Hub Administration
- Explain the difference between JupyterHub and JupyterLab
- Add users to a hub using GitHub Teams
- Describe happens when a “server” is started
- How to stop and restart a server
---
### How we Operate
**Objectives**:
After completing this episode, participants should be able to...
- Describe the role that feedback plays in the life cycle of a lesson.
- Connect with other members of the community.
**Questions**:
- What are the important milestones in the development of a new lesson?
- How can The Carpentries lesson development community help me complete my lesson?
Carpentries Incubator: https://carpentries-incubator.org/
Carpentries Lab: https://carpentries-lab.org/
#### Lesson Life Cycle
- `pre-alpha`: first draft of lesson is being created
- `alpha`: lesson is being taught by authors
- `beta`: lesson is ready to be taught by other instructors
- `stable`: lesson has been tested by other instructors and improved based on feedback. Major changes and updates are relatively infrequent.
#### Pilot workshops
- _alpha pilot_: a workshop taught by the lesson authors, often one of the first few times the lesson has been taught.
- _beta pilot_: a workshop taught by instructors who have not had previous (major) involvement in developing the lesson.
Questions that can be answered in a pilot workshop:
- _How much time does it take to teach each section of the lesson?_
- _How much time is required for each exercise?_
- _What technical issues were encountered during the lesson?_
- _What questions did learners ask during the workshop?_
- _Which parts of the lesson were confusing for learners?_
- _Which exercises could be improved to provide more information to the instructors?_
[More guidance for organising/teaching pilot workshops](https://docs.carpentries.org/topic_folders/lesson_development/lesson_pilots.html)
## Communications channels
- https://slack-invite.carpentries.org/
- lesson-dev
- also look for domain-specific channels
- incubator-developers mailing list: https://carpentries.topicbox.com/groups/incubator-developers
- lesson development coworking sessions
**Key Points**:
- Teaching a lesson for the first time is an essential intermediate step in the lesson development process.
- The Carpentries lesson developer community shares their experience on multiple communication channels.
### Preparing to Teach
**Objectives**:
After completing this episode, participants should be able to...
- Summarise lesson content as a teaching plan.
- Add Setup Instructions and Instructor Notes to the lesson site.
- Create a feedback collection plan.
**Questions**:
- What can I do to prepare to teach my lesson for the first time?
- How should I communicate lesson setup instructions to learners?
- What information should be recorded for instructors teaching a lesson?
- How should information be collected as part of the feedback process?
#### How the lesson infrastructure can help you teach
- Use the Instructor View (dropdown at top-right of lesson page)
- display a schedule table on lesson landing page ("Summary and Schedule")
- display estimated timings for episodes at top of each episode page
- displays inline Instructor Notes
- gives you the 'Extract All Images' and 'Instructor Notes' links in the top bar menu
- The 'View all in one page' option at the end of the sidebar navigation list is a good way to open a printable version of the lesson, if you want a paper copy to make notes on/help you teach
#### Setup Instructions
Instructions for software and data setup are stored in `learners/setup.md`
#### Exercise: Add Setup Instructions (10 minutes)
Add setup instructions (in the `learners/setup.md` file) with a list of software/tools/data needed by participants to follow your lesson and links on how to obtain and install them.
Rather than producing a separate page in the lesson site, the contents of `learners/setup.md` will be combined with `index.md` to produce the Learner View of the landing page of your lesson.
- quay.io account to push images to?
#### Exercise: Add Instructor Notes (5 minutes)
Add Instructor Notes (in the instructors/instructor-notes.md file) with an initial list of points that will help you and other instructors deliver the lesson.
I suspect that "deploying an hub" will need to include creating accounts for our hub administrator learners. This information is passed to the 2i2c engineering team as a list of GitHub account names. In the `setup.md` instructions, learners should verify that they are able to log into a hub using their github account. Verify by showing a 'screen shot' of what learners should see to start the training. This will give time for account creations and authentication issues to be resolved before the live training starts. 2i2c has not yet built out the ability for one hub administrator to upgrade the role of another user to also be a hub administrator without involved of 2i2c engineering. This is functionality that I see coming eventually. When that is built, the workshop instructor would be able to do this account management rather than having to depend on 2i2c engineering.
##### Collecting Feedback
- assign someone with responsibility for taking notes during your pilot workshop
- notes template: https://codimd.carpentries.org/lesson-pilot-observation-notes-template#
- when advertising your workshop, tell people that it will be a pilot. this helps to:
- manage expectations
- encourage constructive feedback
- collect feedback constantly, throughout the workshop
- indirectly, through those notes you are taking
- directly, e.g. by asking for "minute card" feedback. here is our template, in case that is helpful to you https://docs.google.com/forms/d/1p7iOV5HNvy4POS4g6eottY8RSfKq4kaoKz1-jIFYTMI/template/preview
- also collect feedback at the end of the workshop
- a template for post-pilot workshop surveys: https://docs.google.com/forms/d/1OGCQBotD2nOJkc7KpFZLhFfb3EBcxEDwHz_3p48qz3U/template/preview
- you can use activities such as "one up, one down" to collect feedback directly (but not anonymously)
#### Lesson trial runs helper page
https://carpentries.github.io/lesson-development-training/trial-runs.html
Day 4 Feedback: https://forms.gle/nwATXBnEednmP2Fi7
PArt 1 feedback:
https://forms.gle/nwATXBnEednmP2Fi7
https://docs.google.com/forms/d/e/1FAIpQLSebuabrd0KP-bNK3nWiB8a-cELq__0KRXhxniPipAVfEtO_xw/viewform?fbzx=3039448853671559894
Id for this training: 2024-01-23-cldt-online