Teaching Python

Author(s) orcid logoAvatarHelena Rasche
Editor(s) AvatarBazante Sanders
Tester(s) AvatarDonny Vrins
Overview
Questions:
  • What methods can I incorporate to improve student material uptake within my python courses?

  • How do I write problems that use these methodologies

Objectives:
  • Learn about four different problem design ideas, and one teaching methodology.

Time estimation: 2 hours
Supporting Materials:
Last modification: Oct 27, 2022
License: Tutorial Content is licensed under Creative Commons Attribution 4.0 International License The GTN Framework is licensed under MIT

Improving student learning and out comes should always be a goal of teaching. Here we present several strategies to improve student experiences during courses by focusing on how they approach specific problems, and giving them real world applicable solutions to those problems.

Agenda

In this tutorial, you will learn how to run a local instance of the GTN website:

  1. Course Management Strategies
  2. Problem Strategies
  3. Comparisons to K-12 methodologies

“Live coding”, as espoused by the Carpentries, is a fantastic strategy to communicate material to students and ensure they get a hands-on experience simultaneously. Showing what happens live on the screen is received well by students, if they can manage to watch what we type and try to type it themselves simultaneously. We know at least that our examples give the correct result, but students never see anything other than correct, working code, and never have to formulate an internal model for how to write code. They end up copying and pasting and not understanding why.

Predicting code behaviour without running it is a key component of work as a programmer, and a lot of the time we spend debugging relies on us emulating the computer in our head. Without a solid mental model of code behaviour one cannot predict how it will function in one situation, much less other or non-standard situations. Planning for code to handle both good and bad inputs requires some creativity and mentally planning around expected values at various points throughout the execution.

This situation leaves students unprepared for incorrect or buggy code, either (un)intentionally included in homework assignments, or, generated by themselves, if they cannot identify where code will fail without executing it.

Augmenting lessons with:

  • Pair programming
  • Tracing - Stepping through the internal state
  • Faded examples
  • Compounding examples
  • Debugging intentionally broken examples

Will give students enough tools to respond dynamically to failure states with informed experience to resolve issues they encounter as programmers.

The student’s mental model of the code underlies everything they do as a programmer, from conception to implementation to debugging to their self efficacy:

This study shows that a well-developed and accurate mental model directly affects course performance and also increases self efficacy, the other key element in course performance. Given this double impact, helping students develop good mental models should remain a goal in introductory programming courses. Ramalingam et al. 2004

This is a foundational skill to be able to think through a program, step by step, and understand how the code executes and which variables exist when, and what their values should be. This mental modelling allows students to predict the behaviour of a system, and when it diverges from their prediction, recognise any potential bugs.

Course Management Strategies

Pair Programming

Complementary to the other strategies, Pair Programming or “pairing” provides a reinforcement activity where they utilise similar skills. As one person writes and executes code, the other person ‘drives’ the experience, telling them what to write (Williams, Williams and Upchurch 2001). It has become a common learning model in introductory courses due to its benefits to students (Mendes et al. 2005, Mendes et al. 2006, Hannay et al. 2007). Specifically this technique has also been shown to be beneficial for women in computer science and gives them better chances for success in future programming endeavours (Werner et al. 2004). Adopting this technique is promising, provided you adhere to principles outlined by Mentz et al. 2008.

These can often be implemented as breakout rooms wherein students are assigned a handful of problems to complete. After the breakout rooms end, you can have students summarize solutions, pick on individual ones for their ideas, etc.

Problem Strategies

Tracing Code Execution

Input: Code
# Initialise our accumulator
x = 1 + 1
# Loop over our input data
for i in range(10): # 0..9
    # In-loop temporary variable
    tmp = x * 2 + i
    # Update our accumulator
    x = tmp + 1
# Output our result
print(f'The final value is {x}')
Output: Trace
Line i x tmp
2 n/a 2 n/a
4 0 2 n/a
6 0 2 4
8 0 5 4
4 1 5 n/a
6 1 5 11
8 1 12 11
4 2 12 n/a
6 2 12 26
8 2 27 26

While there is no bug in the above, when there is a bug present, having students produce a table like that significantly improves their understanding of code flow and execution Hertz and Jump 2013. “Tracing” is a valuable and easy to complete exercise, and the results can even be checked automatically leading to good scalability of the exercise across larger classes.

Here students can also use a Debugger like pudb which can follow the execution of a bit of code, and show exactly how it’s working.

Here teaching liberal use of the print() command, as opposed to more complicated tools like the above, can give students the tools they need to solve problems. This was generated by hexylena/auto-python which can be reused or contributed to if new examples are needed.

Faded Examples

When teaching programming one must constantly be cognisant of the student’s cognitive load. It is a complicated task that demands a lot of students, requiring types of explicit logic analysis that they may not have engaged in before. Both learning based on problem-solving and worked examples may cause high cognitive loads for different audiences, and exploring alternatives is important (Retnowati 2017). Faded examples such as what is seen below are exactly such an alternative, starting with a fully worked example and removing successive components until we reach a problem description requiring a full solution. This leads to fewer unproductive learning events (Renkl et al. 2004).

# Write a function that multiplies two numbers
def multiply(a, b):
    c = a * b
    return c

The initial problem shows the entire solution to students

# Write a function that adds two numbers
def add(___):
    ____
    return c

Increased fading, here we call out blanks students should fill in specifically with syntactically incorrect underscores.

# Write a function that subtracts two numbers

Final fading, the entire problem is gone except for the description of what they need to do.

Faded examples however, do come at a higher cost of implementation than worked out examples (Zamary and Rawson 2018). They require writing the correct worked out example and then determining which components to remove, which presents an additional cost during course updates that if examples are changed they need to be double checked to ensure they are still valid, whereas worked examples can be checked more automatically.

Compounding Problems

Compounding problems are a good strategy for homework problems, as you can ask multiple things of students and provide a gentle ramp up to increased complexity. Start by designing a small but complex problem like “write a fastq trimmer”, where here they need to implement several different subtasks:

  • file processing
  • several utility functions
  • multiple filter stages
  • a single main function which combines all of the above

If done correctly, the students have freedom to move around individual functions that aren’t dependent on each other, making sure they’re correct, before building them up into a final function.

There are two alternative ways to further design the problem:

  • Provide it broken down, precise small functions they should implement.
  • Describe the problem and let students determine the optimal way to break it down into small, manageable components.

Which option is prefgerrable depends strongly on how advanced your students are. See the example homework FastQ_trimmer.html and it’s associated ipynb file.

Debugging

Debugging is the act of identify and resolving “bugs” or defects within code, a term popularly attributed to my personal hero Admiral Grace Hopper:

While she was working on a Mark II computer at Harvard University, her associates discovered a moth stuck in a relay and thereby impeding operation, whereupon she remarked that they were “debugging” the system (Wikipedia contributors 2022)

Debugging also functions as a reinforcement method we can use once students have an ok mental model of code execution, a necessary pre-requisite for this activity, which can be further developed through debugging (Ramalingam et al. 2004) alongside their self-efficacy (Michaeli and Romeike 2019). Debugging activities can take many forms but most commonly the task is to correct incorrect code, an activity that works best if they are primed with a number of methods of debugging (Murphy et al. 2008) such as the “Wolf Fence” (Gauss 1982), commenting out code, or breakpoints.

# Fix me!
for number in range(10):
    # use a if the number is a multiple of 3, otherwise use b
    if Number \% 3 == 0:
        message = message + a
    else:
        message = message + "b"
print(message)

The above debugging exercise featuring code with numerous issues from type confusion, variable typos, and failure to initialise a variable. Students can run this example iteratively to figure out where it fails and attempt to fix it.

Use of more complex debugging tools is not always indicated, as the cognitive complexity may be too much for students.

Comparisons to K-12 methodologies

In K-12 teaching (Sentance and Waite 2017), this intervention is used to good results. Their model, PRIMM (Sentance and Waite 2017), starts with a good mental model which is required to predict, tracing during investigation, and debugging to modify code, all building towards students making things themselves.

Here are some examples of how to implement the PRIMM methodology in exercises.

Key points
  • Debugging code is an absolutely critical skill for students that they need ot become familiar with early on.

  • There are a lot of strategies you can teach students to give them the necessary skills for this, such as Wolf Fence debugging.

  • Or simply adding print statements so they can trace the code’s execution.

  • Fading examples helps students become more independent as they work onwards through problems.

  • Conversely compounding problems are sometimes a good fit for homework to give numerous “easy” problems followed by some “stretch” goals that students can reach for, combining their previous work.

  • Pair progarmming is proven to improve student learning outcomes, when applied correctly.

Frequently Asked Questions

Have questions about this tutorial? Check out the FAQ page for the Contributing to the Galaxy Training Material topic to see if your question is listed there. If not, please ask your question on the GTN Gitter Channel or the Galaxy Help Forum

References

  1. Gauss, E. J., 1982 The Wolf Fence algorithm for debugging. Communications of the ACM 25: 780. 10.1145/358690.358695
  2. Williams, L., and R. L. Upchurch, 2001 In support of student pair-programming. ACM SIGCSE Bulletin 33: 327–331. 10.1145/366413.364614
  3. Ramalingam, V., D. LaBelle, and S. Wiedenbeck, 2004 Self-efficacy and mental models in learning to program, in Proceedings of the 9th annual SIGCSE conference on Innovation and technology in computer science education - ITiCSE ’04, ACM Press. 10.1145/1007996.1008042
  4. Renkl, A., R. K. Atkinson, and C. S. Große, 2004 How Fading Worked Solution Steps Works – A Cognitive Load Perspective. Instructional Science 32: 59–82. 10.1023/b:truc.0000021815.74806.f6
  5. Werner, L. L., B. Hanks, and C. McDowell, 2004 Pair-programming helps female computer science students. Journal on Educational Resources in Computing (JERIC) 4: 4–es. 10.1145/1060071.1060075
  6. Mendes, E., L. B. Al-Fakhri, and A. Luxton-Reilly, 2005 Investigating pair-programming in a 2nd-year software development and design computer science course, pp. 296–300 in Proceedings of the 10th annual SIGCSE conference on Innovation and technology in computer science education, 10.1145/1151954.1067526
  7. Mendes, E., L. Al-Fakhri, and A. Luxton-Reilly, 2006 A replicated experiment of pair-programming in a 2nd-year software development and design computer science course. ACM SIGCSE Bulletin 38: 108–112. 10.1145/1151954.1067526
  8. Hannay, J. E., D. I. K. Sjoberg, and T. Dyba, 2007 A Systematic Review of Theory Use in Software Engineering Experiments. IEEE Transactions on Software Engineering 33: 87–107. 10.1109/tse.2007.12
  9. Mentz, E., J. L. van der Walt, and L. Goosen, 2008 The effect of incorporating cooperative learning principles in pair programming for student teachers. Computer Science Education 18: 247–260. 10.1080/08993400802461396
  10. Murphy, L., G. Lewandowski, R. McCauley, B. Simon, L. Thomas et al., 2008 Debugging. ACM SIGCSE Bulletin 40: 163–167. 10.1145/1352322.1352191
  11. Hertz, M., and M. Jump, 2013 Trace-based teaching in early programming courses, in Proceeding of the 44th ACM technical symposium on Computer science education - SIGCSE ’13, ACM Press. 10.1145/2445196.2445364
  12. Retnowati, E., 2017 Faded-example as a Tool to Acquire and Automate Mathematics Knowledge. Journal of Physics: Conference Series 824: 012054. 10.1088/1742-6596/824/1/012054
  13. Sentance, S., and J. Waite, 2017 PRIMM, in Proceedings of the 12th Workshop on Primary and Secondary Computing Education, ACM. 10.1145/3137065.3137084
  14. Zamary, A., and K. A. Rawson, 2018 Are Provided Examples or Faded Examples More Effective for Declarative Concept Learning? Educational Psychology Review 30: 1167–1197. 10.1007/s10648-018-9433-y
  15. Michaeli, T., and R. Romeike, 2019 Improving Debugging Skills in the Classroom, in Proceedings of the 14th Workshop in Primary and Secondary Computing Education, ACM. 10.1145/3361721.3361724
  16. Wikipedia contributors, 2022 Debugging — Wikipedia, The Free Encyclopedia. [Online; accessed 9-February-2022]. https://en.wikipedia.org/w/index.php?title=Debugging&oldid=1069955193
  17. Williams, L. Integrating pair programming into a software development process, in Proceedings 14th Conference on Software Engineering Education and Training. ’In search of a software engineering profession’ (Cat. No.PR01059), IEEE Comput. Soc. 10.1109/csee.2001.913816

Feedback

Did you use this material as an instructor? Feel free to give us feedback on how it went.
Did you use this material as a learner or student? Click the form below to leave feedback.

Click here to load Google feedback frame

Citing this Tutorial

  1. Helena Rasche, Teaching Python (Galaxy Training Materials). https://training.galaxyproject.org/training-material/topics/contributing/tutorials/python/tutorial.html Online; accessed TODAY
  2. Batut et al., 2018 Community-Driven Data Analysis Training for Biology Cell Systems 10.1016/j.cels.2018.05.012



@misc{contributing-python,
author = "Helena Rasche",
title = "Teaching Python (Galaxy Training Materials)",
year = "",
month = "",
day = ""
url = "\url{https://training.galaxyproject.org/training-material/topics/contributing/tutorials/python/tutorial.html}",
note = "[Online; accessed TODAY]"
}
@article{Batut_2018,
    doi = {10.1016/j.cels.2018.05.012},
    url = {https://doi.org/10.1016%2Fj.cels.2018.05.012},
    year = 2018,
    month = {jun},
    publisher = {Elsevier {BV}},
    volume = {6},
    number = {6},
    pages = {752--758.e1},
    author = {B{\'{e}}r{\'{e}}nice Batut and Saskia Hiltemann and Andrea Bagnacani and Dannon Baker and Vivek Bhardwaj and Clemens Blank and Anthony Bretaudeau and Loraine Brillet-Gu{\'{e}}guen and Martin {\v{C}}ech and John Chilton and Dave Clements and Olivia Doppelt-Azeroual and Anika Erxleben and Mallory Ann Freeberg and Simon Gladman and Youri Hoogstrate and Hans-Rudolf Hotz and Torsten Houwaart and Pratik Jagtap and Delphine Larivi{\`{e}}re and Gildas Le Corguill{\'{e}} and Thomas Manke and Fabien Mareuil and Fidel Ram{\'{\i}}rez and Devon Ryan and Florian Christoph Sigloch and Nicola Soranzo and Joachim Wolff and Pavankumar Videm and Markus Wolfien and Aisanjiang Wubuli and Dilmurat Yusuf and James Taylor and Rolf Backofen and Anton Nekrutenko and Björn Grüning},
    title = {Community-Driven Data Analysis Training for Biology},
    journal = {Cell Systems}
}
                   

Congratulations on successfully completing this tutorial!
Developing GTN training material
This tutorial is part of a series to develop GTN training material, feel free to also look at:
  1. Overview of the Galaxy Training Material
  2. Adding auto-generated video to your slides
  3. Adding Quizzes to your Tutorial
  4. Contributing with GitHub via command-line
  5. Contributing with GitHub via its interface
  6. Creating a new tutorial
  7. Creating content in Markdown
  8. Creating Interactive Galaxy Tours
  9. Creating Slides
  10. Design and plan session, course, materials
  11. Generating PDF artefacts of the website
  12. GTN Metadata
  13. Including a new topic
  14. Principles of learning and how they apply to training and teaching
  15. Running the GTN website locally
  16. Running the GTN website online using GitPod
  17. Teaching Python
  18. Tools, Data, and Workflows for tutorials
  19. Updating diffs in admin training