This is how I met the requirements for this competency: Mentored Research Project:
Influence of context on the architecture of student-generated models in an intro-bio course
Assessing teaching and learning: Backward Design - Dr. Diane Ebert May, Fall 2017
Research Project: To read more about my mentored research project please click here.
6-step outline: Teaching and Learning Goal: We already know that context influences students' epistemological understanding (Gobert et al., 2011; Krell, et al., 2012, 2014) and meta-modeling knowledge (Schwarz, 2002) of models and modeling. Additionally we know from analysis of students' written responses that context influences they way they reason about evolution by natural selection. What we do not know, and want to figure out, is if and how context influences the way students construct models, especially models of evolution by natural selection. Teaching Question: We tested this question in Dr. Long's section of BS162 in FS15. This is the second course in a two part introductory biology series which are required for most Life Science majors. It is called Organismal and Population Biology.
These were our questions: Q1. How does context affect the architecture of the student generated models? Q2. How much of this variation can be explained by demographics, such as prior achievement (GPA)?
Classroom Practice: Dr. Long's classes are active classes where scientific skills are stressed. As part of regular course work, students have to construct and evaluate models, collect, represent and analyse data, and use evidence to build arguments.
The data for this project came from an in-class assessment which was given as a practice to an exam. Students took about 15 minutes to complete the test. Prior to grading, the assessment was scanned, and these scans were used for analysis. The scans were not anonymous, however since the lab I work in is a biology education research lab, we have procedures set in place to deidentify the data. All analysis was done using the deidentified data.
Additionally we also got data from the registrar's office to answer the second question. This was also deidentified prior to analysis.
Assessment technique: We administered four isomorphic prompts of the form: “(Taxon) has (trait). How would biologists explain how a (taxon) with (trait) evolved from an ancestral (taxon) without (trait)?”. This prompt stem has been validated by researchers analysing students narrative responses (Nehm & Ha, 2011).
Each student provided model-based responses to two prompts (same type of trait, different taxa. We analysed models for aspects of model architecture like size (number of structures, arrows, and propositions) and complexity (Web-Causality Index, WCI). This type of data has been used in prior research, including prior research in my lab, to make claims about students cognitive structures. (Dauer et al., 2013; Ifenthaler, Masduki, & Seel, 2011)
We then analysed the registrar's data to explore the effects of demographic variables (performance and class level) on variation in model architecture.
Summary: Our results indicate that contextual features (here, taxon) are eliciting differences in model architecture. While the context of the prompt did not significantly impact model size, complexity did vary with context. This could indicate that while students are using the same number of concepts to explain natural selection in both humans and cheetahs, their cognitive structures are more connected when reasoning about non-human animals.
Middle-achieving students constructed models that were unaffected by context, both in terms of size and of complexity. Students with lower GPAs had the highest variation in complexity based on context, and they had the highest mean complexity when responding to cheetah prompts. Additionally, they had low sized models for both the contexts. High-achieving students had both low size and complexity.
Conclusion: Scientific argumentation and scientific models are two of the skills and practices that have deemed important for undergraduate biology education. From students written responses, we had evidence that context affects the way they reason about natural selection. Now we have evidence that it also affects they way they construct models. This has implications both for instruction and assessment.
Using relevant and relatable contexts is important in instruction. From our findings we see that students' models about human evolution are the least complex. Therefore, it might be advisable to use humans as the model organism when talking about natural selection. After all what could be more relevant and relatable? Additionally in the interest of efficiency, since it is not possible to teach every single concept in every single context, it would be good to know which are the 'troublesome' contexts and use them in instruction.
Context can act as a barrier to knowledge transfer. Students are often not able to see the conceptual linkages between one context and the other. When it comes to assessment, when designing exams we want to be sure that we are actually measuring what we set out to measure. Often instructors frame parallel questions - this is when they use a context other than the one that was used during instruction, to test conceptual knowledge. However, if the test context does not elicit the same knowledge in the student, then it becomes a non-valid measure of knowledge.
The part about this project that surprised me the most is the middle achievers are seemingly immune to contextual influences. This is a group that is often ignored, the resources either go towards low achievers or high achievers. It would be very interesting to know what exactly is going on in their minds!
Products: This research will become part of the second chapter of my dissertation.
It has already been presented at the following conferences: Contributed oral presentations:
de Lima, J. 2018. Influence of context on the architecture of student-generated models in an intro-bio course. Michigan AGEP Alliance Fall Conference, MSU, Michigan, USA.
de Lima, J., Long, T. M. 2018. Contextual Differences Influence Model Architecture. Society for the Advancement of Biology Education Research (SABER) National Meeting, University of Minnesota, Minnesota, USA
de Lima, J. 2018. Influence of context on the architecture of student-generated models in an intro-bio course. Future Academic Scholars in Teaching (FAST) Fellowship Program 12th Annual Symposium. MSU, Michigan USA
de Lima, J. 2018. An exploration of contextual influence on model architecture and cognitive structures. Plant Science Graduate Student Research Symposium, MSU, Michigan, USA
de Lima, J., Long, T. M. Influence of context on the architecture of student generated models in an Intro-bio course. MSU Graduate Teaching and Learning Fellows Poster Symposium. MSU, Michigan, USA