1. Emma Anderson
  2. Research Scientist
  3. Making Sense of Models
  4. MIT STEP Lab
  1. Irene Lee
  2. https://www.linkedin.com/in/irene-a-lee/
  3. Research scientist
  4. Making Sense of Models
  5. MIT STEP Lab, Education Development Center (EDC)
  1. Aditi Wagh
  2. Research Scientist
  3. Making Sense of Models
  4. Massachusetts Institute of Technology
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Emma Anderson

    Emma Anderson

    Lead Presenter
    Research Scientist
    May 9, 2022 | 09:12 p.m.
    Thank you for watching our video! My name is Emma Anderson and I am one of the researchers on Making Sense of Models. I, along with my colleagues Irene Lee and Aditi Wagh, are very interested in hearing from you. 

    In our video we speak about how we present students with broken models and ask them to fix them so they better represent the scientific phenomenon being modeled. We would love to hear what you think of this approach. How do you help students think through the validity of a computational scientific model? 

    All of our units link, math, science and CT learning. We use the broken model activity as a way for students to think through the science phenomena they have learned, understand the math concepts and relationships learned, and connect to the underlying CT in the model as they figure out how to fix the model. Do you do cross curricular CT integration? If so, how do you help students make links across different disciplines so that students are gaining knowledge and skill in both or all three disciplines? What do you think of our approach?

    Of course we would love to hear all questions and comments even if not related to these two areas of focus (model validity and cross curricular learning). 
  • Icon for: Brian Drayton

    Brian Drayton

    Researcher
    May 10, 2022 | 01:38 p.m.

    I like this approach -- giving students the opportunity (and indeed the need) to think critically in domain-specific terms.  It does seem as though construing "model" principally in terms of computer models may introduce additional levels of complexity that can distract from the more basic challenges of learning how to critique representations vis-à-vis the phenomena being represented.  Are you (is your research) interested in the more general problem of model validation?  

  • Icon for: Emma Anderson

    Emma Anderson

    Lead Presenter
    Research Scientist
    May 10, 2022 | 01:46 p.m.
    Brian! Thanks for your question as our project is interested in the overlap and integration of math, science and CT we are specifically focused on computational models and how students critically exploring computational models can be rich learning moments. But we would be happy to learn how others help students explore validity with other types of models. 
  • Icon for: Joan Ferrini-Mundy

    Joan Ferrini-Mundy

    Facilitator
    University President
    May 11, 2022 | 02:12 p.m.

    Very interesting work.  Can you tell us more about any specific learning outcomes you have in mind, in mathematics, scientific process, or computing?  It would seem there might also be gains in confidence that would be good to assess.  Do you have data about what students see that signals the model is broken - and do students see the "same problems" in a given model?

  • Icon for: Emma Anderson

    Emma Anderson

    Lead Presenter
    Research Scientist
    May 11, 2022 | 04:18 p.m.

    Thanks for your questions! For your first question about specific learning outcomes we have aligned our units with both common core and NGSS standards for 6th grade learning. We have also focused on particular CT/CS skills and knowledge such as critically viewing models to assess validity. 

    In terms of the 'broken' models there are two places that student encounter them. At the end of each unit we have an activity called 'fix-its' where students are presented with a model and told what the intent of the modeler was in building the model for example "the modeler wanted to model what happens when sunlight reflects down on white and black roofs."  Students are asked to run the model and figure out if its showing what the modeler intended and if not figure out what is wrong and how to fix it.  

    Students who were selected to participate in the artifact based interview were also shown broken models and also told the intent the modeler had for the model they built. In the ABI we have seen that usually within a short run of the simulation most students are able to identify at least one error with the model. Some students are able to figure out how to solve this error and some are able to point out additional errors with the model. 

    So to answer you second question most students are able to identify one error with the model and a few students are able to identify more than one error. 

    Let me know if this answered your question or if I can clarify or answer more questions. 

  • Icon for: Aditi Wagh

    Aditi Wagh

    Co-Presenter
    Research Scientist
    May 11, 2022 | 09:12 p.m.

    Hi Joan! Thanks for your questions!


    To build on Emma’s reply about learning outcomes: A key learning outcome we are interested in is students’ understanding of mechanism. Each of our curricular units centers on a particular mechanism which is represented in code in the model. For instance, in one unit, students explore probabilistic conditionals to investigate the mechanism of varied reaction to surfaces. Our math and science learning outcomes in each unit are centered around that particular mechanism. For instance, in this unit, students investigate the albedo effect and its impact on rising ocean temperatures, and in math, they explore probability and chance. In this way, each unit focuses on a coding, math and science concept that are tightly linked in the unit (probabilistic conditionals, probability and albedo and rising ocean temperatures). 


    The assessment corresponding to each unit is designed for us to investigate this linked code-math-science mechanistic understanding. We are looking for gains in each independently (code, math and science) as well as gains in their mathematical and scientific understanding of a concept through code (math+code and science+code). Feel free to reach out if you’d like to learn more or look at the assessments (awagh@mit.edu). 


    In our student interviews, we are seeing that students detect that the model is broken in different ways. Many students notice that something is off from the visuals in a simulation run of the model. Some students notice the graph that is dynamically updated as the model runs, and recognize that the trend does not look right. Fewer students look right away at the code and notice that it’s broken. As Emma pointed out, most students are able to identify that something is wrong with the model. 


    And you’re absolutely right about gains in confidence. In interviews, students talk about feeling proud that they could tell the model was wrong and were able to fix it. Field observers from classroom implementations have also noted that students express feelings of confidence around being able to engage in modeling and coding. This is something we're looking to assess in future work. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Satabdi Basu
  • Icon for: Tichina Ward-Pratt

    Tichina Ward-Pratt

    Facilitator
    Educator
    May 12, 2022 | 02:32 p.m.

    Thanks for sharing about this great curriculum. I appreciate the integration or Science, Math, and CS.

    I am wondering how student choice and experience is embedded into these complex lesson plans? Also, what motivated the creation of this curriculum?

  • Icon for: Emma Anderson

    Emma Anderson

    Lead Presenter
    Research Scientist
    May 13, 2022 | 04:14 p.m.

    Tichina! 

    The motivation for this curriculum came from a practitioner in the field, a head of a science department, who in a conversation mentioned how in the middle schools in his district he knew that students were learning math in their math classes, but they were having a really hard time applying it to their science classes. We also had heard and seen from teachers who have incorporated computational modeling into their curriculum that they would have to spend extra time reteaching math concepts.  Paired with this knowledge of students learning math but not seeing how it is interconnected with other subject areas lead to us creating this multidisciplinary set of lessons. 

  • Icon for: Laura Santhanam

    Laura Santhanam

    Health Reporter & Coordinating Producer for Polling
    May 12, 2022 | 05:07 p.m.

    Such empowering work! This is a great way to engage students in issues that more greatly resemble real-world issues. It's rare that we walk into an error-free situations. This trains them to look for flaws and trains them to use the tools to fix them. It would be great to hear more about how students grow and collaborate in this model.

  • Icon for: Emma Anderson

    Emma Anderson

    Lead Presenter
    Research Scientist
    May 13, 2022 | 04:15 p.m.

    We too are excited to dig more into our data to learn more about how students are engaging with the curriculum. 

  • Icon for: Joi Spencer

    Joi Spencer

    Facilitator
    Interim Dean and Professor of Mathematics Education
    May 12, 2022 | 07:30 p.m.

    I really enjoyed learning about this project. I also love the idea of students trying to figure out what it broken in code so that it can be fixed to better portray a scientific phenomena. I loved hearing the young student share the how empowering it was to find what was broken. In some of my own work, I give students challenging visual puzzles to solve and then ask them to share about the process of finding the solution. 

    Can you share about students persistence and specifically about their ability to persist productively- even when the task is really challenging and they are stuck? Also, I note that you have a focus both on the coding and on the scientific phenomena. How are these dual learning goals managed? Do students struggle with the coding more or the scientific concepts? Or perhaps with something different all together?

    Thanks and I look forward to learning more.

  • Icon for: Emma Anderson

    Emma Anderson

    Lead Presenter
    Research Scientist
    May 13, 2022 | 04:44 p.m.

    Joi!

    These are really great questions! 

    Through a designed based research processes we have worked very hard in our curriculum to do our best to make sure that math, computational thinking, and science are equally weighted in the lessons and activities.  Through seeing how students engaged with different aspects of the lessons, speaking with teachers, and our field investigators we have rewritten our lessons to make sure we are tightly coupling math, CT, and science concepts. For example in our unit "Its Gettin' Hot in Here!" students learn in math about how to write the location of an object in 3D space through the Z coordinate. In science students learn about absorption and reflection of light energy from white and black surfaces, and for CT students learn about if then logic. These three concepts are interwoven throughout the lessons to help students engage with all three domains of knowledge and interlink these ideas through their engagement with computational models. 

    We are still digging into our data and cannot speak to where or with what students struggle the most. I can say that we see students gain in their confidence and fluency with code and understanding of science phenomena from their artifact based interviews. 

  • Icon for: Jessica Stovall

    Jessica Stovall

    Higher Ed Faculty
    May 14, 2022 | 11:17 p.m.

    I enjoyed watching the video for your project.  I am part of a team that has been researching the impacts of using programming to teach mathematical reasoning. Though our current project has been focused on 7th and 8th grade math classrooms, we have utilized our instructional model in science classrooms as well.  We have a high school forensics lesson, an 8th grade science lesson about waves, and a high school bioinformatics lesson. These lessons were developed in conjunction with Physics faculty members and local science teachers. When we are training teachers to use our instructional model, we often have them investigate and correct broken codes.  We have found this builds their confidence in programming before leading the students in the coding activities.  It is interesting to see the students in this role and will be something for us to consider in the future as well.   Thanks for sharing your work!

  • Icon for: Leah Wiitablake

    Leah Wiitablake

    Graduate Student
    May 16, 2022 | 05:35 p.m.

    I really like how the student was proud of the fact that they were able to identify the issue with the model themselves. And from one of the replies it sounds like other students said similar things! Building confidence in problem-solving and STEM is fantastic!

  • To post to this discussion go to