1. Eric Greenwald
  2. Senior Research Lead
  3. Positioning Youth for Success in Science: Studying the Malleability and Impact of Computational Thinking for Science
  4. UC Berkeley, Lawrence Hall of Science
  1. Mac Cannady
  2. Research Group Director
  3. Positioning Youth for Success in Science: Studying the Malleability and Impact of Computational Thinking for Science
  4. UC Berkeley
  1. Rena Dorph
  2. Director, The Lawrence Hall of Science
  3. Positioning Youth for Success in Science: Studying the Malleability and Impact of Computational Thinking for Science
  4. UC Berkeley, Lawrence Hall of Science
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Barbara Hopkins

    Barbara Hopkins

    Science Education Consultant
    May 10, 2022 | 09:24 a.m.

    Hello Eric:

    I will look further into your CT-S measurement framework to see how our work might use that as part of our impact assessment. We purposely pushed for purposeful measurement and quantitative data analysis in hopes that our general biology courses would include more computation. We learned from work with our biology teachers that the use of Punnet Squares was the most common response when asked about Computation in Biology classes. They just used pre-existing Punnet Squares and did not analyze population data. I wondered if your work might have seen a similar trend in the K-12 Life Sciences?

  • Icon for: Eric Greenwald

    Eric Greenwald

    Lead Presenter
    Senior Research Lead
    May 10, 2022 | 12:31 p.m.

    Thanks for sharing! What you found about Punnet Squares resonates--one thing that emerged as we were developing the framework is just how uncommon computational data practices are in K12 classrooms, despite being pervasive in modern science. Our measure includes items that attend to these kinds of 'data moves' as well as other computational practices common in science practice in the 21st century. Please let us know how we might support your work!

  • Icon for: Eric Greenwald

    Eric Greenwald

    Lead Presenter
    Senior Research Lead
    May 10, 2022 | 01:13 p.m.

    Hi Folks, Thanks for your interest in our work! Here's a link to our technical report for the measure. The report details the conceptualization, development, and validity evidence of the computational thinking for science survey. In this report, we provide an operational definition of CT-S, describe the survey construction process, and provide validity evidence to support inferential claims from the survey about a middle school student’s computational thinking for science ability. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Adelmo Eloy
  • Icon for: Tichina Ward-Pratt

    Tichina Ward-Pratt

    Facilitator
    Educator
    May 13, 2022 | 12:44 a.m.

    This is a great project. Thanks for sharing. How were you able to measure the effectiveness of how your framework used computational tools to achieve science goals?

  • Icon for: Eric Greenwald

    Eric Greenwald

    Lead Presenter
    Senior Research Lead
    May 13, 2022 | 04:13 p.m.

    Thanks for the comment! I'm not sure if this is answering your question exactly, but our technical report lays out how we gathered evidence for the validity of the assessment--that it is measuring something meaningful and not already captured in other similar measures--and some background on how we developed the assessment.

    We also have a manuscript under review that provides the theoretical grounding for our assessment framework that centers activity with computational tools. We define a computational tool as anything that can compute, or carry out sequences of arithmetic, or logical operations, automatically in accordance with a well-defined model (e.g., an algorithm) has computational affordances (e.g., digital and analog artifacts like calculators and slide rules respectively have computational affordances). In short, we see three ways in which a computational tools are involved in computational thinking. 

     

    • Reflective Use of a Computational Tool: building or modifying a mental model of that computational tool’s functionality through interaction with that tool.
    • Design of a Computational Tool: building or modifying a mental model of an imagined computational tool’s functionality.
    • Evaluation of a Computational Tool: building or modifying a mental model of the affordances and limitations of that computational tool’s functionality.

     

    These definitions are grounded in Activity Theory which stipulates that cognition occurs through the interaction with tools towards a goal. In other words, each of the above definitions assumes that the cognitive processes are happening within a goal-directed activity. In the case of CT-S, the goal-directed activity is necessarily a science activity.

     
    1
    Discussion is closed. Upvoting is no longer available

    Tichina Ward-Pratt
  • Icon for: Barbara Hopkins

    Barbara Hopkins

    Science Education Consultant
    May 10, 2022 | 02:55 p.m.

    That's exactly what I needed ...the Technical Report!  Thank you! BTW if you haven't read "Homo Deus"...I highly recommend it!

  • Icon for: Joan Ferrini-Mundy

    Joan Ferrini-Mundy

    Facilitator
    University President
    May 11, 2022 | 05:37 p.m.

    Fascinating work, to tackle the conceptual and cognitive framing of computational thinking.  I am interested in whether the assessment can be used in informal STEM learning environments, and if it is already, and what you are learning? Also, as we increasingly see opportunity for college students to be engaged in research and projects that will require integration of computational thinking with other STEM areas, what are your thoughts on the developmental trajectory of computational thinking in science?  Do you have research underway on that front?

  • Icon for: Eric Greenwald

    Eric Greenwald

    Lead Presenter
    Senior Research Lead
    May 11, 2022 | 06:35 p.m.

    Thank you for your interest! We have not yet tested the measure in informal learning settings, but as part of the Activation Lab's broader measurement development effort, that is definitely an area we'll be looking to study--please let us know if you might be interested in partnering or participating in that work! For more background on the instrument and the evidence for its validity that we have been able to gather in classroom settings, please take a look at the technical report

    As for your question about the trajectory of CT in science, our team has been thinking about that a lot! As an example, here's a recent publication considering the prospects for integrated CT+S learning. We have several projects exploring learning at this intersection, and have been working with computer scientists, AI researchers, bio-engineers, climate scientists and others to design ways to support young learners' engagement in contemporary science practices (in formal and formal contexts). In broad terms, we see two trends to consider together: the increasing integrality of computational and AI-involving practices for science, and the increasingly transdisciplinary nature of the grand challenges that science can help make progress on.

     

  • Icon for: Rena Dorph

    Rena Dorph

    Co-Presenter
    Director, The Lawrence Hall of Science
    May 12, 2022 | 08:10 p.m.

    Adding to Eric's response-- the measure is designed to be used across learning settings, like our other Activation Lab measures. While we recruited study participants for this project from classroom-based settings, we will definitely will be using it in informal settings--summer camps, youth programs, afterschool spaces, etc. In fact, this summer we're planning to use it in some of our summer camp programs to learn if CT-S changes over the course of a 2-week camp program. We'll share out what we learn once we have done so.

  • Icon for: Lynn Cominsky

    Lynn Cominsky

    Higher Ed Faculty
    May 12, 2022 | 01:50 p.m.

    We have developed an integrated CSTEM class for 9th grade students, and will definitely look into your framework. For anyone else who is interested in our Learning by Making curriculum, please feel free to contact me and to check our website http://lbym.sonoma.edu

  • Icon for: Joi Spencer

    Joi Spencer

    Facilitator
    Interim Dean and Professor of Mathematics Education
    May 12, 2022 | 05:49 p.m.

    Thank you for sharing this intriguing study. Your study recognized that the work that scientists engage in is often so different than what we see occurring in classrooms. Did you collaborate with practicing scientists to get a sense of the computational work they do in their profession? Do students get to interact with scientists.

    Also, I noticed in your video (close to minute one) students were posing questions about how they might investigate different phenomena (like using technology to track the sun at the horizon over the course of a year.) These questions seemed particularly sophisticated and insightful. Moreover, they demonstrate an understanding of what tool one would need to have in order to investigate a particular phenomena. Can you share about the process of shaping students competencies with asking these kinds of scientific and investigative questions? Thank you for your work.

  • Icon for: Eric Greenwald

    Eric Greenwald

    Lead Presenter
    Senior Research Lead
    May 12, 2022 | 07:53 p.m.

    Thanks for your comment! For our measurement development work, we drew on recent studies that inquire into the computational work that practicing scientists engage in (for example, Weintrop, et al., 2016) as well as our network of scientists and advisors on this project. Our work with students was primarily in cognitive interviews to understand what different kinds of computational tasks reveal about youth thinking and evaluate alignment between what we think an item is measuring and how youth are interpreting the question. 

    In terms of the questions--are you referring to the text in the framework cells (1:24)? They are sophisticated questions, for sure! The questions are meant to evoke the kinds of tasks that would clearly engage CT-S. That is to say, one would be likely to see (e.g., for assessment) CT-S by answering that question. As your comment suggests, there remains a huge challenge to support students with the competencies to ask these kinds of questions as well as answer them--our hope is that the framework and tool developed through this project serves to articulate the aspects of CT that best position youth for success in science learning, and can inform educators and curriculum developers seeking to advance student learning in those areas.

  • Icon for: Jose Felipe Martinez

    Jose Felipe Martinez

    Researcher
    May 12, 2022 | 10:00 p.m.

    Thanks for sharing, this is a very interesting project.

    I understand your measurement work centered on developing an instrument to assess the key outcome of interest here (CT learning). On the other hand, the framework in your video explicitly positions pedagogy as a key factor (perhaps THE key?) influencing this outcome.

    I am wondering  whether you also tried to measure instructional practices/pedagogical approach in your project. If you did I would be very interested in learning about how you went about it. If not, what are your thoughts on how we may building a knowledge base (and eventually instruments) to measure instruction related to CT.

    Developing measures of classroom processes is an area very close to my interests and I am currently collaborating with Math and Science educators at UCLA/Center X to develop and pilot an observation rubric to measure CT-instruction. Thanks!

  • Icon for: Eric Greenwald

    Eric Greenwald

    Lead Presenter
    Senior Research Lead
    May 13, 2022 | 12:25 p.m.

    Thanks for your comment! We are also very invested in understanding how learning experiences can be designed to support learning at the intersection of science and computational thinking--the framework (coming in at about the 0:50s mark) you refer to is from a project synergistic with the measurement development work. That project (NSF DRL#1657002) has been important in better understanding how youth engage in computational thinking while pursuing science goals--findings that helped us conceive of appropriate task structures for the measurement work and articulate the CT-S assessment framework (presented at about 1:20).

    Our hope is that the assessment will help clarify the kinds of competencies that best position students for success in science learning--to ground design of learning experiences so that they are likely to advance these competencies and to understand the extent to which learning experiences are successful in doing so. Thus, we see the measure as an important tool to identify and refine promising instructional practices and pedagogical approaches.

  • Icon for: Adelmo Eloy

    Adelmo Eloy

    Researcher
    May 13, 2022 | 08:03 a.m.

    Thanks for sharing this work! Also, thanks for sharing the technical report, it is very informative.

    Could you tell more about the digital tools students have been used to engage with computational modeling? I wonder if designing these tools is also a part of the project, or if you have found any that suit your need.
    By the way, we are designing a block-based programming environment with this same goal, in case you want to check it out (link) :)

  • Icon for: Eric Greenwald

    Eric Greenwald

    Lead Presenter
    Senior Research Lead
    May 13, 2022 | 12:34 p.m.

    Thank you for your comment! The digital tools seen in the video are from a synergistic project (NSF DRL#1657002) that has been studying how youth engage in computational thinking while pursuing science goals. The design of digital tools for computational modeling and computational data practices has been a longstanding effort among the research and design team here at the Lawrence--as you have likely encountered in your work, it's no trivial task to develop a tool/programming environment that is accessible for the task at hand (e.g., building a computational model) and supportive for learning (rather than concealing) the computational competencies underlying the task. I look forward to checking out your work as well!