1. Lei Liu
  2. Senior Research Scientist
  3. Student Reasoning Patterns in NGSS Assessments (SPIN-NGSS)
  4. Educational Testing Service
  1. Dante Cisterna
  2. https://www.linkedin.com/in/dante-cisterna-6b076415/
  3. Research Developer
  4. Student Reasoning Patterns in NGSS Assessments (SPIN-NGSS)
  5. Educational Testing Service
  1. Devon Kinsey
  2. Senior Research Assistant
  3. Student Reasoning Patterns in NGSS Assessments (SPIN-NGSS)
  4. Educational Testing Service
  1. Yi Qi
  2. Senior Research Project Manager
  3. Student Reasoning Patterns in NGSS Assessments (SPIN-NGSS)
  4. Educational Testing Service
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Dante Cisterna

    Dante Cisterna

    Co-Presenter
    Research Developer
    May 9, 2022 | 03:00 p.m.

    Welcome to our SPIN NGSS video. This video describes the development of an AI-supported tool to provide feedback to teachers and students on student reasoning patterns in tridimensional science. The video also presents findings from a small research study conducted with middle and high-school students.

    Please post your comments and questions about our video and project. 

  • Icon for: Ekundayo Shittu

    Ekundayo Shittu

    Higher Ed Faculty
    May 10, 2022 | 10:33 a.m.

    Three-dimensional learning... very nice concept. While the use of AI is promising, it is also important to note that there are trustworthiness challenges related to their implementations. How are some of these challenges addressed in this project? 

    Kudos for the excellent video... very clearly explained. I voted!

     
    2
    Discussion is closed. Upvoting is no longer available

    Dante Cisterna
    Lei Liu
  • Icon for: Lei Liu

    Lei Liu

    Lead Presenter
    Senior Research Scientist
    May 10, 2022 | 10:41 a.m.

    Thanks very much for the nice comment, Ekundayo! Totally agree with the implementation challenges, which is not addressed in this project. We are proposing a new proposal to tackle the classroom implementation issues. We would also welcome any brilliant ideas from teachers and/or researchers here.

  • Icon for: Janet Coffey

    Janet Coffey

    Facilitator
    Program Director, Science Learning
    May 13, 2022 | 02:59 p.m.

    To piggy back on Ekundayo's comment - given the noted biases often inherent in AI, are you tracking in this study how these could become manifest in broader use of this technology?  And/or are you expressly looking for evidence that the AI algorithms employed in this tool do not reflect such biases?  Your research on ways to optimize technology to support learning is so important.  Thank you for sharing (your video was great!).

     

  • Icon for: Lei Liu

    Lei Liu

    Lead Presenter
    Senior Research Scientist
    May 16, 2022 | 10:31 a.m.

    Hi Janet,

    Thank you for your great comments!

    We are exploring other funding opportunities to pilot broader use of this technology. I am a hundred percent with you about the importance of potential biases employed in AI and we did document a few potential ones. E.g., we noticed that our current AI algorithms didn't take into account how students' misunderstanding of certain vocabularies might have impacted their reasoning patterns (particularly when the sample size of such misunderstanding was not big enough). We have a list of such potential issues and will explore potential solutions (either through AI or through AI + human/teacher). This could be a limitation of using existing data in our study.

    Lei

  • Icon for: Barry Fishman

    Barry Fishman

    Facilitator
    Professor
    May 10, 2022 | 11:10 a.m.

    This is great work, and I like the implications for reducing teacher workload while allowing students to get more frequent feedback. The student reactions to computer feedback are really interesting. As a way of building on the implementation question above, have you thought about providing explicit guidance to teachers/students on how/when to use feedback and what level of trust to put in it?

     
    3
    Discussion is closed. Upvoting is no longer available

    Sarah Bichler
    Dante Cisterna
    Lei Liu
  • Icon for: Lei Liu

    Lei Liu

    Lead Presenter
    Senior Research Scientist
    May 10, 2022 | 11:25 a.m.

    Thanks for visiting our video, Barry! Yes, student data from the cognitive interviews helped us identify quite a few granted assumptions about student knowledge about NGSS. Regarding the implementation, explicit guidance and training will be necessary. Currently we are thinking of focusing on what can AI tell, what AI (at least in this AI tool) cannot tell, what evidence highlighted by AI could be trusted, what are some potential biases. More importantly, we are also thinking about what teachers may supplement what AI cannot do (or do well). 

  • May 10, 2022 | 04:54 p.m.

    This is really cool! As a classroom teacher and researcher, I find the concept of using AI for student feedback really interesting. I'm curious what this might look like in the classroom for teachers who aren't supported by a research team. I'm not very familiar with AI development so I'm curious about the versatility of the tool you've already created and the difficulty of adjusting the tool to be used for other assessments or tasks in the science classroom. Thanks!

     
    2
    Discussion is closed. Upvoting is no longer available

    Lei Liu
    Dante Cisterna
  • Icon for: Dante Cisterna

    Dante Cisterna

    Co-Presenter
    Research Developer
    May 10, 2022 | 06:15 p.m.

    Thanks for your comment, Cathryn. SPIN-NGSS is a small-scale, proof-of-concept study in which we developed the framework for reasoning patterns, the AI annotation model, and the feedback tool. Certainly, we'd like to pursue further research on classroom implementation of the tool, including the exploration of the potentials and challenges for implementation. Regarding the versatility of the tool, the item we selected for the AI-based feedback tool is aligned with a particular NGSS performance expectation for middle school and part of a performance assessment task. For example, teachers can embed the tool (and the task) in their instructional planning. Developing some type of support and teacher PD would be needed as well.

  • Icon for: Andresse St Rose

    Andresse St Rose

    Facilitator
    Director of Educational Research and Evaluation
    May 11, 2022 | 06:39 p.m.

    What interesting work! With the acknowledgement that there are some things AI can do and cannot do and that the team is already thinking about how teachers may supplement what AI does, I am curious what the feedback from teachers in the pilot has been?

     

     
    1
    Discussion is closed. Upvoting is no longer available

    Lei Liu
  • Icon for: Dante Cisterna

    Dante Cisterna

    Co-Presenter
    Research Developer
    May 12, 2022 | 08:46 a.m.

    Hi, Andresse! We only conducted cognitive interviews with students about the tool for this project. We hope to move to the classroom implementation in a new study and collect feedback from teachers. 

  • Small default profile

    Xiaoming Zhai

    Researcher
    May 11, 2022 | 09:06 p.m.

    Good Job, Lei!!!

     
    1
    Discussion is closed. Upvoting is no longer available

    Lei Liu
  • Icon for: Lei Liu

    Lei Liu

    Lead Presenter
    Senior Research Scientist
    May 12, 2022 | 11:36 a.m.

    Thanks, Xiaoming!

  • Icon for: Sarah Bichler

    Sarah Bichler

    Researcher
    May 13, 2022 | 04:23 p.m.

    Hi all, 

    thanks for sharing your work! I appreciate so much how you integrated student in the process to learn from those who are actually using the learning tool! Their feedback shows, I think, that we often underestimate what students know because our prompts/questions might not elicit everything we hoped they would. I am thinking about being in the classroom and students showing their explanations to me asking for feedback. Often all I said was, "why do you think that?", "what is your evidence?", "what data made you decide it is X?" and they're like "oh, I forgot to include..." and they added often as much as 2 sentences! 

    I have so many questions :), feel free to choose any!

    1. What was your strategy for designing the feedback, what kind of feedback did you provide? 

    2. How was the feedback different for students whose response fell into pattern 1 vs 2?

    3. What feedback did students whose response fell into pattern 3 get? What revisions did you see in pattern 3?

    4. What types of revisions did you see? Mostly adding a new idea (data or science idea) or did students "rework" their entire response?

    Looking forward to your responses!

    Sarah

     
    2
    Discussion is closed. Upvoting is no longer available

    Dante Cisterna
    Lei Liu
  • Icon for: Lei Liu

    Lei Liu

    Lead Presenter
    Senior Research Scientist
    May 13, 2022 | 05:40 p.m.

    Great questions, Sarah! I cannot agree more that what student know (what they don't know) often are underestimated. For example, many students haven't had direct instruction on the three dimensions in the NGSS at all, which required us to revise some terms used in our feedback.

    Also thanks for sharing some of the feedback examples. Definitely it's important to make the feedback related to student response and their thinking as much as possible. We may be able to use some of the automated annotation to personalize the feedback based on student's own response. For example, if a student only talked about certain concept but didn't cite any data, the feedback could be "What is your evidence for [AI highlighted concept]?"

    1. What was your strategy for designing the feedback, what kind of feedback did you provide? 

    Our feedback design was based on the student reasoning pattern framework we developed. The feedback was to focus on the integration of dimensions of science learning and the depth of their reasoning. For example, for students who only talked about disciplinary principles, we would provide feedback asking students to use data to support their statements. 

    2. How was the feedback different for students whose response fell into pattern 1 vs 2?

    The feedback is focusing on the missing dimension in pattern 1 (i.e., DCI/CCC) vs pattern 2 (i.e. SEP).

    3. What feedback did students whose response fell into pattern 3 get? What revisions did you see in pattern 3?

    Pattern 3 included multiple dimensions, however not necessarily integrated. The feedback would probe the connections between dimensions to enhance the integrated reasoning.

    4. What types of revisions did you see? Mostly adding a new idea (data or science idea) or did students "rework" their entire response?

    Our cog lab data showed that many students made revisions (96%), among which 69% of them revised their original responses (Patter 1 or 2) into a pattern 3 response. Most students revised based on the feedback to add a new idea to fix the missing dimension issue. 

    Hope these responses answered your questions to some extent. Thanks again for these great questions!

    Lei

  • Icon for: Clausell Mathis

    Clausell Mathis

    Researcher
    May 16, 2022 | 04:15 p.m.

    Great Presentation!!

     

    This was a very informative video. At the end it was mentioned that you plan to implement this in classroom settings. Could you give an example of how this would be used in a classroom setting? Meaning, how will this help teachers in real time over the course of an academic year?

     
    1
    Discussion is closed. Upvoting is no longer available

    Dante Cisterna
  • Icon for: Dante Cisterna

    Dante Cisterna

    Co-Presenter
    Research Developer
    May 16, 2022 | 08:47 p.m.

    Thanks, Clausell for your comment. The item we selected for this study is part of a performance task with several items that can be used for classroom assessment purposes. One potential example of classroom use is working with teachers to embed the performance task in instructional units, so they can align instructional activities and resources with the task. Regarding the automated feedback tool, teachers can potentially use the information provided by a dashboard to group students, based on their reasoning patterns, and organize a peer-feedback activity in which students revise their initial responses.    

  • To post to this discussion go to