8927 Views (as of 05/2023)
Icon for: Paul Horwitz

PAUL HORWITZ

Concord Consortium

Teaching Teamwork

NSF Awards: 1400545

2016 (see original presentation & discussion)

Grades 9-12, Undergraduate

Collaboration is highly valued in the 21st century workplace, but in the classroom it’s often called cheating. When students work together on projects, it is difficult to assess the contribution each student has made. The "Teaching Teamwork” project is measuring how effectively electronics students work in teams. The project addresses the mismatch between the value of teamwork in the modern STEM workplace and the difficulty of teaching students to collaborate while also evaluating them individually.

We log the students’ actions as they work on a simulated circuit consisting of four resistors in series with a voltage source. Using separate computers, linked online, the students work in teams of three. Each student can see and alter one of the resistors, but the fourth, as well as the voltage source, are neither visible nor manipulable by the students. Each student is given the goal of trying to make the voltage drop across her resistor equal to a randomly chosen value. The task is made more difficult by the fact that changes made to any of the resistors affect everyone’s voltage drop, so communication and coordination are critical to good team performance.

Team members can communicate with one another by typing into a chat window. All student actions—from chats to changes in the circuit, measurements, and calculations—are analyzed and used to produce reports that enable instructors to measure not only the performance of a team as a whole, but also the contribution of individual team members.

This video has had approximately 536 visits by 466 visitors from 127 unique locations. It has been played 260 times as of 05/2023.
Click to See Activity Worldwide
Map reflects activity with this presentation from the NSF 2016 STEM For All Video Showcase website, as well as the STEM For All Multiplex website.
Based on periodically updated Google Analytics data. This is intended to show usage trends but may not capture all activity from every visitor.
show more
Discussion from the NSF 2016 STEM For All Video Showcase (16 posts)
  • Icon for: Paul Horwitz

    Paul Horwitz

    Lead Presenter
    Senior Scientist
    May 16, 2016 | 05:04 p.m.

    HI! My name is Paul Horwitz and I’m the Principal Investigator on the Teaching Teamwork project, funded by the Advanced Technological Education Program at the National Science Foundation. I hope you’ll take the time to view our little video and then use this chat stream to give us your reactions to it. We’re tackling an important but complex problem and I’m anxious to get your feedback.

    Here’s the problem: in many lines of work, particularly in science and engineering jobs, it is increasingly important to be able to function effectively in teams, often with collaborators who may be separated by continents and time zones. But when we try to teach this kind of teamwork we run into the frustrating fact that in school it is often impossible to distinguish between collaboration and cheating! As a result, our students encounter precious few opportunities to work with others in the course of their training.

    The computer offers a possible solution: if we use a simulation to present a problem to students, then we can track their actions, and use that data to evaluate the relative contribution of each member of the team.

    The devil is in the details, of course. We need to come up with algorithms for data analysis, based on a thorough understanding of human cognition and the development of social skills, and then use these to automate the generation of reports that can guide instruction and support the independent assessment of both content knowledge and collaborative problem-solving skills.

    It’s a tall order, but we’re making progress and we’d love to share that with you!

  • Icon for: Victor van den Bergh

    Victor van den Bergh

    Facilitator
    Project Manager
    May 16, 2016 | 09:19 p.m.

    Dear Paul and your team,
    This is an interesting project that tackles an incredibly important, complex, and difficult problem. I have done work seeking to measure collaborative behaviors amongst young students working on a project using video recordings and can appreciate the challenges of identifying these types of behaviors (not to mention analyzing those measurements and applying any insights gained to the efficacy of the larger team). I am curious to know, what age ranges are targeted by your project? Also, do you analyze the text data that you gather? Thank you very much for sharing this work!

  • Icon for: Paul Horwitz

    Paul Horwitz

    Lead Presenter
    Senior Scientist
    May 16, 2016 | 09:29 p.m.

    We are working in one technical high school, one two-year community college, and one four-year university. The subjects range in age from 16 to 40 or so, but mostly in their 20s. In addition to the performance data generated as the students attempt to solve the problem, we also give the students a survey and a photo quiz (the “Mind in the Eyes” test). We also solicit information about the students from the instructors and give the students a post-test designed to elicit their reactions to the activity and to provide an independent assessment of their content knowledge.

  • Icon for: Karen Purcell

    Karen Purcell

    Facilitator
    Project Director
    May 17, 2016 | 03:32 p.m.

    Dear Paul and team,
    Wonderful idea. Thanks for an engaging and informative video.
    Can you expand a bit on how the project will broaden participation in STEM? Do you feel that promoting collaboration is the key to doing this?
    Finally, what are some of the challenges you’ve encountered. Have you found, for instance, that the design of the project may inadvertently create a feeling of competition among the collaborators?

  • Icon for: Paul Horwitz

    Paul Horwitz

    Lead Presenter
    Senior Scientist
    May 17, 2016 | 05:19 p.m.

    We can’t be sure, of course, that our intervention will broaden participation in STEM, nor are we set up to observce such an effect if it exists. Nevertheless, the students who participate in our problem-solving activity seem to enjoy it, and to get better at it, and they are thus more likely to feel competent and able to master STEM content. And such self efficacy measures have been demonstrated to increase the likelihood of participation in STEM.

    As for the challenges we have encountered, the major one so far has been our belated recognition of the fact that success at the collaborative task we have set need not correlate either with good teamwork or even with content knowledge and understanding! It is possible, in other words (and we have observed this) to achieve the goals we set while neither cooperating with the other members of the team or applying Ohm’s Law or an understanding of the circuit being manipulated! The interesting thing is, however, that we are able quite easily to distinguish between those teams that work well together and those in which the individual members remain isolated from one another, simply by looking at the log data that each team produces, even though measured by their success at the task the two teams appear identical.

  • Icon for: Karen Purcell

    Karen Purcell

    Facilitator
    Project Director
    May 19, 2016 | 12:40 p.m.

    Fascinating! Thanks so much for sharing.

  • Icon for: Brett Slezak

    Brett Slezak

    Health and Physical Education Teacher
    May 18, 2016 | 01:07 p.m.

    Paul, I think what you are doing to try and measure collaboration is fantastic. As a teacher, I know I would love to be able to assess students on this to better help cultivate collaboration in my classes. This might be a pretty lofty question for me to ask but, can you see possibilities in the future on how your model could be applied across different curriculums?

  • Icon for: Paul Horwitz

    Paul Horwitz

    Lead Presenter
    Senior Scientist
    May 18, 2016 | 04:42 p.m.

    HI Brett!

    That’s the $1,000,000 question, of course, and our eventual goal. To err on the side of caution, I think it’s quite likely that we will learn from the Teaching Teamwork project, and others like it, how to analyze the actions of people who are trying to collaborate, and even more important, how to structure collaborative activities so as to elicit actions and produce data upon which to make valid inferences regarding the ability of the participants to work productively in teams. For instance, as I pointed out in an earlier response in this thread, the first problem we posed turned out to have an unexpected, efficient solving algorithm that requires neither content knowledge nor the ability to work in teams! That emphasizes the importance of planning for – and guarding against! – students “gaming the system” – employing gamesmanship (definition: “the art of winning without actually cheating”) rather than the target skills and abilities.

  • Icon for: Elliot Soloway

    Elliot Soloway

    Arthur F. Thurnau Professor
    May 19, 2016 | 12:28 a.m.

    full disclosure: I am a “collaboration” believer!
    Moving right along… I think NSF needs to focus a program on “social learning” and Horwitz’s project is an excellent example of the kind of focused, careful research that is needed…we learn from and we learn with others… in the enterprise, social learning is paramount.. in schools… it is also, but it isn’t recognized… I hope your video makes it into the finals, Paul — that will help highlight not only your work, but the need for MORE of this type of work. Onward!!

  • Icon for: Paul Horwitz

    Paul Horwitz

    Lead Presenter
    Senior Scientist
    May 19, 2016 | 02:43 p.m.

    Thanks for the support, Elliot! It’s my (possibly biased!) impression that collaboration is on its way to becoming the “next big thing” – or one of them, anyway – in education research. We’re kinda hoping that this project help to speed its arrival.

    Onward, indeed!…

  • Icon for: Joseph Wilson

    Joseph Wilson

    Facilitator
    Managing Director
    May 19, 2016 | 08:14 p.m.

    #teamPaul – I love the experience that students get of working “remotely” yet collaboratively. As someone who has teammates and direct reports in all four continental US time zones, collaboration is especially important. What other types of tasks or activities (beyond resistance measurement) do you think would give you additional insight into measuring collaboration?

  • Icon for: Paul Horwitz

    Paul Horwitz

    Lead Presenter
    Senior Scientist
    May 20, 2016 | 11:07 a.m.

    We’re working on several – all, for the moment, in the general area of electronics (since that was the focus of the original proposal. One activity that’s about to “go public” involves the use of PIC microcontrollers, the task being to connect up a numeric keypad to a seven-digit LED display. Another area we’re investigating involves hooking up logic chips – specifically, NOT, AND, and OR gates to form a circuit with a specific purpse (e.g., an adder). But of course, teamwork (and especially remote teamwork) is an important skill in many areas, and our long-term goals include exploring the applicability of what we are learning on this project to additional STEM domains.

  • Icon for: Jill Denner

    Jill Denner

    Senior Research Scientist
    May 20, 2016 | 01:48 p.m.

    This is a great project, and I agree that fostering effective collaboration is key to broadening participation. But I have found in my research on pair programming that the selection of the pairs has a large influence on how they work together—a mismatch can actually undermine learning and motivation. Do you use any system to form the teams, or do they self-select into groups?

  • Icon for: Paul Horwitz

    Paul Horwitz

    Lead Presenter
    Senior Scientist
    May 20, 2016 | 02:10 p.m.

    We’re aware, of course, of the importance of the formation of the teams, but for the time being we’ve been using random assortment with an emphasis on evaluating how well the teams work together, rather than trying, through careful team formation, to maximize the teams’ performance. A next step in our project, not mentioned in the video, will be to compare both team and individual performance with exogenous measures such as the team members’ scores on the Mind in the Eyes test. Before we can embark on such a study, however, we will have to have five to ten times as much data as we currently have.

  • Icon for: Jill Denner

    Jill Denner

    Senior Research Scientist
    May 20, 2016 | 02:13 p.m.

    That makes sense. I expect that what you learn from the study will help us understand how to form effective teams.

  • Icon for: Paul Horwitz

    Paul Horwitz

    Lead Presenter
    Senior Scientist
    May 20, 2016 | 02:18 p.m.

    That’s certainly the goal. It remains to be seen how far we will get. Frankly, I suspect that for the foreseeable future an experienced teacher is likely to be better at picking out “good” teams then any testing procedure we can come up with. But you never know…

  • Further posting is closed as the event has ended.