Skip to content

NEWS RELEASE

The Nation’s Report Card Releases Results from Innovative Science Assessment

Students Conduct Experiments with Confidence but Struggle to Explain Results

Connect with Us!

Facebook icon      Twitter icon

WASHINGTON—The National Assessment of Educational Progress (NAEP) is leading the way by measuring how well students apply their understanding of science in real-life contexts. The Nation's Report Card Science in Action: Hands-On and Interactive Computer Tasks from the 2009 Science Assessment marks the first time that both tasks were included as part of the NAEP science assessment.

Today's results reveal that America's fourth, eighth, and 12th graders can conduct science investigations using limited data sets, but many students lack the ability to explain results. The report shows that students were challenged by parts of investigations requiring more variables to manipulate, strategic decision-making in collecting data, and the explanation of why a certain result was the correct conclusion.

The new interactive computer tasks and updated hands-on tasks that involve more open-ended scenarios were administered as part of the 2009 science assessment by the National Center for Education Statistics to a nationally representative sample of more than 2,000 students in each of grades 4, 8 and 12. The findings provide important insights for educators and policymakers who are looking for academic approaches that support careers in science, technology, engineering, and mathematics (STEM) fields, and encourage scientific inquiry.

"Science is fundamental to education because it is through scientific inquiry that students understand how to solve problems and ultimately how to learn," said David Driscoll, chairman of the National Assessment Governing Board, which sets policy for NAEP. "So it's tragic that our students are only grasping the basics and not doing the higher-level analysis and providing written explanations needed to succeed in higher education and compete in a global economy."

The purpose of using hands-on and interactive computer tasks in testing is to determine whether students can solve problems as a scientist would and require students to perform actual science experiments. Interactive computer tasks require students to solve scientific problems in a computer-based environment, often by simulating a natural or laboratory setting.

"This innovative format allows for a richer analysis than a paper-and-pencil test," Driscoll said. "Interactive computer tasks allow us to more deeply examine students' abilities to solve problems because the tasks generate much more data."

Only 53 percent of 12th graders reported that they were enrolled in a science course, and only 28 percent reported writing a report on a science project at least once a week. Ninety-two percent of fourth graders and 98 percent of eighth graders had teachers who reported doing hands-on science activities with students at least monthly. Thirty-nine percent of fourth graders and 57 percent of eighth graders had teachers who reported having at least a moderate emphasis on developing scientific writing skills.

The assessment measures science skills in a number of ways. Some questions use a model known as "predict-observe-explain" to examine students' ability to combine their science knowledge with real-world investigative skills.

To correctly predict, students had to provide an accurate description of what might happen in a situation. For instance, when asked what kind of sunlight conditions were needed for a sun-loving plant and a shade-tolerant plant, 59 percent of fourth graders showed understanding that different plants have different sunlight needs.

Through the observe phase, students watched what happened as they conducted their experiments. Eighty percent of fourth graders made straightforward observations and tested how fertilizer and sunlight affected plant growth, but only 35 percent could perform a higher-level task that required them to make decisions about the best fertilizer levels for a sun-loving plant.

Students were then asked to explain what they had observed by interpreting data or drawing conclusions. Across all grade levels, a majority of students could observe, but far fewer could predict or explain. In fourth grade, fewer than 50 percent of students could explain why they selected a given fertilizer amount to support plant growth and use evidence to support their answer. At grade 8, 88 percent of students could correctly identify which liquid flowed at the same rate as water at a given temperature, while only 54 percent could support this answer with a written explanation of the evidence.

At twelfth grade, 64 percent of students could recommend the site for a new town based on information provided about water quality, while 75 percent of students could perform a straightforward investigation to test the water samples and accurately tabulate data. But only 11 percent were able to provide a valid recommendation and support their conclusions with details from the data. [Click for details on the plants task and water systems task.]

More highlights from Science in Action include:

Overall achievement gaps

  • There are gaps in average scores for all tasks between students from low-income families (those eligible for free and reduced-price lunch) and those from higher-income families.
  • There are gaps by race/ethnicity. At all grade levels, white and Asian/Pacific Islander students outscored their black and Hispanic peers.
  • At grades 4 and 12, Hispanic students scored higher than their black peers on interactive computer tasks and hands-on tasks.
  • Female students outscored males on the hands-on tasks, but males scored higher on the traditional paper-and-pencil assessment. There was no gender gap for interactive computer tasks.

Grade 4

  • Seventy-one percent of students could correctly select how volume changes when ice melts into water, but only 15 percent
  • could support this conclusion with evidence from the investigation.
  • Overall, students earned about 42 percent of the total points available from the questions they attempted on the interactive computer tasks.
  • Overall, students earned about 47 percent of the total points available from the questions they attempted on the hands-on tasks.

Grade 8

  • Eighty-four percent of eighth graders could correctly test how much water flowed to different soil samples during a simulated laboratory test.
  • Overall, students earned about 41 percent of the total points available from the questions they attempted on the interactive computer tasks.
  • Overall, students earned about 44 percent of the total points available from the questions they attempted on the hands-on tasks.

Grade 12

  • Fifty-five percent of students could select the correct temperature changes occurring when a warm solid is placed in cool water, but only 27 percent were able to explain how heat was transferred from a warmer to a cooler substance.
  • Overall, students earned about 27 percent of the total points available from the questions they attempted on the interactive computer tasks.
  • Overall, students earned about 40 percent of the total points available from the questions they attempted on the hands-on tasks.

Science in Action: Hands-On and Interactive Computer Tasks from the 2009 Science Assessmentis available at www.nationsreportcard.gov. Visit http://www.nagb.org/science-hots-icts/ for more information and materials on recent results. Dive deeper into the tasks by visiting the NAEP interactive website at http://nationsreportcard.gov/science_2009/

Download the PDF version.


CONTACT:
Stephaan Harris
(202) 357-7504
Stephaan.Harris@ed.gov



# # #

The National Assessment of Educational Progress is the only nationally representative, continuing evaluation of the condition of education in the United States. It has served as a national yardstick of student achievement since 1969. Through the Nation's Report Card, NAEP informs the public about what American students know and can do in various subject areas and compares achievement between states, large urban districts, and various student demographic groups.

The National Assessment Governing Board is an independent, bipartisan board whose members include governors, state legislators, local and state school officials, educators, business representatives and members of the general public. Congress created the 26-member Governing Board in 1988 to oversee and set policy for NAEP.

The National Assessment of Educational Progress (NAEP) is a congressionally authorized project sponsored by the U.S. Department of Education. The National Center for Education Statistics, within the Institute of Education Sciences, administers NAEP. The Commissioner of Education Statistics is responsible by law for carrying out the NAEP project.

Click to download the latest version of Adobe Reader