Challenges of Developing the NAEP Technology and Engineering Literacy Framework

Challenges of Developing the NAEP Technology and Engineering Literacy Framework

There were a number of challenges in developing the 2014 NAEP Technology and Engineering Literacy Framework that were not necessarily encountered in developing other NAEP frameworks. These included: (1) the newness of the endeavor, (2) diffuse curricula, (3) varying definitions, (4) measurement constraints, (5) time and resource constraints, (6) designing an entirely computer-based assessment, and (7) predicting future changes in technology. Each of these challenges is discussed below.

Newness of the Endeavor

Technology and engineering literacy is a growing and evolving area. Unlike other NAEP subjects, such as reading or mathematics, there is no existing NAEP framework to draw on. Moreover, the existing item banks in the United States and other countries are very limited (NAE, 2006). The technology and engineering literacy staff and committee members obtained only a limited number of sample items from outside sources, reflecting the immature state of assessing technology and engineering literacy.

Diffuse Curriculum

Unlike science and mathematics, which have a sequential curriculum taught by subject-area specialists in high school or by generalists in elementary school, technology and engineering education as a whole does not have a unified scope and sequence. Some individual courses (for example, science, technology and society; pre-engineering, and computer modeling) are likely to follow state standards and have a specified curriculum with a scope and sequence, but these individual courses are generally not grouped together under the rubric of technology courses. Whenthey are, the courses under such a heading may vary from place to place. ICT has also been integrated into the curriculum in a variety of ways. While ICT learning is often infused into existing core subjects, it is not always assessed and reported as part of these subjects. In addition, there is not a clear scope and sequence for ICT knowledge and skills, either as a stand-alone curriculum or integrated into core subjects, which may result in an inconsistent application of technology literacy standards across different grades, subjects, and states. As mentioned earlier, all teachers have a role in teaching technology, so in most cases teachers are not singled out as technology teachers in the same way that, for example, mathematics or history teachers are identified with those subject areas. The result (and implication for this assessment) is that the specific technology concepts and practices to which students have been exposed are hit and miss and mostly unknown. Students can say what mathematics or science courses they have taken, but specifying the range of their education in technology and engineering and their use is more ambiguous.

Varying Definitions

One of the most debated issues in developing the framework was the definition of technology and engineering literacy, as different definitions abound. Indeed, even the terms are not agreed upon, as some organizations refer to "technological literacy" while others refer to "technology literacy" or "engineering literacy." In this report, for consistency, the term "technology and engineering literacy" will be used throughout while recognizing that this terminology differs from what is used by some groups. ITEEA, the National Research Council, and ISTE have definitions of technology and engineering literacy. Meanwhile, the Federal No Child Left Behind Act of 2001 (NCLB) required that every student be "technologically literate by the time the student finishes the eighth grade" but the law itself is vague in defining what technological literacy is. States have therefore had flexibility in determining what technology and engineering literacy means and how it should be assessed.

Many states have adopted a common definition worked out by SETDA in 2002, which states, "Technology literacy is the ability to responsibly use appropriate technology to communicate, solve problems, and access, manage, integrate, evaluate, and create information to improve learning in all subject areas and to acquire lifelong knowledge and skills in the 21st century." The Federal American Recovery and Reinvestment Act of 2009 adds real-world consequences to this shared definition by providing grants to state and local agencies and schools based on their abilities to meet goals defined by long-range educational technology plans, most of which include this definition

The NAEP Technology and Engineering Literacy Framework attempts to unify the concepts and skills presented in these and other definitions under one umbrella definition. The definition presented earlier in this chapter is used only as a means to understand the results of the NAEP assessment in technology and engineering literacy. While this framework and subsequent NAEP results may be informative to education administrators, policyholders, industry and business leaders, and the general public, the definition of technology and engineering literacy presented here should not be used to interpret results from other assessments used at state and local levels. To further distinguish the 2014 NAEP Technology and Engineering Literacy Assessment from other technology and engineering literacy assessments, the project committees have recommended that the results be reported in terms of three individual scores, each reflecting performance in one of the three main areas of technology and engineering literacy: Technology and Society, Design and Systems, and Information and Communication Technology. An overall composite score will also be reported.

Measurement Constraints

NAEP, like any large-scale assessment in education, the workplace, or clinical practice, is constrained in what it can measure. This has implications for the proper interpretation of NAEP Technology and Engineering Literacy Assessment results. The framework is an assessment framework, not a curriculum framework. Although the two are clearly related, each has a different purpose and a different set of underlying assumptions. A curriculum framework is designed to inform instruction, to guide what is taught, and, often, to guide how it is taught. It represents a wide universe of learning outcomes from which educators pick and choose what and how they teach. An assessment framework is a subset of the achievement universe from which assessment developers must choose to develop sets of items that can be assessed within time and resource constraints. Hence, the content to be assessed by NAEP has been identified as that considered central to technology and engineering literacy.

As a result, some important outcomes of technology and engineering literacy (broadly defined) that are valued by general educators, engineers, teachers of technology, and the business community but that are difficult and time-consuming to measure—such as habits of mind, sustained projects, and collaboration—will be only partially represented in the framework and on the NAEP Technology and Engineering Literacy Assessment. Moreover, the wide range of technology and engineering standards in the guiding national documents that were incorporated into the framework had to be reduced in number to allow some in-depth probing of fundamental knowledge and skills. As a result, the framework and the specifications represent a distillation rather than a complete representation of the original universe of achievement outcomes specified by technology and engineering education documents.

Time and Resource Constraints

Time and resources limit what NAEP can assess. Like most standardized assessments, NAEP is an "on demand" assessment. That is, it is given a scheduled event outside the normal classroom routine with uniform conditions for all of the students being assessed. In particular, NAEP has a limited amount of time—in this case, approximately one hour per student—to ascertain what students know and can do. However, standards presented by professional associations and the states contain goals that require an extended amount of time (days, weeks, or months) to assess. To assess the achievement of students in the kinds of extended activities that are a central feature of these other standards and of many curricula, it would be necessary to know a number of things about the students, including their:

  • Reasoning while framing their goals;
  • Planning for projects and the implementation of the plan;
  • Skills in using technologies to gather, manage, and analyze data and information related to project goals;
  • Capabilities to meet unpredictable challenges that arise during actual, ongoing problem solving and achievement of goals;
  • Lines of argument in deciding how to alter their approaches in the light of new evidence;
  • Engagement with peers and experts in addressing goals and deciding how to achieve them; and
  • Deliberations and reasoning when evaluating progress, trade-offs, and results.

NAEP, like other on-demand assessments, then, cannot be used to draw conclusions about student achievement with respect to the full range of goals of technology education, broadly defined. States, districts, schools, and teachers can supplement NAEP and other standardized assessments to assess the full range of education standards that address technology and engineering literacy. In addition to describing the content and format of an examination, assessment frameworks, like this one, signal to the public and to teachers some core elements of a subject that are important.

Designing a Computer-Based Assessment

Although some NAEP assessments (the 2009 Science Assessment, for example) have called for interactive computer tasks, so far only the NAEP Writing Assessment has been totally computer-based. The design challenges of creating such an assessment include:

  • Developing the requisite number of tasks and items (test questions), especially since so few tasks and items exist that can serve as samples.
  • Constructing tasks and items that provide whatever prior knowledge is required to answer the question. Since so many contexts are available in which to set items, developers cannot assume that students will have prior knowledge of the specific topics (for example, core subjects, such as the humanities or mathematics, etc.) or technologies (for example, transportation, health, or electronics) within the context. Items must not require students to have prior knowledge of specific technologies, and the knowledge required about particular technologies must be presented in the item.
  • Determining the features and functions of the complete tools students will use.
  • Determining what aspects of student responses to an item need to be assessed. Are the attempts a student makes while trying out a design or using a simulation important to capture? What about the pathway the student follows or the number of mistakes made before getting a correct answer? Rather than a single question and answer, an item might have several components that are assessed.

In addition to the issues above, there will also be administrative challenges, such as whose computers the students use to complete the assessment, handling students' different levels of access to computer technology, and contingencies in case equipment malfunctions. The framework designers were aware of these factors when developing the framework, but they focused on the design factors, leaving the challenge of determining how best to administer the NAEP Technology and Engineering Literacy Assessment to those involved in the assessment development phase.

Predicting Future Changes in Technology

The framework attempts to strike a balance between what can reasonably be predicted about future technology and engineering literacy education and what students are likely to encounter in their curriculum and instruction now and over the next decade. For example, specific communication technologies in use today (Internet-connected multimedia smartphones and personal digital assistants [PDAs]) would not have been familiar to students a decade ago and may well be obsolete a decade from now.

The framework is intended to be both forward-looking (in terms of what technology content and usage will be of central importance in the future) and reflective (in terms of current technology). Because it is impossible to predict with certainty the shape of educational technology and technology education, the choices made for 2014 should be revisited in response to future developments.

It is a significant challenge to write a framework for the future, and the challenge is especially great for the subject of technology and engineering literacy.