A Rationale for Connected Classrooms

 

© 2016 Dr. Gary L. Ackerman

Even through the third decade of the 21st century is quickly approaching, and information technology (IT) is deeply embedded in the lives of our students, many educators are still reluctant to find a role for computers (in all of their variations), digital information, and social media in their classrooms. In this post, I argue there is a place for IT in every classroom, and my argument is based on four “E’s.”

Connected classrooms (Ito et al., 2013) have replaced classrooms in which technology is integrated as the best practice in creating technology-rich learning. In connected classrooms, technology infrastructure is leveraged to facilitate interest driven and socially motivated studies. In this post I argue educators can positively affect efficiency, effectiveness, efficacy, and equity through connected classrooms.

Why should I create a connected classroom? Efficiency, Effectiveness, Efficacy, Equity

Efficiency

For generations of students, simple declarative and procedural knowledge was the focus of the curriculum and instruction. While some educators have minimized the importance of this type of knowledge, there is still a place for it in classrooms. (In other places I have argued for “exercises” to be a minority part of classroom work that are done in parallel with “authentic activities.”)

Through technology-mediated instruction (videos, worked examples, games, and other tools), these lessons can be presented with greater efficiency than they can be in traditional teacher-delivered lessons. The efficiency results from the precision with which content can be delivered (only those who need instruction get it) and the just-in-time or just-when-needed nature of this instruction (students can access it any time and as many times as they need it).

In many ways this displaces the teacher as the dispenser of information in the classroom. The advantage, of course, is that teachers can collaborate on identifying, vetting, and curating the collection of technology tools that provide efficient instruction.

Effectiveness

A recurring theme in the literature on workforce development is that organizations are faced with unpredictable situations, and they need workers who can adapt to theses changing circumstances. While generations of educators have sought to give students experience applying what they learn to new situations (the dreaded word problems in math is the most common example), this work takes on renewed importance in the 21st century.

IT allows teachers to introduce scaffolding, social learning, and other active learning methods so that students find greater connection and relevance to the ideas they study. Through these strategies, the lessons are more effectively learned, so students are more likely to be aware of what they know as well as how and when to use it to solve new problems.

Efficacy

The purpose of school is to prepare students to fully participate in a culture’s economic, political, and social life. This is education’s strategic goal, and the degree to which an organization achieves its strategic goals.

Today, culture is dominated by digital information and technologies. Giving student experience participating in creating knowledge, evaluating the knowledge created by others, and finding new uses of IT and new types of knowledge, are all aspects of the information technology-rich landscape that were unfamiliar in the landscape of print-based information for which much pedagogy was designed.

Equity

It is an unfortunate reality that there remains a digital divide in the United States; disadvantaged students have less access to technology tools, and even if they do have access to the tools, they are more likely to be used for efficient instruction of procedural and declarative knowledge rather than more effective or efficacious purposes.

While efficient instruction may be a reasonable first step in creating connected classrooms, school and technology leaders must take steps to ensure progress continues as all students gain access to curriculum focused around increasing sophisticated and complex problems—problems they identify as relevant—in their schools.

 

References

Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K,. Schor, J., Sefton-Green, J., Watkins, S. C. (2013). Connected Learning: An Agenda for Research and Design. Irvine, CA: Digital Media and Learning Research Hub.

http://dmlhub.net/wp-content/uploads/files/Connected_Learning_report.pdf

 

Being Data-Driven is Nothing to Brag About

Being Data-Driven is Nothing to Brag About

(c) 2016 Dr. Gary L. Ackerman

“Data-driven” has been the mantra of educators for the last generation. This mantra captures the practice of using students’ performance on tests to make instructional decisions. This model can be criticized for several reasons including the dubious reliability and validity of tests, the lack of control over variables, and incomplete and inappropriate analysis. My purpose here, however, is to criticize the “what works” focus that accompanies “data-driven decisions.”

Ostensibly, educators adopt a data-driven stance to create a sense of objectivity; they can reason, “I am taking these actions because, ‘it works’ to improve achievement.” The problem with this approach is that identifying “what works” is a superficial endeavor and it can be used on only very limited circumstances.

While designing physical systems, engineers can apply “what works” methods to improve their systems. Engineers can conceive and plan, build and test, then deploy their systems. At any point in the process they can change definitions of what “it works” means or abandon the project if “it works, but is too expensive” (or if other insurmountable problems arise). Ascertaining “what works” in educational settings is a far less controlled situation. Those who have tried to use others’ lessons plans and found the results disappointing have first-hand experience with this effect.

Understanding “Data-Driven” As a Scientific Endeavor

Humans have created two activities that are data-driven. In basic science, we use data to organize and understand nature so that we can support theories that allow us to predict and explain observations. In applied science, we gather data to understand how well our systems function.

Data-driven approaches to refine systems to build “what works” is the approach used by technologists who work in applied science. Vannevar Bush, a science advisor in President Franklin Roosevelt during and after World War II, placed basic and applied research as opposite ends of a continuum. Basic science was undertaken to make discoveries about the world, and applied science was undertaken to use and control those discoveries to develop tools useful to humans.

If we place data-driven education along this continuum, it must be considered an applied science as it is undertaken to build systems to instruct children. As it is typically undertaken, there is little attempt to understand why or how “it works,” as answering this questions are in the domain in basic science.

Continuum of basic to applied science as proposed by Vannever Bush
Figure 1. Continuum of basic to applied science as proposed by Vannever Bush

This is a very dissatisfying situation for educators (both those who claim to be data-driven and those who make no such claim). Fortunately, we can reconcile that dissatisfaction by recognizing that the basic to applied science continuum does not accurately describe the landscape of education.

Use-Based Research

In 1997, Donald Stokes, a professor of public policy at Princeton University, suggested the understanding that basic researchers seek and the use that applied researchers seek are different dimensions of the same endeavor, so research is not either basic or applied. According to Stokes, the continuum of science should be replaced with the matrix shown in figure 2.

The matrix created by placing the question “Do the researchers seek to develop or refine systems?” along the x-axis and “Do the researchers seek to make new discoveries?” along the y-axis creates four categories into which one can place a science-like activity:

  • Pure research is Bush’s basic research, and it is undertaken to satisfy curiosity, so the researchers are not motivated to create useful systems.
  • Technology development is Bush’s applied research, and researchers seek to develop useful systems, but they do not seek to make generalizations beyond those needed to build their systems.
  • Purposeful hobbies are undertaken for entertainment, and hobbyists are not motivated to share their systems they use or to make discoveries.
  • Use-based research is the label applied to endeavors in which the researchers seek to both develop new systems and make discoveries about the work.
Figure 2. Stokes' matrix of data activities
Figure 2. Stokes’ matrix of data activities

Stokes used the term “Pasteur’s quadrant” to capture the nature of work in the use-based research quadrant. He reasoned Pasteur’s work in microbiology had multiple purposes. As he developed methods of preventing disease (these are the  technologies he developed); Pasteur also sought to discover how and why the technologies worked, thus he established important details of microbial life.

Replacing Data-Driven Decisions

Educators who choose to adopt a more sophisticated approach to using data to drive decisions can adopt use-based research. This will require they begin to approach  data, its collection, and analysis in a more sophisticated manner. These educators will be faced with more work, but it is more interesting and more efficacious than the data-driven methods I typically observe. Use-based research necessitates educators:

  • Begin data projects and analysis with a question. The question cannot be “Which instruction is better?” It must be focused and precise: “Did the students who experienced intervention x perform better on y test?” They must also recognize that these questions can only be answered with large cohorts of students and using statistical methods. Further, these answers (like all answers supported with data) cannot be known with certainty.
  • Seek a theory to explain the results they find in the answers to their questions. While the “data-driven educator” may be satisfied with knowing “what worked,” the educator which uses data as a use-based researcher will seek to elucidate reasons and mechanisms, a theory, for “what worked.” This will leave them better prepared to developed and refine interventions for other settings and cohorts of students. This theory will also allow them to predict other observations that will confirm their theory.
  • Based on their predictions, seek other evidence to support their theories. This evidence cannot be the same measurements. If, for example, we accept the dubious conclusion that SBAC (or PARC tests) measure college and career readiness, then we should be able to devise other measures of career and college readiness and the instruction that affects those tests scores should be observed in other ways as well.
  • Use-based research will also cause educators to become more critical of the measures they use (including those they are mandated to use) and to better understand the reality that we must be active consumers and evaluators of the data that is collected about our students and the methods used to analyze it.

Reference

Stokes, D. E. (1997). Pasteur’s quadrant: basic science and technological innovation. Washington, D.C: Brookings Institution Press.

Download this post as a PDF file:

Being Data Driven is Nothing to Brag About