A common mistake in education reform is to focus on what seems innovative about a new platform, solution, or content, rather than on the learning process used and how results vary based on use. I was reminded of this when reading a blog post by John Merrow that gives Khan Academy as an example of 'blended learning' (bit.ly/Pl58Ji). I am an advocate of Khan Academy; it's a great platform, a game changer; my kids use it daily as a key part blended learning model adopted by their school. However, the platform is not in itself 'blended learning.' Blended learning is the learner-centric process that includes a blend of interactions between the learner, technology platforms(s), and the teacher in a classroom setting.
Yes, Khan Academy is an enabling technology platform with a critical mass of content that can support a range of blended learning methodologies. However, it also can support NON-blended, online only, learning. The 'blend' can happen, when students and educational "coaches" (including classroom teachers) interact with the system and with each other via the system. The details of the blended model used, and even decisions/actions made by and for students using Khan Academy, can result in very successful or very disappointing results.
I suspect this tendency to focus on the platform is largely because we don't know what we don't know. Well-established disciplines such as architecture and software have matured to a point in which they recognize "design patterns," engineering approaches that are proven effective to meet reoccurring challenges. Unlike design patterns for software that are implemented as code, design patterns for learning would describe learner experiences, which may or may not involve technology. We don't yet have a catalog of "design patterns" for learning to support emerging innovations.
It would be fair to say that there are well established instructor-centric 'design patterns' that are recognized in the field of curriculum development. Moreover, there are pedagogical 'patterns' applied in the field of instruction. There are also higher-level models defined, like the classification of blended learning models by the Innosight Institute. However, the kinds of interactions now available for individualized and blended learning call for a new discipline that looks at interactions on a deeper level, and beyond the constraints of the traditional classroom-only, teacher-student, fixed-time interactions.
Learning happens based on the right mix of interactions for each learner, which may include a blend of technology and human interactions. We know from cognitive science what conditions lead to learning, and even what it takes for a person to become becoming an expert in an area of cognitive development. For example, we know that new learning builds on background knowledge and works when the new concept or skill is within a zone of proximal development. We also know the value of feedback loops, scaffolding, and practice over time.
The next breakthrough in in education may be to establish/mature a professional specialty in the area of "learning interactions engineering," in development of design patterns for learning interactions, and in optimizing the application of those patterns for individual learners.
Monday, September 24, 2012
Friday, July 20, 2012
Common Education Data Standards (CEDS) Version 3 Standardizing Data to Support Formative Assessment Processes
Last week at STATS-DC I was joined by Nancy Burke, Grafton Public School District (North Dakota), Lee Rabbitt, Newport Public Schools (Rhode Island), and David Weinberger, Yonkers Public Schools (New York) for this presentation about Standardizing Data to Support Formative Assessment Process Use in Schools. My three co-presenters are members of the Common Education Data Standards (CEDS) Stakeholders Group and have been an important part of the CEDS version 3 focus on teaching and leaning.
One thing interesting about this presentation was that we used PowerPoint as our "sandbox" for the development work, so the presentation content developed long before we knew it would be a presentation at STATS-DC. We started by developing a model of the human processes, and had that fully defined before working on defining the data elements needed to support those processes. We avoided the risk of thinking about formative assessment as a thing, a "kind of test", and focused on the process of teaching and learning.
Formative assessment is a process by which teachers and students use data to inform:
– where they need to go,
– where they are, and
– how to close the gap.
Formative assessment content specialists Margaret Heritage and Susan M. Brookhart help us refine the process model based on research finding and promising practices. Organizations like the Innosight Institute helped us check assumptions about formative data needed to support various blended learning models. We worked with projects such as the Learning Resource Metadata Initiative (LRMI), Open Badges, Race to the Top Assessment consortia (RTTA), SLC, EdFi, and the Learning Registry (LR) to ensure these implementation-oriented projects will have compatibility with the common vocabulary for education data that will be defined in CEDS v3.
See the slides here.
One thing interesting about this presentation was that we used PowerPoint as our "sandbox" for the development work, so the presentation content developed long before we knew it would be a presentation at STATS-DC. We started by developing a model of the human processes, and had that fully defined before working on defining the data elements needed to support those processes. We avoided the risk of thinking about formative assessment as a thing, a "kind of test", and focused on the process of teaching and learning.
Formative assessment is a process by which teachers and students use data to inform:
– where they need to go,
– where they are, and
– how to close the gap.
Formative assessment content specialists Margaret Heritage and Susan M. Brookhart help us refine the process model based on research finding and promising practices. Organizations like the Innosight Institute helped us check assumptions about formative data needed to support various blended learning models. We worked with projects such as the Learning Resource Metadata Initiative (LRMI), Open Badges, Race to the Top Assessment consortia (RTTA), SLC, EdFi, and the Learning Registry (LR) to ensure these implementation-oriented projects will have compatibility with the common vocabulary for education data that will be defined in CEDS v3.
See the slides here.
The Bridge Between Data Standards and Learning Standards—Common Core State Standards (CCSS) and Common Education Data Standards (CEDS)
Last week I co-presented at STATS-DC about the work that is being done to bridge gaps between data standards and learning standards. Maureen Wentworth of the Council of Chief State School Officers (CCSSO) and Greg Grossmeier of Creative Commons joined me to discuss the collaboration between many initiatives including the Common Core State Standards (CCSS) initiative led by CCSSO and the National Governors Association (NGA), Common Education Data Standards (CEDS) led by
IES/NCES under the U.S. Department of Education, and Learning Resource Metadata Initiative (LRMI) led by the Association of Education Publishers (AEP) and Creative Commons.
The presentation slides are here.
The presentation slides are here.
Labels:
AEP,
CCSS,
CCSSO,
CEDS,
Common Education Data Standards,
Creative Commons,
Greg Grossmeier,
IES,
LRMI,
Maureen Wentworth,
NCES,
NGA,
U.S. ED
Wednesday, May 30, 2012
Data Standards to Support the Formative Assessment Process
Yesterday I had a great meeting with the K-12 stakeholder group directing the development of the Common Education Data Standards. Version 3 of CEDS (a common vocabulary for P-20 data) will include standard data element definitions to support the formative assessment process.
"I am really excited that we are finding ways to include instructionally relevant information into the data standards in ways that are clearly meaningful and important."
I share in David's excitement. The expansion of the data standards will support scientifically proven processes of assessment for learning, using progress data to inform instructional decisions and student learning activities.
It has been a privilege to work with the K-12 stakeholders, content specialists Susan M. Brookhart and Margaret Heritage, and others, on refining the process model and identifying supporting data elements.
“Formative assessment is a process used by teachers and students during instruction that provides feedback to adjust ongoing teaching and learning to improve student’ achievement of intended instructional outcomes.”(FAST SCASS, 2008, October)David Weinberger, one of the group's LEA representatives from Yonkers Public Schools, commented after the meeting,
"I am really excited that we are finding ways to include instructionally relevant information into the data standards in ways that are clearly meaningful and important."
I share in David's excitement. The expansion of the data standards will support scientifically proven processes of assessment for learning, using progress data to inform instructional decisions and student learning activities.
It has been a privilege to work with the K-12 stakeholders, content specialists Susan M. Brookhart and Margaret Heritage, and others, on refining the process model and identifying supporting data elements.
The process model draft, shown below, is based on case studies of models being used successfully in practices, cognitive science/learning sciences research findings, and engineering/control theory.
The group is identifying supporting data elements and definitions based on concepts from the Formative Assessment for Students and Teachers (FAST) State Collaborative on Assessment and Student Standards (SCASS) (2008). Rather than "reinvent the wheel" the work is coordinated with many other initiatives interested in standard definitions for the types of data used to inform teaching and learning. The group collaborates with initiatives such as the Learning Resource Metadata Initiative (LRMI), Learning Registry (LR), Schools Interoperability Framework Association (SIFA), EdFi, the Mozilla Foundation's Open Badges, and the Shared Learning Collaborative (SLC). We are also working with CCSSO/NGA partners on the components related to Common Core State Standards (CCSS) and with organizations supporting the RTTT assessment consortia looking at more granular competencies "unpacked" from the standards to measure learner progress.
The emerging standards for data related to the formative assessment process includes use cases across traditional classroom-based models, blended learning models, and virtual learning models. It will create a
common vocabulary for describing data that most directly supports student learning.
Labels:
blended learning,
CCSS,
CCSSO,
CEDS,
David Weinberger,
EdFi,
FAST SCASS,
feedback,
formative assessment,
LR,
LRMI,
Margaret Heritage,
NGA,
SIFA,
SLC,
student,
Susan M. Brookhart
Friday, March 2, 2012
Un-Blended Blended Learning
In their new monthly column in THE Journal Michael B. Horn and Heather Staker advise schools to ignore blended learning "best practices".
I would argue that the "best" practices have not yet emerged ...although there are some promising practices on track to be proven practices. Horn and Staker are right in that if "best practice" models are defined based on mostly irrelevant aspects of the blended learning model, such as where and when students sit in front of a screen, device to student ratios, or how many students sit in a computer lab at a time, then educators can make their own model.
The definition for blended learning leaves room for both good and bad practices:
"Blended learning is any time a student learns at least in part at a supervised brickand-This definition includes models in which there is no connection between the supervised instruction and the self-paced online learning, thus "un-blended" blended learning. The online and offline learning experiences may be disconnected, or they may be tightly aligned, each activity informing the other.
mortar location away from home and at least in part through online delivery
with some element of student control over time, place, path, and/or pace." (Staker, H.; The Rise of K–12 Blended Learning, p.5)
The emerging "best practices" for blended learning will have less to do with seat time and device count, and more to do with what learners do during both offline and online time, and what teachers and online tools do to support learning. The best practices will focus the effort on the right kind of learning activities that motivate the right kind of behaviors and lead to the right kind of outcomes. These are models not to ignore.
Tuesday, January 31, 2012
Common Education Data Standards v2 released!
The Common Education Data Standards v2 release today marks one of many interconnected milestones for 2012 that I think will have far-reaching impact on the U.S. education system. Data standards are not going to revolutionize education...but CEDS is a catalyst. It is serving as a bridge between many other initiatives that collectively have a shot at tipping the scales toward a system of education focused on individual learners rather than groups. This release comes at a time when the "common vocabulary" of common data standards could determine the scale of success for these other technology-enabled game-changes such as SLC/SLI, EdFi, P20W data systems, and the innovations being developed/tested in Race to the Top states. Instead of competing standards of the past, CEDS v2 has carved out common ground. CEDS is just one catalyst for these separate initiatives to pull in a common direction and transform the ecosystem into one in which it is possible to meet the needs of every learner. Here is the official announcement:
The National Center for Education Statistics (NCES) is pleased to announce the Version 2 release of the Common Education Data Standards (CEDS). The CEDS project is a national, collaborative effort to develop voluntary data standards to streamline the exchange, comparison, and understanding of data within and across P-20 (early learning through postsecondary) institutions and sectors. CEDS Version 2 includes a broad scope of elements spanning much of the P-20 spectrum and provides greater context for understanding the standards' interrelationships and practical utility. Specifically, Version 2 of CEDS focuses on elements and modeling in the Early Learning, K12, and Postsecondary sectors and includes domains, entities, elements, options sets, and related use cases.
Version 2 of CEDS can be found at the CEDS website: ( <http://ceds.ed.gov>http://ceds.ed.gov). This website includes three ways to view and interact with CEDS:
1. By Element - Via the CEDS elements page, users can access a searchable catalog of the CEDS "vocabulary"; 2. By Relationship - Through the CEDS Data Model, users can explore the relationships that exist among entities and elements; 3. By Comparison - The CEDS Data Alignment Tool allows users to load their organization's data dictionary and compare it, in detail, to CEDS and the data dictionaries of other users.
Educators and policymakers need accurate, timely, and consistent information about students and schools to inform decisionmaking-from planning effective learning experiences, to improving schools, reducing costs, and allocating resources-and states need to streamline federal data reporting. When common education data standards are in place, education stakeholders, from early childhood educators through postsecondary administrators, legislators, and researchers, can more efficiently work together toward ensuring student success, using consistent and comparable data throughout all education levels and sectors.
While numerous data standards have been used in the field for decades, there has not emerged a universal language that can serve basic information needs across all sites, levels, and sectors throughout the P-20 environment. By identifying, compiling, and building consensus around the basic, commonly used elements across P-20, CEDS will meet this critical need.
The standards are being developed by NCES with the assistance of a CEDS Stakeholder Group that includes representatives from states, districts, institutions of higher education, state higher education agencies, early childhood organizations, federal program offices, interoperability standards organizations, and key education associations and non-profit organizations. In a parallel effort a group of non-government interested parties with shared goals, including CCSSO, SHEEO, DQC, SIF, and PESC, has come together as a Consortium with foundation support to encourage the effort and assist with communications and adoption of the standards.
Wednesday, January 25, 2012
Is the U.S. ready to empower learners with data?
Last week, when I attended the Data Quality Campaign's National Data Summit, I was caught off guard by a subtle but dramatic shift in advocacy for use of data in education.
It was no surprise that the event last week would focus on data use. DQC has shifted its emphasis over the past couple of years from promoting quality in state longitudinal data systems to promoting effective use of the data collected in such systems. Aimee Guidera and the DQC staff deserve credit for moving the agenda forward at each step of the way.
It may not have been obvious to many, but last week's meeting took a next step. I expected to hear about expanded uses of educational data, beyond school accountability, and more about uses of data that more directly impact student learning. I expected to hear about the changing role of state data systems from accountability (reporting up), to supporting local decision-making (reporting down) to the school district, school, and classroom. I expected to hear discussions about how these changing roles require different kinds of data and services provided to teachers. I expected to hear about data for improving instructional and the teaching profession.
What I was surprised about was discussion about giving students access to data to inform their own learning. Of course this is being done and is nothing new. I was not surprised by the concept, only in that it was discussed at a DQC event with the U.S. Secretary of Education, a State Commissioner of Education, and others on the national stage.
It is not a new idea to focus on learning and to empower students to become life-long learners. It is new, however, to discuss on the national stage about giving data to students. It is new to have a discussion about the learning that students might do on their own or outside of the traditional public school setting. Long-held unwritten and unspoken turf rules, institutionalized as core to our educational system have made certain conversations about data use off-limits. For example, "local control" has been an excuse for some states to focus on collecting accountability data, not on data systems that support student learning. And until recently the classroom was off limits, even to other teachers, as the place for the private practice of a teacher's "art". Professional learning communities are changing that.
Education is too often perceived as something teachers do to students. Teachers "educate" students, principals and LEAs "manage" schools, state education agencies "fund, regulate, and hold accountable" schools and districts, and the federal education agency uses funding and federal policy as levels to hold states and school districts accountable. The institutional culture is preserved as long as everyone stays on their own turf. The unfortunate reality is that student learning is the core purpose for all of these actors.
Because of the institutional turf culture, the national agenda for educational data use has until recently left out the most important participant, the learner. We collect a lot of data about students, but we don't provide students with the data that could optimize their learning experience.
The traditional classroom model of student learning doesn't always promote student ownership, but emerging blended learning models make the student a more active participant. A cultural shift is beginning to changed attitudes and behaviors, as U.S. Secretary of Education Arne Duncan said, "Our goal here is not data, our goal is to change behavior and get better outcomes." Better outcomes may require shifts in attitudes from "doing my job" (in the context of established turf) to "doing my part to support students learning."
I've been connected with some "disruptive" innovations already being used to empower students with data, multi-state efforts to develop a "shared learning infrastructure" and teaching and learning systems, and experiments in blended learning that shift the data needs from teacher-centric to learner-centric.
A long time friend and life-long teacher was once asked by a student, "Why do you not think I'm not important enough to let me in on what I'm supposed to be learning, and tell me how well I'm learning it?" The conversation at the DQC event last week asked a similar question to the data community. While there will continue to be turf issues, as well as positive aspects of local vs. national boundaries, I saw the conversation last week as another milestone toward a system able to meet the needs of all learners.
It may not have been obvious to many, but last week's meeting took a next step. I expected to hear about expanded uses of educational data, beyond school accountability, and more about uses of data that more directly impact student learning. I expected to hear about the changing role of state data systems from accountability (reporting up), to supporting local decision-making (reporting down) to the school district, school, and classroom. I expected to hear discussions about how these changing roles require different kinds of data and services provided to teachers. I expected to hear about data for improving instructional and the teaching profession.
What I was surprised about was discussion about giving students access to data to inform their own learning. Of course this is being done and is nothing new. I was not surprised by the concept, only in that it was discussed at a DQC event with the U.S. Secretary of Education, a State Commissioner of Education, and others on the national stage.
It is not a new idea to focus on learning and to empower students to become life-long learners. It is new, however, to discuss on the national stage about giving data to students. It is new to have a discussion about the learning that students might do on their own or outside of the traditional public school setting. Long-held unwritten and unspoken turf rules, institutionalized as core to our educational system have made certain conversations about data use off-limits. For example, "local control" has been an excuse for some states to focus on collecting accountability data, not on data systems that support student learning. And until recently the classroom was off limits, even to other teachers, as the place for the private practice of a teacher's "art". Professional learning communities are changing that.
Education is too often perceived as something teachers do to students. Teachers "educate" students, principals and LEAs "manage" schools, state education agencies "fund, regulate, and hold accountable" schools and districts, and the federal education agency uses funding and federal policy as levels to hold states and school districts accountable. The institutional culture is preserved as long as everyone stays on their own turf. The unfortunate reality is that student learning is the core purpose for all of these actors.
Because of the institutional turf culture, the national agenda for educational data use has until recently left out the most important participant, the learner. We collect a lot of data about students, but we don't provide students with the data that could optimize their learning experience.
The traditional classroom model of student learning doesn't always promote student ownership, but emerging blended learning models make the student a more active participant. A cultural shift is beginning to changed attitudes and behaviors, as U.S. Secretary of Education Arne Duncan said, "Our goal here is not data, our goal is to change behavior and get better outcomes." Better outcomes may require shifts in attitudes from "doing my job" (in the context of established turf) to "doing my part to support students learning."
I've been connected with some "disruptive" innovations already being used to empower students with data, multi-state efforts to develop a "shared learning infrastructure" and teaching and learning systems, and experiments in blended learning that shift the data needs from teacher-centric to learner-centric.
A long time friend and life-long teacher was once asked by a student, "Why do you not think I'm not important enough to let me in on what I'm supposed to be learning, and tell me how well I'm learning it?" The conversation at the DQC event last week asked a similar question to the data community. While there will continue to be turf issues, as well as positive aspects of local vs. national boundaries, I saw the conversation last week as another milestone toward a system able to meet the needs of all learners.
Subscribe to:
Posts (Atom)