The Common Core Standards, the national academic standards for K-12 schools in the United States, have now been adopted by 47 of the 50 states in the U.S. This makes them the pre-eminent source of what is being taught in the vast majority of public schools in America.
The Common Core standards don’t just suggest novel technology use as a way to “engage students,” but rather requires learners to make complex decisions about how, when, and why to use technology–something educators must do as well. In the past, tech use—whether limited or gratuitous—has been more a matter of preference or available resources than a must-do requirement. With the Common Core, such use is now a matter of law. It would be easy to consider the adoption of a set of national standards to be bad for technology, mainly because the connotations of these standards involve testing, accountability, and other icky edu-stuff that flies in the face of all that the open world of technology represents. But it doesn’t have to be that way. A challenge for teachers will be to re-think how they perceive the role of technology, as hidden biases (both for and against) can sabotage student learning.
While the long-term impact of these standards on learning is obviously unknown, the impetus—and potential—of meaningful technology integration into learning in K-12 schools in the United States has never been higher.
Read the Entire Article by Terry Heick on Teach Thought at www.teachthought.com/technology/exactly-what-the-common-core-standards-say-about-technology/
Image attribution flickeringbrad and usnavalcollege