Lois Hetland's Blog

Wanderings, inside and out

Arts Assessment

| 3 Comments

I’m just reading the document the NEA commissioned on assessment: Improving the Assessment of Student Learning in the Arts – State of the Field and Recommendations. It’s by Sharon A Herpin, Adrienne Washington, and Jian Li at WestEd — for none of whom are credentials listed in the document. Neither is a copyright, so though I believe it’s quite recent, there’s no way to tell. This is inexcusable in scholarship, and they should know better.

The story told in the Executive Summary is that, in 2005, NEA started requiring and looking at what applicants for grants said about their assessment practices, and they found that people weren’t differentiating  program evaluation and assessment of student learning. I’ve seen that commonly, too. So they commissioned WestEd to conduct this study to “examine current trends, promising techniques, and successful practices being used to assess student learning in the arts throughout the country, as well as identify potential areas in which arts assessment could be improved.” This is right-headed, in my view. They identified a field-need anecdotally and then got a reputable group to conduct an empirical study. So far, so good.

But the study didn’t find strong models of assessment, so the report switched gears. It describes what’s up now, what literature is good, and what the field needs to do to improve. Again, this sounds useful. But, as they go into the goals in the Executive Summary, I find my first quibble, and it’s a major one. These are what they list as the goals of the current study:

  • Available resources, tools, and documentation related to the assessment of student learning in the arts
  • Current experiences and practices in assessing student knowledge in the arts
  • Current experiences and practices in assessing student skills in the arts
  • Trends in locating and using assessment tools
  • Needs of the field to improve the assessment of student learning in the arts

I’m down with bullets one, four, and five as goals. But do you notice anything that’s important to assess that’s missing? I do.

Bullet 2: Knowledge — ok. But there isn’t really a defined body of knowledge that constitutes what an artist needs to know. Art is the true interdisciplinary discipline — it claims as its content all the content and forms and methods of every discipline. Artists work on science, history, economics, literature, language, mathematics, politics, geography, philosophy, history, sustainability, social justice and equality, religion, anthropology — name it, there’s an artist who puts that at the center of her or his work. I was just at the Venice Biennale in October 2013, and the national pavilions and Gioni’s curated exhibition both made that abundantly clear. So just what knowledge are they planning to assess? I suspect — we’ll see as I read further — that it’s the history of art, the names of eras and artists and works, and the names and definitions of tools and media and approaches. Good, all good. But so inadequate, and not at ALL all or even essentially what an artist needs to know to practice effectively, or to understand and appreciate the archive of wisdom and intelligence represented by the works artists have created throughout time and place.

Bullet 3: Skills–ok. But, first, it’s not parallel to knowledge. Knowledge is a network of information organized systematically. It should be dense and organized for access so it can be used. But skills? Those are what people can do expertly, yes? My online dictionary says: “the ability to do something well; expertise : difficult work, taking great skill.
• a particular ability : the basic skills of cooking.” In what way is skill parallel to knowledge? It’s like comparing apples to a jet plane. What’s parallel to knowledge is Methods — that is, the processes by which experts build and validate knowledge. That’s a definition we use at Project Zero, defined when creating the Dimensions of Understanding in the early 1990s. Skills are part of methods, but methods is bigger; a deeper disciplinary structure.

And it’s THAT that we need to ensure students take away from their primary and secondary education–the deeper disciplinary structures of the disciplines, including the arts. We need to be educating for an appreciation of what art is (Knowledge and Forms), how it works and how it’s done (Methods), what counts as good (Methods), what it’s for (Purposes), and why it matters (Purposes). And by appreciation, I don’t mean merely knowing ABOUT. David Perkins has identified a common failing of education as participating in “about-itis.” No, we need more than learning ABOUT, and we need to demand more and, therefore, assess for more. We need students to be able to use what they know flexibly in novel circumstances. That’s the definition of understanding we use at Project Zero, the “performance view of understanding.” And it’s THAT that we need to be assessing in the arts (and in everything else).

There’s widely shared agreement in the field of assessment that, in order to assess well, you have to be assessing something you intend students to learn. The same is true of research — you need to interpret something you’ve stated as intended. So the research goals are critical to a study’s veracity and usefulness. In this case, the goals are inadequate, as I’ve discussed above. And that means that the field is now more confused but whatever’s been found (which I haven’t read yet). Rather than clarify what we really need, it seems the study has set out to state what is needed without considering what that really means. And, as we all know, “we get what we assess,” so if we’re not assessing what the field really is and is for, what are we doing?

We’re setting students and teachers and the educational system up to do Mickey-Mouse, which demeans the arts and wastes people’s time. It sets the arts up as trivial — and, as Gioni, the Curator of the New Museum in New York and of the 2013 Venice Biennale, states, “they are a matter of life and death. So that’s malpractice, because we, individually and as a society, desperately need arts education, and that means we desperately need arts assessment. If the assessment is skewed toward assessing something that isn’t really what art is, but, rather, assessing a parody or a flim-flam man of what art is, then the effort serves to derail the quality of arts education and future artistic practice and appreciation.

We have to get the categories right. If we find, as we do, that arts don’t have quality assessment (and I can’t believe that, given what I know to be the case at MassArt, where ongoing public critique, refinement, and culminating reviews and exhibitions are the norm and where we teach our students to carry out those practices in their arts classrooms when they teach), then it is likely because the field distrusts what’s being done in the name of assessment. So the first order, in my view, is to get clear about what we need to assess. Once that’s straight, we can see how it’s been assessed by experts in the field over time and in varied contexts, and how that’s evolving today. It’s THOSE practices that will show us the pathways to assessing student learning in the arts.

So let’s get on it!

 

 

Author: lhetland

Lois Hetland, Ed.D., is Professor and Chair of the Art Education Department at the Massachusetts College of Art and Design and Senior Research Affiliate at Project Zero at the Harvard Graduate School of Education. Trained in music and visual arts, she taught elementary and middle school students for 17 years. Currently, she co-leads the Studio Thinking Network, a monthly online conversation among educators who use the Studio Thinking Framework. Previous work includes conducting an assessment initiative at MassArt (2009-2013), serving as Co-Principal Investigator on a National Science Foundation study of potential transfer from visual arts learning to geometric spatial reasoning (2008-2013), conducting research for the co-authored book, Studio Thinking 2: The Real Benefits of Visual Arts Education (2013, Teachers’ College, 2nd edition), supported by the Getty and Ahmanson Foundations (2001-2004); serving as Consulting Evaluator for Art21 Educators (2010-20012); Principal Investigator for research and professional development in Alameda County, CA, funded by the US Department of Education (2003-2010); Co-Principal Investigator on the Wallace funded study, Qualities of Quality: Understanding Excellence in Arts Education (2005-2008); and research leading to a set of ten meta-analytic reviews analyzing the effects of arts learning on non-arts outcomes, funded by the Bryant Family Foundation (1997-2000). Contact: lhetland@massart.edu 617-879-7528 (w)

3 Comments

  1. ALways thoughtful and observant!

  2. Data HK yaitu ringkasan yang membahas tentang hasil result pengeluaran data hk atau togel Hongkong yang sudah terlewatkan. Kata kunci ini sendiri banyak di cari-cari oleh pemain yang meminati Hongkong Pools. Jika kamu salah satu pecinta pasaran hongkong pools dengan demikian situs ini pilihan yang tepat bagi anda. Sebab kami akan tampilkan pengeluaran HK pools terbaru dan tercepat khusus bagi yang membutuhkan.

  3. Lagi cari Jasa Live Streaming atau Dokumentasi Foto dan Video? Jangan khawatir, kini ada Letari Studio yang menawarkan
    layanan Jasa Live Streamind juga Dokumentasi Foto dan Video untuk Event dan Wedding Anda.

Leave a Reply

Required fields are marked *.


Skip to toolbar