Browsing articles tagged with " CSI"
Feb 24, 2012
liza

Evaluation

Feb 22, 2012
liza

TEI 2012 GSC Presentation Reflection

Reflection+Impact

I gave my presented my project from last semester, SnapToTrace, at the TEI 2012 Graduate Student Consortium on Sunday. I must say, it was somewhat of a dissonant experience to present after spending the last six months building a body of research on the foundation of its failings (though also a great exercise). However, it was also validating to hear criticism of this initial approach and specifically the delivery of my concept. It allowed me to see the strengths of my thesis as it currently stands and also the weaknesses where I need to focus, specifically on evaluation and assessment. Our GSC mentors just happen to have an expertise in this area and offered some invaluable advice moving forward with assessment “short cuts” that are more time sensitive and in some ways more applicable.

1) Case studies that are microanalyses of a particular set of behaviors, attitudes, and actions. I don’t think this type of pinpointed generalization is the best option in working to design inclusively for teachers of different backgrounds and who are teaching different levels of students. The variables are to many.

2) Only a post evaluation guided by a coding scheme that examines changes in terminology, attitude, and behavior.

3) Originality of ideas: if one of my main goals is to evaluate how effective this approach is in allowing educators to reframe concepts based on the materials and the context (design, collaboration, creativity, making), then one of my metrics could center on the intersection of different content domains and the ideas generated surrounding

The other major distinction to make is the difference between evaluating the toolkit and the learning that emerges from the toolkit. To do this, my initial thoughts are to implement a short pre-evaluation form and a post-evaluation in the form of video interview. The latter will use a qualitative coding scheme that is aligned with the initial data gather in the pre-evaluation. The video evaluation will also gauge the originality of ideas. Here is my light reading for the trek home from Creativity and Cognition 2009:

Creativity Factor Evaluation: Towards a Standardized Survey Metric for Creativity Support
Erin A. Carroll and Celine Latulipe (UNC Charlotte)
Richard Fung and Michael Terry (University of Waterloo)

I think this would actually be super helpful for a lot of us and will post to the minigroup – for instance, flow is referenced frequently as a major foundation for their framework (from a cursory glance at least).

More to come soon!