Design educator Jon Kolko details how testing, evaluation, assessment and feedback are honing the Austin Center for Design
program. Kolko details the iterative and collaborative process.
Tags:How Testing Improves Design Programs,Design Tips,does student feedback matter,graphic design programs,how schools improve programs,how student feedback influences school programs,austin center for design,capture your flag,career advice,erik michielsen,jon kolko
Grab video code:
Erik Michielsen: What roles have evaluation and testing played in building your design graduate program?
Jon Kolko: So we've treated the building of Austin Center for Design as an iterative design exercise and part of that process is testing it with real people. And so, we treated our first cohort as co-designers and to their faces I called them co-founders and I think that the majority of them would agree that they're co-founders in the venture. The venture is a non-profit. They don’t literally own equity in it and neither do I, but they own decision-making power. And it wasn’t all democratic but there was certainly a lot of things that we changed as a result of both explicit feedback, implicit feedback, observation assessment. And so I think testing—so testing means different things to different contexts but I think it always means trying something, and learning from it and then iterating on it. And in this case, we tested the pedagogy: how we were actually going about teaching and learning. We tested the entrepreneurial idea, the notion that when you leave the program, you’ve started a company. We tested some professors who had never taught before. We tested some course content that had never been sort of used before. And like anything else with testing, we failed a bunch of times and that’s the point. I mean, so, arguably it's better this year and arguably it will be better next year. What's really nice about being a new school is that if you're not dealing with bureaucratic organizations like accrediting bodies, you can change on a dime. That changes when you're dealing with those organizational bodies and probably in my future, I will deal with those organizational bodies because there's a huge benefit to them. But at least for the time being, it means that I can hone this program, content notwithstanding because the content is always going to change but I can hone the structure of the program until I feel like there's evidence for it being really, really good. As always with evaluation, you sort of take it with a grain of salt. And so, there’s things that I just have pushed back on as changes that were suggested and there’s things that I completely didn’t think of that students were like, “Hey! We should be doing it like this. Why aren’t we doing it like this?” So now we're doing it like that.” There is something sort of really, really nice about building a program together with the people that are benefiting from it. I wasn’t expecting that at all when I started it. I never really thought of this as like, I guess I do think of it as like, it's my thing but I've never felt overly protective of it from outside feedback but I was not ready for how much benefit I got from that outside feedback, I think is what I'm trying to say.
Capture Your Flag creates a model of success college graduates and early- to mid- career professionals can follow by interviewing up and coming leaders about formative decisions and experiences shaping their careers.