How I Critique, Support Iteration, and Evaluate Student Work
Across my studio, lab, and thesis courses, I evaluate student work through a process-driven and evidence-based rubric that prioritizes ethical reasoning, conceptual clarity, and meaningful technological synthesis over surface-level polish.
Critique sessions explicitly address what changed, what failed, and what was learned, and students are expected to document these shifts through reflective writing, developer logs, and design documentation. Progress is measured by learning trajectory and depth of inquiry, not by early perfection or surface-level polish.
1. Ideation & Research Rigor
Students are assessed on the depth and quality of their research, including theoretical and interdisciplinary sources, observational fieldwork, and both qualitative and quantitative inquiry. In critique, I focus on whether students can clearly articulate what they are investigating, why it matters, and how their research informs design decisions, rather than treating research as a decorative reference layer.2. Contextualization & Criticality
A key component of evaluation is how well students situate their work within social, cultural, political, and ethical contexts. During critiques, I ask students to identify the assumptions embedded in their concepts and technologies, and to reflect on issues such as equity, accessibility, authorship, labor, and power. Ethical perspective is treated not as an add-on, but as a design constraint that actively shapes form, interaction, and system behavior.3. Concept–Technology Synthesis
Students are evaluated on how effectively they translate ideas into purposeful technological form. Rather than rewarding technical complexity alone, I assess whether technology is used with clear intent—where interaction design, usability, the choreography of objects or participants, metaphor, storytelling, and sensory experience work together to reinforce the conceptual goals of the project. During critique, I often guide students to simplify, refocus, or reframe their technical approaches in order to strengthen conceptual coherence and experiential clarity.4. Iteration, Testing, and Reflection
Iteration is central to both assessment and mentorship, and I devote sustained attention to this phase alongside research and writing. I evaluate how students form hypotheses, test assumptions, respond to stakeholder and peer feedback, and revise their work over time, using prototypes as thinking tools rather than final answers—an approach grounded in experiential learning.Critique sessions explicitly address what changed, what failed, and what was learned, and students are expected to document these shifts through reflective writing, developer logs, and design documentation. Progress is measured by learning trajectory and depth of inquiry, not by early perfection or surface-level polish.