Tuesday, October 6, 2015

Notes on seminar 2

Chapter 13. Evaluations.
Its important to evaluate your work, both with yourself and with a user, so you get many different viewpoints. This helps to understand the users requirements, which takes time for the designers to find out what exactly these are. Things that limit evaluations are tight schedules, low budgets or limited access to users.

The DECIDE framework is a good one, a checklist.

· Determine the goals
Who wants it and why? Let the goals guide the evaluation.
· Explore the questions
Find the questions relevant to the goals, break them down in further questions. Continually ask why, why, why?
· Choose the evaluation methods
Finding the right method is important, so you don’t choose something that does not fit with the issue at hand, for example doing observations on the natural behavior of something, but in a controlled environment. If it’s not their natural habitat, their behavior will not be natural.

· Identify the practical issues
Its helpful to do a pilot study at first, to see the largest issues. Other questions to ask yourself is: is this the right people to participate, do we have the time or the budget for this, do we have the expertise needed?

· Decide how to deal with the ethical issues
Should the participants be anonym? How do we protect their information? Have they signed a consent form if we are to use their info? Have we been honest with them?

· Evaluate, analyze, interpret, and present the data.
Decide how reliable your data is, if its consistent or makes sense. Is the person you interviewed a reliable source? How valid was your evaluation method, now in hindsight? If it effects the environment, or if the environment effected your evaluation, is it relevant? Are we or any of our participants bias? Is that relevant?

Its common to not follow this strictly linearly, but rather jump back and forth, since reality is rarly this simple.
 Chapter 15 Evaluation: Inspections, analytics, and models
Deal with different kinds of evaluations.
Walkthroughs, a expert help the participant.
Heuristic evaluation: Factors like Visability of system status, how it corrilates to the real world, how much freedom do the user has, how good are the standards, is it consistent, how do we avoid errors, is it efficient to use, are important, and its also important to think about that the users should recognize things, minimalistic design, and give them good help.
You start a Heuristic evaluation with a briefing session so they know what to do, then you do the evaluation, where the expert does it first, then the evaluators do it, then finish with a defreafing season.
Cognitive walkthrough, you simulate the users problem solving in each step.
Pluarstic walktrhought.
Several evaluators take on different roles,
Questions for us: What model should we use, shall we follow the decide framework?


No comments:

Post a Comment