tisdag 6 oktober 2015

Reading reflections for Seminar 2

Reading reflections for Seminar 2

Chapter 13 - An Evaluation Framework
Chapter 13 is all about the DECIDE. DECIDE is a framework to guide you through an evaluation process. DECIDE is an acronym where every letter of the word stands for a specific step in the evaluation process. 

D, Determine Goals.

What are the goals of the evaluation what do you want to find out about your product? (p.626)

E, Explore questions.
Find out what questions to be answered by your evaluation to be able to reach the goal of your evaluation. (p.627)

C, Choose the evaluation method.
Yes, choose your preferred method. (p.628)

I, Identify the practical issues.
What problems might occur during the evaluation? Can we get all the users we like for this study and does the budget cover the things we want to do. (p.630)

D, Decide how to deal with ethnical issues.
Is there any ethics you have to keep in mind during the evaluation? (p.633)

E, Evaluate, analyze interpret, and present the data.
How will the data be analyzed, statistically? Is there something bias in your evaluation? Etc. (p.640)


 Chapter 15 - Analytical Evaluation

The theme of this seminar seems to be evaluation of a high level prototype or a program already on the market who wants to improve. I found it interesting reading about the different evaluation techniques. Before reading this book I had no idea about the science behind evaluation. I was surprised in a positive way when finding these methods, involving users, experts and evaluators  have been around for a while. What I understood from the book and from reading these chapters is that you do not need users to evaluate a product. It is, as the book says of course nice to have the users at hand, but that might not always be possible. That is where evaluation techniques such as Heuristic evaluation, cognitive walkthroughs and Predictive models come into the picture. I find it fascinating that for a distinct set of rules or by following a checklist you are able to evaluate an applications usability. For example Nielsen's 10 usability heuristics is used to evaluate almost any application.  

The same heuristics also adapt to evaluate websites. I, myself build websites from time to time, and I find it intriguing to now have these tools for evaluating my work in my toolset. The problem with evaluation, as they said in the book, is that the evaluator is often giving a subjective evaluation, even though she tries to be objective. Therefore it is important to have multiple evaluators.  

Is it possible to have one ultimate checklist for evaluation that can be applied to all applications? 
Thoughts By Alexander Koski

Inga kommentarer:

Skicka en kommentar