Teacher(s)
Language
French
Main themes
· Foundations and definitions of the user experience
· User experience measures
· User experience evaluation methods
· Planning, data analysis and presentation of results
· Integration of the user experience evaluation process into the development of interactive systems
· User experience measures
· User experience evaluation methods
· Planning, data analysis and presentation of results
· Integration of the user experience evaluation process into the development of interactive systems
Learning outcomes
At the end of this learning unit, the student is able to : | |
1. | List and define the conceptual elements and metrics of the user experience ; |
2. | Distinguish user experience evaluation methods in terms of purpose (goal), objectives (means to reach goal), type of collected data, and deliverables ; |
3. | Compare several methods, select the most efficient, argue the choice ; |
4. | Plan and conduct the evaluation of an interactive system and propose solutions improving the user experience with this system. |
Content
Foundations and definitions of the user experience
User experience measures
User experience evaluation methods
Planning, data analysis and presentation of results
Integration of the user experience evaluation process into the development of interactive systems
User experience measures
User experience evaluation methods
Planning, data analysis and presentation of results
Integration of the user experience evaluation process into the development of interactive systems
Teaching methods
The pedagogical approach is blended teaching, which alternates face-to-face classroom teaching with online distance learning via Microsoft Teams. Teaching methods include flipped classroom and project-based learning:
- Flipped classroom: students study or complete an assignment at home and then meet with teachers and peers in a classroom to ask questions, get extra help or work in groups;
- Project-based learning: students develop a project by combining online learning (e.g. watching tutorials or completing assignments) and face-to-face meetings.
Evaluation methods
Continuous assessment without examination in June following two modes: knowledge tests (40%), and group and/or individual assignment (60%). In September, a custom-made individual assignment (i.e., based on failed modes) must be submitted on the first day of the session.
The use of artificial intelligence (AI) tools must comply with the guidelines established by the ESPO faculty. It is permitted as a writing aid (e.g., text improvement, translation) and for information retrieval. For the submission of certain assignments, the instructor defines the other authorized uses (e.g., idea exploration, brainstorming, image or text generation).
The use of artificial intelligence (AI) tools must comply with the guidelines established by the ESPO faculty. It is permitted as a writing aid (e.g., text improvement, translation) and for information retrieval. For the submission of certain assignments, the instructor defines the other authorized uses (e.g., idea exploration, brainstorming, image or text generation).
Other information
All relevant information regarding these modalities and the progress of the activities (calendar, detailed instructions, evaluation criteria, etc.) are presented during the first course and are available on Moodle.
Some resources (e.g. bibliographic resources, slides, explanatory videos) are in English.
Some resources (e.g. bibliographic resources, slides, explanatory videos) are in English.
Online resources
Moodle (asynchronous): course slides, bibliographic resources, calendar, models and rubrics, H5P exercises, tests, assignments, workshops with peer assessment, Q&A forum
Microsoft Teams (live): calendar, meetings, documents, discussion, lecture notes
Web links: how-to videos, websites, online software
Microsoft Teams (live): calendar, meetings, documents, discussion, lecture notes
Web links: how-to videos, websites, online software
Bibliography
Javier A. Bargas-Avila and Kasper Hornbæk. 2011. Old wine in new bottles or novel challenges: a critical analysis of empirical studies of user experience. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 2011, 2689-2698. DOI=http://doi.org/10.1145/1978942.1979336
Tiago Silva da Silva, Milene Selbach Silveira, and Frank Maurer. 2015. Usability evaluation practices within agile development. In Proceedings of the 48th Hawaii International Conference on System Sciences (HICSS-48). IEEE, HI, 2015, 5133-5142. DOI=http://doi.org/10.1109/HICSS.2015.607
Andrei Garcia, Tiago Silva da Silva, and Milene Selbach Silveira. 2017. Artifacts for Agile User-Centered Design: A Systematic Mapping. In Proceedings of the 50th Hawaii International Conference on System Sciences (HICSS-50). IEEE, HI, 2017, 10 pages. DOI=http://doi.org/10.24251/HICSS.2017.706
Margherita Grandi, Fabio Peruzzini, and Marcello Pellicciari. 2017. A reference model to analyse user experience in integrated product-process design. In Transdisciplinary Engineering: A Paradigm Shift: Proceedings of the 24th ISPE Inc. International Conference on Transdisciplinary Engineering, Vol. 5, 243-250, July 2017. IOS Press. DOI=http://doi.org/10.3233/978-1-61499-779-5-243
Carine Lallemand, Guillaume Gronier, and Vincent Koenig. 2015. User experience: A concept without consensus? Exploring practitioners’ perspectives through an international survey. Computers in Human Behavior 43 (2015): 35-48.
Effie L-C. Law, Arnold P. O. S. Vermeeren, Marc Hassenzahl, and Mark Blythe (Eds.). 2007. Towards a UX Manifesto COST294-MAUSE affiliated workshop. In Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 2 (BCS-HCI '07), Vol. 2. BCS Learning & Development Ltd., Swindon, UK, 205-206.
Effie L-C. Law, Nigel Bevan, Georgios Christou, Mark Springett and Marta Lárusdóttir (Eds). 2008. Proceedings of the International Workshop on Meaningful Measures: Valid Useful User Experience Measurement (VUUM). COST294-MAUSE.
Effie Lai-Chong Law, Virpi Roto, Marc Hassenzahl, Arnold P.O.S. Vermeeren, and Joke Kort. 2009. Understanding, scoping and defining user experience: a survey approach. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 719-728. DOI: https://doi.org/10.1145/1518701.1518813
Thomas Tullis and William Albert. 2013. Measuring the User Experience, Second Edition: Collecting, Analyzing, and Presenting Usability Metrics (2nd ed.). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
Arnold P. O. S. Vermeeren, Effie Lai-Chong Law, Virpi Roto, Marianna Obrist, Jettie Hoonhout, and Kaisa Väänänen-Vainio-Mattila. 2010. User experience evaluation methods: current state and development needs. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries (NordiCHI '10). ACM, New York, NY, USA, 521-530. DOI=http://doi.acm.org/10.1145/1868914.1868973
Tiago Silva da Silva, Milene Selbach Silveira, and Frank Maurer. 2015. Usability evaluation practices within agile development. In Proceedings of the 48th Hawaii International Conference on System Sciences (HICSS-48). IEEE, HI, 2015, 5133-5142. DOI=http://doi.org/10.1109/HICSS.2015.607
Andrei Garcia, Tiago Silva da Silva, and Milene Selbach Silveira. 2017. Artifacts for Agile User-Centered Design: A Systematic Mapping. In Proceedings of the 50th Hawaii International Conference on System Sciences (HICSS-50). IEEE, HI, 2017, 10 pages. DOI=http://doi.org/10.24251/HICSS.2017.706
Margherita Grandi, Fabio Peruzzini, and Marcello Pellicciari. 2017. A reference model to analyse user experience in integrated product-process design. In Transdisciplinary Engineering: A Paradigm Shift: Proceedings of the 24th ISPE Inc. International Conference on Transdisciplinary Engineering, Vol. 5, 243-250, July 2017. IOS Press. DOI=http://doi.org/10.3233/978-1-61499-779-5-243
Carine Lallemand, Guillaume Gronier, and Vincent Koenig. 2015. User experience: A concept without consensus? Exploring practitioners’ perspectives through an international survey. Computers in Human Behavior 43 (2015): 35-48.
Effie L-C. Law, Arnold P. O. S. Vermeeren, Marc Hassenzahl, and Mark Blythe (Eds.). 2007. Towards a UX Manifesto COST294-MAUSE affiliated workshop. In Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 2 (BCS-HCI '07), Vol. 2. BCS Learning & Development Ltd., Swindon, UK, 205-206.
Effie L-C. Law, Nigel Bevan, Georgios Christou, Mark Springett and Marta Lárusdóttir (Eds). 2008. Proceedings of the International Workshop on Meaningful Measures: Valid Useful User Experience Measurement (VUUM). COST294-MAUSE.
Effie Lai-Chong Law, Virpi Roto, Marc Hassenzahl, Arnold P.O.S. Vermeeren, and Joke Kort. 2009. Understanding, scoping and defining user experience: a survey approach. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 719-728. DOI: https://doi.org/10.1145/1518701.1518813
Thomas Tullis and William Albert. 2013. Measuring the User Experience, Second Edition: Collecting, Analyzing, and Presenting Usability Metrics (2nd ed.). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
Arnold P. O. S. Vermeeren, Effie Lai-Chong Law, Virpi Roto, Marianna Obrist, Jettie Hoonhout, and Kaisa Väänänen-Vainio-Mattila. 2010. User experience evaluation methods: current state and development needs. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries (NordiCHI '10). ACM, New York, NY, USA, 521-530. DOI=http://doi.acm.org/10.1145/1868914.1868973
Faculty or entity