You don't have access to this class

Keep learning! Join and start boosting your career

Aprovecha el precio especial y haz tu profesión a prueba de IA

Antes: $249

Currency
$209
Suscríbete

Termina en:

1 Días
11 Hrs
37 Min
55 Seg

Evalúa el primer diálogo

18/29
Resources

How do I evaluate dialogue with Table Reading?

Evaluating dialogue is an essential part of the script creation process, and Table Reading is a highly effective technique for doing so. This dynamic allows you to evaluate and refine conversations, seeking a more natural, clear and effective dialogue. Introducing this evaluation into script development ensures a better experience for the end user, which can ultimately have a positive impact on user interaction and satisfaction.

What aspects are considered in the evaluation?

During the evaluation process with Table Reading, several key aspects of the dialogue are analyzed. The idea is to focus on elements such as:

  • Naturalness of the dialogue: does the conversation feel authentic, as if you were talking to a real person?
  • Clarity: Does the dialogue convey information clearly and unambiguously?
  • Overall experience: Is the experience smooth and easy for the user?

Evaluating these elements helps not only to improve dialogs, but also to ensure a better user experience.

How is a live example conducted?

To better understand the evaluation process, a live Table Reading can be performed. In the case presented, a user representing the target user profile is invited to participate in a dialog session.

  • Selecting the right user: In this example, Jen was chosen as she fits the profile of the user Alex.
  • Dialog recreation: The facilitator and the user act out a test dialog. The facilitator reads the lines from the system (Moon), while the user reads his own lines (Alex).

Example of the dialog:

  • Moon: "Hi, I'm Moon. I'll help you prepare your mind for sleep. I can accompany you with a meditation, read you a story, or relax you with sleep music. which one would you like?"
  • Alex: "I would like to meditate."
  • Moon: "Is this your first time meditating?"

This dialogue simulates a guided interaction and assesses the flow and clarity of the conversation.

What do you do after the simulation?

At the end of the dialogue simulation, collect feedback on the experience:

  • Discuss the experience: ask the participant how his or her experience with the dialogue went. In the example, the user indicated that the communication was clear and felt like a natural dialogue.
  • Identification of areas for improvement: Note any specific comments or suggestions the user provides to improve the dialogue.

What is the next step?

Once feedback has been obtained, it is recommended that additional testing be done with:

  1. At least one person who is familiar with the project.
  2. At least one person who is not.

This step ensures that the improvements made are relevant and effective for all types of users.

We invite all those interested in scripting and user experience development to continue to explore and improve their skills in dialog evaluation, as it is a powerful tool for creating richer and more satisfying interactions.

Contributions 1

Questions 0

Sort by:

Want to see more contributions, questions and answers from the community?

Gracias profe no te imaginas lo que nos funciona estos ejemplos!