The Transcripts tab lets instructors and admins review real conversations between learners and a specific mentorAI—with context and cost metrics.
See summary KPIs (messages per conversation, average rating, estimated cost), search by topic or user, and open any session to read the exact learner inputs and mentor replies.
Use these insights to improve instruction and tune your mentor’s prompts, datasets, and tools.
Instructor · Administrator
Headline metrics for the selected period:
- Avg messages per conversation
- Avg conversation cost
- Avg rating
Totals for conversations, user queries, and assistant responses in the time window.
Find transcripts by topic (e.g., “mitosis”) or by user when you need to check on a particular learner.
For each transcript, view:
- User name/username (or Anonymous)
- Mentor name
- LLM model
- Messages exchanged
- Estimated cost
- Timestamp
Read the exact learner questions and the mentor’s responses to evaluate clarity, tone, and accuracy.
Download conversation data from Data Reports → Chat History when you need spreadsheets or BI analysis;
quick viewing is also available in Chat History.
- In the mentor header, click Analytics, then select the Transcripts tab.
- Scan avg messages/conversation, avg cost, and avg rating to gauge conversation quality and efficiency.
- Use the search bar to locate transcripts about a specific concept or for a specific learner you want to check in on.
- Note totals for conversations, user queries, and assistant responses to understand overall load and activity.
Click any session to view:
- User identity (or Anonymous) and username (if login is required)
- Mentor, LLM model, messages exchanged, estimated cost, and timestamp
- The exact Q&A exchange between learner and mentor
- If you see confusion or low ratings, adjust:
- Prompts
- Datasets
- Tools (e.g., Web Search, Code Interpreter)
- Reach out to specific learners based on what you observe.
- Go to Data Reports → Chat History to download CSVs for deeper analysis or archival.
Spot recurring misunderstandings and create targeted reviews, examples, or mini-lessons.
Ensure responses are accurate, on-brand, and student-friendly; refine the System Prompt where needed.
Use user-level transcripts to reach out with resources or office-hour invitations.
After updating prompts, datasets, or tools, compare new transcripts’ ratings, message lengths, and costs.
Topics that dominate transcripts may indicate where lecture materials or assignments need clarification.
Use Transcripts to move beyond surface metrics—read the conversations themselves, understand learner needs, and continuously improve both your teaching and your mentorAI.