Learn with Facebook was first launched in India on November 2017. It is a career development site that focuses on both hard and soft skills people need to advance in today’s digital workforce.
The initiative is part of Facebook’s pledge to train business owners and equip people with the skills needed to succeed in the modern workplace.
The platform has since been intermittently launched in Germany and France.
User are not completing the lessons which they start on the platform. We see the biggest drop off in lessons within the first 3 pages.
To create a best-in-class learning experience that allows Facebook to meet their goal of training people as defined by logging in and completing one lesson. The secondary goal is having lessons completed and increasing time spent so we need to create an experience that both drives people to complete their first lesson and continue to learn both in the same session and in multiple sessions.
I was part of a team to deliver the enhanced lesson experience for the USA and Brazil launch, for which I was responsible for the user research, strategy, design and user testing.
1 month - 4 one week sprints
1 User Experience Designer
1 Content Lead
2 Digital Designers
1 Front End Developer
What the data tells us
We have undertaken a detailed analysis of Germany - DE lessons to understand the performance of the lesson content. This sample set was chosen over the other market and languages as they were low for Germany - EN, France - FR and France - EN.
Our analysis focused on how the audience interacts with lesson content: what engages them and what drives them to exit.
Through these insights, we aim to identify the best performing lessons, what we can learn from user behaviour to improve lesson performance in order to increase overall lesson completion rate.
Current lesson completion rates in Germany - DE
of the 13.1k lessons started were subsequently completed by users (on average).
Data Source: Google Analytics, German
Date Range: 01.06.18 - 08.07.18
We delve in deeper as to why there is such a low percentage of lessons completed by users.
Drivers to exit
We looked to identify and understand where we are currently losing our audience in-lesson.
For lessons, we see a decline in page views of 54.5% between lesson intro page and page one of lesson content.
We have identified the two largest driver to exit within lessons as:
1. Page one
Across all fifteen lessons, the drop off between the first and second pages of a lesson averages 18.1%. The decline seen following page one is greater in low performing lessons - averaging a 21.4% decline for the 3 lessons with the lowest completion rate.
2. 5 stacked cards
Whilst users spend significant time on 5 stacked card pages, we do see a significant decline following these lesson pages. Decline varies from 11.4% up to 37.5%.
Using time spent on a page as a measure of a user’s engagement, we isolate what content, and content types, draw users to spend more time viewing lesson content.
The average time on page is higher for pages with videos, than those without.
However, in multiple instances whilst a user was spending longer than average on a page to consume a video: they were likely not watching the whole video. For example, a lesson with a video of 95 seconds in length, but the average time on page is just 49 seconds. Suggesting our audience do engage with video content - albeit not fully.
3 stacked cards
Whilst we have seen 5-stacked cards can drive users to exit lesson content, we have also seen our audience spend significantly longer periods of time consuming this content than any other on the platform. 3-stacked cards also elicit greater time spent on page, without the high corresponding exit of users.
Aside from the stacked card views, users spend the longest viewing knowledge check pages. The average time spent on a knowledge check is 54 secs, +80% on the avg 30 secs spent viewing the first lesson page: suggesting users engage with interactive content.
Across all lessons, drop off between commencing a knowledge check and completing a lesson is just 12.5%, compared to the drop off rate of 54.5% at the beginning of the lesson.
Data Source: Google Analytics, German DSS [DE lang]. Date Range: 01.06.18 - 08.07.18
Mapping behaviour against the lesson structure
Looking at the structure of a lesson, we map out the high engagement and drop off points throughout, to assess where the lesson needs to work harder at retaining and engaging the users more.
What the solution looked like
From extracting the data of the current lesson performance, we found the sections that the users were engaged with the most were the interactive elements of the lesson.
We therefore looked to implement active participation throughout with interactions on every page which we think will be additive to the learning experience.
Re-visiting the knowledge check, we amended the format to be a scenario based question/exercise, which will help the user to retain the lesson information.
For the user testing sessions, we tested 2 sets of prototypes:
Lesson with active participation throughout - with a simple scenario based knowledge check
Lesson with active participation throughout - with a complex scenario based knowledge check
Simple vs Complex
A simple Knowledge Check is the same Q&A approach we currently have. It’s been improved with a business application and the correct answers are always explained, whether the users gets them right or wrong.
A complex Knowledge Check is a specific activity related to the lesson that requires a little more thinking and has a more dynamic interaction. It’s often a sorting or organising game that deals with multiple variants. It is also built around a business application and the correct answers are always explained.
We executed a remote, unmoderated user testing to ascertain the viability of the proposed knowledge check concept, and asses the engagement of the active participation throughout the lesson.
Through usertesting.com 16 participants joined out user testing sessions, four each from Brazil and Germany, and eight from the USA. All of them completed the session.
Objectives and Hypothesis
We gathered insights around the overall effectiveness and problem areas within the lesson being tested. The goals of the study were to:
1. Assess the overall engagement of the lesson being tested
We believe that adding interactions to pages will enhance the learning experience. We will know this is true if participants reflect on the content whilst answering the question.
The interactions were viewed as a differentiating factor by users, and as an improvement over other online-learning platforms. It was described as being "not just someone teaching you" and that the interactions "actually got you thinking".
“I find that a lot of online learning just gives you information. I like that this is actually asking questions I can engage with. I feel that makes it more intriguing. It makes me want to learn more and actively get involved.”
- Participant 15
2. Identify obstacles to completing given tasks
We believe a variety of complex and simple interactions will help people move through the lesson. We will know this is true if participants are observed to spend time thinking, but don’t struggle, with the new elements.
People stated that the complex interactions made them want to continue learning, and keep engaging with the lesson. They reflected whilst toying with the most interactive action, the slider. This was mirrored in the complex knowledge check as well.
“I liked the questions that were asked in-between the lesson sections, they made me feel more engaged and made me want to keep going with the lesson.”
- Participant 4
3. Explore the difference between simple and complex knowledge checks.
We believe a business scenario based knowledge check will help people to recall the information they have learnt more readily. We will know this to be true when participants accurately answer knowledge based questions at the end of the experience.
The majority of the users didn't comment on the use of a business scenario - only 2 of 16 explicitly used it to frame their thinking.
Users found the explanation of why their answer was right or wrong really useful and engaging. This was noted as a differentiating factor.
“I like that quiz, I like the fact I got questions wrong. It makes you learn, doesn't it.”
- Participant 5 (Simple KC)
Recommendations for all lessons
Involve the learner
Use a variety of playful interactive tasks ahead of the specifically related content, which reveals insight and further information. This engages users and draws them through the lesson.
Promote a mastery of the subject, rather than a performance goal. Increase the complexity and depth of the lesson as it progresses, providing a genuine feeling of success and advancement.
Explore the impact of using multiple media types, including video and animation within lessons.
Provide emotionally satisfying content
Explore the use of a selectable or personalised business scenario throughout the lesson, and weaving it into the knowledge check to encourage understanding through empathy.
“I definitely see how the company I'm in right now is missing out on some opportunities.”
- Participant 12
Explore benefits of an enhanced resources and partners section based on the response to the knowledge check e.g. ‘you got this question wrong so check out this additional reading’.
As well as providing additional reference details for the business scenario to help validate its authenticity.
USA and Brazil launch
Both markets combined, we are seeing 62.5% of the lessons started, completed. The new updates to the lesson format will now be rolled out to the existing market platforms: India, Germany and France.