Opinion
One Year On: Transforming Onboarding for Women and Non-Binary Applicants
Around this time last year, I announced that we were launching our own coding platform for complete beginners, Learn to Code. At the time, I said that using third party platforms in our onboarding processes was discouraging learners, in particular women and non-binary applicants. You can read my announcement post here, where I spell out some of the rationales and - ultimately - aspirations for the project.
Accountability and reflection is important. It’s easy just to launch things and say “look at us we’ve solved a problem”, without any data to back it up. So, one year on, I have some data, and I have to admit straight off the bat … accountability and reflection is pretty easy when your data says, “huh I guess we did actually solve a problem”.
To recap, in 2022 we were unhappy with the progress of our applicants through these third-party tools, and we had a particular issue with the disproportionate number of women and non-binary people dropping out of our application process while using them. After some analysis, we found that third party learning tools we used to onboard people onto the bootcamp had a very different teaching style to the bootcamp itself. The difficulty levels of these tools were extremely uneven. There were tasks where the building blocks learned were generally trivial and unchallenging, but each module would culminate in a piece of work which drew all of these challenges together, and the difficulty spiked at that point. For many learners, particularly those whose forays into the industry are tentative, these spikes in difficulty cause them to immediately give up, and decide it was not the right field for them.
It’s easy to look at a problem like that and focus on the difficult challenges as the core issue - after all, these tended to be the points at which people threw in the towel. However, the goal of anyone learning to code is that these sorts of problems should eventually be soluble. So instead, what we focused on was fixing the difficulty curve. We followed the principle that new knowledge, once acquired, should be challenged immediately. Learning the tools that are required to solve harder problems in a rote, unchallenging way, ultimately makes them difficult to retrieve at the point of need. Tacking to Bjork’s theory (not that one) of “desirable difficulties”, meant that we actually increased the average difficulty the problems learners were faced with, but with the aim of not straying into territory that felt impossible. We hoped that this would yield a notable difference immediately, but if it didn’t, we would have the visibility and control to be able to adjust the content to improve retention.
So - drumroll please - since launching JS Basics one year ago, we have seen a 29% increase in applicants following through onto the course. This increase is even more pronounced in women and non-binary people, who are 74% more likely to join one of our courses now than before we launched.
We were also keen to track the pace of learning, and the eventual learning outcomes of our system from launch, to ensure that we weren’t accidentally eroding other KPIs. As it happens, since launch, learners take about 30% less time to complete our onboarding journey from end-to-end, and achieve better outcomes at the point of technical test.
I should mention for anyone who is naturally sceptical about cherry picked statistics, that these increases relate to a higher number of applicants overall, and our tech tests control for cheating (including methods that are currently most in vogue, such as LLMs like ChatGPT).
Both the tool we developed, and the learning material we deployed on it, is the handiwork of a great, diverse team, unified by the objective of improving our students’ and our applicants’ outcomes. I am happy to take on the burden of public accountability, but the credit belongs to them. I, for one, am grateful.
Sam Caine
Chief Operating Officer