Today’s post, on retention, was supposed to be the final one in the “building blocks” series where we bring together the various aspects of great technology products and the questions that can guide our thinking through this process.
But, before I spent time on retention, I thought I’d take a detour and share a few notes on the dark side of engagement from the author of a book on the subject, Nir Eyal. Nir Eyal had a thought provoking post on this in which he called out the dark side of aiming for engagement – unfortunately, making things more engaging also makes them more potentially addictive.
We’ve all experienced this with some of our favorite technology products. They are engaging to the point where we can’t really access them without experiencing that daily hit. His suggestion is to build in safeguards within our products. Here are a few examples –
For example, instead of auto-starting the next episode on Netflix or Amazon Video, the binge-inducing video streaming services could ask users if they’d like to limit the number of hours they watch in a given weekend.
Online games could offer players who cancel their accounts the option of blacklisting their credit cards to prevent future relapses.
Facebook could let users turn off their newsfeeds during certain times of the day.
And rather than making it so fiendishly difficult to figure out how to turn off notifications from particularly addictive apps, Apple and Android could proactively ask certain users if they’d like to turn off or limit these triggers.
We saw real life examples of these issues recently when a 13 year old in China jumped off a building after being denied access to the very addictive “Honor of Kings” game by Tencent. Then, a 17 year old nearly died of cerebral infraction after 40 hours of continuous playing. So, Tencent responded with time limits for anyone upto 18 years of age. Of course, one would hope that companies won’t wait for a tragedy before creating such safeguards.
These examples aside, there have been studies on the negative effects of social networks on teenagers’ self worth and body image. And, we’ve all likely experienced meals where we wished there was a sign like this.
Nir Eyal concludes his post with this –
Of course, tech companies won’t be able to “cure” addictions, nor should they attempt to do so. Nor should they act paternalistically, turning off access after arbitrarily determining that a user has had enough. Rather, tech companies owe it to their users simply to reach out and ask if they can be helpful, just as a concerned friend might do. If the user indicates they need assistance cutting back, the company should offer a helping hand.
With the data these companies collect, identifying and reaching out to potential addicts is a relatively easy step. A harder one, it seems, is caring enough to do the right thing.
For anyone part of a team or company that’s involved in building technology products, this responsibility to care is on us.