Dr Andrea Rosales and Professor Jakob Svensson on tech teams’ appeal to youth culture and how that means they often exclude the input of older workers with implications for the kinds of products and services they produce.
There are many challenges posed by the current datafication of contemporary connected societies. These include age-related biases embedded in the algorithmic design of digital platforms and concerns about the data on which algorithms base their increasingly automated activities. Both algorithmic design and the data on which they are based reinforce age discrimination.
It is common knowledge today that tech culture is quite homogenous in terms of ethnicity and gender, but this is also the case for age. Tech culture has its roots in hacking, a youth-orientated culture that emerged on the campuses of research universities and then moved to the garages of middle-class suburbia. These were spaces where young people – mostly boys – entertained themselves with coding, making toys for themselves, challenging the adult world as well as dreaming of a different world in which their bodies would be irrelevant.
This emphasis on youth is still present today, with programmers expected to be young, enthusiastic and flexible and to want to work in a vibrant and dynamic environment (for instance, in workplaces that include video games and table football). It is a multicultural environment where programmers are invited to make new friends, according to many job adverts. This is in part linked to the entrepreneurial side of tech culture which often implies a youth-oriented office setting, with furniture such as bean bags and an emphasis on friendship, an expectation of late starts and late nights and drinking with your office mates. Indeed, working in tech often suits a young lifestyle and people who have no family responsibilities.
In such a context, those who are conceived of as older (ie those above 35) are expected to be geniuses and inspiring leaders, people who left coding to become managers. Older tech workers who would like to continue programming or workers who decide to switch career later in life may therefore find it difficult to fit into such an environment.
The influence of this youth-oriented culture in tech companies also has consequences for the design of their products and services. Algorithms may end up reproducing the bias of the composition of the team engineering them. By not considering the values, interests and habits of older users, digital products and services are likely to contribute to the deprioritisation and disregard of such users and to discrimination against them.
When it comes to race and gender, previous studies have shown how, for example, face recognition systems were launched despite the fact that they produced sexist or racist results. A more diverse programmer team could have stopped such bias creeping. It’s not good for business either as it leads to products and services that do not match users’ expectations. For example, after years of public complaints, IBM decided to leave the face-recognition business. Similarly, Amazon ended up stopping the recruitment engine they developed to automate how they filtered candidates to job offers.
This has led many tech companies to actively seek inclusive teams regarding race, gender and sexual orientation. However, they often still reinforce age stereotypes in their job adverts and public communication, often in a quest to give their company a youth-oriented feel.
There is therefore a need to continue creating awareness of the discriminatory practices embedded in digital technologies, particularly with regard to age and to continue fighting for more equality in society.
Dr Andrea Rosales is a researcher at the Internet Interdisciplinary Institute at the Universitat Oberta de Catalunya and Jakob Svensson is a media professor at Malmö University, Sweden. Dr Rosales is first author of this recently published study on ageism in the era of digital platforms.