This week, we’ve come across online articles and social media posts discussing the claim that an AI is now able to accurately predict a person’s death or remaining lifespan.With descriptors such as “Doom calculator” or “Death bot” populating many of the social media posts, we’ve also noted a lack of information or details beyond the widely shared headlines. Another set of articles also seem to be promoting a web-based “AI Death Calculator” tool that anyone can try.
The origin of this claim is a paper titled “Using sequences of life-events to predict human lives” published on 18th December in Nature Computational Science. Some science-focused platforms reported on the paper, followed shortly by new sites such as The Daily Mail and The Independent. Based on our scan of social media, articles from these news platforms have been most widely shared.The published paper contains the work of a group of researchers from the Technical University of Denmark who created an AI model named Life2vec.
In summary, the researchers used a rich set of health and labour data from 6 million Danes to train a large language model, which is a generative AI technology. This model is able to process the vast dataset of human data (including gender, income, education, medical history, or specific events) and predict the most statistically likely outcomes of an individual’s life. The researchers looked not only at mortality, but other life events such as personality types or predicted lifetime income.The concept of statistical analysis based on a set of factors is not in itself new and it is possible to conduct such analysis without the help of AI. However, the methodologies and volume of training data behind this model have resulted in high-accuracy predictions that make it noteworthy. According to the paper, Life2vec’s predictions are more accurate than other existing AI models. The most cited example of this is that when Life2vec was fed data of a test group (half of whom died between 2016 and 2020), it was able to predict with 78% accuracy who had lived and who had died.
Life2vec’s accuracy is limited to the Danish dataset and has not yet been replicated in other locations, nor has the full algorithm behind the model been published. While the study heralds new and potentially breakthrough technology, it is still far from being available to the public in any form.
Therefore, while reporting on Life2vec has brought up eye-catching statistics that are technically true, the framing of the study can possibly be misleading and incomplete. While Life2vec was able to predict mortality of a specific group with high accuracy in the context of a specific study, this does not necessarily mean that it can currently do so for any individual. We thus rate the claim that an AI can accurately predict human deaths somewhat true.
However, articles and posts which leave out the above context allow further misinformation and potential scams to pop up in their wake. Following the initial reporting on Life2vec, we found an article on Yahoo Finance that seems to be reporting on the Life2vec study but also promoting a web-based service that the public can try in order to calculate their own mortality.Possibly capitalising on the lack detail in many articles about Life2vec (for instance, context that the algorithm behind the AI model is not currently public), this article links to a website – deathcalculator.ai – which uses the Life2vec name and invites users to “try it for free” by keying in personal information.
Clicking the “try it for free” link leads to another page of purported examples of the “AI Death Calculator” in action. However, any attempts to try the offered service instead redirect to a different “No Filter NSFW” AI chatbot – suggesting that Life2vec is possibly being used as a means to covertly promote a different product or to capture user information through the sign-up form. We further checked the source code of this website and found that it was only published on 20th December 2023. It is therefore important to be cognisant of shady and potentially dangerous marketing techniques such as this one which make use of misleading or incomplete reporting. This is also an example of how some news articles which might appear legitimate can nonetheless be false advertisements. Given the current landscape which is both saturated with constant headlines of new technologies yet also filled with incomplete information, it is more crucial than ever to actively seek out details and context to protect ourselves from falling prey to bad actors online.
The URL deathcalculator.ai goes to a fake copy.
After giving my data, I tried very hard to receive a number, but this bot always refused and found thousands of excuses why it could not answer that question.