LOVE KNOWLEDGE AND CHALLENGE SCIENCE
Last year, the Danish Broadcasting Company DR was reprimanded. The researchers, who had participated in the much-maligned TV program "Ellen Imellem" were vindicated by the Press Council because they had been misled and they had been asked to participate in a program on a deceptive basis. The criticism can also be read as a nod to science. And we need that in a world full of fake news and misinformation.
For the past two and a half years, health experts have had their hands full fighting COVID-19 and the spread of conspiracy theories and misinformation about the pandemic.
From fake news stories about rising stillbirths caused by the COVID-19 vaccine to baseless reports linking the shots to infertility in women, the so-called "infodemic" has threatened the fight against the virus. Like the pandemic itself, the "infodemic" spread most rapidly among the world's most marginalized and vulnerable. Here, excess mortality was massive and generally undocumented. Fear, confusion, and distrust of institutions lead people to accept harmful medical myths, researchers say.
What can we, in privileged Denmark, learn from it? And what can we learn from those situations where science misinforms, or when it is clear that scientific experiments, data, or expert analyses are riddled with bias?
My conclusion is: Love knowledge and challenge science! Let's look at a few telling examples:
One of the most profound misconceptions about the human psyche is that men are simple and women are complicated. Researchers in gender psychology trace this belief back to at least the 19th century, when the long-held belief that women were inferior versions of men began to fall out of favour. More than 100 years later, however, this idea still shapes how society perceives women and how biomedical scientists approach research. On May 30, 2019, the New York Times ran an article titled "Fighting the Gender Stereotypes that Distort Biomedical Research," which inspired this article. The article asks, in part, "Would you take medicine if you knew the drug had only been tested on men and male animals?" According to Rebecca Shansky, a neurologist at Northeastern University in Boston, most trials are done on male animals. There's a widespread belief that females are too complicated to study.
We know that men's and women's bodies function differently from each other. Symptoms of the same condition or disease show up differently depending on gender. Medicines can have different side effects. Yet the research that informs diagnoses and treatments is often limited to male subjects and tests. Dr Janine Clayton, director of the Office of Research on Women's Health at the National Institute of Health in the US, explains that "much of medical science is based on the belief that male and female physiology differ only in terms of reproductive organs," she says. Because of this, most research has been done on male animals and male cells."
Research is carried out on more than just living beings. We know that today, massive datasets from all over the world can be linked together. New insights emerge daily with the help of artificial intelligence, and huge breakthroughs happen across disciplines. In addition, scientists and the rest of us are changing perceptions and actions because of artificial intelligence. This is wonderful in many ways, but there is growing concern that algorithms used by modern AI systems produce discriminatory outputs, presumably because they are trained on data in which societal biases are embedded. This is true about gender, race, disability, and areas where discrimination is prevalent.
To better serve society, individuals, research and business, fighting algorithmic bias must be a priority. "By 2022, 85% of AI projects will deliver flawed results due to bias in the data, algorithms or the teams responsible for managing them. This is not just a gender inequality problem - it also undermines the utility of AI" according to Gartner, Inc.
It is therefore exciting that Denmark, as the first country in the world, has a recently established labelling scheme for IT security and responsible data use in companies. The scheme is backed by a number of organisations, including the Danish Industry Foundation, the Confederation of Danish Industry, the Confederation of Danish Enterprise, SME Denmark and the Danish Consumer Council. To obtain the label, companies must complete a self-assessment based on eight criteria. The answers are validated and if they are satisfactory, the company is awarded the label.
I am particularly interested in criterion 7.2, which requires the company to ensure that the use of algorithms and AI is factual, both in terms of choice of data (e.g., quality, nature, and origin) and computational logic (inference method). Furthermore, it must ensure that errors, bias, and unlawful discrimination in data sets and models are handled based on professional values and practices.
At the same time, as Danish companies must now consider gender parity in several management layers, they also have the opportunity to view it thoroughly in their approach to data. This could allow Denmark to change the age-old gender stereotypes that affect our lives, well-being, and societal development. It will be interesting to see if all seize this opportunity. Furthermore, it will be interesting to see whether both private and public managers and boards embrace the data bias issue.
Well, now back to scientists and science. I love that Anja C Andersen said no, and that the press board agreed with her. I am proud that a talented female scientist won the Hørup Debate Prize. But science and academic authority must never be unchallenged, especially when it comes to life-threatening discrimination and perpetuating inequalities in society.
The only way to minimise the forces that spread misinformation, and in some cases seek to undermine our democratic information society, is open and critical debate. As citizens, consumers and parents, we must expect the best from our research institutions, companies and experts. Active participation, questions and debate are needed. The more complex the world becomes, the more we must lean in and take responsibility for the data and information base on which all new knowledge is built. We must love knowledge and challenge science.