Next Article in Journal
Moses Mendelssohn as an Influence on Hermann Cohen’s “Idiosyncratic” Reading of Maimonides’ Ethics
Next Article in Special Issue
Aristotelian-Thomistic Contribution to the Contemporary Studies on Biological Life and Its Origin
Previous Article in Journal
Understanding Personal Stances on Religion: The Relevance of Organizational Behavior Variables
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

How a Humbler Science Becomes a Better Science

1
Institute for Research in Technology (IIT), Universidad Pontificia Comillas, 28015 Madrid, Spain
2
ICADE, Universidad Pontificia Comillas, 28015 Madrid, Spain
3
Faculty of Theology, Pontifical University Antonianum, 00185 Roma, Italy
*
Author to whom correspondence should be addressed.
Religions 2023, 14(1), 64; https://doi.org/10.3390/rel14010064
Submission received: 11 November 2022 / Revised: 23 December 2022 / Accepted: 28 December 2022 / Published: 3 January 2023

Abstract

:
Giving humility a key role in scientific practice and communication would improve its objective social function—that is, the production of knowledge about our world and its application to the improvement of the human condition—and its public acceptance. This article reviews the limits of science arising from systemic, epistemic, methodological, and individual limitations and links them to the phenomena in scientific practice that they originate from. The reflection invites us to consider science from the point of view of its limits in situations where there is difficulty in reaching a consensus but also when a consensus has indeed been achieved. Science and technology reflect who we are as individuals and as a society and inherit both our virtues and weaknesses. Humility is the key to getting technoscience that brings us closer to the truth and helps us advance toward improving the human condition. Humbler science becomes a better science.

1. Introduction

The social function of science and its performance depends on its ability to provide an accurate and reliable representation of the world we inhabit, and in addition to improving our understanding, grounding technological developments that fix real issues and advance the wellbeing of our species. The social acceptance of science depends not only on this performance but also on its capacity to convince the general public that it does more good than harm and will not disrupt other social and personal areas. In this manner, the success of science would be related both to its contributions and to its ability to keep away from hubris and excessive claims.
The former reflection invites us to consider science from the point of view of its own limits. This is an essential lesson for every social system. As soon as a system does not learn about its limits and the distinction between what it can and cannot perform, we will always meet dysfunctional models and disruptions. This happens not just with science but with Politics, Economics or Religion. Humility seems to belong to personal virtues and less to systemic constitution and functioning; however, some sense of restraint appears as required for general good functioning at different levels.
Once we become aware of the principle of humility, how can we integrate it into scientific practice? Different paths can be threaded when we explore humility in the context of science. In this paper, we try to better describe how science and scientists can learn about the limits of science and scientific practice to better serve the objectives of science: to improve our objective understanding of the world and improve the human condition.

2. Humility and Fallibility

Many results that were considered solid in former times are later dismissed. Fallibility belongs to science as one of its best traits. Indeed, fallibility allows science to be always open to admitting its errors and correcting them. In this manner, what is virtuous in science reveals its provisory character, a level of uncertainty in many cases, and the need to assume a humble stance since, paradoxically, the strength of science is based on its weakness; or, in other words, science shows its better side as it recognizes its fallibility and limits.
The former views do not preclude doing science in a way that avoids narrow Popperian criteria and can assume a stance based more on evidence and good arguments. The point is that science needs to be open to possible errors, limits, and the need for correction as a condition of its own success. Indeed, that humble sense of fallibility is among the best contributions it can offer to other social systems, like economics, politics, and even religion, which are very often tempted by dogmatism and strong certainties not open to error and correction. Science has progressed precisely because it assumes that statute and it is always open to checks, correction, and improvement. Its many failures have paved the way for progress, as much as positive achievements.
180-degree turns (or U-turns) are common in medical recommendations. In 2000, the American Academy of Pediatrics guidance recommended to mothers with food allergies to eliminate the consumption of nuts during pregnancy and not give them to their children until they were at least three years old. In 2008, the recommendation was changed to the opposite: it is best to expose children to nuts from 4 months of age (and until that age, exclusive breastfeeding is recommended). It was necessary to change the recommendation in light of new studies, which incontestably proved that early exposure to nuts reduces the risk of allergy by no less than 80%.
A Mayo-Clinic study identified 146 changes in medical practice between 2000 and 2010 (Prasad et al. 2013). These erroneous recommendations have a considerable impact on the lives and health of people, so it is key to understand their origin and how to manage our confidence in present guidelines, that is: how to introduce humility with respect to our current scientific knowledge. Such awareness poses serious questions and dilemmas to medical practitioners whose restraint and prudence could discourage effective treatments, as they are only provisory tested. This is not just a moral problem but one that requires more attention and continuous testing, replication, and feedback. As we will discuss, humility should not lead to inaction but rather to contemplate the possibility of common knowledge and practice errors, which leads to continuing research to get sufficient evidence to support it. This paper first presents a concise review of humility as a virtue in the context of science, followed by a compilation of the main issues that call for humility in scientific practice.

3. Social Epistemology and Epistemic Virtue

The objectivity of science is one of the pillars of our societies. We rely on it to determine the best course of action in technical decisions (e.g., which treatment is administered to a patient or how to build a bridge) and to inform policy decisions (like what economic policy would be most beneficial to a country in a given situation). In recent decades, there has been a movement of skepticism about science with important social consequences. The denial of climate change or anti-vaccine movements is rooted in the question of the relationship between science and power, objectivity, and interests. Insisting on the objectivity of science and its processes is not an effective strategy to bridge this gap. We should rather do the opposite and accept with humility that science is not perfectly objective.
However, this is far less obvious in most contexts. For a vast majority of evidentialists, belief in general, and in particular scientific, is based solely on the evidence, which can be determined unequivocally (Conee and Feldman 2004). In addition, we find that many thinkers are individualists in this context, arguing that doxastic attitudes are purely determined by individual factors. For example, if a doctor is confident about the results of a study published in a journal of his specialty without having had direct evidence, it is because he has previously received evidence about the journal being trustworthy. Thus, even if not directly, all individual beliefs are supported by evidence to which that individual has had access.
This view is opposed by social epistemology, which emphasizes the role of social factors in the formation of beliefs. A string of authors starting with Foucault (Foucault and Rabinow 1997), stressed the importance of social structures in the formation of knowledge and especially the relationship between science and power. According to Foucault, it is those who have the power who define what is normal and what is not.
Foucault’s perspective suggests a conscious manipulation, while this is not necessarily the case. His approach is more linked to the power and social control interests that permeate all discourses, including scientific ones, even if it would be unfair to blame science for such misuse. Possibly Bruno Latour can be more associated with a critical view of science and its social conditions beyond sheer scientific interests (Latour 1988). After observing practical work in a laboratory, Latour argued that naive descriptions of the scientific method, where theories are tested by a single experiment, are not consistent with actual laboratory practice. According to him, a typical experiment produces only inconclusive data, which is often attributed to methodological or practical failures. Scientific training means not learning the scientific method but acquiring good intuitions to make subjective decisions about what data to keep and what data to discard. These intuitions are, however, sound and could not be further from arbitrariness but introduce subjectivity in scientific practice. This vision of a perverted science is implicit in contemporary attitudes toward scientific denialism. However, we believe that the mechanisms that connect science and society and make science deviate from objectivity exist but are mostly unconscious. In the first part of this article, examples of these deviations will be presented regarding the objectivity and phenomena that generate them.
Finally, according to virtue epistemologists, science must be understood as a human activity that embodies both the strengths and weaknesses of individuals who perform it (Wood 2009). Thus, science is objective only when individuals who make it are. According to Wood, a series of virtues are necessary for scientific practice: from curiosity which he identifies as the main virtue, to sensitivity, including honesty and objectivity. According to virtuous epistemologists, it does not make sense to speak of science in an abstract manner but only as a project in its concrete manifestations. Virtues should not be taken for granted, and they require a conscious effort on the part of the scientist. Assimilating this perspective, we will talk about good science and bad science in the rest of this article. One of the main virtues that characterize this good science should be objectivity, regardless of the subject matter and the approach to truth.
Objectivity must therefore be sought to make a conscious effort. I would emphasize the fact that the result of scientific activity is not necessarily objective and does not deny the existence of absolute truth. The perspective that I raise in this article is not relativistic. It simply recognizes that our shortcomings as human beings pervade all our constructs: only from humility and acknowledging our mistakes we get as close as possible to do good science, science as objective as possible within our means to bring us closer to the truth.
The presence of values in the practice of science overlaps with other axiological demonstrations in their applications, which in its simplest version, assume that technoscience is neutral and only particular applications result in humanizing or dehumanizing consequences. Thus, science has been compared to a golem, a monster in Jewish tradition formed from clay and who followed the orders his owner wrote in a note and introduced in its mouth. According to Collins (Collins and Pinch 2012), “science is neither a knight nor an ogre: science is a golem.”

4. Humility and Systemic Limits

The first reason for humility in science stems from its deepest limitation: can science reach everything? Science has developed accurate knowledge of a vast part of our world, developing very useful applications in many human and social settings. As more precise knowledge has been acquired, we could better address much more issues in distinct sectors: economy, health, education, nutrition, or weaponry. However, the past successes of science could encourage views about unrestrained powers in science and the so-called ‘scientism’ as an ideological position, which is a belief about the boundless capacity of science to address and fix problems in every human or social dimension beyond other traditional ways. In other words, using the Merriam-Webster definition, scientism is “an exaggerated trust in the efficacy of the methods of natural science applied to all areas of investigation,” and we could dare to say, “to all other areas of human and social life,” more a hubris and an ideology than a balanced approach to reality.
The former point reminds us of the concerns that Max Weber expressed in his mature years on the capacities and limits of science. His essay Science as Vocation (Weber 1946) is very explicit in discouraging any attempt to replace religion with science, or to take science as a new religion, something that could be predictable after the positivist philosophy of Auguste Comte and the expectation that scientific development could entail an overcoming (Aufhebung) of the former religious stage in humanity with its many limitations and delusions. Even if such big expectations have been dismissed one time after another, nevertheless, a light version of science’s capacity to address and fix everything, or at least the most demanding and central issues, has survived and finds several expressions in the hottest dreams of different utopian expressions. These attitudes and beliefs can be found in presentations about artificial intelligence, human enhancement, and life extension, among many others. Admittedly, such dreams coexist with more pessimistic views and dystopian expressions in the media, often obsessed with Catastrophic futures.
In reality, science is aware of its many limits, as it becomes unable to address more issues arising in our time. We have a clear example in the current Ukrainian war: science can provide better weaponry and better instruments of communication and analysis, but it can do almost nothing to overcome war or to reach a stable peace; science and its applications could even worsen things and increase levels of destruction exponentially. The same happens in the field of human relationships: science can provide a better analysis of the many factors involved in interactions, but it can do very little to address problems of loneliness in a specific individual. In a more general sense, science can address only scientific issues or issues that can be treated in scientific terms, trying to better know what all is about, but not others that are more related to the political, emotional, and family realms.
The distinction between science and its applications needs more development: indeed, more voices would suggest that in many fields—like the therapeutic sciences—it makes little sense to distinguish them. However, the usefulness of humility benefits from making this distinction. It is the search for knowledge that should be open to humility to gather more evidence that could either support the prevalent theory or refute it. By contrast, the practice should be limited to applying the best possible treatment given our current knowledge. This helps avoid any unintended, negative consequences of humility. For instance, climate change deniers could justify not taking any action because “more study is needed to understand the climate.” However, this position would be unethical, as it implies not using the best possible knowledge. We must make the needed decision.
In any case, it should be recalled that science does not encompass the entirety of human affairs, with many dimensions of our existence transcending its limits (and, for instance, belonging to the realm of religion). Positions that argue that the scientific domain is all-encompassing are not scientific, as this is rather a belief than a scientifically refutable fact.

5. Humility and Epistemic Limits

The abundant literature on the limits of science often reminds us of the limited conditions of our human cognitive capacities, which are quite fitting for many functions related to survival and reproduction, and less so to understand the ultimate causes of reality (Barrow 1999; Bolger 2012; Chu 2013; Medawar 1984; Midgley 1992; Yanofsky 2016). This is a point that can be discussed: indeed, nobody is sure about the real capacities of the human mind. What is more accepted is that several areas in our reality escape a scientific grasp, becoming mysteries for our current cognitive capacity.
In one way or another, the reality we inhabit appears as much more complex and intriguing now than it used to be 100 years ago, or when faith in science prospected an almost complete cognitive dominion about all reality, including the human and social. The more we know about several aspects of reality, the more is revealed about dimensions that move beyond our understanding and that prevent a naïve view of the powerful scientific gaze. This point has been made, among others, by the linguist Noam Chomsky (Chomsky 2015), who stated that we need to assume the existence of many and intractable mysteries for science as a ‘truism’ and that science must renounce to a whole intelligibility program to focus on the details.
Among the books that have stressed that topic, Marcelo Gleiser essay The Island of Knowledge: The Limits of science and the Search for Meaning (Gleiser 2014) is very explicit in stating that science will always be limited and that our knowledge will never be able to reach the deepest layers of reality. Besides the limits due to our cognitive capacities, other limits are linked to the fundamental nature of things as science reveals some intriguing facts, for instance: the incompleteness theorem, the uncertainty principle, or the mysteries of the quantum dimensions. Recent theories, like string theory, multiverse, super-symmetry, and the like, add more complexity and open more questions than offering sure answers or complete theories. Gleiser’s point is that we need science but be aware of its limits to render it a good guide in our quest for meaning.
Flaws in ambitious scientific programs have been identified in the past decades, for instance, in the area of evolutionary psychology and rational choice theories, applied to better describe human and social behavior. This is the point raised by John Dupré in Human Nature and the Limits of Science (Dupré 2001), who warns against forms of scientism—or an exaggerated faith in science—which is, after all, not very scientific, and it becomes unable to take into account the great complexity of human nature and social interactions.
It is important to realize that these arguments do not refer to science but against its bolder support, which is unable to recognize the difference between what can be and cannot be known through scientific means and how we can apply science inside a broader program that allows for more fruitful use and function in our own cultural context.

6. Humility and Pluralism

These limits arise from difficulties in reaching a consensus. They are linked to the existence of several valid interpretations of data. Few authors have dared to explore pluralism in-depth, but some works have shown that science is embedded in many social and cultural structures and interests, which clearly weigh in the design of research programs, the chosen methods, and the interpretation of the obtained data. For example, Angela Potochnik, in her book Idealization and the Aims of science (Potochnik 2017), reveals how values and interests determine the programs and the models developed in different areas, like climate change. Then, the self-called Minnesota Group, in their book Scientific Pluralism (Kellert et al. 2006), target what they design as scientific monism. In their own words, “The ultimate aim of a science is to establish a single, complete, and comprehensive account of the natural world (or the part of the world investigated by the science) based on a single set of fundamental principles” (Kellert et al. 2006, p. x). Against this narrow view, the Editors of the quoted book claim the “inalienability of multiplicity in some scientific contexts” (ibid, p. xiii). The argument they display resorts to the great and irreducible complexity of our reality at all the levels: physical, living, mental and social, and hence to the impossibility of building unique and close models able to account for that complexity.
The other source of pluralism that plays down grand scientific ambitions is the problem of many methods and interpretations at all levels, like quantum, biological, and even at the level of analysis of statistical data. A recent case raised eyebrows in scientific quarters when an experiment on crowd analysis of a shared dataset gave as a result divergent conclusions after the recruited analysts applied their own tools and methods (Schweinsberg et al. 2021). This is not an isolated case, but it is revealing about such unavoidable pluralism even in a field that applies mathematical calculus in its analysis.

7. Humility and the Individual

The origin of the lack of objectivity in science is the lack of objectivity in individuals. We continuously deviate from rationality both in our individual decisions and as a group. Probably the most well-known model for this is the one proposed by Kahneman, Nobel Prize in 2002, which articulates the human thought processes in two different modes (Kahneman 2011): a first system which is fast, intuitive, emotional and unconscious and a second system which is slow, conscious and deliberative. The first system permeates our thinking in all situations and makes it susceptible to cognitive biases, studied among others by behavioral economists such as Kahneman, Thaler, or Tsversky (Tversky and Kahneman 1974). Some of these biases are particularly harmful to scientific practice, such as confirmation bias (Nickerson 1998), which might be behind some “180-degree turns”: every time a new recommendation is proposed, confirming partial evidence tends to be overweighed. Thus, it is assumed to be true before having been properly evaluated. Later, when more evidence is available, the recommendation might have to be modified. In addition, selection bias (Heckman 1990) appears when individuals or samples selected to participate in an experiment are not representative of the whole population, thus invalidating any conclusions. Survival bias is a particular case of the latter, where the individuals who have survived a process are preferably selected. This is one of the most difficult biases to identify and correct and appears in a broad spectrum of experimental designs. For example, when an alternative medication starts testing, how do we get that subjects participating in the study are representative of the diseased population and not primarily the most difficult cases that have already received standard treatments without success?
Other important phenomena are the bandwagon effect (following the opinion of the majority), authority bias, overconfidence bias, and professional deformation and affinity bias, which makes us more prone to believe the opinion of people who are similar to ourselves. This relates to data bias (Criado 2019), which has only been recently recognized and appears intertwined with other biases. For instance, in the book Invisible Women, Caroline Criado posits that we are conditioned to view the male gender as the default human experience and ignore or erase female perspectives. She offers a detailed critique (which she backs with numerous studies) of how male norms permeate society to the detriment of women. Data sources and analysis are not gender-balanced. The author explains that the gap in data about women and the “male default” in data analysis produces devastating consequences in women’s lives, such as U.K. doctors misdiagnosing 50% of women having heart attacks. The author’s wake-up call to data scientists, social scientists, and policymakers reveals the real-world effects of not collecting and analyzing data about women. This bias is linked to standpoint and to moral limitations, as will be presented below.
All these biases are unconscious; recognizing them can dampen their effects in scientific practice, so educating scientists on them should become a priority. However, accepting the reality of our own cognitive limitations is not easy. It requires intellectual humility, which, according to research, depends greatly on personality traits that we have no control over. Greater humility has been indirectly associated with greater interpersonal warmth, less perceived dominance, and the less perceived threat of being controlled (Priebe and Van Tongeren 2021).

8. Humility and Methodological Limitations

This problem is linked to what has been known as “the replication crisis,” which reveals deeper issues, sometimes related to moral standards and assumptions, which are often neglected for practical reasons in scientific research. It can be described as “an ongoing methodological crisis in which the results of many scientific studies are difficult or impossible to reproduce” (Wikipedia entry). This crisis reveals to what extent current science is not just fallible but flawed. Some of the issues behind the lack of replicability of results are even deliberate manipulations that take us away from the truth with the intention of serving the interest of the researcher or its funding agents. The replication crisis is still being studied, but it seems clear that it will result in stronger standards that will make it more difficult for these issues to appear in the future (Shrout and Rodgers 2018; Boulesteix et al. 2020; Bird 2021).
We can answer that—in contrast to other social systems—science was able to spot big mistakes and fix them. The idea is that science is a self-correcting system able to provide revision and to adjust what is revealed as wrong or just poor in the data collection or in its analysis. This is just one possible opinion; the other is that this crisis reveals a deep weakness in the current form that assumes scientific research, the need to publish as soon as possible, to get positive outcomes that prove the proposed hypotheses, and a fierce competence in the academia that presses for swift results, applications, and career success. Probably a humbler understanding of scientific activity would help to avoid such problems and provide a better guide to research. Once more, the suspicion is that our empirical and experimental methods are subject to limits and invite us to adopt a humbler attitude when dealing with complex subjects.

9. Humility and Moral Limitations

These limits refer to the use and application of scientific research to areas where new developments could become harmful, or at least very risky, or to support injustice of any form.
It is broadly discussed to what extent ethical concerns should influence the activity of social systems like the economy, politics, and even science. Divergent opinions move in a spectrum between those minimalists claiming that each social system needs to work independently and ethics would become just a nuisance, and those maximalists who claim that any social system needs ad hoc regulations that prevent the worst developments that could entail great harm and suffering. Science is clearly not excluded from such issues. The climate emergency and sustainability have become demanding tests for many social areas, like economics and politics, but it reaches probably every human system, which cannot desert or elude its own responsibility before the big risks humanity confronts. The recent war in Ukraine has added some new stress to the need to consider ethical dimensions.
Big suspicions and concerns arise when reviewing past terrible practices and applications of science, even to justify inequity or marginalize populations or support negative impact campaigns, or serve very ambiguous interests. A clear example of this would be the book “The Bell Curve: Intelligence and Class Structure in American Life” (Herrnstein and Murray 2010), which aims to demonstrate through dubious statistics measures that the poorest in America (namely, African Americans) are also the less intelligent. Fortunately, a long list of scientists protested obvious mistakes in the book, although this has not prevented its support among white supremacist groups. Thus, bad science not only leads us away from the truth but also from justice.
Good science must serve good objectives, even if, in this case, dual-use can still result in damaging consequences. The causes of sustainability, human rights, and equality or social justice must be clear and at the forefront of the scientific project. This should not be understood as a weakening of science’s independence but as a guide that would also increase public trust in its endeavors.
Cognitive biases, particularly data bias, also play a role in scientific results that gets intertwined with morality. A particularly important example is the treatment given to ethnic diversity in clinical practice. Recently, a report was published by the International Federation of Medical Students (International Medical Students Federation 2018) in which the main problems related to the differential treatment of different ethnic groups were exposed. For example, when an African American develops a dark wart, specifically under the nail of a toe, it frequently implies skin cancer. If the diagnosis is delayed, cancer can spread and result in amputation or death. Clinical guidelines typically focus on white patients, so these errors, despite being preventable, are all too common.
Prejudices and stereotypes also influence clinical practice. For example, in the United States, lower doses of painkillers could be prescribed to African Americans because of the stereotype that they tolerate pain better (Weisse et al. 2001). Another interesting example is episiotomy, a surgical incision cut in the female perineum (including skin, muscle, and vagina) made with the purpose of expanding the soft canal to accelerate birth. It is performed with scissors or a scalpel and requires subsequent suturing. Episiotomy is performed routinely as a prevention for tears. However, the World Health Organization (WHO) has established that this practice not only does not prevent tears but has negative consequences for women’s health, who are affected by the scar and suffers incontinence and sexual dysfunction (Chang et al. 2011). Despite the clarity of this evidence and WHO recommendations, episiotomy is performed as a routine practice in many countries (Clesse et al. 2018). In some cases, it is even combined with the practice of the “husband’s stitch,” which consists of closing the vagina more than it should, with the belief that this would result in better sexual satisfaction for the husband. This practice also has negative implications for women’s health (Dobbeleir et al. 2011). How is it possible that this practice is still prevalent? In this case, the origin was probably rational medicine, the idea of choosing the practice that seems more rational rather than basing the decisions on empirical evidence. It seems that authority bias is present in this setting: doctors who are in positions of power and decide protocols are more prone to be outdated. Moreover, their overconfidence bias makes them overvalue their experience over WHO recommendations.
There are examples also in engineering. One of the main cases is algorithmic bias, where artificial intelligence algorithms can lead to unfair situations arbitrarily favoring some groups over other users. It is known, for example, that the algorithms that try to detect the convicts that will be repeat offenders estimate a systematically higher probability for nonwhite individuals. Examples of algorithmic bias can be found in (Hajian et al. 2016). Fortunately, legal developments have led us to, for example, the right to refuse to be categorized by an algorithm rather than a person. In addition to racist algorithms, we had photographic film optimized to show the nuances of white skin (and showing a flat dark face for people of color) or touch sensors that failed to detect dark skin.
Another interesting example of bad engineering comes from the University of Virginia, where a study analyzed data from more than 45,000 victims of road accidents over 11 years and concluded that women are 47% more likely to suffer severe damage (Bose et al. 2011). The reason seems to be the design of the headrest. It is optimized for the height of the average man, and although it can be adapted to some degree, that does not include most women; the headrest remains too high. Designing security systems for the average man and not the average person is unacceptable. It is not even a matter of cost: a sufficiently flexible headrest would not necessarily be more expensive than existing models, and, in any case, this cost is negligible compared to the total of the vehicle. Injustice occurs here when ignoring some users (in this case, about half of them).

10. Humility from Understanding Our Place

Standpoint (Haraway 1988) explains that any claim, including scientific ones, is made from a certain point of view and includes the social and personal circumstances of the subject. Thus, there is no evil intent in any of the above examples. Neither car manufacturers want women to travel less safely, and programmers do not want to intentionally hurt African Americans. However, they think and build science and technology from their own perspective.
In this dynamic, power becomes a part of these mechanisms through authority and majorities. As Haraway explained, power gives the privilege of neutrality, the privilege of being considered neutral. In the example of the car, the neutral conductor has been imagined as male so that the headrest device produces the illusion of being sufficiently flexible when in reality, it is not. Thus, the concept of neutrality undermines our ability to generate good science and good engineering and deviates us from objectivity.
It is key that the opinions of people perceived as neutral are valued as more objective, while other voices are undervalued, their views being attributed to their difference. This testimonial injustice Friker’s words (Fricker 2007) combines with authority bias (which already makes some opinions outweigh others) and steals from us the opportunity to introduce new criteria and complementary perspectives that bring us closer to the truth.
Hermeneutic injustice complements this perspective in a subtler way: the structure of science also defines the concepts that are necessary for scientific work. Often, these definitions are too focused on the experiences of the groups of power, which are those considered neutral in terms of Haraway.
If any statement is made from a certain standpoint, the only path to objectivity is to include as many points of view as possible.

11. Further Developments: Humility and Responsibility

In this article, we have presented the limitations of the objectivity of science, their main causes, and some epistemological theories that help us understand them. The key idea is that science and technology, as human activities, inherit our strengths and weaknesses. Virtue epistemologists pointed at curiosity as the main epistemic virtue; we complete their perspective by insisting on the value of humility in science. Giving humility a key role in scientific practice and communication would improve both its objective social function—that is, the production of knowledge about our world and its application to the improvement of the human condition—and its public acceptance. As presented above, science does have limitations on what it can ultimately grasp and on what its methods can conquer, in addition to the ones posed by the individuals who participate in the scientific project. Along these lines, Kuhn’s opinion is quite relevant (1962). He adopts a focus on the history of science and philosophy of science based on conceptual questions such as which type of ideas are conceivable at a given moment, which types of strategies and intellectual options people had available during a certain period, as well as the importance of not attributing modern thinking models to historical authors. Based on this position, he argues that the evolution of a scientific theory does not arise from simply gathering facts but rather from a set of intellectual circumstances and possibilities which are subject to change.
Interestingly, Gorichanaz (2022) explores the relationship between intellectual humility (I.H.) and people’s information-seeking and use. Their results show that modesty and engagement may be most important to information seeking. In an era of populism and fake news, it is important to include intellectual humility as one of the dimensions in which the public should be educated.
As explained, the replication crisis in the social, behavioral, and life sciences has spurred a reform movement aimed at increasing the credibility of scientific studies. Many of these credibility-enhancing reforms focus, appropriately, on specific research and publication practices. A less often mentioned aspect of credibility is the need for intellectual humility or being transparent about and owning the limitations of our work. Although intellectual humility is presented as a widely accepted scientific norm, Hoeskstra and Vazire (2021) argue that current research practice does not incentivize intellectual humility. The replication crisis in the social, behavioral, and life sciences has spurred a reform movement aimed at increasing the credibility of scientific studies. Many of these credibility-enhancing reforms focus, appropriately, on specific research and publication practices. A less often mentioned aspect of credibility is the need for intellectual humility or being transparent about and owning the limitations of the work.
We must be aware of our cognitive weaknesses. Although we try to use our rational capacities, we often make mistakes. We are built to want to be right, and we value more the evidence that supports our views than the one that contradicts them (confirmation bias). We think that our knowledge is better and more important than it really is (overconfidence bias, professional deformation), and most people tend to follow the opinion of the majority (bandwagon effect) rather than seek the truth for ourselves. In addition, we value opinions differently depending on who proposes them: we attach more truth to the views that come from authority figures or from people that are similar to ourselves, and we unfairly underestimate those from individuals away from power or definitions of the majority. This leads to phenomena such as 180-degree turns in medical recommendations to the widespread practice of techniques outdated and disproved by evidence, as well as technological designs that result in unnecessary damage to certain groups.
We noted that these unconscious biases are systematic (they consistently occur in a wide variety of situations) and involuntary. However, we are able to reduce, to some extent, their impact. If we are aware of the situations where we tend to lose objectivity, we can try to re-evaluate our judgments. There are several very interesting projects in this regard. For example, the “Implicit” Harvard project (Greenwald and Krieger 2006) provides free online tests in which our individual tendencies are determined to value some groups over others or to stereotype certain people. In general, we should remember that we tend to lose objectivity when using the first system of thought, the quick one. As we have stressed, this system is essential from the point of view of survival but can damage us if we are searching for truth. We can learn to detect when we are operating in this system and move consciously and voluntarily to the second system. However, this requires a certain level of self-knowledge and, above all, humility to admit that we are not being as rational as we would like to be.
In addition, the establishment of objective procedures is a way to introduce filters to our individual and collective actions. One particularly clear example is the statistical analysis plans described above. When analyses are defined prior to a study, we are much less prone to errors such as selection bias or confirmation bias. It would be interesting to extend this practice to other fields. This should be understood as a shield against bad practices that should in no case limit the creativity of the scientific enterprise since it does not reduce the scope of the questions or the proposed hypotheses; it only explains what we consider a valid response and what will be considered in need for further work.
A healthy critical attitude should be rewarded more in the scientific context: the repetition of studies exploring alternative hypotheses or studies with negative or inconclusive results is too difficult to publish. This has led some thinkers to propose stricter regulations for academic journals, such as might be required to publish a certain percentage of items in these categories. This, coupled with a scientific disclosure quality (a difficult thing to achieve in this crisis of journalism in which we live), could convey to the public the idea that science is alive and open and that consensus is a joint work and not the imposition of the idea that best corresponds to certain interests.
There are also tools to stimulate critical attitude, which could be integrated into daily practice in the same way that meetings start with an agenda or the minutes of discussions are collected. For example, we can reserve spaces to contemplate different alternatives or solutions that escape the established dichotomies. The corporate world is already working along these lines; Scientific practice could incorporate these techniques. Diversity is a much-talked resource, but how to guarantee it is not obvious. Sex, race, culture, or type of previous professional experience are sources of diversity capital that can help a team to be more objective in its conclusions (Fischer 2005).
Technoscience also needs to acknowledge its social impact responsibly. In this regard, engineering has developed over the last couple of decades a tremendously useful concept: universal design. Universal design proposes that, in order for a design to be valid, it must include among its potential users the largest possible number of people—ideally, the whole population. With this in mind, we need to include not only men and women of different races but also with disabilities. For example, in the case of the car seat, the paradigm of universal design would lead us to consider the widest range of heights and weights possible, so anyone could fit comfortably. Universal design is growing fast in web-application design and architecture. For example, if access considerations are introduced in the design phase of a building, the extra costs are minimal. When designing web pages, the universal design ensures that even people with vision impairments can use the contents. For example, descriptive labels are created for images, color palettes that keep the contrasts even for people with color blindness, or multiple interfaces are used to adapt to the channel every user finds most appropriate are used. The idea is that this is not an extra cost: but rather the opposite, as if designed within this paradigm, leads to a larger customer base and provides a better experience even for those who think they would not have those needs. In addition, in the context of web design, it has already been proven that universal design leads to significantly cheaper maintenance costs, and already a series of standards have crystallized (Story 2001). Universal design is a vision of engineering that recognizes that we are all different and are subject to illness and aging, so it makes no sense to design thinking of a single type of person.

12. Discussion

We need to accept in peace that neither technoscience nor its methods are necessarily objective. Science and technology reflect who we are as individuals and as a society and inherit both our virtues and weaknesses. Humility is the key to getting technoscience that brings us closer to the truth, improving technology, and bridging the widening gap between science and society.
We have tried to make our case by resorting to different strategies and deploying our argument at different levels: the epistemological, focusing on the limits of science and the virtue of scientific humility; the structural and social-systemic, which demands some contention to avoid harming interferences and trespassing beyond one’s own boundaries; and the ethical and its demands for prudence, contention, restrain, and inclusivity. These strategies converge at showing clear limits and gaps and discouraging an understanding and praxis of science rendering it less useful, or even harmful, and further from epistemic expected quality. Displaying these several strategies, although resulting in a very broad text, serves to support our point from multiple perspectives: a conscious limiting and restrain, a humble take on science renders it a more credible and socially valuable endeavor.
Some questions arise regarding such a program. One has already been addressed concerning the doubts that could arise in doctors and other medical practitioners aware of U-turns in research and therapeutic applications. Our suggestion was to encourage an attitude that combines prudence and steady testing and replication. A second one concerns a perverse use of such awareness about some incompleteness and uncertainty when science should inform political and other decision-makers. A clear case is research on climate change, where hot discussions have sometimes shown an incomplete and in-progress condition. Such a provisional status could be perceived as discouraging urgent action to prevent and deter such threats, since our scientific certainty is rather limited. This issue invites serious discernment: scientists have the duty to provide the most updated and reliable data about current developments and future scenarios after accurate simulations; politicians and other decision-makers in economics and other fields need to consider these data, even if provisional, in order to prevent catastrophic developments. Obviously, the decision-makers cannot wait until scientific views become certain; this is impossible since they are always subject to falsification in a Popperian fashion. However, science can provide state-of-the-art data and predictions that, even if temporary and conditional, nevertheless offer the best available information to guide the best decisions. Humility must therefore be present when assessing current knowledge and whether there is a need to get further proof. In this manner, humility counteracts dogmatism and biases that could lead to flawed decisions. However, when encountered with opposite, dogmatic views (e.g., climate change denialism), humility must not be confused with passivity. This virtue should put available knowledge into context and acknowledge that it is not absolute, but also stress when facts have been proven beyond reasonable doubt and when uncertainty becomes unimportant confronted to the risks of inaction.
After completing the first draft of our paper, we have been invited to connect to the topic of “epistemology of testimony” as an approach to our beliefs acquisition that becomes relevant for our own research (Shieber 2015). It is obvious that testimony plays a central role not just in building our beliefs but in scientific praxis as well as the formation of theories. The point is that achieving a high level of certainty in our way of representing reality is more the result of cooperation and mutual testing than just individual discovery and that such endeavor depends on the work of others. This framework which includes humility at its core could make it possible to integrate science with other sources of wisdom and knowledge, whose testimony and contributions could moderate and enhance scientific progress.

Author Contributions

Conceptualization, S.L. and L.O.; methodology, S.L. and L.O.; investigation, S.L., L.O., L.G.; writing—original draft preparation, S.L.; writing—review and editing, S.L., L.O., L.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Barrow, John D. 1999. Impossibility: The Limits of Science and the Science of Limits. Oxford: Oxford University Press. [Google Scholar]
  2. Bird, Alexander. 2021. Understanding the Replication Crisis as a Base Rate Fallacy. The British Journal for the Philosophy of Science 72: 965–93. [Google Scholar] [CrossRef]
  3. Bolger, Robert. 2012. Kneeling at the Altar of Science: The Mistaken Path of Contemporary Religious Scientism. Eugene: Pickwick. [Google Scholar]
  4. Bose, Dipan, Maria Segui-Gomez, and Jeff R. Crandall. 2011. Vulnerability of female drivers in motor vehicle crashes Involved: An analysis of U.S. population at risk. American Journal of Public Health 101: 2368–73. [Google Scholar] [CrossRef] [PubMed]
  5. Boulesteix, Anne-Laure, Sabine Hoffmann, Alethea Charlton, and Heidi Seibold. 2020. A replication crisis in methodological research? Significance 17: 18–21. [Google Scholar] [CrossRef]
  6. Chang, Shiow-Ru, Kuang-Ho Chen, Ho-Hsiung Lin, Yu-Mei Y. Chao, and Yeur-Hur Lai. 2011. Comparison of the effects of episiotomy and no episiotomy on pain, urinary incontinence, and sexual function three months postpartum: A prospective follow-up study. International Journal of Nursing Studies 48: 409–18. [Google Scholar] [CrossRef] [PubMed]
  7. Chomsky, Noam. 2015. What Kind of Creatures Are We? New York: Columbia University Press. [Google Scholar]
  8. Chu, Dominique. 2013. The Science Myth: God, Society, the Self and What We Will Never Know. Winchester and Washington: Iff Books. [Google Scholar]
  9. Clesse, Christophe, Joëlle Lighezzolo-Alnot, Sylvie De Lavergne, Sandrine Hamlin, and Michèle Scheffler. 2018. Statistical trends of episiotomy around the world: Comparative systematic review of changing practices. Health Care for Women International 39: 644–62. [Google Scholar] [CrossRef] [PubMed]
  10. Collins, Harry M., and Trevor Pinch. 2012. The Golem: What You Should Know about Science. Cambridge: Cambridge University Press. [Google Scholar]
  11. Conee, Earl, and Richard Feldman. 2004. Evidentialism: Essays in Epistemology. Oxford: Clarendon Press. [Google Scholar]
  12. Criado, Caroline. 2019. Invisible Women. New York: Abrams Press. [Google Scholar]
  13. Dobbeleir, Julie M. L. C. L., Koenraad Van Landuyt, and Stan J. Monstrey. 2011. Aesthetic surgery of the female genitalia. Seminars in Plastic Surgery 25: 130–41. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Dupré, John. 2001. Human Nature and the Limits of Science. Oxford: Oxford University Press. [Google Scholar]
  15. Fischer, Gerhard. 2005. Distances and diversity: Sources for social creativity. Paper presented at the 5th Conference on Creativity & Cognition, London, UK, April 12–15; pp. 128–36. [Google Scholar]
  16. Foucault, Michel, and Paul Rabinow. 1997. Ethics, Subjectivity and Truth: Essential Works of Eddy 1954–1984. Translated by C. Porter. New York: The New Press, vol. 1. [Google Scholar]
  17. Fricker, Miranda. 2007. Epistemic Injustice: Power and the Ethics of Knowing. Oxford: Oxford University Press. [Google Scholar]
  18. Gleiser, Marcelo. 2014. The Island of Knowledge: The Limits of Science and the Search for Meaning. New York: Basic Books. [Google Scholar]
  19. Gorichanaz, Tim. 2022. Relating information seeking and use to intellectual humility. Journal of the Association for Information Science and Technology 73: 643–54. [Google Scholar] [CrossRef]
  20. Greenwald, Anthony G., and Linda Hamilton Krieger. 2006. Implicit bias: Scientific Foundations. California Law Review 94: 945–67. [Google Scholar] [CrossRef]
  21. Hajian, Sara, Francesco Bonchi, and Carlos Castillo. 2016. Algorithmic bias: From discrimination to discovery fairness-aware data mining. Paper presented at the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, August 13–17; pp. 2125–26. [Google Scholar]
  22. Haraway, Donna. 1988. Situated knowledges: The science question in feminism and the privilege of partial perspective. Feminist Studies 14: 575–99. [Google Scholar] [CrossRef]
  23. Heckman, James. 1990. Varieties of selection bias. The American Economic Review 80: 313. [Google Scholar]
  24. Herrnstein, Richard J., and Charles Murray. 2010. The Bell Curve: Intelligence and Class Structure in American Life. New York: Simon and Schuster. [Google Scholar]
  25. Hoekstra, Rink, and Simine Vazire. 2021. Aspiring to greater intellectual humility in science. Nature Human Behaviour 5: 1602–7. [Google Scholar] [CrossRef] [PubMed]
  26. International Medical Students Federation. 2018. Ethnicity and Health, Denmark. [Google Scholar]
  27. Kahneman, Daniel. 2011. Thinking, Fast and Slow. London: Pinguin Books Ltd. [Google Scholar]
  28. Kellert, Stephen H., Helen E. Longino, and C. Kenneth Waters, eds. 2006. Scientific Pluralism. Minnesota: University of Minnesota Press. [Google Scholar]
  29. Latour, Bruno. 1988. Science inAction: How to Follow Scientists and Engineers through Society. Cambridge: Harvard University Press. [Google Scholar]
  30. Medawar, Peter Brian. 1984. The Limits of Science. Oxford: Oxford University Press. [Google Scholar]
  31. Midgley, Mary. 1992. Science as Salvation: A Modern Myth and its Meaning. London and New York: Routledge. [Google Scholar]
  32. Nickerson, Raymond S. 1998. Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology 2: 175–220. [Google Scholar] [CrossRef]
  33. Potochnik, Angela. 2017. Idealization and the Aims of Science. Chicago: University of Chicago Press. [Google Scholar]
  34. Prasad, Vinay, Andrae Vandross, Caitlin Toomey, Michael Cheung, Jason Rho, Steven Quinn, Satish Jacob Chacko, Durga Borkar, Victor Gall, Senthil Selvaraj, and et al. 2013. A decade of reversal: An analysis of 146 medical practices contradicted. In Mayo Clinic Proceedings. Amsterdam: Elsevier, vol. 88, pp. 790–98. [Google Scholar]
  35. Priebe, Carolyn, and Daryl R. Van Tongeren. 2021. Women pay a steeper price for arrogance: Examining presentation style, gender, and humility. The Journal of Positive Psychology 2021: 1–9. [Google Scholar] [CrossRef]
  36. Schweinsberg, Martin, Michael Feldman, Nicola Staub, Olmo R. van den Akker, Robbie C.M. van Aert, Marcel A.L.M. van Assen, Yang Liu, Tim Althoff, Jeffrey Heer, Alex Kale, and et al. 2021. Same Data, Different Conclusions: Radical Dispersion in Empirical Results When Independent Analysts Operationalize and Test the Same Hypothesis. Organizational Behavior and Human Decision Processes 165: 228–49. [Google Scholar] [CrossRef]
  37. Shieber, Joseph. 2015. Testimony: A Philosophical Introduction. London: Routledge. [Google Scholar]
  38. Shrout, Patrick E., and Joseph L. Rodgers. 2018. Psychology, Science, and Knowledge Construction: Broadening, Perspectives from the Replication Crisis. Annual Review of Psychology 69: 487–510. [Google Scholar] [CrossRef] [PubMed]
  39. Story, Molly Follette. 2001. Principles of Universal Design. New York: Universal Design Handbook. [Google Scholar]
  40. Tversky, Amos, and Daniel Kahneman. 1974. Judgment under Uncertainty: Heuristics and Biases: Biases in judgments reveal some heuristics of thinking under uncertainty. Science 185: 1124–31. [Google Scholar] [CrossRef] [PubMed]
  41. Weisse, Carol S., Paul C. Sorum, Kafi N. Sanders, and Beth L. Syat. 2001. Do gender and race Affect About pain management decisions? Journal of General Internal Medicine 16: 211–17. [Google Scholar] [CrossRef] [PubMed]
  42. Weber, Max. 1946. Science as a Vocation. In Science and the Quest for Reality. London: Palgrave Macmillan, pp. 382–94. [Google Scholar]
  43. Wood, W. Jay. 2009. Epistemology: Becoming Intellectually Virtuous. Westmont: InterVarsity Press. [Google Scholar]
  44. Yanofsky, Noson S. 2016. The Outer Limits of Reason: What science, Mathematics, and reason Cannot Tell Us. Cambridge and London: MIT Press. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lumbreras, S.; Gismera, L.; Oviedo, L. How a Humbler Science Becomes a Better Science. Religions 2023, 14, 64. https://doi.org/10.3390/rel14010064

AMA Style

Lumbreras S, Gismera L, Oviedo L. How a Humbler Science Becomes a Better Science. Religions. 2023; 14(1):64. https://doi.org/10.3390/rel14010064

Chicago/Turabian Style

Lumbreras, Sara, Laura Gismera, and Lluis Oviedo. 2023. "How a Humbler Science Becomes a Better Science" Religions 14, no. 1: 64. https://doi.org/10.3390/rel14010064

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop