Term papers writing service


The role of scientific knowledge in society

Historical Background Philosophers who study the social character of scientific knowledge can trace their lineage at least as far as John Stuart Mill. Mill, Charles Sanders Peirce, and Karl Popper all took some type of critical interaction among persons as central to the validation of knowledge claims.

Mill's arguments occur in his well-known political essay On Liberty, Mill 1859 rather than in the context of his logical and methodological writings, but he makes it clear that they are to apply to any kind of knowledge or truth claim.

Mill argues from the fallibility of human knowers to the necessity of unobstructed opportunity for and practice of the critical discussion of ideas. Only such critical discussion can assure us of the justifiability of the true beliefs we do have and can help us avoid falsity or the partiality of belief or opinion framed in the context of just one point of view.

Critical interaction maintains the freshness of our reasons and is instrumental in the improvement of both the content and the reasons of our beliefs. The achievement of knowledge, then, is a social or collective, not an individual, matter. Peirce's contribution to the social epistemology of science is commonly taken to be his consensual theory of truth: Whatever the correct reading of this particular statement, Peirce elsewhere makes it clear that, in his view, truth is both attainable and beyond the reach of any individual.

Peirce puts great stock in instigating doubt and critical interaction as means to knowledge. Thus, whether his theory of truth is consensualist or realist, his view of the practices by which we attain it grants a central place to dialogue and social interaction. Popper is often treated as a precursor of social epistemology because of his emphasis on the importance of criticism in the development of scientific knowledge.

Two concepts of criticism are found in his works Popper 1963, 1972 and these can be described as logical and practical senses of falsification. The logical sense of falsification is just the structure of a modus tollens argument, in which a hypothesis is falsified by the demonstration that one of its logical consequences is false.

This is one notion of criticism, but it is a matter of formal relations between statements. The practical sense of falsification refers to the efforts of scientists to demonstrate the inadequacies of one another's theories by demonstrating observational shortcomings the role of scientific knowledge in society conceptual inconsistencies.

This is a social activity. For Popper the methodology of science is falsificationist in both its logical and practical senses, and science progresses through the demonstration by falsification of the untenability of theories and hypotheses. Popper's logical falsificationism is part of an effort to demarcate genuine science from pseudo science, and has lost its plausibility as a description of scientific methodology as the demarcation project has come under challenge from naturalist and historicist approaches in philosophy of science.

While criticism does play an important role in some current approaches in social epistemology, Popper's own views are more closely approximated by evolutionary epistemology, especially that version that treats cognitive progress as the effect of selection against incorrect theories and hypotheses.

In contrast to Mill's views, for Popper the function of criticism is to eliminate false theories rather than to improve them. The work of Mill, Peirce, and Popper is a resource for philosophers presently exploring the social dimensions of scientific knowledge.

Science, Knowledge, and Society

However, the current debates are framed in the context of developments in both philosophy of science and in history and social studies of science following the collapse of the logical empiricist consensus. The philosophers of the Vienna Circle are conventionally associated with an uncritical form of positivism and with the logical empiricism that replaced American pragmatism in the 1940s and 1950s.

According to some recent scholars, however, they saw natural science as a potent force for progressive social change. Cartwright, Cat, and Chang 1996; Giere and Richardson, eds.

While one development of this point of view leads to scientism, the view that any meaningful question can be answered by the methods of science; another development leads to inquiry into what social conditions promote the growth of scientific knowledge.

Logical empiricism, the version of Vienna Circle philosophy that developed in the United States, focused on logical, internal aspects of scientific knowledge and discouraged philosophical inquiry into the social dimensions of science.

These came into prominence again after the publication of Thomas Kuhn's Structure of Scientific Revolutions Kuhn 1962. A new generation of sociologists of science, among them Barry Barnes, Steven Shapin, and Harry Collins, took Kuhn's emphasis on the role of non-evidential community factors in scientific change even further than he had and argued that scientific judgment was determined by social factors, such as professional interests and political ideologies Barnes 1977, Shapin 1982, Collins 1983.

This family of the role of scientific knowledge in society provoked a counter-response among philosophers. These responses are marked by an effort to acknowledge some social dimensions to scientific knowledge while at the same time maintaining its epistemological legitimacy, which they take to be undermined by the new sociology.

At the same time, features of the organization of scientific inquiry compel philosophers to consider their implications for the normative analysis of scientific practices. Big Science, Trust, and Authority The second half of the twentieth century saw the emergence of what has come to be known as Big Science: Theoretical and the role of scientific knowledge in society physicists located at various sites across the country, though principally at Los Alamos, New Mexico, worked on sub-problems of the project under the overall direction of J.

While academic and military research have since been to some degree separated, much experimental research in physics, especially high energy particle physics, continues to be pursued by large teams of researchers. Research in other areas of science as well, for example the work comprehended under the umbrella of the Human Genome Project, has taken on some of the properties of Big Science, requiring multiple forms of expertise.

In addition to the emergence of Big Science, the transition from small scale university or even amateur science to institutionalized research with major economic impacts supported by national funding bodies and connected across international borders has seemed to call for new ethical and epistemological thinking.

Moreover, the consequent dependence of research on central funding bodies and increasingly, private foundations or commercial entities, prompts questions about the degree of independence of contemporary scientific knowledge from its social and economic context. John Hardwig 1985 articulated one philosophical dilemma posed by large teams of researchers. Each member or subgroup participating in such a project is required because each has a crucial bit of expertise not possessed by any other member or subgroup.

This may be knowledge of a part of the instrumentation, the ability to perform a certain kind of calculation, the ability to make a certain kind of measurement or observation.

  • These fail to take account of variations in contexts of application that will affect the outcome;
  • Introduction Representatives from Mexico, the USA and Canada met in Alberta, Canada, to examine the impact of scientific change on society and its governance;
  • Rouse in his 1987 integrated analytic and continental philosophy of science and technology sought to develop what might be called a critical pragmatism;
  • For Popper the methodology of science is falsificationist in both its logical and practical senses, and science progresses through the demonstration by falsification of the untenability of theories and hypotheses;
  • These issues have crystallized in the early 21st century in debates about the brain and cognition drawing the attention of philosophers of biology and cognitive scientists;
  • Research in other areas of science as well, for example the work comprehended under the umbrella of the Human Genome Project, has taken on some of the properties of Big Science, requiring multiple forms of expertise.

The other members are not in a position to evaluate the results of other members' work, and hence, all must take one anothers' results on trust. The consequence is an experimental result, for example, the measurement of a property such as the decay rate or spin of a given particle the evidence for which is not fully understood by any single participant in the experiment. This leads Hardwig to ask two questions, one about the evidential status of testimony, and one about the nature of the knowing subject in these cases.

With respect to the latter, Hardwig says that either the group as a whole, but no single member, knows or it is possible to know vicariously. Neither of these is palatable to him. Talking about the group or the community knowing smacks of superorganisms and transcendent entities and Hardwig shrinks from that solution.

  1. Coming from a slightly different angle, the precautionary principle represents an approach shifting the burden of proof in regulatory decisions from demonstration of harm to demonstration of safety of substances and practices.
  2. Which these are is determined by examining the phenomena, the models, and the match between phenomena and models.
  3. John Hardwig 1985 articulated one philosophical dilemma posed by large teams of researchers.
  4. Thus, even if the structure of reward and punishment is an in principle incentive not to cheat, it does not guarantee the reliability of every research report.
  5. For Popper the methodology of science is falsificationist in both its logical and practical senses, and science progresses through the demonstration by falsification of the untenability of theories and hypotheses. Philosophers concerned to defend the rationality of science against sociological misrepresentations include Larry Laudan 1984 James Brown 1989, 1994 , Alvin Goldman 1987, 1995 and Susan Haack 1996.

Vicarious knowledge, knowing without oneself possessing the evidence for the truth of what one knows, requires, according to Hardwig, too much of a departure from our ordinary concepts of knowledge. The first question is, as Hardwig notes, part of a more general discussion about the epistemic value of testimony.

  1. Kitcher thus joins Goldman, Haack, and Laudan in the view that it is possible to articulate a priori conditions of rationality or of epistemic warrant that operate independently of, or, perhaps one might say, orthogonally to, the social relations of science. This goal may seem unreachable, even Utopian, but we need not look far to find examples of such involvement operating successfully for decades.
  2. Mill's arguments occur in his well-known political essay On Liberty, Mill 1859 rather than in the context of his logical and methodological writings, but he makes it clear that they are to apply to any kind of knowledge or truth claim. Peer review both of research proposals and of research reports submitted for publication screens for quality, which includes methodological competence and appropriateness as well as for originality and significance, while replication is intended to probe the robustness of results when reported experiments are carried out in different laboratories and with slight changes to experimental conditions.
  3. This involves integrating the work of specialists in the kind of substance whose risks are under assessment geneticists, chemists, physicists , biomedical specialists, epidemiologists, statisticians, and so on.

Much of what passes for common knowledge is acquired from others. We depend on experts to tell us what is wrong or the role of scientific knowledge in society with our appliances, our cars, our bodies. Indeed, much of what we later come to know depends on what we previously learned as children from our parents and teachers. We acquire knowledge of the world through the institutions of education, journalism, and scientific inquiry. Philosophers disagree about the status of beliefs acquired in this way.

Here is the question: If A knows that p on the basis of evidence e, B has reason to think A trustworthy and B believes p on the basis of A's testimony that p, does B also know that p?

Some philosophers, as Locke and Hume seem to have, argue that only what one has observed oneself could count as a good reason for belief, and that the testimony of another is, therefore, never sufficient warrant for belief. Thus, B does not know simply on the basis of A's testimony. While this result is consistent with traditional philosophical empiricism and rationalism, which emphasized the individual's sense experience or rational apprehension as foundations of knowledge, it does have the consequence that we do not know most of what we think we know.

The Social Dimensions of Scientific Knowledge

A number of philosophers have recently offered alternative analyses focusing on one or another element in the problem. Some argue that testimony by a qualified expert is itself evidential, Schmitt 1988others that the expert's evidence constitutes good reason for, but is not itself evidential for the recipient of testimony Hardwig 1985, 1988others that what is transmitted in testimony is knowledge and not just propositional content and thus the question of the kind of reason a recipient of testimony has is not to the point Welbourne 1981.

However this dispute is resolved, questions of trust and authority arise in a particularly pointed way in the sciences, and Hardwig's dilemma for the physics experiment is also a specific version of a more general phenomenon. A popular conception of science, fed partly by Popper's falsificationism, is that it is epistemically reliable because the results of experiments and observational studies are checked by independent repetition. In practice, however, only some results are so checked and many are simply accepted on trust.

Not only must positive results be accepted on trust, but claims of failure to replicate as well as other critiques must be also.

Thus, just as in the non-scientific world information is accepted on trust, so in science, knowledge grows by depending on the testimony of others. What are the implications of accepting this fact for our conceptions of the reliability of scientific knowledge?

David Hull, in his 1988 argues that because the overall structure of reward and punishment in the sciences is a powerful incentive not to cheat, further epistemological analysis of the sciences is unnecessary. The structure itself guarantees the veridicality of research reports. And, while the advocates of cold fusion were convinced that their experiments had produced the phenomenon, there have also been cases of outright fraud.

Thus, even if the structure of reward and punishment is an incentive not to cheat, it does not guarantee the veridicality of every research report. The reward individual scientists seek is credit. That is, they seek recognition, to have their work cited as important and as necessary to further scientific progress. The scientific community seeks true theories or adequate models. Credit, or recognition, accrues to individuals to the extent they are perceived as having contributed to that community goal.

  • That falls on the other side;
  • Nelson uses Quine's arguments against the independently foundational status of observation statements as the basis for what she calls a feminist empiricism;
  • In Mexico, where agriculture remains an important part of the national economy, scientific work related to food production and food security is complicated by a web of social problems such as rural poverty, social discrimination against peasants, migration to cities because of changes in land use, weak transportation and marketing services, and lack of farmer access to credit.

There is a strong incentive to cheat, to try to obtain credit without necessarily having done the work. Both Alvin Goldman Goldman, 1995, 1999 and Philip Kitcher 1993 have treated the potential for premature, or otherwise improperly interested reporting of results to corrupt the sciences as a question to be answered by means of decision theoretic models.

The Use of Knowledge in Society

The decision theoretic approach to problems of trust and authority treats both credit and truth as utilities. The challenge then is to devise formulas that show that actions designed to maximize credit also maximize truth.

Kitcher, in particular, develops formulas intended to show that even in situations peopled by non-epistemically motivated individuals that is, individuals motivated more by a desire for credit than by a desire for truththe reward structure of the community can be organized in such a way as to maximize the role of scientific knowledge in society and foster scientific progress. One consequence of this approach is to treat scientific fraud and value or interest infused science as the same problem.

One advantage is that it incorporates the motivation to cheat into the solution to the problem of cheating. But one may wonder how effective this solution really is. Increasingly, we learn of problematic behavior in science based industries, such as the pharmaceutical industry. Results are withheld or distorted, authorship is manipulated. Hot areas, such as stem cell research or cloning have been subjected to fraudulent research. Thus, even if the structure of reward and punishment is an in principle incentive not to cheat, it does not guarantee the reliability of every research report.

Community issues have been addressed under the banners of research ethics and of peer review. One might think that the only ethical requirements on scientists are to protect their research subjects from harm and, as professional scientists, to seek truth above any other goals. This presupposes that seeking truth is a sufficient guide to scientific decision-making.

Heather Douglas, in her critical study of the ideal of value-freedom Douglas 2009rejects this notion. Douglas draws on her earlier study of inductive risk Douglas 2000 to press the point that countless methodological decisions required in the course of carrying out a single piece of research are underdetermined by the factual elements of the situation and must be guided by an assessment of the consequences of being wrong.

The Role of Science in Society

Science is not value-free, but can be protected from the deleterious effects of values if scientists take steps to mitigate the influence of inappropriate values.

One step is to distinguish between direct and indirect roles of values; another is the articulation of guidelines for individual scientists. Values play a direct role when they provide direct motivation to accept or reject a theory; they play an indirect role when they play a role in evaluating the consequences of accepting or rejecting a claim, thus influencing what will count as sufficient evidence to accept or reject.

The responsibility of scientists is to make sure that values do not play a direct role in their work and to be transparent about the indirect roles of values. Steel and Whyte 2012 examine testing guidelines developed by pharmaceutical companies to point out that the very same decision may be motivated by values playing a direct role or playing an indirect role. Elliott 2011 questions whether only harmful consequences should be considered.