Machines and Human Epistemology
https://doi.org/10.32603/2412-8562-2022-8-2-17-27
Abstract
Introduction. The article analyzes the structure of the epistemology of knowledge of weak artificial intelligence in comparison with the structure of human knowledge. The article was written within the framework of one of the most demanded branch sections of philosophy – the philosophy of artificial intelligence. The novelty of the research is based on the idea of applying the analysis of knowledge developed within the framework of analytical epistemology to the field of artificial intelligence.
Methodology and sources. The article is written in the framework of analytical tradition in philosophy. We use conceptual analysis to analyze the most crucial aspects of knowledge. This method assumes that in order to explain a complex phenomenon, it is necessary to analyze its components. This method formed the basis of an analytical discussion about knowledge in the second half of the 20th century. The article also uses comparative analysis.
Results and discussion. People's knowledge is characterized by three essential characteristics – the presence of information-bearing mental states, their reliability, and factuality. We analyzed to what extent the information-carrying internal states of a weak AI correspond to these qualities. The authors concluded that internal states can be considered beliefs if they are weakly interpreted, they can have a high degree of reliability under certain conditions, and they can have factuality if we accept that a weak AI has beliefs.
Conclusion. A weak interpretation of the concept of belief allows us to argue that neural networks are capable of having beliefs. A more rigorous interpretation of the concept of belief also includes the requirement to understand the meaning. However, we do not have at our disposal a satisfactory theory of understanding meaning. In this case, the condition of reliability is the only criterion of knowledge to which the functional states of machines can correspond in the case of certain tasks, in connection with which the problem of generality arises especially acutely.
About the Authors
T. S. DeminRussian Federation
Timofei S. Demin – Assistant Lecturer at the Department of Philosophy, Saint Petersburg Electrotechnical University
5F Professor Popov str., St Petersburg 197022
K. G. Frolov
Russian Federation
Konstantin G. Frolov – Can. Sci. (Philosophy) (2017), Research Officer at the International Laboratory for Logic, Linguistics and Formal Philosophy
Research Officer at the Department of Philosophy
11 Pokrovsky blvd., Moscow 109028
References
1. Lakoff, G. (2004), Women, Fire, and Dangerous Things: What Categories Reveal about the Mind, Transl. by Shatunovskii, I.B., Yazyki slavyanskoi kul'tury, Moscow, RUS.
2. Lakoff G. and Johnson, M. (2004), Metaphors We Live By, Transl. by Baranov, A.N. and Morozova, A.V., URSS, Moscow, RUS.
3. Searle, J.R. (1980), “Minds, brains, and programs”, Behavioral and brain sciences, vol. 3, iss. 3,
4. pp. 417–424. DOI: https://doi.org/10.1017/S0140525X00005756.
5. Chalmers, D.J. (2016), “The singularity: A philosophical analysis”, Science fiction and philosophy: from time travel to superintelligence, in Schneider, S. (ed.), Wiley-Blackwell, Chichester, UK, P. 171–224.
6. Ichikawa, J.J. and Steup, M. (2014), “The Analysis of Knowledge”, The Stanford Encyclopedia of Philosophy, Zalta, E.N. (ed.), available at: https://plato.stanford.edu/archives/spr2014/entries/ knowledge-analysis/ (accessed 20.12.2021).
7. Pavese, C. (2021), “Knowledge How”, The Stanford Encyclopedia of Philosophy, Zalta, E.N. (ed.), available at: https://plato.stanford.edu/entries/knowledge-how/ (accessed 20.12.2021).
8. Williamson, T. (2002), Knowledge and its Limits, Oxford Univ. Press, Oxford, UK. DOI:10.1093/019925656X.001.0001.
9. Shope, R.K. (1983), The analysis of Knowledge, Princeton Univ. Press, Princeton, NJ, USA.
10. Zagzebski, L. (2009), On epistemology, Wadsworth Cengage Learning, Belmont, CA, USA.
11. Martin, A. and Santos, L.R. (2016), “What cognitive representations support primate theory of mind?”, Trends in cognitive sciences, vol. 20, iss. 5, pp. 375–382. DOI: 10.1016/j.tics.2016.03.005.
12. Davidson, D. (1982), “Rational animals”, Dialectica, vol. 36, no. 4. pp. 317–327.
13. Dennett, D. (1991), “Real patterns”, The J. of Philosophy, vol. 88, no. 1, pp. 27–51. DOI: https://doi.org/10.2307/2027085.
14. Frolov, K.G. (2018), “Metaphysics of correspondence: some approaches to the classical theory of truth”, Epistemology & Philosophy of Science, vol. 55, no. 1, pp. 83–98. DOI: 10.5840/eps201855110.
15. Lobanov, S.D. (2011), “About of problem knowledge reliability and question of truth/knowledge dilemma”, Herald of Vyatka State Humanitarian Univ., no. 4 (4), pp. 26–27.
16. McGlynn, A. (2014), Knowledge First?, Palgrave Macmillan, London, UK. DOI: https://doi.org/
17. 1057/9781137026460.
18. Feldman, R. (1985), “Reliability and justification”, The Monist, vol. 68, iss. 2, pp. 159–174.
19. DOI: 10.5840/monist198568226.
20. Pritchard, D. (2014), “Knowledge and understanding”, Virtue epistemology naturalized, in Fair-weather, A. (ed.), vol. 366, Springer, Cham, CHE, pp. 315–327. DOI: https://doi.org/10.1007/978-3-319-04672-3_18.
Review
For citations:
Demin T.S., Frolov K.G. Machines and Human Epistemology. Discourse. 2022;8(2):17-27. (In Russ.) https://doi.org/10.32603/2412-8562-2022-8-2-17-27