Ok

En poursuivant votre navigation sur ce site, vous acceptez l'utilisation de cookies. Ces derniers assurent le bon fonctionnement de nos services. En savoir plus.

mardi, 24 janvier 2023

Le déclin de l'intelligence et l'appauvrissement du langage

d2f33656e0afc65d5eea8736118fc0cd.jpg

Le déclin de l'intelligence et l'appauvrissement du langage

Par Christophe Clavé

Source: https://jornalpurosangue.net/2023/01/21/o-ocaso-da-inteligencia-e-o-empobrecimento-da-linguagem/

Le QI moyen de la population mondiale, qui a toujours augmenté depuis l'après-guerre jusqu'à la fin des années 1990, a diminué au cours des vingt dernières années. C'est le renversement de l'effet Flynn.

Il semble que le niveau d'intelligence, mesuré par des tests, diminue dans les pays les plus développés. Les causes de ce phénomène peuvent être nombreuses. L'un d'entre eux pourrait être l'appauvrissement de la langue.

En effet, plusieurs études montrent la diminution des connaissances lexicales et l'appauvrissement de la langue: ce n'est pas seulement la réduction du vocabulaire utilisé, mais aussi les subtilités linguistiques qui permettent d'élaborer et de formuler des pensées complexes.

La disparition progressive des temps (subjonctif, imparfait, formes composées du futur, participe passé) donne lieu à une pensée presque toujours au présent, limitée à l'instant : incapable de projections dans le temps.

La simplification des tutoriels, la disparition des majuscules et de la ponctuation sont des exemples de "coups mortels" portés à la précision et à la variété de l'expression.

Un seul exemple: éliminer le mot "signorina/senhorita/mademoiselle" (désormais obsolète) signifie non seulement renoncer à l'esthétique d'un mot, mais aussi promouvoir involontairement l'idée qu'entre une fille et une femme, il n'y a pas d'étapes intermédiaires.

Moins de mots et moins de verbes conjugués signifient moins de capacité à exprimer des émotions et moins de capacité à traiter une pensée. Des études ont montré qu'une partie de la violence dans les sphères publique et privée découle directement de l'incapacité à décrire les émotions avec des mots.

Sans mots pour construire un argument, la pensée complexe devient impossible.

Plus la langue est pauvre, plus la pensée disparaît. L'histoire est pleine d'exemples et de nombreux livres (George Orwell - "1984" ; Ray Bradbury - "Fahrenheit 451") racontent comment tous les régimes totalitaires ont toujours entravé la pensée en réduisant le nombre et le sens des mots.

d224a2a9792850d63d8242da578df2a0.jpg

S'il n'y a pas de pensées, il n'y a pas de pensées critiques. Et il n'y a pas de pensée sans mots. Comment construire une pensée hypothético-déductive sans le conditionnel ? Comment penser le futur sans une conjugaison avec le futur ? Comment est-il possible de saisir la temporalité, une succession d'éléments dans le temps, passés ou futurs, et leur durée relative, sans un langage qui distingue ce qui aurait pu être, ce qui était, ce qui est, ce qui pourrait être, et ce qui sera après que ce qui aurait pu se produire se soit effectivement produit ?

Chers parents et enseignants : faisons en sorte que nos enfants, nos élèves parlent, lisent et écrivent. Enseignons et pratiquons la langue sous ses formes les plus diverses. Même si cela semble compliqué. Surtout si elle est compliquée. Parce que dans cet effort, il y a la liberté.

Ceux qui affirment la nécessité de simplifier l'orthographe, de débarrasser la langue de ses "défauts", d'abolir les genres, les temps, les nuances, tout ce qui crée de la complexité, sont les véritables artisans de l'appauvrissement de l'esprit humain.

Il n'y a pas de liberté sans nécessité. Il n'y a pas de beauté sans la pensée de la beauté.

mardi, 08 mars 2022

Edward Dutton : le QI est-il en hausse ou en baisse ?

maxresdqiqiqiqiefault.jpg

Edward Dutton : le QI est-il en hausse ou en baisse ?

par A. Hercynský

Ex: https://deliandiver.org/2022/01/edward-dutton-roste-iq-nebo-klesa.html

Edward Dutton, britannique d'origine qui vit à Oulu, en Finlande, est professeur agrégé d'anthropologie, enseigne la psychologie évolutionniste dans une université privée à Łódź, en Pologne, et est membre de plusieurs sociétés scientifiques, principalement dans les pays scandinaves. Il dirige une chaîne YouTube Jolly Heretic, où il présente son point de vue sur certains phénomènes sociaux. Dutton est un excentrique, tant dans son expression que dans sa façon d'introduire chacune de ses vidéos par une scène fictive déguisée. Il présente chacune ce ses vidéos comme une comédie, mais son contenu peut être considéré comme parfaitement sérieux, même si l'on ne peut pas être d'accord avec tout.  Les interprétations de Dutton reposent sur plusieurs prémisses, dont l'une est la notion d'un déclin constant du QI moyen depuis la révolution industrielle. Il discute notamment de ce déclin dans cette vidéo.

MV5BN2ViMjc3MDktYzAxNC00NjUwLWE2NDgtYzkwZDJjNWFiMDhmXkEyXkFqcGdeQXVyMDY3OTcyOQ@@._V1_.jpg

543x840.jpg

Edward Dutton soutient que l'intelligence moyenne a atteint un pic tout au début de la révolution industrielle, au 18e siècle, avec un déclin constant depuis lors. Cela est censé être dû au fait que les segments les moins intelligents de la population ne meurent pas, mais prospèrent au contraire grâce aux avancées modernes telles que les soins médicaux, et se reproduisent plus rapidement que le reste de la société. Avant cela, selon Dutton, la société fonctionnait de telle manière que les plus intelligents restaient au sommet de la société, tandis que les moins intelligents descendaient vers le bas jusqu'à ce que, aux niveaux les plus bas, ils sortent complètement de la roue. Comme Dutton le fait remarquer ailleurs, l'intelligence est également corrélée aux prédispositions génétiques à diverses maladies, il ne s'agit donc pas seulement d'une question de sécurité matérielle externe. À l'extrême, les personnes véritablement démentes souffrent souvent d'autres maladies génétiques associées, mais cela se manifeste à des niveaux moins graves.

801x1200.jpg

Dutton a développé ses vues dans son livre At Our Wits' End : Why We're Becoming Less Intelligent and What it Means for the Future (2017, Sol Noctis, 2020). Le livre est un recueil populaire et une méta-analyse de nombreuses études partielles, pour la plupart récentes, mais il ne contient lui-même presque aucune analyse de données ou recherche propre ; pour cela, il faut se reporter aux sources citées (les chapitres 8 et 9 en particulier sont essentiels).

Il est vrai que la psychologie a connu une augmentation des résultats des tests de QI au cours du siècle dernier. Cette découverte est connue sous le nom d'effet Flynn, du nom du psychologue à qui elle est attribuée à tort. Toutefois, même Wikipedia indique à ce sujet qu'il existe différentes perspectives et que l'effet Flynn doit être interprété correctement. Dutton traite le phénomène à peu près comme suit : les résultats des tests de QI sont relatifs, une valeur de 100 correspond à une moyenne et est périodiquement ajustée. Ainsi, les tests montrent une comparaison des sujets au sein d'un groupe (défini par le temps), mais ils ne peuvent pas être utilisés pour une comparaison directe entre plusieurs groupes (puisque chacun a une ligne de base fixée ailleurs). La vérité, dit-il, est qu'après une difficile conversion en valeurs absolues, c'est-à-dire comparables, on peut observer une sorte d'augmentation. Toutefois, lorsqu'il analyse les parties du test qui présentent une amélioration et celles qui n'en présentent pas, il conclut que l'augmentation est davantage due aux compétences pratiques, tandis que les parties du test où l'intelligence générale séparée (généralement désignée en psychologie par la lettre g) est la plus appliquée ne présentent aucune augmentation. Nous discuterons plus tard de cette différence et de la manière dont elle est très significative.

Selon Edward Dutton, les gens, grâce aux moyens modernes de fonctionnement (tests scolaires, ordinateurs, etc.), ont atteint un état où ils peuvent utiliser leur potentiel au maximum dans des tests de QI conçus de manière similaire - mais la barre du potentiel lui-même, l'intelligence générale, g, n'a pas bougé, au contraire, elle baisse. À titre d'exemple, il donne un test du type : lequel des animaux sélectionnés n'est pas un mammifère ? Pour pouvoir répondre, le répondant doit connaître les animaux en question et savoir ce que signifie le mot mammifère ; de plus, il doit penser de manière abstraite et analytique ; Dutton donne l'exemple suivant : lorsque des chercheurs ont posé une question similaire au début du 20e siècle à un paysan russe, un homme dont la pensée était essentiellement pré-moderne, il a été incapable de répondre, n'était pas habitué à penser en catégories abstraites et a répondu par une construction mentale inattendue. Ainsi, si un répondant est formé à ce mode de pensée, qui convient le mieux pour réussir un test de QI, il obtiendra les meilleurs résultats aux tests de QI - mais l'intelligence générale n'augmentera pas pour autant. Par analogie, on peut citer le cas de la croissance de la taille corporelle qui, bien qu'elle ait également augmenté au fil du temps, principalement en raison de l'alimentation, s'est arrêtée à un moment donné et aucune autre croissance n'est possible.

La question plus sérieuse est de savoir quelle est l'intelligence générale g. Dans son livre, Dutton donne un certain nombre d'exemples sur lesquels l'intelligence générale peut être mesurée séparément, qui sont tous en corrélation les uns avec les autres. Bien entendu, il ne s'agit pas d'une liste exclusive ; l'intelligence générale se reflète d'une manière ou d'une autre dans chaque activité humaine libre. Parcourons ces exemples de manière télégraphique :

    - la vitesse de réaction, corrélée à l'intelligence, les records de mesure depuis 1800, et les résultats en baisse constante [cf. "homme lent"] ;
    - la capacité de reconnaissance fine des couleurs (peut être testée ici - https://www.colorblindnesstest.org/farnsworth-munsell-100-hue-test/ -, mais elle est conditionnée par la qualité et le calibrage du moniteur) ;
    - l'utilisation de mots complexes, qui est peut-être l'exemple le plus intéressant ; on a constaté que, bien que le vocabulaire augmente sensiblement avec l'éducation, il s'agit de mots conceptuellement plus simples, alors que l'on observe un déclin constant de l'utilisation de termes plus complexes (voir le graphique de la page 170) ;
    - la capacité de répéter une série de chiffres à l'envers, alors que la capacité de répéter une série de chiffres dans l'ordre avant s'est améliorée, ce que l'auteur décrit comme une capacité purement pratique sans besoin de plus d'intelligence ;
    - l'orientation spatiale ;
    - Développement de l'enfant, les enfants sont de moins en moins capables d'estimer les tailles et les poids [l'inclusion ici peut être considérée comme discutable, elle pourrait être plus valable comme preuve de ces compétences pratiques] ;
    - Diminution du nombre de personnes brillantes ;
    - La créativité, son déclin est mesurable selon l'auteur.

Edward Dutton a constaté que tous les critères ci-dessus sont corrélés et que l'on peut observer une tendance à la baisse de ces critères depuis l'époque de la révolution industrielle jusqu'à aujourd'hui.

En conclusion, même si l'on peut contester certains des signes observés comme étant arbitrairement ou volontairement sélectionnés, il est difficile de nier que la maîtrise de certaines compétences pratiques ne peut être confondue avec la croissance de l'intelligence générale. Je ne doute pas non plus que la non-extinction et la prolifération des moins intelligents est un fait indéniable (ainsi que le faible taux de natalité des personnes intelligentes) qui doit nécessairement avoir un effet sur l'intelligence moyenne.

dimanche, 02 décembre 2018

Nous sommes en pleine régression intellectuelle

idiot-720x340.jpg

Nous sommes en pleine régression intellectuelle

Cet article apporte des hypothèses intéressantes à la réflexion sur la baisse de QI en Occident, mais même si la lecture sur écran s’avère inférieure sur les plans de la concentration, de la mémorisation et de la compréhension à sa contrepartie sur papier et que clairement, Internet peut surinformer, nous pensons que le relativisme culturel évoqué par Dimitri Casali est davantage en cause que la révolution numérique en elle-même, malgré tous les bémols qui peuvent en nuancer le bilan.

Parce que, si l’on se penche par exemple sur la Chine, une nation aujourd’hui aussi connectée (ordinateurs, tablettes, smartphones) que la France en 2007, elle affiche au contraire une hausse générale du QI de sa population, même dans la région autonome de Hong Kong, qui s’est massivement connectée à Internet au même moment que l’Occident. Si cette tendance chinoise se maintient, il faudra bien se résigner à envisager prioritairement des facteurs comme l’encouragement au nivellement par le bas de la culture en Occident pour comprendre notre tendance à la baisse. A suivre, donc.


Par Charles Sannat
Paru sur Insolentiae.fr


L‘historien Dimitri Casali prône la transmission des connaissances dans cet article du Midi Libre.

Selon Dimitri Casali, “la société bascule dans l’ignorance. Chaque jour la science nous apporte de nouvelles découvertes. Mais d’après l’historien, jamais l’ignorance n’a gagné autant de terrain”.

Il en veut pour preuve les 3 millions de personnes illettrées dans notre pays qui ne savent ni lire ni écrire. Si on additionne les personnes analphabètes qui ont des difficultés à lire et à écrire le français, on arrive à près de 10 % de la population et 6,5 millions de personnes !

Pour faire rentrer la France dans le siècle de la connaissance, c’est plutôt mal parti ! Il évoque aussi les enquêtes PISA, la baisse généralisée de QI.

C’est la fin de l’article qui est le plus intéressant.

À partir de quand peut-on dater, selon vous, cette montée de l’ignorance ?

“Je vois un lien évident avec la révolution numérique. Cela a démarré dans les années 2000 où on a assisté à une surinformation qui a conduit à l’ignorance. Les 15-29 ans ne lisent plus de livres. En revanche, ils lisent davantage sur les réseaux sociaux, les blogs… D’après une étude de l’Université de Yale, la lecture sur Internet n’est pas la même : les informations se superposent les unes aux autres alors que la lecture d’un livre permet de pénétrer les pensées de l’auteur et de structurer les informations.

Cela organise le cerveau. D’autres études sont à rapprocher de cela : les Français auraient perdu 4 points de QI entre 1989 et 2009, phénomène mesuré aussi en Angleterre ou aux États-Unis. Wikipédia est le plus bel exemple des effets pervers d’Internet. On a donné la culture aux imbéciles. Si dans le domaine scientifique, les notices sont rédigées par des experts, dans le domaine de la littérature et en histoire, c’est un agrégat d’informations nivelées par le plus grand nombre. Il n’y a plus de hiérarchisation du savoir. On est à l’époque du relativisme culturel. Tout se vaut. Ainsi la page de Kim Kardashian sera bientôt plus longue que celle de Montaigne et le grand poète grec Homère a déjà moins d’articles que Homer Simpson.

Y a-t-il un moyen d’éradiquer la montée du phénomène ?

Bien sûr, il faut replacer la culture générale et l’histoire au centre de nos préoccupations. Et d’abord à l’école. Or, depuis une trentaine d’années, la culture générale a été abandonnée. Les fameux pédagogistes de la rue de Grenelle ont remplacé la transmission du savoir et des connaissances par de simples compétences techniques. L’idée est de faire un homme nouveau, sans racines ni héritages, un bon consommateur. Rappelez-vous que Philippe Mérieu et Bourdieu préconisaient quand même d’apprendre à lire dans les notices d’appareils électroménagers et non plus grâce aux textes de Hugo ou de Molière… Il faut sortir de ce rejet de la culture classique française qui fait du mal aux esprits faibles. Et cesser de croire que nous devons tous être égaux dans la médiocrité.”

Charles Sannat
Photo Pixabay

Source Midi Libre

samedi, 20 octobre 2018

Jean Piaget & the Superior Psychogenetic Cognition of Europeans

piagetportr.png

Jean Piaget & the Superior Psychogenetic Cognition of Europeans

Part I

Everyone has heard about Jean Piaget’s (1896-1980) theory of the cognitive development of children. But no one knows that his theory placed Europeans at the top of the cognitive ladder with most humans stuck at the bottom — unless Europeans taught them how to think.

Piaget is widely recognized as the “greatest child psychologist of the twentieth century.” Unlike many other influential figures, Piaget’s discoveries have withstood the test of time. His argument that human cognition develops stage by stage, from sensorimotor, through preoperational and concrete operations, to formal operations, is generally endorsed in psychology and sociology texts as a “remarkably fruitful” model. This is not to deny that aspects of his theory have been revised and supplemented by new insights. One important criticism is that his fixed sequence of clear-cut stages does not always apprehend the overlapping and uneven process in the development of cognition. But even the strongest critics admit that his observations accurately show that substantial differences do exist between the cognitive processes (linguistic development, mental representations of concrete objects, logical reasoning) of children and adults.

Suppression of Piaget’s Cross Cultural Findings

What the general public does not know, and what the mainstream academic world is suppressing, is that many years of cross-cultural empirical research by Piaget and his followers have demonstrated that the stages of mental development of children and adolescents reflect the stages of cognitive evolution “humankind” has gone through from primitive, ancient, and medieval, to modern societies. The cognitive processes of humanity have not always been the same, but have improved over time. The civilizations of the world can be ranked according to the level of cognitive development of their populations. The peoples of the world differ not only in the content of their values, religious beliefs, and ways of classifying things; they differ in the cognitive processes they employ, their capacity to understand, for example, the relation between objects and concepts, their awareness of objective time, their ability to draw inferences from data, and to project these inferences into the hypothetical realm of the future. Most humans throughout history have been “childlike” in their cognitive capacities; they are not able, for example, to recognize contradictions between belief and experience, or to conceive multiple causes for individual events. Europe began to produce adolescents capable of reaching the stage of formal operational reasoning before any other continent, whereas to this day some nations barely manage to produce adults capable of formal operations.

This aspect of the cross-cultural comparative research conducted by Piaget and his associates has been suppressed. Critics interpreted the lack of formal reasoning among adolescents in many non-Western societies as evidence that his model lacked universal application, rather than as further confirmation that his theory of child development, first developed through extensive research on children in the West, could be applied outside the West. Because many critics erroneously assumed that Piaget’s theory was about how all children naturally maturate into higher levels of cognition, they took this lack of cognitive development in pre-modern cultures as a demonstration that different cultural contexts produce different modes of cognitive development. Piaget’s stages, however, should not be seen as stages that every child goes through as they get older. They are not biologically predetermined maturational stages. While there is a teleological tendency in Piaget’s account of cognitive stages, with each of the four stages in a modern environment unfolding naturally as the child ages, this criticism ignored the implications of his cross-cultural studies, which were carried in his later years, and which made it evident that the ability to reach the stage of formal operations depended on the type of science education children received rather than on a predetermined maturation process.

It can be argued, actually, that Piagetian cross-cultural studies made his theory all the more powerful in offering a precise and orderly account of the cognitive psychological development of humankind in world history from hunting and gathering societies through agrarian societies to modern societies. This was not just a theory about children but a grand theory covering the cognitive experience of all peoples throughout history, from primitive peoples with a preoperational mind, to agrarian peoples with a concrete operational mind, to modern peoples with a formal operational mind. One of the rare followers of this cross-cultural research, the German sociologist Georg W. Oesterdiekhoff, observes that “thousands of empirical studies across all continents and social milieus, from the 1930s to the present” (2015, 85) have been conducted demonstrating that, depending on the level of cultural scientific education, the nations of the world in the course of history can be identified as preoperational (which is the stage of children from their second to their sixth or seventh year of life), concrete operational (which is the stage from ages seventh until twelfth years) and formal operational (which is the stage of cognition from twelve years onward).

Piaget1.jpg

 [3]

Adults living in a scientific culture are more rational (and intelligent) than adults living in pre-modern cultures. For example, according to studies conducted in the 1960s and 1970s, even educated adults living in Papua New Guinea did not reach the formal stage. Australian Aborigines who were still living a traditional lifestyle barely developed beyond a preoperational stage in their adult years. Without a population that has mentally developed to the level of formal operations, which entails a capacity to think about abstract relationships and symbols without concrete forms, a capacity to grasp syllogistic reasoning, comprehend algebra, formulate hypotheses, there can be no modernization

However, despite all the studies confirming Piaget’s powerful theory, from about 1975-1980 a “wave of ideological attacks” was launched across the Western academic world against any notion that the peoples of the Earth could be ranked in terms of their cognitive development. According to Oesterdiekhoff, “nearly all child psychologists of the first two generations of developmental psychology knew about the similarities between children and pre-modern man,” but “due to anti-colonialism, student revolt, and damaged self-esteem of the West in consequence of the World Wars this theory as the mainstream spirit of Western sciences and public opinion declined gradually” (2014a, 281). As another author observed in 1989, “any suggestion that the cognitive processes of the older child might posses any similarities to the cognitive processes of some primitive human cultures is regarded as being beneath contempt” (Dan Le Pan, 1989).

I came across Oesterdiefkhoff’s research after a long search through Piagetian theory. I was wondering what his stage theory might have to say about the cognitive development of peoples in history. But I could find only sources of Piaget as a cognitive psychologist of children as such, not as a grand theorist of the cognitive development of humanity across world history — until I came to Oesterdiefkhoff’s many publications, which draw on pre-1975 Piagetian research and current research. This research, as Oesterdiefkhoff notes, “no longer belong to the center of attention and research interests. Most social scientists have never heard about these researchers and have only a very scanty knowledge of them” (2014a, 280).

Oesterdiefkhoff is very blunt and ambitious in his arguments. It is about why Piagetian theory is “capable of explaining, better than previous approaches, the history of humankind from prehistory through ancient to modern societies, the history of economy, society, culture, religion, philosophy, sciences, morals, and everyday life” (2014a). He believes that the rise of formal operational thinking among Europeans was the decisive factor in the rise of modern science, enlightenment, industrialism, democracy, and humanism in the West. The reason why India, China, Japan, and the Middle East did not start the Industrial Revolution “lies in their inability to evolve the stage of formal operations” (2014a).

Primitive and pre-modern peoples cannot be described as having a similar rational disposition as modern peoples because they are at the preoperational and concrete operational stages of cognition. Primitive adults share basic aspects of the preoperational thinking of children no more mature than eight years old. Adults in pre-modern civilizations share the concrete operational thinking of 6-12 year olds.

Children and premodern adults share the same mechanisms and basic understandings of physical dimensions such as length, volume, time, space, weight, area, and geometric qualities. Both groups share the animistic understanding of nature and regard stones, mountains, woods, stars, rivers, winds, clouds, and storms as living beings, their movements and appearances as expressions of their will, intentions, and commitment. Premodern humans often manifest the animistic tendencies of modern children before their sixth year. Fetishism and natural religion of premodern humans reside in children’s mentality before concrete operational stage . . . The biggest parts of ancient religions are based on children’s psychology and animism before the sixth year of life (2016, 301).

It is not that adults in primitive and pre-modern cultures are similar to children in modern cultures in their emotional development, experience, and ability to survive in a hostile environment. It is that the reasoning abilities of adults in pre-modern cultures are undeveloped. As Lucien Lévy-Bruhl (1857-1939) had already observed in Primitive Mentality (1923), a work which was recently released (2018) as part of Forgotten Books [4], the primitive mind is devoid of abstract concepts, analytical reasoning, and logical consistency. The objective-visible world is not distinguished from the subjective-invisible world. Dreams, divination, incantations, sacrifices, and omens, not inferential reasoning and objective causal relations, are the phantasmagorical doors through which primitives get access to the intentions and plans of the unseen spirits that they believe control all natural events.

The visible world and the unseen world are but one, and the events occurring in the visible world depend at all times upon forces which are not seen . . . A man succumbs to some organic disease, or to snake-bite; he is crushed to death by the fall of a tree, or devoured by a tiger or crocodile: to the primitive mind, his death is due neither to disease nor to snake-venom; it is not the tree or the wild beast or reptile that has killed him. If he has perished, it is undoubtedly because a wizard had “doomed” and “delivered him over”. Both tree and animal are but instruments, and in default of the one, the other would have carried out the sentence. They were, as one might say, interchangeable, at the will of the unseen power employing them (2018, 438).

I have reservations about the extent to which the rise of operational thinking on its own can explain the uniqueness of Western history (as I will explain in Part II), but I agree that without children or adolescents reaching the stage of operational thinking, there can be no modernization. The study of the geographical, economic, or cultural factors that led to the rise of science and the Industrial Revolution are not the matters we should be focusing on. The rise of a “new man” with psychogenetic capacities — psychological processes, personality, and behavior — for formal operational reasoning needs direct attention if we want to understand the rise of modern culture.

Cultural Relativism

But first, it seems odd that Oesterdiefkhoff holds two seemingly diametrical outlooks, “cultural relativism and universality of rationality,” responsible for the discrediting of Piagetian cross cultural theory. He does not explain what he means by “universality of rationality.” We get a sense that by “cultural relativism” he means the rejection of the unreserved confidence in the superiority of Western scientific rationality. Social scientists after the Second World War did become increasingly ambivalent about setting up Western formal thinking as a benchmark to judge the cognitive processes and values of other cultures, even though the non-Western world was happily embracing the benefits of Western science and technology.

The pathological state to which this relativism has affected Western thinking can be witnessed right inside the otherwise hyper-scientific field of cognitive psychology today. Take the very well known textbook, Cognitive Psychology [5] (2016), by IBM Professor of Psychology at Yale University, Robert Sternberg; it approaches every subject in a totally scientific and neutral manner — except the moment it touches the subject of intelligence cross-culturally, when it immediately embraces a relativist outlook informing students that intelligence is “inextricably linked to culture” and that it is impossible to determine whether members of “the Kpelle tribe in Africa” have less intelligent concepts than a PhD cognitive psychologist in the West. Intelligence is “something that a culture creates to define the nature of adaptive performance in that culture and to account for why some people perform better than others on the tasks that the culture happens to value.” It is “so difficult,” it says, to “come up with a test that everyone would consider culture-fair — equally appropriate and fair for members of all cultures” (503-04).

If members of different cultures have different ideas of what it means to be intelligent, then the very behaviors that may be considered intelligent in one culture may be considered unintelligent in another (504).

This textbook pays detailed attention to the scientific achievements of Piaget, but portrays him as someone who investigated the “internal maturation processes” of children as such, without considering his cross-cultural findings, which clearly suggest that children in less developed and less scientific environments do not mature to the formal stage. Pretending that such findings do not exist, the book goes on to criticize Piaget for ignoring “evidence of environmental [cultural] influences on children’s performance.”

I am not suggesting that cultural relativism has not taken over Western sciences in the way it has the humanities, sociology, history, and philosophy. But there is no denying this relativism is being effectively used by scientists against any overt presumption by Western scientists that their knowledge is “superior” to the knowledge of African tribes and Indigenous peoples. No cognitive psychologist is allowed to talk about the possible similarities between the minds of children and the minds of adult men in pre-modern cultures.

Cultural Universals

Oesterdiefkhoff does not define “universality of rationality,” but we can gather from the literature he uses that he is referring to other anthropologists who argue that all humans are rationally inclined; primitive and pre-modern peoples are not “illogical” or “irrational.” The “actual structures of thought, cognitive processes, are the same in all cultures.” What differs are the “superstructural” values, religious beliefs, and ways of classifying things in nature. Primitive peoples, Islamic and Confucian peoples, were quite rational in the way they went about surviving in the natural world, making tools, building cultures, and enforcing customs that were “adaptive” to their social settings and environments. They did not develop science because they had different priorities and beliefs, and were less obsessed with mastering nature and increasing production.

The anthropologist Claude Lévi-Strauss, and the sociologist Émile Durkheim, were the first to argue that the primitive mind is “logical in the same sense and same fashion as ours” and that the only difference lies in the classification systems and thought content. George Murdock and Donald Brown, in more recent times, came up with the term “cultural universals” (or “human universals [6]“) to refer to patterns, institutions, customs, and beliefs that occur universally in all cultures. These universals demonstrated, according to these anthropologists, that cultures differ a lot less than one might think by just examining levels of technological development. Murdock and Brown pointed to strong similarities in the gender roles of all cultures, the common presence of the incest taboo, similarities in their religious and healing rituals, mythologies, marriage rules, use of language, art, dance, and music.

This idea about the universality of rationality and “cultural universals” was subsequently elaborated in a more Darwinian direction by evolutionary psychologists. Evolutionary psychology is generally associated with “Right wing” thinking, in contrast to cultural relativism, which is associated with “Left wing” thinking. Evolutionary psychologists like E. O. Wilson and Steven Pinker hold that these cultural universals are naturally selected, biologically inherited behaviors. They believe that rationality is a naturally inherited disposition among all humans, though they don’t say that the levels of knowledge across cultures are the same. Humans are rational in the way they go about surviving and co-existing with other humans. These universals were selected because they enhanced the adaptability of peoples to their environments and improved the group’s chances of survival. Some additional cultural universals observed in all human cultures are bodily adornment, calendars, cooperative labor, cosmology, courtship, divination, division of labor, dream interpretation, food taboos, funeral rites, gift-giving, greetings, hospitality, inheritance rules, kin groups, magic, penal sanctions, puberty customs, residence rules, soul concepts, and status differentiation.

piaget2.jpg

 [7]

Evolutionary psychologists are convinced that the existence of cultural universals amount to a refutation of the currently “fashionable” notion that all human behaviors, including gender differences, are culturally determined. But if the West is very similar to other cultures, why did modern science develop in this civilization, including liberal democratic values? Evolutionary psychologists search for general explanations — the notion of cultural universals meets this criteria, Western uniqueness does not; therefore, they either ignore this question or reduce Western uniqueness to a concatenation of historical factors, varying selective pressures, and geographical good luck. They point to how modern science has been assimilated by multiple cultures, from which point they argue that science is not culturally exceptional to the West but a universal method that produces universal truths “for humanity.”

Can one argue that universalism is a cultural attribute uniquely Western and therefore relative to this culture?

Piagetian Universalism and IQ Convergence

Piagetian theory is also universalist in maintaining that all cultures are now reaching the stage of formal operational thinking. The West merely initiated formal reasoning. More than this, according to Oesterdiefkhoff, this cognitive convergence is happening across all the realms of social life, because changes in the cognitive structures of humans bring simultaneous changes in the way we think about politics and institutional arrangements. The more rational we become, the more we postulate enlightened conceptualizations of government in opposition to authoritarian forms. Drawing on the extension of Piagetian theory to explain the moral development of humans (initiated by Piaget and elaborated by Lawrence Kohlberg), Oesterdiefkhoff writes that once humans reach stage four, they start to grasp “that rule legitimacy should follow only from a correct rule installation, that is, from the choices of the players involved” (2015, 88).

Thus, they regard only rules correctly chosen as obliging rules. Only democratic choices install legitimate rules. Youth on the formal stage surmount therefore the holy understanding of rules by the democratic understanding. They replace an authoritarian understanding of rules, laws, and customs by a democratic one. Thus, they invent democracy in consequence of their cognitive maturation (2015, 88-9).

The emergence of the adolescent stage of formal operations gave birth not only to the new sciences after 1650 but also to philosophers such as Locke, Montesquieu, and Rousseau, who formulated the basic principles of constitutional government, representative institutions, and religious tolerance. Extensive cross-cultural research has shown that “children do not understand tolerance for deviating ideas, liberty rights for individuals, rights of individuals against government and authority, and democratic legitimacy of governments and authorities” (2015, 93). They are much like the adults of premodern societies, or current backward Islamic peoples, who take “laws and customs as unchangeable, eternal, and divine, made by god and not modifiable by human wishes or choices” (2015, 90).

This argument may seem similar to Francis Fukuyama’s thesis that modernizing humans across the world are agreeing that liberal-democratic values best satisfy the longing humans have for a state that recognizes the right of humans to pursue their own happiness within a constitutional state based on equal rules. The difference, a crucial one, is that for Fukuyama the rise of democracy came from the articulation and propagation of new ideas, whereas for Oesterdiefkhoff psychogenetic maturation is a precondition of democratic rule. Adults who were raised in a pre-modern culture and have a concrete operational mind can “never surmount” this stage, no matter how many books they read about the merits of liberal democracy. These adults will lack the appropriate ontogenetic development required for a democratic mind.

The absence of stimuli and forces of modern culture during early childhood in premodern cultures prevents later psychological development from going beyond certain stages . . . Unused developmental opportunities in youth stop the development of the nervous system, thus preventing psychological advantages in later life. This explains why education and enlightenment, persuasion and media programs could not draw adult premodern people out of their adherence to magic, animism, ordeal praxis, ancestor worship, totemism, shamanism, and belief in witches. Such people, moving in adulthood to modern milieus, cannot surmount their anthropological structures and their deepest emotions and convictions (2016, 306-7).

Moreover, according to Oesterdiefkhoff, with the attainment of higher Piagetian stages come higher IQ levels. Psychogenetic differences, not biological genetic differences, are the decisive factor. “All pre-modern peoples stood on intelligence levels of 50 to 70 [IQ points] or on preoperational or concrete operational levels, no matter from what race, culture or continent they have come” (2014b, 380).

Not only the Western nations, but all modernizing nations have raised their scores. The rises in stage progression and IQ scores express the greatest intelligence transformations ever in the history of humankind and stem solely from changes in culture and education. When Africans, Japanese, Chinese and Brazilians have raised their intelligence so dramatically, where is the evidence for huge genetic influences? Huge genetic influences might be assumed if Europeans had always had higher intelligence and if African, Indians, Arabs and Vietnamese had been unable to raise their intelligence to levels superior to that of Europeans 100 years ago. But Latin Americans and Arabs today do have higher IQ scores than Europeans had 100 years ago . . . Where is the leeway for genetic influences to affect national intelligence differences? (Ibid).

 

piaget3.png

 [8]

IQ experts would counter that only psychometric data about levels of heritable general intelligence can explain the rise of formal operational thinking. But even if we agree that a gap in IQ scores between American blacks and American whites has remained despite the Flynn effect [9] and similar levels of education and income, it is very hard to attribute the remarkable increases in IQ identified over the last century to heredity. Oesterdiefkhoff’s argument that “all modernizing nations have raised their IQ scores,” and that operational thinking has been central to this modernization, is a strong one.

Formal Reasoning is not a Cultural Universal

The stage of formal operations cannot be said to be a biologically primary ability that humans inherit genetically. They are secondary biological abilities requiring a particular psycho-cultural context. Formal thinking came to be assimilated by other nations (most successfully in east Asian nations with an average high IQ, but far less so in sub-Saharan nations where to this day witchcraft prevails [10]). The abilities associated with the first two stages (e.g., control over motor actions, walking, mental representation of external stimuli, verbal communication, ability to manipulate concepts), have been acquired universally by all humans since prehistorical times. These are biologically primary qualities that children across cultures accomplish at the ages and in the sequence more or less predicted by Piaget. They can be said to be universal abilities built into human nature and ready to unfold with only little educational socialization, explainable in the context of Darwinian evolutionary psychology. These cognitive abilities can thus be identified as “cultural universals.”

The concrete-operational abilities of stage three (e.g., the “ability to conserve” or to know that the same quantity of a liquid remains when the liquid is poured into a differently shaped container) are either lacking in primitive cultures or emerge at later ages in children than they do in modern cultures. These cognitive abilities may also be described as biologically primary, as skills that unfold naturally as the child matures in interaction with adult members of the society. In modern societies, all individuals with a primary education acquire concrete operational abilities. The aptitudes of this stage can be reasonably identified as universally present in all agrarian cultures.

This is not the case at all with formal operational skills. The skills associated with this stage (inductive logic, hypothesis testing, reasoning about proportions, combinations, probabilities, and correlations) do not come to humans naturally through socialization. There is abundant evidence that even normally intelligent college students with a long background in education have great difficulties distinguishing between the form and content of a syllogism, as well as other types of formal operational skills. Oesterdiefkhoff acknowledges that

Only when human beings are exposed to forces and stimuli typical of modern socialization and culture do they progress further and develop the adolescent stage of formal operations (2016, 307).

But, again, as it has been observed by critics of Piaget, even in modern societies where children inhabit a rationalized environment and adolescents are taught algebra and a variety of formal operational skills, many students with a reasonable IQ find it difficult to think in this way. According to P. Dasen (1994), only one-third of adults ever reach the formal operational stage. Evolutionary psychologists have thus disagreed with the idea that this stage is bound to unfold among most humans as they get older as long as they get a reasonably modern education. There are many “sub-stages” within this stage, and the upper stages require a lot of schooling and students with a keen interest and intelligence in this type of reasoning. This lack of universality in learning formal operational skills has persuaded evolutionary psychologists to make a distinction between the biologically primary abilities of the first three stages and the biological secondary abilities of stage four. Formal reasoning is principally a “cultural invention” requiring “tedious repetition and external motivation [11]” for students to master it.

If the ability to engage in formal thinking is so particular, a biologically secondary skill in our modern times, would it not require a very particular explanation to account for the origins of this cognitive stage in an ancient world devoid of a modern education? If the rise of “new humans” with a capacity for formal thinking was responsible for the rise of the modern world, and the existence of a modern education is an indispensable requirement in the attainment of this stage among a limited number of students, how did “new humans” grow out of a pre-modern world with a lower average IQ?

In the second part of this article, it will be argued that Europeans reached stage four long before any other people on the planet because Europeans began an unparalleled intellectual tradition of first-person investigations into their conscious states. This is a type of self-reflection in which European man began to ask who he is, how does he know that he is making truthful statements, what is the best life, and if he is being self-deceived in his beliefs and intentions. This is a form of self-knowledge that was announced in the Delphic motto “know thyself.” It would be an error, however, to describe the beginnings of this self-consciousness as a relation to something in oneself (an I or an ego) from which a predicate, or an outside, to which the subject relates, is derived. The emergence of the first-person consciousness of Europeans did not emerge outside the being-in-the world of the aristocratic community of Indo-Europeans. Europeans began a quest for rationally justified truths, for objective standards of justification, and for the realization of the good life in a reflective self-relation, coupled with socially justified reasons about what is morally appropriate.

References

Brown, Donald (1991). Human Universals. Philadelphia: Temple University Press.

Dasen, P. (1994). “Culture and cognitive development from a Piagetian perspective.” In W. J. Lonner & R. S. Malpass (Eds.), Psychology and culture. Boston: Allyn and Bacon.

Genovese, Jeremy (2003). “Piaget, Pedagogy, and Evolutionary Psychology.” Evolutionary Psychology, Volume 1: 217-137.

LePan, Donald. (1989). The Cognitive Revolution in Western Culture. London: Macmillan Press.

Lucien Lévy-Bruhl (2018). Primitive Mentality [1923]. Forgotten Books.

Oesterdiekhoff, Georg W. (2014a). “The rise of modern, industrial society. The cognitive developmental approach as key to disclose the most fascinating riddle in history.” The Mankind Quarterly, 54, 3/4, 262-312.

Oesterdiekhoff, Georg W. (2016). Child and Ancient Man: How to Define Their Commonalities and Differences Author(s). The American Journal of Psychology, Vol. 129, No. 3, pp. 295-312.

Oesterdiekhoff, Georg W. (2012). Was pre-modern man a child? The quintessence of the psychometric and developmental approaches. Intelligence 40: 470-478.

Oesterdiekhoff, Georg W (2014b). “Can Childlike Humans Build and Maintain a Modern Industrial Society?” The Mankind Quarterly 54, 3/4, 371-385.

Oesterdiekhoff, Georg W (2015). “Evolution of Democracy. Psychological Stages and Political Developments in World History” Cultura: International Journal of Philosophy of Culture and Axiology 12 (2): 81-102.

Stenberg, Robert (2003). Cognitive Psychology. Nelson Thompson Learning. Third Edition.

This article was reproduced from the Council of European Canadians [12] Website.

Article printed from Counter-Currents Publishing: https://www.counter-currents.com

URL to article: https://www.counter-currents.com/2018/10/jean-piaget-the-superior-psychogenetic-cognition-of-europeans-1/

URLs in this post:

[1] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-17-18-2.png

[2] here: https://www.counter-currents.com/2018/10/jean-piaget-the-superior-psychogenetic-cognition-of-europeans-part-ii/

[3] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-17-18-3.jpg

[4] Forgotten Books: https://www.amazon.ca/Primitive-Mentality-Classic-Reprint-Levy-Bruhl/dp/0282635432

[5] Cognitive Psychology: https://www.amazon.com/Cognitive-Psychology-Robert-J-Sternberg/dp/1305644654/ref=pd_lpo_sbs_14_t_0?_encoding=UTF8&psc=1&refRID=NR395X3ESKS7MW1FY1TV

[6] human universals: http://www.humiliationstudies.org/documents/BrownUniversalsDaedalus.pdf

[7] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-17-18-4.jpg

[8] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-17-18-5.png

[9] Flynn effect: https://en.wikipedia.org/wiki/Flynn_effect

[10] sub-Saharan nations where to this day witchcraft prevails: https://www.rt.com/news/435905-malawi-bbc-child-killings/

[11] tedious repetition and external motivation: https://www.psychology-lexicon.com/cms/glossary/35-glossary-b/4346-biologically-secondary-abilities.html

[12] Council of European Canadians: https://www.eurocanadian.ca/2018/10/jean-piaget-superior-psychogenetic-cognition-europeans-part-one.html

Why did the West rise to become the most powerful civilization, the progenitor of modernity, the culture with the most prodigious creators? The answers are plenty. But it may be that a child psychologist, Jean Piaget, has offered the best theoretical framework to explain the difference between the West and the Rest. Part II of this article continues the examination of George Oesterdiefkhoff’s application and elaboration of Piagetian theory in his ranking of the cognitive development of the peoples of the world. It praises the fundamental insights of this elaboration while arguing that the psychogenetic superiority of European children should be traced back to the appearance of new humans in ancient Greek times who started to realize that their consciousness is the highest point on which all else depends.

go-liv.jpgOesterdiefkhoff on the Origins of Western Operational Thinking

Why did Europeans reach the fourth stage of formal operations long before any other peoples in the world? When pressed (in an exchange) about the causes of the emergence of stage four, George Oesterdiefkhoff responded that

schooling and other cultural factors must have been more elaborated in early modern Europe than in Asia, antiquity, and medieval times. The trigger to arouse the evolution of formal operations would have been especially the systems of education (2014b, 376).

He then added:

Admittedly, this begs the question about the causes of this alleged fact and necessitates yet another level of causal explanation . . . I rather prefer cultural explanations and think about the possible relevance of the advantages of the Greek/Roman alphabet or Aristotelian logic, phenomena fostering the use of abstraction and logic (2014b, 376).

But this is as far as Oesterdiefkhoff goes in explaining why the ancient Greeks reached the fourth stage first. He prefers, rather, to jump right into the modern era, the seventeenth century, as the century in which formal operational thinking really emerged, from which point he then identifies “the rise of formal operations, the cognitive maturation of people” (in itself) as the “cause” of the rise of modern Europe. He insists that his Piagetian theory “is crucially a causal theory of modernity” (2014b, 375). But no explanation is provided as to the original causes of the rise of formal operational thinking.

If Oesterdiefkhoff’s point is that, without a population in which the children have ontogenetically developed a capacity for formal operations, you can’t have adults engage in formal operational thinking, then I agree that this ontogenetic development is a precondition for a modern society. But we still need an explanation for the rise of “new humans” (to use his own words) capable of formal operational thinking. Does he mean that the Greek/Roman alphabet and Aristotelian logic already contained the seeds of formal reasoning? The alphabet is indeed the most abstract symbolic system of writing in which both consonants and vowels are represented. Can it be denied that Aristotle’s theory of the syllogism is at the level of stage four, considering that this theory teaches that we can abstract altogether from the concrete content of an argument and judge its merits solely in terms of how the terms are formally or logically connected to each other?

Oesterdiefkhoff says that the Ionian philosophers (in the sixth century BC) were the first to establish the concrete operational stage and, in the same vein, implies that Aristotle’s philosophy did not rise above this concrete stage. “Aristotle’s physics strongly resembles the animistic physics of children aged 10 before they establish the mechanistic world view.” “The formal operational stage comes into being predominantly with Descartes in the 17th century” (2016, 304). We can agree that it “comes into being predominantly” in this century, but if we also agree that this stage has “many sub-stages” (as Oesterdiefkhoff points out), why can’t we identify Aristotle’s extensive writings on logic, induction and deduction, affirmations and contradictions, syllogisms and modalities, definitions and essences, species, genus, differentia, and the categories as the beginnings of stage four?

Oesterdiefkhoff knows he needs some origins, and admits he is caught in a chicken-egg dilemma. He writes about “a positive feedback loop” in the interrelationship between “the knowledge taught in schools and universities” in modern Europe and the rise of formal reasoning. But instead of “finding the causes for the emergence of formal reasoning in Europe some centuries ago,” he prefers to say that the “highest stage, the stage of formal operations, directly accounts for the rise of modern sciences” (2014a, 269). “The rise of formal operations in the Western world after 1700 is the single cause of the rise of the sciences, industrialism, enlightenment, humanism, and democracy” (2014a, 287).

aristotle_with_a_bust_of_homer.jpg

Aristotle with a bust of Homer

This may be understandable since Oesterdiefkhoff is not a historian. He has, in my view, made a fundamental contribution to the “rise of the West” debate, explaining the direct relevance of Piaget’s theory of cognitive development. None of the participants in this debate care to talk about “cognitive development,” but assume (along with the academic establishment) that all humans across all cultures and throughout history (since we became Homo sapiens in Africa) are equally rational.

Oesterdiefkhoff wants to fit Western history within a stage theory of developmental psychology in which ancient/medieval times are clearly demarcated from modern operational stages. He writes of the “child-like stages” of peoples living in the pre-modern world, including Europeans, and says that the cognitive age of pre-modern adults “typically corresponds to that of children” before the age of 10. “Medieval philosophy, be it Platonic or Aristotelic, regarded nature and reality as living things, ruled by God, and other spiritual forces. It had no concept of physical laws.” “[T]he rise of formal operations became a phenomenon of major importance as late as the 17th century.” “The kernel of Enlightenment philosophy is the surpassing of childlike mental states, of the world of fairy tales, magic, and superstition, as it prevailed in the pre-modern world” (2014a, 292-295).

He qualifies this estimation a bit when he writes that “formal operations…evolved in the intellectual elite of early modern Europe and slowly spread to other milieus.” But his pervading message is that it was only during the 1700s, or even “after the 1700s,” that Europeans came to reach the operational stage. There is no reason to disagree if he means that only the 1700s and after saw sufficient numbers of Europeans maturing into the last stage, making possible a full-scale industrial revolution. But we still need an explanation of the origins of “new humans,” the first humans who matured into the fourth stage.

I understand that many will be tempted to point to social and educational forces as the causes of this initial cognitive transition to operational thinking. They will argue that as literacy was mastered, and as institutes of learning were established, and arithmetic, reading, and other subjects were taught, a major shift occurred in human mental activity. This emphasis on the educational environment is a view often attributed to the Soviet psychologist A. R. Luria (1902-1977). From this claim, it takes only one step to the identification of the “social and economic” mode of production as the “underlying” factor of this cognitive revolution, thus combining Piaget and Marx’s historical materialism. The ancient Greeks developed operational thinking in the new milieu of urban life, growing trade in the Mediterranean, and money exchanges. The flaw in this explanation is that not only were all these new economic ways present in greater abundance in the older and larger civilizations of Mesopotamia and elsewhere, but all these commercial and urban activities only required concrete operational habits of thinking.

The view I will propose in a later section, albeit in a suggestive manner, presupposing for its understanding what I wrote in The Uniqueness of Western Civilization about the aristocratic culture of Indo-Europeans, and in a number of articles at the Council of European Canadians [3] about the masculine preconditions of individualism, the higher fluidity of the Western mind, the multiple intelligences of Europeans, and the bicameral mind, is that Oesterdiefkhoff underplays the importance of self-consciousness, the awareness of humans of their own identity as knowers, in contradistinction to everything that is not-I, in the development of cognition. Europeans were the first to reach the fourth stage, a long time before any other people, because they were a new breed of humans who evolved a uniquely high level of self-awareness, an ability to differentiate clearly between their conscious “I” and the physical world; that is, an awareness of their own minds, as distinguished from their appetitive drives, the conventions of the time, and the world of invisible spirits. This introspective awareness of the role of the human mind as the active agent of cognition is what allowed Europeans to reach the fourth stage so early in their history.

It is no accident that the main precursor of the modern concept of mind is the ancient Greek notion of nous. Plato’s identification of three distinct parts of the soul — rational (nous), appetitive (epithumia). and the spirited (thymos) — can be classified as the first psychological contribution in the Western tradition. Both the appetitive and the spirited parts of the soul are about desires, but the appetitive part is about the biologically-determined desires humans have for food, sex, and comfort, whereas the spirited part is about “passion,” the emotions associated with the pursuit of honor and glory, feelings of anger and fear. Plato anticipated the Cartesian dualist separation of mind and body when he argued that the mind was immaterial and immortal, whereas the body was material and mortal. He also understood that the Indo-Europeans were the most “high spirited” peoples in the world, once observing that “the Thracians and Scythians and northerners generally” were peoples “with a reputation for a high-spirited character” (Francis M. Cornford, trans., The Republic of Plato, 132). Aristotle added to this observation a distinction between the “high-spirited” but barbaric passions of “those who live in Europe” and the “high-spirited” but “intelligent” virtues of the Hellenic peoples. Aristotle further observed that while the peoples of Asia were intelligent, they were “wanting in spirit and therefore they are always in a state of subjection and slavery [4].”

cartepiaget.jpg

 [5]

I trace this high spirited character to the uniquely aristocratic culture of Indo-Europeans. While one may be tempted to think that the intelligent-rational virtues of the Greeks were able to manifest themselves only when the rational part of their soul was brought to bear on their strong thymotic drives, Plato was correct in observing that the rational part would always be in unending combat with the demands of the appetites were it not for the intervention of the spirited part, the strong sense of aristocratic pride and honor of the Greeks, in helping reason subdue the appetitive part, and, in the same vein, helping reason to channel the high-strung energies of the spirited part away from barbaric and chaotic actions into a will-to-knowledge, a courage to break through the unknown, and thus bring forth the first sub-stages of the formal operational stage.

Before I say more about this explanation, I want to outline why I think Europeans, under their own initiative, not just in ancient Greek and Roman times but again in the High Middle Ages, after the decline of the Dark Ages (500 AD to 1100 AD), were the developers of formal operational habits of thinking long before any other people were compelled to adopt these habits under Western pressure.

Ancient Greeks were the First “New Humans”

netzbook.jpgRelying on Piaget’s criteria that the ability to think in a deductive way without handling concrete objects is a necessary component of the formal operational stage, it is hard to deny that the first clear signs of this stage were evident in Greek culture around the fifth century BC. We learn in Reviel Netz’s The Shaping of Deduction in Greek Mathematics: A Study in Cognitive History [6] (1999) that Greek mathematics produced knowledge of general validity, not only about the particular right triangle ABC of the diagram, for example, but about all right triangles. This formal operational trait, this ability to think about numbers in a purely abstract way, is what makes Greek mathematics historically novel in comparison to all preceding “concrete” operational mathematics. This type of reasoning was very exclusive, to be sure, restricted to a small number of Greeks; it has been estimated that at most there were a thousand mathematicians throughout Greek antiquity, a period lasting a full millennium.

How about scientific accomplishments during the Hellenistic era (323-31 BC)? Oesterdiefkhoff seems aware of Hellenistic science when he writes that “Roman intellectuals no longer understood the superior contributions of the Hellenistic scholars” (2014a, 281). Can one say that the cognitive processes of the Hellenistic elite were at a level under the age of 10 after reading Lucio Russo’s The Forgotten Revolution: How Science Was Born in 300 BC and Why It Had to Be Reborn [7]? Can one really say that the institutionalization of scientific research in the Museum and Library at Alexandria, which contained more than five hundred thousand papyrus rolls and funded one hundred scientists and literary scholars, was not an educational establishment promoting formal operational thinking? We learn from Russo’s book about the conics of Apollonius and the invention of trigonometry by Hipparchus, about Archimedes’s work on hydrostatics and the mechanics of pulleys and levers, the first formal science of weight, about Aristarchus’ heliocentric proposal, and about Eratosthenes and his calculations to determine the circumference of the Earth. The hypothetico-deductive form of Euclid’s Elements is undeniable; it is the way in which circles, right angles, and parallel lines are explicitly defined in terms of a few fundamental abstract entities, such as points, lines, and planes, on the basis of which many other propositions (theorems) are deduced. (Newton, by the way, was still using Euclidean proofs in his Principia). While the Romans did not make major contributions in mathematics and theoretical science, it should be noted that Claudius Ptolemy, while living under Roman rule in Alexandria in the second century AD, wrote highly technical manuals on astronomy and cartography. The Almagest, which postulates a geocentric model, employs pure geometric concepts combined with very accurate observations of the orbits of the planets. It postulates epicycles, eccentric circles, and equant points, with the latter being imaginary points in space from which uniform circular motion is measured. Attention should be paid to the “formal-rational” codification and classification of Roman civil law into four main divisions: the law of inheritance, the law of persons, the law of things, and the law of of obligations, with each of these subdivided into a variety of kinds of laws, with rational methods specifying how to arrive at the formulation of particular rules. The rules upon which legal decisions were based came to be presented in categories headed by definitions. The most general rules within each of these categories were the principles upon which more specific rules were derived. This ordering was in line with a formal operational mode of reasoning, for the rules were presented without reference to the factual settings in which they were developed, and the terminology used in these rules was abstract.

This effort at a rationally consistent system of law was refined and developed through the first centuries AD, culminating in what is known as Justinian’s Code, 527 to 565 AD, which served as the foundation of the “Papal Revolution” of the years 1050-1150, associated with the rise of Canon Law. This Papal Revolution, by separating the Church’s corporate autonomy, its right to exercise legal authority within its own domain, and by analyzing and synthesizing all authoritative statements concerning the nature of law, the various sources of law, and the definitions and relationships between different kinds of laws, and encouraging whole new types of laws, created a modern legal system.

Medieval Europeans

Oesterdiefkhoff acknowledges in passing that ancient Greece saw “seminal forms of democracy . . . for a certain period,” a form of state which actually entails, in his view, the fourth stage of cognitive development. If Greek democracy was short-lived, what about the republican form of government during ancient Roman times [8], and the impact this form of government had on the modern Constitution of the United States [9]? We can also mention the representative parliaments and estates of medieval Europe [10]? To be sure, ancient Greece and Rome, and the Middle Ages, were far from the formal operational attainment of modern Europe (even if we draw attention to the continuation of witchcraft and magic in Enlightenment Europe [11]).

 [12]

raddingbook.jpgIt is telling, however, that according to Charles Radding’s book, A World Made by Men: Cognition and Society, 400-1200 (1985), new lines of formal operational reasoning were “well established by 1100” in some European circles. I say “telling” because this book (one of only two) directly employs Piaget’s theory to make sense of Europe’s intellectual history. Oesterdiefkhoff references this work without paying attention to its argument. From a Europe that employed ordeals of boiling water and glowing iron to decide innocence and guilt, and that “looked for direction” to divinely inspired pronouncements from superiors, kings, abbots, or the ancients, and that was rarely concerned with “human intention,” we see (after 1100) a growing number of theologians insisting that humans must employ their God-given reasoning powers to determine the truth. Whereas the way theological disputes were settled before 1100 was “by citing authority,” “it was even increasingly the case [after 1100] that the very authority of a text’s author might be denied or disregarded” (p. 204). Using “one’s own judgment” was encouraged, combined with the study of logic as “the science of distinguishing true and false arguments.”

Although Radding is not definitive and barely elaborates key points, he understands that this increase in logical cognition entailed a new awareness of the distinction “between the knower and what is known,” between the I and the not-I. Medievalists actually went ahead of the ancient Greeks. For Plato, an idea existed and was correct if its origins were outside the mind, in the world of immaterial and perfect forms, which he differentiated from the untrue world of physical things. Perfect ideas were independent of the human mind, outside space and time, immutable. These ideas were not the products of human cognition. While the only way the human mind could apprehend these ideas was through intense training in geometrical (formal) reasoning, the aim was to reach a world of godlike forms to which the human mind was subservient.

While Aristotle transformed Plato’s forms into the “essences” of individual things, he believed that universal words existed in individual objects, or that abstract concepts could be equated with the essences of things. It is not that Aristotle did not perceive any dividing line between the supernatural, the world of dreams, and the natural; it is that he was a conceptual realist who believed that the contents of consciousness really existed as the essences of particular objects. Medieval nominalists showed a deeper grasp of the relationships between the mind and the external world by abandoning the notion that Forms (or ideas) represent true reality, the source of the mind’s ideas, and arguing instead that general concepts are mere names, neither the essences of things nor forms standing outside the material world. Only particular objects existed, and the role of cognition was to make true statements about the world of particular things even though ideas are not things but mental tools originated by men.

Nominalism represented a higher level of awareness of the role the human mind plays in cognition and of the distinction between the knower and the world outside. While Plato distinguished reason from the world of sensory phenomena, including natural desires, and, in so doing, identified the faculty of reasoning in its own right, he viewed human (intellectual) activity as dependent or subservient to a world of perfect and purely immaterial forms existing independently of the mind. Moreover, among medieval philosophers we find (in Peter Abelard, for example) a greater emphasis on intention, the view that the intention of humans should be considered in determining the moral worth of an action. Human action should not be attributed to supernatural powers or evil forces entering into human bodies and directing it. Humans have a capacity to think through different courses of actions, and for this reason human actions cannot be understood without a consideration of human intentions.

radd.jpgRadding (picture) brings up the emerging “idea of nature as a system of necessary forces” in opposition to the early medieval idea about miraculous events, as well as the “treatment of velocity itself as a quantity . . . comparing motion that follows differently shaped paths,” in the work of Gerard of Brussels in the early 1200s (p. 249). A better example of formal operational concepts would be Nicole Oresme’s (1320-1382) depiction of uniformly accelerated motion, which was not about motion in the real world but an effort to explain how motion increases uniformly over time in a totally abstract way. This view anticipated Galileo’s law of falling bodies. Among other examples Radding brings to elucidate this medieval shift to formal operational thinking is the observation that by the reign of Henry (1133-1189) the idea had taken root that consultation of members of the upper classes should be the norm in the workings of the monarchy, as well as the legal idea that mental competence should be a prerequisite in deciding criminal behavior.

The Birth of Expectation in Early Modern Era

 [13]

donlepan.jpgDon LePan’s book, The Cognitive Revolution in Western Culture (1989), agrees with Radding that “there is considerable evidence of at least the beginnings of changes in the cognitive processes occurring among the educated elite in the twelfth century” (p. 45). But he believes that new cognitive processes began to spread in the early Modern period (or the Renaissance) when Europeans developed the capacity of “expectancy,” which he defines as “the ability to form specific notions as to what is likely to happen in a given situation” (p. 75). It is around this sense of expectancy, LePan says, that most of the cognitive processes Piaget identifies with the fourth stage are clearly evident. This sense of expectancy involves a “rational assessment of probabilities,” evaluating “disparate pieces of information” within a chain of events and circumstances as to whether something is likely to transpire in the future or not, drawing inferences from this information, and projecting “these inferences into the hypothetical realm of the future” (pp. 74-75). Before this capacity developed, the sense of future expectation that humans had was of a predetermined sort, or accidental and beyond reason, in which an outcome was believed to happen “regardless of the intervening chain of events” (p. 79) and without an objective assessment of human intentions and events about how the future event will likely happen.

This sense of expectancy involved the emergence of an ability to think in terms of abstract universal time, as contrasted to the commonly held notion of pre-modern peoples that “time moves at variable speeds, depending on the nature and quality of the events”. Among primitives, the recounting of past events, or history, is merely an aggregation of disconnected anecdotes without any sense of chronology and causal relationship and no grammatical distinction between words referring to past events or to present events. The past is conceived similarly to the present. While early Christian historians did have a sense of chronology, a universal history where all events were framed within a temporal sequence, they did not have a framework of abstract and objective time. They were more interested in detecting the plan of God rather than in how humans with intentions made their own history.

Because pre-modern peoples lack a framework of abstract and objective time, the “when” of an event is merely about before or after other events and not about the length of time elapsed between it and other events. Pre-modern peoples are also incapable of distinguishing between travelling the same distance and travelling at the same speed. They lack the habit of thinking of velocity as a quantity distinct from those of distance and time. Without a temporal conception wherein one can think of causes as anterior to the effect, it is not possible to consider historical events in terms of causal relations within a sequence of past, present, and future events.

For these reasons, pre-modern peoples were unable to think in terms of expectations of a hypothetical future, that is, to think about what will happen in the future in terms of multiple chains of causation and the ways in which these causes, sometimes happening simultaneously in different places, may bring about a future effect. LePan is particularly keen in showing that William Shakespeare’s originality was a result of his ability to create complex plots which gave the audience “a continual sense of anticipation . . . by drawing them into [an] unfolding pattern of connections with the past and the future of the story” (p. 175). The curiosity of a pre-modern audience is restricted to what will happen next within a sequence of episodes in which the reader or audience is confident about what is likely to happen, or what the final outcome will be, and in which there is, therefore, no sense of expectation whether it will happen, no concern to envisage the hypothetical possibilities of situations, no weighing of causes and intentions against each other, and no judgment of what the probable outcome of the future will be.

donlepanportrait.jpgDon LePan

As to what brought this new sense of expectation and the spread of the habits of thought associated with stage four, LePan is inclined to follow A. R. Luria’s argument that the causes of cognitive change are due to social and educational factors. He is rather vague; as society changes, literacy is mastered, the level of education increases, and the cognitive processes change. Which came first, new cognitive processes, new ways of educating children, or new “underlying economic changes”? They “reinforced each other.” LePan carefully distances himself from any claim that Europeans were genetically wired for higher levels of cognition. Even though he rejects the establishment idea that “all peoples think with exactly the same thought processes,” he believes that all humans are equally capable of reaching this stage. Without realizing that Piaget laid the groundwork for Kohlberg’s moral stages, he insists there is no “direct correlation between degrees of rationality and degrees of moral goodness” (p. 15). The book ends with a strange “postscript” about how he has been living with his wife in rural Zimbabwe for the last two years. He says he wishes the primitive and the modern mind could co-exist with each other, praises the cultural “vitality” of this African country, and then concludes with the expectation that “if something like a new Shakespeare is to emerge, it will be from the valleys of the Niger or the Zambezi” (p. 307). The subtitle of the first volume of The Cognitive Revolution in Western Culture is The Birth of Expectation. He did not write a second volume.

The uniqueness of the West frightens academics. They have concocted every imaginable explanation to avoid coming to terms with the fact that Europeans could not have produced so many transformations, innovations, renaissances, original thinkers, and the entire modern world, without having superior intellectual powers and superior creative impulses. The tendency for some decades now has been to ignore the cultural achievements of Europeans, minimize them, or reduce the “rise of the West” to one transformation, the Industrial Revolution, currently seen as the only happening that brought about the “great divergence.” The prevailing interpretation paints these achievements as no better than what transpired in any other primitive culture, and, indeed, far worse insomuch as the West was different only in its imperialistic habits, obsessive impulse for military competition, and genocidal actions against other races.

I agree with Oesterdiefkhoff that the faster cognitive maturation of European peoples “is the decisive phenomenon” in need of an explanation if our aim is to explain the rise of modern-scientific society. I will leave aside the question of whether this is the only factor that needs explanation if our aim is to explain other unique attributes of the West, such as the immense cultural creativity of this civilization. Cultural creativity in the arts presupposes a higher level of cognitive development, but it would be one-sided to reduce all forms of creativity to formal operational habits. Once these cognitive habits are established, formal operations can be performed at the highest level of expertise by individuals who are not creative, but who have a high IQ and a very good education. Computers can be programmed to perform formal operations, but it is hard to say that they are self-conscious beings rather than automata unthinkingly executing prescribed actions. Computers do not understand the meaning of the real world for which they are processing information; they are not “aware” of what they are thinking about, and they have no sense of self, and cannot, therefore, examine their own thoughts, exercise free will, and show a spirited character. Obviously, humans who engage in formal operations are not computers. But if we equate the human intellect with formal operational thinking and identify this capacity as the defining trait of modern culture and Western uniqueness, we are endorsing a computational model of human consciousness.

Self-Consciousness is Uniquely European

 descartes.jpg[14]

Oesterdiefkhoff and LePan wanted to generate the origins of formal operational habits by positing the prior presence of proto-formal habits, the alphabet, Aristotelian logic, and literacy; but knowing this was a self-referential explanation, they also brought in educational institutions, implying thereby that these institutions were created by proto-formal thinkers who taught children to learn formal operations, still offering a self-referential account. We need to step outside the world of formal operations to understand its origins. Oesterdiefkhoff identifies Descartes as the first thinker to offer a systematic methodology for the pursuit of knowledge based strictly on formal operational principles. It is not a coincidence that Descartes is also known as the first modern philosopher in having postulated self-consciousness as the first principle of his formal-deductive philosophy. Descartes showed himself to be very spirited in daring to doubt and repudiate all authority and everything he had been taught, to arrive at the view that the only secure foundation for knowledge was in self-consciousness. The only secure ground for formal operations was his certainty that he was a thinking being, despite doubting everything else. Everything could be subjected to doubt except his awareness that his own mind is the one authority capable of deciding what is true knowledge, not the external senses and not any external authority.

The Cartesian idea that self-consciousness on its own can self-ground itself would be superseded by future thinkers who correctly set about connecting self-consciousness to an intersubjective social context (a social setting I would identify as singularly European, since no other setting could have generated this Cartesian idea). My point now is that Piaget’s fourth stage, in its modern form, would have been impossible without self-consciousness. Descartes did not invent self-consciousness; ancient Greece saw the beginnings of self-conscious new humans; but he did offer its first modern expression, with more sophisticated expressions to follow. It is worth citing Hegel’s treatment of Descartes in his History of Philosophy:

Actually we now first come to the philosophy of the modern world, and we begin this with Descartes. With him we truly enter upon an independent philosophy, which knows that it emerges independently out of reason . . . Here, we may say, we are at home, and like the mariner after a long voyage over the tempestuous seas, we can finally call out, “Land!” . . . In this new period the essential principle is that of thought, which proceeds solely from itself . . . The universal principle is now to grasp the inner sphere as such, and to set aside the claims of dead externality and authority; the latter is to be viewed as out of place here (Hegel’s Lectures on the History of Philosophy, trans. Haldane & Simpson, vol. III, p. 217).

The key idea is that thought proceeds from itself, out of reason, independently of all external authorities. The biological roots of this declaration of independence by the human, thinking subject are to be found in the natural obsession men have shown across all cultures to affirm the male ego in contradistinction to the enveloping womblike environment. This struggle for male identity is only a sexual precondition, and an always-present one, for the subsequent appearance of self-awareness and the first inklings of human individuality. The first cultural signs of individualism are to be found in pre-historical Indo-European societies uniquely ruled by “high spirited” aristocratic men living in a state of permanent mobility and adversity, for whom the highest value in life was honorable struggle to the death for pure prestige. It was out of this struggle by aristocratic men, who were seeking excellence in warfare that would be worthy of recognition from their aristocratic peers, that the separation and freedom of humans from the undifferentiated world of nature and the undifferentiated world of collectivist-despotic societies was fostered.

Cognitive and evolutionary psychologists, and philosophers of the mind, take it for granted that humans as humans are self-conscious beings, aware of themselves as living. “Consciousness is the greatest invention in the history of life; it has allowed life to become aware of itself [15],” said Stephen Jay Gould. This is true if by self-consciousness we mean the awareness humans have of their first-person inner experiences, pain, feelings, and memories. Human beings are constantly trying “to understand, respond to and manipulate the behavior of other human beings,” and in so doing they learn to read other people’s behavior, their feelings, and interests by self-examining their own thoughts and feelings, imagining what it is like to be in the other person’s shoes. This capacity to reflect on one’s states of mind and emotions in order to understand the behavior of others is a biologically-ingrained trait found in all humans, selected by nature. Nicholas Humphrey, in a very insightful short book, The Inner Eye, identifies this capacity as a form of “social intelligence” that evolved with gorillas and chimps. Consciousness was selected by nature because it enhanced the ability of these primates to survive within social settings characterized by “endless small disputes about social dominance, about who grooms who, about who should have first access to a favourite food, or sleep in the best site” (p. 37). In dealing with these issues, primates “have to think, remember, calculate, and weigh things up inside their heads” (p. 39). They have to learn to read the brains of other gorillas by looking inside their own brains and imagining what it is like to be in the situation of another gorilla.

This social intelligence is very different, but just as important, as the technical and natural intelligence required to survive in the acquisition of food and protection in a hostile environment. I am not going to rehearse Steven Mithen’s additional claim [16] to Nicholas Humphrey’s argument that consciousness can be said to have emerged not when primates learned to predict the social behavior of other members of the group, but when Homo sapiens during the Upper Paleolithic era managed to achieve enough “cognitive fluidity” between the different intelligences: social, linguistic, technical, and natural. Neither will I rehearse Julian Jaynes’ argument that such advanced peoples as the Mesopotamians and Egyptians were still lacking in self-consciousness, without “an interior self,” subservient to powerful gods controlling and arresting the development of their cognitive processes. I have added in the first part of this article [17] Piaget’s scientifically-based argument that pre-modern peoples did have “childlike” minds, which made it very difficult for them to rely on their own reasoning powers, to attain independence from the influence of unknown spirits and age-old mandates accepted without reflection.

I will conclude by asserting that it goes against the entire history of actual cognition and actual intellectual developments, as well as the history of science, mathematics, psychology, physics, and chemistry, to be satisfied with the degree of consciousness found in primates, Upper Paleolithic peoples, and all non-Western civilizations, which never reached the stage of formal operations, and which stagnated intellectually after the Bronze Age, and, in the cases of China and the Islamic world, after about 1300 AD. Europeans reached a higher level of consciousness starting in ancient Greek times with their spirited discovery of the faculty of the mind, and their increasing awareness of their own agency as human beings capable of understanding the workings of the world in terms of self-determined or rationally-validated regularities, coupled with their growing awareness that man was the measure of all things, a subject with a spirited will-to-be-conscious of himself as a free subject who takes himself to be the “highest point” on which all else depends, rather than a mere object of nature and mysterious forces. But this self-consciousness was in its infancy in ancient times, and it would take a consideration of German Idealism during the 1800s to attain a full account of how the (self-conscious) I can be shown to lie at the very basis of all knowledge, and beyond this outlook, to develop a philosophical-historical account that demonstrates a full awareness that this self-conscious I was self-generated only within the particular cultural setting of Western Civilization.

References

Brown, Donald (1991). Human Universals. Philadelphia: Temple University Press.

Dasen, P. (1994). “Culture and cognitive development from a Piagetian perspective.” In W. J. Lonner & R. S. Malpass (eds.), Psychology and Culture. Boston: Allyn and Bacon.

Genovese, Jeremy (2003). “Piaget, Pedagogy, and Evolutionary Psychology.” Evolutionary Psychology, Volume 1: 217-137.

Humphrey, Nicholas (2002). The Inner Eye: Social Intelligence in Evolution. Oxford University Press.

LePan, Donald (1989). The Cognitive Revolution in Western Culture. London: Macmillan Press.

Lucien Lévy-Bruhl (2018). Primitive Mentality [1923]. Forgotten Books.

Oesterdiekhoff, Georg W. (2014a). “The rise of modern, industrial society. The cognitive developmental approach as key to disclose the most fascinating riddle in history.” The Mankind Quarterly, 54, 3/4, 262-312.

Oesterdiekhoff, Georg W. (2016). “Child and Ancient Man: How to Define Their Commonalities and Differences.” The American Journal of Psychology, Vol. 129, No. 3, pp. 295-312.

Oesterdiekhoff, Georg W. (2012). “Was pre-modern man a child? The quintessence of the psychometric and developmental approaches.” Intelligence 40: 470-478.

Oesterdiekhoff, Georg W (2014b). “Can Childlike Humans Build and Maintain a Modern Industrial Society?” The Mankind Quarterly 54, 3/4, 371-385.

Oesterdiekhoff, Georg W (2015). “Evolution of Democracy. Psychological Stages and Political Developments in World History” Cultura: International Journal of Philosophy of Culture and Axiology 12 (2): 81-102.

Radding, M. Charles (1985). A World Made by Men: Cognition and Society, 400-1200. The University of North Carolina Press.

Stenberg, Robert (2003). Cognitive Psychology, Third Edition. Nelson Thompson Learning.

This article was reproduced from the Council of European Canadians [18] Website.

 

Article printed from Counter-Currents Publishing: https://www.counter-currents.com

URL to article: https://www.counter-currents.com/2018/10/jean-piaget-the-superior-psychogenetic-cognition-of-europeans-part-ii/

URLs in this post:

[1] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-18-18-5.jpg

[2] here: https://www.counter-currents.com/2018/10/jean-piaget-the-superior-psychogenetic-cognition-of-europeans-1/

[3] Council of European Canadians: https://www.eurocanadian.ca/

[4] wanting in spirit and therefore they are always in a state of subjection and slavery: https://books.google.ca/books?id=EafUAgAAQBAJ&pg=PA2107&lpg=PA2107&dq=%22wanting+in+spirit+and+therefore+they+are+always+in+a+state+of+subjection+and+slavery%22.&source=bl&ots=iKOJROEM1-&sig=A4gpvXRWbsWKn6Qihf6-yVhgfmQ&hl=en&sa=X&ved=2ahUKEwi5kY3exoDeAhUqUt8KHeLRD0cQ6AEwAnoECAgQAQ#v=onepage&q=%22wanting%20in%20spirit%20and%20therefore%20they%20are%20always%20in%20a%20state%20of%20subjection%20and%20slavery%22.&f=false

[5] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-18-18-1.jpg

[6] The Shaping of Deduction in Greek Mathematics: A Study in Cognitive History: https://www.cambridge.org/core/books/shaping-of-deduction-in-greek-mathematics/6801E135656F4401979F431F6FF48A28

[7] The Forgotten Revolution: How Science Was Born in 300 BC and Why It Had to Be Reborn: https://www.springer.com/gp/book/9783540200680

[8] republican form of government during ancient Roman times: https://www.amazon.com/Companion-Democracy-Republic-Blackwell-Companions/dp/1444336010

[9] modern Constitution of the United States: https://books.google.ca/books?id=-NO-DAAAQBAJ&printsec=frontcover&dq=republican+government+of+rome&hl=en&sa=X&ved=0ahUKEwj1toaz-s7dAhXKmeAKHWG3A_YQ6AEINTAC#v=onepage&q=republican%20government%20of%20rome&f=false

[10] representative parliaments and estates of medieval Europe: https://www.amazon.com/Parliaments-Estates-Europe-1789-Myers/dp/0500320330

[11] witchcraft and magic in Enlightenment Europe: http://www.oapen.org/search?identifier=341322

[12] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-18-18-2.jpg

[13] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-18-18-3.jpg

[14] Image: https://www.counter-currents.com/wp-content/uploads/2018/10/10-18-18-4.jpg

[15] Consciousness is the greatest invention in the history of life; it has allowed life to become aware of itself: https://books.google.ca/books?id=W8G8Oji53XsC&pg=PA34&lpg=PA34&dq=%22Consciousness+is+the+greatest+invention+in+the+history+of+life;+it+has+allowed+life+to+become+aware+of+itself%22&source=bl&ots=ywNbrywKqr&sig=6uHrLKln45n4G-LrisQgW9S7hhM&hl=en&sa=X&ved=2ahUKEwjcmZSnyoDeAhVhiOAKHek5B3QQ6AEwAXoECAAQAQ#v=onepage&q=%22Consciousness%20is%20the%20greatest%20invention%20in%20the%20history%20of%20life%3B%20it%20has%20allowed%20life%20to%20become%20aware%20of%20itself%22&f=false

[16] Steven Mithen’s additional claim: https://www.eurocanadian.ca/2018/07/the-higher-cognitive-fluidity-of-white-origins-consciousness.html

[17] the first part of this article: https://www.eurocanadian.ca/2018/10/jean-piaget-superior-psychogenetic-cognition-europeans-part-one.html

 

mercredi, 19 septembre 2018

Pourquoi l’intelligence diminue-t-elle ?

iqtime.jpg

Pourquoi l’intelligence diminue-t-elle ?

Article original de Lance Welton, publié le 21 août 2017 sur le site Unz Review
Traduit par le blog http://versouvaton.blogspot.fr

Nos dirigeants ne veulent pas que vous sachiez.

C’est un signe certain qu’un morceau de « science controversée » est en fait empiriquement exact quand notre élite marxiste culturelle tente désespérément de trouver une explication environnementale, aussi improbable soit-elle. Cela signifie que la preuve est si écrasante qu’on ne peut plus la nier, elle doit donc, d’une certaine manière, être due au comportement humain, et en particulier au comportement des humains privilégiés. Si c’est dû à la génétique, alors le déterminisme environnemental n’a aucun sens ; et c’est injuste, impensablement injuste. C’est ce qui s’est produit lorsque la preuve des différences raciales au niveau du QI est devenue indéniable. Et maintenant, cela se produit avec la preuve qu’en moyenne, nous devenons moins intelligents.


Intel1.jpgComme preuve que le QI des Noirs américains à un écart-type inférieur à celui des Blancs, des tentatives risibles ont été faites pour trouver une explication environnementale. « Les tests de QI sont injustes pour les Noirs », insistait-on, jusqu’à ce qu’il soit démontré que les Noirs s’en sortaient mieux sur les parties les moins culturelles du test, que les Asiatiques de l’Est obtiennent de meilleurs résultats que les Blancs et que les différences raciales se situent au niveau de l’intelligence générale, qui est fortement génétique. L’explication suivante était : « C’est en rapport avec la pauvreté des Noirs », jusqu’à ce qu’il soit démontré que la pauvreté est fortement génétique et que les enfants Noirs – adoptés par de riches couples blancs – finissaient par avoir à peu près le même QI que leurs parents biologiques [Race Differences in Intelligence, By Richard Lynn, 2015, Washington Summit]. Le plus drôle pour finir avec la « menace de stéréotype » – les Noirs s’en sortent plus mal dans les tests de QI parce qu’ils sont victimes de stéréotypes les desservant. Mais il a ensuite été démontré que, dans certains cas, l’inverse est vrai, la preuve est très incohérente et son sujet a un biais de publication étonnant (« An examination of stereotype threat effects on girls’ mathematics performance » By J. Ganley and others, Developmental Psychology, 2013).

Le fléau du désespoir environnemental a maintenant atteint un important corpus de preuves que nous devenons moins intelligents. Connu sous le nom d’« effet Woodley » – d’après le nom du psychologue britannique Michael Woodley de Menie – ce qui s’est produit est maintenant très clair. Sur de nombreuses et solides corrélations sur des relations temps de réaction/intelligence, les parties les plus génétiques des tests de QI qui mesurent aussi au mieux l’intelligence, la discrimination des couleur, l’utilisation de mots difficiles, la capacité de compter à rebours, les niveaux de génie par nombre d’habitants et les innovations majeures, les niveaux de créativité et la fréquence dans la population des allèles associés à des niveaux d’éducation très élevés et à une intelligence élevée – les occidentaux deviennent de plus en plus stupides. En se basant sur le ralentissement des temps de réaction, nous avons perdu 15 points de QI en moyenne entre 1900 et l’an 2000 ; la différence entre un enseignant d’école primaire et un professeur d’université [At Our Wits’ End : Why We’re Becoming Less Intelligent and What It Means for the Future, Par Edward Dutton & Michael Woodley de Menie, Imprint Academic, 2018].

intel2.jpgLa raison proposée par Woodley de Menie et son équipe est simple. En fait, elle a été conjecturée dès le XIXe siècle, car des gens comme Charles Darwin avaient prédit qu’il y aurait des « effets dysgéniques ».

Darwin a écrit, cité par Lynn :
« Nous, les hommes civilisés, nous faisons tout notre possible pour contrôler le processus d’élimination ; nous construisons des asiles pour les imbéciles, les mutilés et les malades ; nous instituons de mauvaises lois ; et nos médecins exercent leurs plus grandes compétences pour sauver la vie de tout le monde jusqu’au dernier moment. C’est ainsi que les membres faibles des sociétés civilisées propagent leur être. Personne … ne doutera que cela sera très préjudiciable à la race humaine. » La filiation de l’homme, 1871

Jusqu’à la révolution industrielle, nous étions dans des conditions optimales en terme de sélection. Cela signifiait que les enfants des riches, avec leurs conditions supérieures, avaient plus de chances de survivre. En effet, d’après les testaments anglais du XVIIe siècle, dans 50% des testaments des plus riches, il y avait presque le double du nombre d’enfants survivants par rapport aux 50% les plus pauvres. La richesse et le niveau d’éducation sont prédits par l’intelligence, qui est à environ 80 % génétique, de sorte que nous sommes devenus de plus en plus intelligents jusqu’à la percée géniale de la révolution industrielle.

Cela a réduit la dureté de l’environnement pour les pauvres et a conduit à l’innovation de la contraception, qui était mieux utilisée par les moins impulsifs et donc les plus intelligents. Les familles nombreuses sont devenues un accident basé sur la stupidité et l’intelligence a commencé à décliner. Cette situation a été aggravée par l’immigration dysgénique du Tiers Monde, par des femmes plus intelligentes qui limitent leur fertilité pour poursuivre leur carrière, et par un État providence laxiste qui offre des incitatifs – sous forme d’allocations familiales – aux femmes à faible QI pour qu’elles se reproduisent [At Our Wits’ End, Dutton & Woodley of Menie].

Mais des spectres hantent cette théorie simple et fondée sur des preuves. L’une d’entre elles est une scientifique galloise, Barbara Demeneix (née Jenkins), qui dirige un laboratoire au Muséum d’histoire naturelle de Paris. Tant qu’elle dit aux médias dominants ce qu’ils veulent entendre, elle n’a aucun problème pour diffuser son message, par exemple comme dans son livre Losing Our Minds : How Environmental Pollution Impairs Human Intelligence and Mental Health, (Perdre la tête : Comment la pollution de l’environnement nuit à l’intelligence humaine et à la santé mentale), publié en 2014 par Oxford University Press, ou dans son livre suivant Toxic Cocktail : How Chemical Pollution Is Poisoning Our Brains [2017], (Toxic Cocktail : Comment la pollution chimique empoisonne notre cerveau) ou un documentaire de 2017 sur la chaîne française Arte sur le déclin du QI [Are people becoming dumber ? (Les gens deviennent-ils plus bêtes ?), Sputnik News, 10 décembre 2017] qui a interviewé un partisan de la théorie dysgénique, mais a coupé tout ce qu’il avait à dire à ce sujet, et n’a rapporté que le point de vue de Demeneix.

toxic.jpgEssentiellement, Demeneix prétend que le déclin de l’intelligence est causé par les neurotoxines trouvées dans le plastique et d’autres matériaux industriels auxquels les gens, et donc les bébés à naître, sont de plus en plus exposés. Ces derniers agissent apparemment comme des « perturbateurs endocriniens » qui modifient l’expression des gènes et réduisent ainsi l’intelligence.

Woodley et son équipe ont créé un modèle informatique pour tester la théorie de la neurotoxine par rapport à la leur. Ils ont retracé dans quelle mesure les neurotoxines avaient augmenté au fil du temps et l’ont comparée à l’influence de l’immigration et de la fertilité dysgénique au fil du temps. Ces facteurs correspondaient au modèle qui illustrait la façon dont l’intelligence avait diminué au fil du temps. La neurotoxine accumulée ne correspondait pas du tout au modèle. En d’autres termes, d’un point de vue scientifique, la théorie de la neurotoxine est assez…. toxique.

L’autre spectre : une paire de Norvégiens qui soutient que toute la baisse du QI présentée dans les méta-analyses sur la baisse du QI peut s’expliquer par des facteurs environnementaux, parce que les jeunes frères conscrits norvégiens, les conscrits étant la base de ces tests, ont tendance à obtenir un score inférieur à celui de leurs frères plus âgés [The Flynn Effect and its Reversal Are Environmentally Caused, (L’effet Flynn et son inversion sont causés par l’environnement), par Brent Bratsberg & Ole Rogeberg, PNAS, 2018].

Cette méthode souffre de problèmes évidents. Les méta-analyses de la baisse du QI n’ont fourni que des estimations, ce qui signifie que la baisse pourrait être considérablement plus importante. De plus, ils ne testent pas si le déclin concerne les aspects génétiques de l’intelligence – ce qui s’est avéré être le cas dans les pays où cette information est disponible [The Negative Flynn Effect : A Systematic Literature Review, par Edward Dutton et autres, Intelligence, 2016]. Les diminutions intergénérationnelles réelles de la fréquence des allèles associés à un niveau d’éducation très élevé (et donc un QI élevé) ont été trouvées par exemple en Islande [Sélection contre les variantes du génome associées au niveau d’éducation, par Augustine Kong et autres, PNAS, 2017]. En conséquence, la baisse du QI est presque certainement liée à des aspects génétiques en Norvège aussi, ce qui signifie que le résultat obtenu est une sorte de faux positif, provoqué par les chercheurs imposant beaucoup de contrôles complexes et abscons sur leurs données.

À mesure que les preuves du déclin de l’intelligence deviennent plus largement connues, on peut s’attendre à des tentatives de plus en plus désespérées et élaborées pour persuader les gens que tout cela est dû à l’environnement et que le déclin de la civilisation peut être arrêté si seulement nous cessons d’utiliser des téléphones intelligents, de boire de l’alcool, de manger de la viande, et surtout cessons de penser mal…

Absolument tout pour éviter d’admettre que certaines personnes sont génétiquement plus intelligentes que d’autres et que ces personnes restent sans enfants alors que des indigènes moins intelligents sont encouragés à se reproduire et que les personnes provenant de pays à faible QI sont importées pour faire de même.

Lance Welton est un nom de plume pour un journaliste freelance vivant à New York.

Note du traducteur

Ce sujet très sensible doit être contrebalancé avec le fait qu'il existe différentes formes d'intelligence. Celle dont parle l'article est celle privilégiée par l'élaboration de notre société du technologisme, conçue par l'Occident pour les Occidentaux il y a déjà quelques siècles, à une époque ou le multiculturalisme n'existait que dans le cerveaux de quelques « philosophes ».