Why Intelligence Is Not Enough: Homeostasis, Affect, And The Biologıcal Ground Of Consciousness


Creative Commons License

GÜLTEKİN A.

Scientific Culture, cilt.12, sa.4, ss.6377-6382, 2026 (Scopus)

Özet

Recent advances in artificial intelligence have intensified a long-standing philosophical temptation: the 

identification of intelligent performance with conscious existence. Systems capable of sophisticated reasoning, 

linguistic fluency, and adaptive learning are increasingly seen as challenging the traditional boundaries between 

artificial cognition and human mentality. This article resists conflation by arguing that intelligence and 

consciousness belong to fundamentally distinct ontological categories. Intelligence concerns functional 

competence and behavioral success; consciousness concerns subjective experience structured by affect, 

vulnerability, and intrinsic normativity. Drawing on Antonio Damasio’s biologically grounded account of the 

mind, the article defends the thesis that consciousness is not an emergent property of information processing; 

rather, it is a regulatory achievement of living systems engaged in homeostatic self-maintenance. Consciousness, 

on this view, arises from the organism’s ongoing effort to preserve its own existence, and is inseparable from 

affective valuation. What it is like to be conscious is inseparable from the fact that things can genuinely go better 

or worse for the subject who experiences them. The argument is developed through critical engagement with 

dominant positions in contemporary philosophy of mind. Functionalist approaches, as exemplified by Dennett, 

have been shown to explain intelligent behavior at the cost of neutralizing normativity. Property-dualist 

accounts, such as Chalmers’s, preserve phenomenality while detaching it from biological explanation, thereby 

rendering the systematic link between consciousness and life explanatorily inert. Embodied and extended 

cognition theories expand the scope of cognition without adequately accounting for the emergence of subjectivity. 

In contrast to these alternatives, Damasio’s framework offers a naturalistic yet normatively robust account of 

consciousness, grounding phenomenality in biological vulnerability rather than in computational complexity. 

Extending this framework to artificial intelligence, the article argues that artificial systems—lacking genuine 

homeostasis, mortality, and existential stakes—cannot instantiate consciousness, regardless of their level of 

performance. This limitation is ontological rather than technological. Consciousness does not scale with 

intelligence; rather, it arises from the precariousness of life. The article concludes by defending ontological 

sobriety in debates about artificial minds, emphasizing that recognizing the limits of design is necessary for 

conceptual clarity rather than being a failure of philosophical imagination.