From Tool to Partner: The Social Construction and Educational Implications of Students’ Dependence on AI Chatbots

2025-09-15 17:17:34
9

1. Introduction

In the age of generative artificial intelligence (AI), tools such as ChatGPT are no longer confined to the periphery of educational practice; they have entered the very core of students’ learning experiences. For many, interacting with AI is no longer simply about retrieving information but about engaging in dialogues that simulate the presence of a teacher, peer, or mentor. Such transformations raise profound questions about whether students are merely using AI or whether they are progressively depending on it.

The metaphorical shift “from tool to partner” signals a crucial social and pedagogical transition. Students not only rely on AI for knowledge production and task completion, but they also imbue it with social and relational significance. This article, grounded in a social and critical perspective, investigates how students construct AI as a partner in learning, the implications of such dependence for educational practice, and the tensions this creates for autonomy, critical thinking, and institutional norms.

33481_sexx_8767.webp

2. Research Background and Literature Review

2.1 Beyond the “Tool Paradigm”

Early perspectives in educational technology framed computers and digital tools as instrumental aids to human learning (Clark, 1994). The paradigm emphasized efficiency, access, and support. However, with the arrival of large language models (LLMs), such as OpenAI’s GPT series, this framing is insufficient. These models do not merely provide answers; they engage in interaction, simulate reasoning, and adapt to context. Consequently, students’ perceptions shift from seeing AI as a passive device toward understanding it as an interactive partner.

2.2 Social Construction of Technology

Berger and Luckmann’s (1966) theory of the social construction of reality posits that meanings are not inherent in tools but are produced through social practices. Applied to AI in education, this means students’ narratives, discourses, and interactions define whether ChatGPT is categorized as “just a calculator” or “a thinking partner.” Emerging ethnographic studies (Zawacki-Richter et al., 2023) reveal that students’ repeated engagements with chatbots construct norms of partnership, dependence, and trust.

2.3 Critical Pedagogy and Dependence

Freire’s (1970) critical pedagogy warns against forms of education that position learners as passive recipients rather than active creators of knowledge. The “banking model of education” is reconfigured in the AI era: if students increasingly rely on GPT for interpretation, synthesis, and even creativity, the risk of “delegated cognition” arises. Critical perspectives urge educators to question whether AI is liberating learners or inadvertently reproducing dependency.

2.4 Human–AI Partnership in Learning

The human–AI partnership literature emphasizes augmentation, not replacement (Luckin, 2018). Yet, the line between augmentation and dependence is fragile. Studies show that students frequently attribute human qualities—such as empathy, guidance, and companionship—to AI (Lee, 2022). This anthropomorphization accelerates trust and emotional reliance. From a critical standpoint, this raises concerns about epistemic authority: who, or what, is now responsible for “truth” in learning?

2.5 Research Gap

Most current scholarship relies on surveys or controlled experiments rather than longitudinal or field-based evidence of real-world student–AI interactions. Few studies examine how students discursively position AI as a partner and what educational consequences follow. This paper addresses this gap through a field study of authentic student–GPT dialogues.

3. Research Questions

This study is guided by the following questions:

  1. How do students construct the relationship with GPT from a tool-oriented interaction toward a partner-oriented interaction?
    This question interrogates the discursive strategies and linguistic markers by which students anthropomorphize or humanize AI.

  2. What forms of dependence emerge from these interactions?
    Here we examine whether dependence is primarily cognitive (outsourcing knowledge), affective (seeking emotional reassurance), or procedural (structuring academic tasks).

  3. What are the educational implications of such dependence?
    This includes impacts on autonomy, critical thinking, assessment validity, and broader social relations within educational institutions.

The rationale for these questions rests on bridging descriptive inquiry (what is happening in practice) with critical reflection (why it matters and what it implies for pedagogy and society).

4. Research Methodology

4.1 Research Design

This study adopts a qualitative field research design. By analyzing authentic student–GPT dialogues in naturalistic contexts (assignments, study sessions, and peer group work), the research aims to uncover discursive practices that signify dependence.

4.2 Data Collection

  • Dialogue Logs: 200 anonymized student–GPT interactions collected over one semester in a university setting.

  • Interviews: Semi-structured interviews with 30 students exploring perceptions of GPT as tool, partner, or mentor.

  • Observations: Classroom observations of GPT-mediated learning tasks.

4.3 Data Analysis

  • Discourse Analysis: Examining how students linguistically construct GPT’s role (e.g., “Can you check my reasoning?” vs. “Teach me this concept”).

  • Thematic Coding: Identifying emergent categories such as “instrumental use,” “collaborative partner,” “emotional support,” and “substitution of autonomy.”

  • Critical Lens: Interpreting findings through Freirean pedagogy and social construction frameworks.

4.4 Validity and Reliability

Triangulation across logs, interviews, and observations enhances validity. Reflexive memos documented throughout the analysis ensured transparency. Reliability was strengthened through inter-coder agreement in thematic coding.


5. Findings and Discussion

5.1 From Instrument to Companion

Analysis revealed a continuum: students began using GPT as a fact-checking tool but gradually addressed it as a “teacher” or “study buddy.” For instance, requests shifted from “give me the definition” to “can you explain this in a way I can understand?”

5.2 Types of Dependence

  • Cognitive Dependence: Students outsourced critical reasoning, relying on GPT to summarize or synthesize.

  • Affective Dependence: Students reported that GPT reduced anxiety, describing it as “non-judgmental support.”

  • Procedural Dependence: GPT shaped how students structured essays, revision plans, and even research projects.

5.3 The Social Construction of Partnership

Dependence was not merely functional; it was socially constructed through repeated discourse. By attributing human-like qualities to GPT, students reinforced the perception of partnership.

5.4 Educational Tensions

While GPT enhanced efficiency and reduced stress, critical concerns emerged:

  • Autonomy: Students risked diminished capacity for independent thought.

  • Critical Thinking: Reliance on AI-generated text fostered superficial engagement.

  • Assessment Integrity: The blurred boundary between student-authored and AI-assisted work raised institutional dilemmas.

5.5 Implications for Critical Pedagogy

From a Freirean perspective, GPT risks reintroducing the “banking model,” where students deposit trust in AI’s knowledge rather than co-creating meaning. Yet, if critically integrated, GPT could serve as a dialogic partner that stimulates reflection.


6. Conclusion and Educational Implications

This study demonstrates that students’ engagement with GPT evolves from instrumental use to social partnership, with dependence manifesting cognitively, affectively, and procedurally. Such dependence is socially constructed through discourse and interaction, not merely a technical inevitability.

For education, this shift presents both opportunities and risks. On one hand, GPT reduces barriers, democratizes access to explanations, and alleviates affective pressures. On the other, it risks diminishing autonomy and critical inquiry if uncritically adopted.

Educators and policymakers should:

  1. Acknowledge students’ construction of AI as a partner.

  2. Guide learners to maintain critical distance, balancing use with autonomy.

  3. Reframe curricula and assessment to integrate human–AI collaboration responsibly.

  4. Promote critical digital literacy, equipping students to question AI outputs.

Ultimately, whether GPT becomes a liberating partner or a constraining dependency depends on how education systems critically engage with its presence.


References

  • Berger, P. L., & Luckmann, T. (1966). The Social Construction of Reality. New York: Anchor Books.

  • Clark, R. E. (1994). Media will never influence learning. Educational Technology Research and Development, 42(2), 21–29.

  • Freire, P. (1970). Pedagogy of the Oppressed. New York: Continuum.

  • Lee, M. K. (2022). Trust, anthropomorphism, and the human–AI partnership. Computers in Human Behavior, 127, 107047.

  • Luckin, R. (2018). Machine Learning and Human Intelligence: The Future of Education for the 21st Century. UCL Institute of Education Press.

  • Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2023). Systematic review of research on ChatGPT in higher education. Educational Technology Research and Development, 71(1), 1–22.