TEACHERS’ CLIL ID ON TECHNOLOGY-INTEGRATED EXTRAMURAL LANGUAGE PRACTICES AND INDIVIDUAL DIFFERENCES:
EUROPEAN AND PAN-EUROPEAN CROSS-COUNTRY RESEARCH
1Fırat University (Turkey)
2University of Durres “Aleksander Moisiu” (Albania)
3Universidad de Cantabria, Santander (Spain)
Received July 2025
Accepted February 2026
Abstract
This study investigates the perceptions of Content and Language Integrated Learning (CLIL) teachers across Austria, Turkey, Spain, and Albania regarding the integration of digital technologies in CLIL contexts. The research addresses a critical gap in the literature, as the intersection of digital literacy and CLIL pedagogy—particularly from teachers’ perspectives—has received limited attention. Employing a cross-national survey (DLTS) within the framework of COST Action #21114, the study collected data from 135 (Austria (n = 45), Spain (n = 36), Albania (n = 31), and Türkiye (n = 23) teachers. The methodology combined descriptive and inferential statistical analyses, including one-way ANOVA and hierarchical multiple regression, to explore the influence of individual differences on teachers’ perceptions of technology integration. Results indicate teachers’ perceptions were moderately favorable, reflecting cautious endorsement rather than strong approval. However, significant contextual disparities emerged, notably between countries with more established CLIL frameworks (Spain and Austria) and those with developing practices (Albania and Turkey). Surprisingly, individual factors such as age, seniority, or CLIL training did not significantly predict teachers’ attitudes towards technology integration. Teachers’ primary challenges included infrastructural inadequacies, limited professional development, and institutional resistance to change. The findings underscore the importance of tailored professional support and policy‑driven infrastructure enhancements to maximize the pedagogical potential of digital tools within CLIL education.
Keywords – CLIL, Digital activities, Extramural English, Individual differences, Teachers perceptions.
To cite this article:
|
Orak, S. D., Strati, E., & Pérez-Fernández, L. (2026). Teachers’ CLIL ID on technology-integrated extramural language practices and individual differences: European and pan-European cross-country research. Journal of Technology and Science Education, 16(1), 275–296. https://doi.org/10.3926/jotse.3724 |
----------
-
-
1. Introduction
-
Content and Language Integrated Learning (CLIL) has emerged as a prominent educational approach in Europe and beyond, fostering simultaneous development of subject content and foreign language skills (Coyle et al., 2010). It represents a shift from traditional language instruction to a more integrative, immersive model of education.
Concurrently with the growth of Content and Language Integrated Learning (CLIL), digital technology have progressively influenced pedagogical methods in multilingual education. In CLIL environments, digital technologies are often regarded as resources that can augment learner engagement, bolster disciplinary literacy, and enable multilingual comprehension. The perceived instructional value of these tools differs among situations and educators. Prior studies indicate that educators’ opinions of technology integration may be influenced by their professional experience, institutional circumstances, and availability of training, rather than solely by human traits. This study examines CLIL teachers’ opinions of technology-integrated CLIL teaching, analysed comparatively across four European contexts, and utilizes perceptions as the main empirical construct, despite teacher identity being recognized as a broader theoretical framework in CLIL research. Although there is a growing amount research on CLIL pedagogy and educational technology, there are still big holes at the crossroads of technological integration, teacher identity, and language behaviors outside of school. Most of the research that has been done on digital tools in CLIL has looked at how they might be used to help students learn or motivate them, with a focus on what happens in the classroom or what students learn. There has been significantly less focus on how CLIL educators conceptualise and navigate their professional perceptions in the context of digital technologies that facilitate learning beyond the classroom, especially within varied national and governmental frameworks. Furthermore, comparative, cross-national research are scarce, leading to a disjointed comprehension of how systemic factors—such as the maturity of CLIL legislation, infrastructural conditions, and professional development opportunities—affect teachers’ involvement with technology.
To address these gaps, the present research examines at how CLIL teachers in four European countries—Austria, Spain, Albania, and Turkey—use technology in their extramural activities and how it affects their professional identity. Variables such as prior training, teaching experience, and contextual factors can shape how teachers perceive and apply technology in their CLIL practice, particularly in pedagogically demanding multilingual settings such as CLIL.Utilising data from the Digital Literacies Teacher Survey created under COST Action CLILNetLE (CA21114), the study transcends a tool-centric viewpoint, framing technology use as an identity-mediated pedagogical practice shaped by contextual limitations rather than solely by individual traits. The study offers novel empirical evidence that contests prevalent ideas in CLIL and educational technology literature by illustrating that teachers’ opinions are influenced more by institutional and systemic factors than by age, seniority, or training. In doing so, it adds to the current discussions about how to make digital CLIL more sustainable and shows how important it is to have professional development frameworks that are in line with policy and take into account teachers’ identities as multilingual, digitally mediated educators.
This study seeks to investigate the perceptions of CLIL teachers regarding the integration of digital technologies across four contexts: Albania, Austria, Spain, and Turkey. It should be noted that the national reports cited in this section serve a contextual and descriptive purpose only. While they were produced within the same COST Action framework, the present study conducts an independent secondary analysis of the DLTS dataset, and none of the findings reported in this article are reproduced from or dependent upon those national reports. These countries were selected to reflect a diverse spectrum of CLIL implementation—from well-established, policy-driven models in Spain and Austria to developing and evolving practices in Turkey and Albania—providing a rich comparative perspective on technology integration in CLIL (Agmaz & Orak, 2024; Ghamarian & Smit, 2024; Hoxha & Strati, 2024; Segura & Bárcena-Toyos, 2024). Specifically, the study aims to explore the types of technologies employed, the influence of demographic and contextual variables on perceptions, and the challenges teachers face in implementing technology within CLIL instruction.
The research is guided by the following questions:
-
1.What are CLIL teachers’ perceptions of technology-integrated CLIL education in Albania, Austria, Spain, and Türkiye?
-
2.What types of digital technologies do CLIL teachers report using in their instructional practices across these contexts?
-
3.Do CLIL teachers’ perceptions of technology-integrated CLIL education differ according to individual and contextual variables (e.g., age, seniority, CLIL training, school location, and country)?
-
4.What challenges do CLIL teachers perceive in relation to technology-integrated CLIL education?
-
-
2. Literature Review
-
2.1. Teachers’ Identity and CLIL Implementation
Research into Content and Language Integrated Learning (CLIL) have often addressed the notion of teacher identity to elucidate how educators manage the intricate pedagogical challenges of merging topic knowledge with language training. In this literature, teacher identity is defined as a dynamic and socially contextualised construct influenced by professional roles, beliefs, institutional expectations, and continuous negotiation within particular educational settings (Alonso-Belmonte & Fernández-Agüero, 2020; Hüttner & Smit, 2021). CLIL educators are frequently situated at the convergence of subject matter proficiency and language instruction, necessitating a balance between content precision and linguistic support. Identity-focused research has emphasised how this dual positioning might affect teachers’ pedagogical approaches, professional self-assurance, and adaptability to curricular innovation.
Recent research have correlated teacher identity with educational reform processes, particularly the integration of digital technologies. From this viewpoint, educators’ professional self-perceptions and convictions on their instructional responsibilities may influence their interpretation of innovation-driven policies and their reactions to institutional demands (Dooly & O’Dowd, 2023). For instance, educators who primarily identify as content specialists may regard technology only as an auxiliary presenting instrument, whereas those who prioritise language mediation may be more predisposed to utilise digital technologies for engagement and scaffolding. These discoveries have enhanced the theoretical comprehension of how professional positioning might affect instructional decision-making in CLIL environments.
Nonetheless, whereas identity-oriented research provides significant conceptual insights, it is crucial to differentiate between theoretical framing and actual measurement. Teacher identity is a complex psychological and sociocultural construct that generally necessitates qualitative or mixed-method approaches for thorough examination, including narrative inquiry, interviews, or longitudinal research. Consequently, identity cannot be presumed to be immediately quantifiable solely through perception‑based survey methods. Understanding this distinction is essential for preserving conceptual and methodological consistency.
This study does not empirically examine teacher identity. This part utilises identity-related scholarship to establish a theoretical framework for comprehending teachers’ perceptions of technology-integrated CLIL education. The perspectives of teachers, as analysed in this study, are regarded as evaluative judgements influenced by professional experience, institutional context, and instructional requirements, rather than as direct indications of identity formation or evolution. This study conceptualises identity as a contextual lens instead of a quantifiable variable, so preventing the overextension of its empirical assertions while integrating perception-based findings into wider discourses of CLIL education.
This conceptual framework enables the study to recognise the intricacies of CLIL instruction while adhering to its methodological parameters. Teachers’ impressions of technology integration are regarded as contextually influenced responses that reflect structural factors, pedagogical expectations, and available resources. Thus, identity-oriented research elucidates the interpretation of perception-based data, ensuring clarity and alignment across theory, research objectives, and analytical methodologies. This section, grounded in the theoretical framework, explicitly examines empirical studies on CLIL teachers’ perceptions regarding the use of technology in instructional practices.
2.2. CLIL Teachers’ Perceptions of Technology Use in CLIL Education
Research on CLIL educators’ perspectives of technology utilisation have consistently underscored both the instructional potential of digital instruments and the disparities in their application across various educational settings. Recent studies indicate that educators typically perceive technology as beneficial for content comprehension, student engagement, and multilingual interpretation, while also voicing apprehensions regarding practicality, workload, and institutional limitations (Pérez-Cañado, 2020; Dooly & Sadler, 2022; Sánchez-Gómez et al., 2023). Perception-based assessments are especially significant in CLIL environments, where the cognitive and linguistic requirements of teaching intensify the difficulties of educational innovation.
Under the auspices of COST Action CA21114, various national reports have delivered comprehensive analyses of CLIL educators’ use of technology across diverse European contexts (Agmaz & Orak, 2024; Ghamarian & Smit, 2024; Hoxha & Strati, 2024; Segura & Bárcena-Toyos, 2024). These data illustrate disparities in the frequency and intensity of technology use, with notably elevated levels recorded in Spain and Austria, while more restricted usage is evident in Albania and Türkiye. These studies are utilised in the current study solely to furnish contextual knowledge concerning country CLIL environments and policy frameworks. The current research does an independent secondary analysis of the Digital Literacies Teacher Survey (DLTS), utilising data gathered within the same COST Action; nonetheless, none of the empirical conclusions presented are replicated or directly extrapolated from the national reports.
In addition to COST-related research, extensive empirical studies confirm the presence of cross-contextual differences in educators’ perspectives of technology utilisation. Research in well-resourced educational systems frequently indicates elevated teacher confidence and a broader range of digital practices, whereas studies from environments with constrained infrastructure emphasise practical and resource-dependent technology applications (Howard et al., 2021; Tondeur et al., 2021). Recent work, however, advises against understanding such diversity just as a consequence of national growth or policy maturity, highlighting instead the significance of institutional culture, professional assistance, and curricular alignment (Schmid, 2023).
The finding that technology utilisation seems more prevalent and pedagogically varied in Spain and Austria compared to Albania and Türkiye is not regarded as a priori in this study. This trend is explicitly analysed in the Findings section through both descriptive and inferential analyses, enabling the study to assess whether observed differences are statistically significant or merely descriptive. Section 2.2 presents a contextual and theoretical underpinning for the subsequent empirical studies, while circumventing circular thinking and hasty conclusions.
This section emphasises the necessity for empirical examination of cross-national patterns by contextualising teachers’ opinions of technology use within both COST-informed literature and broader peer-reviewed studies. The study’s basic premise is reinforced: teachers’ opinions of technology-integrated CLIL teaching should be regarded as contextually mediated assessments, influenced by systemic conditions rather than presumed national hierarchies of innovation.
2.3. Teachers’ Perceptions and Individual Differences
A significant amount of recent research has investigated the relationship between individual and environmental variables and teachers’ opinions of technology use in educational environments. In this literature, perceptions are typically defined as teachers’ evaluative assessments of the utility, practicality, and educational significance of digital technologies in instructional practice. These evaluations are influenced by continuous engagement with institutional circumstances, professional development opportunities, and classroom dynamics, rather than being fixed or intrinsic personal characteristics (Cabero-Almenara et al., 2021; Scherer et al., 2021a). In CLIL environments, where educators must concurrently integrate subject content and language acquisition, perception-based evaluations are especially significant, since technology can either facilitate pedagogical integration or exacerbate existing instructional demands.
Age is one of the most commonly examined variables in the research concerning teachers’ perspectives of technology integration. Previous studies have indicated that younger educators are more likely to express positive opinions of digital technologies, attributing this inclination to their enhanced familiarity with technology gained during their initial teacher training or daily experiences. In CLIL-related research, analogous assumptions have been proposed, with younger educators occasionally characterised as more proficient consumers of digital resources in multilingual classrooms (Dalton-Puffer et al., 2021a). Nonetheless, new empirical research increasingly contest the validity of age-based hypotheses. Evidence indicates that when institutional support, infrastructural access, and possibilities for pedagogically significant utilisation are considered, the explanatory power of age significantly decreases (Howard & Mozejko, 2022; Scherer et al., 2021b). The findings suggest that age disparities in teachers’ judgements may signify unequal access to facilitating conditions rather than generational attitudes towards technology.
Teaching experience or seniority, closely associated with age, is another attribute often presumed to influence teachers’ impressions of digital technologies. Certain research indicate that novice educators are inclined to assess technology more favourably, potentially owing to increased confidence in utilising new tools or a lack of established pedagogical practices. Conversely, seasoned educators are occasionally characterised as taking a more prudent or critical approach, shaped by previous experiences with ephemeral or inadequately supported technology endeavours (Pozo-Sánchez et al., 2022). In CLIL environments, when instructional complexity is elevated, seasoned educators may be especially attuned to the supplementary cognitive, organisational, and temporal challenges linked to technology integration. Consequently, seniority does not seem to have a consistent impact on perceptions; instead, its impacts are influenced by teachers’ previous experiences with institutional support and pedagogical coherence.
Professional training constitutes an additional element frequently associated with teachers’ opinions of technology use. Recent research consistently indicate that teachers express more positive perceptions when training is deemed relevant, practice-oriented, and closely matched with their teaching requirements (Cabero-Almenara et al., 2022). In CLIL environments, training that expressly focusses on the integration of content, language, and technology seems very impactful in fostering favourable assessments. Conversely, when professional development is excessively technical, disjointed, or detached from classroom reality, its influence on teachers’ views is frequently constrained. This discrepancy highlights the necessity of evaluating teachers’ assessments of training quality and relevance, rather than viewing training participation as a mere binary measure.
A burgeoning corpus of research underscores the significance of contextual elements in influencing teachers’ impressions of technology integration, beyond individual traits. Comparative studies commonly indicate variability among educational environments, typically ascribing variances to discrepancies in infrastructure, regulatory frameworks, and institutional capability (Martín-García et al., 2023). In CLIL research, cultures with a longer history of bilingual education are often presumed to cultivate more positive perspectives of pedagogical innovation. Recent work, however, contests this assumption, indicating that the maturity of national policy does not inherently result in favourable teacher ratings at the classroom level. Rather, conditions at the school level—such as leadership practices, workload expectations, and availability of pedagogical support—seem to have a more direct impact on teachers’ perceptions (Howard & Mozejko, 2022).
The geographical setting of schools, typically categorised as urban or rural, has been researched concerning educators’ perspectives of technology utilisation. Educators in urban or well-resourced institutions may regard digital tools as more practical and advantageous owing to superior infrastructure and technical assistance. Conversely, educators in under-resourced or rural environments may perceive technology integration as fraught with problems, including unstable connectivity and restricted access to gadgets. Nonetheless, empirical data indicates that these disparities are not deterministic. Targeted institutional support and context-specific implementation techniques might alleviate location-based discrepancies, enabling educators in varied environments to cultivate comparable cautious or favourable opinions (Martín-García et al., 2023).
In CLIL-focused study, instructional language serves as an additional contextual aspect influencing teachers’ perceptions of technology utilisation. Educators in programs utilising a second language as the instructional medium may assess digital tools based on their ability to facilitate linguistic scaffolding and enhance comprehension. Digital resources offering multimodal input, visual assistance, or interactive opportunities are frequently regarded as advantageous in linguistically challenging environments. Simultaneously, educators may view technology as augmenting instructional complexity when digital resources are misaligned with students’ language competency levels or curriculum objectives. Thus, perceptions regarding instructional language are intricately linked to pedagogical design and the quality of resources, rather than functioning as separate contextual variables.
The age group of students has also been identified as a factor influencing teachers’ opinions of technology integration. Educators instructing younger students frequently view digital tools as beneficial for enhancing engagement and understanding, especially when the resources are interactive and visually stimulating. Conversely, educators in secondary or upper-secondary CLIL environments may regard technology integration as more challenging due to curricular constraints, assessment obligations, and the abstract characteristics of disciplinary subject. Empirical evidence concerning the impact of student age on teachers’ perceptions is inconclusive, indicating that age-related impacts are mediated by institutional expectations and pedagogical limitations rather than having a direct effect.
Recent research increasingly converges on the notion that, whereas individual and contextual variables are commonly presumed to affect teachers’ evaluations of technology-integrated instruction, their actual explanatory power is frequently constrained. Systematic reviews and cross-contextual studies consistently indicate that demographic characteristics contribute minimally to the variance in teachers’ evaluations of digital technologies, while structural and systemic conditions exert a more significant influence (Cabero‑Almenara et al., 2021; Martín-García et al., 2023). The requirements encompass institutional culture, policy coherence, leadership endorsement, and the provision of ongoing, contextually relevant professional development.
This shift in focus corresponds with wider criticisms of individualistic methods in educational innovation, which tend to place the onus of change predominantly on teachers, neglecting the limitations they face. Teachers’ perceptions of technology integration should be regarded as contextual evaluative judgements, indicating their assessment of the instructional significance and institutional support of digital technologies within their individual environments (Scherer et al., 2021a). In CLIL situations, where instructional demands are intrinsically intricate, such evaluations may be more susceptible to systemic discrepancies between innovative discourse and classroom realities.
This study investigates the relationship between specific individual and contextual characteristics and CLIL teachers’ impressions of technology-integrated CLIL education across several national contexts, building on existing research. This study, by emphasising perceptions as the fundamental construct, contributes to the ongoing discourse on the significance of individual traits versus structural conditions in influencing teachers’ reactions to digital innovation in multilingual educational contexts.
3. Methodology
3.1. The Study
This article reports part of the findings derived from the work of Working Group 4 (WG4) within the framework of COST Action #21114, titled CLIL Network for Languages in Education: Towards bi- and multilingual disciplinary literacies (CLILNetLE), funded by COST. WG4 aimed to investigate engagement with bi/multilingual disciplinary literacies across educational contexts in European and Pan-European countries. As part of this initiative, two large-scale surveys were developed and administered across 11 European and Pan-European countries: the Digital Literacies Student Survey (DLSS) and the Digital Literacies Teacher Survey (DLTS). The present study focuses exclusively on CLIL teachers’ self-reported perceptions of technology-integrated CLIL education, as measured through the Digital Literacies Teacher Survey (DLTS) with a specific focus on CLIL teachers in four national contexts—Albania, Austria, Spain, and Turkey. It is helpful to underline that the current study does not operationalise or measure teacher identity, which is addressed as a theoretical construct in the broader CLIL literature, but instead focusses empirically on perceptions. Accordingly, all analyses and interpretations in this study are restricted to perception-based data derived from teacher self-reports, and no identity-related variables are inferred beyond this scope.
3.2. Participants
The study involved 135 CLIL teachers from four national contexts: Albania (n = 31), Austria (n = 45), Spain (n = 36), and Türkiye (n = 23). The study sought to incorporate CLIL teachers from various institutional contexts within each country; however, participant recruitment relied on school-level distribution of the online survey and voluntary teacher involvement, rather than rigorous statistical random sampling. National coordinators in the COST Action network disseminated the survey link to schools recognised for implementing CLIL, inviting eligible teachers to participate. The final sample, being derived from voluntary participation, should be regarded as convenience-based, characterised by disparate group sizes among countries. Thus, the study does not seek to deliver statistically representative national profiles but instead aims to present a comparative, exploratory analysis of CLIL educators’ perspectives regarding technology-integrated CLIL instruction. Table 1 outlines the general characteristics of the participants’ backgrounds in each country.
The total of answers within each question may not add up to the total of participants in that country’s sample because not all questions were obligatory to answer. As a result, the aggregate replies within certain categories (e.g., gender, mother language) do not consistently equal the total number of participants in each country. All analyses utilised available-case data, and missing responses were not substituted.
Data on variables like mother tongue and the language of instruction in CLIL were gathered to create a comprehensive contextual profile of the participating educators. Nonetheless, owing to the finely detailed distribution of categories and limited sample sizes in various subgroups, these variables were regarded mainly as descriptive markers rather than as reliable predictors in inferential statistical studies.
In this study, the primary language of instruction denotes the predominant institutional language utilised in the school for general education, while the school CLIL language specifically pertains to the language employed for teaching CLIL courses. In the Austrian setting, German serves as the principal language of instruction, although English is the predominant CLIL language for a select group of educators.
Regarding the Austrian participants, the majority were German speakers (N=33), followed by English speakers (N=10). The main language of schooling was German (N=35), while English served as the primary CLIL language (N=9). The student age range was diverse, though most fell between 17–21 years old. More than half of the Austrian teachers had specialized CLIL training (N=25), and most schools were located in urban centres (N=31).
As for the Albanian participants, the most common home language was Albanian (N=9), followed by German and Greek. The primary language of schooling was German (N=9), with English being the dominant CLIL language (N=19). Students were distributed across different age groups, with a notable presence in the 13–16 and 17–21 age ranges. Most Albanian teachers had up to 10 years of experience, and nearly two-thirds had specialized CLIL training (N=19). The majority of schools were in urban centres (N=25).
|
|
Austria (n= 45) |
Albania (n = 31) |
Spain (n=36) |
Turkey (n = 23) |
|
|
Gender |
Female |
26 |
21 |
29 |
12 |
|
Male |
17 |
10 |
6 |
10 |
|
|
Prefer not to say |
2 |
|
1 |
1 |
|
|
Mother Tongue |
Spanish |
|
|
9 |
|
|
Albanian |
|
9 |
|
|
|
|
German |
33 |
8 |
|
|
|
|
Catalan |
|
|
2 |
|
|
|
Macedonian |
|
1 |
|
|
|
|
English |
10 |
1 |
2 |
|
|
|
Greek |
|
4 |
|
|
|
|
Italian |
|
1 |
10 |
|
|
|
Turkish |
|
|
|
19 |
|
|
Kurdish |
|
|
|
4 |
|
|
Others |
|
8 |
13 |
|
|
|
School CLIL Language |
Spanish |
|
|
8 |
|
|
German |
35 |
9 |
1 |
|
|
|
English |
9 |
19 |
14 |
23 |
|
|
French |
|
3 |
1 |
|
|
|
Catalan |
|
|
10 |
|
|
|
Others |
1 |
2 |
3 |
|
|
|
Age Range of Students |
13-16 |
4 |
8 |
15 |
14 |
|
17-21 |
11 |
11 |
|
|
|
|
9-12 |
4 |
4 |
12 |
4 |
|
|
9-12/13-16 |
4 |
1 |
7 |
2 |
|
|
13-16/17-21 |
3 |
7 |
2 |
3 |
|
|
Seniority |
0-5 |
20 |
12 |
15 |
8 |
|
6-10 |
14 |
7 |
12 |
5 |
|
|
11-15 |
2 |
5 |
7 |
4 |
|
|
16-20 |
3 |
3 |
2 |
4 |
|
|
21-25 |
6 |
1 |
|
1 |
|
|
26-30 |
|
|
|
1 |
|
|
Special Training in CLIL |
yes |
25 |
19 |
23 |
13 |
|
no |
20 |
11 |
11 |
10 |
|
|
School Location |
urban center |
31 |
25 |
17 |
21 |
|
urban suburbs |
12 |
5 |
13 |
2 |
|
|
rural |
2 |
|
4 |
23 |
|
Table 1. Participants’ Background Information
For the Spanish participants, the most commonly spoken home languages were Spanish (N=9) and Catalan (N=2), with some students also speaking English and Italian. The main language of schooling was Catalan (N=10), followed by Spanish (N=8). The primary CLIL language was English (N=14), though German and French were also present. The student age range was broad, but most were between 13 and 13–16 years old. A large proportion of Spanish teachers had received CLIL training (N=23), and schools were primarily located in urban centres (N=17).
Finally, among the Turkish participants, Turkish (N=19) and Kurdish (N=4) were the most commonly spoken home languages. English was the dominant CLIL language (N=23), with German and French playing minor roles. Students were mostly within the 13–16 age range (N=14). While fewer Turkish teachers had CLIL training compared to other countries (N=13), a significant proportion had more than a decade of teaching experience. Unlike other countries in the study, most Turkish schools were situated in rural areas (N=23), indicating a distinct educational setting compared to the largely urban environments in Austria, Albania, and Spain.
In a nutshell, although random sampling was attempted within each national context by distributing the survey to identified CLIL schools and teachers, participation ultimately depended on voluntary response. Consequently, the final sample (N = 135) should be considered a convenience-informed sample, unevenly distributed across countries. Additionally, some demographic items were optional, resulting in missing data for certain variables. Highly granular categories (e.g., mother tongue, CLIL language) were therefore treated descriptively rather than as robust predictors in inferential analyses. Notably, all Turkish participants were employed in rural schools, a contextual characteristic that is interpreted descriptively rather than comparatively.
3.3. Data Collection Instruments and Procedure
Data for the present study were collected through the Digital Literacies Teacher Survey (DLTS), developed within the framework of the COST Action CLILNetLE (#CA21114), under the direction of Working Group 4 (WG4). The DLTS was designed using the Qualtrics platform (Qualtrics, 2024) and aimed to assess CLIL teachers’ perceptions and self-reported use of digital tools in content and language integrated learning environments. The survey focused on several key dimensions, including:
Teachers’ beliefs and perceptions of technology use.
Teachers’ awareness of students’ digital practices.
The survey was originally developed in English and subsequently translated into the local languages of the participating countries to facilitate accessibility and reliability in data collection. The data collection period spanned from March to June 2024. In each country, national survey administrators established contact with schools where CLIL was being implemented. The online survey link was distributed directly to CLIL teachers who had been identified as eligible participants.
The DLTS opened with an overview explaining the study’s purpose and ethical safeguards, followed by a required digital consent process. All participants were informed about the voluntary nature of their involvement prior to beginning the questionnaire. Ethical procedures were carried out in line with the Memorandum of Understanding (MoU, 2022) established for COST Action #21114. Participants gave informed consent, confirming their willingness to take part in the study with full knowledge of its purpose. To ensure confidentiality and anonymity, all identifying information was removed prior to data analysis, safeguarding respondents’ privacy. The research also adhered to the standards set by the General Data Protection Regulation (GDPR), with all data securely stored and accessible only to designated members of the research team. Moreover, participants were assured of their right to withdraw from the study at any point, without explanation or consequences.
Prior to full implementation, the Digital Literacies Teacher Survey (DLTS) was piloted in Albania in 2023 to ensure clarity, functionality, and contextual relevance. Following the pilot, minor revisions were made to optimize the survey for broader deployment.
3.4. Data Analysis Procedure
To determine the appropriate data analysis approach, initial checks were conducted on sample size and data distribution. Regarding data distribution, preliminary checks were performed to assess normality, linearity, multicollinearity, homoscedasticity, and homogeneity of variances, following the guidelines outlined by Pallant (2011). Based on the outcomes of these preliminary tests, parametric techniques were employed for descriptive statistics, alongside one-way ANOVA tests and subsequent post-hoc analyses. Gabriel’s test was chosen for post-hoc comparisons, as the participant distribution across groups was unequal, albeit close. Additionally, a hierarchical multiple regression analysis was conducted to examine whether individual and contextual variables predict CLIL teachers’ self-reported perceptions of technology-integrated CLIL education. All variables included in the regression model were derived from the Digital Literacies Teacher Survey (DLTS) and reflect teacher-level responses; no student-level outcome measures were included in the analysis. In Data cleaning was performed in Excel, and analyses were carried out using JASP and SPSS software.
4. Findings
This section presents the findings of the statistical analysis performed to respond to the research questions. All inferential interpretations adhere to the standard significance level of α = .05. Descriptive statistics are presented initially, succeeded by inferential analysis.
The preliminary analysis of the data served to check for normality and homogeneity of variances. Previously, residual statistics were examined via Cook’s distance ( -1< cook’s distance <+1) in order to check normality and homogenity of variances, and control the outliers in the data set in the coming stages. Depending on the residual statistics results (p = .012, -1<p <+1) outliers were kept under control. Later on, the Kolmogorov-Smirnov test and Levene’s test was employed, respectively, as a prerequisite for the main hierarchical multiple regression analysis. The Kolmogorov–Smirnov test revealed a significant departure from normality (p = .002). Nonetheless, considering the sample size (N = 135), the examination of variance homogeneity, and the resilience of ANOVA and regression analyses to moderate deviations from normality, parametric tests were maintained (Pallant, 2011) as displayed in Table 2.
|
|
Statistic |
df |
Sig. |
|
Survey on Teacher’s CLIL ID |
.114 |
102 |
.002 |
Table 2. Output of Test of Normality Through Kolmogorov-Smirnov
Homogeneity of variances was evaluated using Levene’s test, which revealed no statistically significant differences between groups (p > .05) as presented in Table 3.
|
|
Levene Statistic |
df1 |
df2 |
Sig. |
|
|
Teacher’s CLIL ID |
Based on Mean |
2.230 |
2 |
77 |
.114 |
|
Based on Median |
2.027 |
2 |
77 |
.139 |
|
|
Based on Median and with adjusted df |
2.027 |
2 |
75.39 |
.139 |
|
|
Based on trimmed mean |
2.213 |
2 |
77 |
.116 |
|
Table 3. Output of Test of Homogeneity of Variance by Levene Statistics
No assertions concerning homogeneity were inferred from the visual examination of the scatterplot.
Figure 1. Histogram of teachers’ perception scores and scatterplot of standardized residuals for preliminary assumption checking
Scatterplots were employed exclusively for the exploratory examination of residual dispersion, rather than as a formal assessment of variance equality. Depending on the preliminary tests for one-way ANOVA, it is seen that parametric analysis can be embarked on for making a cross-country comparison of teachers’ perception of technology-integrated CLIL education.
4.1. Q1. CLIL Teachers’ Perceptions of Utilizing Technology in CLIL Education in Austria, Türkiye, Spain, and Albania
First research question requires a cross-country comparison of teachers’ perceptions of technology‑integrated CLIL education in order to make an evaluation on their technology-integrated CLIL ID, and in that frame, at the first step, teachers’ perception of the importance of placing digital tools to polish up students’ CLIL languages was examined. It is important to highlight that individual perceptual items were optional, leading to differing quantities of valid responses for each item. The entire sample comprised 135 people, with individual item responses varying between 98 and 113. The descriptive statistics presented in Table 4 are derived from available-case data for each item, rather than from listwise-complete cases. Thus, the data are to be regarded as representative of general trends rather than precise population estimates.
|
|
N |
Min. |
Max. |
Mean |
Std. Deviation |
|
113 |
1.00 |
5.00 |
3.55 |
1.29 |
|
100 |
1.00 |
5.00 |
3.92 |
1.09 |
|
98 |
1.00 |
5.00 |
4.00 |
1.05 |
|
Valid N (listwise) |
135 |
|
|
|
|
Table 4. Teachers’ Perception of Technology-Integrated CLIL Activities
Table 4 shows the general picture of the participant teachers’ perceptions of technology-integrated integrated-CLIL education. According to the table, it is possible to explore that teachers perceive the technology-integrated CLIL education lukewarm to positive regarding polishing students’ disciplinary literacy skills (M = 3.55, std. = 1.29). Furthermore, participant teachers are found to be positive about the multilingualism feature of extramural technology usage by students (M = 3.92 Std. =1.09).
After presenting the descriptive results, we turn next to the results of the inferential statistics, which allow for cross-group, namely country, comparison. Following the Kolmogorov-Smirnov test and the homogeneity of variances reported in Levene’s test from the preliminary analysis, there will be three more preliminary analyses, which are multicollinearity, linearity, and Homoscedasticity for hierarchical multiple regression analysis.
Table 5 points out that participant teachers consider technology-integrated CLIL education as a motivating factor for students since students are already keen on technology-aided missions.
|
|
Sum of Squares |
df |
Mean Square |
F |
Sig. |
|
Between Groups |
22.706 |
12 |
1.892 |
1.226 |
.280 |
|
Within Groups |
129.665 |
84 |
1.544 |
|
|
|
Total |
152.371 |
135 |
|
|
|
Table 5. Cross-country Analysis Through One-Way ANOVA
Further analysis was conducted to examine such correlations more in depth. A one-way ANOVA was performed to assess whether teachers’ perceptions varied significantly among countries. Assumptions were assessed before to the analysis. The Kolmogorov–Smirnov test revealed a significant deviation from normality (p = .002), whereas Levene’s test affirmed the homogeneity of variances (p > .05). Considering the sample size (N = 135) and ANOVA’s resilience to modest deviations from normality, the analysis was conducted. The one-way ANOVA indicated no statistically significant variations in teachers’ perceptions of technology-integrated CLIL teaching among the four nations, F(12, 84) = 1.226, p = .280 (Table 5). Consequently, while descriptive variations were noted among countries, these variations cannot be deemed statistically significant.
Exploratory post-hoc comparisons were performed utilising Gabriel’s test. The results indicated one statistically significant pairwise difference between Austria and Türkiye (p = .02), whilst all other comparisons were non-significant (p > .05). In light of the lack of a substantial omnibus ANOVA impact, this singular finding must be regarded with care and considered exploratory rather than confirmatory.
4.1.1. Q1a. Cross-Country Analysis on Digital Tools Utilized by CLIL Teachers
Depending on the perception related questionnaire results (Table 4), a follow up questionnaire developed by COST CA 21114 #WG4 group was conducted to the CLIL teachers on the types of the digital tools and the frequency of utilizing them to improve students’ CLIL languages This questionnaire was employed to explore teachers awareness level of the digital tools and their optimal place in the CLIL education.
Descriptive statistics were utilised to analyse the types and frequency of digital tools utilised by CLIL educators in the four countries (Table 6). Table 6 presents descriptive statistics (means and standard deviations) for the frequency of digital tool use reported by CLIL teachers across the four national contexts. Given the exploratory nature of this analysis and the unequal group sizes, results are interpreted descriptively, without inferential comparison.
Across all four countries, digital projectors emerged as the most frequently used technological resource (overall M = 3.69), followed by online research tools (M = 3.16) and online video platforms (M = 2.98). This pattern was consistent across national contexts, indicating a shared reliance on presentation-oriented and information-access technologies in CLIL classrooms. In contrast, interactive and advanced technologies, such as virtual reality, digital gaming, and paid educational applications, were reported as infrequently used across all countries. While some variation in mean scores was observed between contexts, these differences reflect usage tendencies rather than statistically tested differences. Overall, the descriptive results suggest that technology integration in CLIL settings is characterized by the widespread use of basic digital tools, with more innovative or immersive technologies remaining peripheral.
|
|
Mean |
Average |
Std. Deviation |
||||||
|
SP |
TR |
AL |
AU |
SP |
TR |
AL |
AU |
||
|
Social media |
1.322 |
2.478 |
1.961 |
1.372 |
1.691 |
.701 |
1.201 |
1.148 |
.817 |
|
Multiplayer gaming |
1.193 |
1.521 |
1.260 |
1.186 |
1.266 |
.601 |
.994 |
.540 |
.450 |
|
Instant messaging |
1.290 |
3.130 |
2.500 |
1.558 |
1.975 |
.824 |
1.057 |
1.414 |
.853 |
|
Video streaming |
1.766 |
3.130 |
2.250 |
2.907 |
2.533 |
.727 |
1.179 |
1.188 |
.921 |
|
Phone Application |
1.562 |
3.136 |
1.863 |
1.604 |
1.924 |
1.162 |
1.206 |
1.320 |
.954 |
|
Online Video Streaming |
3.062 |
3.304 |
2.826 |
2.837 |
2.983 |
1.014 |
1.105 |
1.114 |
1.089 |
|
Online research |
3.843 |
2.652 |
2.857 |
3.069 |
3.159 |
1.110 |
1.495 |
1.236 |
1.203 |
|
VR |
1.290 |
1.500 |
1.681 |
1.357 |
1.427 |
.739 |
.912 |
1.086 |
.726 |
|
Online shopping |
1.354 |
1.652 |
1.285 |
1.302 |
1.381 |
1.018 |
1.191 |
.643 |
.637 |
|
Mobile photograph |
2.000 |
1.739 |
1.913 |
1.488 |
1.750 |
1.211 |
1.053 |
1.040 |
.797 |
|
Digital storytelling |
2.250 |
2.173 |
1.590 |
1.690 |
1.916 |
1.135 |
1,233 |
.908 |
.896 |
|
Online forums |
1.354 |
2.260 |
1.3636 |
1.651 |
1.638 |
.838 |
1.321 |
.657 |
.841 |
|
Free educational apps |
2.906 |
2.913 |
2.1364 |
2.325 |
2.558 |
1.027 |
1.276 |
1.125 |
1.017 |
|
Paid educational apps |
1.800 |
2.088 |
1.565 |
1.325 |
1.638 |
1.214 |
1.083 |
.992 |
.865 |
|
Online music streaming |
1.903 |
1.913 |
1.619 |
1.571 |
1.735 |
1.374 |
1.378 |
.973 |
.887 |
|
E-book readers |
1.600 |
2.409 |
1.954 |
1.534 |
1.794 |
1.069 |
1.368 |
1.290 |
.908 |
|
AI |
1.935 |
2.500 |
1.750 |
2.046 |
2.051 |
1.314 |
1.371 |
.966 |
.998 |
|
E-textbooks |
2.366 |
3.285 |
2.100 |
2.907 |
2.693 |
1.670 |
1.454 |
1.333 |
1.460 |
|
Digital projectors |
4.031 |
3.761 |
3.381 |
3.558 |
3.692 |
1.491 |
1.609 |
1.596 |
1.516 |
|
Single player games |
1.100 |
1.750 |
1.190 |
1.166 |
1.256 |
.547 |
1.208 |
.511 |
.695 |
|
Online courses |
1.612 |
2.263 |
1.476 |
1.690 |
1.725 |
1.022 |
1.326 |
.679 |
1.199 |
|
Digital reading |
1.466 |
2.111 |
1.666 |
1.404 |
1.585 |
1.074 |
1.450 |
1.238 |
.857 |
|
Social media |
1.266 |
1.555 |
1.285 |
1.333 |
1.342 |
.739 |
1.199 |
.643 |
.754 |
|
Valid N = 135 |
|
|
|
|
|
|
|
|
|
Table 6. Cross-country Comparative Descriptives (counts) on the Significance of Digital Tools
|
Country |
Country |
Mean Difference |
Std. Error |
Sig. |
95% Confidence Interval |
|
|
Lower Bound |
Upper Bound |
|||||
|
Spain |
Türkiye |
-.783 |
.878 |
.93 |
-3.134 |
1.567 |
|
Albenia |
,649 |
.990 |
.98 |
-1.968 |
3.268 |
|
|
Austria |
1.611 |
.738 |
.17 |
-.365 |
3.587 |
|
|
Türkiye |
Spain |
.783 |
.878 |
.93 |
-1.567 |
3.134 |
|
Albania |
1.433 |
1.056 |
.68 |
-1.391 |
4.257 |
|
|
Austria |
2.394 |
.824 |
.02 |
.211 |
4.577 |
|
|
Albania |
Spain |
-.649 |
.990 |
.98 |
-3.268 |
1.968 |
|
Türkiye |
-1.433 |
1.056 |
.68 |
-4.257 |
1.391 |
|
|
Austria |
.961 |
.943 |
.87 |
-1.488 |
3.411 |
|
|
Austria |
Spain |
-1.611 |
.738 |
.17 |
-3.587 |
.365 |
|
Türkiye |
-2.394 |
.824 |
.02 |
-4.577 |
-.211 |
|
|
Albania |
-.961 |
.943 |
.87 |
-3.411 |
1.488 |
|
Table 7. Gabriel’s Post-Hoc Analysis Results of Teachers’ Perception of Technology-Integrated CLIL Education
Subsequent to the one-way ANOVA, Gabriel’s post-hoc test was used to investigate pairwise disparities among countries. The omnibus ANOVA revealed no statistically significant effect (p > .05), hence post‑hoc findings are examined in an exploratory manner. The post-hoc comparisons indicated that the majority of pairwise differences lacked statistical significance (p > .05). A comparison between Austria and Türkiye produced a p-value below the standard significance threshold (p = .02). Nonetheless, due to the non-significant omnibus ANOVA and the exploratory character of the post-hoc analysis, this singular finding should be regarded with care and not considered as definitive proof of systematic cross-country disparities.
The post-hoc findings lack adequate statistical evidence to substantiate continuous or generalisable disparities in teachers’ perceptions of technology-integrated CLIL instruction across the four national settings.
In line with best statistical practice, conclusions regarding group differences are based primarily on the omnibus ANOVA rather than isolated post-hoc results.
4.2. Q2. Individual Differences as Predictors of Teachers’ Perception of Technology-Integrated CLIL Education
Research question 2 focused on examining whether individual differences of the participating teachers (namely gender, mother tongue, age range of students, school location, school CLIL language, residential country, and special CLIL training) predict their perception of technology-integrated CLIL education. To answer this question, we look at the results of the teachers of the four countries of interest together. For this analysis, we employed a hierarchical multiple regression test.
The regression model summary is presented in Table 8. The Durbin–Watson statistic (1.834) indicated no serious autocorrelation of residuals. The Enter regression model explained 8% of the variance in CLIL teachers’ perceptions of technology-integrated CLIL education (R² = .08), while the adjusted R² was .009, indicating that less than 1% of the variance was explained when accounting for the number of predictors included in the model.
These values suggest that the overall explanatory power of the model was weak, indicating that the individual and contextual variables included in the analysis accounted for only a small proportion of variance in teachers’ perceptions. The low R² and adjusted R² values indicate that factors beyond those included in the model—such as institutional culture, policy constraints, or school-level support—are likely to play a more substantial role in shaping teachers’ perceptions of technology-integrated CLIL education.
|
Model |
Enter Regression Model |
|
R |
.287 |
|
R Square |
.08 |
|
Adjusted R Square |
.009 |
|
Std. Error of the Estimate |
3.018 |
|
Durbin-Watson |
1.834 |
Table 8. Regression Modal Summary for Testing Homoscedasticity
In following the regression Analysis, Pearson correlation analyses were performed to examine the bivariate correlations between teachers’ perceptions and individual characteristics (Table 9). All correlation coefficients were weak or negligible (|r| < .20), and none demonstrated a significant connection.
Despite certain relationships achieving statistical significance owing to sample size, their effect sizes were negligible and did not demonstrate practical significance. The findings align with the regression results, corroborating the conclusion that individual variations do not significantly account for variation in teachers’ perceptions.
|
Variable |
r |
p-value |
|
Country of residence |
−0.257 |
.071 |
|
Mother tongue |
0.136 |
.214 |
|
School CLIL language |
0.014 |
.881 |
|
Age range of students |
−0.141 |
.198 |
|
Seniority |
−0.111 |
.279 |
|
Special CLIL training |
0.009 |
.934 |
|
School location |
0.023 |
.812 |
Table 9. Correlation Between Teachers’ Perception of Technology-Integrated CLIL Education and Independent Variables
A hierarchical multiple regression analysis was performed to assess the predictive power of individual and contextual variables on teachers’ perceptions of technology-integrated CLIL instruction. The dependent variable was the overall perception score of teachers. The model incorporated gender, native language, the age range of students, expertise level, school CLIL language, school location, country of residency, and previous CLIL training as predictors. The regression model lacked statistical significance, F(7, 88) = 1.128, p = .353, accounting for a minimal variance in teachers’ perceptions (R² = .08; adjusted R² = .009).
|
Model |
Sum of Squares |
df |
Mean Square |
F |
Sig. |
|
|
Enter |
Regression |
71.968 |
7 |
10.281 |
1.128 |
.353b |
|
Residual |
802.022 |
88 |
9.114 |
|
|
|
|
Total |
873.990 |
135 |
|
|
|
|
Table 10. Output of the ANOVA for Hierarchical Multiple Regression
The Predictors are (Constant), gender, mother tongue, age range of students, school location, school CLIL language, residential country, and special CLIL training. Similar results were reported in the Hierarchical Multiple Regression (Table 11), which confirmed that independent variables do not predict the dependent variables significantly meaningful (F(7,135)=1.128, p.>.001).
|
Variable |
Unstandardized |
Bootstrapping BC’a % 95 CI |
Standardized |
Collineary Statistics |
|||||
|
|
B |
Std. Error |
Lower limit |
Upper limit |
ß |
t |
Sig. |
Tolerance |
VIF |
|
(Constant) |
13.70 |
1.692 |
10.337 |
17.062 |
|
8.097 |
.000 |
|
|
|
Gender |
-.473 |
.300 |
-1.069 |
.124 |
-.197 |
-1.575 |
.119 |
.666 |
1.50 |
|
Mother T. |
.011 |
.016 |
-.022 |
.044 |
.078 |
.681 |
.498 |
.796 |
1.25 |
|
School CLIL Lang. |
-.006 |
.021 |
-.049 |
.036 |
-.033 |
-.303 |
.763 |
.898 |
1.11 |
|
Age Range of Students |
-.054 |
.232 |
-.515 |
.406 |
-.028 |
-.234 |
.815 |
.754 |
1.32 |
|
Seniority |
-.243 |
.243 |
-.726 |
.239 |
-.106 |
-1.002 |
.319 |
.937 |
1.06 |
|
Special Training in CLIL |
-.372 |
.662 |
-1.688 |
.944 |
-.060 |
-.562 |
.576 |
.914 |
1.09 |
|
School Location |
.087 |
.579 |
-1.063 |
1.237 |
.016 |
.151 |
.881 |
.909 |
1.10 |
Table 11. Output of the Hierarchical Multiple Regression
Table 11 displays the findings of the hierarchical multiple regression analysis investigating the predictive influence of individual and contextual variables on CLIL educators’ opinions of technology-integrated CLIL instruction. None of the factors in the model attained statistical significance at the customary threshold (p <.05). For instance, gender did not serve as a significant predictor of teachers’ judgements (β = −.19, t(88) = −1.57, p = .119). Likewise, factors including seniority, school location, CLIL training, native language, and the language of instruction in CLIL did not exhibit statistically significant predictive impacts (p > .05).
The findings suggest that the individual and contextual variables analysed do not significantly account for the diversity in CLIL teachers’ perceptions of technology-integrated CLIL education.
Specifically;
The variable ’Gender’ does not predict the perception of teachers statistically significant and positive.
ß = -.19, t(88)=-1.57, p > .005 (p=.119), p2 =0,27
The variable ’mother tongue’ does not predict the perception of teachers statistically significant and positive.
ß = .07, t(88)=.68, p > .005 (p=.498), p2 =0,49
The variable ’school CLIL language’ does not predict the perception of teachers statistically significant and positive.
ß = .03, t(88)=-.30, p > .005 (p=.76), p2 =0,9
The variable of ’age range of students’ does not predict the perception of teachers statistically significant and positive.
ß =- .02, t(88)=-.23, p > .005 (p=.81), p2 =0,4
The variable ’seniority’ does not predict the perception of teachers statistically significant and positive.
ß =- .10, t(88)=-1.00, p > .005 (p=.31), p2 =0,01
The variable of ’special training in CLIL’ does not predict the perception of teachers statistically significant and positive.
ß =- .06, t(88)=-.56, p > .005 (p=.57), p2 =0,003
The variable of ’school location’ does not predict the perception of teachers statistically significant and positive.
ß =- .01, t(88)=.15, p > .005 (p=.88), p2 =0,0001
4.3. Q3. Perceived Challenges
The third research question examined teachers’ perceptions of challenge regarding teachers’ technology-integrated extramural CLIL education.
The perceived obstacles of teachers regarding technology-integrated CLIL instruction were analysed by descriptive statistics. Figure 2 depicts the comprehensive distribution of perceived obstacles, whereas Tables 12 and 13 present detailed mean ratings. Sixty-four (64%) percent of participating educators indicated that using technology into CLIL curriculum posed difficulties. The challenges with the highest ratings were opposition to change in professional settings (M = 9.69, SD = 1.28), insufficient training or professional growth (M = 9.05, SD = 2.44), and financial limitations (M = 9.02, SD = 2.53).
Figure 2. Teachers’ Perception of Challenges in Technology-Integrated Extramural CLIL Education
Infrastructure and device access challenges, however significant, garnered relatively lower mean scores.
Country-specific descriptions indicated contextual disparities in perceived difficulties. Educators in Spain expressed greater apprehension regarding privacy and security, whereas educators in Türkiye and Albania highlighted budgetary limitations. Austrian educators identified school policy constraints as a significant difficulty. The results underscore that systemic and institutional issues, rather than individual traits, influence teachers’ experiences with technology integration.
Table 12 shows that teachers are challenged with their immediate professional preparedness level regarding placing technology into their both in and out of class activities to encourage and engage students decently as stated in the item 5. (M = 9.69, Std. = 1. 28). Although “inadequate infrastructure” emerged as the least challenging dimension relative to other items, its mean score (M = 7.25, SD = 3.12) indicates that it was still perceived as a considerable challenge by participating teachers.
|
Challenging Points |
Mean |
Media |
Mode |
Std. |
|
1.Limited access to technology and devices for teachers |
8.44 |
10.00 |
10.00 |
3.01 |
|
2.Limited access to technology and devices for students |
8.27 |
10.00 |
10.00 |
3.12 |
|
3.Inadequate infrastructure |
7.25 |
10.00 |
10.00 |
3.12 |
|
4.Lack of training or professional development |
9.05 |
10.00 |
10.00 |
2.44 |
|
5.Resistance in my professional context to change and adapt to new technologies |
9.69 |
10.00 |
10.00 |
1.28 |
|
6.Time constraints |
7.80 |
10.00 |
10.00 |
3.45 |
|
7.Budget constraints |
9.02 |
10.00 |
10.00 |
2.53 |
|
8.Privacy and security concerns |
7.38 |
10.00 |
10.00 |
3.65 |
|
9.School policy and restrictions (i.e. mobile phones are banned) |
8.61 |
10.00 |
10.00 |
2.79 |
Table 12. Challenging Points in Technology-Integrated CLIL Education
|
Challenging Points |
SP Mean |
TR Mean |
AL Mean |
AU Mean |
|
1.Limited access to technology and devices for teachers |
9.00 |
9.41 |
9.41 |
7.45 |
|
2.Limited access to technology and devices for students |
8.66 |
9.48 |
9.48 |
7.63 |
|
3.Inadequate infrastructure |
8.42 |
9.32 |
9.32 |
7.77 |
|
4.Lack of training or professional development |
8.95 |
9.80 |
9.80 |
8.45 |
|
5.Resistance in my professional context to change and adapt to new technologies |
9.14 |
9.29 |
9.29 |
9.36 |
|
6.Time constraints |
8.85 |
9.64 |
9.64 |
7.56 |
|
7.Budget constraints |
9.09 |
9.90 |
9.90 |
7.47 |
|
8.Privacy and security concerns |
9.95 |
9.77 |
9.77 |
7.79 |
|
9.School policy and restrictions (i.e. mobile phones are banned) |
9.42 |
9.48 |
9.48 |
9.43 |
Table 13. Country Wise Challenging Points
Descriptive data at the country level reveal contextual differences in perceived obstacles. Educators in Spain expressed significant apprehension regarding privacy and security matters (M = 9.95), whereas budgetary limitations were most prominent among educators in Türkiye and Albania (M = 9.90). These patterns indicate contextual variations and are understood descriptively rather than inferentially.
5. Discussion & Conclusion
The first research question investigated CLIL educators’ perspectives of technology-enhanced CLIL instruction within four national contexts. The results suggest that educators predominantly possess relatively favourable views, especially about the motivational and supportive functions of digital technology in content-language integration. This outcome corresponds with an increasing collection of recent Scopus-indexed studies indicating cautiously favourable teacher attitudes towards instructional technology in CLIL and multilingual environments (Pérez-Cañado, 2020; Dooly & Sadler, 2022; Castillo‑Cuesta et al., 2024). These studies repeatedly indicate that educators acknowledge the instructional promise of digital tools for improving engagement, structuring disciplinary content, and facilitating multilingual meaning-making, despite unequal implementation.
Hüttner and Smit (2021) report similar findings, asserting that teachers’ favourable perceptions frequently mirror an aspirational alignment with innovation-driven educational discourses rather than fully implemented classroom practices. Sánchez-Gómez et al. (2023) similarly discovered that educators often advocate for technological integration conceptually, although exhibit doubt regarding its pedagogical profundity. The modest means and significant variability identified in this study support this interpretation, indicating that teachers’ perceptions are influenced by both perceived advantages and encountered limitations.
The lack of statistically significant cross-country variations contrasts with previous CLIL research highlighting national policy maturity as a crucial factor influencing pedagogical confidence (Dalton-Puffer et al., 2021b; Pérez-Cañado, 2020). Research in regions with established CLIL traditions, like Spain and Austria, has sometimes indicated greater teacher support for novel techniques than in more recent CLIL contexts. Recent comparative studies contest this assumption, revealing that the duration of policy does not inherently lead to educational transformation (Howard et al., 2021; Dooly & O’Dowd, 2023). The current findings endorse this perspective, indicating that systemic similarities—such as curriculum pressures and assessment requirements—may surpass national disparities.
The findings indicate a developing consensus that educators’ views on technology-integrated CLIL are moderately favourable yet structurally limited, highlighting a conflict between pedagogical ambitions and institutional constraints. Instead of indicating resistance, these judgements reflect a pragmatic position influenced by contextual opportunities and constraints.
The second research question examined whether individual and contextual characteristics forecasted CLIL teachers’ perceptions of technology-integrated CLIL education. The regression and correlation analyses indicated that none of the assessed variables significantly predicted teachers’ perceptions, and all bivariate connections were weak or negligible. These findings correspond with recent Scopus-indexed studies that challenge the explanatory efficacy of demographic factors in technology integration research (Schmid, 2023; Howard et al., 2021).
Kim and Graham (2022) provide parallel data, indicating that criteria such as age, teaching experience, and training background explained negligible variance in teachers’ digital pedagogical ideas when institutional factors were taken into consideration. Tondeur et al. (2021) contend that personal attributes are frequently exaggerated in their role in technology adoption, whilst structural factors—such as leadership endorsement and curricular coherence—are insufficiently scrutinised. The low R² and modified R² values identified in this study support this assertion, suggesting that teachers’ perceptions are not significantly elucidated by individual-level characteristics alone.
In contrast, previous research indicated that younger educators or those with formal digital training may exhibit more favourable opinions of technology (Gudmundsdottir & Hatlevik, 2018). Recent extensive and cross-national research increasingly indicate diminished or inconsistent effects of these variables (OECD, 2021; Scherer et al., 2021a). The current findings endorse this transition, indicating that professional experience and training do not provide confidence or favourable perceptions in CLIL environments, characterised by inherent pedagogical complexity.
The minimal correlations identified in this study advise against attributing individual responsibility for technological integration. The findings emphasise the necessity of addressing systemic restrictions, such as time, resources, evaluation frameworks, and professional support structures, rather than perceiving teachers’ perceptions as deficiencies needing correction. This viewpoint corresponds with contemporary critical frameworks in digital education, highlighting institutional accountability rather than individual adjustment (Dooly et al., 2024; OECD, 2021).
The third research question addressed the problems perceived by CLIL teachers in technology-integrated CLIL instruction. The findings indicate that educators recognise significant challenges in all assessed areas, with opposition to change, insufficient professional development, and financial limitations identified as the most prominent concerns. The results strongly align with recent Scopus-indexed research that identify institutional and structural challenges as the principal impediments to digital innovation in multilingual education (Alonso-Belmonte & Fernández-Agüero, 2020; Howard et al., 2021).
The prevalence of resistance to change corresponds with studies emphasising the cultural aspects of educational innovation. Maknun et al. (2024) contend that resistance frequently signifies systemic misalignment rather than individual hesitance, especially in situations when digital integration is enforced without sufficient pedagogical assistance. Dooly and O’Dowd (2023) assert that innovation fatigue and policy overload may diminish teachers’ inclination to adopt new techniques.
Country-specific trends further demonstrate the contextual nature of perceived difficulties. The recognition of privacy and security issues in Spain reflects contemporary European discussions around data protection and digital ethics in education (OECD, 2021). The significance of budgetary limitations in Türkiye and Albania highlights wider inequalities in educational infrastructure, aligning with the conclusions of Sánchez-Gómez et al. (2023). These variances substantiate the assertion that digital CLIL issues are contextually situated rather than universally prevalent.
Conversely, research highlighting technological optimism frequently concentrates on minor interventions or well-funded experimental initiatives (Castillo-Cuesta et al., 2024). Although these investigations indicate promise, the current results underscore the disparity between pilot achievements and systemic scalability. This emphasises the necessity of understanding technology integration within practical institutional contexts, rather than in idealised situations.
This study aimed to investigate CLIL educators’ impressions of technology-integrated CLIL instruction in four European contexts and to determine whether these opinions were influenced by individual traits, contextual factors, or perceived implementation difficulties. The findings collectively indicate a coherent and theoretically significant conclusion: CLIL teachers typically possess moderately positive views on technology integration, which are not significantly varied across national contexts nor substantially influenced by individual or professional background variables. Teachers’ perspectives seem to be predominantly influenced by systemic and institutional factors that define their daily instructional experiences. In Albania, Austria, Spain, and Türkiye, educators acknowledged the potential of digital technologies to enhance motivation, comprehension, and multilingual participation; however, their endorsement was measured rather than fervent, indicating a pragmatic recognition of contextual limitations. The lack of statistically significant cross-country variations indicates that disparities in CLIL policy maturity or national educational traditions do not necessarily result in differing teacher perceptions, supporting recent research that questions deterministic connections between policy history and classroom innovation. Moreover, the regression and correlation analyses revealed that variables including age, seniority, CLIL training, school location, and instructional language did not significantly predict teachers’ perceptions, highlighting the restricted explanatory capacity of individual-level factors in comprehending technology integration within intricate pedagogical frameworks like CLIL. This discovery is significant as it warns against deficit-focused narratives that attribute the responsibility for digital integration mainly to teachers’ skills, attitudes, or demographics, and instead shifts the focus to the wider educational ecosystems in which teachers function. The examination of perceived obstacles further corroborates this interpretation. Educators in many settings indicated significant challenges, especially about aversion to change, inadequate professional development, and financial limitations, suggesting that obstacles to technology-integrated CLIL are predominantly institutional rather than individual. While distinct national tendencies emerged—such as increased apprehension over privacy and security in Spain and financial constraints in Türkiye and Albania—these variations signify contextual circumstances rather than fundamentally divergent attitudes towards technology. Significantly, even issues categorised as “least severe” received rather high ratings, indicating that technology integration in CLIL is perceived as demanding across many contexts, irrespective of country setting. Theoretically, the study enhances CLIL and educational technology research by promoting a transition from individualistic explanations to context-sensitive, system-level interpretations of pedagogical innovation. This study underscores the necessity of prudent statistical interpretation and emphasises the significance of descriptive and exploratory cross-national analyses when sample numbers and contextual heterogeneity limit inferential generalisation. The findings indicate that effectively implementing technology-integrated CLIL necessitates more than mere access to digital tools or isolated training programs; it requires cohesive institutional support, ongoing professional development specifically aligned with CLIL pedagogy, and school cultures that foster experimentation without penalising risk-taking.
While this study provides valuable insights into CLIL teachers’ perceptions of technology integration across four countries, it is not without limitations. The study’s results must be considered within the context of several limitations. The total sample size (N = 135), when allocated across four countries, constrains statistical power for country-level comparisons and diminishes generalisability. The dependence on voluntary participation led to inconsistent group sizes and absent data for some demographic factors. Consequently, inferential results are interpreted with caution, and country-level variations are seen as descriptive rather than indicative. Future study would be enhanced by larger, more equitable samples and stratified sampling methods to fortify cross-national comparisons.
Building on the findings of this study, future research could explore several promising directions. One area could be to investigate the impact of specific types of professional development—especially those that combine CLIL methodology with hands-on digital training—on teachers’ classroom practices and student learning. Additionally, comparative studies across more diverse geographic contexts could also illustrate global trends in digital CLIL pedagogy. Finally, in order to get a more complete picture of the role of digital tools in multilingual environments, it could be interesting to examine students’ perspectives on technology use in CLIL education focusing on its impact on motivation, engagement and learning outcomes.
Besides sample size and self-reported data, a significant issue pertains to the original conceptual uncertainty between teacher identity and perceptions. Despite revisions and clarifications, future research would benefit from utilising verified identity-oriented instruments or mixed-method designs to more directly examine identity development in digital CLIL situations.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
This article is based upon work from COST Action #21114 CLIL Network for Languages in Education: Towards bi- and multilingual disciplinary literacies (CLILNetLE), supported for the data analysis step by COST (European Cooperation in Science and Technology) virtual mobility grant. www.cost.eu
Authors' contributions
For completing the research study and reaching the publication journey all the authors have taken active participation. In terms of the contribution roles:
Suheyla Demirkol Orak: Conceptualization, Methodology, Investigation, writing – Review & Editing.
Ekatarina Strati: Conceptualization, Investigation, Resources, Supervision.
Lucila Pérez-Fernández: Conceptualization, Methodology, Writing – Original Draft.
Data availability
The instruments, data, and the published pan-European and country-specific reports are openly available on the Phaidra repository administered by the University of Vienna, at the following link: https://hdl.handle.net/11353/1.0.2082901.
Use of Artificial Intelligence
The authors employed AI-based tools like Grammarly and Quilbot to help with language editing during the preparation and revision of this paper, including enhancements to grammar, clarity, structure, and academic tone. The writers created all of the content and intellectual contributions. No generative AI technologies were employed for autonomous content creation, data analysis, or interpretation. The final manuscript's accuracy and integrity are entirely the authors' responsibility.
References
Alonso-Belmonte, I., & Fernández-Agüero, M. (2020). Language awareness and teacher identity in CLIL contexts: Insights from in-service teachers. System, 94, 102302. https://doi.org/10.1016/j.system.2020.102302
Agmaz, R. F., & Orak, S. D. (2024). CLILNetLE WG4 Türkiye Report. COST Action CLILNetLE.
Cabero-Almenara, J., Romero-Tena, R., Barroso-Osuna, J., & Palacios-Rodríguez, A. (2021). Teachers’ digital competence and perception of technological resources in education. Journal of Technology and Science Education, 11(2), 309–325. https://doi.org/10.3926/jotse.1213
Cabero-Almenara, J., Palacios-Rodríguez, A., & Barroso-Osuna, J. (2022). The role of pedagogical training in teachers’ perceptions of digital technologies. Journal of Technology and Science Education, 12(1), 45–60. https://doi.org/10.3926/jotse.1524
Castillo-Cuesta, L., Amonte, J., & Galindo-Merino, M. M. (2024). Digital mediation and pedagogical transformation in multilingual education: Teachers’ perspectives. Teaching and Teacher Education, 134, 104286. https://doi.org/10.1016/j.tate.2023.104286
Coyle, D., Hood, P., & Marsh, D. (2010). CLIL: Content and language integrated learning (Vol. 1). Cambridge University Press. https://doi.org/10.1017/9781009024549
Dalton-Puffer, C., Hüttner, J., & Smit, U. (2021a). From voluntary to obligatory CLIL in upper secondary technical colleges: Teacher and student voices from a diverse landscape. In The psychological experience of integrating language and content: Teacher and learner perspectives (pp. 93–111). Multilingual Matters Ltd. https://doi.org/10.2307/jj.22730489.13
Dalton-Puffer, C., Hüttner, J., & Smit, U. (2021b). Content and language integrated learning: From practice to principles? Journal of Immersion and Content-Based Language Education, 9(2), 181–200. https://doi.org/10.1075/jicb.21005.dal
Dooly, M., & Sadler, R. (2022). Becoming digitally literate teachers: The role of telecollaboration and task design. Language Teaching, 55(3), 356–370. https://doi.org/10.1017/S026144482100015X
Dooly, M., & O’Dowd, R. (2023). Digital transformation in language education: Critical perspectives on policy and practice. Language Teaching, 56(4), 462–476. https://doi.org/10.1017/S0261444822000304
Dooly, M., Vinagre, M., & Rosi-Solé, C. (2024). Teachers’ professional development for digitally mediated multilingual education. System, 118, 103121. https://doi.org/10.1016/j.system.2023.103121
Ghamarian, K., & Smit, U. (2024). CLILNetLE WG4 Austria Report. COST Action CLILNetLE.
Gudmundsdottir, G. B., & Hatlevik, O. E. (2018). Newly qualified teachers’ professional digital competence: Implications for teacher education. European Journal of Teacher Education, 41(2), 214–231. https://doi.org/10.1080/02619768.2017.1416085
Hoxha, M., & Strati, E. (2024). CLILNetLE WG4 Albania Report. COST Action CLILNetLE.
Howard, S. K., Tondeur, J., Siddiq, F., & Scherer, R. (2021). Ready, set, go! Profiling teachers’ readiness for online teaching in secondary education. Computers & Education, 168, 104204. https://doi.org/10.1016/j.compedu.2021.104204
Howard, S. K., & Mozejko, A. (2022). Teachers’ perceptions of digital innovation across international education systems. Humanities and Social Sciences Communications, 9, 312.
https://doi.org/10.1057/s41599-022-01234-7
Hüttner, J., & Smit, U. (2021). Negotiating pedagogical roles in CLIL teacher education. Journal of Immersion and Content-Based Language Education, 9(1), 34–57. https://doi.org/10.1075/jicb.20018.hut
Kim, Y., & Graham, S. (2022). Professional development for CLIL teachers: Pedagogical and digital dimensions. Language Teaching Research, 26(4), 623–644. https://doi.org/10.1177/1362168820938825
Maknun, L. L., Zamzani, Z., & Jamilah, J. (2024). Leveraging Technology-Based AFL and AAL within the Framework of English Differentiated Instruction in Indonesia. Teaching English with Technology, 24(2), 47–70. https://doi.org/10.56297/vaca6841/LRDX3699/VSCH7944
Martín-García, A. V., Sánchez-Gómez, M. C., & Mena, J. (2023). Teachers’ perceptions of educational technology: A systematic multidisciplinary review. Multidisciplinary Reviews, 6, e2023021. https://doi.org/10.29327/multi.2023021
MoU (2022). Memorandum of Understanding for the implementation of the COST Action “CA21114–CLIL Network for Languages in Education: Towards bi-and multilingual disciplinary literacies” (CLILNetLE). COST. https://www.cost.eu/actions/CA21114/
OECD (2021). Teachers and school leaders as valued professionals. OECD Publishing. https://doi.org/10.1787/b1a0b52e-en
Qualtrics (2024). Qualtrics [Computer software]. https://www.qualtrics.com/
Pallant, J. (2011). Survival manual. A step by step guide to data analysis using SPSS, 4(4).
Pérez-Cañado, M. L. (2020). CLIL research: State of the art. Language Teaching, 53(4), 1–35. https://doi.org/10.1017/S0261444819000450
Pozo-Sánchez, S., López-Belmonte, J., Fuentes-Cabrera, A., & López-Núñez, J. A. (2022). Teachers’ perceptions of digital competence and instructional technology use. Journal of Technology and Science Education, 12(3), 682–697. https://doi.org/10.3926/jotse.1705
Sánchez-Gómez, M. C., Martín-García, A. V., & Mena, J. (2023). Digital competence and educational inequalities in multilingual education. Computers & Education, 190, 104582. https://doi.org/10.1016/j.compedu.2022.104582
Scherer, R., Siddiq, F., & Tondeur, J. (2021a). The technology acceptance model (TAM): A meta-analytic structural equation modeling approach. Computers & Education, 164, 104123. https://doi.org/10.1016/j.compedu.2020.104123
Scherer, R., Siddiq, F., & Tondeur, J. (2021b). All the same or different? Revisiting the role of context in teachers’ perceptions of technology integration. Humanities and Social Sciences Communications, 8, 112. https://doi.org/10.1057/s41599-021-00785-3
Schmid, M. S. (2023). The final frontier? Why we have been ignoring second language attrition, and why it is time we stopped. Language Teaching, 56(1), 73-93. https://doi.org/10.1017/S0261444822000301
Segura, M., & Bárcena-Toyos, P. (2024). CLILNetLE WG4 Spain Report. COST Action CLILNetLE.
Tondeur, J., Scherer, R., Siddiq, F., & Baran, E. (2021). A comprehensive framework for teachers’ digital competence. Educational Technology Research and Development, 69(4), 2235–2254. https://doi.org/10.1007/s11423-020-09860-8
This work is licensed under a Creative Commons Attribution 4.0 International License
Journal of Technology and Science Education, 2011-2026
Online ISSN: 2013-6374; Print ISSN: 2014-5349; DL: B-2000-2012
Publisher: OmniaScience



