Advances in Consumer Research
Issue:5 : 1960-1970
Research Article
Faculty Performance in the Age of Generative AI: The Role of Organizational Support Systems and Task-Technology Fit in Higher Education
 ,
 ,
 ,
1
Center for Innovative Technologies in Education, Laguna State Polytechnic University, Philippines
2
Colllege of Computer Studies, Laguna State Polytechnic University, Philippines
3
Colllege of Arts and Sciences, Laguna State Polytechnic University, Philippines
4
College of Teacher Education, Laguna State Polytechnic University, Philippines
Received
Aug. 20, 2025
Revised
Sept. 20, 2025
Accepted
Oct. 15, 2025
Published
Nov. 22, 2025
Abstract

This study examines the predictors of faculty performance and productivity (FPP) in higher education institutions (HEI) in the age of generative AI, using organizational support systems (OSS) framework and task-technology fit (TTF) model as theoretical frameworks. As generative AI tools rapidly transform academic work, understanding how traditional organizational support and technology alignment influence faculty effectiveness becomes increasingly critical. This research investigates current best practices to enhance these areas while considering the emerging AI landscape. The survey questionnaire was administered to 186 faculty members at a state university in the Philippines to measure the constructs of OSS and TTF perspectives. The results revealed that the faculty members perceived high levels of organizational support systems, and that task-technology fit was highly practiced among them. Furthermore, the study found moderate to strong associations between OSS, TTF, and FPP variables. Finally, two OSS variables, namely communication, capacity and resources, alongside TTF’s performance impact and utilization, significantly emerged as key predictors of faculty performance and productivity. The findings of this study provide important insights for HEI administrators and policymakers in navigating the digital transformation of higher education. The results suggest that institutions must strategically invest in enhancing support systems and ensuring that faculty members have access to appropriate technologies—including generative AI tools—along with the training and resources to leverage these innovations effectively for their pedagogical and research tasks.

Keywords
INTRODUCTION

Higher Education Institutions (HEIs) are complex organizations that operate in a dynamic and competitive environment, particularly in an era marked by rapid technological transformation. The emergence of generative artificial intelligence (Gen AI) tools such as ChatGPT, Claude, and other large language models has fundamentally reshaped the academic landscape, creating both opportunities and challenges for teaching, research, and administrative work [1], [2], [3]. Faculty increasingly rely on critical and supportive goals and objectives within their institutions to navigate this technological shift effectively. Various factors, including organizational support systems and task-technology fit, influence HEI faculty members' performance and productivity. Understanding how organizational support systems and the institutional capacity to integrate emerging technologies, particularly generative AI, shape faculty effectiveness has become imperative for institutional success in the digital age [3], [4].

 

HEIs are increasingly reliant on technology—including generative AI tools—to support their teaching and research activities. Technology adoption, particularly

 

of AI-powered platforms, has become a strategic imperative for many HEIs to enhance their academic programs, increase student engagement and success, and improve faculty productivity and performance [5]. However, technology adoption alone does not guarantee improved outcomes. Faculty members need comprehensive organizational support to navigate and ethically integrate these powerful tools into their pedagogical and scholarly practices. It is essential that faculty members receive adequate support to use the technology effectively [4], [6]. Therefore, understanding the role and influence of technology, including generative AI adoption, in relation to organizational support systems becomes critical for HEIs to maximize the benefits of technology adoption [7].

 

Organizational support systems encompass a wide range of resources and support mechanisms HEIs provide to enhance faculty members' ability to perform their job-related tasks [8], [9]. These resources and support mechanisms include but are not limited to access to technology, professional development opportunities—particularly AI literacy training—mentoring and technical staff [1], [5]. Previous research has shown that organizational support systems are critical in shaping faculty job satisfaction, commitment, and retention in HEIs [10], [11]. These factors, in turn, are positively associated with faculty performance and productivity outcomes [12], [13]. Various studies indicate that organizational support systems positively influence faculty performance and productivity in HEIs [14]. However, the relationship between organizational support systems and faculty performance and productivity in the context of emerging technologies like generative AI remains underexplored. Some studies found a positive relationship, while others found no significant one [15]. Therefore, it is important to investigate further the relationship between organizational support systems and faculty performance and productivity in HEIs, especially as institutions navigate the integration of AI tools into academic workflows.

 

The degree of alignment between the technology utilized to complete a task and its requirements is known as task-technology fit [6]. The concept of task-technology fit is based on the idea that technology is more effective when it aligns with the requirements of the task at hand [16]. In the age of generative AI, this alignment becomes even more critical as faculty must determine which AI tools are appropriate for specific pedagogical or research tasks. Research has consistently shown that task-technology fit can positively and significantly impact the acceptance and use of technology in various contexts [18].

 

HEIs are complex organizations that rely on faculty members to achieve their goals and objectives [19]. Various factors, including organizational support systems and task-technology fit with generative AI tools, influence faculty members' performance and productivity [2], [7]. Faculty members must perform various tasks, including teaching, research, and service, which increasingly involve digital and AI-enabled tools. HEIs have invested significantly in support systems for teaching and institutional software for research [7]. Therefore, it is important to investigate the extent to which organizational support systems and task-technology fit predict faculty performance and productivity in HEIs, particularly as generative AI tools become integral to academic work. Previous research has shown that task-technology fit is positively associated with job satisfaction, performance, and productivity [19]. However, the relationship between organizational support systems and task-technology fit with faculty performance and productivity in HEIs has yet to be extensively examined in the context of AI transformation. Therefore, it is important to investigate how task-technology fit predicts faculty performance and productivity in HEIs [20].

 

Research aims to investigate the extent to which organizational support systems and task-technology fit—including the alignment of generative AI tools with academic tasks—influence faculty performance and productivity in HEIs. The research will focus on three main research questions: (1) What is the relationship between organizational support systems and faculty performance and productivity in HEIs? (2) What is the relationship between task-technology fit and faculty performance and productivity in HEIs? (3) What are the key predictors of faculty performance and productivity in HEIs in the era of generative AI adoption? This study is significant because it extends existing research by examining how traditional organizational support mechanisms intersect with emerging AI technologies to influence faculty outcomes. By understanding these dynamics, HEI administrators can develop more effective strategies to support their faculty members. In addition, ensuring that technology is aligned with task requirements—particularly as AI tools proliferate—ensures HEIs enhance faculty members' performance and productivity while promoting responsible and effective AI integration in higher education.

METHODS

Research Design and Instrument. This study employed a descriptive quantitative research design well-suited for systematically describing and quantifying a population's characteristics, trends, attitudes, or opinions by studying a population sample [21]. This approach allowed the researchers to gather numerical data and generalize findings from a sample to the broader population of HEI faculty navigating the integration of generative AI in their academic work.

 

The primary data collection instrument was a carefully structured survey questionnaire. This tool was chosen for its efficiency in collecting large amounts of standardized data and its ability to facilitate statistical analysis [22]. The questionnaire was designed and divided into three distinct parts to address different aspects of the research objective in the context of generative AI adoption in higher education.

 

Part 1 focused on the demographic profile of the respondents. This section gathered essential background information such as age, gender, educational attainment, years of teaching experience, academic rank, and department affiliation. Collecting these data gave a comprehensive understanding of the sample's composition and enabled potential subgroup analyses, including examination of variations in generative AI adoption and comfort levels across different faculty demographics.

 

Part 2 was designed to identify the respondents' mean perceptions of two critical aspects of their work environment: organizational support systems and task-technology fit, with particular attention to AI-related support and technology alignment. This section utilized Likert-scale questions to measure faculty members' attitudes and perceptions. Perceived organizational support included organizational structure, processes/procedures, communication, capacity, and resources provided by the institution—including support for generative AI integration, AI literacy training, and access to AI tools for teaching and research. Task-technology fit questions assessed how well the task characteristics, technology characteristics (including generative AI platforms), and individual abilities aligned to support effective use of AI tools in academic contexts. The questions were designed to evaluate the alignment of technology, particularly AI tools, with the specific requirements of teaching, research, and administrative tasks in higher education.

 

The dependent variable, faculty performance and productivity, was measured using a series of self-reported performance indicators such as mastery evaluation of teaching strategies (including AI-enhanced pedagogies), utilization of LMS and generative AI tools for instruction, classroom management, completion of learning outcomes, conduct of faculty-led research integrated with AI tools where appropriate, evaluation criteria, and publication records.

 

Sample. The study employed random sampling procedures to select 186 faculty members from Laguna State Polytechnic University (LSPU) in the Philippines, ensuring every teaching staff member had an equal probability of inclusion. Laguna, long recognized as one of the most progressive provinces in the Philippines, owes this status to its unwavering commitment to education. Its state university has been at the forefront of shaping and developing the country's next generation of leaders and workforce for over seven decades, including preparing faculty and students to effectively leverage emerging technologies like generative AI. The university holds several notable distinctions: recognition as a State University and Colleges (SUC) Level III institution, Level I Institutional Accreditation, and ISO 9001:2015 Certification. These accolades underscore LSPU's commitment to quality education and alignment with the government's goal of ensuring continuous, high-quality learning opportunities for Filipino youth while embracing technological innovation. The 186 faculty respondents represented a diverse demographic profile, including variations in sex, age, educational background, employment status, department, and academic rank, as well as different levels of experience with and attitudes toward generative AI integration. This diverse sample was reflective of the broader faculty population at LSPU and provided rich insights into how organizational support and technology fit influence performance in an era of rapid AI adoption.

 

Table 1. Demographic Profile

 

Demographic

f

%

Sex

Male

84

45.16

 

Female

102

54.84

Age

20-25

22

11.83

 

26-35

83

44.62

 

36-45

34

18.28

 

46-55

23

12.37

 

Above 55

24

12.90

Educational Background

Bachelor's Degree

-

-

 

With Master's Units

70

37.64

 

Master's Degree

53

28.49

 

With Doctorate Units

38

20.43

 

Doctorate Degree

25

13.44

Employment Status

Full-time

120

64.52

 

Part-time

66

35.48

College/ Department

Industrial Technology

23

12.37

 

Engineering

17

9.14

 

Teacher Education

45

24.19

 

Computer Studies

18

9.68

 

Hospitality Management and Tourism

16

8.60

 

Arts and Sciences

36

19.35

 

Criminal Justice Education

13

6.99

 

Business Management and Accountancy

18

9.68

Position*

Guest Faculty

66

35.48

 

Instructor

68

36.56

 

Assistant Professor

33

17.74

 

Associate Professor

15

8.06

 

Full Professor

4

2.16

N: 186

 

 

 

 

Research Procedures. The researchers initiated the approval process by formally communicating with the president of Laguna State Polytechnic University (LSPU). Upon approval, data collection commenced during an online questionnaire hosted on a secure web-based application. The survey link was distributed via Google Forms to deans, who then forwarded it to their faculty members. This approach ensured efficient distribution across the university's various departments and colleges, facilitating broad participation from faculty at different stages of generative AI adoption. Before beginning the survey, this form outlined the study's purpose—including investigation of how organizational support and task-technology fit influence faculty performance in the context of emerging AI technologies—nature of participation, and data usage. Respondents were required to acknowledge their understanding and provide consent before proceeding.

 

The form emphasized the importance of honest responses while assuring participants of strict confidentiality and anonymity. Respondents were informed that their participation was voluntary and that they could withdraw at any time without consequences. Consistent with ethical research standards, especially when examining perceptions about emerging technologies like generative AI, sensitive information was strictly limited to the primary researcher responsible for data analysis. Appropriate data protection measures, including password and encryption, were implemented to safeguard all collected information. Throughout the process, the researcher maintained a commitment to ethical research practices, ensuring the study's integrity and the protection of participants' rights and privacy.

 

The mean, standard deviation, and Pearson correlation coefficient were utilized to measure faculty members' respondents' perceptions of organizational support systems, task-technology fit (including AI tool alignment), and faculty performance and productivity. Multiple regression analysis was utilized to determine if the study variables—including perceptions of institutional support for AI integration and alignment of AI tools with academic tasks—were significantly related and to uncover potential predictors of faculty performance and productivity [23]. This approach allowed the researchers to describe the study sample descriptive statistics, explore the relationships between variables related to traditional organizational support and emerging AI-enabled technologies, and identify factors that might influence faculty outcomes in an era of digital transformation.

RESULTS AND DISCUSSION

Table 2. Mean, Standard Deviation, Cronbach's Alpha, and Correlations between OSS and FPP

Variables

Organizational Support Systems

Faculty Performance and Productivity

Organizational Structure (OS)

Processes/Procedures (P/P)

Communication (Comm)

Capacity and Resources (CaR)

·      Mastery (Mas)

0.589**

0.637**

0.658**

0.653**

·      Evaluation of Teaching Strategies (ETS)

0.516**

0.570**

0.655**

0.639**

·      Utilization of LMS and Gen AI Tools for Instruction (UtLGAI)

0.333**

0.408**

0.432**

0.388**

·      Classroom Management (CM)

0.313**

0.393**

0.407**

0.394**

·      Evaluation of Learning Outcomes (ELO)

0.489**

0.568**

0.568**

0.546**

Mean

4.70

4.70

4.62

4.63

SD

0.39

0.40

0.42

0.41

Cronbach's Alpha

0.898

0.907

0.872

0.886

 

Faculty Performance and Productivity

 

Mastery

Evaluation of Teaching Strategies

Utilization of IMs

Classroom

Management

Evaluation of Learning Outcomes

Mean

4.56

4.50

4.55

4.56

4.59

SD

0.50

0.50

4.46

0.48

0.44

Cronbach's Alpha

0.949

0.900

0.917

0.950

0.890

                   

 

Table 2 illustrates respondents' perceptions of the Organizational Support Systems (OSS) and its relationship with Faculty Performance and Productivity (FPP) within the university setting in the era of increasing generative AI integration. The findings suggest that various aspects of OSS have positively affected FPP within the organization, aligning with previous research highlighting the importance of organizational support in enhancing employee performance and productivity [24], [25]. The OSS, including support for emerging technologies like generative AI, demonstrates strong positive perceptions across all dimensions. The high mean score and strong internal consistency indicate that respondents generally perceive the organizational structure, processes,

 

communication, capacity, and resources—including AI literacy training and access to AI tools—as robust contributors to FPP. Mastery (Mas.) = 0.890, Evaluation of Teaching Strategies (ETS) = 0.516, Utilization of LMS and Gen AI Tools for Instruction (UtLGAI) = 0.333, Classroom Management (CM) = 0.313, and Evaluation of Learning Outcomes (ELO) = 0.362. The high mean scores and good internal consistency across all OSS dimensions [26], [27] emphasize the role of organizational structure in facilitating performance and productivity, particularly as institutions navigate digital transformation.

 

Organizational support systems regarding processes/procedures (x̄ = 4.06, σ = 0.406, Cα = 0.907) correlate with faculty performance and productivity. The high mean score and excellent internal consistency suggest that respondents view the processes and procedures positively, particularly those supporting the integration of AI tools into teaching and research workflows. The FPP shows a moderate relationship among all study variables of OSS. This result aligns with the theoretical foundation provided by previous research by Davenport [28] on the importance of well-designed processes in enhancing organizational performance, capacity, organizational support systems regarding processes/procedures, communication, capacity, and resources provision for performance and productivity. The relatively high mean score and good internal consistency indicate that respondents perceive communication patterns positively. It shows a moderate to strong relationship among all study variables of capacity and resources, and a weak relationship with learning performance and productivity: Mas = 0.634, ETS = 0.635, UtLGAI = 0.432, CM = 0.487, and ELO = 0.568. These findings support the work of Robbins and Judge [29] on the critical role of communication in organizations, particularly in facilitating awareness and adoption of new technologies like generative AI among faculty.

 

Lastly, organizational support systems regarding capacity and resources (x̄ = 4.63, σ = 0.44, Cα = 0.886) correlate significantly with faculty performance and productivity. The high mean score and good internal consistency suggest that respondents positively view the organization's capacity and resources, including institutional investment in AI infrastructure, training programs, and technical support for AI tool adoption. It shows a positive relationship among all study variables of OSS and its relationship with FPP: Mas = 0.633, UtLGAI = 0.484, CM = 0.461. This result aligns with the theory in the literature that health aligns with Ahmed et al.’s [30], [31] research on the resource-based view of the firm, which posits that organizational resources and capabilities—including technological infrastructure for AI integration—are fundamental to achieving competitive advantage and superior performance outcomes in higher education institutions.

 

Table 3. Mean, Standard Deviation, Cronbach's Alpha, and Correlations between TTF and FPP

Variables

Task-Technology Fit

Faculty Performance and Productivity

Task Characteristics (TkC)

Technology Characteristics (TeC)

Task-Technology Fit (TtF)

Utilization (U)

Performance Impact (PI)

·      Mastery (Mas)

0.514**

0.580**

0.560**

0.814**

0.908**

·      Evaluation of Teaching Strategies (ETS)

0.535**

0.617**

0.579**

0.804**

0.868**

·      Utilization of LMS and Gen AI Tools for Instruction (UoLGAI)

0.281**

0.391**

0.364**

0.516**

0.568**

·      Classroom Management (CM)

0.310**

0.388**

0.359**

0.493**

0.533**

·      Evaluation of Learning Outcomes (ELO)

0.421**

0.496**

0.475**

0.674**

0.785**

Mean

4.56

4.58

4.55

4.47

4.56

SD

0.45

0.44

0.48

0.53

0.45

Cronbach's Alpha

0.767

0.896

0.910

0.948

0.933

 

Faculty Performance and Productivity

 

Mastery

Evaluation of Teaching Strategies

Utilization of IMs

Classroom

Management

Evaluation of Learning Outcomes

Mean

4.56

4.50

4.55

4.56

4.59

SD

0.50

0.50

4.46

0.48

0.44

Cronbach's Alpha

0.949

0.900

0.917

0.950

0.890

                     

 

Table 3 illustrates respondents' perceptions of the Task-Technology Fit (TTF) and its relationship with Faculty Performance and Productivity (FPP) within the university setting, particularly in the context of integrating generative AI tools into academic workflows. The findings suggest that various aspects of TTF have positively correlated with FPP within the organization, aligning with the seminal work of Goodhue and Thompson [15] on the Task-Technology Fit model [33]. This model posits that information technology is more likely to positively impact individual performance when it fits well with the tasks being performed—a principle increasingly relevant as faculty navigate AI-enhanced teaching and research environments.

 

The TTF, in terms of task characteristics (x̄ = 4.56, σ = 0.45; Cα = 0.767), has demonstrated a positive impact on FPP. This evidently links mean score and acceptable internal consistency, indicating that respondents positively perceive the task characteristics—including tasks enhanced or transformed by generative AI—as well-suited to their work. The task characteristics exhibit significant correlations with study variables related to FPP: Mastery (Mas) = 0.634, Evaluation of Teaching Strategies (ETS) = 0.535, Utilization of LMS and Gen AI Tools for Instruction (UtLGAI) = 0.281, Classroom Management (CM) = 0.310, and Evaluation of Learning Outcomes (ELO) = 0.421. These findings support the work of Zigurs and Buckland [34], who emphasized the importance of matching task characteristics with appropriate technology support, particularly as generative AI tools offer new capabilities for personalized learning, content creation, and research assistance in group support systems.

 

Similarly, task-technology fit regarding technology characteristics (x̄ = 4.58, σ = 0.44; Cα = 0.896) also correlates with faculty performance and productivity. The findings suggest that respondents view the characteristics of the technology—including generative AI platforms, LMS systems, and digital tools—positively. It shows the study variables related to FPP: Mas = 0.580, ETS = 0.617, UtLGAI = 0.391, CM = 0.388, and ELO = 0.496. These results align with research by Venkatesh and Davis [36] on the Technology Acceptance Model, which highlights the importance of perceived usefulness and ease of use in technology adoption and performance, particularly relevant as faculty evaluate the capabilities and limitations of emerging AI tools [37].

 

The overall task-technology fit (x̄ = 4.55, σ = 0.48; Cα = 0.910) also influences faculty performance and productivity. Respondents perceive a good fit between tasks and technology, including the alignment of generative AI tools with specific pedagogical and research needs. It shows a moderate relationship among all study variables of the faculty performance and productivity: Mas = 0.590, ETS = 0.579, UtLGAI = 0.364, CM = 0.339, and ELO = 0.475. These findings support the work of Dishaw and Strong [38] and Al-Maatouk et al. [39], who integrated the Task-Technology Fit model with the Technology Acceptance Model to better explain technology utilization and its impact on performance, particularly crucial as institutions navigate the rapid proliferation of AI tools in academic settings.

 

Task-technology fit in terms of utilization (x̄ = 4.47, σ = 0.53; Cα = 0.948) correlates with faculty performance and productivity. The results suggest that respondents report good utilization of the technology, including active integration of generative AI tools into teaching, research, and administrative tasks. It shows a moderate positive relationship among all study variables of TTF and its relationship with the performance and productivity of faculty: Mas = 0.576, ETS = 0.553, UtLGAI = 0.491, CM = 0.397, and ELO = 0.454. This finding supports the works of Button-Jones and Straub [40] and Wu et al. [41], who emphasized the importance of system usage in realizing the benefits of information technology, particularly as generative AI requires active engagement and experimentation to fully leverage its capabilities for enhancing academic outcomes.

 

Lastly, TTF in terms of performance impact (x̄ = 4.56, σ = 0.45; Cα = 0.931) influences faculty performance and productivity. The results indicate that respondents perceive a strong positive impact of the task-technology fit on their performance, particularly when AI tools are appropriately matched to academic tasks. It shows a strong positive relationship among all study variables of the faculty performance and productivity: Mas = 0.608, ETS = 0.685, UtLGAI = 0.392, CM = 0.353, and ELO = 0.783.

 

These findings align with the work of DeLone and McLean [42] on the Information Systems Success Model [43], highlighting the link between system quality, use, and individual impact—principles that extend naturally to generative AI adoption, where appropriate tool selection and effective implementation directly influence faculty productivity and institutional outcomes.

 

Table 4. Regression Coefficients, Standard Errors, and Model Summary for the Presumed Influence of the Organizational Support Systems on HEI Faculty Performance and Productivity

Model

Coefficients

Unstandardized

Coefficients

Standardized Coefficients

t

Sig.

B

Std. Error

Beta

Constant

1.425

0.281

 

5.070

0.000

Communication

0.394

0.098

0.395

4.000

0.000

Capacity and Resources

0.282

0.102

0.274

2.772

0.006

 

R2 = 0.408; Adj R2 = 0.402

 

F = 63.102; (2, 183)

 

A stepwise multiple linear regression examined the relationship between Organizational Support Systems (OSS) and Faculty Performance and Productivity (FPP) within the university setting, particularly as institutions navigate the integration of generative AI into teaching and research. The statistical approach is widely used in empirical research to identify the most significant predictors of a dependent variable from a set of independent variables [44], [45], which are useful as the dependent variable, while various OSS components were considered independent variables in this study examining faculty performance in an era of rapid technological transformation.

 

The multiple regression analysis revealed that two specific aspects of OSS – communication and capacity and resources – contributed significantly to the regression model, F(2, 183) = 63.102, p < .05. This F statistic indicates that the overall model is statistically significant, suggesting that these OSS components play a meaningful impact on FPP outcomes, particularly in facilitating faculty awareness, adoption, and effective use of generative AI tools in their academic work.

 

The model accounted for 40.8% of the variation in faculty performance and productivity (R² = .408). This R-squared value indicates the model's moderate to strong explanatory power, indicating that these OSS factors—especially those supporting AI integration—can explain a substantial portion of the variability in FPP [23]. An R-squared value of this magnitude in social sciences is substantial, as numerous complex factors typically influence human behavior and performance [46], [47].

 

The finding that communication is a major predictor is consistent with earlier studies showing how important good communication is to organizational performance. For example, Neves and Eisenberger [48] discovered that enhanced employee performance results from explicit communication of organizational support, particularly regarding new technological tools and resources. The resource-based view suggests that organizations, which contends that an organization's distinct combination of resources and capabilities—including technological infrastructure for AI adoption—is essential to its success, is similarly supported by the importance of capacity and resources [49], [50]. It is noteworthy that although processes/procedures and organizational structure were included in the OSS construct, they were not found to be significant predictors of faculty performance and productivity, although their unique contribution to explaining FPP variance was not statistically significant when accounting for other components like communication about AI capabilities and resources for AI integration [51].

 

These results imply that colleges should prioritize enhancing their communication systems and ensuring they have the capacity and resources to support faculty members in effectively integrating generative AI into their work. This includes clear communication about available AI tools, their appropriate uses, institutional guidelines for responsible AI use, and ongoing dialogue about AI's role in teaching and research. Rather than focusing solely on creating policies and procedures around AI, institutions should emphasize open communication about AI capabilities, limitations, and best practices, combined with adequate resources such as AI tool subscriptions, technical support, computational infrastructure, and dedicated AI literacy training programs. Since communication emerged as a critical predictor, future research could explore which specific communication channels or strategies are most effective in supporting faculty AI adoption. To prove causation, more investigation is needed to prove causal linkages. This could involve the use of experimental designs or longitudinal studies that track faculty performance as institutions implement AI-focused communication and resource allocation strategies over time.

 

Table 5. Regression Coefficients, Standard Errors, and Model Summary for the Presumed Influence of the Task-Technology Fit on HEI Faculty Performance and Productivity

Model

Coefficients

Unstandardized

Coefficients

Standardized Coefficients

t

Sig.

B

Std. Error

Beta

Constant

1.218

0.158

 

7.688

0.000

Performance Impact

0.563

0.059

0.658

9.522

0.000

Utilization 

0.168

0.054

0.214

3.094

0.002

 

R2 = 0.710; Adj R2 = 0.706

 

F = 223.637; (2, 183)

 

A stepwise multiple linear regression analysis investigated the association between Faculty Performance and Productivity (FPP) and Task-Technology Fit (TTF). In this instance, TTF's multiple components were regarded as independent variables, while FPP was the dependent variable. Information systems researchers frequently employ this statistical method to separate the most important predictors of a dependent variable from a group of independent variables, particularly when examining technology adoption and integration contexts such as generative AI implementation [44], [45].

 

The results of the multiple regression analysis showed that the regression model, F (2, 183) = 223.637, p < .05, revealed a significant effect on FPP, specifically driven characteristics of TTF: utilization and performance impact. According to Field (2013), this F-statistic shows that the model is statistically significant, indicating that these TTF components—particularly those related to effective use and impact of generative AI tools—have a meaningful influence on faculty performance and productivity outcomes.

 

The model accounted for 71% of the variation in faculty performance and productivity (R² = .71). This R-squared value suggests a strong explanatory power of the model, indicating that a large portion of the variability in FPP can be explained by these TTF factors, particularly how well faculty utilize and perceive the impact of AI-enhanced technologies [23]. In social sciences and information systems research, an R-squared value of this magnitude is considered exceptionally high, as numerous complex factors typically influence human behavior and performance related to technology adoption and effective use [46], [52].

 

The significance of utilization as a significant predictor aligns with the original Task-Technology Fit model [15]. They agreed that a good fit between task requirements and technology characteristics—including the capabilities of generative AI tools—leads to higher performance impacts. This finding is also consistent with more recent research [32] that perceived performance impacts of technology significantly influence user satisfaction and individual performance in educational settings, particularly as faculty actively engage with AI tools in their teaching and research workflows.

 

The significance of utilization as a predictor supports the extended Task-Technology Fit model, which incorporates technology utilization as a crucial factor [38]. This aligns with the work of Burton-Jones and Grange [53], who emphasized the importance of the effective use of information systems in realizing performance benefits. In the context of generative AI adoption, faculty members' active experimentation with and integration of AI tools into their pedagogical and scholarly practices plays a crucial role in their performance [54]. It's noteworthy that while task characteristics and technology characteristics were also part of the TTF construct, they did not emerge as significant predictors in this stepwise regression, particularly when accounting for the stronger effects of utilization and performance impact. This doesn't necessarily mean they are unimportant, but rather that their unique contribution to explaining FPP variance was not statistically significant when accounting for performance impact and utilization [51].

 

These findings suggest that to enhance faculty performance and productivity, universities should focus on ensuring that the technology provided—including generative AI platforms—has a substantial and observable impact on teaching and research outcomes, and that faculty members are actively encouraged to utilize these technologies effectively. This might involve training programs to improve technology utilization, including hands-on workshops on AI-assisted course design, research applications, and assessment strategies, and continuously evaluating the performance impact of these tools through feedback mechanisms and performance metrics. Such initiatives could provide deeper insights into how faculty members perceive the performance impact of technology, particularly generative AI tools, and what factors facilitate or hinder effective utilization. Additionally, while task and technology characteristics did not emerge as significant predictors when utilization and performance impact were considered, institutions should not neglect ensuring that AI tools align well with faculty tasks and possess appropriate technical characteristics such as user-friendly interfaces, reliability, and relevance to academic needs.

CONCLUSION

Based on the study's findings, it was revealed that faculty members perceived a high level of organizational support systems, and the task-technology fit was highly practiced among them, particularly in the context of navigating and integrating generative AI tools into academic work. This suggests a positive organizational climate within the higher education institutions (HEIs) studied, where faculty members feel supported and equipped with appropriate technological resources, including access to and training for emerging AI technologies. Additionally, there were strong positive associations between Organizational Support Systems (OSS), Task-Technology Fit (TTF), and Faculty Performance and Productivity (FPP) variables. This interconnectedness highlights the composite factors influencing faculty performance and underscores the importance of a holistic approach to enhancing academic productivity, particularly as institutions navigate the transformative potential of generative AI in teaching, research, and administrative functions.

 

Two OSS variables - communication, capacity and resources - and two TTF variables - performance impact and utilization - emerged as the most significant predictors of faculty performance and productivity. This finding is particularly significant as it pinpoints specific areas where interventions could yield the most substantial improvements in faculty performance in the era of AI-enhanced education.

 

These findings have far-reaching implications for HEI administrators and policymakers. They suggest that HEIs, to promote implementing effective organizational support systems, focusing on enhancing communication channels, building institutional capacity, and allocating resources strategically to support faculty work—including transparent and efficient communication protocols about generative AI tools, policies, and best practices, and ensuring equitable distribution and access to essential resources such as AI platform subscriptions, computational infrastructure, and technical support. When communication is clear, transparent, and consistent—particularly regarding available AI tools, appropriate use cases, ethical guidelines, and institutional support mechanisms—faculty members can access appropriate technology tools and resources to perform their tasks efficiently. However, excellent communication alone is insufficient; institutions must also ensure they have the capacity and resources to back up their messages. This includes adequate funding, skilled technical staff to support AI integration, robust IT infrastructure to run AI applications, and strong institutional power to effect necessary changes to facilitate AI adoption in teaching and research.

 

To this end, HEIs should consider implementing comprehensive training programs and ongoing support mechanisms to enhance faculty members' task-technology fit, particularly around generative AI literacy and responsible use. This could include workshops on how educational technologies and AI tools can be leveraged for pedagogical innovation, one-on-one tech support for troubleshooting AI platform issues, and the creation of peer learning communities where faculty can share best practices in technology utilization, experiment with AI-enhanced teaching strategies, and collaboratively explore ethical considerations around AI use in education.

 

The study also opens avenues for future research. While it establishes strong correlations, further investigation into the causal relationships between these variables could provide even more actionable insights. Additionally, qualitative studies could explore faculty experiences and perceptions regarding how they integrate AI tools, organizational support systems and technology in their daily academic lives, and the challenges and opportunities they encounter. Identifying key faculty performance indicators beyond self-reported measures and examining task-technology fit across different disciplines, institutional contexts, and stages of AI adoption will be vital. As higher education continues to evolve in an increasingly digital world, and as generative AI tools become more sophisticated and widely adopted, understanding and optimizing these relationships will be crucial for institutional success and faculty well-being.

 

ACKNOWLEDGEMENTS

The authors express their heartfelt appreciation to the faculty members who graciously dedicated their time and took part in this study. Furthermore, the authors would like to acknowledge and express gratitude to Laguna State Polytechnic University for generously giving support for both the development and dissemination of this research.

REFERENCES
  1. H. Stupnisky, A. BrckaLorenz, B. Yuhas, & F. Guay, “Faculty members' motivation for teaching and best practices: Testing a model based on self-determination theory across institution types”. Contemporary Educational Psychology, vol. 53, 15-26, 2017.
  2. https://doi.org/10.1016/j.cedpsych.2018.01.004
  3. Fedorowicz, & B. Konsynski, “Organization Support Systems: Bridging Business and Decision Processes.” Journal of Management Information Systems, vol. 8 no. 4, 5–25, 1992.  https://doi.org/10.1080/07421222.1992.11517936
  4. N. Kurtessis, R. Eisenberger, M. T. Ford, L. C. Buffardi, K. A. Stewart,, & C. S. Adis, “Perceived organizational support: A meta-analytic evaluation of organizational support theory.” Journal of management, vol. 43 no. 6, 1854-1884, 2017.
  5. https://doi.org/doi/10.1177/0149206315575554
  6. Y. Alyoussef, “Acceptance of e-learning in higher education: The role of task-technology fit with the information systems success model.” Heliyon, vol. 9 no. 3, 2023.
  7. https://doi.org/10.1016/j.heliyon.2023.e13751
  8. Deslonde, & M. Becerra, “The technology acceptance model (TAM): Exploring school counselors' acceptance and use of Naviance.” The Professional Counselor, vol. 8 no. 4, 369-382, 2018. http://dx.doi.org/10.15241/vd.8.4.369
  9. O. Awa, O. U. Ojiabo, & L. E. Orokor, “Integrated technology-organization-environment (TOE) taxonomies for technology adoption.” Journal of Enterprise Information Management30(6), 893-921, 2017.
  10. https://doi.org/10.1108/JEIM-03-2016-0079
  11. Kim, & K. S. S. Lee, “Conceptual model to predict Filipino teachers' adoption of ICT-based instruction in class: using the UTAUT model. Asia Pacific Journal of Education, vol. 42 No. 4, 699-713, 2022.
  12. https://doi.org/10.1080/02188791.2020.1776213
  13. A. McCoy, “Work Engagement Outcomes in Higher Education: A Systematic Review of Management's Role in Supporting Job and Career Development(Doctoral dissertation, University of Maryland University College), 2019.
  14. V. Aliazas, J. F. Panoy, A. L. Del Rosario, & J. Madrideo, “Critical Success Factors of the Flexible Learning Delivery as Organizational Innovation of One State University in the Philippines.” International Journal of Educational Management and Development Studies, vol. 2 no. 3, 61-77, 2021. https://doi.org/10.53378/348736
  15. Bashir, & A. Gani, “Testing the effects of job satisfaction on organizational commitment.” Journal of Management Development, vol. 39 no. 4, 525-542, 2020. http://dx.doi.org/10.1108/JMD-07-2018-0210
  16. A. Ashraf, “Demographic factors, compensation, job satisfaction and organizational commitment in private university: an analysis using SEM.” Journal of Global Responsibility, vol. 11 no. 4, 407-436, 2020. https://doi.org/10.1108/JGR-01-2020-0010
  17. H. Stupnisky, N. C. Hall, & R. Pekrun, “The emotions of pretenure faculty: Implications for teaching and research success.” The Review of Higher Education, vol. 42 no. 4, 1489-1526, 2019. https://doi.org/10.1353/rhe.2019.0073
  18. M. M. Aliazas, & E. N. Chua, “Work Culture and Learning Organization Practices in Promoting Work Productivity among Public Elementary School Teachers.” International Journal of Educational Management and Development Studies, vol. 2 no. 3, 39-60, 2021. https://doi.org/10.53378/348735
  19. B. Bakker, “Job crafting among health care professionals: The role of work engagement.” Journal of Nursing Management, vol. 26 no. 3, 321-331, 2018.
  20. https://doi.org/10.1111/jonm.12551
  21. L. Goodhue, & R. L. Thompson, “Task-technology fit and individual performance. MIS Quarterly, vol. 19 no. 2, 213-236, 1995. https://doi.org/10.2307/249689
  22. C. Howard, & J. C. Rose, “Refining and extending task–technology fit theory: Creation of two task–technology fit scales and empirical clarification of the construct.” Information & Management, vol. 56 no. 6, 103134, 2019. https://doi.org/10.1016/j.im.2018.12.002
  23. Spies, S. Grobbelaar, & A. Botha, “A scoping review of the application of the task-technology fit theory”. In Conference on e-Business, e-Services and e-Society(pp. 397-408), 2020. Cham: Springer International Publishing.
  24. Tam, & T. Oliveira, “Understanding the impact of m-banking on individual performance: DeLone & McLean and TTF perspective.” Computers in Human Behavior, vol. 61, 233-244, 2016. https://doi.org/10.1016/j.chb.2016.03.016
  25. M. Cheng, "How does task-technology fit influence cloud-based e-learning continuance and impact?” Education+ Training, vol. 61 no. 4, 480-499, 2019. https://doi.org/10.1108/ET-09-2018-0203
  26. Halilić, & D. Tinjić, “The Impact of Digitalization on Student Academic Performance in Higher Education: Investigating the change in academic performance of university level students after a sudden switch to digital education due to the COVID-19 outbreak. Case of Jönköping International Business School. 2020.
  27. S. Goodman,, & L Zhang, “Quantitative research methods.” In Public Health Research Methods for Partnerships and Practice(pp. 188-219), 2017. Routledge.
  28. Pandey & M. M. Pandey, “Research methodology tools and techniques." Bridge Center, 2021.
  29. L. Aberson, “Applied power analysis for the behavioral sciences.” Routledge, 2019.
  30. Rhoades & R. Eisenberger, “Perceived organizational support: A review of the literature.” Journal of Applied Psychology, vol. 87 no. 4, 698-714, 2002. https://doi/10.1037/0021-9010.87.4.698
  31. Eisenberger, G. P. Malone, & W. D. Presson, “Optimizing perceived organizational support to enhance employee engagement.” Society for Human Resource Management and Society for Industrial and Organizational Psychology, vo. 2, 3-22, 2016.
  32. Mintzberg, “The structuring of organizations.” Englewood Cliffs, NJ: Prentice-Hall, 1979.
  33. C. Lunenburg, “Organizational structure: Mintzberg's framework.” International Journal of Scholarly, Academic, Intellectual Diversity, vol. 14 no. 1, 1-8, 2012.
  34. H. Davenport, “Process innovation: Reengineering work through information technology.” Harvard Business Press, 2013. https://www.jstor.org/stable/4165128
  35. P. Robbins and T.A. Judge, “Organizational Behavior.” 17th Edition, Pearson Education Limited, Upper Saddle River, 2016.
  36. Ahmed, F. M. Khuwaja, N. A. Brohi, I. Othman, & L. Bin, “Organizational factors and organizational performance: A resource-based view and social exchange theory viewpoint.” International Journal of Academic Research in Business and Social Sciences, vol. 8 no. 3, 579-599, 2018.
  37. http://dx.doi.org/10.6007/IJARBSS/v8-i3/3951
  38. T. Hussain & A. Waheed, “Strategic resources and firm performance: an application of the resource based view. 2019.
  39. Wu & X. Chen, “Continuance intention to use MOOCs: Integrating the technology acceptance model (TAM) and task technology fit (TTF) model.” Computers in Human Behavior, vol. 67, 221-232, 2019.
  40. https://doi.org/10.1016/j.chb.2016.10.028
  41. G. R. Samaravickrama & G. D. M. N. Samaradiwakara, “INFORMATION COMMUNICATION TECHNOLOGIES USED IN RESEARCH. In RESEARCH SYMPOSIUM. 2022.
  42. Zigurs & B. K. Buckland, “A theory of task/technology fit and group support systems effectiveness.” MIS Quarterly, vol. 22 no. 3, 313-334, 1998. https://doi.org/10.2307/249668
  43. F. Cascio & R. Montealegre, “How technology is changing work and organizations.” Annual review of organizational psychology and organizational behavior, vol. 3 no. 1, 349-375, 2016.
  44. https://doi.org/10.1146/annurev-orgpsych-041015-062352
  45. Venkatesh, M. G. Morris, G. B. Davis, & F. D. Davis, “User acceptance of information technology: Toward a unified view.” MIS Quarterly, vol. 27 no. 3, 425-478, 2003. http://dx.doi.org/10.2307/30036540
  46. A. Panergayo & J. V. Aliazas, “Google Classroom Adoption as Learning Management System in Senior High School Using Technology Acceptance Model.” Jurnal Pendidikan Progresif, vol. 13 no. 2, 871-883, 2023. https://doi.org/10.23960/jpp.v13.i2.202355
  47. T. Dishaw & D. M. Strong, “Extending the technology acceptance model with task–technology fit constructs.” Information & Management, vol. 36 no. 1, 9-21, 1998.
  48. Al-Maatouk, M. S. Othman, A. Aldraiweesh, U. Alturki, W. M. Al-Rahmi, & A. A. Aljeraiwi, “Task-technology fit and technology acceptance model application to structure and evaluate the adoption of social media in academia.” Ieee Access, vol. 8, 78427-78440, 2020. http://doi.org/10.1109/ACCESS.2020.2990420
  49. Burton-Jones & D. W. Straub, Jr, “Reconceptualizing system usage: An approach and empirical test.” Information Systems Research, vol. 17 no. 3, 228-246, 2006. https://doi.org/10.1287/isre.1060.0096
  50. P. J. Wu, D. W. Straub, & T. P. Liang, “How information technology governance mechanisms and strategic alignment influence organizational performance.” MIS quarterly, vol. 39 no. 2, 497-518, 2015.
  51. https://doi.org/10.25300/MISQ/2015/39.2.10
  52. H. DeLone & E. R. McLean, “The DeLone and McLean model of information systems success: A ten-year update.” Journal of Management Information Systems, vol. 19 no. 4, 9-30, 2003.
  53. http://dx.doi.org/10.1080/07421222.2003.11045748
  54. C. Lee, Y. C. Shiue, & C. Y. Chen, “Examining the impacts of organizational culture and top management support of knowledge sharing on the success of software process improvement.” Computers in Human Behavior, vol. 54, 462-474, 2016.
  55. https://doi.org/10.1016/j.chb.2015.08.030
  56. F. Hair, W. C. Black, B. J. Babin, & R. E. Anderson, “Multivariate data analysis (8th ed.).” Cengage Learning, 2018.
  57. Field, “Discovering statistics using IBM SPSS statistics (4th ed.).” Sage, 2013.
  58. Moksony, “Small is beautiful. The use and interpretation of R2 in social research.” Szociológiai Szemle, Special issue, 130-138, 1990.
  59. Neff, “Work and human behavior.” Routledge, 2017.
  60. Neves & R. Eisenberger, “Management communication and employee performance: The contribution of perceived organizational support.” Human Performance, vol. 25 no. 5, 452-464, 2012. https://doi/10.1080/08959285.2012.721834
  61. Barney, “Firm resources and sustained competitive advantage.” Journal of Management, vol. 17 no. 1, 99-120, 1991.
  62. https://doi.org/10.1177/014920639101700108
  63. Wernerfelt, “Employment, Markets, Contracts, and the Scope of the Firm.” Markets, Contracts, and the Scope of the Firm (January 6, 2015). http://dx.doi.org/10.2139/ssrn.2369048
  64. Z. Keith, “Multiple regression and beyond: An introduction to multiple regression and structural equation modeling (3rd ed.).” Routledge, 2019.
  65. I. Cook & D. D. Woods, “Operating at the sharp end: the complexity of human error.” In Human error in medicine(pp. 255-310). CRC Press, 2018.
  66. Burton-Jones & C. Grange, “From use to effective use: A representation theory perspective.” Information Systems Research, vol. 24 no. 3, 632-658, 2013.
  67. https://www.jstor.org/stable/42004286
  68. Wang & B. Li, “Technostress among university teachers in higher education: A study using multidimensional person-environment misfit theory.” Frontiers in psychology, vol. 10, 1791, 2019. https://doi.org/10.3389/fpsyg.2019.01791
Recommended Articles
Research Article
Assessing the Role of Mall Atmospherics on Emotional Responses and Shopping Behavior in Tamil Nadu
Published: 23/11/2025
Research Article
Exploring Biomarkers of Subclinical Mastitis in Dromedary Camels: Insights from Somatic Cell Count and Milk Composition
Published: 23/11/2025
Original Article
Adoption Of Digital Banking Services On The Level Of Security And Privacy- A Study On Rural Customers In Selected Districts Of Odisha.
...
Original Article
Csr Reporting And Transparency: An Evaluation Of Public Sector Banks Operating In Karnataka.
Loading Image...
Volume 2, Issue:5
Citations
4 Views
3 Downloads
Share this article
© Copyright Advances in Consumer Research