Exploring students’ learning experience in online education: analysis and improvement proposals based on the case of a Spanish open learning university
- Cultural and Regional Perspectives
- Open access
- Published: 27 August 2021
- volume 69 , pages 3367–3389 ( 2021 )
You have full access to this open access article
- Pablo Rivera-Vargas ORCID: orcid.org/0000-0002-9564-2596 1 , 2 ,
- Terry Anderson 3 &
- Cristina Alonso Cano 1
Explore all metrics
Cite this article
Not surprisingly, the number of online universities continues to expand—especially in Covid-19 times. These institutions all offer “online education” with diverse institutional, technological, and pedagogical processes. However, a fundamental element has to do with the experience of the students, and how they adapt to the educational model of the online university in which they are studying. In this article, we present the main results of the case-study developed in one of the most historical and relevant virtual universities in an international context. We have explored and analysed the process of adaptation to the educational model by the student body, and their perceptions of their interactions with the pedagogical, institutional, and technological elements designed to support their learning. Qualitative and quantitative methods are used to gather and analyse the data. From 1715 students who participated in the survey and the perceptions of 30 students individually interviewed, the results show positive evaluations regarding the integration and adoption of technological competencies, and also, that the online education generally serves as a responsive model to the emergent needs of the learner. However, the results also show that students have important concerns regarding the pedagogical and institutional support provided.
Avoid common mistakes on your manuscript.
Online education has undergone profound transformations in recent times. Its evolution and configuration has gone hand in hand with changes that societies themselves have experienced with pervasive effects of the digital and networked society. These changes intensified during 2020 with the pandemic and its consequences.
The research presented in this article is from a case study developed at the Open University of Catalonia (UOC), located in Spain OUC was founded as an online university and has 25 years’ experience in offering online education. The main objective of the research was to explore and understand the academic and personal trajectories of the students during their educational experiences, with a focus on their interaction with the main pedagogical and technological elements that make up the online education system and their adaptation to the online education model used by the University. Considering this context, the two research questions that we answer in this article are:
What are the most relevant aspects for the student body when evaluating their online educational experience?
How can the educational experiences of virtual students in online universities be improved?
In order to answer these questions, and before presenting the methodological process carried out together with the results and main conclusions, two central dimensions of the work are analysed. First, the possibilities and limits of online learning, and second, a scan of the research literature on the nature and challenges faced by online students.
Although this research was carried out prior to the pandemic, we consider that its results are useful in identifying some of the possibilities and limitations that this form of education so as to improve the online educational experience of students from traditional, blended and online universities.
Online education: possibilities and limits for learning
The definition of “Online Education” has gone through substantive transits. Although its definitions and approaches are varied, the present literature defines it as a mode that is essentially carried out in virtual learning environments (VLE), through the internet and with active use of digital devices (Anderson & Rivera-Vargas, 2020 ; Bates, 2019 ; Lee, 2019 ). Its emergence and consolidation must be understood not only as an evolution of traditional distance education, but also as a modality capable of dealing with the new formative demands of a technologically infused world (Lee, 2019 ) and a networked and connected society (Selwyn, 2019 ).
At the same time, the consolidation of the digital society, and the recent consequences and responses to the Covid-19 pandemic have reduced the distance between traditional face-to-face education and online education. Although their target audiences are still essentially different, traditional universities have undergone substantial virtualization processes, gradually leaving behind their analog heritage (Rivera-Vargas & Cobo Romaní, 2019 ). In addition, virtual universities have opted to offer, in their own virtual learning environments, some of those distinctive features of face-to-face education—including in some cases occasional face-to-face gatherings. An example of this is the continuing effort of virtual universities to strengthen the constructivist and collaborative character of their educational programming. Unlike early models of distance education that focussed on independent study (Fallon & Brown, 2016 ; Moore, 1989 ) and instructivist pedagogies (Zawacki-Richter & Anderson, 2014 ) online education now provides a platform and thus an educational teaching model with the affordances to create and sustain simultaneous and accessible learning communities anywhere at any time.
Online education has also defended its role as an inclusive educational modality that enables and facilitates access to higher education and the development of digital competences. Research efforts such as Sangrà et al ( 2012 ), and Hills ( 2017 ) show that the active use of electronic and digital media and devices in online education can facilitate access, development and improvement of the quality of education. Simultaneously to formal curriculum delivery, Chu ( 2010 ) and Anderson & Rivera-Vargas ( 2020 ) argue that online education provides and standardizes the technological and digital competencies of the students using the same virtual environment. This reduces the potential digital divide across multiple intersectional dimensions including gender, social class, physical disability, geographical location, and age (Chu, 2010 ).
It is important to highlight that these characteristics of online education, have coexisted for years with opposing critical views—some that question their effectiveness in real contexts rather than their global conceptualization. These highlights, for example, the limitations of mediated human contact and the need for the student to have high levels of personal motivation to be successful (Anderson & Rivera-Vargas, 2020 ; Kocdar et al., 2018 ). A problem also arises when the design of a virtual environment and learning activities is limited to the organization and dissemination of electronic resources (for example posting lectures online) and is not built on an ecological support of active learning (Davis et al., 2018 ).
In this sense, one of the most researched and most relevant aspects of the online education experience are efforts to increase student motivation in their educational process, through autonomy in learning, through effective use of digital tools, and through an active and interactive relationship between and amongst students and teachers. Guri-Rosenblit and Gros ( 2011 ) highlight the potential horizontal nature of this pedagogical relationship, giving relevance to the support that the student receives from the teaching staff and the institution as a whole. At the same time, Palloff & Pratt ( 2013 ), Kocdar et al., ( 2018 ) and Pilkington ( 2018 ) highlight the importance of helping students achieve autonomy and self-regulation so as, to motivate and thus enrich their educational experience.
In the next section, we delve more deeply into the importance of the student body in the educational model of online university education.
The student in virtual learning environments
The development and large-scale accessibility to digital technologies and resources, together with the need for lifelong learning motivate why many people decide to pursue their university education in virtual environments (Jung, 2011 ; Pilkington, 2018 ). Many of these students cannot access traditional learning centres, with conventional face-to-face models, due to physical or economic constraints. However, they still need to acquire specific knowledge and skills that are applicable to their personal and professional lives (McKnight et al., 2016 ). In addition, it is usually students with professional experience and digital skills, who generally seek an education that allows them to integrate their previous knowledge, with new skills and knowledge (Jung, 2011 ; Sánchez-Gelabert et al., 2020 ) while adapting to their emergent needs in their professional and personal lives.
From a constructivist viewpoint, Anderson ( 2016 ) and Vuopala et al ( 2016 ) maintain that the learning process with digital tools is or at least can be, fundamentally collaborative. Students create knowledge through interaction between themselves, the teacher, and their environment, that allows and indeed forces them to assume the leading role in their learning process. The demographic characteristics of the student body engaged in online learning are heterogeneous. Jung ( 2011 ) and Murphy and Stewart ( 2017 ) noted that the majority of the first wave of online education students made contact with the computer and with digital technologies in late youth or adult life (late twentieth century and early twenty-first century). That is, these students came from a campus-based educational environment where the teacher was the leader of the process, who set the timetable and dictated how knowledge would be acquired.
The following generations of virtual students are made up of a great variety of ages, the majority coming from a regulated formation focused on the transmission of knowledge made by the teacher, but who more prone to proactivity (Murphy & Stewart, 2017 ). Thus, they are more accustomed to collaboration between equals, to be more democratic, more diverse and be involved in less hierarchical telematic relationships. Although there are differences and varying needs among online students according to their culture, the disciplines they choose to study and their age, they show many common characteristics in their identity and performance when learning in these environments (Kocdar et al., 2018 ; McKnight et al., 2016 ). Perhaps the most striking commonality, although not surprising, is that the majority enrol for the first time in online education, without knowing what it is to be an online student, without knowing what to do, what it entails, how to perform optimally and without having received any training (Jung, 2011 ; Pilkington, 2018 ). Despite this, most are able to adapt and learn due to the flexible context, transference of digital skills from social and professional contexts, having the individual responsibility for their time and the ability to access educational resources until the completion of their course (Pilkington, 2018 ).
When referring to the online student, there is a tendency to highlight those actions that describe their predisposition to participate in online learning environments (Sánchez-Gelabert et al., 2020 ). This is important if we take into account the multiple transformations that have occurred in the educational field during the last decades (Lee, 2019 ). In fact, the figure of the student in virtual environments as an apprentice with a higher level of autonomy, not only emerges as a consequence of the development and use of digital technologies in educational contexts, but also emerges from previous efforts aimed at positioning the student as a leading actor in the teaching and learning processes, and as a result strengthens their autonomous learning (Farrell & Brunton, 2020 ). Palloff & Pratt ( 2013 ), for example, suggest the profile of the successful online student, that although emergent and mediated through the use of computers, is marked by the abilities and skills to manage the tools and resources of these learning environments. The student also gains skills and competencies that facilitate their autonomous learning. Palloff & Pratt ( 2013 ) identify these characteristics of the online students:
Ease of sharing their work, points of view, and experiences with others in order to build virtual communities.
Improvement in written communication skills, in order to relate to others online and to develop social capital by exposing personal and communication skills.
Ability to self-motivate and self-discipline, given the flexibility of the courses.
Commitment to the course, investing a significant amount of time and effort.
Adaptability—a critical position in their learning process.
Understanding that reflection is part of the learning process and that learning is a transformative experience.
The use of digital technologies in the teaching and learning process aims to partially remedy the deficiencies of the traditional teaching and learning model used on campus and formally used in older distance education models (Alqurashi, 2019 ). In this way, inserting strategies that encourage the student to work autonomously, reinforce their self-control and support leaving behind the conception of student as passive, dependent, rigid, solitary, and non-reflective are critically important (Murphy & Stewart, 2017 ; Sánchez-Gelabert et al., 2020 ).
Method and context
Context of the open university of catalonia.
This case study focuses on the Open University of Catalonia (UOC). This institution was founded in 1995 during the period of the initial internet expansion. The UOC sought to be an academic environment adapted to the characteristics of modern society (UOC, 2009 ). It is recognized as one of the first universities in the world that has supported its teaching and learning model with the integral use of a virtual learning environment (VLE) or a learning management system (LMS) (Grau-Valldosera & Minguillón, 2014 ).
During the 2019–2020 academic year, the UOC had 56500 active students, of which 40500 were undertaking bachelor’s degrees studies, and 16000 were undertaking master’s degrees studies. From its foundation (1995) until December 2020, there have been a total of 89300 Bachelor’s degrees, and master’s degrees graduates, Footnote 1 spread over 134 countries around the world (UOC, 2020 ).
According to the institutional report for the 2018–2019 academic year (UOC, 2020 ), the typical student who begins studies at the UOC combines studies with work (81.70%), works in the private sector (67.40%), is studying to progress professionally (61.10%) and opts for the UOC in order to reconcile studies, work, and other responsibilities (50.40%) (UOC, 2020 ). Finally, it should be noted that the UOC has been increasing its international prestige in recent years. The latest version of the university ranking created by the Times Higher Education journal (THE, 2021 ), has placed the UOC in the following quality dimensions: among the top 150 young universities; the second-best Spanish university under 50 years old; the best online university in Ibero-America; and among the top 601–800 global universities. Footnote 2
Approach to the educational model
The UOC's educational model is focused on extending the learning possibilities of the student. To this end, it offers a wide diversity of strategies, resources, and pedagogical work dynamics, based on support of the teaching team and on interaction with classmates, who try to empathize with their needs and lifestyles (UOC, 2020 ). The model supports students learning as they work and communicate on the network (Sánchez-Gelabert et al, 2020 ).
The UOC’s educational model, it is based on the integration of four fundamental elements (Grau-Valldosera & Minguillón, 2014 ; UOC, 2009 , 2020 ):
The commitment to a horizontal and collaborative relationship between students and teachers.
The promotion of student autonomy and self-regulation, placing these at the center of their learning process.
Human support of the student, instantiated by three roles:
“Consultant”: providing pedagogical and subject matter support on a course-by-course basis
“Tutor”: providing institutional pastoral support throughout the students’ enrolment Footnote 3
“Technological support department”: providing technical support and accompaniment
Resources and content (spaces, tools and didactic materials with active use of digital technologies).
In addition, the UOC supports its pedagogical commitment with evaluative flexibility and the creation of a learning environment that favours interactivity and cooperation between students, and between students and the university (Sánchez-Gelabert et al., 2020 ). It is a model focused on accommodating the contemporary students as active participants in their learning processes (Kocdar et al., 2018 ). Footnote 4
In this research, a sequential mixed methods approach (qualitative first) has been used, complementing the use of qualitative methods with quantitative information collection and analysis techniques (Goetz & LeCompte, 1988 ). At the analytical level, an interpretative view is assumed, because it emphasizes the concern for the local and for the generation of a knowledge that is relevant and that emerges within these environments. This has also allowed us to observe how the students from an online university interact with educational transformations, and what realities and subjects they recognize (Anderson & Rivera-Vargas, 2020 ). The study focuses on how reality is generated through ordinary actions. We explore how students, beyond the technology itself, create and recreate learning contexts from their interaction with digital technologies (Laux et al., 2016 ). Thus, the research can be considered a multi-method, single case study (Stake, 1995 ) bounded by undergraduate students enrolled in the Open University of Catalonia (Spain).
Information collection tool
Individual active interviews were conducted (Denzin, 2001 ; Holstein & Gubrium, 2020 ) with students, academics and institutional representatives of the UOC. In this type of active interview, the interaction between the interviewer and the interviewee tends to be symmetrical, that is, both having an active role. We used semi-structured interviews with topics and questions derived directly from the objectives of the research (Denzin, 2001 ), but also left room for participants to expand and shape the conversations. Table 1 shows the main topics developed in the interview.
At the quantitative level, based on the statements and discourses obtained in the analysis of the interviews, a questionnaire was designed (Goetz & LeCompte, 1988 ). These complements and triangulate the information obtained in the qualitative phase, and at the same time provide a more representative reflection of the experiences of those interviewed. To ensure the content validity of the questionnaire, the initial 40 questions were validated through expert judgment. Seven experts from the field of online university education and distance education participated, from Spain (3), Canada (2), UK (1) and Chile (1). This process was carried out through validation matrices, where each expert responded individually with a Yes or No to the validity conditions of each question. Of the 40 questions, 35 obtained a quality assessment of over 75%, while 5 questions that were evaluated under 75%, sere dropped in the final evaluation instrument. Thus, the final instrument was made up of 35 questions, which included closed multiple-choice responses, and Likert scales. Footnote 5
The internal consistency of the instrument was calculated and interpreted using data from a test application. The results, α = 0.60, indicated a «good» internal reliability for scales between 0.6 and 0.8 points (Cohen et al., 2007 ). The test application process that led to the final design of the questionnaire spanned two months. The data from questionnaire were downloaded and saved in an IBM SPSS 25.0 (2017) spreadsheet with consideration for the ethical aspects corresponding to anonymity and data security compliance. The questionnaire was administrated using the online platform of the UOC Office of Planning and Quality.
Sample and participants
The research targetted students in undergraduate degree programmes, with current enrolment at the time of the research (course 2016–2017), and who have had at least two consecutive years of experience at the UOC. The process was divided into two phases. In the first, essentially qualitative phase three undergraduate degree programmes were reviewed these were: computer engineering, psychology, and business administration and management (BAM). This choice was not arbitrary, since these three studies are the oldest (Computer Engineering) and with the highest number of students (Psychology, and BAM) (UOC, 2009 , 2020 ). In addition, each of these undergraduate degree programmes belong to a different knowledge area (Table 1 ). Finally, 30 individual active interviews of students across the three selected degree programmes were carried out (10 students from each). These students were selected based on three criteria:
Had completed at least two full years at the UOC
Have passed all the subjects enrolled
Voluntarily to participate in this investigation. An invitation was sent to all students who met the first two criteria. Each of the 30 interviewees responded with an email confirming their personal interest in participating in this investigation.
In the second, quantitative phase, the complementary questionnaire (Goetz & LeCompte, 1988 ) was provided to all 7885 students belonging to the 15 undergraduate studies that made up the UOC’s enrolment in 2016. The questionnaire was completed by 1715 (21.75%) of students (which fits a 3% accepted error and a confidence level of 99%).
The analysis of the qualitative information used discourse analysis by grouping and categorizing the responses from the interviewees. We selected this type of analysis from Wetherell & Potter ( 1998 ), because it poses discourse as a social practice, and not just as a set of statements. In the words of Iñiguez & Antaki ( 1994 ) we extracted “a set of linguistic practices that maintain and promote certain social relationships” (1994: 63).
In the qualitative information coding and treatment phase, the transcripts were grouped according to the three-degree studies considered (computer engineering, psychology, and BAM). Then, the codified process was developed from the interview guideline. Subsequently, the units of meaning created in each degree were grouped into a single frame of group narrations. This work reduced the volume of data, highlighting those collective narratives directly and indirectly linked with the research objectives. By systematically reading the codes, selected citations and their context, we searched for patterns, themes and regularities, as well as contrasts, paradoxes and irregularities (Denzin, 2001 ; Denzin et al., 2016 ). From this, we proceeded to relate the codes, grouping and regrouping them until they made sense in order to create consolidated discourses. The regrouping of narratives generated a new analytical sense, allowing in this way, new interpretative schemes (Wetherell & Potter, 1998 ). This work gave rise to four categories:
Assimilation of digital competences
Flexibility and adaptation to the UOC model
Virtual learning environments
Once the categories were organized, they were analysed according to a combined model, in which the content of the narratives was worked on, also considering their discursive form, recovering analytical resources from the repertoire model of Wetherell & Potter ( 1998 ).
For the analysis of the quantitative information, the data was analysed using SPSS software, using descriptive analysis techniques. The classification and tabulation was made according to the Knowledge Areas in which the UOC organizes the degree studies, shown in Table 2 .
Finally, in the last phase, analytical triangulation and discussion was carried out using both the qualitative and quantitative information obtained. The coherence and correlation between both types of information was analyzed, identifying the most significant similarities and differences.
Analysis of results
The results presented below have been organized based on the development of the four emerging categories mentioned in the previous section. Each of the four categories were addressed using triangulation of data, including the constructed discourse, quotations from the interviewed student body, and descriptive results of the supplied questionnaire, illustrated in tables, that have been grouped according to the Knowledge Areas described in Table 1 .
One of the main results of this research was to determine if the online education experience itself provided powerful and sustainable digital skills to students. This is significant because it was not a goal established in the pedagogical models proposed by UOC (UOC, 2009 , 2020 ), nor in the initial expectations of this research.
An important part of the student body considers this process of assimilation of skills in digital technologies as an added value to the formative experience, as affirmed by this fifty-two-year-old student of Psychology:
You are using the virtual campus, then an application, and then you have to create one of these, etc. At first it is difficult to understand the mechanics of them, but once you manage to do it for the first time, everything flows and is faster (Psychology student).
Although a significant majority of the students stated that they had some level of digital competence prior to joining the UOC, almost all of the participants interviewed acknowledged that they now use digital technologies more actively and reflectively in other areas of their work and leisure activities. This they attribute to their formative experience using e-Learning tools.
Results obtained from the questionnaire, confirm results gathered from individual interviews. As we see in Fig. 1 , students from all areas of knowledge acquired new technological skills.
I consider that studying at the UOC using online education technologies has allowed me to gain new technological competences. According to discipline (%)
As might be expected, we see a trend (Fig. 2 ) towards more perceived value of online education skills acquired in older ages. Beyond this, the evidence shows that age differentiates students when it comes to assimilating competences in digital technologies during the online education experience.
I consider that studying at the UOC under the online education modality has allowed me to assimilate new technological competences. According to Age (%)
Complementing this data, we found from students’ comments, that variables such as: experience with digital technologies, gender, age, degree of study, do not negatively affect the academic interaction between classmates, nor their learning performance in VLE. This is in line with Chu’s ( 2010 ) argument that online education experience may reduce the digital divide between ages, genders and disciplines.
However, the main obstacle to interaction with colleagues is more related to the lack of concrete experience in the virtual campus of the UOC, as this student of BAM states.
You can quickly find out who is harder to study with online. If you are a new student, everything is slower (when you have to work in a group), they also fill the forum with questions, send you some personal messages, etc. The key is in the years (level of experience) you have been in the UOC, if you have several, then you do everything right. This goes beyond the age or sex of the people, or other aspects (BAM Student).
Thus, the increased participation in VLEs, in addition to favouring the assimilation of new digital skills, is valued by the student body as an action that tends to reduce the digital divide between students.
Flexibility and adaptation to the model
In this second dimension, students’ evaluations of the online education educational model proposed by the UOC are revealed. At the same time, the compatibility of the model with their own lifestyles is analysed. The flexibility of the learning model (Sangrà et al., 2012 ), together with the largely asynchronous character (Jung, 2011 ), allowed for continuing studies while engaged in often busy professional and personal lives. According to Anderson ( 2016 ), Anderson & Rivera-Vargas ( 2020 ) this generates greater commitment and leadership of the student in their own learning process:
Studying from my house and doing other activities at the time of day that suits me best, is fundamental for me. Otherwise, (using a face-to-face model) I couldn’t work or have a personal life (Psychology Student). Because of my work, I have a free schedule at very unusual times. That’s when I can take advantage of studying and doing evaluative activities. Having this autonomy and leading my learning process, is what, in fact, allows me to be committed to my studies and motivates me to continue (Computer Engineering Student).
Complementing this discourse, the results of the questionnaire reaffirm that students across the UOC's Knowledge Areas consider that the online education educational model at the UOC allows them to combine their studies with the usual organization of their time, and also, with their personal and professional life (Table 3 ).
The results also reflect that many virtual students of the UOC opt for online education because their lifestyle prevents them from attending university institutions with face-to-face instructional models:
In order to get a job promotion, for me it was important to finish my studies, but studying in person was impossible. So, I looked for distance education alternatives and the truth is that it works very well for me. So far, I have not had to change my routine in any way (Computer Engineering Student).
As can be seen in Fig. 3 , this sentiment is felt beyond the field of study or knowledge areas.
With the lifestyle I have, it was very difficult for me to study in person. (%)
One of the central aspects of the UOC's educational model is its permanent evaluation component, based on Continuous Evaluation Tests (CET). This flexible evaluative model, which aims to promote the autonomy and leadership of students in their educational process (Sánchez-Gelabert et al., 2020 ; UOC, 2020 ), represents the main reason why students choose to complete their university education in the online education system (Fig. 4 ).
If the continuous assessment model did not exist (through the CET), I could not study at the UOC (%)
However, beyond the fact that the student recognizes the flexibility of the UOC's educational model, there are aspects such as “personal academic organization and planning” that are recognized as difficult challenges to solve. This is significant, given the importance of enhancing the self-regulation and autonomy of students throughout their educational process (Pilkington, 2018 ; UOC, 2020 ). In this sense, beyond the flexibility of the evaluation model (considering evaluation and pedagogical tools), students also find that due to other commitments of their lifestyle, it is often difficult for them to plan and fulfil their academic responsibilities.
The model is flexible, and it is assumed that you can self-manage everything, but in practice, it does not stop having a very intense lifestyle. I finish each semester overwhelmed and pushing my academic abilities to the limit (BAM Student).
In general, we found that students organize and submit their evaluation work without sufficient time, and without much space for reflection and content review. In the opinion of the interviewees themselves, this has as a consequence for many, that, it is not possible to complete the courses with high levels of academic quality.
In this third section, the perceptions and the assessment of the structure and design of the virtual campus, are presented. The student perception of the virtual campus, where most student interaction action takes place, is generally positive. It is recognized that it is a friendly environment and that the applications and tools provide an effective learning space. One student explains:
Without having much previous experience in these environments (VLE), the truth is that I have always found that the (virtual) campus of the UOC is easy to manage. It is quite instinctive. In addition, the fact that you can give it your own design, makes it more representative of your own identity (Psychology Student).
Most students had positive opinions of the design, technical performance and operations of the virtual campus, however we found (Table 4 ) that the assessments of the students tend to vary depending on the area of knowledge and the variable that is evaluated. For example, students in the knowledge areas: law and political sciences (7.94), and Arts and Humanities (8.16), give a very good general assessment of the virtual campus. While for the students of the knowledge areas: computer science, multimedia and telecommunications (7.67), and Information Sciences (7.38), the assessment tends to be less favourable. Table 4 also shows that generally students generally rate positively access to materials, assessment activities, and communications with consultants, tutors, student services and peers. There are however small differences between knowledge areas.
Generally, we note that the two aspects with worse valued are related to the communication mechanisms of the campus, both for communication with institutional departments (7.19), and communication between students (7.36). The two most valued aspects of the virtual campus are: access to pedagogical materials of the subjects (7.95) and access to continuous evaluation tests (8).
- Student support
A successful student's support from UOC staff is probably one of the most important aspect of UOC's educational model (Sánchez-Gelabert et al, 2020 ). Both the interview and survey results show that although a small majority of students report satisfaction with tutors and consultants, there is a large portion of students who express dissatisfaction with these support roles (see Table 5 ). When we are asking about the feedback that students receive from the Consultants, the interviewees value it as weak and inefficient, recognizing how frustrating this is.
In general, you spend a lot of time doing your work, so you expect relatively clear feedback. But nothing, just a note, you do not know the reasons behind it. It seems to me very insincere (BAM Student). Corrections should be returned so that you can see how you have done in your work and what criteria were used to evaluate them, because if not, in the end, the only thing you look at is whether you have passed or not (Psychology Student).
In relation to the performance of Tutors, many participating students consider that their efforts are not very resolute when it comes to managing and solving certain problems:
In the three years that I have been at the UOC, I have only managed to contact my tutor 2 times and I will have written more than 20 emails. It is very frustrating, although I confess that at least it is good to know that they exist, that, in case of emergency, you can still count on them. In fact, if this role did not exist, surely people would ask for it (BAM Student).
In any virtual environment, the supportive role of human contacts remains critical to student success (Tait, 2014 ). As seen in Table 5 this area of support is problematic for many students as revealed in the questionnaire. For example, if the four indicators of the Likert scale are grouped in only two: “positive” (“Good” and “Very good”) and “negative” (“Bad” and “Very bad”), we observe that except for “Arts and Humanities Studies” the results in general, are equally split between positive and negative evaluations.
As we see in Table 4 , dimensions such as technological management, adaptation to the educational model and flexibility of its evaluation model, present generally positive evaluations from the students. However, when assessing two of the most relevant actors in the process of student's accompaniment and support, as shown in Table 5 , the results are not so positive. The only exception is given in the area of “Arts and Humanities Studies”, where the assessment of the performance of tutors and consultants is substantively positive. Meanwhile, in the five remaining areas of knowledge, there is a symmetry between the sum of the indicators “Very good—Good” and “Very bad—Bad”, highlighting even, the cases of the areas of “Law and Political Science” and “Studies in Psychology and Educational Sciences” where the sum of the negative evaluations exceeds 50% of the total, both in the case of Consultants and Tutors.
For me, the tutor’s performance has been very bad. At first, I did not know how the virtual campus worked, I did not know what and how many subjects I had to do. I did not even know if it made much sense to do college at my age. I wrote several emails, asking for your guidance. He never answered me (Psychology Student).
This is lack of communication was also identified by students from other areas of knowledge, where although the opinion on the performance of the consultants and tutors is better, we have been able to recognize multiple manifestations that openly criticize their performance.
Since I entered the UOC, I had to do everything by myself. I have never felt that a consultant oriented me well pedagogically, even his feedback tends to be monosyllabic. From the tutor, I have nothing to say, in three years, I still do not know who he is (BAM Student).
This data shows that the pedagogical support to the student, which represents one of the most significant aspects of online education (Almusharraf & Khahro, 2020 ; Sánchez-Gelabert et al., 2020 ; UOC, 2020 ) is not satisfactory for large numbers of students. These assessments of the role of the “Consultants” and “Tutors” reveal some significant weaknesses in the UOC’s educational model.
Discussion and conclusions
In this section, we will answer the two questions that guided the research.
Regarding the first question: “ From the perception of the students of the Open University of Catalonia, Identify and analyse which are the most relevant aspects for the student body when evaluating their online educational experience ?” The development of the empirical phase of this research has allowed us to identify four categories that have been relevant to students during their online university education. Next, we will contrast the evidence that supports these categories with the literature on the subject that we presented in the first part of this article.
We find that participation in virtual environments had a significant side effect of providing opportunity to learn and assimilate digital competences (Hills, 2017 ). Thus, participation in a virtual environment has benefits that go beyond subject matter learning objectives. Further, these benefits have become critical for both personal and professional advancement as communications and networking applications become increasingly important in both commercial and social realms. As Anderson & Rivera-Vargas ( 2020 ) argue, these benefits are seen across all age groups suggesting generally positive outcomes.
This outcome highlights that the students carry out their university education through online education because it gives them the possibility of making their studies compatible with the normal development of their professional and personal lives (Guri-Rosenblit & Gros, 2011 ; Pilkington, 2018 ). This critical capacity of increasing access is well known in the literature and widely supported among students of different ages in this study. We find that one of the major reasons why students opt for online education is due to the enhanced flexibility of the model as compared to attendance in traditional university institutions with a face-to-face approach (Anderson & Rivera-Vargas 2020 ). In relation to the mechanisms of evaluation of learning, students prefer the continuous evaluations tests (CET) to learge summative examinations, because is suits their lifestyle (allowing for more flexible time-shifting. However, similar to expressed in Kocdar et al ( 2018 ) the student’s question being overworked in certain subjects leading to lack of time for reflection and application of new knowledge.
The virtual campus
In general terms, and similar to the evidence that Sánchez-Gelabert et al. ( 2020 ) presented, students made positive evaluations of their experience of the technological platform and technical support, especially in those aspects of essentially educational including: access to pedagogical materials and to Continuous Evaluation Tests. However, contrary to what Fallon and Brown ( 2016 ) suggested, the lowest scores were obtained in those dimensions that evaluated the virtual contact tools between the students with the University (at a pedagogical, institutional, and technical level) and among the students themselves.
The student’s support
One of the most central aspects in determining the success of the online educational experience is the quality of interactions with human staff (Palloff & Pratt, 2013 ; Pilkington, 2018 ; Anderson & Rivera-Vargas 2020 ). In the case of the UOC, pedagogical and institutional accompaniment to the student body is supposed to continue throughout their educational experience, whereas it is absent or provided on a course by course basis in most other open universities (Davis et al., 2018 ). In this way, this university gives responsibility to both tutors and consultants, to provide student support (UOC, 2009 ). UOC contends that this has been authentic and sustainable model (intact since 2010) (UOC, 2020 ). However, the results show that there are medium to high levels of dissatisfaction with these resources as assessed by the students. This is an important factor, considering the efforts made by this educational institution to promote the students' interactive experience during their university studies. There are strong pedagogical rationale documenting the value of this type of continuous support (Sánchez-Gelabert et al., 2020 ), but it seems at UOC that the realization of this value is at best uncertain. This study reveals the need to first measure and then design ways to make this support more effective. This diagnosis should be accentuated when evaluating the performance of tutors and consultants.
Regarding the second question: “ From the analysis of the results of this research: how to improve the educational experience of virtual students in online universities ?”
As we have seen, the educational model of the UOC works well in general and has a good evaluation by the student body. However, there are some aspects that require adjustments or at least a revision. Based on this, and bearing in mind the results of this study, some initiatives are proposed below that could be considered by online universities to improve the educational experience of their students.
Firstly, the assimilation of digital skills during their own online education experience is an aspect recognized and well valued by the student body. These are instrumental skills, which are necessary to make adequate use of the virtual campus. However, at different moments of the research, the students raised the need for their educational experience to further strengthen the comprehensive and reflective processes. Given this, the main suggestion in this dimension is to go beyond an instrumental learning and promote the assimilation of reflective and critical skills of the student body on the use of digital technologies. For example, it would be useful to make students literate about the potential of the use of data in education, incorporating in the educational process the possibility of working and generating new knowledge (both individually and in groups). Self-care strategies could also be strengthened in relation to learning to properly organize the time they spend exposed and connected to digital screens. It could be useful in this sense to propose the development of learning activities outside the virtual campus, or directly in communities that are not virtual nor online.
Secondly, although the evaluation proposal based on CET is one of the strengths of the UOC’s educational model, an important part of the student body claims the need to reduce the time they dedicate to performing the multiple tasks that it entails. Given this, a commitment to the project-based learning model (PBL), could fit in and solve this demand of the student body to have more time for reflection and the application of the new knowledge assimilated in a subject. There is already a set of PBL, DIY, Maker experiences, implemented with substantive successes in other online university institutions (Lasauskiene and Rauduvaite, 2015 ; Chu et al., 2017 ). In most cases, these are initiatives that, through projects, have connected the study plan of the subjects with real social problems of interest to the students. It is a strategy that, in addition to being flexible, makes it possible to strengthen the bond of the student body with their own learning trajectory (Miño-Puigcercós et al., 2019 ).
Thirdly, the results show the need to generate alternative strategies, that favor the accompaniment of the student, taking into account their diverse learning needs and socio-cultural characteristics (Tait, 2014 ). However, the support from the human actors—tutors and consultants present a significant number of students expressing dissatisfaction with the service levels. At the same time, provision of this service incurs both financial and opportunity costs in addition to time commitments required of both students and staff. Thus, choice of type and level of support is critical. A first step in decision making is to accurately measure the current function of the service within the delivery system (Oregon et al., 2018 ).
A second step, given the current post-pandemic scenario is to look at innovative models from other industries in which service is provided by machines (for example driver-less cars, chat bot customer service, self-service banking and automated tax returns, etc.). This growing set of digital tools used effectively by online universities may provide pedagogical, institutional, and technical support to students (chatbots, web tools, student access to their own and comparative student data analytics), which according to Salmon ( 2013 ), could favour the experience of the student body in virtual learning environments in addition to extending their expertise and exposure to digital technologies.
Finally, the study provides useful information for UOC, but also to other virtual universities and traditional face-to-face mode universities that are experimenting important changes given the accelerated virtualization caused by the Covid-19 pandemic (Dhawan, 2020 ). The results also can assist the wider online educational community by examining, in detail, an innovative model for online learning. Thus, each institution can reflect on their own level of support and service to students and costs of doing so.
We will make the data available.
See more details here: https://www.uoc.edu/portal/en/metodologia-online-qualitat/fets-xifres/index.html.
See more details here: https://www.timeshighereducation.com/world-university-rankings/open-university-catalonia.
This role is relatively unique amongst online universities in that the term ‘tutor’ is often used elsewhere to refer to faculty who engage only with the student as they are enrolled in a particular course (similar to consultant role above), not as here, where they are engaged with the student throughout their enrolment at the university (Sánchez-Gelabert et al., 2020 ).
For more information on the UOC's educational model see https://www.uoc.edu/portal/en/metodologia-online-qualitat/model-educatiu/index.html .
See the final version of the questionnaire, before it was digitized and included in the UOC’s institutional platform, and sent to the student body: https://drive.google.com/file/d/1frnPPJsiTBULI92RQgx1jI4bcaAQB4GN/view?usp=sharing .
Almusharraf, N., & Khahro, S. (2020). Students satisfaction with online learning experiences during the COVID-19 pandemic. International Journal of Emerging Technologies in Learning (iJET), 15 (21), 246–267. https://doi.org/10.3991/ijet.v15i21.15647
Article Google Scholar
Alqurashi, E. (2019). Predicting student satisfaction and perceived learning within online learning environments. Distance Education, 40 (1), 133–148. https://doi.org/10.1080/01587919.2018.1553562
Anderson, T. (2016). Theories for learning with emerging technologies. In G. Veletsianos (Ed.), Emergence and innovation in digital learning: Foundations and applications (pp. 35–50). Athabasca University Press.
Anderson, T., & Rivera-Vargas, P. (2020). A critical look at educational technology from a distance education perspective. Digital Education Review . https://doi.org/10.1344/der.2020.37.208-229
Bates, A. W. (2019). Teaching in a digital age . University of British Columbia.
Chu, R. J. C. (2010). How family support and Internet self-efficacy influence the effects of e-Learning among higher aged adults–analyses of gender and age differences. Computers & Education, 55 (1), 255–264. https://doi.org/10.1016/j.compedu.2010.01.011
Chu, S. K. W., Zhang, Y., Chen, K., Chan, C. K., Lee, C. W. Y., Zou, E., & Lau, W. (2017). The effectiveness of wikis for project-based learning in different disciplines in higher education. The Internet and Higher Education, 33 , 49–60. https://doi.org/10.1016/j.iheduc.2017.01.005
Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education. Routledge . https://doi.org/10.4324/9780203224342
Davis, D., Chen, G., Hauff, C., & Houben, G. J. (2018). Activating learning at scale: A review of innovations in online learning strategies. Computers & Education, 125 , 327–344. https://doi.org/10.1016/j.compedu.2018.05.019
Denzin, N., & Giardina, M. (2016). Qualitative inquiry through a critical lens . Routledge.
Book Google Scholar
Denzin, N. (2001). The reflexive interview and a performative social science. Qualitative Research, 1 (1), 23–46. https://doi.org/10.1177/146879410100100102
Dhawan, S. (2020). Online learning: A panacea in the time of COVID-19 crisis. Journal of Educational Technology Systems, 49 (1), 5–22. https://doi.org/10.1177/0047239520934018
Fallon, C., & Brown, S. (2016). E-learning standards: A guide to purchasing, developing, and deploying standards-conformant e-Learning . CRC Press.
Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in Higher Education, 17 (25), 1–19. https://doi.org/10.1186/s41239-020-00199-x
Goetz, J.P. & LeCompte, M.D. (1988). Etnografía y diseño cualitativo en investigación educativa. Evaluación del diseño etnográfico [ Ethnography and qualitative design in educational research. Ethnographic design evaluation ]. Morata.
Grau-Valldosera, J., & Minguillón, J. (2014). Rethinking dropout in online higher education: The case of the Universitat Oberta de Catalunya. The International Review of Research in Open and Distributed Learning, 15 (1), 291–308. https://doi.org/10.19173/irrodl.v15i1.1628
Guri-Rosenblit, S., & Gros, B. (2011). e-Learning: Confusing terminology, research gaps and inherent challenges. Journal of Distance Education, 25 (1), 1–17.
Hills, H. (2017). Individual preferences in e-Learning . Routledge.
Holstein, J. A., & Gubrium, J. F. (2020). Interviewing as a form of narrative practice. In D. Silverman (Ed.). Qualitative research (pp. 69–85). Sage.
Íñiguez, L., & Antaki, C. (1994). Discourse analysis in social psychology. Boletín De Psicología, 44 , 57–75.
Jung, I. (2011). The dimensions of e-Learning quality: From the learner’s perspective. Educational Technology Research and Development, 59 (4), 445–464. https://doi.org/10.1007/s11423-010-9171-4
Kocdar, S., Karadeniz, A., Bozkurt, A., & Buyuk, K. (2018). Measuring self-regulation in self-paced open and distance learning environments. The International Review of Research in Open and Distributed Learning, 19 (1), 25–43. https://doi.org/10.19173/irrodl.v19i1.3255
Lasauskiene, J., & Rauduvaite, A. (2015). Project-based learning at university: Teaching experiences of lecturers. Procedia-Social and Behavioral Sciences, 197 , 788–792. https://doi.org/10.1016/j.sbspro.2015.07.182
Laux, D., Luse, A., & Mennecke, B. E. (2016). Collaboration, connectedness, and community: An examination of the factors influencing student persistence in virtual communities. Computers in Human Behavior, 57 , 452–464. https://doi.org/10.1016/j.chb.2015.12.046
Lee, K. (2019). Rewriting a history of open universities. The International Review of Research in Open and Distributed Learning, 20 (4), 21–35. https://doi.org/10.19173/irrodl.v20i3.4070
McKnight, K., O’Malley, K., Ruzic, R., Horsley, M. K., Franey, J. J., & Bassett, K. (2016). Teaching in a digital age: How educators use technology to improve student learning. Journal of Research on Technology in Education, 48 (3), 194–211. https://doi.org/10.1080/15391523.2016.1175856
Miño-Puigcercós, R., Domingo Coscollola, M., & Sancho Gil, J. M. (2019). Transforming the teaching and learning culture in higher education from a DIY perspective. Educación XX1, 22 (1), 139–160. https://doi.org/10.5944/educxx1.20057
Moore, M. (1989). Editorial: Three types of interaction. American Journal of Distance Education, 3 (2), 1–7. https://doi.org/10.1080/1080/08923648909526659
Murphy, C. A., & Stewart, J. C. (2017). On-campus students taking online courses: Factors associated with unsuccessful course completion. The Internet and Higher Education, 34 , 1–9. https://doi.org/10.1016/j.iheduc.2017.03.001
Oregon, E., McCoy, L., & Carmon-Johnson, L. (2018). Case analysis: Exploring the application of using rich media technologies and social presence to decrease attrition in an online graduate program. Journal of Educators Online, 15 (2), 1–13. https://doi.org/10.9743/JEO.2018.15.2.7
Palloff, R., & Pratt, K. (2013). Lessons from the cyberspace classroom. The realities of online teaching (2nd ed.). Jossey-Bass.
Pilkington, C. (2018). A Playful approach to fostering motivation in a distance education computer programming course: Behaviour change and student perceptions. The International Review of Research in Open and Distributed Learning, 19 (3), 282–298. https://doi.org/10.19173/irrodl.v19i3.3664
Rivera-Vargas, P., & Cobo Romaní, C. (2019). La universidad en la sociedad digital: entre la herencia analógica y la socialización del conocimiento [The university in the digital society: between the analogic heritage and the socialization of knowledge]. Revista de Docencia Universitaria, 17 (1), 17–32. https://doi.org/10.4995/redu.2019.11276
Salmon, G. (2013). E-tivities: The key to active online learning . Routledge.
Sánchez-Gelabert, A., Valente, R., & Duart, J. M. (2020). Profiles of online students and the impact of their university experience. The International Review of Research in Open and Distributed Learning, 21 (3), 230–249. https://doi.org/10.19173/irrodl.v21i3.4784
Sangrà, A., Vlachopoulus, D., & Cabrera, N. (2012). Building an inclusive definition of e-Learning: An approach to the conceptual framework. The International Review of Research in Open and Distance Learning, 13 (2), 145–159. https://doi.org/10.19173/irrodl.v13i2.1161
Selwyn, N. (2019). Whats is digital sociology? Polity Press.
Stake, R. E. (1995). The art of case study research . Sage.
Tait, A. (2014). From place to virtual space: Reconfiguring student support for distance and e-learning in the digital age. Open Praxis, 6 (1), 5–16. https://doi.org/10.5944/openpraxis.6.1.102
Times Higher Education. (2021). World University Ranking. Open university of Catalonia ranking . Times Higher Education. Retrieved from: https://www.timeshighereducation.com/world-university-rankings/open-university-catalonia
UOC. (2009). The UOC educational model: Evolution and perspectives. Universitat Oberta de Catalunya . Retrieved from: http://www.uoc.edu/portal/_resources/ES/documents/innovacio/modelo_educativo.pdf
UOC. (2020). Report of the 2018–2019 academic year. We grow in research, we share knowledge . Universitat Oberta de Catalunya. Retrieved from: https://www.uoc.edu/portal/_resources/ES/documents/memories/1819/memoria-UOC-2018-2019_es.pdf
Vuopala, E., Hyvönen, P., & Järvelä, S. (2016). Interaction forms in successful collaborative learning in virtual learning environments. Active Learning in Higher Education, 17 (1), 25–38. https://doi.org/10.1177/1469787415616730
Wetherell, M., & Potter, J. (1998). Discourse analysis and identification of interpretive repertoires. In A. Gordo, & J. Linaza (Eds.), Psychology, discourse and power: qualitative methodologies, critical perspectives (pp. 63–78). Edvisor.
Zawacki-Richter, O., & Anderson, T. (Eds.). (2014). Online distance education: Towards a research agenda . Athabasca University Press.
National Agency for Research and Development (ANID—Chile).
Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. This study was supported by National Agency for Research and Development (CONICYT) Grant No.72090564.
Authors and affiliations.
Department of Teaching and Learning and Educational Organization, Universidad de Barcelona, Passeig de la Vall d’Hebron, 171. Edifici de Llevant, 2nd floor, 08035, Barcelona, Spain
Pablo Rivera-Vargas & Cristina Alonso Cano
Facultad de Educación y Ciencias Sociales, Universidad Andrés Bello, Fernández Concha 700, Las Condes, Santiago, Región Metropolitana, Chile
Athabasca University, 10005 93 St. Edmonton, Athabasca, AB T5H1W6, Canada
You can also search for this author in PubMed Google Scholar
Correspondence to Pablo Rivera-Vargas .
Conflict of interest.
The authors declares that they have no conflict of interest.
The authors have the informed consent of the thirty students interviewed.
Research involving human participants and/or animals
The research has not involved human or animal experiments. Individual interviews have been conducted with thirty students. The interviews have been transcribed, processed, and subsequently anonymized in the analysis and presentation of results phase.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and Permissions
About this article
Rivera-Vargas, P., Anderson, T. & Cano, C.A. Exploring students’ learning experience in online education: analysis and improvement proposals based on the case of a Spanish open learning university. Education Tech Research Dev 69 , 3367–3389 (2021). https://doi.org/10.1007/s11423-021-10045-0
Accepted : 15 August 2021
Published : 27 August 2021
Issue Date : December 2021
DOI : https://doi.org/10.1007/s11423-021-10045-0
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Online education
- Autonomous learning
- Digital divide
- Competences in digital technologies
- Find a journal
- Publish with us
- Research article
- Open access
- Published: 02 December 2020
Integrating students’ perspectives about online learning: a hierarchy of factors
- Montgomery Van Wart 1 ,
- Anna Ni 1 ,
- Pamela Medina 1 ,
- Jesus Canelon 1 ,
- Melika Kordrostami 1 ,
- Jing Zhang 1 &
International Journal of Educational Technology in Higher Education volume 17 , Article number: 53 ( 2020 ) Cite this article
This article reports on a large-scale ( n = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students’ perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence--were identified as significant and reliable. Regression analysis indicates the minimal factors for enrollment in future classes—when students consider convenience and scheduling—were Basic Online Modality, Cognitive Presence, and Online Social Comfort. Students who accepted or embraced online courses on their own merits wanted a minimum of Basic Online Modality, Teaching Presence, Cognitive Presence, Online Social Comfort, and Social Presence. Students, who preferred face-to-face classes and demanded a comparable experience, valued Online Interactive Modality and Instructional Support more highly. Recommendations for online course design, policy, and future research are provided.
While there are different perspectives of the learning process such as learning achievement and faculty perspectives, students’ perspectives are especially critical since they are ultimately the raison d’être of the educational endeavor (Chickering & Gamson, 1987 ). More pragmatically, students’ perspectives provide invaluable, first-hand insights into their experiences and expectations (Dawson et al., 2019 ). The student perspective is especially important when new teaching approaches are used and when new technologies are being introduced (Arthur, 2009 ; Crews & Butterfield, 2014 ; Van Wart, Ni, Ready, Shayo, & Court, 2020 ). With the renewed interest in “active” education in general (Arruabarrena, Sánchez, Blanco, et al., 2019 ; Kay, MacDonald, & DiGiuseppe, 2019 ; Nouri, 2016 ; Vlachopoulos & Makri, 2017 ) and the flipped classroom approach in particular (Flores, del-Arco, & Silva, 2016 ; Gong, Yang, & Cai, 2020 ; Lundin, et al., 2018 ; Maycock, 2019 ; McGivney-Burelle, 2013 ; O’Flaherty & Phillips, 2015 ; Tucker , 2012 ) along with extraordinary shifts in the technology, the student perspective on online education is profoundly important. What shapes students’ perceptions of quality integrate are their own sense of learning achievement, satisfaction with the support they receive, technical proficiency of the process, intellectual and emotional stimulation, comfort with the process, and sense of learning community. The factors that students perceive as quality online teaching, however, has not been as clear as it might be for at least two reasons.
First, it is important to note that the overall online learning experience for students is also composed of non-teaching factors which we briefly mention. Three such factors are (1) convenience, (2) learner characteristics and readiness, and (3) antecedent conditions that may foster teaching quality but are not directly responsible for it. (1) Convenience is an enormous non-quality factor for students (Artino, 2010 ) which has driven up online demand around the world (Fidalgo, Thormann, Kulyk, et al., 2020 ; Inside Higher Education and Gallup, 2019 ; Legon & Garrett, 2019 ; Ortagus, 2017 ). This is important since satisfaction with online classes is frequently somewhat lower than face-to-face classes (Macon, 2011 ). However, the literature generally supports the relative equivalence of face-to-face and online modes regarding learning achievement criteria (Bernard et al., 2004 ; Nguyen, 2015 ; Ni, 2013 ; Sitzmann, Kraiger, Stewart, & Wisher, 2006 ; see Xu & Jaggars, 2014 for an alternate perspective). These contrasts are exemplified in a recent study of business students, in which online students using a flipped classroom approach outperformed their face-to-face peers, but ironically rated instructor performance lower (Harjoto, 2017 ). (2) Learner characteristics also affect the experience related to self-regulation in an active learning model, comfort with technology, and age, among others,which affect both receptiveness and readiness of online instruction. (Alqurashi, 2016 ; Cohen & Baruth, 2017 ; Kintu, Zhu, & Kagambe, 2017 ; Kuo, Walker, Schroder, & Belland, 2013 ; Ventura & Moscoloni, 2015 ) (3) Finally, numerous antecedent factors may lead to improved instruction, but are not themselves directly perceived by students such as instructor training (Brinkley-Etzkorn, 2018 ), and the sources of faculty motivation (e.g., incentives, recognition, social influence, and voluntariness) (Wingo, Ivankova, & Moss, 2017 ). Important as these factors are, mixing them with the perceptions of quality tends to obfuscate the quality factors directly perceived by students.
Second, while student perceptions of quality are used in innumerable studies, our overall understanding still needs to integrate them more holistically. Many studies use student perceptions of quality and overall effectiveness of individual tools and strategies in online contexts such as mobile devices (Drew & Mann, 2018 ), small groups (Choi, Land, & Turgeon, 2005 ), journals (Nair, Tay, & Koh, 2013 ), simulations (Vlachopoulos & Makri, 2017 ), video (Lange & Costley, 2020 ), etc. Such studies, however, cannot provide the overall context and comparative importance. Some studies have examined the overall learning experience of students with exploratory lists, but have mixed non-quality factors with quality of teaching factors making it difficult to discern the instructor’s versus contextual roles in quality (e.g., Asoodar, Vaezi, & Izanloo, 2016 ; Bollinger & Martindale, 2004 ; Farrell & Brunton, 2020 ; Hong, 2002 ; Song, Singleton, Hill, & Koh, 2004 ; Sun, Tsai, Finger, Chen, & Yeh, 2008 ). The application of technology adoption studies also fall into this category by essentially aggregating all teaching quality in the single category of performance ( Al-Gahtani, 2016 ; Artino, 2010 ). Some studies have used high-level teaching-oriented models, primarily the Community of Inquiry model (le Roux & Nagel, 2018 ), but empirical support has been mixed (Arbaugh et al., 2008 ); and its elegance (i.e., relying on only three factors) has not provided much insight to practitioners (Anderson, 2016 ; Cleveland-Innes & Campbell, 2012 ).
Integration of studies and concepts explored continues to be fragmented and confusing despite the fact that the number of empirical studies related to student perceptions of quality factors has increased. It is important to have an empirical view of what students’ value in a single comprehensive study and, also, to know if there is a hierarchy of factors, ranging from students who are least to most critical of the online learning experience. This research study has two research questions.
The first research question is: What are the significant factors in creating a high-quality online learning experience from students’ perspectives? That is important to know because it should have a significant effect on the instructor’s design of online classes. The goal of this research question is identify a more articulated and empirically-supported set of factors capturing the full range of student expectations.
The second research question is: Is there a priority or hierarchy of factors related to students’ perceptions of online teaching quality that relate to their decisions to enroll in online classes? For example, is it possible to distinguish which factors are critical for enrollment decisions when students are primarily motivated by convenience and scheduling flexibility (minimum threshold)? Do these factors differ from students with a genuine acceptance of the general quality of online courses (a moderate threshold)? What are the factors that are important for the students who are the most critical of online course delivery (highest threshold)?
This article next reviews the literature on online education quality, focusing on the student perspective and reviews eight factors derived from it. The research methods section discusses the study structure and methods. Demographic data related to the sample are next, followed by the results, discussion, and conclusion.
Online education is much discussed (Prinsloo, 2016 ; Van Wart et al., 2019 ; Zawacki-Richter & Naidu, 2016 ), but its perception is substantially influenced by where you stand and what you value (Otter et al., 2013 ; Tanner, Noser, & Totaro, 2009 ). Accrediting bodies care about meeting technical standards, proof of effectiveness, and consistency (Grandzol & Grandzol, 2006 ). Institutions care about reputation, rigor, student satisfaction, and institutional efficiency (Jung, 2011 ). Faculty care about subject coverage, student participation, faculty satisfaction, and faculty workload (Horvitz, Beach, Anderson, & Xia, 2015 ; Mansbach & Austin, 2018 ). For their part, students care about learning achievement (Marks, Sibley, & Arbaugh, 2005 ; O’Neill & Sai, 2014 ; Shen, Cho, Tsai, & Marra, 2013 ), but also view online education as a function of their enjoyment of classes, instructor capability and responsiveness, and comfort in the learning environment (e.g., Asoodar et al., 2016 ; Sebastianelli, Swift, & Tamimi, 2015 ). It is this last perspective, of students, upon which we focus.
It is important to note students do not sign up for online classes solely based on perceived quality. Perceptions of quality derive from notions of the capacity of online learning when ideal—relative to both learning achievement and satisfaction/enjoyment, and perceptions about the likelihood and experience of classes living up to expectations. Students also sign up because of convenience and flexibility, and personal notions of suitability about learning. Convenience and flexibility are enormous drivers of online registration (Lee, Stringer, & Du, 2017 ; Mann & Henneberry, 2012 ). Even when students say they prefer face-to-face classes to online, many enroll in online classes and re-enroll in the future if the experience meets minimum expectations. This study examines the threshold expectations of students when they are considering taking online classes.
When discussing students’ perceptions of quality, there is little clarity about the actual range of concepts because no integrated empirical studies exist comparing major factors found throughout the literature. Rather, there are practitioner-generated lists of micro-competencies such as the Quality Matters consortium for higher education (Quality Matters, 2018 ), or broad frameworks encompassing many aspects of quality beyond teaching (Open and Distant Learning Quality Council, 2012 ). While checklists are useful for practitioners and accreditation processes, they do not provide robust, theoretical bases for scholarly development. Overarching frameworks are heuristically useful, but not for pragmatic purposes or theory building arenas. The most prominent theoretical framework used in online literature is the Community of Inquiry (CoI) model (Arbaugh et al., 2008 ; Garrison, Anderson, & Archer, 2003 ), which divides instruction into teaching, cognitive, and social presence. Like deductive theories, however, the supportive evidence is mixed (Rourke & Kanuka, 2009 ), especially regarding the importance of social presence (Annand, 2011 ; Armellini and De Stefani, 2016 ). Conceptually, the problem is not so much with the narrow articulation of cognitive or social presence; cognitive presence is how the instructor provides opportunities for students to interact with material in robust, thought-provoking ways, and social presence refers to building a community of learning that incorporates student-to-student interactions. However, teaching presence includes everything else the instructor does—structuring the course, providing lectures, explaining assignments, creating rehearsal opportunities, supplying tests, grading, answering questions, and so on. These challenges become even more prominent in the online context. While the lecture as a single medium is paramount in face-to-face classes, it fades as the primary vehicle in online classes with increased use of detailed syllabi, electronic announcements, recorded and synchronous lectures, 24/7 communications related to student questions, etc. Amassing the pedagogical and technological elements related to teaching under a single concept provides little insight.
In addition to the CoI model, numerous concepts are suggested in single-factor empirical studies when focusing on quality from a student’s perspective, with overlapping conceptualizations and nonstandardized naming conventions. Seven distinct factors are derived here from the literature of student perceptions of online quality: Instructional Support, Teaching Presence, Basic Online Modality, Social Presence, Online Social Comfort, cognitive Presence, and Interactive Online Modality.
Instructional Support refers to students’ perceptions of techniques by the instructor used for input, rehearsal, feedback, and evaluation. Specifically, this entails providing detailed instructions, designed use of multimedia, and the balance between repetitive class features for ease of use, and techniques to prevent boredom. Instructional Support is often included as an element of Teaching Presence, but is also labeled “structure” (Lee & Rha, 2009 ; So & Brush, 2008 ) and instructor facilitation (Eom, Wen, & Ashill, 2006 ). A prime example of the difference between face-to-face and online education is the extensive use of the “flipped classroom” (Maycock, 2019 ; Wang, Huang, & Schunn, 2019 ) in which students move to rehearsal activities faster and more frequently than traditional classrooms, with less instructor lecture (Jung, 2011 ; Martin, Wang, & Sadaf, 2018 ). It has been consistently supported as an element of student perceptions of quality (Espasa & Meneses, 2010 ).
- Teaching presence
Teaching Presence refers to students’ perceptions about the quality of communication in lectures, directions, and individual feedback including encouragement (Jaggars & Xu, 2016 ; Marks et al., 2005 ). Specifically, instructor communication is clear, focused, and encouraging, and instructor feedback is customized and timely. If Instructional Support is what an instructor does before the course begins and in carrying out those plans, then Teaching Presence is what the instructor does while the class is conducted and in response to specific circumstances. For example, a course could be well designed but poorly delivered because the instructor is distracted; or a course could be poorly designed but an instructor might make up for the deficit by spending time and energy in elaborate communications and ad hoc teaching techniques. It is especially important in student satisfaction (Sebastianelli et al., 2015 ; Young, 2006 ) and also referred to as instructor presence (Asoodar et al., 2016 ), learner-instructor interaction (Marks et al., 2005 ), and staff support (Jung, 2011 ). As with Instructional Support, it has been consistently supported as an element of student perceptions of quality.
Basic online modality
Basic Online Modality refers to the competent use of basic online class tools—online grading, navigation methods, online grade book, and the announcements function. It is frequently clumped with instructional quality (Artino, 2010 ), service quality (Mohammadi, 2015 ), instructor expertise in e-teaching (Paechter, Maier, & Macher, 2010 ), and similar terms. As a narrowly defined concept, it is sometimes called technology (Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Sun et al., 2008 ). The only empirical study that did not find Basic Online Modality significant, as technology, was Sun et al. ( 2008 ). Because Basic Online Modality is addressed with basic instructor training, some studies assert the importance of training (e.g., Asoodar et al., 2016 ).
Social Presence refers to students’ perceptions of the quality of student-to-student interaction. Social Presence focuses on the quality of shared learning and collaboration among students, such as in threaded discussion responses (Garrison et al., 2003 ; Kehrwald, 2008 ). Much emphasized but challenged in the CoI literature (Rourke & Kanuka, 2009 ), it has mixed support in the online literature. While some studies found Social Presence or related concepts to be significant (e.g., Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Eom et al., 2006 ; Richardson, Maeda, Lv, & Caskurlu, 2017 ), others found Social Presence insignificant (Joo, Lim, & Kim, 2011 ; So & Brush, 2008 ; Sun et al., 2008 ).
Online social comfort
Online Social Comfort refers to the instructor’s ability to provide an environment in which anxiety is low, and students feel comfortable interacting even when expressing opposing viewpoints. While numerous studies have examined anxiety (e.g., Liaw & Huang, 2013 ; Otter et al., 2013 ; Sun et al., 2008 ), only one found anxiety insignificant (Asoodar et al., 2016 ); many others have not examined the concept.
- Cognitive presence
Cognitive Presence refers to the engagement of students such that they perceive they are stimulated by the material and instructor to reflect deeply and critically, and seek to understand different perspectives (Garrison et al., 2003 ). The instructor provides instructional materials and facilitates an environment that piques interest, is reflective, and enhances inclusiveness of perspectives (Durabi, Arrastia, Nelson, Cornille, & Liang, 2011 ). Cognitive Presence includes enhancing the applicability of material for student’s potential or current careers. Cognitive Presence is supported as significant in many online studies (e.g., Artino, 2010 ; Asoodar et al., 2016 ; Joo et al., 2011 ; Marks et al., 2005 ; Sebastianelli et al., 2015 ; Sun et al., 2008 ). Further, while many instructors perceive that cognitive presence is diminished in online settings, neuroscientific studies indicate this need not be the case (Takamine, 2017 ). While numerous studies failed to examine Cognitive Presence, this review found no studies that lessened its significance for students.
Interactive online modality
Interactive Online Modality refers to the “high-end” usage of online functionality. That is, the instructor uses interactive online class tools—video lectures, videoconferencing, and small group discussions—well. It is often included in concepts such as instructional quality (Artino, 2010 ; Asoodar et al., 2016 ; Mohammadi, 2015 ; Otter et al., 2013 ; Paechter et al., 2010 ) or engagement (Clayton, Blumberg, & Anthony, 2018 ). While individual methods have been investigated (e.g. Durabi et al., 2011 ), high-end engagement methods have not.
Other independent variables affecting perceptions of quality include age, undergraduate versus graduate status, gender, ethnicity/race, discipline, educational motivation of students, and previous online experience. While age has been found to be small or insignificant, more notable effects have been reported at the level-of-study, with graduate students reporting higher “success” (Macon, 2011 ), and community college students having greater difficulty with online classes (Legon & Garrett, 2019 ; Xu & Jaggars, 2014 ). Ethnicity and race have also been small or insignificant. Some situational variations and student preferences can be captured by paying attention to disciplinary differences (Arbaugh, 2005 ; Macon, 2011 ). Motivation levels of students have been reported to be significant in completion and achievement, with better students doing as well across face-to-face and online modes, and weaker students having greater completion and achievement challenges (Clayton et al., 2018 ; Lu & Lemonde, 2013 ).
To examine the various quality factors, we apply a critical success factor methodology, initially introduced to schools of business research in the 1970s. In 1981, Rockhart and Bullen codified an approach embodying principles of critical success factors (CSFs) as a way to identify the information needs of executives, detailing steps for the collection and analyzation of data to create a set of organizational CSFs (Rockhart & Bullen, 1981 ). CSFs describe the underlying or guiding principles which must be incorporated to ensure success.
Utilizing this methodology, CSFs in the context of this paper define key areas of instruction and design essential for an online class to be successful from a student’s perspective. Instructors implicitly know and consider these areas when setting up an online class and designing and directing activities and tasks important to achieving learning goals. CSFs make explicit those things good instructors may intuitively know and (should) do to enhance student learning. When made explicit, CSFs not only confirm the knowledge of successful instructors, but tap their intuition to guide and direct the accomplishment of quality instruction for entire programs. In addition, CSFs are linked with goals and objectives, helping generate a small number of truly important matters an instructor should focus attention on to achieve different thresholds of online success.
After a comprehensive literature review, an instrument was created to measure students’ perceptions about the importance of techniques and indicators leading to quality online classes. Items were designed to capture the major factors in the literature. The instrument was pilot studied during academic year 2017–18 with a 397 student sample, facilitating an exploratory factor analysis leading to important preliminary findings (reference withheld for review). Based on the pilot, survey items were added and refined to include seven groups of quality teaching factors and two groups of items related to students’ overall acceptance of online classes as well as a variable on their future online class enrollment. Demographic information was gathered to determine their effects on students’ levels of acceptance of online classes based on age, year in program, major, distance from university, number of online classes taken, high school experience with online classes, and communication preferences.
This paper draws evidence from a sample of students enrolled in educational programs at Jack H. Brown College of Business and Public Administration (JHBC), California State University San Bernardino (CSUSB). The JHBC offers a wide range of online courses for undergraduate and graduate programs. To ensure comparable learning outcomes, online classes and face-to-face classes of a certain subject are similar in size—undergraduate classes are generally capped at 60 and graduate classes at 30, and often taught by the same instructors. Students sometimes have the option to choose between both face-to-face and online modes of learning.
A Qualtrics survey link was sent out by 11 instructors to students who were unlikely to be cross-enrolled in classes during the 2018–19 academic year. 1 Approximately 2500 students were contacted, with some instructors providing class time to complete the anonymous survey. All students, whether they had taken an online class or not, were encouraged to respond. Nine hundred eighty-seven students responded, representing a 40% response rate. Although drawn from a single business school, it is a broad sample representing students from several disciplines—management, accounting and finance, marketing, information decision sciences, and public administration, as well as both graduate and undergraduate programs of study.
The sample age of students is young, with 78% being under 30. The sample has almost no lower division students (i.e., freshman and sophomore), 73% upper division students (i.e., junior and senior) and 24% graduate students (master’s level). Only 17% reported having taken a hybrid or online class in high school. There was a wide range of exposure to university level online courses, with 47% reporting having taken 1 to 4 classes, and 21% reporting no online class experience. As a Hispanic-serving institution, 54% self-identified as Latino, 18% White, and 13% Asian and Pacific Islander. The five largest majors were accounting & finance (25%), management (21%), master of public administration (16%), marketing (12%), and information decision sciences (10%). Seventy-four percent work full- or part-time. See Table 1 for demographic data.
Measures and procedure
To increase the reliability of evaluation scores, composite evaluation variables are formed after an exploratory factor analysis of individual evaluation items. A principle component method with Quartimin (oblique) rotation was applied to explore the factor construct of student perceptions of online teaching CSFs. The item correlations for student perceptions of importance coefficients greater than .30 were included, a commonly acceptable ratio in factor analysis. A simple least-squares regression analysis was applied to test the significance levels of factors on students’ impression of online classes.
Exploratory factor constructs
Using a threshold loading of 0.3 for items, 37 items loaded on seven factors. All factors were logically consistent. The first factor, with eight items, was labeled Teaching Presence. Items included providing clear instructions, staying on task, clear deadlines, and customized feedback on strengths and weaknesses. Teaching Presence items all related to instructor involvement during the course as a director, monitor, and learning facilitator. The second factor, with seven items, aligned with Cognitive Presence. Items included stimulating curiosity, opportunities for reflection, helping students construct explanations posed in online courses, and the applicability of material. The third factor, with six items, aligned with Social Presence defined as providing student-to-student learning opportunities. Items included getting to know course participants for sense of belonging, forming impressions of other students, and interacting with others. The fourth factor, with six new items as well as two (“interaction with other students” and “a sense of community in the class”) shared with the third factor, was Instructional Support which related to the instructor’s roles in providing students a cohesive learning experience. They included providing sufficient rehearsal, structured feedback, techniques for communication, navigation guide, detailed syllabus, and coordinating student interaction and creating a sense of online community. This factor also included enthusiasm which students generally interpreted as a robustly designed course, rather than animation in a traditional lecture. The fifth factor was labeled Basic Online Modality and focused on the basic technological requirements for a functional online course. Three items included allowing students to make online submissions, use of online gradebooks, and online grading. A fourth item is the use of online quizzes, viewed by students as mechanical practice opportunities rather than small tests and a fifth is navigation, a key component of Online Modality. The sixth factor, loaded on four items, was labeled Online Social Comfort. Items here included comfort discussing ideas online, comfort disagreeing, developing a sense of collaboration via discussion, and considering online communication as an excellent medium for social interaction. The final factor was called Interactive Online Modality because it included items for “richer” communications or interactions, no matter whether one- or two-way. Items included videoconferencing, instructor-generated videos, and small group discussions. Taken together, these seven explained 67% of the variance which is considered in the acceptable range in social science research for a robust model (Hair, Black, Babin, & Anderson, 2014 ). See Table 2 for the full list.
To test for factor reliability, the Cronbach alpha of variables were calculated. All produced values greater than 0.7, the standard threshold used for reliability, except for system trust which was therefore dropped. To gauge students’ sense of factor importance, all items were means averaged. Factor means (lower means indicating higher importance to students), ranged from 1.5 to 2.6 on a 5-point scale. Basic Online Modality was most important, followed by Instructional Support and Teaching Presence. Students deemed Cognitive Presence, Social Online Comfort, and Online Interactive Modality less important. The least important for this sample was Social Presence. Table 3 arrays the critical success factor means, standard deviations, and Cronbach alpha.
To determine whether particular subgroups of respondents viewed factors differently, a series of ANOVAs were conducted using factor means as dependent variables. Six demographic variables were used as independent variables: graduate vs. undergraduate, age, work status, ethnicity, discipline, and past online experience. To determine strength of association of the independent variables to each of the seven CSFs, eta squared was calculated for each ANOVA. Eta squared indicates the proportion of variance in the dependent variable explained by the independent variable. Eta squared values greater than .01, .06, and .14 are conventionally interpreted as small, medium, and large effect sizes, respectively (Green & Salkind, 2003 ). Table 4 summarizes the eta squared values for the ANOVA tests with Eta squared values less than .01 omitted.
While no significant differences in factor means among students in different disciplines in the College occur, all five other independent variables have some small effect on some or all CSFs. Graduate students tend to rate Online Interactive Modality, Instructional Support, Teaching Presence, and Cognitive Presence higher than undergraduates. Elder students value more Online Interactive Modality. Full-time working students rate all factors, except Social Online Comfort, slightly higher than part-timers and non-working students. Latino and White rate Basic Online Modality and Instructional Support higher; Asian and Pacific Islanders rate Social Presence higher. Students who have taken more online classes rate all factors higher.
In addition to factor scores, two variables are constructed to identify the resultant impressions labeled online experience. Both were logically consistent with a Cronbach’s α greater than 0.75. The first variable, with six items, labeled “online acceptance,” included items such as “I enjoy online learning,” “My overall impression of hybrid/online learning is very good,” and “the instructors of online/hybrid classes are generally responsive.” The second variable was labeled “face-to-face preference” and combines four items, including enjoying, learning, and communicating more in face-to-face classes, as well as perceiving greater fairness and equity. In addition to these two constructed variables, a one-item variable was also used subsequently in the regression analysis: “online enrollment.” That question asked: if hybrid/online classes are well taught and available, how much would online education make up your entire course selection going forward?
As noted above, two constructed variables and one item were used as dependent variables for purposes of regression analysis. They were online acceptance, F2F preference, and the selection of online classes. In addition to seven quality-of-teaching factors identified by factor analysis, control variables included level of education (graduate versus undergraduate), age, ethnicity, work status, distance to university, and number of online/hybrid classes taken in the past. See Table 5 .
When the ETA squared values for ANOVA significance were measured for control factors, only one was close to a medium effect. Graduate versus undergraduate status had a .05 effect (considered medium) related to Online Interactive Modality, meaning graduate students were more sensitive to interactive modality than undergraduates. Multiple regression analysis of critical success factors and online impressions were conducted to compare under what conditions factors were significant. The only consistently significant control factor was number of online classes taken. The more classes students had taken online, the more inclined they were to take future classes. Level of program, age, ethnicity, and working status do not significantly affect students’ choice or overall acceptance of online classes.
The least restrictive condition was online enrollment (Table 6 ). That is, students might not feel online courses were ideal, but because of convenience and scheduling might enroll in them if minimum threshold expectations were met. When considering online enrollment three factors were significant and positive (at the 0.1 level): Basic Online Modality, Cognitive Presence, and Online Social Comfort. These least-demanding students expected classes to have basic technological functionality, provide good opportunities for knowledge acquisition, and provide comfortable interaction in small groups. Students who demand good Instructional Support (e.g., rehearsal opportunities, standardized feedback, clear syllabus) are less likely to enroll.
Online acceptance was more restrictive (see Table 7 ). This variable captured the idea that students not only enrolled in online classes out of necessity, but with an appreciation of the positive attributes of online instruction, which balanced the negative aspects. When this standard was applied, students expected not only Basic Online Modality, Cognitive Presence, and Online Social Comfort, but expected their instructors to be highly engaged virtually as the course progressed (Teaching Presence), and to create strong student-to-student dynamics (Social Presence). Students who rated Instructional Support higher are less accepting of online classes.
Another restrictive condition was catering to the needs of students who preferred face-to-face classes (see Table 8 ). That is, they preferred face-to-face classes even when online classes were well taught. Unlike students more accepting of, or more likely to enroll in, online classes, this group rates Instructional Support as critical to enrolling, rather than a negative factor when absent. Again different from the other two groups, these students demand appropriate interactive mechanisms (Online Interactive Modality) to enable richer communication (e.g., videoconferencing). Student-to-student collaboration (Social Presence) was also significant. This group also rated Cognitive Presence and Online Social Comfort as significant, but only in their absence. That is, these students were most attached to direct interaction with the instructor and other students rather than specific teaching methods. Interestingly, Basic Online Modality and Teaching Presence were not significant. Our interpretation here is this student group, most critical of online classes for its loss of physical interaction, are beyond being concerned with mechanical technical interaction and demand higher levels of interactivity and instructional sophistication.
Discussion and study limitations
Some past studies have used robust empirical methods to identify a single factor or a small number of factors related to quality from a student’s perspective, but have not sought to be relatively comprehensive. Others have used a longer series of itemized factors, but have less used less robust methods, and have not tied those factors back to the literature. This study has used the literature to develop a relatively comprehensive list of items focused on quality teaching in a single rigorous protocol. That is, while a Beta test had identified five coherent factors, substantial changes to the current survey that sharpened the focus on quality factors rather than antecedent factors, as well as better articulating the array of factors often lumped under the mantle of “teaching presence.” In addition, it has also examined them based on threshold expectations: from minimal, such as when flexibility is the driving consideration, to modest, such as when students want a “good” online class, to high, when students demand an interactive virtual experience equivalent to face-to-face.
Exploratory factor analysis identified seven factors that were reliable, coherent, and significant under different conditions. When considering students’ overall sense of importance, they are, in order: Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Social Online Comfort, Interactive Online Modality, and Social Presence. Students are most concerned with the basics of a course first, that is the technological and instructor competence. Next they want engagement and virtual comfort. Social Presence, while valued, is the least critical from this overall perspective.
The factor analysis is quite consistent with the range of factors identified in the literature, pointing to the fact that students can differentiate among different aspects of what have been clumped as larger concepts, such as teaching presence. Essentially, the instructor’s role in quality can be divided into her/his command of basic online functionality, good design, and good presence during the class. The instructor’s command of basic functionality is paramount. Because so much of online classes must be built in advance of the class, quality of the class design is rated more highly than the instructor’s role in facilitating the class. Taken as a whole, the instructor’s role in traditional teaching elements is primary, as we would expect it to be. Cognitive presence, especially as pertinence of the instructional material and its applicability to student interests, has always been found significant when studied, and was highly rated as well in a single factor. Finally, the degree to which students feel comfortable with the online environment and enjoy the learner-learner aspect has been less supported in empirical studies, was found significant here, but rated the lowest among the factors of quality to students.
Regression analysis paints a more nuanced picture, depending on student focus. It also helps explain some of the heterogeneity of previous studies, depending on what the dependent variables were. If convenience and scheduling are critical and students are less demanding, minimum requirements are Basic Online Modality, Cognitive Presence, and Online Social Comfort. That is, students’ expect an instructor who knows how to use an online platform, delivers useful information, and who provides a comfortable learning environment. However, they do not expect to get poor design. They do not expect much in terms of the quality teaching presence, learner-to-learner interaction, or interactive teaching.
When students are signing up for critical classes, or they have both F2F and online options, they have a higher standard. That is, they not only expect the factors for decisions about enrolling in noncritical classes, but they also expect good Teaching and Social Presence. Students who simply need a class may be willing to teach themselves a bit more, but students who want a good class expect a highly present instructor in terms responsiveness and immediacy. “Good” classes must not only create a comfortable atmosphere, but in social science classes at least, must provide strong learner-to-learner interactions as well. At the time of the research, most students believe that you can have a good class without high interactivity via pre-recorded video and videoconference. That may, or may not, change over time as technology thresholds of various video media become easier to use, more reliable, and more commonplace.
The most demanding students are those who prefer F2F classes because of learning style preferences, poor past experiences, or both. Such students (seem to) assume that a worthwhile online class has basic functionality and that the instructor provides a strong presence. They are also critical of the absence of Cognitive Presence and Online Social Comfort. They want strong Instructional Support and Social Presence. But in addition, and uniquely, they expect Online Interactive Modality which provides the greatest verisimilitude to the traditional classroom as possible. More than the other two groups, these students crave human interaction in the learning process, both with the instructor and other students.
These findings shed light on the possible ramifications of the COVID-19 aftermath. Many universities around the world jumped from relatively low levels of online instruction in the beginning of spring 2020 to nearly 100% by mandate by the end of the spring term. The question becomes, what will happen after the mandate is removed? Will demand resume pre-crisis levels, will it increase modestly, or will it skyrocket? Time will be the best judge, but the findings here would suggest that the ability/interest of instructors and institutions to “rise to the occasion” with quality teaching will have as much effect on demand as students becoming more acclimated to online learning. If in the rush to get classes online many students experience shoddy basic functional competence, poor instructional design, sporadic teaching presence, and poorly implemented cognitive and social aspects, they may be quite willing to return to the traditional classroom. If faculty and institutions supporting them are able to increase the quality of classes despite time pressures, then most students may be interested in more hybrid and fully online classes. If instructors are able to introduce high quality interactive teaching, nearly the entire student population will be interested in more online classes. Of course students will have a variety of experiences, but this analysis suggests that those instructors, departments, and institutions that put greater effort into the temporary adjustment (and who resist less), will be substantially more likely to have increases in demand beyond what the modest national trajectory has been for the last decade or so.
There are several study limitations. First, the study does not include a sample of non-respondents. Non-responders may have a somewhat different profile. Second, the study draws from a single college and university. The profile derived here may vary significantly by type of student. Third, some survey statements may have led respondents to rate quality based upon experience rather than assess the general importance of online course elements. “I felt comfortable participating in the course discussions,” could be revised to “comfort in participating in course discussions.” The authors weighed differences among subgroups (e.g., among majors) as small and statistically insignificant. However, it is possible differences between biology and marketing students would be significant, leading factors to be differently ordered. Emphasis and ordering might vary at a community college versus research-oriented university (Gonzalez, 2009 ).
Availability of data and materials
We will make the data available.
Al-Gahtani, S. S. (2016). Empirical investigation of e-learning acceptance and assimilation: A structural equation model. Applied Comput Information , 12 , 27–50.
Alqurashi, E. (2016). Self-efficacy in online learning environments: A literature review. Contemporary Issues Educ Res (CIER) , 9 (1), 45–52.
Anderson, T. (2016). A fourth presence for the Community of Inquiry model? Retrieved from https://virtualcanuck.ca/2016/01/04/a-fourth-presence-for-the-community-of-inquiry-model/ .
Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning , 12 (5), 40.
Arbaugh, J. B. (2005). How much does “subject matter” matter? A study of disciplinary effects in on-line MBA courses. Academy of Management Learning & Education , 4 (1), 57–73.
Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education , 11 , 133–136.
Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology , 47 (6), 1202–1216.
Arruabarrena, R., Sánchez, A., Blanco, J. M., et al. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education , 16 , #10.
Arthur, L. (2009). From performativity to professionalism: Lecturers’ responses to student feedback. Teaching in Higher Education , 14 (4), 441–454.
Artino, A. R. (2010). Online or face-to-face learning? Exploring the personal factors that predict students’ choice of instructional format. Internet and Higher Education , 13 , 272–276.
Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior , 63 , 704–716.
Bernard, R. M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research , 74 (3), 379–439.
Bollinger, D., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. Int J E-learning , 3 (1), 61–67.
Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the influence of faculty development training on teaching effectiveness through a TPACK lens. The Internet and Higher Education , 38 , 28–35.
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin , 3 , 7.
Choi, I., Land, S. M., & Turgeon, A. J. (2005). Scaffolding peer-questioning strategies to facilitate metacognition during online small group discussion. Instructional Science , 33 , 483–511.
Clayton, K. E., Blumberg, F. C., & Anthony, J. A. (2018). Linkages between course status, perceived course value, and students’ preferences for traditional versus non-traditional learning environments. Computers & Education , 125 , 175–181.
Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning , 13 (4), 269–292.
Cohen, A., & Baruth, O. (2017). Personality, learning, and satisfaction in fully online academic courses. Computers in Human Behavior , 72 , 1–12.
Crews, T., & Butterfield, J. (2014). Data for flipped classroom design: Using student feedback to identify the best components from online and face-to-face classes. Higher Education Studies , 4 (3), 38–47.
Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education , 44 (1), 25–36.
Drew, C., & Mann, A. (2018). Unfitting, uncomfortable, unacademic: A sociological reading of an interactive mobile phone app in university lectures. International Journal of Educational Technology in Higher Education , 15 , #43.
Durabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning , 27 (3), 216–227.
Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education , 4 (2), 215–235.
Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education , 59 (3), 277–292.
Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in High Education , 17 , #25.
Fidalgo, P., Thormann, J., Kulyk, O., et al. (2020). Students’ perceptions on distance education: A multinational study. International Journal of Educational Technology in High Education , 17 , #18.
Flores, Ò., del-Arco, I., & Silva, P. (2016). The flipped classroom model at the university: Analysis based on professors’ and students’ assessment in the educational field. International Journal of Educational Technology in Higher Education , 13 , #21.
Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education , 1 , 113–127.
Gong, D., Yang, H. H., & Cai, J. (2020). Exploring the key influencing factors on college students’ computational thinking skills through flipped-classroom instruction. International Journal of Educational Technology in Higher Education , 17 , #19.
Gonzalez, C. (2009). Conceptions of, and approaches to, teaching online: A study of lecturers teaching postgraduate distance courses. Higher Education , 57 (3), 299–314.
Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business Education. International Review of Research in Open and Distance Learning , 7 (1), 1–18.
Green, S. B., & Salkind, N. J. (2003). Using SPSS: Analyzing and understanding data , (3rd ed., ). Upper Saddle River: Prentice Hall.
Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis: Pearson new international edition . Essex: Pearson Education Limited.
Harjoto, M. A. (2017). Blended versus face-to-face: Evidence from a graduate corporate finance class. Journal of Education for Business , 92 (3), 129–137.
Hong, K.-S. (2002). Relationships between students’ instructional variables with satisfaction and learning from a web-based course. The Internet and Higher Education , 5 , 267–281.
Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovation Higher Education , 40 , 305–316.
Inside Higher Education and Gallup. (2019). The 2019 survey of faculty attitudes on technology. Author .
Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers and Education , 95 , 270–284.
Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictor in a structural model. Computers & Education , 57 (2), 1654–1664.
Jung, I. (2011). The dimensions of e-learning quality: From the learner’s perspective. Educational Technology Research and Development , 59 (4), 445–464.
Kay, R., MacDonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. Journal of Computing in Higher Education , 31 , 449–471.
Kehrwald, B. (2008). Understanding social presence in text-based online learning environments. Distance Education , 29 (1), 89–106.
Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education , 14 , #7.
Kuo, Y.-C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2013). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Education , 20 , 35–50.
Lange, C., & Costley, J. (2020). Improving online video lectures: Learning challenges created by media. International Journal of Educational Technology in Higher Education , 17 , #16.
le Roux, I., & Nagel, L. (2018). Seeking the best blend for deep learning in a flipped classroom – Viewing student perceptions through the Community of Inquiry lens. International Journal of Educational Technology in High Education , 15 , #16.
Lee, H.-J., & Rha, I. (2009). Influence of structure and interaction on student achievement and satisfaction in web-based distance learning. Educational Technology & Society , 12 (4), 372–382.
Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal , 9 (2), 97–102.
Legon, R., & Garrett, R. (2019). CHLOE 3: Behind the numbers . Published online by Quality Matters and Eduventures. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-3-Report-2019-Behind-the-Numbers.pdf
Liaw, S.-S., & Huang, H.-M. (2013). Perceived satisfaction, perceived usefulness and interactive learning environments as predictors of self-regulation in e-learning environments. Computers & Education , 60 (1), 14–24.
Lu, F., & Lemonde, M. (2013). A comparison of online versus face-to-face students teaching delivery in statistics instruction for undergraduate health science students. Advances in Health Science Education , 18 , 963–973.
Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1).
Macon, D. K. (2011). Student satisfaction with online courses versus traditional courses: A meta-analysis . Disssertation: Northcentral University, CA.
Mann, J., & Henneberry, S. (2012). What characteristics of college students influence their decisions to select online courses? Online Journal of Distance Learning Administration , 15 (5), 1–14.
Mansbach, J., & Austin, A. E. (2018). Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education , 43 (4), 257–272.
Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education , 29 (4), 531–563.
Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education , 37 , 52–65.
Maycock, K. W. (2019). Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning , 35 , 121–126.
McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate . Studies , 23 (5), 477–486.
Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior , 45 , 359–374.
Nair, S. S., Tay, L. Y., & Koh, J. H. L. (2013). Students’ motivation and teachers’ teaching practices towards the use of blogs for writing of online journals. Educational Media International , 50 (2), 108–119.
Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching , 11 (2), 309–319.
Ni, A. Y. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education , 19 (2), 199–215.
Nouri, J. (2016). The flipped classroom: For active, effective and increased learning – Especially for low achievers. International Journal of Educational Technology in Higher Education , 13 , #33.
O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education , 68 (1), 1–14.
O'Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95.
Open & Distant Learning Quality Council (2012). ODLQC standards . England: Author https://www.odlqc.org.uk/odlqc-standards .
Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education , 32 , 47–57.
Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education , 19 , 27–35.
Paechter, M., Maier, B., & Macher, D. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 , 292–329.
Prinsloo, P. (2016). (re)considering distance education: Exploring its relevance, sustainability and value contribution. Distance Education , 37 (2), 139–145.
Quality Matters (2018). Specific review standards from the QM higher Education rubric , (6th ed., ). MD: MarylandOnline.
Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior , 71 , 402–417.
Rockhart, J. F., & Bullen, C. V. (1981). A primer on critical success factors . Cambridge: Center for Information Systems Research, Massachusetts Institute of Technology.
Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. The Journal of Distance Education / Revue de l'ducation Distance , 23 (1), 19–48 Athabasca University Press. Retrieved August 2, 2020 from https://www.learntechlib.org/p/105542/ .
Sebastianelli, R., Swift, C., & Tamimi, N. (2015). Factors affecting perceived learning, satisfaction, and quality in the online MBA: A structural equation modeling approach. Journal of Education for Business , 90 (6), 296–305.
Shen, D., Cho, M.-H., Tsai, C.-L., & Marra, R. (2013). Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet and Higher Education , 19 , 10–17.
Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology , 59 (3), 623–664.
So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education , 51 (1), 318–336.
Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education , 7 (1), 59–70.
Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education , 50 (4), 1183–1202.
Takamine, K. (2017). Michelle D. miller: Minds online: Teaching effectively with technology. Higher Education , 73 , 789–791.
Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education , 20 (1), 29.
Tucker, B. (2012). The flipped classroom. Education Next , 12 (1), 82–83.
Van Wart, M., Ni, A., Ready, D., Shayo, C., & Court, J. (2020). Factors leading to online learner satisfaction. Business Educational Innovation Journal , 12 (1), 15–24.
Van Wart, M., Ni, A., Rose, L., McWeeney, T., & Worrell, R. A. (2019). Literature review and model of online teaching effectiveness integrating concerns for learning achievement, student satisfaction, faculty satisfaction, and institutional results. Pan-Pacific . Journal of Business Research , 10 (1), 1–22.
Ventura, A. C., & Moscoloni, N. (2015). Learning styles and disciplinary differences: A cross-sectional study of undergraduate students. International Journal of Learning and Teaching , 1 (2), 88–93.
Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education , 14 , #22.
Wang, Y., Huang, X., & Schunn, C. D. (2019). Redesigning flipped classrooms: A learning model and its effects on student perceptions. Higher Education , 78 , 711–728.
Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning , 21 (1), 15–35.
Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education , 85 (5), 633–659.
Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education , 20 (2), 65–77.
Zawacki-Richter, O., & Naidu, S. (2016). Mapping research trends from 35 years of publications in distance Education. Distance Education , 37 (3), 245–269.
No external funding/ NA.
Authors and affiliations.
Development for the JHB College of Business and Public Administration, 5500 University Parkway, San Bernardino, California, 92407, USA
Montgomery Van Wart, Anna Ni, Pamela Medina, Jesus Canelon, Melika Kordrostami, Jing Zhang & Yu Liu
You can also search for this author in PubMed Google Scholar
Equal. The author(s) read and approved the final manuscript.
Correspondence to Montgomery Van Wart .
We have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and Permissions
About this article
Cite this article.
Van Wart, M., Ni, A., Medina, P. et al. Integrating students’ perspectives about online learning: a hierarchy of factors. Int J Educ Technol High Educ 17 , 53 (2020). https://doi.org/10.1186/s41239-020-00229-8
Received : 29 April 2020
Accepted : 30 July 2020
Published : 02 December 2020
DOI : https://doi.org/10.1186/s41239-020-00229-8
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Online education
- Online teaching
- Student perceptions
- Online quality
- Student presence
Help | Advanced Search
Computer Science > Machine Learning
Title: a modern introduction to online learning.
Abstract: In this monograph, I introduce the basic concepts of Online Learning through a modern view of Online Convex Optimization. Here, online learning refers to the framework of regret minimization under worst-case assumptions. I present first-order and second-order algorithms for online learning with convex losses, in Euclidean and non-Euclidean settings. All the algorithms are clearly presented as instantiation of Online Mirror Descent or Follow-The-Regularized-Leader and their variants. Particular attention is given to the issue of tuning the parameters of the algorithms and learning in unbounded domains, through adaptive and parameter-free online learning algorithms. Non-convex losses are dealt through convex surrogate losses and through randomization. The bandit setting is also briefly discussed, touching on the problem of adversarial and stochastic multi-armed bandits. These notes do not require prior knowledge of convex analysis and all the required mathematical tools are rigorously explained. Moreover, all the included proofs have been carefully chosen to be as simple and as short as possible.
- Download PDF
- Other Formats
References & Citations
- Google Scholar
- Semantic Scholar
DBLP - CS Bibliography
Bibtex formatted citation.
Bibliographic and Citation Tools
Code, data and media associated with this article, recommenders and search tools.
arXivLabs: experimental projects with community collaborators
arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.
Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.
Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .
- More from M-W
- To save this word, you'll need to log in. Log In
Definition of abstraction
Did you know.
From its roots, abstraction should mean basically "something pulled or drawn away". So abstract art is art that has moved away from painting objects of the ordinary physical world in order to show something beyond it. Theories are often abstractions; so a theory about economics, for instance, may "pull back" to take a broad view that somehow explains all of economics (but maybe doesn't end up explaining any of it very successfully). An abstract of a medical or scientific article is a one-paragraph summary of its contents—that is, the basic findings "pulled out" of the article.
Examples of abstraction in a Sentence
These examples are programmatically compiled from various online sources to illustrate current usage of the word 'abstraction.' Any opinions expressed in the examples do not represent those of Merriam-Webster or its editors. Send us feedback about these examples.
borrowed from Middle French & Late Latin; Middle French, "abduction (of a woman), removal, extraction (of a foreign body from a wound), (in philosophy) process by which the mind is able to form universal representations of the properties of distinct objects," borrowed from Late Latin abstractiōn-, abstractiō , from Latin abstrac- (variant stem of abstrahere "to remove forcibly") + -tiōn-, -tiō , suffix of action nouns — more at abstract entry 1
15th century, in the meaning defined at sense 1a
Dictionary Entries Near abstraction
Cite this Entry
“Abstraction.” Merriam-Webster.com Dictionary , Merriam-Webster, https://www.merriam-webster.com/dictionary/abstraction. Accessed 29 Nov. 2023.
Kids definition of abstraction, more from merriam-webster on abstraction.
Nglish: Translation of abstraction for Spanish Speakers
Britannica English: Translation of abstraction for Arabic Speakers
Britannica.com: Encyclopedia article about abstraction
Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!
Can you solve 4 words at once?
Word of the day.
See Definitions and Examples »
Get Word of the Day daily email!
Games & Quizzes
- Pop culture
- Writing tips
- Daily Crossword
- Word Puzzle
- Word Finder
- Word of the Day
- Synonym of the Day
- Word of the Year
- Language stories
- All featured
- Gender and sexuality
- All pop culture
- Grammar Coach ™
- Writing hub
- Grammar essentials
- Commonly confused
- All writing tips
thought of apart from concrete realities, specific objects, or actual instances: an abstract idea.
expressing a quality or characteristic apart from any specific object or instance, as justice, poverty, and speed.
not applied or practical; theoretical : abstract science.
difficult to understand; abstruse : abstract speculations.
Fine Arts .
of or relating to the formal aspect of art, emphasizing lines, colors, generalized or geometrical forms, etc., especially with reference to their relationship to one another.
Often Abstract . pertaining to the nonrepresentational art styles of the 20th century.
a summary of a text, scientific article, document, speech, etc.; epitome .
something that concentrates in itself the essential qualities of anything more extensive or more general, or of several things; essence.
an idea or term considered apart from some material basis or object.
an abstract work of art.
to make an abstract of; summarize .
to draw or take away; remove .
to divert or draw away the attention of.
to consider as a general quality or characteristic apart from specific objects or instances: to abstract the notions of time, space, and matter.
Idioms about abstract
abstract away from , to omit from consideration.
in the abstract , without reference to a specific object or instance; in theory: beauty in the abstract.
Origin of abstract
Other words from abstract.
- ab·stract·er, noun
- ab·stract·ly, adverb
- ab·stract·ness, noun
- non·ab·stract, adjective, noun
- non·ab·stract·ly, adverb
- non·ab·stract·ness, noun
- o·ver·ab·stract, verb (used with object), adjective
- pre·ab·stract, adjective
- su·per·ab·stract, adjective
- su·per·ab·stract·ly, adverb
- su·per·ab·stract·ness, noun
Words Nearby abstract
- abstinence syndrome
- abstinence theory
- abstract algebra
- abstract art
- abstract expressionism
- abstracting journal
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2023
How to use abstract in a sentence
In a pursuit whose meaning and purpose is abstract at the best of times, that’s not nothing.
“Our models can validate thousands of unseen candidates in seconds,” the study’s authors wrote in the abstract to their paper, which appears in the Monthly Notices of the Royal Astronomical Society.
It also makes it more real and concrete, rather than abstract .
The same applies in fields of biology dealing with more abstract concepts of the individual — entities that emerge as distinct patterns within larger schemes of behavior or activity.
The Ising model represents one of the simplest places in this abstract “theory space,” and so serves as a proving ground for developing novel tools for exploring uncharted territory.
These matters are not mere threats to abstract constitutional principles.
Do you think that as we get older our thoughts shift to the more abstract , the music, than the definite, the lyrics?
To listeners, Adnan is a real human while Jay remains an abstract figure.
In the mindset of the Coexist camp, those abstract beliefs have become twisted things, wrapped up with hate.
“There will be flashbacks to that day, but I think it will be a reasonably abstract performance,” Berger said.
This work is now lost, and we know it only by the abstract given by Photius in the passage quoted.
If you are thinking of making an abstract of a particular book, awaken the utmost interest in regard to it before you begin.
Any other work of which an abstract is published will serve the student as well as the above.
Three things are required: To learn how to abstract ; To make one, at least, such abstract ; and To learn it when made.
He never made any attempt to learn the abstract science of war, and until stirred by danger his character seemed to slumber.
British Dictionary definitions for abstract
having no reference to material objects or specific examples; not concrete
not applied or practical; theoretical
hard to understand; recondite; abstruse
denoting art characterized by geometric, formalized, or otherwise nonrepresentational qualities
defined in terms of its formal properties : an abstract machine
philosophy (of an idea) functioning for some empiricists as the meaning of a general term : the word ``man'' does not name all men but the abstract idea of manhood
a condensed version of a piece of writing, speech, etc; summary
an abstract term or idea
an abstract painting, sculpture, etc
in the abstract without reference to specific circumstances or practical experience
to think of (a quality or concept) generally without reference to a specific example; regard theoretically
to form (a general idea) by abstraction
( ˈæbstrækt ) (also intr) to summarize or epitomize
to remove or extract
euphemistic to steal
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
- Open access
- Published: 23 November 2023
Longitudinal qualitative study of paired mentor-mentee perspectives on the abstract submission process
- Carol A. Mancuso 1 , 2 ,
- Laura Robbins 3 &
- Stephen A. Paget 2 , 4
BMC Medical Education volume 23 , Article number: 898 ( 2023 ) Cite this article
Submitting research abstracts to scientific societies is expected in academic medicine and requires dedicated time and effort. The authors queried mentors and mentees to ascertain what topics and proposed strategies should be included in a new curriculum to enhance the abstract submission process.
Between May 2019 and March 2020, the authors enrolled 14 senior-rank mentors from diverse disciplines at a tertiary musculoskeletal center and their 14-paired mentees (mostly residents and fellows) into a several-component qualitative study consisting of in-depth interviews several months before abstract submission addressing prior experiences, and longitudinal follow-up interviews 1 month before, 1 week before, and 1 week after submission to uncover challenges faced during the actual process and strategies that were effective in overcoming these challenges. Additional contacts occurred through November 2020 to ascertain outcomes of submissions. Mentors and mentees were unaware of each other’s responses. Responses were grouped into categories using grounded theory and a comparative analytic strategy.
At enrollment participants recounted details from prior abstracts that included experiences with the submission process such as format, content, and online requirements, and experiences with interpersonal interactions such as managing coinvestigators’ competing priories and consulting with statisticians in a timely manner. Benefits of submitting abstracts included advancing mentees’ careers and increasing research methodology rigor. Challenges encountered during the submission process included meeting deadlines before all data were acquired, time away from other responsibilities, and uncertainty about handling changing conclusions as more data accrued. Delayed feedback from coinvestigators and broadening the scope or changing the focus of the abstract compounded the time crunch to meet the submission deadline. At the time of abstract submission mentor-mentee pairs agreed that major challenges were dealing with collaborators, incomplete data/limited results, and different work styles. The authors developed a proposal for a comprehensive curriculum to include organizational, technical and interpersonal topics.
This longitudinal qualitative study involving mentor-mentee pairs revealed multiple benefits and challenges associated with submitting research abstracts. These findings provide the foundation for a comprehensive curriculum to enhance this recurring labor-intensive undertaking and cornerstone of academic medicine.
Peer Review reports
Submitting research abstracts to scientific societies is scholarly work expected of most mentors and mentees in academic training programs [ 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 ]. Abstracts are highly valued because they are tangible proxies for different types of achievement, including conducting a research study and effectively communicating findings in a written format [ 9 ]. In addition to overseeing the research, faculty mentors guide residents, fellows and other postdoctoral mentees in preparing and then submitting abstracts as co-authors along with content-specific collaborators [ 1 , 2 ].
Abstracts are highly structured and relatively brief with approximately 200–350 words and sometimes tables or figures [ 2 , 9 , 10 , 11 , 12 , 13 ]. Most professional societies have online portals to submit abstracts by stipulated deadlines [ 10 , 11 ]. Once received, abstracts are evaluated by reviewers and the most highly rated are selected for podium or poster presentations at the next society meeting [ 12 , 13 , 14 ]. Attendance and presentation at meetings are desired outcomes, especially for trainees, as they are opportunities to learn, review cutting-edge research, and network with peers and experts [ 5 ].
Abstract preparation is an iterative process that requires dedicated time, rechanneling of effort, collegiality, and patience [ 6 , 14 , 15 ]. Despite being highly valued and requiring sizable human capital from the medical staff, there is no recommended curriculum to teach faculty and trainees to be efficient and successful in submitting abstracts for scientific meetings [ 9 , 14 ]. Most attention to date has focused on preparing abstracts for manuscripts and grants, but these do not address technical and stylistic nuances of standalone abstracts for scientific meetings [ 7 , 14 ].
The goals of this study were to identify perceptions of mentor-mentee pairs regarding challenges encountered during prior abstract submissions and challenges encountered in real time during current abstract submissions. Additional goals were to learn about perceived benefits of submitting abstracts and what mentors and mentees would want in a curriculum aimed at enhancing the abstract submission process. This qualitative study had several components, specifically in-depth interviews at enrollment addressing prior submission experiences, ongoing discussions over several months during the actual submission process to uncover challenges in real time and identify strategies to improve the process, and a final contact to learn about the outcome of the submission.
This study was approved by the IRB at Hospital for Special Surgery and all participants provided verbal informed consent; the IRB approved this form of consent. This institution is a tertiary musculoskeletal center with residency and fellowship programs in diverse disciplines. Residents are assigned blocks of time for research, usually 2 years; fellowships are mostly of 1 year duration.
We identified faculty from these programs with mentorship responsibilities and recruited and interviewed them one-on-one at enrollment about their experiences submitting previous research abstracts to scientific societies. Mentors were asked open-ended questions about the best and worst aspects of submitting abstracts, what they often wish they had done differently, how mentees make the process easier and harder, what are benefits and drawbacks of submitting abstracts, and what should be included in a curriculum about abstract submission. At the conclusion of the interview we asked each mentor to identify a mentee with whom they planned to submit an abstract during the next submission cycle. We also asked them to name the anticipated recipient society and submission deadline. We then recruited these mentees and interviewed them in-person or by telephone and asked the same enrollment questions with the modification of asking how mentors make the process easier and harder. Mentees were not aware of their mentors’ previous responses.
Our methodology was consistent with a theoretical sampling framework with a priority to have representation from different departments to capture variations in abstract preparation strategies and abstract requirements of different professional societies. We did not employ iterative sampling. As interviews proceeded, we also maintained a log of emerging concepts through memo writing which we then probed in subsequent interviews with other participants. The log also was useful for our longitudinal components (described below) where we tailored follow-up questions within participants to responses from their enrollment interviews as well as responses from other participants. This recursive process was consistent with the iterative analyses of grounded theory (described below) [ 16 , 17 ].
We recontacted participants by telephone or email approximately 1 month, and then again 1 week before the submission deadline and asked how preparations were proceeding, what was hard to do, what was unexpected, and did they think they would meet the deadline. Mentors and mentees were not aware of each other’s responses.
We recontacted participants again approximately 1 week after the deadline and asked if they were pleased with the final product, what aspects of the process were most challenging, what would they do differently next time, and what would have been helpful. Finally, we contacted participants again by email several months later to learn about the outcome of the submission, i.e. accepted as a podium or poster presentation, or not accepted, and whether a manuscript also was submitted. One investigator (CAM), who was experienced in qualitative research, conducted the interviews.
Questions were posed in an open-ended fashion and participants could expand on any answers they wished. Information obtained during interviews was written down verbatim in field notes and repeated back to participants for confirmation. A single narrative then was created for each participant composed of all transcribed interviews and email responses.
Participants were asked about academic rank, estimated number of abstracts submitted per year, and approximate previous abstract acceptance rate.
The interviewer assessed participants’ responses according to grounded theory and a descriptive strategy [ 18 , 19 ]. Using open coding, responses were reviewed line-by-line to identify unique concepts. Through an iterative process, concepts were then grouped into larger categories. Based on a comparative analytic strategy, categories were refined to ensure they encompassed distinct features and then were named to capture the phenomena they represented [ 18 ]. Categories were compared for similarities and differences in an iterative process and then grouped into over-arching themes according to the larger common topics they encompassed. Another investigator (LR), also experienced in qualitative methods, independently reviewed all narratives, and corroborated the categories and themes [ 20 ]. Data saturation, or the point when no new concepts were volunteered, was achieved [ 21 ].
We enrolled 14 mentor-mentee pairs (i.e. 28 participants) from May 2019 through March 2020; follow-ups near the time of abstract submission occurred through November 2020, and follow-ups to ascertain abstract outcome and manuscript submission occurred through October 2022. Participants represented 7 specialties, most mentors were professors, and most mentees were residents (Table 1 ). Based on prior performance both groups estimated an acceptance rate of ≥ 80% for previous abstracts. All pairs identified a single abstract to be tracked for this study.
Twenty seven of 28 participants provided follow-up; one mentee was not contacted as her mentor reported they would not submit an abstract. Twelve of the 14 pairs submitted an abstract as planned; 2 pairs did not submit because of low patient recruitment. Of these 12, 3 were accepted for podium and 6 were accepted for poster presentations and 3 were not accepted. Nine of 14 pairs ultimately submitted corresponding manuscripts that were accepted.
We developed categories by reviewing enrollment, follow-up, and curriculum comments separately, and then assembled themes from a composite of all categories (Table 2 ). Thus categories and themes emerged at different time points.
Enrollment (perspectives based on prior experiences)
Enrollment interviews focused on experiences from submission of prior abstracts.
Theme: assembling abstract
Individual or team approach.
All participants stated they had either a mentor-initiated or mentee-initiated process. Mentor-initiated processes were highly structured and included group meetings to determine what data were ready (‘We discuss what we have that is good and competitive, not premature’), what should be the target society (‘It has to be a match’), and allocation of responsibilities, timelines, and designated draft writer (‘Who does the writing gets to do the presenting’). Mentee-initiated processes were more diffuse (‘I put the abstract together, send it to the mentor, get feedback, repeat, back and forth’).
Deadlines and time crunch
Many participants stated they often had to scramble to meet deadlines. This happened mostly if there was no submission plan, but also occurred when unanticipated results necessitated additional analyses. Not starting early enough and delays in getting feedback compounded the problem. In some cases time crunch was associated with sub-optimal submissions (‘You find yourself with loose ends, you make adjustments that aren’t optimal, you do things you wouldn’t do if there was no deadline, you need to compromise; you do these things just to have abstracts’).
Time away from other pursuits
Both groups reported the process required a lot of time with ‘unsure payoff’ and less time for more consequential pursuits (‘It takes time away from other things, I am already overextended’, ‘As a fellow I really need to get the most out of my time.’) The time required also was a downside because the background volume of work was not relaxed during the process (‘It takes time away from experiments that I’m supposed to do; I have to cram it in’). Time spent on abstracts was considered ‘too much for just a poster’ and was ‘not worth it if not accepted and followed with a manuscript.’ Some mentors also commented it takes time to ‘support a nervous presenter and buffer a tough audience’ if the abstract were accepted. Time spent also was disruptive to personal life (‘Everyone put in extra time at night and weekends’, ‘It definitely took time away from my family’).
Theme: quality of research
Mentors and mentees perceived multiple benefits for their ongoing research, including getting collaborators ‘to focus on the scope and quality of the project, which helps you make big strides’ and ‘gives you a chance to pause and realize what else you could do.’ In addition, abstracts ‘make you look at the data part of the way through and alert you to whether you need to re-direct your research’ (‘It’s a stop, check, ascertain, are we really on the right track? When it is time to publish it is almost too late to do this’).
Challenges due to data volume and quality
Both groups identified situations with insufficient or poor-quality data (‘It’s frustrating if the abstract is weak after all the hard work’, ‘If it’s not strong enough for that society, we have to pull it at the last minute and it becomes a waste of time’). A frequent solution was to submit what was available because ‘it is commonly acknowledged that the abstract is a teaser, it is not final, it is a work in progress, and more data will follow.’ This solution was not embraced by some who noted that ‘abstracts are published’ and then if accepted, presenting contradictory or weak results would be daunting.
Theme: inter-personal interactions
Role of collaborators.
Most participants acknowledged that collaborators ‘strengthened the work, fueled the fire’ and ‘made you see perspectives you had not thought of.’ Preparing abstracts often prompted more discussion and interaction among investigators (‘It makes everyone think and progress in the right direction’). However, collaborators hindered the process when they did not provide timely input or write required sections ‘especially for conferences not in the limelight.’ Other challenges were having ‘many people weighing in or wanting different things’ especially if they had different priorities or were ‘not aware of the goals of other mentors.’ Also, ‘if there are multiple co-authors and they disagree, then it is hard to reconcile.’
Dependence on statisticians
Most respondents noted benefits from input from statisticians who increased the scope and rigor of results. However, they also lamented dependence on statisticians because databases had to be provided far in advance (‘They want data a month in advance, this is a real problem’) and turn-around time could be slow (‘It is hard to get results to match your deadlines’). Need for quick feedback had such a marked impact that some participants learned to do their own analyses (‘I do stats myself if straightforward, I did a Masters’) or their departments secured their own support (‘We have our own statistician, she vets all abstracts’, ‘We out-source for stats, we need full time help’).
Impact of mentees
Mentors reported that mentees make the process easier if they have prior experience, plan ahead, and are ‘motivated’, organized, and responsive to comments (‘Follow the email flow and make all edits before sending it back’). Mentees make the process harder if they assemble a cursory draft, wait until the ‘11th hour to ask for help’ (‘Radio silence is hard’), ‘do not consider the audience reading the abstract’, and are reluctant to make conclusions. Mentees also make the process harder if they do not recognize mentors’ time constraints (‘I am incredibly overwhelmed and need more time to review drafts’) and acknowledge mentors’ relationships with colleagues (‘If they don’t send it to co-authors in a timely fashion it makes me look bad with my colleagues’).
Impact of mentors
Mentees reported mentors make the process easier if they start early, prioritize work, clearly state the relevance of the work (‘They know what is impactful’), select societies that ‘are worth sending to’, put the ‘right spin for the audience’, and prod collaborators for timely contributions. Mentors hinder the process if they ‘get lost in details’, and do not provide timely feedback and guidance (‘Saying “no good and re-do” but without clear direction’. ‘If they say “it’s great” too quickly without really looking at it, they are not doing you a favor; mentors do this to boost your confidence but it is counterproductive in the long run.’).
Theme: submission process
Both groups reported that societies make the process more difficult by not stating current areas of interest (‘We would have submitted fewer if we knew what they wanted’) and ‘always adding new requirements.’ Other challenges were ‘busy work’, such as variations in figure and table formatting, and ‘arbitrary’ non-uniform word counts (‘It is tedious to have to vary each abstract’). While most word count comments related to too few words permitted, changes were also challenging (‘One society doubled its word count’, ‘They now have word count limits per section, not just total’).
There were challenges with the submission itself because ‘info online is not always the same as actual requirements’ (‘Sometimes I practice with a make-believe abstract just to see what the non-posted requirements are’). Other challenges were prerequisites to upload conflict of interest statements (‘It is painful to get them’) and abrupt unavailability of websites (‘Some interfaces are horrible, finicky, they crash 1 hour before deadlines’).
Theme: impact on careers
Minimal boost to mentors’ careers.
Some senior mentors remarked ‘I don’t need abstracts anymore’ and some mid-level mentors wondered ‘does leadership even notice?’ Some abstracts were viewed as perfunctory ‘to demonstrate mentee productivity’ and ‘were scientifically unrewarding.’ Some mentors reported risks in publicly revealing work before it was formally published in manuscripts (‘You don’t want to share it right away, especially if it could be duplicated or reproduced because you could be out-published; so I say instead let’s submit something else as an abstract’). However, some mentors viewed abstracts as a means ‘to present the breadth of what we do’ and ‘keep me in the game.’
Marked advance to mentees’ careers
For mentors, the main benefit was in promoting their mentees’ careers (‘The best part is helping junior colleagues grow and develop’). Abstracts also provide something tangible that, compared to other mentoring deliverables, does not require as much time or effort (‘They are the least hard to mentor’). For mentees, going to national meetings was valuable to ‘get yourself out there’, ‘get involved’, ‘practice presenting’, and network with peers and experts. Presenting abstracts also was considered a way to promote careers (‘Looks good on your CV’, ‘You need people to write letters for promotion’).
Benefits to mentors and mentees from presenting work
Presenting abstracts had unique benefits, such as getting ‘opinions from many experts because you never know what others are thinking about from a manuscript; with a presentation you get feedback immediately.’ Other benefits were finding out what others are doing, ‘getting the absolute latest information’ and getting ideas for future projects. Providing momentum for writing manuscripts was another benefit as was asserting your role in the field (‘You mark your territory’). Some mentors noted that public presentations help recruit future fellows because ‘they see our great work and that trainees are encouraged to participate.’
Theme: psychological impact
Abstracts were associated with psychological stress from multiple sources. Mentees reported stress to meet deadlines (‘In the end I have to rush, I don’t sleep a few nights before abstracts are due’). It also was stressful to prompt collaborators for input (‘It’s hard to get responses from senior co-investigators and then make changes with the little time left’). Dealing with set-in-their way mentors also was stressful (‘I have a hands-on-for-every-detail mentor; but in some ways that’s good because then other co-authors don’t have much to add’). Mentees also reported ‘the worst part is waiting to hear back if accepted.’ Mentors reported it was stressful for mentees if abstracts were rejected or if they ultimately concluded ‘it’s still not good enough, we have to pull it because we have to ensure quality control, it is our reputation.’
Follow-up (perspectives based on current experiences in real time)
Participants were contacted approximately 1 week and 1 month before submission and 1 week after submission to report on the process in real time.
Several mentor/mentor pairs volunteered similar experiences. These included, respectively, comments about results (‘we didn’t find significant differences’/‘our analyses didn’t demonstrate meaningful findings’) and comments about collaborators (‘the hardest part is getting multiple authors to send revisions’/‘the hardest part is the multiple rounds of revisions’). Work style was also mentioned (‘I make many edits, I have to control myself and let it be their voice’/‘stylistic differences occur, but if it makes the mentor happy it’s not a big deal’). Word count was an issue for both (‘now they include the title in the word count’/‘the extra blurb they want creates a word count problem’). Fostering the mentor-mentee relationship was a mutually cited benefit (‘it is meaningful for the relationship with the trainee, you spend a lot of time together, more than in other settings’/‘you get close to your mentor’). COVID-19 markedly impacted research for some pairs (‘we could not get samples, I came up with a couple of projects at the last minute and we pulled it off”/‘because the labs shut down we had to come up with new projects, we managed two abstracts’).
Some topics raised during enrollment were echoed during follow-up: ‘if we had started earlier we could be writing something more solid’, ‘time is tight and things are rushed so I feel a little stressed’, and ‘there is no time for collaborators to comment so we have only a narrow interpretation of findings.’ Additional categories were discerned and grouped according to themes identified in conjunction with enrollment categories.
Submitting far in advance of the meeting
Both groups commented that submitting abstracts far in advance was problematic (‘Deadlines are so far before the meeting that the information can become irrelevant or not interesting anymore’). The main drawback, however, was not having enough data and submitting partial results (‘Everyone knows you will have more data before the conference, so it doesn’t have to be a finished product’, ‘In the end we were forced, we didn’t have all the data but we couldn’t wait until the next meeting’). Some noted this strategy could backfire if additional data were contradictory and the story had to change (‘By the time the meeting rolls around we may have a dilemma, do we present the state when the abstract was submitted or the state at the time of the meeting, the message may have to change’).
Scope of project changed
In some cases participants were surprised with unfolding analyses (‘We hit a snag with the data and are now discussing what to change. Some of our results don’t resemble our assumptions as closely as we would like’). In other cases surprises were favorable (‘Our collaborators did a great job of taking new findings of interest to them and running with them so we got other abstracts’).
Angst about attending the meeting
Once submitted some mentors became concerned ‘if it gets accepted then what? Someone will have to go to the meeting’ and ‘I will have to prepare the trainee for a tough expert audience and a very large room.’ Another concern was ‘this wasn’t rigorous research and if it gets accepted we are going to need to prep a lot more.’
Additional comments pertained to the COVD-19 pandemic which began in New York City while the study was in progress. For some participants the pandemic limited patient recruitment and laboratory experiments and diverted efforts from research to clinical work. This also impacted ‘co-investigators who did not have as much time as usual to provide feedback.’ Some participants reported the extended submission deadlines were beneficial because they ‘could do more analyses’, ‘confer with collaborators’, ‘generate COVID-related studies’, ‘prepare another abstract’, and ‘add figures that hopefully will increase the chances it will be accepted.’ For some, however, ‘constantly pushing back the deadline was distracting and actually lower our overall productivity’.
Proposal for a curriculum
Mentors agreed that instruction in preparing abstracts would be helpful otherwise ‘trainees learn on the fly and spend a lot of time trying to figure it out.’ They commented ‘a specific curriculum would be helpful because abstracts for societies are different from other abstracts, they must standalone, they cannot depend on other text like a manuscript or a grant.’ Mentees noted that instruction in preparing abstracts would help them ‘learn the process’ faster and provide tips on how to make the process more efficient. Based on comments offered throughout this study and specific responses about desired topics for instruction, we assembled an outline for a possible curriculum (Table 3 ).
An introduction would summarize benefits of abstracts and the importance of considering the interests of the audience and the society. Doing the work would address making a plan, starting early, and informing collaborators. Content would focus on ensuring the abstract tells a salient story (‘what is the hook, the value of your work, the succinct take home message’). Content also would address ensuring that the analyses are rigorous, and that results are presented advantageously in text, tables and figures. Logistics would emphasize knowing guidelines and interfacing with the submission portal. Additional topics would focus on ways to make the process easier for mentors, mentees, collaborators, and reviewers. Finally, strategies to address special scenarios, such as presenting interim findings, also would be addressed.
The curriculum would be case-based and conveyed with illustrative examples. Editing a draft abstract would be included for hands-on experience.
In our study mentors and mentees from diverse specialties devoted time and effort to preparing research abstracts for various scientific meetings and had multiple perspectives about the process. During this longitudinal study both groups volunteered knowledge from prior experiences and from challenges they encountered while the process unfolded. These included both technical and interpersonal issues, and exemplified the sizable human capital invested in this educational and scientific endeavor. In our study mentors and mentees volunteered abundant information that now provides the content for an evidence-based and targeted curriculum to optimize the abstract submission process.
Although not part of specific educational programs, several publications offer well-considered strategies for assembling competitive research abstracts [ 5 , 6 , 7 , 9 , 10 , 11 , 14 , 15 ] Some of our findings concur with their recommendations, such as addressing salient topics, choosing the right meeting, carefully following instructions, and planning ahead. One publication included several annotated abstracts to effectively illustrate recommended strategies [ 15 ] and another tracked rates of submission over time [ 1 ]. These publications mostly focused on technical suggestions for formatting, section content, and presentation, and devoted less attention to the interpersonal process of abstract preparation. In addition, these previous reports were based on expert opinion and did not use qualitative methods to acquire input from mentors and trainees.
Our study provides the foundation for a curriculum. But is formal instruction really needed for abstract preparation? Don’t trainees eventually learn this on the job? While becoming proficient in writing effective abstracts certainly requires practice that cannot be substituted with instruction, there is a role for providing formal guidance and sharing effective strategies. Fostering effective communication skills is a constant goal of medical education for all endeavors, including abstracts [ 2 , 9 ]. From the point of view of the faculty, our findings highlight the substantial time and effort required of them for this recurring task. A more streamlined and efficient process would allow mentors to devote more time to the scientific significance of the research as opposed to details of formatting and packaging the message. Elements of the curriculum also could be tailored to mentors to optimize their skills in guiding and overseeing this process.
Our study has several limitations. First, participants were from a tertiary care institution where dual submission of research abstracts by mentors and mentees is expected. In addition, mentees had limited dedicated time for research, and this contributed to submission of abstracts before data acquisition was completed. Second, we chose mentors based on designated leadership roles in their training programs, and they then chose the mentees to be partnered with for this study. Third, some participants emphasized certain topics based on their specialty. For example radiologists, who were often collaborators, were particularly attuned to issues involving collegiality; thus abstract submission according to specialty warrants further inquiry. Fourth, although abstracts were submitted to multiple societies, they all focused on musculoskeletal medicine. These issues may impact the applicability of our findings and repeating this work in other academic medical settings would improve generalizability.
In summary, our study was unique in that it focused on abstract preparation for scientific meetings, was based on input from faculty and trainees, and longitudinally tracked the submission process. Using qualitative methods, we ascertained what technical and interpersonal topics are integral to the process. These findings will provide the foundation for a comprehensive curriculum to enhance this recurring labor-intensive endeavor and cornerstone of academic medicine.
The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.
Chen AMH, Sweeney M, Sprague JE, Dowling TC, Durst SW, Eberle MM, Stolte SK. Tal bot JN. Stimulating and sustaining scholarly activity at teaching-intensive institutions. Curr Pharm Teach Learn. 2021;13:228–37.
Article Google Scholar
Toledo-Pereyra LH. In the pursuit of scholarly activities. J Invest Surg. 2010;23:335–41.
Philibert I, Lieh-Lai M, Miller R, Potts JR 3rd, Brigham T, Nasca TJ. Scholarly activity in the next accreditation system: moving from structure and process to outcomes. J Grad Med Educ. 2013;5:714–7.
Simpson D, Yarris LM, Carek PJ. Defining the scholarly and scholarship common program requirements. J Grad Med Educ. 2013;5:539–40.
Wood GJ, Morrison RS. Writing abstracts and developing posters for national meetings. J Palliat Med. 2011;14:353–9.
Linder L. Disseminating research and scholarly project: developing a successful abstract. J Pediatr Oncol Nurs. 2012;29:362–6.
Cook DA, Bordage G. Twelve tips on writing abstracts and titles: how to get people to use and cite your work. Med Teach. 2016;38:1100–4.
Becker D, Garth H, Hollander R, Klein F, Klau M. Understanding faculty and trainee needs related to scholarly activity in a large, nonuniversity graduate medical education program. Permanente J. 2017;21:16–034.
Ickes MJ, Gambescia SF. Abstract art: how to write competitive conference and journal abstracts. Health Promotion Pract. 2011;12:493–6.
Boullata JI, Mancuso CE. A how-to guide in preparing abstracts and poster presentations. Nutr Clin Pract. 2007;22:641–6.
Singh MK. Preparing and presenting effective abstracts and posters in psychiatry. Acad Psychiatry. 2014;38:709–15.
Varpio L, Amiel J, Richards BF. Writing competitive research conference abstracts: AMEE guide no 108. Medical Teach. 2016;38:863–871.
Gambescia SF. A brief on writing a successful abstract. Educ for Health. 2013;26:122125.
Happell B. Hitting the target! A no tears approach to writing an abstract for a conference presentation. Internat J Mental Health Nurs. 2007;16:447–52.
Pierson DJ. How to write an abstract that will be accepted for presentation at a national meeting. Resp Care. 2004;49:1206–12.
Glaser BG, Strauss AL. The discovery of grounded theory: strategies for qualitative research. Mill Valley, CA Sociology Press; 1967.
Corbin JM, Strauss A. Grounded theory research: procedures, canons, and evaluative criteria. Qual Sociol. 1990;13:3–21.
Pawluch D, Neiterman E. The SAGE Handbook of Qualitative Methods in Health Research , Chap. 9: What is Grounded Theory and Where Does it Come From? Bourgeault I, Dingwall R De Vries R, editors. 2010, SAGE Publications Ltd, London.
Strauss AL, Corbin JM. Basics of qualitative research: techniques and procedures for developing grounded Theory Research. 2nd ed. Thousand Oaks, CA: Sage Publications; 1998.
Campbell JL, Quincy C, Osserman J, Pedersen OK. Coding in-depth semistructured interviews: problems of unitization and intercoder reliability and agreement. Sociolog Methods Res. 2013;42:294–320.
Morse JM. Data were saturated…. Qualitative Health Res. 2015;25:587–8.
The authors acknowledge the Academy of Medical Educators at the Hospital for Special Surgery for its support.
There was no external funding for this study.
Authors and affiliations.
Research Division, Hospital for Special Surgery, 535 East 70th Street, New York, NY, 10021, USA
Carol A. Mancuso
Department of Medicine, Weill Cornell Medical College, 1300 York Ave, New York, NY, 10021, USA
Carol A. Mancuso & Stephen A. Paget
Education Institute, Hospital for Special Surgery, 535 East 70th Street, New York, NY, 10021, USA
Division of Rheumatology, Hospital for Special Surgery, 535 East 70th Street, New York, NY, 10021, USA
Stephen A. Paget
You can also search for this author in PubMed Google Scholar
CAM designed the study, interviewed all participants, analyzed the data, interpreted the data, and wrote the manuscript. LR designed the study, analyzed the data, interpreted the data, and edited the manuscript. SAP designed the study, interpreted the data, and provided administrative oversight. All authors read and approved the final manuscript.
Correspondence to Carol A. Mancuso .
Ethics approval and consent to participate.
This study was approved by the Institutional Review Board at the Hospital for Special Surgery. All participants provided verbal consent. The IRB at the Hospital for Special Surgery approved this form of consent.
Consent for publication
The authors declare no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Reprints and Permissions
About this article
Cite this article.
Mancuso, C.A., Robbins, L. & Paget, S.A. Longitudinal qualitative study of paired mentor-mentee perspectives on the abstract submission process. BMC Med Educ 23 , 898 (2023). https://doi.org/10.1186/s12909-023-04869-y
Received : 03 August 2023
Accepted : 14 November 2023
Published : 23 November 2023
DOI : https://doi.org/10.1186/s12909-023-04869-y
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Abstract submission
- Abstract deadlines
- Abstract preparation
- Curriculum development
BMC Medical Education
Java methods, java classes, java file handling, java how to, java reference, java examples, java abstraction, abstract classes and methods.
Data abstraction is the process of hiding certain details and showing only essential information to the user. Abstraction can be achieved with either abstract classes or interfaces (which you will learn more about in the next chapter).
- Abstract class: is a restricted class that cannot be used to create objects (to access it, it must be inherited from another class).
- Abstract method: can only be used in an abstract class, and it does not have a body. The body is provided by the subclass (inherited from).
An abstract class can have both abstract and regular methods:
From the example above, it is not possible to create an object of the Animal class:
To access the abstract class, it must be inherited from another class. Let's convert the Animal class we used in the Polymorphism chapter to an abstract class:
Remember from the Inheritance chapter that we use the extends keyword to inherit from a class.
Try it Yourself »
Why And When To Use Abstract Classes and Methods?
To achieve security - hide certain details and only show the important details of an object.
Note: Abstraction can also be achieved with Interfaces , which you will learn more about in the next chapter.
If you want to report an error, or if you want to make a suggestion, do not hesitate to send us an e-mail:
Top references, top examples, get certified.
RSNA 2023 Science and Education Highlights
Attendees can select from variety of education and science highlights to add to their meeting agenda.
RSNA 2023 offers 300+ educational courses and scientific sessions covering every subspecialty in a variety of formats across all career levels. Here is a sampling of session highlights and education and science courses attendees might want to include in their meeting agenda for the week.
Adding Virtual Access to your registration guarantees that you can attend the meeting at your pace and your convenience, with more than 300 livestreamed and on-demand courses to meet the needs of global attendees. Attendees who register for Virtual Access will be able to view 100% of all available annual meeting programming until April 30, 2024, at noon CT.
Each one-hour science session includes a series of 7-minute presentations of hypothesis-driven research focused on a common theme. Science sessions cover every subspecialty and modality. Science sessions with keynote include an introduction and/or conclusion to the topic by an invited lecturer.
Invited faculty teach courses across all subspecialties in sessions that last one to two hours. Presentations take on several different formats including hands-on, case-based, essentials, hot topic and interactive sessions. Browse the online program and filter by subspecialty and content format to find the course that fits your needs.
Case-based courses offer case scenarios to demonstrate best practices and include self-assessment for participants to test their knowledge. Consider these courses: • Lung/Mediastinum Case Based Multidisciplinary Review • Demystifying Postoperative Spine Imaging: Imaging Techniques and Case Based Review • Critical Musculoskeletal Trauma in the ER- A Case Based Approach
Essentials courses are designed for the general radiologist, trainees and subspecialty radiologists who want to review other areas of focus. Faculty focus on the fundamentals of the given topic. Consider these courses: • Musculoskeletal Imaging: When Molecular Imaging Helps • Essentials of GI Imaging • Revolution in Alzheimer’s Disease Therapy is Finally Here: What does the Radiologist Need to Know
Hot topics courses highlight late-breaking research and new innovations in radiology and offer a variety of viewpoints on a topic. Consider these courses: • Informatics and Patient Centered Care-Creating the Tools For a Better Radiology Experience • Hot Topics in Emergency Radiology • The Future is Here: Artificial Intelligence in Cardiovascular Imaging
Interactive sessions invite audience participation using technology including polling, self-assessment or gamification. Consider these sessions: • STAT! Emergency Imaging of the Pediatric Patient • Multi-modality Challenging Breast Cases • GU Essentials! A Case-Based Audience Participation Session
RSNA Hands-On Labs are six 90-minute sessions where you can learn about and practice a variety of imaging and interventional protocols across several modalities and various body structures.
Practice techniques in the RSNA Hands-on Labs with hands-on training courses focused on Pediatric Musculoskeletal US; Liver Elastrography; Musculoskeletal US: Approach to Ultrasound Assessment of the Shoulder with Dynamic Maneuvers; Ultrasound Doppler Hands-On Course of the Carotid System and Abdominal Vasculature; Breast US Biopsy; and Contrast Reaction Management.
Build AI acumen in the RSNA Deep Learning Lab, located in the Learning Center. The Lab features 19 unique sessions covering a range of topics and skills. Bring your own devices for hands-on activities and to explore new tools and resources.
RSNA AI Deep Learning Lab sessions are available in several skill levels including beginner-friendly options. See topics and schedules at Meeting.RSNA.org and add labs to your registration; labs are $100 each.
NIH Grantsmanship Workshop Sunday, Nov. 26, 12 p.m. CT
This workshop introduces participants to the process of preparing a competitive research or training grant application. Designed for junior faculty in academic centers who wish to pursue a career in radiologic research, this didactic workshop will cover elements of a good grant proposal, understanding the review process and planning the proposal. Workshop attendees must be registered for the RSNA annual meeting.
Five speakers present for five minutes each in these fast-paced presentations. The innovative, non-clinical topics were selected by popular vote. Follow the conversation on social media at #RSNAFast5.
Tuesday, Nov. 28, 10:30 a.m. CT
Moderator: Ángel Gómez-Cintrón, MD, MPH
Kara Gaetke-Udager, MD Teaching Future Radiologists: What’s In It For Me?
Anne Williams Darrow, MD MORE: Mentoring, Outreach & Resources for Equity
Cooper Gamble, BS AI Needs to Know What It Doesn’t Know
Saurabh Jha, MD X-Rays on Mount Everest
Jessica Tsai Wen, MD, PhD Is Colorblindness Doing More Harm Than Good in Combatting Racial Health Disparities
Planning for Your Future: How to Minimize Taxes and Create Your Legacy Tuesday, Nov. 28, 3 p.m. CT, Room S405
Join Lynn M. Gaumer, JD, senior gift planning consultant from The Stelter Company for a one-hour seminar on estate planning, asset management, tax savings and charitable gift giving. You’ll learn more about what you need to create a solid estate plan, how to save on taxes and popular ways to help support the next generation of radiologists through the R&E Foundation. In addition to comprehensive discussion, ample opportunity for Q&A will be included at the end of the session.
Quantitative Imaging Symposium Wednesday, Nov. 29, 2 p.m. CT
Join clinical, research and industry colleagues in Room E253AB to learn more about QIBA activities. This year’s discussion will focus on the opportunities and needs of quantitative imaging in the era of AI.
Case of the Day
Each day a unique and challenging case is available online for meeting participants to submit answers. The correct answer is revealed the following morning. Answers must be submitted by midnight CT on the day the case is available online. Participants who submit the correct answer receive 0.5 AMA PRA Category 1 Credits ™.
RSNA designates this Other activity (blended live and enduring material) for AMA PRA Category 1 Credits™. Physicians should claim only the credit commensurate with the extent of their participation in the activity.
The RSNA 2023 credit claim site is designed for self-service. All registrants, except for ARRT, will be able to review credit and adjust credit for education sessions attended, complete session evaluations, and print and save credit certificate or record of attendance at any time.
Virtual Access registrants can maintain extended access beyond the week of the live meeting, until April 30, 2024, noon CT, to view sessions and earn additional on-demand credits. After April 30, 2024, no additional revisions to transcripts will be allowed; however, attendees will maintain access to the credit claiming site through April 30, 2025, in order to print their CME certificate or Record of Attendance.
RSNA 2023 will offer A+/A ARRT credit. In order to record and obtain A+/A continuing education (CE) credits, all ARRT-registered radiologic technologists and radiologist assistants must follow ARRT guidelines, including being present for 50 minutes of every scheduled hour of a live CE activity, checking in and out with your badge from each educational session and evaluating your RSNA 2023 courses online at Meeting.RSNA.org . There will be no ARRT credit for attending virtual or on-demand sessions.
RSNA 2023 does not offer Self-Assessment Modules (SAMs). Members are encouraged to visit RSNA’s Online Learning Center for opportunities to complete SA-CME credit throughout the year.
For More Information
Register for the meeting at RSNA.org/Annual-Meeting .
Review the RSNA 2023 Program at Meeting Central .
Review the RSNA 2023 Technical Exhibits .
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
- Account settings
- Advanced Search
- Journal List
- Wiley - PMC COVID-19 Collection
Students’ experience of online learning during the COVID‐19 pandemic: A province‐wide survey study
1 Centre for Learning Analytics at Monash, Faculty of Information Technology, Monash University, Clayton VIC, Australia
2 Portfolio of the Deputy Vice‐Chancellor (Education), Monash University, Melbourne VIC, Australia
3 Department of Computer Science, Jinan University, Guangzhou China
4 College of Cyber Security, Jinan University, Guangzhou China
Guanliang chen, associated data.
The data is not openly available as it is restricted by the Chinese government.
Online learning is currently adopted by educational institutions worldwide to provide students with ongoing education during the COVID‐19 pandemic. Even though online learning research has been advancing in uncovering student experiences in various settings (i.e., tertiary, adult, and professional education), very little progress has been achieved in understanding the experience of the K‐12 student population, especially when narrowed down to different school‐year segments (i.e., primary and secondary school students). This study explores how students at different stages of their K‐12 education reacted to the mandatory full‐time online learning during the COVID‐19 pandemic. For this purpose, we conducted a province‐wide survey study in which the online learning experience of 1,170,769 Chinese students was collected from the Guangdong Province of China. We performed cross‐tabulation and Chi‐square analysis to compare students’ online learning conditions, experiences, and expectations. Results from this survey study provide evidence that students’ online learning experiences are significantly different across school years. Foremost, policy implications were made to advise government authorises and schools on improving the delivery of online learning, and potential directions were identified for future research into K‐12 online learning.
What is already known about this topic
- Online learning has been widely adopted during the COVID‐19 pandemic to ensure the continuation of K‐12 education.
- Student success in K‐12 online education is substantially lower than in conventional schools.
- Students experienced various difficulties related to the delivery of online learning.
What this paper adds
- Provide empirical evidence for the online learning experience of students in different school years.
- Identify the different needs of students in primary, middle, and high school.
- Identify the challenges of delivering online learning to students of different age.
Implications for practice and/or policy
- Authority and schools need to provide sufficient technical support to students in online learning.
- The delivery of online learning needs to be customised for students in different school years.
The ongoing COVID‐19 pandemic poses significant challenges to the global education system. By July 2020, the UN Educational, Scientific and Cultural Organization (2020) reported nationwide school closure in 111 countries, affecting over 1.07 billion students, which is around 61% of the global student population. Traditional brick‐and‐mortar schools are forced to transform into full‐time virtual schools to provide students with ongoing education (Van Lancker & Parolin, 2020 ). Consequently, students must adapt to the transition from face‐to‐face learning to fully remote online learning, where synchronous video conferences, social media, and asynchronous discussion forums become their primary venues for knowledge construction and peer communication.
For K‐12 students, this sudden transition is problematic as they often lack prior online learning experience (Barbour & Reeves, 2009 ). Barbour and LaBonte ( 2017 ) estimated that even in countries where online learning is growing rapidly, such as USA and Canada, less than 10% of the K‐12 student population had prior experience with this format. Maladaptation to online learning could expose inexperienced students to various vulnerabilities, including decrements in academic performance (Molnar et al., 2019 ), feeling of isolation (Song et al., 2004 ), and lack of learning motivation (Muilenburg & Berge, 2005 ). Unfortunately, with confirmed cases continuing to rise each day, and new outbreaks occur on a global scale, full‐time online learning for most students could last longer than anticipated (World Health Organization, 2020 ). Even after the pandemic, the current mass adoption of online learning could have lasting impacts on the global education system, and potentially accelerate and expand the rapid growth of virtual schools on a global scale (Molnar et al., 2019 ). Thus, understanding students' learning conditions and their experiences of online learning during the COVID pandemic becomes imperative.
Emerging evidence on students’ online learning experience during the COVID‐19 pandemic has identified several major concerns, including issues with internet connection (Agung et al., 2020 ; Basuony et al., 2020 ), problems with IT equipment (Bączek et al., 2021 ; Niemi & Kousa, 2020 ), limited collaborative learning opportunities (Bączek et al., 2021 ; Yates et al., 2020 ), reduced learning motivation (Basuony et al., 2020 ; Niemi & Kousa, 2020 ; Yates et al., 2020 ), and increased learning burdens (Niemi & Kousa, 2020 ). Although these findings provided valuable insights about the issues students experienced during online learning, information about their learning conditions and future expectations were less mentioned. Such information could assist educational authorises and institutions to better comprehend students’ difficulties and potentially improve their online learning experience. Additionally, most of these recent studies were limited to higher education, except for Yates et al. ( 2020 ) and Niemi and Kousa’s ( 2020 ) studies on senior high school students. Empirical research targeting the full spectrum of K‐12students remain scarce. Therefore, to address these gaps, the current paper reports the findings of a large‐scale study that sought to explore K‐12 students’ online learning experience during the COVID‐19 pandemic in a provincial sample of over one million Chinese students. The findings of this study provide policy recommendations to educational institutions and authorities regarding the delivery of K‐12 online education.
Learning conditions and technologies.
Having stable access to the internet is critical to students’ learning experience during online learning. Berge ( 2005 ) expressed the concern of the divide in digital‐readiness, and the pedagogical approach between different countries could influence students’ online learning experience. Digital‐readiness is the availability and adoption of information technologies and infrastructures in a country. Western countries like America (3rd) scored significantly higher in digital‐readiness compared to Asian countries like China (54th; Cisco, 2019 ). Students from low digital‐readiness countries could experience additional technology‐related problems. Supporting evidence is emerging in recent studies conducted during the COVID‐19 pandemic. In Egypt's capital city, Basuony et al. ( 2020 ) found that only around 13.9%of the students experienced issues with their internet connection. Whereas more than two‐thirds of the students in rural Indonesia reported issues of unstable internet, insufficient internet data, and incompatible learning device (Agung et al., 2020 ).
Another influential factor for K‐12 students to adequately adapt to online learning is the accessibility of appropriate technological devices, especially having access to a desktop or a laptop (Barbour et al., 2018 ). However, it is unlikely for most of the students to satisfy this requirement. Even in higher education, around 76% of students reported having incompatible devices for online learning and only 15% of students used laptop for online learning, whereas around 85% of them used smartphone (Agung et al., 2020 ). It is very likely that K‐12 students also suffer from this availability issue as they depend on their parents to provide access to relevant learning devices.
Technical issues surrounding technological devices could also influence students’ experience in online learning. (Barbour & Reeves, 2009 ) argues that students need to have a high level of digital literacy to find and use relevant information and communicate with others through technological devices. Students lacking this ability could experience difficulties in online learning. Bączek et al. ( 2021 ) found that around 54% of the medical students experienced technical problems with IT equipment and this issue was more prevalent in students with lower years of tertiary education. Likewise, Niemi and Kousa ( 2020 ) also find that students in a Finish high school experienced increased amounts of technical problems during the examination period, which involved additional technical applications. These findings are concerning as young children and adolescent in primary and lower secondary school could be more vulnerable to these technical problems as they are less experienced with the technologies in online learning (Barbour & LaBonte, 2017 ). Therefore, it is essential to investigate the learning conditions and the related difficulties experienced by students in K‐12 education as the extend of effects on them remain underexplored.
Learning experience and interactions
Apart from the aforementioned issues, the extent of interaction and collaborative learning opportunities available in online learning could also influence students’ experience. The literature on online learning has long emphasised the role of effective interaction for the success of student learning. According to Muirhead and Juwah ( 2004 ), interaction is an event that can take the shape of any type of communication between two or subjects and objects. Specifically, the literature acknowledges the three typical forms of interactions (Moore, 1989 ): (i) student‐content, (ii) student‐student, and (iii) student‐teacher. Anderson ( 2003 ) posits, in the well‐known interaction equivalency theorem, learning experiences will not deteriorate if only one of the three interaction is of high quality, and the other two can be reduced or even eliminated. Quality interaction can be accomplished by across two dimensions: (i) structure—pedagogical means that guide student interaction with contents or other students and (ii) dialogue—communication that happens between students and teachers and among students. To be able to scale online learning and prevent the growth of teaching costs, the emphasise is typically on structure (i.e., pedagogy) that can promote effective student‐content and student‐student interaction. The role of technology and media is typically recognised as a way to amplify the effect of pedagogy (Lou et al., 2006 ). Novel technological innovations—for example learning analytics‐based personalised feedback at scale (Pardo et al., 2019 ) —can also empower teachers to promote their interaction with students.
Online education can lead to a sense of isolation, which can be detrimental to student success (McInnerney & Roberts, 2004 ). Therefore, integration of social interaction into pedagogy for online learning is essential, especially at the times when students do not actually know each other or have communication and collaboration skills underdeveloped (Garrison et al., 2010 ; Gašević et al., 2015 ). Unfortunately, existing evidence suggested that online learning delivery during the COVID‐19 pandemic often lacks interactivity and collaborative experiences (Bączek et al., 2021 ; Yates et al., 2020 ). Bączek et al., ( 2021 ) found that around half of the medical students reported reduced interaction with teachers, and only 4% of students think online learning classes are interactive. Likewise, Yates et al. ( 2020 )’s study in high school students also revealed that over half of the students preferred in‐class collaboration over online collaboration as they value the immediate support and the proximity to teachers and peers from in‐class interaction.
Learning expectations and age differentiation
Although these studies have provided valuable insights and stressed the need for more interactivity in online learning, K‐12 students in different school years could exhibit different expectations for the desired activities in online learning. Piaget's Cognitive Developmental Theory illustrated children's difficulties in understanding abstract and hypothetical concepts (Thomas, 2000 ). Primary school students will encounter many abstract concepts in their STEM education (Uttal & Cohen, 2012 ). In face‐to‐face learning, teachers provide constant guidance on students’ learning progress and can help them to understand difficult concepts. Unfortunately, the level of guidance significantly drops in online learning, and, in most cases, children have to face learning obstacles by themselves (Barbour, 2013 ). Additionally, lower primary school students may lack the metacognitive skills to use various online learning functions, maintain engagement in synchronous online learning, develop and execute self‐regulated learning plans, and engage in meaningful peer interactions during online learning (Barbour, 2013 ; Broadbent & Poon, 2015 ; Huffaker & Calvert, 2003; Wang et al., 2013 ). Thus, understanding these younger students’ expectations is imperative as delivering online learning to them in the same way as a virtual high school could hinder their learning experiences. For students with more matured metacognition, their expectations of online learning could be substantially different from younger students. Niemi et al.’s study ( 2020 ) with students in a Finish high school have found that students often reported heavy workload and fatigue during online learning. These issues could cause anxiety and reduce students’ learning motivation, which would have negative consequences on their emotional well‐being and academic performance (Niemi & Kousa, 2020 ; Yates et al., 2020 ), especially for senior students who are under the pressure of examinations. Consequently, their expectations of online learning could be orientated toward having additional learning support functions and materials. Likewise, they could also prefer having more opportunities for peer interactions as these interactions are beneficial to their emotional well‐being and learning performance (Gašević et al., 2013 ; Montague & Rinaldi, 2001 ). Therefore, it is imperative to investigate the differences between online learning expectations in students of different school years to suit their needs better.
By building upon the aforementioned relevant works, this study aimed to contribute to the online learning literature with a comprehensive understanding of the online learning experience that K‐12 students had during the COVID‐19 pandemic period in China. Additionally, this study also aimed to provide a thorough discussion of what potential actions can be undertaken to improve online learning delivery. Formally, this study was guided by three research questions (RQs):
RQ1 . What learning conditions were experienced by students across 12 years of education during their online learning process in the pandemic period? RQ2 . What benefits and obstacles were perceived by students across 12 years of education when performing online learning? RQ3 . What expectations do students, across 12 years of education, have for future online learning practices ?
The total number of K‐12 students in the Guangdong Province of China is around 15 million. In China, students of Year 1–6, Year 7–9, and Year 10–12 are referred to as students of primary school, middle school, and high school, respectively. Typically, students in China start their study in primary school at the age of around six. At the end of their high‐school study, students have to take the National College Entrance Examination (NCEE; also known as Gaokao) to apply for tertiary education. The survey was administrated across the whole Guangdong Province, that is the survey was exposed to all of the 15 million K‐12 students, though it was not mandatory for those students to accomplish the survey. A total of 1,170,769 students completed the survey, which accounts for a response rate of 7.80%. After removing responses with missing values and responses submitted from the same IP address (duplicates), we had 1,048,575 valid responses, which accounts to about 7% of the total K‐12 students in the Guangdong Province. The number of students in different school years is shown in Figure 1 . Overall, students were evenly distributed across different school years, except for a smaller sample in students of Year 10–12.
The number of students in each school year
The survey was designed collaboratively by multiple relevant parties. Firstly, three educational researchers working in colleges and universities and three educational practitioners working in the Department of Education in Guangdong Province were recruited to co‐design the survey. Then, the initial draft of the survey was sent to 30 teachers from different primary and secondary schools, whose feedback and suggestions were considered to improve the survey. The final survey consisted of a total of 20 questions, which, broadly, can be classified into four categories: demographic, behaviours, experiences, and expectations. Details are available in Appendix.
All K‐12 students in the Guangdong Province were made to have full‐time online learning from March 1, 2020 after the outbreak of COVID‐19 in January in China. A province‐level online learning platform was provided to all schools by the government. In addition to the learning platform, these schools can also use additional third‐party platforms to facilitate the teaching activities, for example WeChat and Dingding, which provide services similar to WhatsApp and Zoom. The main change for most teachers was that they had to shift the classroom‐based lectures to online lectures with the aid of web‐conferencing tools. Similarly, these teachers also needed to perform homework marking and have consultation sessions in an online manner.
The Department of Education in the Guangdong Province of China distributed the survey to all K‐12 schools in the province on March 21, 2020 and collected responses on March 26, 2020. Students could access and answer the survey anonymously by either scan the Quick Response code along with the survey or click the survey address link on their mobile device. The survey was administrated in a completely voluntary manner and no incentives were given to the participants. Ethical approval was granted by the Department of Education in the Guangdong Province. Parental approval was not required since the survey was entirely anonymous and facilitated by the regulating authority, which satisfies China's ethical process.
The original survey was in Chinese, which was later translated by two bilingual researchers and verified by an external translator who is certified by the Australian National Accreditation Authority of Translators and Interpreters. The original and translated survey questionnaires are available in Supporting Information. Given the limited space we have here and the fact that not every survey item is relevant to the RQs, the following items were chosen to answer the RQs: item Q3 (learning media) and Q11 (learning approaches) for RQ1, item Q13 (perceived obstacle) and Q19 (perceived benefits) for RQ2, and item Q19 (expected learning activities) for RQ3. Cross‐tabulation based approaches were used to analyse the collected data. To scrutinise whether the differences displayed by students of different school years were statistically significant, we performed Chi‐square tests and calculated the Cramer's V to assess the strengths of the association after chi‐square had determined significance.
For the analyses, students were segmented into four categories based on their school years, that is Year 1–3, Year 4–6, Year 7–9, and Year 10–12, to provide a clear understanding of the different experiences and needs that different students had for online learning. This segmentation was based on the educational structure of Chinese schools: elementary school (Year 1–6), middle school (Year 7–9), and high school (Year 10–12). Children in elementary school can further be segmented into junior (Year 1–3) or senior (Year 4–6) students because senior elementary students in China are facing more workloads compared to junior students due to the provincial Middle School Entry Examination at the end of Year 6.
The Chi‐square test showed significant association between school years and students’ reported usage of learning media, χ 2 (55, N = 1,853,952) = 46,675.38, p < 0.001. The Cramer's V is 0.07 ( df ∗ = 5), which indicates a small‐to‐medium effect according to Cohen’s ( 1988 ) guidelines. Based on Figure 2 , we observed that an average of up to 87.39% students used smartphones to perform online learning, while only 25.43% students used computer, which suggests that smartphones, with widespread availability in China (2020), have been adopted by students for online learning. As for the prevalence of the two media, we noticed that both smartphones ( χ 2 (3, N = 1,048,575) = 9,395.05, p < 0.001, Cramer's V = 0.10 ( df ∗ = 1)) and computers ( χ 2 (3, N = 1,048,575) = 11,025.58, p <.001, Cramer's V = 0.10 ( df ∗ = 1)) were more adopted by high‐school‐year (Year 7–12) than early‐school‐year students (Year 1–6), both with a small effect size. Besides, apparent discrepancies can be observed between the usages of TV and paper‐based materials across different school years, that is early‐school‐year students reported more TV usage ( χ 2 (3, N = 1,048,575) = 19,505.08, p <.001), with a small‐to‐medium effect size, Cramer's V = 0.14( df ∗ = 1). High‐school‐year students (especially Year 10–12) reported more usage of paper‐based materials ( χ 2 (3, N = 1,048,575) = 23,401.64, p < 0.001), with a small‐to‐medium effect size, Cramer's V = 0.15( df ∗ = 1).
Learning media used by students in online learning
School years is also significantly associated with the different learning approaches students used to tackle difficult concepts during online learning, χ 2 (55, N = 2,383,751) = 58,030.74, p < 0.001. The strength of this association is weak to moderate as shown by the Cramer's V (0.07, df ∗ = 5; Cohen, 1988 ). When encountering problems related to difficult concepts, students typically chose to “solve independently by searching online” or “rewatch recorded lectures” instead of consulting to their teachers or peers (Figure 3 ). This is probably because, compared to classroom‐based education, it is relatively less convenient and more challenging for students to seek help from others when performing online learning. Besides, compared to high‐school‐year students, early‐school‐year students (Year 1–6), reported much less use of “solve independently by searching online” ( χ 2 (3, N = 1,048,575) = 48,100.15, p <.001), with a small‐to‐medium effect size, Cramer's V = 0.21 ( df ∗ = 1). Also, among those approaches of seeking help from others, significantly more high‐school‐year students preferred “communicating with other students” than early‐school‐year students ( χ 2 (3, N = 1,048,575) = 81,723.37, p < 0.001), with a medium effect size, Cramer's V = 0.28 ( df ∗ = 1).
Learning approaches used by students in online learning
Perceived benefits and obstacles—RQ2
The association between school years and perceived benefits in online learning is statistically significant, χ 2 (66, N = 2,716,127) = 29,534.23, p < 0.001, and the Cramer's V (0.04, df ∗ = 6) indicates a small effect (Cohen, 1988 ). Unsurprisingly, benefits brought by the convenience of online learning are widely recognised by students across all school years (Figure 4 ), that is up to 75% of students reported that it is “more convenient to review course content” and 54% said that they “can learn anytime and anywhere” . Besides, we noticed that about 50% of early‐school‐year students appreciated the “access to courses delivered by famous teachers” and 40%–47% of high‐school‐year students indicated that online learning is “helpful to develop self‐regulation and autonomy” .
Perceived benefits of online learning reported by students
The Chi‐square test shows a significant association between school years and students’ perceived obstacles in online learning, χ 2 (77, N = 2,699,003) = 31,987.56, p < 0.001. This association is relatively weak as shown by the Cramer's V (0.04, df ∗ = 7; Cohen, 1988 ). As shown in Figure 5 , the biggest obstacles encountered by up to 73% of students were the “eyestrain caused by long staring at screens” . Disengagement caused by nearby disturbance was reported by around 40% of students, especially those of Year 1–3 and 10–12. Technological‐wise, about 50% of students experienced poor Internet connection during their learning process, and around 20% of students reported the “confusion in setting up the platforms” across of school years.
Perceived obstacles of online learning reported by students
Expectations for future practices of online learning – RQ3
Online learning activities.
The association between school years and students’ expected online learning activities is significant, χ 2 (66, N = 2,416,093) = 38,784.81, p < 0.001. The Cramer's V is 0.05 ( df ∗ = 6) which suggests a small effect (Cohen, 1988 ). As shown in Figure 6 , the most expected activity for future online learning is “real‐time interaction with teachers” (55%), followed by “online group discussion and collaboration” (38%). We also observed that more early‐school‐year students expect reflective activities, such as “regular online practice examinations” ( χ 2 (3, N = 1,048,575) = 11,644.98, p < 0.001), with a small effect size, Cramer's V = 0.11 ( df ∗ = 1). In contrast, more high‐school‐year students expect “intelligent recommendation system …” ( χ 2 (3, N = 1,048,575) = 15,327.00, p < 0.001), with a small effect size, Cramer's V = 0.12 ( df ∗ = 1).
Students’ expected online learning activities
Regarding students’ learning conditions, substantial differences were observed in learning media, family dependency, and learning approaches adopted in online learning between students in different school years. The finding of more computer and smartphone usage in high‐school‐year than early‐school‐year students can probably be explained by that, with the growing abilities in utilising these media as well as the educational systems and tools which run on these media, high‐school‐year students tend to make better use of these media for online learning practices. Whereas, the differences in paper‐based materials may imply that high‐school‐year students in China have to accomplish a substantial amount of exercise, assignments, and exam papers to prepare for the National College Entrance Examination (NCEE), whose delivery was not entirely digitised due to the sudden transition to online learning. Meanwhile, high‐school‐year students may also have preferred using paper‐based materials for exam practice, as eventually, they would take their NCEE in the paper format. Therefore, these substantial differences in students’ usage of learning media should be addressed by customising the delivery method of online learning for different school years.
Other than these between‐age differences in learning media, the prevalence of smartphone in online learning resonates with Agung et al.’s ( 2020 ) finding on the issues surrounding the availability of compatible learning device. The prevalence of smartphone in K‐12 students is potentially problematic as the majority of the online learning platform and content is designed for computer‐based learning (Berge, 2005 ; Molnar et al., 2019 ). Whereas learning with smartphones has its own unique challenges. For example, Gikas and Grant ( 2013 ) discovered that students who learn with smartphone experienced frustration with the small screen‐size, especially when trying to type with the tiny keypad. Another challenge relates to the distraction of various social media applications. Although similar distractions exist in computer and web‐based social media, the level of popularity, especially in the young generation, are much higher in mobile‐based social media (Montag et al., 2018 ). In particular, the message notification function in smartphones could disengage students from learning activities and allure them to social media applications (Gikas & Grant, 2013 ). Given these challenges of learning with smartphones, more research efforts should be devoted to analysing students’ online learning behaviour in the setting of mobile learning to accommodate their needs better.
The differences in learning approaches, once again, illustrated that early‐school‐year students have different needs compared to high‐school‐year students. In particular, the low usage of the independent learning methods in early‐school‐year students may reflect their inability to engage in independent learning. Besides, the differences in help seeking behaviours demonstrated the distinctive needs for communication and interaction between different students, that is early‐school‐year students have a strong reliance on teachers and high‐school‐year students, who are equipped with stronger communication ability, are more inclined to interact with their peers. This finding implies that the design of online learning platforms should take students’ different needs into account. Thus, customisation is urgently needed for the delivery of online learning to different school years.
In terms of the perceived benefits and challenges of online learning, our results resonate with several previous findings. In particular, the benefits of convenience are in line with the flexibility advantages of online learning, which were mentioned in prior works (Appana, 2008 ; Bączek et al., 2021 ; Barbour, 2013 ; Basuony et al., 2020 ; Harvey et al., 2014 ). Early‐school‐year students’ higher appreciation in having “access to courses delivered by famous teachers” and lower appreciation in the independent learning skills developed through online learning are also in line with previous literature (Barbour, 2013 ; Harvey et al., 2014 ; Oliver et al., 2009 ). Again, these similar findings may indicate the strong reliance that early‐school‐year students place on teachers, while high‐school‐year students are more capable of adapting to online learning by developing independent learning skills.
Technology‐wise, students’ experience of poor internet connection and confusion in setting up online learning platforms are particularly concerning. The problem of poor internet connection corroborated the findings reported in prior studies (Agung et al., 2020 ; Barbour, 2013 ; Basuony et al., 2020 ; Berge, 2005 ; Rice, 2006 ), that is the access issue surrounded the digital divide as one of the main challenges of online learning. In the era of 4G and 5G networks, educational authorities and institutions that deliver online education could fall into the misconception of most students have a stable internet connection at home. The internet issue we observed is particularly vital to students’ online learning experience as most students prefer real‐time communications (Figure 6 ), which rely heavily on stable internet connection. Likewise, the finding of students’ confusion in technology is also consistent with prior studies (Bączek et al., 2021 ; Muilenburg & Berge, 2005 ; Niemi & Kousa, 2020 ; Song et al., 2004 ). Students who were unsuccessfully in setting up the online learning platforms could potentially experience declines in confidence and enthusiasm for online learning, which would cause a subsequent unpleasant learning experience. Therefore, both the readiness of internet infrastructure and student technical skills remain as the significant challenges for the mass‐adoption of online learning.
On the other hand, students’ experience of eyestrain from extended screen time provided empirical evidence to support Spitzer’s ( 2001 ) speculation about the potential ergonomic impact of online learning. This negative effect is potentially related to the prevalence of smartphone device and the limited screen size of these devices. This finding not only demonstrates the potential ergonomic issues that would be caused by smartphone‐based online learning but also resonates with the aforementioned necessity of different platforms and content designs for different students.
A less‐mentioned problem in previous studies on online learning experiences is the disengagement caused by nearby disturbance, especially in Year 1–3 and 10–12. It is likely that early‐school‐year students suffered from this problem because of their underdeveloped metacognitive skills to concentrate on online learning without teachers’ guidance. As for high‐school‐year students, the reasons behind their disengagement require further investigation in the future. Especially it would be worthwhile to scrutinise whether this type of disengagement is caused by the substantial amount of coursework they have to undertake and the subsequent a higher level of pressure and a lower level of concentration while learning.
Across age‐level differences are also apparent in terms of students’ expectations of online learning. Although, our results demonstrated students’ needs of gaining social interaction with others during online learning, findings (Bączek et al., 2021 ; Harvey et al., 2014 ; Kuo et al., 2014 ; Liu & Cavanaugh, 2012 ; Yates et al., 2020 ). This need manifested differently across school years, with early‐school‐year students preferring more teacher interactions and learning regulation support. Once again, this finding may imply that early‐school‐year students are inadequate in engaging with online learning without proper guidance from their teachers. Whereas, high‐school‐year students prefer more peer interactions and recommendation to learning resources. This expectation can probably be explained by the large amount of coursework exposed to them. Thus, high‐school‐year students need further guidance to help them better direct their learning efforts. These differences in students’ expectations for future practices could guide the customisation of online learning delivery.
As shown in our results, improving the delivery of online learning not only requires the efforts of policymakers but also depend on the actions of teachers and parents. The following sub‐sections will provide recommendations for relevant stakeholders and discuss their essential roles in supporting online education.
The majority of the students has experienced technical problems during online learning, including the internet lagging and confusion in setting up the learning platforms. These problems with technology could impair students’ learning experience (Kauffman, 2015 ; Muilenburg & Berge, 2005 ). Educational authorities and schools should always provide a thorough guide and assistance for students who are experiencing technical problems with online learning platforms or other related tools. Early screening and detection could also assist schools and teachers to direct their efforts more effectively in helping students with low technology skills (Wilkinson et al., 2010 ). A potential identification method involves distributing age‐specific surveys that assess students’ Information and Communication Technology (ICT) skills at the beginning of online learning. For example, there are empirical validated ICT surveys available for both primary (Aesaert et al., 2014 ) and high school (Claro et al., 2012 ) students.
For students who had problems with internet lagging, the delivery of online learning should provide options that require fewer data and bandwidth. Lecture recording is the existing option but fails to address students’ need for real‐time interaction (Clark et al., 2015 ; Malik & Fatima, 2017 ). A potential alternative involves providing students with the option to learn with digital or physical textbooks and audio‐conferencing, instead of screen sharing and video‐conferencing. This approach significantly reduces the amount of data usage and lowers the requirement of bandwidth for students to engage in smooth online interactions (Cisco, 2018 ). It also requires little additional efforts from teachers as official textbooks are often available for each school year, and thus, they only need to guide students through the materials during audio‐conferencing. Educational authority can further support this approach by making digital textbooks available for teachers and students, especially those in financial hardship. However, the lack of visual and instructor presence could potentially reduce students’ attention, recall of information, and satisfaction in online learning (Wang & Antonenko, 2017 ). Therefore, further research is required to understand whether the combination of digital or physical textbooks and audio‐conferencing is appropriate for students with internet problems. Alternatively, suppose the local technological infrastructure is well developed. In that case, governments and schools can also collaborate with internet providers to issue data and bandwidth vouchers for students who are experiencing internet problems due to financial hardship.
For future adoption of online learning, policymakers should consider the readiness of the local internet infrastructure. This recommendation is particularly important for developing countries, like Bangladesh, where the majority of the students reported the lack of internet infrastructure (Ramij & Sultana, 2020 ). In such environments, online education may become infeasible, and alternative delivery method could be more appropriate, for example, the Telesecundaria program provides TV education for rural areas of Mexico (Calderoni, 1998 ).
Other than technical problems, choosing a suitable online learning platform is also vital for providing students with a better learning experience. Governments and schools should choose an online learning platform that is customised for smartphone‐based learning, as the majority of students could be using smartphones for online learning. This recommendation is highly relevant for situations where students are forced or involuntarily engaged in online learning, like during the COVID‐19 pandemic, as they might not have access to a personal computer (Molnar et al., 2019 ).
Customisation of delivery methods
Customising the delivery of online learning for students in different school years is the theme that appeared consistently across our findings. This customisation process is vital for making online learning an opportunity for students to develop independent learning skills, which could help prepare them for tertiary education and lifelong learning. However, the pedagogical design of K‐12 online learning programs should be differentiated from adult‐orientated programs as these programs are designed for independent learners, which is rarely the case for students in K‐12 education (Barbour & Reeves, 2009 ).
For early‐school‐year students, especially Year 1–3 students, providing them with sufficient guidance from both teachers and parents should be the priority as these students often lack the ability to monitor and reflect on learning progress. In particular, these students would prefer more real‐time interaction with teachers, tutoring from parents, and regular online practice examinations. These forms of guidance could help early‐school‐year students to cope with involuntary online learning, and potentially enhance their experience in future online learning. It should be noted that, early‐school‐year students demonstrated interest in intelligent monitoring and feedback systems for learning. Additional research is required to understand whether these young children are capable of understanding and using learning analytics that relay information on their learning progress. Similarly, future research should also investigate whether young children can communicate effectively through digital tools as potential inability could hinder student learning in online group activities. Therefore, the design of online learning for early‐school‐year students should focus less on independent learning but ensuring that students are learning effective under the guidance of teachers and parents.
In contrast, group learning and peer interaction are essential for older children and adolescents. The delivery of online learning for these students should focus on providing them with more opportunities to communicate with each other and engage in collaborative learning. Potential methods to achieve this goal involve assigning or encouraging students to form study groups (Lee et al., 2011 ), directing students to use social media for peer communication (Dabbagh & Kitsantas, 2012 ), and providing students with online group assignments (Bickle & Rucker, 2018 ).
Special attention should be paid to students enrolled in high schools. For high‐school‐year students, in particular, students in Year 10–12, we also recommend to provide them with sufficient access to paper‐based learning materials, such as revision booklet and practice exam papers, so they remain familiar with paper‐based examinations. This recommendation applies to any students who engage in online learning but has to take their final examination in paper format. It is also imperative to assist high‐school‐year students who are facing examinations to direct their learning efforts better. Teachers can fulfil this need by sharing useful learning resources on the learning management system, if it is available, or through social media groups. Alternatively, students are interested in intelligent recommendation systems for learning resources, which are emerging in the literature (Corbi & Solans, 2014 ; Shishehchi et al., 2010 ). These systems could provide personalised recommendations based on a series of evaluation on learners’ knowledge. Although it is infeasible for situations where the transformation to online learning happened rapidly (i.e., during the COVID‐19 pandemic), policymakers can consider embedding such systems in future online education.
The current findings are limited to primary and secondary Chinese students who were involuntarily engaged in online learning during the COVID‐19 pandemic. Despite the large sample size, the population may not be representative as participants are all from a single province. Also, information about the quality of online learning platforms, teaching contents, and pedagogy approaches were missing because of the large scale of our study. It is likely that the infrastructures of online learning in China, such as learning platforms, instructional designs, and teachers’ knowledge about online pedagogy, were underprepared for the sudden transition. Thus, our findings may not represent the experience of students who voluntarily participated in well‐prepared online learning programs, in particular, the virtual school programs in America and Canada (Barbour & LaBonte, 2017 ; Molnar et al., 2019 ). Lastly, the survey was only evaluated and validated by teachers but not students. Therefore, students with the lowest reading comprehension levels might have a different understanding of the items’ meaning, especially terminologies that involve abstract contracts like self‐regulation and autonomy in item Q17.
In conclusion, we identified across‐year differences between primary and secondary school students’ online learning experience during the COVID‐19 pandemic. Several recommendations were made for the future practice and research of online learning in the K‐12 student population. First, educational authorities and schools should provide sufficient technical support to help students to overcome potential internet and technical problems, as well as choosing online learning platforms that have been customised for smartphones. Second, customising the online pedagogy design for students in different school years, in particular, focusing on providing sufficient guidance for young children, more online collaborative opportunity for older children and adolescent, and additional learning resource for senior students who are facing final examinations.
CONFLICT OF INTEREST
There is no potential conflict of interest in this study.
The data are collected by the Department of Education of the Guangdong Province who also has the authority to approve research studies in K12 education in the province.
This work is supported by the National Natural Science Foundation of China (62077028, 61877029), the Science and Technology Planning Project of Guangdong (2020B0909030005, 2020B1212030003, 2020ZDZX3013, 2019B1515120010, 2018KTSCX016, 2019A050510024), the Science and Technology Planning Project of Guangzhou (201902010041), and the Fundamental Research Funds for the Central Universities (21617408, 21619404).
Yan, L , Whitelock‐Wainwright, A , Guan, Q , Wen, G , Gašević, D , & Chen, G . Students’ experience of online learning during the COVID‐19 pandemic: A province‐wide survey study . Br J Educ Technol . 2021; 52 :2038–2057. 10.1111/bjet.13102 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
DATA AVAILABILITY STATEMENT
- Aesaert, K. , Van Nijlen, D. , Vanderlinde, R. , & van Braak, J. (2014). Direct measures of digital information processing and communication skills in primary education: Using item response theory for the development and validation of an ICT competence scale . Computers & Education , 76 , 168–181. 10.1016/j.compedu.2014.03.013 [ CrossRef ] [ Google Scholar ]
- Agung, A. S. N. , Surtikanti, M. W. , & Quinones, C. A. (2020). Students’ perception of online learning during COVID‐19 pandemic: A case study on the English students of STKIP Pamane Talino . SOSHUM: Jurnal Sosial Dan Humaniora , 10 ( 2 ), 225–235. 10.31940/soshum.v10i2.1316 [ CrossRef ] [ Google Scholar ]
- Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction . The International Review of Research in Open and Distributed Learning , 4 ( 2 ). 10.19173/irrodl.v4i2.149 [ CrossRef ] [ Google Scholar ]
- Appana, S. (2008). A review of benefits and limitations of online learning in the context of the student, the instructor and the tenured faculty . International Journal on E‐learning , 7 ( 1 ), 5–22. [ Google Scholar ]
- Bączek, M. , Zagańczyk‐Bączek, M. , Szpringer, M. , Jaroszyński, A. , & Wożakowska‐Kapłon, B. (2021). Students’ perception of online learning during the COVID‐19 pandemic: A survey study of Polish medical students . Medicine , 100 ( 7 ), e24821. 10.1097/MD.0000000000024821 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
- Barbour, M. K. (2013). The landscape of k‐12 online learning: Examining what is known . Handbook of Distance Education , 3 , 574–593. [ Google Scholar ]
- Barbour, M. , Huerta, L. , & Miron, G. (2018). Virtual schools in the US: Case studies of policy, performance and research evidence. In Society for information technology & teacher education international conference (pp. 672–677). Association for the Advancement of Computing in Education (AACE). [ Google Scholar ]
- Barbour, M. K. , & LaBonte, R. (2017). State of the nation: K‐12 e‐learning in Canada, 2017 edition . http://k12sotn.ca/wp‐content/uploads/2018/02/StateNation17.pdf [ Google Scholar ]
- Barbour, M. K. , & Reeves, T. C. (2009). The reality of virtual schools: A review of the literature . Computers & Education , 52 ( 2 ), 402–416. [ Google Scholar ]
- Basuony, M. A. K. , EmadEldeen, R. , Farghaly, M. , El‐Bassiouny, N. , & Mohamed, E. K. A. (2020). The factors affecting student satisfaction with online education during the COVID‐19 pandemic: An empirical study of an emerging Muslim country . Journal of Islamic Marketing . 10.1108/JIMA-09-2020-0301 [ CrossRef ] [ Google Scholar ]
- Berge, Z. L. (2005). Virtual schools: Planning for success . Teachers College Press, Columbia University. [ Google Scholar ]
- Bickle, M. C. , & Rucker, R. (2018). Student‐to‐student interaction: Humanizing the online classroom using technology and group assignments . Quarterly Review of Distance Education , 19 ( 1 ), 1–56. [ Google Scholar ]
- Broadbent, J. , & Poon, W. L. (2015). Self‐regulated learning strategies & academic achievement in online higher education learning environments: A systematic review . The Internet and Higher Education , 27 , 1–13. [ Google Scholar ]
- Calderoni, J. (1998). Telesecundaria: Using TV to bring education to rural Mexico (Tech. Rep.). The World Bank. [ Google Scholar ]
- Cisco . (2018). Bandwidth requirements for meetings with cisco Webex and collaboration meeting rooms white paper . http://dwz.date/dpbc [ Google Scholar ]
- Cisco . (2019). Cisco digital readiness 2019 . https://www.cisco.com/c/m/en_us/about/corporate‐social‐responsibility/research‐resources/digital‐readiness‐index.html#/ (Library Catalog: www.cisco.com). [ Google Scholar ]
- Clark, C. , Strudler, N. , & Grove, K. (2015). Comparing asynchronous and synchronous video vs. text based discussions in an online teacher education course . Online Learning , 19 ( 3 ), 48–69. [ Google Scholar ]
- Claro, M. , Preiss, D. D. , San Martín, E. , Jara, I. , Hinostroza, J. E. , Valenzuela, S. , Cortes, F. , & Nussbaum, M. (2012). Assessment of 21st century ICT skills in Chile: Test design and results from high school level students . Computers & Education , 59 ( 3 ), 1042–1053. 10.1016/j.compedu.2012.04.004 [ CrossRef ] [ Google Scholar ]
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences . Routledge Academic. [ Google Scholar ]
- Corbi, A. , & Solans, D. B. (2014). Review of current student‐monitoring techniques used in elearning‐focused recommender systems and learning analytics: The experience API & LIME model case study . IJIMAI , 2 ( 7 ), 44–52. [ Google Scholar ]
- Dabbagh, N. , & Kitsantas, A. (2012). Personal learning environments, social media, and self‐regulated learning: A natural formula for connecting formal and informal learning . The Internet and Higher Education , 15 ( 1 ), 3–8. 10.1016/j.iheduc.2011.06.002 [ CrossRef ] [ Google Scholar ]
- Garrison, D. R. , Cleveland‐Innes, M. , & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework . The Internet and Higher Education , 13 ( 1–2 ), 31–36. 10.1016/j.iheduc.2009.10.002 [ CrossRef ] [ Google Scholar ]
- Gašević, D. , Adesope, O. , Joksimović, S. , & Kovanović, V. (2015). Externally‐facilitated regulation scaffolding and role assignment to develop cognitive presence in asynchronous online discussions . The Internet and Higher Education , 24 , 53–65. 10.1016/j.iheduc.2014.09.006 [ CrossRef ] [ Google Scholar ]
- Gašević, D. , Zouaq, A. , & Janzen, R. (2013). “Choose your classmates, your GPA is at stake!” The association of cross‐class social ties and academic performance . American Behavioral Scientist , 57 ( 10 ), 1460–1479. [ Google Scholar ]
- Gikas, J. , & Grant, M. M. (2013). Mobile computing devices in higher education: Student perspectives on learning with cellphones, smartphones & social media . The Internet and Higher Education , 19 , 18–26. [ Google Scholar ]
- Harvey, D. , Greer, D. , Basham, J. , & Hu, B. (2014). From the student perspective: Experiences of middle and high school students in online learning . American Journal of Distance Education , 28 ( 1 ), 14–26. 10.1080/08923647.2014.868739 [ CrossRef ] [ Google Scholar ]
- Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning . Research in Learning Technology , 23 . 10.3402/rlt.v23.26507 [ CrossRef ] [ Google Scholar ]
- Kuo, Y.‐C. , Walker, A. E. , Belland, B. R. , Schroder, K. E. , & Kuo, Y.‐T. (2014). A case study of integrating interwise: Interaction, internet self‐efficacy, and satisfaction in synchronous online learning environments . International Review of Research in Open and Distributed Learning , 15 ( 1 ), 161–181. 10.19173/irrodl.v15i1.1664 [ CrossRef ] [ Google Scholar ]
- Lee, S. J. , Srinivasan, S. , Trail, T. , Lewis, D. , & Lopez, S. (2011). Examining the relationship among student perception of support, course satisfaction, and learning outcomes in online learning . The Internet and Higher Education , 14 ( 3 ), 158–163. 10.1016/j.iheduc.2011.04.001 [ CrossRef ] [ Google Scholar ]
- Liu, F. , & Cavanaugh, C. (2012). Factors influencing student academic performance in online high school algebra . Open Learning: The Journal of Open, Distance and e‐Learning , 27 ( 2 ), 149–167. 10.1080/02680513.2012.678613 [ CrossRef ] [ Google Scholar ]
- Lou, Y. , Bernard, R. M. , & Abrami, P. C. (2006). Media and pedagogy in undergraduate distance education: A theory‐based meta‐analysis of empirical literature . Educational Technology Research and Development , 54 ( 2 ), 141–176. 10.1007/s11423-006-8252-x [ CrossRef ] [ Google Scholar ]
- Malik, M. , & Fatima, G. (2017). E‐learning: Students’ perspectives about asynchronous and synchronous resources at higher education level . Bulletin of Education and Research , 39 ( 2 ), 183–195. [ Google Scholar ]
- McInnerney, J. M. , & Roberts, T. S. (2004). Online learning: Social interaction and the creation of a sense of community . Journal of Educational Technology & Society , 7 ( 3 ), 73–81. [ Google Scholar ]
- Molnar, A. , Miron, G. , Elgeberi, N. , Barbour, M. K. , Huerta, L. , Shafer, S. R. , & Rice, J. K. (2019). Virtual schools in the US 2019 . National Education Policy Center. [ Google Scholar ]
- Montague, M. , & Rinaldi, C. (2001). Classroom dynamics and children at risk: A followup . Learning Disability Quarterly , 24 ( 2 ), 75–83. [ Google Scholar ]
- Montag, C. , Becker, B. , & Gan, C. (2018). The multipurpose application Wechat: A review on recent research . Frontiers in Psychology , 9 , 2247. 10.3389/fpsyg.2018.02247 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
- Moore, M. G. (1989). Editorial: Three types of interaction . American Journal of Distance Education , 3 ( 2 ), 1–7. 10.1080/08923648909526659 [ CrossRef ] [ Google Scholar ]
- Muilenburg, L. Y. , & Berge, Z. L. (2005). Student barriers to online learning: A factor analytic study . Distance Education , 26 ( 1 ), 29–48. 10.1080/01587910500081269 [ CrossRef ] [ Google Scholar ]
- Muirhead, B. , & Juwah, C. (2004). Interactivity in computer‐mediated college and university education: A recent review of the literature . Journal of Educational Technology & Society , 7 ( 1 ), 12–20. [ Google Scholar ]
- Niemi, H. M. , & Kousa, P. (2020). A case study of students’ and teachers’ perceptions in a finnish high school during the COVID pandemic . International Journal of Technology in Education and Science , 4 ( 4 ), 352–369. 10.46328/ijtes.v4i4.167 [ CrossRef ] [ Google Scholar ]
- Oliver, K. , Osborne, J. , & Brady, K. (2009). What are secondary students’ expectations for teachers in virtual school environments? Distance Education , 30 ( 1 ), 23–45. 10.1080/01587910902845923 [ CrossRef ] [ Google Scholar ]
- Pardo, A. , Jovanovic, J. , Dawson, S. , Gašević, D. , & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback . British Journal of Educational Technology , 50 ( 1 ), 128–138. 10.1111/bjet.12592 [ CrossRef ] [ Google Scholar ]
- Ramij, M. , & Sultana, A. (2020). Preparedness of online classes in developing countries amid covid‐19 outbreak: A perspective from Bangladesh. Afrin, Preparedness of Online Classes in Developing Countries amid COVID‐19 Outbreak: A Perspective from Bangladesh (June 29, 2020) .
- Rice, K. L. (2006). A comprehensive look at distance education in the k–12 context . Journal of Research on Technology in Education , 38 ( 4 ), 425–448. 10.1080/15391523.2006.10782468 [ CrossRef ] [ Google Scholar ]
- Shishehchi, S. , Banihashem, S. Y. , & Zin, N. A. M. (2010). A proposed semantic recommendation system for elearning: A rule and ontology based e‐learning recommendation system. In 2010 international symposium on information technology (Vol. 1, pp. 1–5).
- Song, L. , Singleton, E. S. , Hill, J. R. , & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics . The Internet and Higher Education , 7 ( 1 ), 59–70. 10.1016/j.iheduc.2003.11.003 [ CrossRef ] [ Google Scholar ]
- Spitzer, D. R. (2001). Don’t forget the high‐touch with the high‐tech in distance learning . Educational Technology , 41 ( 2 ), 51–55. [ Google Scholar ]
- Thomas, R. M. (2000). Comparing theories of child development. Wadsworth/Thomson Learning. United Nations Educational, Scientific and Cultural Organization. (2020, March). Education: From disruption to recovery . https://en.unesco.org/covid19/educationresponse (Library Catalog: en.unesco.org)
- Uttal, D. H. , & Cohen, C. A. (2012). Spatial thinking and stem education: When, why, and how? In Psychology of learning and motivation (Vol. 57 , pp. 147–181). Elsevier. [ Google Scholar ]
- Van Lancker, W. , & Parolin, Z. (2020). Covid‐19, school closures, and child poverty: A social crisis in the making . The Lancet Public Health , 5 ( 5 ), e243–e244. 10.1016/S2468-2667(20)30084-0 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
- Wang, C.‐H. , Shannon, D. M. , & Ross, M. E. (2013). Students’ characteristics, self‐regulated learning, technology self‐efficacy, and course outcomes in online learning . Distance Education , 34 ( 3 ), 302–323. 10.1080/01587919.2013.835779 [ CrossRef ] [ Google Scholar ]
- Wang, J. , & Antonenko, P. D. (2017). Instructor presence in instructional video: Effects on visual attention, recall, and perceived learning . Computers in Human Behavior , 71 , 79–89. 10.1016/j.chb.2017.01.049 [ CrossRef ] [ Google Scholar ]
- Wilkinson, A. , Roberts, J. , & While, A. E. (2010). Construction of an instrument to measure student information and communication technology skills, experience and attitudes to e‐learning . Computers in Human Behavior , 26 ( 6 ), 1369–1376. 10.1016/j.chb.2010.04.010 [ CrossRef ] [ Google Scholar ]
- World Health Organization . (2020, July). Coronavirus disease 2019 (COVID‐19): Situation Report‐164 (Situation Report No. 164). https://www.who.int/docs/default‐source/coronaviruse/situation‐reports/20200702‐covid‐19‐sitrep‐164.pdf?sfvrsn$=$ac074f58$_$2
- Yates, A. , Starkey, L. , Egerton, B. , & Flueggen, F. (2020). High school students’ experience of online learning during Covid‐19: The influence of technology and pedagogy . Technology, Pedagogy and Education , 9 , 1–15. 10.1080/1475939X.2020.1854337 [ CrossRef ] [ Google Scholar ]