7,532
Views
14
CrossRef citations to date
0
Altmetric
Articles

Promoting Classroom Engagement Through the Use of an Online Student Response System: A Mixed Methods Analysis

ORCID Icon, , ORCID Icon &

Abstract

The use of online student response systems (OSRSs) is increasing within tertiary education providers, however, research investigating their potential to enhance student engagement is limited. The aim of the current study was to examine the impact of an OSRS using an experimental crossover design. Quantitative data measuring student engagement was compared from pre- to post-intervention. A qualitative analysis was used to further investigate student perceptions of the OSRS. The results from this study suggest that OSRSs may be appropriate tools to increase student engagement in undergraduate statistics classes. Despite no significant change in engagement scores observed when students were exposed to the OSRS than when they were not, students appreciated the novelty of the OSRS and perceived it to have had a positive impact on their learning experience. Suggestions for how to exploit the advantages of OSRSs and directions for further research are discussed.

1 Introduction

Student engagement has been defined as how involved students are in their learning experience and how connected they feel to their classes, peers, and their institutions (Axelson and Flick Citation2010). Given the positive associations between student engagement, course satisfaction (Swan Citation2001; Wefald and Downey Citation2009), persistence (Berger and Milem Citation1999; Kuh et al. Citation2008), and academic success (Carini, Kuh, and Klein Citation2006), initiatives aimed at fostering student engagement in classrooms have become a primary focus for higher education institutions. As more and more technology becomes integrated into the student learning experience, educators have begun to explore technology-based initiatives as a means to enhance student engagement through facilitating active learning activities.

Previous research investigating the use of technology within higher education has demonstrated that using technology can enhance student engagement (D’Inverno, Davis, and White Citation2003; Poirier and Feldman Citation2007; Blasco-Arcas et al. Citation2013). The results of the 2016 Educause Center for Analysis and Research (ECAR) Study of Undergraduate Students and Information Technology Report (Brooks Citation2016) found that students believe that technology enriches their academic experience and is critical for academic success, indicating that students have welcomed the incorporation of technology-based learning strategies. Furthermore, recent reports have found that students are requesting classes to integrate more technology (Dahlstrom and Bichsel Citation2014; Brooks Citation2016). Hence, there appears to be a need for universities to incorporate more technology into their teaching practices to satisfy the needs of today’s students. However, despite computers and technology becoming an integral part of delivering education, how best to exploit this potential to enhance student engagement remains largely unknown.

Student response systems (SRSs) are one such technology that has been promoted as having the potential to enhance student engagement in the classroom. SRSs are electronic tools which enable teachers to engage with students during a class by asking questions that can be answered instantaneously by large groups of students. Such systems are increasingly being employed by teachers and research has found that using such tools during classes can help increase student engagement (Hall et al. Citation2005; Fies and Marshall Citation2006; Caldwell Citation2007; Kaleta and Joosten Citation2007; Trees and Jackson Citation2007), partly due to the increased anonymity provided to students when responding to questions (Freeman, Blayney, and Ginns Citation2006).

Until recently, SRSs were predominately employed via handheld devices (e.g., “Clickers”) that teachers had to set up and distribute to students at the beginning of classes. Students would use the handheld devices to respond to multiple-choice questions displayed via a projector (Kay and LeSage Citation2009). However, rather than handheld devices, free online student response systems (OSRSs) are now available. In addition to the financial benefits of using these OSRSs as opposed to the “clicker” systems, these systems can be accessed by students using their own mobile devices, provided that there is Internet access. To access an OSRS, teachers and students simply log on to a website or app on their own devices (e.g., a computer, laptop, tablet, or smartphone) to interact and respond to questions in real-time via the Internet.

An example of such an OSRS is Socrative (https://socrative.com/). Socrative enables teachers to create quizzes and other educational exercises that can help guide the focus of a particular lesson as well as generate discussions with students. The software empowers teachers to monitor and assess students’ responses and progress in real time by giving them immediate feedback. This enables the teacher to identify areas that students are struggling with and adjust the pace or focus of a lesson accordingly. For example, when teaching students how to produce a piece of output for a particular statistical technique, teachers can administer a set of questions via Socrative to gauge their students confidence in producing the output before moving on to the next activity, or if the results of the questions suggest it is necessary, dedicate more time to the current activity.

A previous controlled trial using a sample of engineering students reported mixed results for the OSRS, Socrative, finding that implementing the OSRS significantly increased student engagement, but had no significant effect on student performance (Dabbour Citation2016). Much of the previous research investigating OSRSs has predominately been feasibility-based though. Such studies have reported positive results with students typically reporting OSRSs to be easy to use and have a positive impact on their learning experience through enhanced levels of engagement and performance (Coca and Slisko Citation2013; Awedh et al. Citation2014; Dervan Citation2014; Mork Citation2014; Kaya and Balta Citation2016). Although the results from these previous studies are encouraging, these studies tend to be limited in terms of their generalizability due to the use of post-test, uncontrolled research designs with non-randomized samples.

The use of Socrative has been investigated across a variety of disciplines including physics (Coca and Slisko Citation2013), physiology (Rae and O’Malley Citation2017), science (Wash Citation2014), sports management (Dervan Citation2014), computing (Awedh et al. Citation2014), English language (Kaya and Balta Citation2016), economics (Piatek Citation2014), and engineering (Dabbour Citation2016). Statistics courses are another area that may benefit from using Socrative given its potential positive effect on the student learning experience and considering that course evaluations by students taking statistics units tend to indicate poor engagement (Gladys, Nicholas, and Crispen Citation2012). To the authors’ knowledge, only one study has previously investigated the effect of Socrative specifically for statistics students. Balta and Guvercin (Citation2016) found that the final grades of students enrolled in a statistics class who chose to engage with Socrative-based learning materials prior to their exam were significantly higher than the grades achieved by students who chose not to engage with the Socrative-based learning materials. Although this result is encouraging, the use of a non-randomized, post-test design means that we cannot confirm from this study that there is a beneficial effect for using Socrative, or if the difference in exam scores was due to underlying scholastic aptitude or motivation of the students who chose to engage with the OSRS. Hence, there is a need for further research exploring the use of Socrative specifically within statistics classrooms.

It is important that OSRSs are rigorously tested to ensure that they are having the intended effect and are improving (or at least maintaining) the learning experience for students. Hence, there is a need for further, more methodologically rigorous research investigating the benefits of using OSRSs in the classroom. The aim of the current study was to investigate the integration of technology in a higher education setting. Specifically, the study aimed to examine the impact of using an OSRS (Socrative) within experimental conditions for a cohort of intermediate undergraduate statistics students. This study used both quantitative and qualitative data to assess the impact of Socrative. Firstly, a quantitative analysis was used to investigate the hypothesis that when exposed to the OSRS students would report greater levels of engagement with the curriculum content than when taught without exposure to the OSRS. In addition, a qualitative thematic analysis was used to further investigate what students did and did not enjoy about using the OSRS.

2 Methodology

The sample comprised of on-campus undergraduate students recruited from an intermediate-level statistics unit enrolled at a higher education institution located in Melbourne, Australia. The statistics unit is a core unit in the psychological sciences courses of the university and introduces students to fundamental statistical areas of research design and linear models. Specifically, the unit examines how multiple regression and analysis of variance (ANOVA) can be used to analyze data.

The experiment was based on a 2 (Time: Weeks 1–3 vs. Weeks 4–6) by 2 (pedagogical approach: Standard curriculum vs. Standard curriculum plus Socrative) mixed crossover design. Tutorials were conducted on-campus (i.e., in-person in computer labs). Tutorials are conducted once a week for the duration of the semester. A typical tutorial involves a tutor conducting a class for 120 min with a small group of students. Each of the five tutorial groups (of approximately n = 15 for each tutorial group) were randomly allocated into one of two groups: the intervention group and the control group. The intervention group’s tutorial classes utilized Socrative (in addition to the standard curriculum activities) for the first three weeks of the semester (i.e., Weeks 1–3) before engaging with just the standard curriculum (i.e., no Socrative use) for the next three weeks (i.e., Weeks 4–6) of the semester. The control group’s tutorial classes were exposed to Socrative in the opposite order (i.e., tutorial groups completed the standard curriculum without using Socrative for Weeks 1–3 before using Socrative in Weeks 4–6) (see ). Hence, all students were exposed to the standard curriculum for half of the term and spent the other half of the term being exposed to the standard curriculum in addition to using Socrative. Throughout this time, all students were exposed to the same textbook and learning materials, the same tutor (author three; LT), as well as the same lecture and tutorial content (see for a description of each week’s content). The only difference between the groups was the timing that Socrative was utilized in tutorials. An advantage of this type of design is that each participant acts as their own control, thus, reducing the influence of any confounding covariates.

Table 1 Visual representation of the crossover design.

Table 2 Summary of content covered in tutorial sessions.

When exposed to the standard curriculum, students were presented with a statistical problem at the start of their tutorial sessions. This problem would relate to the content that was covered during an earlier lecture. The students would then work alongside their tutor to complete the presented problem. When students were exposed to Socrative, tutorials operated in a similar manner, however, Socrative was used in conjunction with the standard curriculum as a means to evaluate student learning. When students were presented with the initial question to solve, they were also asked to use Socrative to anonymously input their perceived ability in solving said problem (e.g., “Please rate your current knowledge of interpreting correlation output”). Students were instructed to use their computer to login to Socrative using a “room name” supplied by the tutor via the Student Socrative website (https://b.socrative.com/login/student/). The responses were instantaneously combined and displayed as a bar chart for the entire class to see via a projector (see ). The session would then proceed as per the standard curriculum, with the students working alongside the tutor to solve the problem. At the end of the tutorial session, the students were asked to complete the same Socrative question from the start of the tutorial, with the anonymous aggregate totals being displayed for the whole class to compare. provides an example of a question given to students during one of the tutorials. This example shows that 63% of students responded as having “Fair” or “Excellent” knowledge of the concept about to be tested in the tutorial and increasing to 94% responding “Fair” or “Excellent” after the tutorial. Similar results were observed for this type of question for each tutorial in this study. Socrative questions were created prior to each class and the same questions were used in each class that was being exposed to Socrative during that particular week (see, e.g., Appendix A questions).

Fig. 1 Comparing student’s perceived knowledge (pre-post) using Socrative.

Fig. 1 Comparing student’s perceived knowledge (pre-post) using Socrative.

One week prior to their first tutorial, participants were invited to complete an initial survey consisting of measures that examined academic engagement. Academic engagement was measured using an adapted version of the 9-item Utrecht Work Engagement Scale (UWES-9) (Schaufeli, Bakker, and Salanova Citation2006). Participants rated each item on a 7-point Likert scale ranging from 1 (Strongly Disagree) to 7 (Strongly Agree). Scores were summed to produce a total engagement score with higher scores indicating greater levels of academic engagement. The UWES-9 has demonstrated good internal consistency and test-retest reliability (Schaufeli, Bakker, and Salanova Citation2006; Balducci, Fraccaroli, and Schaufeli Citation2010) and has been adapted to measure academic engagement in previous studies (Salmela-Aro et al. Citation2009).

The students then completed three weeks of either the Socrative or non-Socrative tutorial sessions, followed by a post-intervention survey consisting of the same measure described above (i.e., the academic engagement survey). Participants in the study group were also asked to respond to three open-ended qualitative items on the post-intervention survey. These items were: (1) What did you enjoy about using Socrative?; (2) What did you dislike about Socrative?; and (3) If you could change or improve anything about Socrative what would it be?

All students were required to complete the ethics-approved intervention activities during their respective tutorials, however, participation in the pre-post surveys was completed voluntarily. Any student who did not consent to participate in the research was not required to complete the surveys and their data was not included in the analysis. In total, eighteen valid survey responses were acquired for analysis. This research was carried out according to protocols approved by Swinburne University of Technology Human Research Ethics Committee (Project number: 2017/144).

3 Results

3.1 Quantitative Analysis

Descriptive statistics (means and standard deviations) for the individual items can be found in Appendix B. A total score (using listwise deletion to account for missing data) was computed for the two groups. Note: the listwise deletion resulted in n = 5 and n = 8 for the Standard and Socrative groups, respectively. Descriptive statistics comparing the two groups (after listwise deletion) for total academic engagement are shown in .

Table 3 Descriptive statistics for the pre- and post-survey responses for total academic engagement.

For students in the Socrative group, on average, academic engagement was higher on the post-test (M = 43.75, SD = 6.92) than on the pretest (M = 40.38, SD = 10.93). However, the results of a paired samples t-test revealed that this difference, –3.38 BCa 95% CI [–6.25, –0.38] was not significant, t(7) = 1.52, p = 0.17. Given this group’s relatively small sample size (n = 8 after removing non-responders listwise), non-parametric methods (Wilcoxon signed rank test) were also conducted to ensure consistency. Similar to the previous analysis, academic engagement between the pre- (Mdn = 43) and post- (Mdn = 45) tests, did not differ significantly, T = 27.50, p = 0.18.

Academic engagement for students in the non-Socrative group, on average, decreased from the pretest (M = 42.60, SD = 7.44) to the post-test (M = 41.40, SD = 7.13), however, this difference (1.20, BCa 95% CI [–5.20, 5.40]) was not significant, t(4) = 0.41, p = 0.71. Similar to the previous group, a Wilcoxon signed rank test was conducted on this data. The results revealed no significant difference for academic engagement between pre- (Mdn = 40) and post- (Mdn = 41) tests, T = 5, p = 0.50.

Finally, a Mann–Whitney U test was conducted to compare the two groups across academic engagement (computed as a difference between pre and post measurements). The results revealed that the change in average academic engagement was not significantly different, U = 11.50, p = 0.21, BCa 95% CI [–10.89, 3.13] between the two groups.

3.2 Qualitative Analysis

The qualitative responses were analyzed using an inductive thematic analysis method. The analysis followed the six-step method described by Braun and Clarke (Citation2006): (1) data familiarization; (2) initial code generation; (3) theme searching; (4) theme revision; (5) theme definition and naming; and (6) reporting. A number of themes emerged from summarizing the qualitative data from each of the three open-ended questions included in the post-intervention survey. Each theme will now be discussed with direct quotes from participant responses.

3.2.1 Item 1: What Did You Enjoy About Using Socrative?

Three major themes emerged from summarizing the qualitative data for item 1 via the thematic analysis: (1) novelty, (2) evaluation of competence, and (3) fun.

3.2.1.1. Novelty

Students indicated that using Socrative provided a unique classroom experience. A number of students commented that they found the novelty of Socrative to be interesting, for example; “It was interesting seeing if I understood the content being discussed.”

3.2.1.2. Evaluation of competence

Another theme that was identified was the concept of evaluation of competence in comparison with others. Students expressed that they found it useful to see how others felt about the content being taught in the tutorials. For instance:

“I liked seeing how my feelings about the work compared with other students.”

“I liked it when everyone participated and the poll style questions showed the responses of the whole class. It was nice to see that other people had the same response as me.”

3.2.1.3 Fun

Many students expressed that they enjoyed using Socrative because it was more fun compared to their regular classes. For example:

“It was a fun way the lecturer used to engage with the whole class.”

“It was different to just sitting there and taking down notes—we got to interact.”

3.2.2 Item 2: What Did You Dislike About Socrative?

Two themes emerged from summarizing the qualitative data for item 2 via the thematic analysis: (1) relevancy and (2) none.

3.2.2.1. Relevancy

This theme revealed that some students believed the Socrative questions to be irrelevant to their learning experience. For example, one student commented that “Sometimes it felt like a bit of a waste of time,” while another said that it “Felt like time could have been better spent on something else.”

3.2.2.2. None

One third of students (33.3% of responses) commented that there was nothing they disliked about using Socrative.

3.2.3 Item 3: If You Could Change or Improve Anything About Socrative What Would It Be?

The third item asked students to provide suggestions for how Socrative could be improved. Three themes emerged from the responses: (1) revision, (2) access, and (3) instructions.

3.2.3.1. Revision

Some students suggested that Socrative should be used as a revision tool to help revise the materials taught in lectures and tutorials commenting that the program “can be used to quickly review the material learned in previous lessons” while another suggested to “definitely use it as a revision tool…perhaps in tutorials. The tutor can ask a question and then everyone can give in their answers.”

3.2.3.2. Access

Other students indicated that they would have preferred to use the program via the app on their smartphones rather than via the web-browser on computers. For example, one student commented that “It would have been nicer if it was in an app,” while another suggested to “Maybe try and turn it into an app rather than using [a] web browser.”

3.2.3.3. Instructions

Finally, two students also indicated that more specific instructions were required—these responses did not make it clear whether these students were unable to navigate Socrative or whether they needed better instructions to solve the actual statistics problems they were presented with.

4 Discussion

The use of OSRSs is increasing within tertiary education providers with practice typically leading research on these tools. To bridge the gap between research and practice concerning the integration of technology in higher education, this study investigated the use of an OSRS (Socrative). The quantitative results provided insufficient evidence for an increase in academic engagement scores after being exposed to Socrative. However, the qualitative feedback received from students on the use of Socrative was positive.

The themes that emerged in response to the open-ended questions suggest that Socrative may be an appropriate tool to integrate more technology-based learning strategies that satisfy student requests for more interactive lessons (Dahlstrom and Bichsel Citation2014; Brooks Citation2016). Overall, students appreciated the introduction of novel active learning strategies into tutorial classes and perceived Socrative as having a positive impact on engagement by facilitating a fun and unique learning experience. While some in higher education are quick to dismiss such affective reactions to novel approaches to teaching as fun, it should be remembered that getting students enthused about learning statistics is a significant barrier that statistics educators are regularly faced with. Furthermore, given the challenges to fostering engagement in statistics classes due to students typically holding negative attitudes toward the discipline and perceiving the subject as an obstacle in the way of attaining their degree (Perney and Ravid, Citation1991; Gal and Ginsburg Citation1994), these results suggest that Socrative has potential to overcome issues specific to statistics units. This is encouraging news for educators who might be looking for novel ways to engage their students while delivering statistical-based content.

The qualitative findings also indicate that Socrative provides a welcomed opportunity for students to assess their perceived competence of course content learnt during class, and suggests that students appreciate being able to determine their level of comprehension in comparison to other students. This feedback was also beneficial for the teacher running the tutorial. The feedback provided from questions similar to the example shown in assisted the teacher to implement Just-in-Time Teaching (JiTT) by fine-tuning the tutorial activities to better meet the students’ needs. Using Socrative in this way has the potential for increasing the effectiveness of learning during tutorials.

This suggests that Socrative may be a worthwhile tool, particularly for statistics educators to consider using in online courses where students are often isolated from one another and are unable to make any comparisons between themselves and their peers. For example, online courses with a synchronous component (e.g., a weekly “live” online lecture) might consider using Socrative to run polls or quizzes, much like we did in our on-campus classes, to overcome isolation. Furthermore, the sense of anonymity afforded by SRSs (see Freeman, Blayney, and Ginns Citation2006) which also apply to OSRSs such as Socrative, may also see students more willing to express their opinions and participate in online class discussions. The novelty factor reported by students in the current study may also be true for online students which could also help with engagement in online classes. Future research should consider testing OSRSs like Socrative in online learning environments to investigate whether these tools are beneficial for online students and educators.

Students also reported that Socrative was easy to use. Given that modern students find browsing and navigating web pages to be a relatively simple task, it was to be expected that no major complications would arise from using a browser-based program. However, some students suggested that they would have preferred to engage in Socrative activities via an app rather than the web-browser. Socrative does offer an app that can be used by students in the same way that the web-browser is used. As such, it is suggested that educators consider enabling students to engage in Socrative via either the web-browser or the app.

The results from the qualitative analysis were not all positive, however, with some students questioning the relevance of the Socrative-based activities. This is a common critique noted in previous research investigating student response systems (see Aljaloud et al. Citation2015). This is a concern as students who find an activity irrelevant may quickly lose interest in the task and their attitudes toward the subject may become negative (Osborne, Simon, and Collins Citation2003). However, this problem is not unique to the use of SRSs with students in general desiring authentic learning activities that are applicable to the real world (Maina Citation2004).

Students did offer suggestions for how Socrative could be better used in the classroom, suggesting for it to be used as a revision tool. Previous research has suggested that using Socrative as a revision tool has the potential to help students revise and achieve higher exam marks (Balta and Guvercin Citation2016). Hence, in addition to using Socrative to assess student mastery of the content, educators should also consider creating multiple choice quizzes via Socrative assessing student knowledge relevant to each lesson in preparation of upcoming assessment tasks.

The present study was not without its limitations. Firstly, due to the small sample size, the inferential statistics should be interpreted with caution. Despite the researchers’ best efforts to maximize participation (e.g., discussion boards posts, E-mails, lecture visits), overall, students did not appear interested in participating in the research itself (i.e., completing the questionnaires), hence, the small sample size. This is a common limitation amongst studies investigating OSRSs (e.g., Coca and Slisko Citation2013; Awedh et al. Citation2014; Dervan Citation2014; Mork Citation2014; Balta and Guvercin Citation2016; Dabbour Citation2016; Kaya and Balta Citation2016).

A further limitation was the timing of the research. Students were only exposed to Socrative for three weeks of tutorial classes. This may not have been a long enough time for the OSRS to have an observable impact on engagement. Furthermore, students were administered the pre-post surveys within the first month of the teaching period. This is when engagement levels for students are typically at their highest (Stewart, Stott, and Nuttall Citation2011). Further study is needed to investigate the use of Socrative across multiple academic units (for a longer duration and across multiple time points) using large and heterogeneous samples to confirm a beneficial effect for using Socrative and for the results to be generalizable to the wider tertiary education sector. In addition to academic engagement, future research should look to quantify the effect of using OSRSs on student learning and course satisfaction.

By taking advantage of new educational technology, teachers can create a more active learning environment that assists students to have an enhanced learning experience (D’Inverno, Davis, and White Citation2003; Poirier and Feldman Citation2007; Blasco-Arcas et al. Citation2013). Hence, investigating and employing technology-based strategies that promote student engagement is an important initiative, especially when it comes to units like statistics, that can present educators with difficulties in regards to engaging students in the classroom. Taken together, the quantitative and qualitative results from the current study outline a clear path for future OSRS-related use in teaching and research on the effective use of these tools. It is hoped that the results from this study will encourage educators seeking to foster greater student engagement to consider incorporating OSRSs, like Socrative, into their teaching practices. However, before this technology can be widely adopted, further research is needed to confirm the presence and magnitude of the effect of OSRSs on the student learning experience. Further research is also needed to determine if and to what degree these improvements in engagement impact students’ performance as well as their attitudes and interest in statistics and related career pathways.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

References

  • Aljaloud, A., Gromik, N., Billingsley, W., and Kwan, P. (2015), “Research Trends in Student Response Systems: A Literature Review,” International Journal of Learning Technology, 10, 313–325. DOI: 10.1504/IJLT.2015.074073.
  • Awedh, M., Mueen, A., Zafar, B., and Manzoor, U. (2014), “Using Socrative and Smartphones for the Support of Collaborative Learning,” International Journal on Integrating Technology in Education, 3, 17–24, DOI: 10.5121/ijite.2014.3402.
  • Axelson, R., and Flick, A. (2010), “Defining Student Engagement,” Change: The Magazine of Higher Learning, 43, 38–43. DOI: 10.1080/00091383.2011.533096.
  • Balducci, C., Fraccaroli, F., and Schaufeli, W. B. (2010), “Psychometric Properties of the Italian Version of the Utrecht Work Engagement Scale (UWES-9): A Cross-Cultural Analysis,” European Journal of Psychological Assessment, 26, 143–149, DOI: 10.1027/1015-5759/a000020.
  • Balta, N., and Guvercin, S. (2016), “Increasing Undergraduate Students’ Exam Performances in Statistics Course Using Software Socrative,” The Turkish Journal of Educational Technology, 2016, 314–321.
  • Berger, J. B., and Milem, J. F. (1999), “The Role of Student Involvement and Perceptions of Integration in a Causal Model of Student Persistence,” Research in Higher Education, 40, 641–664, DOI: 10.1023/A:1018708813711.
  • Blasco-Arcas, L., Buil, I., Hernández-Ortega, B., and Sese, F. J. (2013), “Using Clickers in Class. The Role of Interactivity, Active Collaborative Learning and Engagement in Learning Performance,” Computers & Education, 62, 102–110, DOI: 10.1016/j.compedu.2012.10.019.
  • Braun, V., and Clarke, V. (2006), “Using Thematic Analysis in Psychology,” Qualitative Research in Psychology, 3, 77–101. DOI: 10.1191/1478088706qp063oa.
  • Brooks, D. C. (2016), “ECAR Study of Undergraduate Students and Information Technology, 2016,” Research Report, ECAR, Louisville, CO.
  • Caldwell, J. E. (2007), “Clickers in the Large Classroom: Current Research and Best-Practice Tips,” CBE—Life Sciences Education, 6, 9–20, DOI: 10.1187/cbe.06.
  • Carini, R. M., Kuh, G. D., and Klein, S. P. (2006), “Student Engagement and Student Learning: Testing the Linkages,” Research in Higher Education, 47, 1–32, DOI: 10.1007/s11162-005-8150-9.
  • Coca, D. M., and Slisko, J. (2013), “Software Socrative and Smartphones as Tools for Implementation of Basic Processes of Active Physics Learning in Classroom: An Initial Feasibility Study With Prospective Teachers,” European Journal of Physics Education, 4, 17–24.
  • D’Inverno, R. A., Davis, H. C., and White, S. (2003), “Using a Personal Response System for Promoting Student Interaction,” Teaching Mathematics and Its Application, 22, 163–169, DOI: 10.1093/teamat/22.4.163.
  • Dabbour, E. (2016), “Quantifying the Effects of Using Online Student Response Systems in an Engineering Ethics Course,” Journal of Professional Issues in Engineering Education and Practice, 142, 04015010, DOI: 10.1061/(ASCE)EI.1943-5541.0000260.
  • Dahlstrom, E., and Bichsel, J. (2014), “ECAR Study of Undergraduate Students and Information Technology, 2014,” Research Report, ECAR, Lousivlle, CO.
  • Dervan, P. (2014), “Increasing in-class student engagement using Socrative (an online student response system),” The All Ireland Journal of Teaching & Learning in Higher Education, 6, 1801–1813.
  • Fies, C., and Marshall, J. (2006), “Classroom Response Systems: A Review of the Literature,” Journal of Science Education and Technology, 15, 101–109. DOI: 10.1007/s10956-006-0360-1.
  • Freeman, M., Blayney, P., and Ginns, P. (2006), “Anonymity and in Class Learning: The Case for Electronic Response Systems,” Australasian Journal of Educational Technology, 22, 568–580. DOI: 10.14742/ajet.1286.
  • Gal, I., and Ginsburg, L. (1994), “The Role of Beliefs and Attitudes in Learning Statistics: Towards an Assessment Framework,” Journal of Statistics Education, 2, 1–15. DOI: 10.1080/10691898.1994.11910471.
  • Gladys, S., Nicholas, Z., and Crispen, B. (2012), “Undergraduate Students’ Views on Their Learning of Research Methods and Statistics 5RMS) Course: Challenges and Alternative Strategies,” International Journal of Social Science Tomorrow, 1, 1–9.
  • Hall, R. H., Thomas, M. L., Collier, H. L., and Hilgers, M. G. (2005), “A Student Response System for Increasing Engagement, Motivation, and Learning in High Enrollment Lectures,” in AMCIS 2005 Proceedings, p. 255.
  • Kaleta, R., and Joosten, T. (2007), “Student Response Systems: A University of Wisconsin System Study of Clickers,” Educause Center for Applied Research Bulletin, 2007, 1–11.
  • Kay, R. H., and LeSage, A. (2009), “Examining the Benefits and Challenges of Using Audience Response Systems: A Review of the Literature,” Computers & Education, 53, 819–827. DOI: 10.1016/j.compedu.2009.05.001.
  • Kaya, A., and Balta, N. (2016), “Taking Advantage of Technologies: Using the Socrative in English Teaching Classes,” International Journal of Social Sciences & Educational Studies, 2, 4–12.
  • Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., and Gonyea, R. M. (2008), “Unmasking the Effects of Student Engagement on First-Year College Grades and Persistence,” The Journal of Higher Education, 79, 540–563. DOI: 10.1080/00221546.2008.11772116.
  • Maina, F. W. (2004), “Authentic Learning: Perspectives From Contemporary Educators,” Journal of Authentic Learning, 1, 1–8.
  • Mork, C.-M. (2014), “Benefits of Using Online Student Response Systems in Japanese EFL Classrooms,” JALT CALL Journal, 10, 127–137.
  • Osborne, J., Simon, S., and Collins, S. (2003), “Attitudes Towards Science: A Review of the Literature and Its Implications,” International Journal of Science Education, 25, 1049–1079, DOI: 10.1080/0950069032000032199.
  • Perney, J., and Ravid, R. (1991). “The Relationship Between Attitudes Towards Statistics, Math Self-Efficacy Concept, Test Anxiety and Graduate Students’ Achievement in an Introductory Statistics Course,” unpublished manuscript, National College of Education, Evanston, IL.
  • Piatek, R. (2014), “Student Response System: Student Activation Towards Better Learning in Large Classes. A Practical Guide,” available at https://samf.ku.dk/pcs/english/forteachers/tlhe/projects/Remi_Piatek_TLHE_Project.pdf.
  • Poirier, C. R., and Feldman, R. S. (2007), “Promoting Active Learning Using Individual Response Technology in Large Introductory Psychology Classes,” Teaching of Psychology, 34, 194–196. DOI: 10.1080/00986280701498665.
  • Rae, M. G., and O’Malley, D. (2017), “Using an Online Student Response System, Socrative, to Facilitate Active Learning of Physiology by First Year Graduate Entry to Medicine Students: A Feasibility Study,” MedEdPublish, 6, 1–17. DOI: 10.15694/mep.2017.000004.
  • Salmela-Aro, K., Kiuru, N., Leskinen, E., and Nurmi, J. E. (2009), “School Burnout Inventory (SBI) Reliability and Validity,” European Journal of Psychological Assessment, 25, 48–57, DOI: 10.1027/1015-5759.25.1.48.
  • Schaufeli, W. B., Bakker, A. B., and Salanova, M. (2006), “The Measurement of Work Wngagement With a Short Questionnaire: A Cross-National Study,” Educational and Psychological Measurement, 66, 701–716, DOI: 10.1177/0013164405282471.
  • Stewart, M., Stott, T., and Nuttall, A.-M. (2011), “Student Engagement Patterns Over the Duration of Level 1 and Level 3 Geography Modules: Influences on Student Attendance, Performance and Use of Online Resources,” Journal of Geography in Higher Education, 35, 47–65. DOI: 10.1080/03098265.2010.498880.
  • Swan, K. (2001), “Virtual Interaction: Design Factors Affecting Student Satisfaction and Perceived Learning in Asynchronous Online Course,” Distance Education, 22, 306–331. DOI: 10.1080/0158791010220208.
  • Trees, A. R., and Jackson, M. H. (2007), “The Learning Environment in Clicker Classrooms: Student Processes of Learning and Involvement in Large University-Level Courses Using Student Response Systems,” Learning, Media and Technology, 32, 21–40, DOI: 10.1080/17439880601141179.
  • Wash, P. D. (2014), “Taking Advantage of Mobile Devices: Using Socrative in the Classroom,” Journal of Teaching and Learning with Technology, 3, 99–101, DOI: 10.14434/jotlt.v3n1.5016.
  • Wefald, A. J., and Downey, R. G. (2009), “Construct Dimensionality of Engagement and Its Relation With Satisfaction,” The Journal of Psychology, 143, 91–112. DOI: 10.3200/JRLP.143.1.91-112.

Appendix A:

List of Example Questions Given to Students via Socrative

What is the strength and direction of the correlation?

Is multiple R significant?

Interpret the partial regression coefficient

How confident are you using SPSS to produce regression output?

Appendix B:

Descriptive Statistics for the Pre- and Post-Survey Responses for Individual Academic Engagement Items