Categories
Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning qualitative quantitative Student engagement in learning Student experience Technology and digital literacies

Distance Learners’ Use of Handheld Technologies

Blog Authors: Walter Patterson; Lynn Brown; Jan Robertson; Joe Wilson; Tracey Howe

Image: Matthew Hurst from New York, NY, USA / CC BY-SA (https://creativecommons.org/licenses/by-sa/2.0)

One only has to walk through College or on any street or social place to notice that the majority of people are engrossed in some activity on the screen of their handheld technology. How can we utilise this technology for best effect for learners at our College. We explored the findings of a recently published paper Distance Learners’ Use of Handheld Technologies: Mobile Learning Activity, Changing Study Habits, and the ‘Place’ of Anywhere Learning.

Here’s what they did

Undergraduate students enrolled at the UK’s largest distance learning university were surveyed. This included questions about: (a) ownership of technologies; (b) frequency of use of handheld devices (tablet, e-readers, and smartphones) for specified leisure activities and for specified learning activities; (c) locations at which each device is used for study purposes; (d) perceived change in study habits; (e) statements about impact of use on learning; (f) reason for purchase; (g) length of time used; (h) benefits and challenges; and (i) preferences for future use of each technology for learning. Open comment questions were added to probe the types of learning used in distance learning contexts, reasons for use or non-use, and the locations of use. Students were asked separately about their use of tablets, smartphones, and e-readers so potential differences in use could be analysed.

There were 446 responses from 3000 students giving a response rate of 14.9%. All age groups, study levels, and disciplines were represented. A wide range of analytical methods were used to analyse the data.

Here’s what they found

Five key findings are:

  1. most students now use handheld devices for study-related learning;
  2. the distribution of study-related learning tasks was similar in all seven study places;
  3. there is a strong, statistically-significant correlation between the number of study places in which handheld devices are used and the number of study task types performed;
  4. two fifths of students using a handheld device for learning have noticed a change in study habit and benefit to learning;
  5. and multiple regression analysis shows three variables (number of study places, number of study tasks, and change in study habits) are predictors of finding it easier to access learning materials and improved quality of learners’ work.

The author/s concluded

The study concludes by proposing two new concepts: the flow of places and place of space. These should help direct the framing of future studies into the places, spaces, and mobility of formal and informal seamless learning.

Our Journal Club’s views

Who are the authors of the paper and where do they work? Walter and Joe declared an interest as in the past they had both worked with Prof Sharples (author of over 300 papers and founder of the Association for Mobile Learning).

What do we know about the journal? The International Review of Research in Open and Distributed Learning (IRRODL) is a refereed, open access e-journal that disseminates original research, theory, and best practice in open and distributed learning worldwide. Club members judged it to be very reputable, with all articles subjected to double-blind peer review by a minimum of 2 subject experts. According to Google scholar in 2019 the journal has a ranking of third among educational technology journals and a rank of fifth of all education journals.

What about the methodology used? The abstract was viewed by some to be concise and accurate, but others thought that it presented an overcomplicated approach to some straightforward research questions. The abstract also lacked an explanation of some important concepts in the paper (e.g. flow of places).

Some felt that the readability of the paper was impacted by its layout on the page with the text looking dense. It was also noted that there appeared to be a predisposition to adopting a particular framework and that this may have influenced the approach taken and the analysis.

The introduction is rather lengthy and would be improved by more explicit subheads in the text. It also introduces complex ideas some of which are not fully addressed in this paper. Also it was unusual for research questions to appear in the middle of an introduction rather than at the end.

It was also noted that the population for the study is very niche since it is the specific domain of distance learners, who are highly likely to also be mobile learners. This would not mirror well to the type of students who attend COGC.

The careful definition of what a handheld device is was appreciated and and the context and background of the study was well explained.

There was an extended discussion of the response rate (14.9%). There was a danger to the study that these respondents were self-selecting to have a definite view one way or the other on the topic of mobile learning. There was concern that we knew nothing about the 85% who did not respond (eg traits, attributes).

The over-representation of older students was noted, even although the initial sample was stratified according to key factors (good). The survey design itself well developed and demonstrated best practice. The choice of categories for place seemed reasonable but had only been piloted with 6 people. The inclusion of PCs and laptops was welcomed as it offered a good comparison between truly mobile devices and others, as was the recognition in the survey design that mobiles could be used for other (distracting) activities as well as study-related.

No qualitative results were presented, which was somewhat disappointing (to be published separately). The presentation of the results could have been more informative by inclusion of actual sample numbers rather than just percentages. The tables and charts allowed a clear understanding of the study outcomes. The statistical evidence was well presented.

There are strong connections between the number of different study places where mobile devices are used and the number of different tasks for which they are used. The distribution of study-related tasks was fairly even across all the different study places. Some (40%) of respondents noted a change in study habits and improvement to learning through the use of mobile devices.

Our conclusions are – that this evidence has a low risk of bias.

Implications for our practice

A study of learning designs across the college revealed a predominance of content delivery. There was scope to implement better quality learning designs that drew on varied tasks, many of which could be supported by mobile learning (ABC of Learning Design). The college could move in this direction so that students were encouraged more to participate in mobile learning.

It was commented that changing the tone of voice in some module content (eg personalised voice) had proved to better engage learners – so this could also be incorporated.

Next steps

It was intimated that one response to COVID-19 could be to move more learning to mobile and that staff would be given help to do this in the coming weeks.

View from

What do you think?

References

Cross, S., Sharples, M., Healing, G., & Ellis, J. (2019). Distance Learners’ Use of Handheld Technologies: Mobile Learning Activity, Changing Study Habits, and the ‘Place’ of Anywhere Learning. The International Review of Research in Open and Distributed Learning20(2). https://doi.org/10.19173/irrodl.v20i2.4040

Keywords: mobile learning, seamless learning, study space, handheld learning technologies, anywhere learning, distance education.

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Assessment and feedback Delivery and assessment of the curriculum Design, development and approval of programmes Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement qualitative Student engagement in learning Student experience Technology and digital literacies

Experiences of reflection and digital storytelling

Blog Authors: David Cullen, Walter Patterson, John McVeigh, Lynn Brown, Tracey Howe, Lisa Shields

Image: Dave Morris from Oxford, UK / CC BY (https://creativecommons.org/licenses/by/2.0)

City of Glasgow College has many interests including: ESOL teacher development, reflective practice in educators and the use of technology in assessment evidence. This week’s paper ESOL pre-service teachers’ experiences and learning in completing a reflection paper and digital storytelling was chosen for review as it ostensibly covers a number of these topics.

Here’s what they did

The subjects of the qualitative study were 20 students on a post-graduate level Teaching English for Speakers of Other Languages (TESOL) course. In a module on Language and Culture, the students had to complete two assessment tasks: a written assignment and a digital storytelling artefact. The researcher then had the students carry out two further tasks for the purposes of the study: writing a reflection paper and delivering a reflective presentation. The researcher analysed all four sources of data and sought to examine the subjects’ performative approaches to the tasks, and their reactions to the tasks. The researcher also considered the gender and nationality of the subjects in relation to their performance and response.

Here’s what they found

  • that there were general commonalities in the subject’s performance of and reaction to the two sets of tasks (assessment and reflection).
  • there was a significant difference in subjects’ responses between the familiar written report and the unfamiliar digital storytelling task.

The author/s concluded

The inclusion of a dual reflective task was of benefit to pre-service TESOL candidates as it enhanced their reflective literacy and their understanding of the course content on Language and Culture. TESOL training courses should consider using this approach.

Our Journal Club’s views

Who are the authors of the paper and where do they work? This study was undertaken by an individual researcher, an Associate Professor of English (TESOL) at Murray State University in the United States. Regarding the individual author, it was not possible to find a list of publications or citations. We also noted that only one other individual contributed to the research activities.

What do we know about the journal? The Australasian Journal of Educational Technology is a bi-monthly peer-reviewed academic journal covering research in educational technology, instructional design, online and e-learning, educational design, multimedia, computer assisted learning, and related areas. It was rated in 2015 as having an Impact factor of 1.171. It is published by the Australasian Society for Computers in Learning in Tertiary Education. Our view is that this is a trustworthy publication.

What about the methodology used? We felt that the literature review was almost completely descriptive, serving only to provide definitions of terminology, and failed to critically evaluate the sources.

Secondly, the researchers failed to identify and declare any potential bias and limitations of their activity. Thirdly, we felt that the writing of the article, while being thorough and detailed in parts, lacked clarity, and was consequently difficult to decode and interpret.

Finally, there was significant and undeclared potential for bias: the researcher was also the course tutor; the students as subjects were potentially eager to teacher-please in their responses; only one other individual was involved in supporting the researcher and that person was also a direct colleague.

Given the high probability of bias and the other concerns outlined above, we have limited confidence in this article and we feel that the exploratory project would have been better served had it been presented as a less formal case study account.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

How can CoGC staff develop “reflection” in our working practice and professional development ?

Is this concept included in Staff Integration activities?

Is digital storytelling something we can use in staff development and/or student work? The topic of reflective practice should be considered for inclusion in the OneCity event in June and the Education Symposium after summer.

We have recently had a tussle with a local university over the accreditation of a vocational award. The university insisted that reflection should be assessed via an essay. After several rounds of negotiation, the university has accepted that there are equally valid representations of reflective practice – such as digital storytelling (videos, blogs, e-portfolios). We have been assured on many occassions by Scottish Qualifications Agency (SQA) that it has moved away from specifying the form in which evidence can be presented – if only all External Validators were of the same mind!

Next steps

The topic of reflective practice should be considered for inclusion in the OneCity event in June and the Education Symposium after summer.

View from

What do you think?

References

Ho-Ryong Park. ESOL pre-service teachers’ experiences and learning in completing a reflection paper and digital storytelling. Journal of Education Technology, 2019, 35(4)

SQA 2017 Digital Evidence for Internally Assessed HN and VQ Units: Principles and Guidance

Keywords:

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning qualitative quantitative Student engagement in learning Student experience Technology and digital literacies

Students’ insights on the use of video lectures in online classes

Blog Authors: Walter Patterson, John McVeigh, Jan Robertson, Joe Wilson, Tony Adams, Tracey Howe

Image: Iase.bodh / CC BY-SA (https://creativecommons.org/licenses/by-sa/4.0)

This week our College closed in response to Covid-19 epidemic sweeping across the UK. We selected this weeks paper Students’ insights on the use of video lectures (VL) in online classes to help us explore options for remote learning for our students. The overarching question motivating this research focused on students’ perception of their own learning in courses using VL.

Here’s what they did

This was mixed method study using surveys and focus group as a source of data collection combined with a review of previous research on the topic. Selection criteria for participants included: (1) graduate and undergraduate students enrolled in online courses for Business and Education majors in the 2014–15 academic year; (2) online courses that include VL of any type; and (3) instructors’ approval to explore the design of the online courses. 96 out of 493 (424 undergraduate students and 69 graduate students) were recruited – 10 graduate and 86 undergraduate.

The online survey consisted of 18 questions that focused on 5 main categories: overall experience as online students, interaction with VL, perceived learning impact and integration of VL with other course activities. The focus group was administered via a web conferencing system.

Data from the focus group was analyzed qualitatively only. Basic descriptive statistics and graphical analysis were performed with quantitative data 

Here’s what they found

Three factors predict students’ satisfaction rate and their perception of relevance of VL in their own learning.

  • familiarity with the media,
  • the extension of experience using video in learning, and
  • educational level or academic status.

The author/s concluded

This study suggests that courses in higher education should consider the inclusion of VL in their course materials because the use of video meets different learners’ preferences, increases students’ engagement with content, enhances students’ perception of better learning experiences through content interaction, and reinforces teaching presence in online courses.

Our Journal Club’s views

Who are the authors of the paper and where do they work? Dr Norma I. Scagnoli is the senior director of eLearning in the College of Business and holds a position of Research Associate Professor in the College of Education at the University of Illinois. The other authors are also based there.

What do we know about the journal? BJET has just published its 50th Anniversary edition. The Journal is published by Wiley on behalf of The British Educational Research Association (BERA). Impact factor:2.588 (2018). ISI Journal Citation Reports © Ranking: 2018:31/243 (Education & Educational Research).

BJET is a long standing journal but it is held to be academic in it focus and its treatment of EdTech is quite different from current ALT Publications which tend to focus on ‘real world’ matters. BJET articles were characterised as being abstract, nuanced, distanced from real world practice.

What about the methodology used?

The abstract did not offer much of an explanation of the journal article – it was more of a teaser to read further. At first sight the ‘practitioner notes’ (included in a box) appeared to give a clear explanation of the context, methods and outcomes of the study but this was re-visited later.

The research questions were easy to find and the methodology was deemed to be appropriate (mixed methods using surveys, focus groups). The inclusion of the survey questions and the focus group questions in the appendix was appreciated.

However a significant weakness in this paper is that it provides no indication of the context of the undergraduate or postgraduate students who participated. Participants were clearly self-selecting from the purposeful sample with a high risk of bias to the qualitative data. The fact that only 10 graduate students participated made some of the conclusions and analysis suspect. Also, the inclusion of graduate students made the analysis and results more complex than was required for this study

It would have been good to have some demographics of the participants in terms of: age, experience, and access to technology – so that some comparisons could be with the student cohort at our College.

Nor was it possible to make sense of the data in the light of the particular population sampled. It was agreed that it would be unsafe to make these conclusions generalisable. It was also noted that the use of a particular online tool (Zoom) for the focus group could have excluded some from participating.

The statistical analysis was over complicated and a number of different tests had been carried out on the data to uncover where some statistical significance might arise – rather than proposing a hypothesis and using the data to confirm or reject that hypothesis (which is how statistical analysis should be performed).

In the discussion of the results it was noted that the percentage figures could be quite misleading and the absolute numbers should have been included. Also the choice of dark columns for the very small graduate sample meant that the reader’s eye was drawn to these figures rather than those of the much larger sample of undergraduate students. A question was also raised about the meaning of the term “Effect on learning” in the analysis.

It was concluded that any statements made about differences between graduate and undergraduate experiences and satisfaction were untenable because of the small sample size for the former.

Returning to the practitioner notes it was then realised that some of the statements made in the Notes were NOT established evidentially in this paper (even though they were seen to be reasonably correct.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

The paper gives no hint of the practical difficulties of recording and broadcasting video lessons where artefacts that can be freely used as copyright ‘fair dealing’ in the classroom became a breach of copyright when recorded to open broadcast. There are issues to do with staff intellectual property rights of video material ,and this has been the subject of much negotiation and discussion between institutions and staff unions. The paper offers no way of no advice on picking a way through such issues.

It was noted that the current Covid-19 crisis was driving delivery to online and that this had the potential to change the face of further education for the future.

Next steps

View from

What do you think?

References

Scagnoli, N.I., Choo, J. and Tian, J. (2019), Students’ insights on the use of video lectures in online classes. Br J Educ Technol, 50: 399-414. doi:10.1111/bjet.12572

Keywords:

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Delivery and assessment of the curriculum Design, development and approval of programmes Enabling student development and achievement Political, social and economic drivers influencing educational policy and strategy. Quality assurance Student experience Systematic Review

Work-based learning in technical and vocational education

Photo by Science in HD on Unsplash

Blog Authors: Tony Adams, Fiona Balloch, David Cullen, Ian Hamilton, John McVeigh, Walter Patterson, Jan Robertson, Joe Wilson, Tracey Howe

City of Glasgow College provides over 2000 courses across a diverse range of technical, business, and professional curriculum areas. Our unique Industry Academy model channels our curriculum and staff expertise, along with external industry partner collaboration, to match the needs of students with the needs of employers. As a result, our students graduate with industry-relevant skills and highly valuable qualifications sought after by industry. We were therefore interested in a recently published systematic review of Application Of Work-Based Learning Model In Technical And Vocational Education (TVET).

Here’s what they did

The authors searched five databases: Scopus, PsycINFO, Springer, Google Scholar, and ScienceDirect using search terms “Work-Based Learning” “Conceptual Model of WBL” “WBL in TVET” and “Implementation of WBL in TVET”. They included and reviewed 16 research based articles published from 2000 to 2018.

Here’s what they found

  • The extent of the implementation of WBL in TVET in tertiary institutions including universities is low.
  • Emphasis was given to aspects such as Student Industrial Work Experience Scheme (SIWES) leaving other aspects unexploited. cooperative work experience, job shadowing, youth apprenticeship programme, internships etc.
  • Factors affecting the implementation of WBL in TVET included curriculum defects, poor policy framework, inadequate trained manpower to supervise the proper implementation and lack of WBL learning implementation framework in institution of learning.

The author/s concluded

WBL is beneficial to students and attainment of the goal of TVET, however, the several obstacles to the proper implementation of the WBL contradicts the effectiveness of the WBL in TVET ……… However, there is no high quality evidence with which to provide robust answers to questions about the effectiveness of WBL.

Our Journal Club’s views

Who are the authors of the paper and where do they work?The authors work at Department of Technical and Engineering Education, School of Education, Universiti Teknologi Malaysia, Johor Bahru and Department of Electrical and Electronic Technology Education, SOSE (Technical), Federal College of Education (Technical) Bichi, Kano, Nigeria.

What do we know about the journal? Education, Sustainability and Society (ESS) is a peer-reviewed, open access trans- and interdisciplinary e-journal. Volume 1 was published in 2018.

What about the methodology used? The authors state that this was a systematic review. However, there were no definitions of the population of interest other than ‘schools’ and ‘workplace’. They did not offer the definition of ‘work based learning’ that they used to include papers. Neither did they define any specific measurable outcomes they were interested in. It is unclear whether papers were independently reviewed and how disagreements were resolved.

The analysis section was weak giving no information on how themes were established or whether they were indeed defined a priori by the authors. No indication is given on whether quality appraisal took place. The findings are descriptive and lack detailed critique.

The discussion is a summary and is not reflective and does not draw upon the extensive wider literature. Instead it focuses on issues relating to the authors own contexts.

The authors could have benefiting from using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

This paper evoked much discussion around this topic. There is literature from North America and Europe and UNESC and the World Bank on this area, examples include [Joe to add].

In terms of our own practice we should be cognisant of National – and for World Skills – International occupational standards and we considered whether our staff are up to date and aware of the latest versions and integrate these into curriculum design, delivery and evaluation. This could be facilitated by the engagement of members from industry professional organisations at all phases in the curriculum cycle. Our Industry Academy model would provide an obvious platform for this.

We discussed the validation cycle for rapidly moving areas such as computing [other examples] which are currently 5 years. Should we be looking a faster cycles such as 18 months? This could be addressed within the refresh of our student experience strategy. Other organisations have considered graduate skills and given the dramatic change since the Covid-19 pandemic we also need to consider skills FOR and AT work.

Other areas discussed included the College’s role as a Civic Anchor and how we could benefit the wider community for example linking with Volunteer Scotland.

Next steps

  • Review the integration of new staff from industry into education
  • Review staff expertise and identify opportunities for updating skills
  • Redefine Industry Academies and their operationalisation
  • Deliver new opportunities for professional practice deriving from work placements –  building on staff internal capacities.
  • Anticipate demand for Active Blended Learning and Active  Distance Learning  programmes in preparation for and during work as part of an ongoing engagement with Industry and Commerce.
  • Finalise the ‘Appointment of visiting and honorary staff’ scheme
  • Engage externals in our governance process e.g. Faculty Boards
  • Engage externals in all aspects of the curriculum process
  • Address some of these issues within the refresh of our student experience strategy.

View from

What do you think?

References

Keywords: systematic review, technical, vocational, work-based, education, learning

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning Student engagement in learning Student experience Theories of learning, teaching and assessment

Learning and Satisfaction in Webinar, Online, and Face-to-Face Instruction: A Meta-Analysis

Blog Authors: Tracey Howe, Anthony Adams, Lisa Shields, Sarah Janette Robertson, Walter Patterson, John McVeigh, David Cullen, Kate Cotter, Joe Wilson.

Here’s what they did

Since the middle of the 1990s, there has been considerable increase in eLearning resources and educational technologies within higher education and professional training contexts. One such method being the Webinar with the advantages it confers in terms of flexibility. It negates the need for a ‘classroom space’ and means students can learn from their own homes, or from other geographically suitable, convenient places. The authors sought to investigate, using meta-analysis, the effectiveness of Webinars in the promotion of online learning as compared to traditional classroom based, ‘face to face’ teaching and with online asynchronous online learning systems. In addition they aimed to test Kirkpatrick’s four-level training evaluation model which assumes that a positive correlation exists between student satisfaction and learning. They wanted to ascertain the levels of satisfaction and learning with respect to Webinars, ‘face to face teaching’ and asynchronous online learning systems. The predictive validity of a positive association between satisfaction levels and learning had not thus far been established.

A meta-analysis of randomised controlled trials was performed for which a 2 step literature search was conducted based on pre-defined inclusion and exclusion criteria. An electronic search was undertaken of 4 databases, ERIC, PsycINFO, PubMed and Scopus using relevant keywords, followed by screening and removal of duplicates. 403 paper were excluded as they either reported qualitative research, were review papers or did not fully focus on the required topics under scrutiny. Both authors read the remaining 51 to ensure eligibility, and from these 2 papers were selected. The second step involved cross referencing using a backward and forward literature searching process for potentially eligible papers – a further 3 were selected, thus a total of 5 were papers were included in the meta-analysis. These were coded independently and in-duplicate using statistical testing to ensure interrater reliability. Various inferential statistical tests were conducted to complete the meta-analysis and address the research questions.

Here’s what they found

With regard to student learning, Webinars were found to be more effective in promoting knowledge than traditional, ‘face to face’ teaching and asynchronous online learning. However, the researchers point out that the difference between Webinars and the other groups were minimal and statistically insignificant, thus leading to the assumption that all 3 modalities tend to be equally effective for student learning.

In terms of student satisfaction, it seems that Webinars are inferior to ‘face to face’ teaching but produce higher satisfaction than asynchronous online instruction. However, again, differences were negligible in size; so it can be assumed that satisfaction is similar in all 3 modalities. Results of correlation analysis to determine the association between student satisfaction and student knowledge showed negative relationships between these 2 variables in all learning modalities. Therefore, the researchers found that Kirkpatrick’s predicted positive causal link between satisfaction and learning could not be confirmed.

The author/s concluded

The researchers concluded that the results of this meta-analysis provide insight and indications as to the practical application of e-leaning modalities in higher education and professorial learning contexts. All 3 modalities were roughly equal in terms of the outcomes of learning and satisfaction – therefore the use of one of them may well be justified without concern for major negative consequences. As traditional ‘face to face’ instruction seems to be slightly superior to online learning environments generally, if there is no need for flexibility (time or location), ‘face to face’ classroom education seems to be an appropriate learning environment for higher education and professional training contexts. But, if flexibility is required, Webinars can be used as an alternative with only slightly reduced student satisfaction. Asynchronous learning environments also offer a viable alternative, for example if students are in different time zones. ELearning modalities generally and Webinars in particular, are useful tools for extending the traditional leaning environment and creating a more flexible environment for both students and tutors.

Our Journal Club’s views

Who are the authors of the paper and where do they work? For the primary author, Christian Ebner, this was his first published article; perhaps as a result of PhD studies or an ‘early researcher’ article. Andreas Gegenfurtner has 39 publications, the focus of these being ‘knowledge transfer’ and ‘learning through technology’.

What do we know about the journal? This is a academic, peer reviewed journal, the 5th most cited publication and readily available via Open Access.

What about the methodology used? The title of this paper is self-explanatory, encouraging the reader to engage, and the abstract is concise but informative. The 2 stated research questions are clear and focused. The choice of meta-analysis, allowing the pooling of data from primary research studies is a suitable methodological choice, enabling the researchers to explore and investigate a wider data set. This methodological approach being at the top of ‘the food chain’ in terms of evidence hierarchy. PRISMA guidelines were acknowledged and followed in the execution of this systematic review of the literature. Identification of relevant inclusion and exclusion criteria was evident as was a comprehensive search of the literature, including retrospective searching. The authors ensured that all methodological detail was presented in order for their study to be reproduced, to obtain the same results. Throughout the paper, decision-making has been documented, making this cognitive process transparent to the reader. The inclusion of the PRISMA statement provided a concise but illuminative overview and the chosen coding scheme was also clearly outlined. Data extraction was fully aligned to research questions. Interrater reliability was employed in an attempt to reduce bias and minimise subjectivity, which revealed a high level of consistency between raters. Statistical calculations were made available, as were the Forest plots (allowing the results of the studies to be combined), showing results which were then faithfully presented in textual form. The researchers also acknowledged the limitations of their study and the project was funded by an Educational establishment.

Our conclusions are – that this evidence has a low risk of bias.

Implications for our practice

The trustworthy evidence from this paper could help with decision making about our educational programmes when planning for the Academic year of 2021-22. This could inform decisions about which level of students receive which modality of learning, ‘face to face’, Webinar or Asynchronous online systems. Having a relevant evidence base to inform College policy will be advantageous. The motivation of different levels of student could also be considered when reaching this decision.

This paper offers direction as to the use of synchronous Webinars, enabling immediate responses and spontaneity of feedback. These also confer benefits in terms of connectively and belonginess on the part of the student. However, there could be a potential ‘burden’ placed on lecturers when preparing and participating in these live events, and perhaps there needs to be guidelines provided as to how much time from the timetable in devoted to an online synchronous presence.

There needs to be recognition of the ‘digital divide’ on the part of students, in terms of potential limited internet connectively and not having a suitable environment from which to engage in online learning.

Next steps

To consider the other papers within this series – one Journal Club member to review, read and disseminate relevant information, perhaps via Webinar.

The paper, Olson, J. S., and McCracken, F. E. (2015). Is it worth the effort? The impact of incorporating synchronous lectures into an online course. Online Learn. J. 19, 73–84. doi: 10.24059/olj.v19i2.499 is recommended for Journal club members to read.

To make full use of the support and guidance to the Learning and Teaching Academy provided by the College.

When we return, we may well experience a very different College, and knowing that online learning ‘ works’, confers a greater flexibility when planning and executing teaching and learning.

View from

What do you think?

References

Ebner, C and Gegenfurtner, A. (2019) Learning and Satisfaction in Webinar, Online, and Face-to-Face Instruction: A Meta-Analysis. Front. Educ. 4:92. doi: 10.3389/feduc.2019.00092

Keywords: Adult learning, computer-mediated communication, distance education and telelearning, distributed learning environments, media in education.

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Cohort study Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning Student engagement in learning Student experience Technology and digital literacies

Digital Literacy in Higher Education: engagement with e-tutorials using blended learning

Blog Authors: Tracey Howe, Anthony Adams, Angus Hynd-Gaw, John McVeigh, Sarah Janette Robertson, David Cullen, Lisa Shields, Claire Roberts.

Here’s what they did: The researchers conducted a case study project (using a case study approach) aimed at developing interactive digital skills E-tutorials as an integral part of selected under and postgraduate programmes. Nine interactive E-tutorials were devised collaboratively between instructors and students and these E-tutorials were then embedded within the curricula. The authors then sought to evaluate the students’ experience, perceptions and engagement with these E-tutorials and explore the respondents’ general attitudes to online learning. This was operationalised using the survey method and a 23 item questionnaire was delivered via Survey Monkey comprising of open and closed questions. The survey population consisted of 274 students from undergraduate (1st and 2nd year) and postgraduate programmes; 86 student responded (a response rate of 31%).  

Here’s what they found:

Factors affecting user engagement with digital learning were highlighted. These included: challenges, such as browser incompatibility, uneven sound quality, and internet connectively issues – all of which disrupted learning.

Students’ perceptions of the role of online learning within their programme were identified: E-tutorials were perceived as being a valuable asset for reiterating classroom learning, notably for revision purposes. They were seen as a valuable resource to enable them to learn at their own pace and in their own time. They were accessible, easy to use and their duration was appropriate.

Overall, respondents expressed enjoyment of this form of learning but highlighted a preference for a blended learning approach. Respondents did not want to forego ‘face to face’ teaching within the classroom environment entirely.

The author/s concluded: Interactive digital learning should be strategically embedded within under/postgraduate courses at defined points of the programme.  This would reinforce other forms of learning and skill development.  Appropriate support is required for successful and effective online learning, for example the speedy resolution of any technical glitches, in order to avoid a detrimental online experience.

Our Journal Club’s views

Who are the authors of the paper and where do they work? Claire McGuinness, Assistant Professor, Deputy Head of School and Director of Undergraduate Programmes in the School of Information and Communication Studies, University College Dublin, Dublin, Ireland and Crystal Fulton, Associate Professor, University College Dublin, Dublin, Ireland. Our view is that both are credible researchers and authors.

What do we know about the journal? The journal, ‘Journal of information Technology Education: Innovations in Practice is an academically peer reviewed journal (thus papers published within it have undergone peer review) but it does not appear in the Impact Factor table (not part of Thomson Reuters) although is mentioned in other indices. This does not necessarily suggest that papers within this journal are not of importance. This journal seems to be a vehicle for the publication of ‘early’ work, i.e. new ideas, initial findings, innovations and pilot studies. It is also an international publication and is well established as this paper was extracted from volume 18.  We therefore have confidence in the journal itself.

What about the methodology used? The title is attractive (encouraging people to read it), and informative making the content of the paper self-evident to the reader.  The abstract is extremely comprehensive, and quite lengthy in comparison to other papers reviewed; this may be because there are no word count constraints with this journal. It is also well structured with the use of subheadings.  The introduction and extensive literature review fully demonstrate that the development and implementation of the E-tutorial project were evidence based. The objectives of the study being clear and explanatory.

The case study research approach used was a pragmatic one as the study enfolded within the ‘real life’ context. Data were collected using a descriptive survey approach yielding both textual data and descriptive statistics. The questionnaire had undergone multiple iterations and revisions before being distributed showing that an attempt had been made to fully review and revise it accordingly. No detail provide as to whether an objective reviewer had also been used to verify its reliability and validity. Detail of analysis of the qualitative was provided (hand coded followed by a line-by-line constant comparative approach), but here also an independent reviewer could have been employed to verify findings. Ethical considerations and approval were achieved via the appropriate channels. In terms of the data collected however, full details about the number of respondents to different questions of the survey were not provided and it was not always clear as to the ‘make up’ of the respondents in relation to their respective courses.

Our conclusions are – that this evidence has a medium/low risk of bias.

Implications for our practice: From a City of Glasgow College perspective we need to consider how much experience and expertise our students have of online, multi-media learning, especially during the current situation.

Next steps: It would be useful to audit students’ digital literacy and online learning skill development to identify the skill base and level of competency they have. Checking the internet availability of our student population is also an important factor to ascertain. This would then provide an evidence based baseline on which to devise and deliver skill development and digital literacy training at the appropriate level.

View from

What do you think?

References: McGuinness, C. and Fulton, C. (2019) Digital Literacy in Higher Education: A Case Study of Student Engagement with E-Tutorials Using Blended Learning Journal of Information Technology Education: Innovations in Practice    Volume 18,  2019,  pp. 001-028. https://doi.org/10.28945/4190

Keywords: blended learning, digital literacy, e-learning, e-tutorials, higher education, online learning, online tutorials

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Assessment and feedback Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. review Student engagement in learning Technology and digital literacies

Student-generated video creation for assessment?

Blog Authors: Fiona Balloch, Jan Robertson, John McVeigh, Robertson, Lisa Shields, Joe Wilson, Tracey Howe

Image: Photo by Hermes Rivera on Unsplash

Student-generated video creation assessments are an innovative and emerging form of assessment in higher education. Academic staff may be reluctant to transform assessment practices without robust evidence of the benefits and rationale for doing so and some guidance regarding how to do so successfully. JISC have recently published Future of assessment: five principles, five targets for 2025 which states ‘In a move away from the traditional essay or exam, assessments are building in authenticity by asking students to develop websites, set up online profiles, shoot and edit videos, and use social media.’

We explored the idea, with reference to the article Student-generated video creation for assessment: can it transform assessment within Higher Education? published in the International Journal of Transformative Research, 2018.

Here’s what they did

They searched literature and conducted a thematic analysis related to the use of student-generated video for assessment.

Here’s what they found

For successful use of video creation for assessment:

  • Align video creation task set to both learning outcomes and skills development required for graduate capabilities for relevant industry
  • Ensure technological support, resources and infrastructure are all in place
  • Have an intentional change management process to support both staff and students in the transition to a new assessment format.
  • Involve students in the generation of clear guidance for the assessment and development of an assessment rubric.

The author/s concluded

Video assessment is beneficial for students’ digital communication skills and an effective and enjoyable method of assessment.

Our Journal Club’s views

Who are the authors of the paper and where do they work? At the time of publication the authors are Ruth Hawley and Cate Allen, who work at University of Derby.  Our view is that the authors may be biased in favour of video assessment, in order to provide evidence to support an initiative taking place within their own institution.

What do we know about the journal? The fully refereed Journal of International Journal of Transformative Research does not seem to be live yet and will be issued for the first time in Fall 2020. Our view is that the journal does not meet its stated aims, as it says that articles should explore transformative impact but this is not the case in this article.

What about the methodology used? Research could not be easily replicated based on the level of detail provided in the paper. In addition, the findings lack critical analysis. that this evidence is inconclusive and biased. It lacks a rationale for the use of video assessment or guidance on how it can be used effectively.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

  • How can Nautical courses extend their use of asynchronous video assessment with international students?
  • How could the COGC Health suite integrate video assessment into assessments?
  • How could issues such as trolling, and confidence with one’s own image on video be addressed through digital communication skills training?
  • How could YouTube and Flipgrid be used for assessment?
  • How issues such as trolling, and confidence with one’s own image on video could be addressed through digital communication skills training. It is easier for assessors to view videos asynchronously at the time of the assessor’s choosing, than assessing a large run of live events, one after the other.
  • Training and support is available through the College Learning and Teaching Academy

Next steps

Create a working group to pursue this topic in the College with a view to group-creation of a paper on this area.

View from

What do you think?

References

Keywords: assessment, video, student-generated, Higher Education, digital, technology

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Assessment and feedback Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. qualitative quantitative Technology and digital literacies

Rubrics in Vocational Education

Blog Authors: Tracey Howe, John McVeigh, David Cullen, Walter Patterson, Ian Hamilton

Image by Cleonard1973 / CC BY-SA (https://creativecommons.org/licenses/by-sa/4.0)

Our College delivers vocational training that frequently uses observation-based assessment. However we realise that for this to be reliable, fair, and practicable it also needs to demonstrate consistency across assessors (quality assurance), and involve decisions about the range and number of observations of performance that are required to make a reliable judgement about competence. The notion of using rubrics is being explored and we looked at this paper ‘Electronic Rubrics Design to Assess Student Competence in Vocational Education‘.

Here’s what they did. Using Design Based Research they aimed to develop an instrument that contained a rubric on food and beverage service practice in vocational education that is valid, practical, and effective. The three stages included: 1) identification and analysis of problems, 2) development of prototype program, 3) test and prototype implementation of the program.

They explored the needs of 4 lecturers from food and beverage service of different universities and 30 students of culinary education Indonesian Education University. This defined the concept of evaluation tools that were made and validity was explored using the view of 2 specialist subject matter experts and 1 assessment expert.

Data collection involved interviews and questionnaires and descriptive statistics.

Here’s what they found.

  • food and beverage service lecturers have never created nor applied an assessment rubric.
  • students on food and beverage service programme do not know the assessment tools used by lecturers
  • researchers designed a task performance guide that can be used by students in the practical implementation.
  • the performance criteria for the task and performance assessment (rubric) showed a good degree of validation

The author/s concluded

The results of the study consisted of instruments used in food and beverage service performance task of student assignments as a guide for students in carrying out lab work and performance assessment consisting of electronics rubric as practical competency guidelines. The results of the development were validated, based on expert discussions conducted using the Aiken index coefficient.

Our Journal Club’s views

Who are the authors of the paper and where do they work? All authors work at UNIVERSITAS PENDIDIKAN INDONESIA,UPI The Education University.

What do we know about the journal? This paper was published as part of proceedings from the 1st Vocational Education International Conference (VEIC 2019).

What about the methodology used? The main problem with the paper was that clearly English was not the authors’ first language. This resulted in lack of clarity and understanding throughout. The methodology was unclear and all subsequent analysis, results and conclusions were difficult to interpret.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

There are a number of individuals and programme teams across our College developing and using electronic rubrics. These include Beauty and Culinary Arts and are used on ‘Moodle’ our VLE platform. It was felt that rubrics give a more standardised feedback to student that allows their understanding of their performance.

A key area where we could look at this is that of ‘meta skills’ as these are cross disciplinary in nature and could provide core methodology and consistency of approach.

Next steps

  • College staff currently developing or using rubrics could showcase their work at forthcoming internal events and conferences.
  • We could propose a work package on rubrics as part of the current institutional review of assessment and feedback
  • Create a working group of interested individuals
  • Ask OD and COPTE for staff development in this area
  • Look at the Skills Development Scotland meta skills

View from

What do you think?

References

  • Muktiarni, M. et al. (2019) ‘Electronic Rubrics Design to Assess Student Competence in Vocational Education’, in 1st Vocational Education International Conference (VEIC 2019). Atlantis Press, pp. 257–261. doi: 10.2991/assehr.k.191217.042.

Keywords: rubrics, assessment, competence, vocational, college

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

css.php