Categories
Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning qualitative quantitative Student engagement in learning Student experience Technology and digital literacies

Distance Learners’ Use of Handheld Technologies

Blog Authors: Walter Patterson; Lynn Brown; Jan Robertson; Joe Wilson; Tracey Howe

Image: Matthew Hurst from New York, NY, USA / CC BY-SA (https://creativecommons.org/licenses/by-sa/2.0)

One only has to walk through College or on any street or social place to notice that the majority of people are engrossed in some activity on the screen of their handheld technology. How can we utilise this technology for best effect for learners at our College. We explored the findings of a recently published paper Distance Learners’ Use of Handheld Technologies: Mobile Learning Activity, Changing Study Habits, and the ‘Place’ of Anywhere Learning.

Here’s what they did

Undergraduate students enrolled at the UK’s largest distance learning university were surveyed. This included questions about: (a) ownership of technologies; (b) frequency of use of handheld devices (tablet, e-readers, and smartphones) for specified leisure activities and for specified learning activities; (c) locations at which each device is used for study purposes; (d) perceived change in study habits; (e) statements about impact of use on learning; (f) reason for purchase; (g) length of time used; (h) benefits and challenges; and (i) preferences for future use of each technology for learning. Open comment questions were added to probe the types of learning used in distance learning contexts, reasons for use or non-use, and the locations of use. Students were asked separately about their use of tablets, smartphones, and e-readers so potential differences in use could be analysed.

There were 446 responses from 3000 students giving a response rate of 14.9%. All age groups, study levels, and disciplines were represented. A wide range of analytical methods were used to analyse the data.

Here’s what they found

Five key findings are:

  1. most students now use handheld devices for study-related learning;
  2. the distribution of study-related learning tasks was similar in all seven study places;
  3. there is a strong, statistically-significant correlation between the number of study places in which handheld devices are used and the number of study task types performed;
  4. two fifths of students using a handheld device for learning have noticed a change in study habit and benefit to learning;
  5. and multiple regression analysis shows three variables (number of study places, number of study tasks, and change in study habits) are predictors of finding it easier to access learning materials and improved quality of learners’ work.

The author/s concluded

The study concludes by proposing two new concepts: the flow of places and place of space. These should help direct the framing of future studies into the places, spaces, and mobility of formal and informal seamless learning.

Our Journal Club’s views

Who are the authors of the paper and where do they work? Walter and Joe declared an interest as in the past they had both worked with Prof Sharples (author of over 300 papers and founder of the Association for Mobile Learning).

What do we know about the journal? The International Review of Research in Open and Distributed Learning (IRRODL) is a refereed, open access e-journal that disseminates original research, theory, and best practice in open and distributed learning worldwide. Club members judged it to be very reputable, with all articles subjected to double-blind peer review by a minimum of 2 subject experts. According to Google scholar in 2019 the journal has a ranking of third among educational technology journals and a rank of fifth of all education journals.

What about the methodology used? The abstract was viewed by some to be concise and accurate, but others thought that it presented an overcomplicated approach to some straightforward research questions. The abstract also lacked an explanation of some important concepts in the paper (e.g. flow of places).

Some felt that the readability of the paper was impacted by its layout on the page with the text looking dense. It was also noted that there appeared to be a predisposition to adopting a particular framework and that this may have influenced the approach taken and the analysis.

The introduction is rather lengthy and would be improved by more explicit subheads in the text. It also introduces complex ideas some of which are not fully addressed in this paper. Also it was unusual for research questions to appear in the middle of an introduction rather than at the end.

It was also noted that the population for the study is very niche since it is the specific domain of distance learners, who are highly likely to also be mobile learners. This would not mirror well to the type of students who attend COGC.

The careful definition of what a handheld device is was appreciated and and the context and background of the study was well explained.

There was an extended discussion of the response rate (14.9%). There was a danger to the study that these respondents were self-selecting to have a definite view one way or the other on the topic of mobile learning. There was concern that we knew nothing about the 85% who did not respond (eg traits, attributes).

The over-representation of older students was noted, even although the initial sample was stratified according to key factors (good). The survey design itself well developed and demonstrated best practice. The choice of categories for place seemed reasonable but had only been piloted with 6 people. The inclusion of PCs and laptops was welcomed as it offered a good comparison between truly mobile devices and others, as was the recognition in the survey design that mobiles could be used for other (distracting) activities as well as study-related.

No qualitative results were presented, which was somewhat disappointing (to be published separately). The presentation of the results could have been more informative by inclusion of actual sample numbers rather than just percentages. The tables and charts allowed a clear understanding of the study outcomes. The statistical evidence was well presented.

There are strong connections between the number of different study places where mobile devices are used and the number of different tasks for which they are used. The distribution of study-related tasks was fairly even across all the different study places. Some (40%) of respondents noted a change in study habits and improvement to learning through the use of mobile devices.

Our conclusions are – that this evidence has a low risk of bias.

Implications for our practice

A study of learning designs across the college revealed a predominance of content delivery. There was scope to implement better quality learning designs that drew on varied tasks, many of which could be supported by mobile learning (ABC of Learning Design). The college could move in this direction so that students were encouraged more to participate in mobile learning.

It was commented that changing the tone of voice in some module content (eg personalised voice) had proved to better engage learners – so this could also be incorporated.

Next steps

It was intimated that one response to COVID-19 could be to move more learning to mobile and that staff would be given help to do this in the coming weeks.

View from

What do you think?

References

Cross, S., Sharples, M., Healing, G., & Ellis, J. (2019). Distance Learners’ Use of Handheld Technologies: Mobile Learning Activity, Changing Study Habits, and the ‘Place’ of Anywhere Learning. The International Review of Research in Open and Distributed Learning20(2). https://doi.org/10.19173/irrodl.v20i2.4040

Keywords: mobile learning, seamless learning, study space, handheld learning technologies, anywhere learning, distance education.

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Assessment and feedback Delivery and assessment of the curriculum Design, development and approval of programmes Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement qualitative Student engagement in learning Student experience Technology and digital literacies

Experiences of reflection and digital storytelling

Blog Authors: David Cullen, Walter Patterson, John McVeigh, Lynn Brown, Tracey Howe, Lisa Shields

Image: Dave Morris from Oxford, UK / CC BY (https://creativecommons.org/licenses/by/2.0)

City of Glasgow College has many interests including: ESOL teacher development, reflective practice in educators and the use of technology in assessment evidence. This week’s paper ESOL pre-service teachers’ experiences and learning in completing a reflection paper and digital storytelling was chosen for review as it ostensibly covers a number of these topics.

Here’s what they did

The subjects of the qualitative study were 20 students on a post-graduate level Teaching English for Speakers of Other Languages (TESOL) course. In a module on Language and Culture, the students had to complete two assessment tasks: a written assignment and a digital storytelling artefact. The researcher then had the students carry out two further tasks for the purposes of the study: writing a reflection paper and delivering a reflective presentation. The researcher analysed all four sources of data and sought to examine the subjects’ performative approaches to the tasks, and their reactions to the tasks. The researcher also considered the gender and nationality of the subjects in relation to their performance and response.

Here’s what they found

  • that there were general commonalities in the subject’s performance of and reaction to the two sets of tasks (assessment and reflection).
  • there was a significant difference in subjects’ responses between the familiar written report and the unfamiliar digital storytelling task.

The author/s concluded

The inclusion of a dual reflective task was of benefit to pre-service TESOL candidates as it enhanced their reflective literacy and their understanding of the course content on Language and Culture. TESOL training courses should consider using this approach.

Our Journal Club’s views

Who are the authors of the paper and where do they work? This study was undertaken by an individual researcher, an Associate Professor of English (TESOL) at Murray State University in the United States. Regarding the individual author, it was not possible to find a list of publications or citations. We also noted that only one other individual contributed to the research activities.

What do we know about the journal? The Australasian Journal of Educational Technology is a bi-monthly peer-reviewed academic journal covering research in educational technology, instructional design, online and e-learning, educational design, multimedia, computer assisted learning, and related areas. It was rated in 2015 as having an Impact factor of 1.171. It is published by the Australasian Society for Computers in Learning in Tertiary Education. Our view is that this is a trustworthy publication.

What about the methodology used? We felt that the literature review was almost completely descriptive, serving only to provide definitions of terminology, and failed to critically evaluate the sources.

Secondly, the researchers failed to identify and declare any potential bias and limitations of their activity. Thirdly, we felt that the writing of the article, while being thorough and detailed in parts, lacked clarity, and was consequently difficult to decode and interpret.

Finally, there was significant and undeclared potential for bias: the researcher was also the course tutor; the students as subjects were potentially eager to teacher-please in their responses; only one other individual was involved in supporting the researcher and that person was also a direct colleague.

Given the high probability of bias and the other concerns outlined above, we have limited confidence in this article and we feel that the exploratory project would have been better served had it been presented as a less formal case study account.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

How can CoGC staff develop “reflection” in our working practice and professional development ?

Is this concept included in Staff Integration activities?

Is digital storytelling something we can use in staff development and/or student work? The topic of reflective practice should be considered for inclusion in the OneCity event in June and the Education Symposium after summer.

We have recently had a tussle with a local university over the accreditation of a vocational award. The university insisted that reflection should be assessed via an essay. After several rounds of negotiation, the university has accepted that there are equally valid representations of reflective practice – such as digital storytelling (videos, blogs, e-portfolios). We have been assured on many occassions by Scottish Qualifications Agency (SQA) that it has moved away from specifying the form in which evidence can be presented – if only all External Validators were of the same mind!

Next steps

The topic of reflective practice should be considered for inclusion in the OneCity event in June and the Education Symposium after summer.

View from

What do you think?

References

Ho-Ryong Park. ESOL pre-service teachers’ experiences and learning in completing a reflection paper and digital storytelling. Journal of Education Technology, 2019, 35(4)

SQA 2017 Digital Evidence for Internally Assessed HN and VQ Units: Principles and Guidance

Keywords:

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning qualitative quantitative Student engagement in learning Student experience Technology and digital literacies

Students’ insights on the use of video lectures in online classes

Blog Authors: Walter Patterson, John McVeigh, Jan Robertson, Joe Wilson, Tony Adams, Tracey Howe

Image: Iase.bodh / CC BY-SA (https://creativecommons.org/licenses/by-sa/4.0)

This week our College closed in response to Covid-19 epidemic sweeping across the UK. We selected this weeks paper Students’ insights on the use of video lectures (VL) in online classes to help us explore options for remote learning for our students. The overarching question motivating this research focused on students’ perception of their own learning in courses using VL.

Here’s what they did

This was mixed method study using surveys and focus group as a source of data collection combined with a review of previous research on the topic. Selection criteria for participants included: (1) graduate and undergraduate students enrolled in online courses for Business and Education majors in the 2014–15 academic year; (2) online courses that include VL of any type; and (3) instructors’ approval to explore the design of the online courses. 96 out of 493 (424 undergraduate students and 69 graduate students) were recruited – 10 graduate and 86 undergraduate.

The online survey consisted of 18 questions that focused on 5 main categories: overall experience as online students, interaction with VL, perceived learning impact and integration of VL with other course activities. The focus group was administered via a web conferencing system.

Data from the focus group was analyzed qualitatively only. Basic descriptive statistics and graphical analysis were performed with quantitative data 

Here’s what they found

Three factors predict students’ satisfaction rate and their perception of relevance of VL in their own learning.

  • familiarity with the media,
  • the extension of experience using video in learning, and
  • educational level or academic status.

The author/s concluded

This study suggests that courses in higher education should consider the inclusion of VL in their course materials because the use of video meets different learners’ preferences, increases students’ engagement with content, enhances students’ perception of better learning experiences through content interaction, and reinforces teaching presence in online courses.

Our Journal Club’s views

Who are the authors of the paper and where do they work? Dr Norma I. Scagnoli is the senior director of eLearning in the College of Business and holds a position of Research Associate Professor in the College of Education at the University of Illinois. The other authors are also based there.

What do we know about the journal? BJET has just published its 50th Anniversary edition. The Journal is published by Wiley on behalf of The British Educational Research Association (BERA). Impact factor:2.588 (2018). ISI Journal Citation Reports © Ranking: 2018:31/243 (Education & Educational Research).

BJET is a long standing journal but it is held to be academic in it focus and its treatment of EdTech is quite different from current ALT Publications which tend to focus on ‘real world’ matters. BJET articles were characterised as being abstract, nuanced, distanced from real world practice.

What about the methodology used?

The abstract did not offer much of an explanation of the journal article – it was more of a teaser to read further. At first sight the ‘practitioner notes’ (included in a box) appeared to give a clear explanation of the context, methods and outcomes of the study but this was re-visited later.

The research questions were easy to find and the methodology was deemed to be appropriate (mixed methods using surveys, focus groups). The inclusion of the survey questions and the focus group questions in the appendix was appreciated.

However a significant weakness in this paper is that it provides no indication of the context of the undergraduate or postgraduate students who participated. Participants were clearly self-selecting from the purposeful sample with a high risk of bias to the qualitative data. The fact that only 10 graduate students participated made some of the conclusions and analysis suspect. Also, the inclusion of graduate students made the analysis and results more complex than was required for this study

It would have been good to have some demographics of the participants in terms of: age, experience, and access to technology – so that some comparisons could be with the student cohort at our College.

Nor was it possible to make sense of the data in the light of the particular population sampled. It was agreed that it would be unsafe to make these conclusions generalisable. It was also noted that the use of a particular online tool (Zoom) for the focus group could have excluded some from participating.

The statistical analysis was over complicated and a number of different tests had been carried out on the data to uncover where some statistical significance might arise – rather than proposing a hypothesis and using the data to confirm or reject that hypothesis (which is how statistical analysis should be performed).

In the discussion of the results it was noted that the percentage figures could be quite misleading and the absolute numbers should have been included. Also the choice of dark columns for the very small graduate sample meant that the reader’s eye was drawn to these figures rather than those of the much larger sample of undergraduate students. A question was also raised about the meaning of the term “Effect on learning” in the analysis.

It was concluded that any statements made about differences between graduate and undergraduate experiences and satisfaction were untenable because of the small sample size for the former.

Returning to the practitioner notes it was then realised that some of the statements made in the Notes were NOT established evidentially in this paper (even though they were seen to be reasonably correct.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

The paper gives no hint of the practical difficulties of recording and broadcasting video lessons where artefacts that can be freely used as copyright ‘fair dealing’ in the classroom became a breach of copyright when recorded to open broadcast. There are issues to do with staff intellectual property rights of video material ,and this has been the subject of much negotiation and discussion between institutions and staff unions. The paper offers no way of no advice on picking a way through such issues.

It was noted that the current Covid-19 crisis was driving delivery to online and that this had the potential to change the face of further education for the future.

Next steps

View from

What do you think?

References

Scagnoli, N.I., Choo, J. and Tian, J. (2019), Students’ insights on the use of video lectures in online classes. Br J Educ Technol, 50: 399-414. doi:10.1111/bjet.12572

Keywords:

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning review Student engagement in learning Technology and digital literacies

The flipped classroom!

Blog Authors: Tracey Howe, John McVeigh, Jan Robertson, Walter Patterson, Cathy Glover, Anna Close, Fiona Nixon

Image: DuEnLiJu / CC0Attribution

In the last few weeks, due to the coronavirus (COVID-19) pandemic, the world of education has had to react to a fundamental change due to closures of schools, colleges and universities on a world wide scale. Our College like many other institutions has had to shift from more traditional lecture-based approaches to teaching on-line.

This week we discussed this recent systematic review of flipped classroom empirical evidence from different fields addressing the gaps and future trends. Flipped classrooms are where students attempt to learn and comprehend the instructional contents before attending class through video-recorded lectures.

Here’s what they did

The authors searched electronic databases, namely ScienceDirect, EBSCOhost Web, Emerald Insight, Wiley Online Library and Springer Link for studies using keywords flipped classroom, flipped learning, flipping the class, inverted classroom and inverted learning in different fields and published in 2017 and 2018.

The authors reviewed and analysed 48 studies that met their criteria using content analysis. They explored positive impacts and challenges of implementing flipped classrooms.

Here’s what they found

Four major themes emerged;

  • students’ academic achievement,
  • learning motivation and/or engagement,
  • self-directed learning skills
  • social interaction.

The flipped classroom yielded positive impacts on students’ learning activities such as academic performance, learning motivation and/or engagement, social interaction and self-directed learning skills.

The most significant challenges encountered by the instructors is a lack of students’ motivation to watch the pre-recorded video lectures or to study the contents outside of the class time.

The author/s concluded

The findings suggest  that the flipped classroom concept might be effective in pursuing the 21st century learning such as greater collaboration, more interaction, greater confidence in communicating  ideas, and interestingly delivering a more democratic and equal learning space.

Our Journal Club’s views

Who are the authors of the paper and where do they work? The authors are affiliated to the Faculties of Education at the University of Hong Kong and the Ocean University of China.

What do we know about the journal? An international journal, On the Horizon, now in volume 28, explores the issues that are emerging as technology changes the nature of education and learning within and among institutions, other organizations, and across geo-political boundaries, as learning increasingly takes place outside of the traditional institutional environment. 

What about the methodology used? This was stated to be a systematic review using content analysis (see here for a hands-on guide to content analysis). The authors would have benefited from using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses. For more information see here.

The inclusion and exclusion criteria were not explicit, studies were only eligible if they were indexed by the the Social Sciences Citation Index (SSCI). The SSCI is a multidisciplinary index and indexes over 3,000 social sciences journals – 1988 to present. It would have been helpful to have a PRISMA flow diagram and tables of included and excluded studies.

It is not clear what language restrictions, if any, were applied and whether studies were independently reviewed by authors, what the level of agreement there was or how any disagreements were resolved.

The 5 summary tables of findings, 4 of positive impacts based on the four themes identified and one of challenges were helpful to the reader.

The final section on gamification did not appear to be based on the included studies and appeared more personal opinions of the authors and based on the work of ‘a professor’ at the same university of most of the authors which may have been one of the authors of this paper.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

Covid-19 may become a catalyst for educational institutions worldwide to search for innovative solutions to the closure of campuses in a relatively short period of time.

It appears that those students who benefit from flipped classroom approaches are those with high self efficacy and motivation. Rather than offering a blanket change in practice could we do more to evaluate students self efficacy, motivation, learning styles, digital literacy and access to technology and connectivity. This would help targeting interventions to those who need support.

It is our anecdotal experience that gamification often appeals more to a younger cohort of students, this may be due to the format and content. Furthermore, cultural and language context should be taken into consideration.

It may be a cultural issue that many students are expecting to be talked at rather than taking responsibility for their own learning. This is reflected in evening class students who in general appear more motivated to undertake activity outside of the classroom.

There is a widespread assumption that all students are able to access and engage with online material. This is not the case. Many students do not have access to wifi, have poor broadband width and general connectivity issues, lack of suitable mobile technology, skills required to use them and some have language barriers.

Next steps

  • Convene a virtual group, MS Teams, of staff who have experience or wish to gain experience of using flipped classrooms. This will allow us to pool expertise and resources. This will also identify any staff development needs and develop a strategy to fulfil any needs.
  • Look at adding material as part of new staff integration process.
  • Finding new ways to develop students’ self-efficacy and motivation
  • Consider introducing a ‘learning log’ for all staff to capture innovative practice, challenges and issues during this period of College closure.

View from

What do you think?

References

Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, et al. (2009) The PRISMA Statement for Reporting Systematic Reviews and Meta-Analyses of Studies That Evaluate Health Care Interventions: Explanation and Elaboration.PLoS Medicine (OPEN ACCESS)  PLoS Med 6(7): e1000100. doi:10.1371/journal.pmed.1000100

Zamzami Zainuddin, Hussein Haruna, Xiuhan Li, Yin Zhang, Samuel Kai Wah Chu, (2019) “A systematic review of flipped classroom empirical evidence from different fields: what are the gaps and future trends?“, On the Horizon , https://doi.org/10.1108/OTH-09-2018-0027

Keywords:

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning Student engagement in learning Student experience Systematic Review Technology and digital literacies

Supporting Self-Regulated Learning in Online Learning Environments and MOOCs: A Systematic Review

Blog Authors: Tracey Howe, Lisa Shields, Sarah Janette Robertson, Walter Patterson, John McVeigh, David Cullen, Kate Cotter, Joe Wilson.

Here’s what they did

Moocs (Massive Open Online Courses) enable learning to take place anytime and anywhere and have created more accessible educational opportunities for the ‘masses’. There are however discrepancies between enrolment and completion rates in Moocs, suggesting that learning online presents its own challenges and that learners may require support to succeed. Prior studies suggest that learners find this form of learning challenging as they do not effectively use self-regulated learning (SRL) strategies.  Research thus far demonstrates that the provision of SRL strategies is likely to result in greater online academic success. The authors of this systematic review considered the role of SRL in online academic success and the influence of human factors, (for example, motivation and experience) and aimed to investigate approaches to support SRL in online environments. Their goal being to use any insights gleaned to inform further research into the development of Moocs.

The authors reviewed empirical studies in Moocs and also included studies conducted in other online learning environments, using this as an ‘umbrella term’ in order to include all related learning taking place on the internet. They sought to investigate the current provision and type of SRL support provided in the online leaning environment.  Additionally, they examined the impact of human factors as identified and addressed in the studies selected. Their primary research question being to determine the effectiveness of approaches used to support SRL strategies in online learning environments, and whether these approaches take account of the role of human factors.

The authors followed the 5-step methodology of Khan, Kunz, Kleijnen and Antes (2003) which includes framing the question for the review, identification and assessment of relevant studies, summarising the evidence and interpreting the findings. Keywords were decided upon and the literature search was conducted using the databases; Scopus, Web of Science, ERIC over the timeframe of the last 10 years. Searches via Google Scholar supplemented this as did manual searches of reference lists.  A total of 398 articles were selected having browsed their titles and predefined inclusion criteria were applied to further screen these results. These were then fully examined separately and then discussed by 3 of the authors to determine the final selection of 35 studies to be included in the review.

Here’s what they found

The studies included in the review had been conducted over varying educational levels, and of the 35 reviewed, 23 focused on undergraduate level. These studies also covered a wide range of subjects, e.g. psychology chemistry, biology and medical sciences. Results were presented in 4 main sections, ‘Prompt’, ‘Feedback’, ‘Integrated support systems’ and ‘Other’ approaches; a further section examined the human factors investigated.

Prompt – Prompting appears to be an effective approach to support SRL strategies and academic success. However, the studies reviewed explored varying modes of prompting intervention and differences existed between the range of prompts, operationalisation and measurement.

Feedback – Only 2 studies considered feedback alone as an approach to support SRL activities, so the authors found it difficult to reach a conclusion as to its efficacy in this context. When feedback and prompting were combined, results were more promising, but the authors recommended that more studies should be undertaken to investigate the independent effect of feedback.

Integrated Support Systems – If students use the support systems made available to them, there are positive effects on their SRL strategies.  However, these tools and support are only effective if utitlised, and used appropriately.  The authors point out that human factors could however impact the uptake of these support systems, for instance if the student feels overwhelmed by the plethora of support offered to them.

Other – this section included the results relating to approaches that could not be categories in the aforementioned sections. These results were variable and not definitive in nature.

Human factors – 12 studies were examined, and the authors concluded that the effectiveness of an approach to support SRL and academic success is dependent on these human factors. Findings suggest that additional or differential support should be in place to assist learners with different levels of prior knowledge, cognitive and metacognitive ability. Approaches adapted to suit the different needs of learners will assist them to become better at regulating their own learning and thereby achieve greater academic success.

The author/s concluded

The authors concluded that it is important to encourage and assist learners to use the tools and strategies in place to support their SRL, as effective approaches cannot benefit the learner if they are not sufficiently and appropriately utitlised.  Adaptive support should be provided to meet the learners’ diverse learning needs and the impact of human factors should be acknowledged when considering which approaches to use to support SRL strategies.

Our Journal Club’s views

Who are the authors of the paper and where do they work?

There are 5 named authors, (some of whom are PhD students), whose background is split between educational psychology, focusing on motivational aspects, and technical contributors, whose background is Web Information systems. This instils us with a sense of confidence with the authors, as the topic of the study is within their particular area of interest.

What do we know about the journal?

This is a peer-reviewed journal; external experts reviewing articles prior to being accepted for publication. It has a 1.354 Impact Factor, which has increased by approximately 7% from last year. It is an established journal but did close for 2 years over recent times.

What about the methodology used?

The title provides enough interest to invite one to read further and the abstract is clear and informative about the paper’s subject matter and what the researchers set out to do. There is not a great deal of information available about this aspect in relation to Moocs, and although it is recognised that there is a high drop-out rate, how to maintain a student’s presence in a Mooc is less well investigated. Within the abstract, there would normally be more detail about the authors’ results and methodology along with some conclusions; these are missing somewhat, bringing the abstract to an abrupt ending. Detail within the abstract is important as many people will decide whether to read further, and sometimes purchase the article, depending on the abstract’s content.

When explaining their methodological process, the authors acknowledge the use of the 5 Step (Khan, Kunz, Kleijnen and Antes, 2003) process, which was instigated by experienced reviewers. However, the PRISMA reporting guidance postdates this process and could have been used to inform this systematic review. The researchers’ objectives and methodology were made clear which gave us sufficient information to believe that the study could be replicated. When searching the literature they used Google Scholar as an adjunct which was useful, as this enabled them to access papers not yet included in the chosen databases; they also reported that they stopped searching Google Scholar after 200 papers were sourced, as no new data became available.  They clearly stated the Inclusion Criteria used and the steps undertaken to conduct the review were fully explained and reported – thereby, their methodology was made evident. 

Within the results section, the authors could have taken a more analytical stance and ‘stood back’ in order to appraise their chosen papers more objectively and succinctly. The approach taken simply amassed the information from the chosen papers and presented it back in a summarised form rather than it being an analytical discourse of the findings. It is only when the reader reaches the end of the different sections, that there is a structured and organised overview of the results. It therefore appears that when considering the multiple studies they had sourced, they merely described the results of each of them, rather than ‘pooling’ these individual results thereby pulling them together. In addition, quality assessment of each individual paper was not evident, which resulted in each paper’s findings being given equal weighting, possibly erroneously. Within the general discussion, an attempt is made to bring the results back to the research questions – thereby relating them successfully to one another. Some time is spent in making recommendations for ‘going forward’ and the authors demonstrate an awareness of the limitations of their study. The conclusion is abrupt and does not fully reflect the findings comprehensively. It is almost as if a conclusion had not actually been reached. This research was supported financially by one of the institutions of the authors – thus internally funded, which is not unusual.

Our conclusions are – that this evidence has a moderate risk of bias. There is no evidence of quality assessment of every paper included in the review and thus each paper being credited with equal weighting when drawing conclusions.

Implications for our practice

This paper has lead us to consider the necessity to use integrated support systems for our online teaching and learning environment, taking note of the individual learning needs of our various students – maturity, prior learning, existing knowledge and skills and experience, level of study and so forth. The introduction of the support intervention of ‘combined prompt and feedback’ seems to be a sound path to follow. However, we realise that the impact of the ‘human face’, discussion and debate should not be underestimated. The flexibility of online learning has been reinforced and this is something we should consider due to the changing nature of the demography of our student population.

Next steps

It would be useful to ascertain the existing support for SRL learning within the College, and using that as a baseline, make plans to build this in. The use of a ‘Tool kit’ might be an advantageous asset, but developing the right support for SRL is imperative. The categories and headings used in the paper might be something we could use to build and develop our ‘toolkit’ and the planned roadshows with curricula teams would be a medium through which to disseminate this information to teaching and support staff. At the moment, supporting students learning and participation in Moocs is not a priority but supporting them to become self-directed students within the online environment is something to focus on in the planning for the next academic year.

View from

What do you think?

References

Wong J., Baars, M., Davis, D., Van Der Zee, T., Houben, G. and Paas, F. (2019). Supporting Self-Regulated Learning in Online Learning Environments and MOOCs: A Systematic Review. International Journal of Human–Computer Interaction,35 (4-5), 356–73.

Khan, K. S., Kunz, R., Kleijnen, J., & Antes, G. (2003). Five steps to conducting a systematic review. Journal of the Royal Society of Medicine, 96(3), 118–121.

Keywords: learning, environments, review, moocs, online, self-regulated, supporting, systematic.

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Cohort study Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning Student engagement in learning Student experience Technology and digital literacies

Digital Literacy in Higher Education: engagement with e-tutorials using blended learning

Blog Authors: Tracey Howe, Anthony Adams, Angus Hynd-Gaw, John McVeigh, Sarah Janette Robertson, David Cullen, Lisa Shields, Claire Roberts.

Here’s what they did: The researchers conducted a case study project (using a case study approach) aimed at developing interactive digital skills E-tutorials as an integral part of selected under and postgraduate programmes. Nine interactive E-tutorials were devised collaboratively between instructors and students and these E-tutorials were then embedded within the curricula. The authors then sought to evaluate the students’ experience, perceptions and engagement with these E-tutorials and explore the respondents’ general attitudes to online learning. This was operationalised using the survey method and a 23 item questionnaire was delivered via Survey Monkey comprising of open and closed questions. The survey population consisted of 274 students from undergraduate (1st and 2nd year) and postgraduate programmes; 86 student responded (a response rate of 31%).  

Here’s what they found:

Factors affecting user engagement with digital learning were highlighted. These included: challenges, such as browser incompatibility, uneven sound quality, and internet connectively issues – all of which disrupted learning.

Students’ perceptions of the role of online learning within their programme were identified: E-tutorials were perceived as being a valuable asset for reiterating classroom learning, notably for revision purposes. They were seen as a valuable resource to enable them to learn at their own pace and in their own time. They were accessible, easy to use and their duration was appropriate.

Overall, respondents expressed enjoyment of this form of learning but highlighted a preference for a blended learning approach. Respondents did not want to forego ‘face to face’ teaching within the classroom environment entirely.

The author/s concluded: Interactive digital learning should be strategically embedded within under/postgraduate courses at defined points of the programme.  This would reinforce other forms of learning and skill development.  Appropriate support is required for successful and effective online learning, for example the speedy resolution of any technical glitches, in order to avoid a detrimental online experience.

Our Journal Club’s views

Who are the authors of the paper and where do they work? Claire McGuinness, Assistant Professor, Deputy Head of School and Director of Undergraduate Programmes in the School of Information and Communication Studies, University College Dublin, Dublin, Ireland and Crystal Fulton, Associate Professor, University College Dublin, Dublin, Ireland. Our view is that both are credible researchers and authors.

What do we know about the journal? The journal, ‘Journal of information Technology Education: Innovations in Practice is an academically peer reviewed journal (thus papers published within it have undergone peer review) but it does not appear in the Impact Factor table (not part of Thomson Reuters) although is mentioned in other indices. This does not necessarily suggest that papers within this journal are not of importance. This journal seems to be a vehicle for the publication of ‘early’ work, i.e. new ideas, initial findings, innovations and pilot studies. It is also an international publication and is well established as this paper was extracted from volume 18.  We therefore have confidence in the journal itself.

What about the methodology used? The title is attractive (encouraging people to read it), and informative making the content of the paper self-evident to the reader.  The abstract is extremely comprehensive, and quite lengthy in comparison to other papers reviewed; this may be because there are no word count constraints with this journal. It is also well structured with the use of subheadings.  The introduction and extensive literature review fully demonstrate that the development and implementation of the E-tutorial project were evidence based. The objectives of the study being clear and explanatory.

The case study research approach used was a pragmatic one as the study enfolded within the ‘real life’ context. Data were collected using a descriptive survey approach yielding both textual data and descriptive statistics. The questionnaire had undergone multiple iterations and revisions before being distributed showing that an attempt had been made to fully review and revise it accordingly. No detail provide as to whether an objective reviewer had also been used to verify its reliability and validity. Detail of analysis of the qualitative was provided (hand coded followed by a line-by-line constant comparative approach), but here also an independent reviewer could have been employed to verify findings. Ethical considerations and approval were achieved via the appropriate channels. In terms of the data collected however, full details about the number of respondents to different questions of the survey were not provided and it was not always clear as to the ‘make up’ of the respondents in relation to their respective courses.

Our conclusions are – that this evidence has a medium/low risk of bias.

Implications for our practice: From a City of Glasgow College perspective we need to consider how much experience and expertise our students have of online, multi-media learning, especially during the current situation.

Next steps: It would be useful to audit students’ digital literacy and online learning skill development to identify the skill base and level of competency they have. Checking the internet availability of our student population is also an important factor to ascertain. This would then provide an evidence based baseline on which to devise and deliver skill development and digital literacy training at the appropriate level.

View from

What do you think?

References: McGuinness, C. and Fulton, C. (2019) Digital Literacy in Higher Education: A Case Study of Student Engagement with E-Tutorials Using Blended Learning Journal of Information Technology Education: Innovations in Practice    Volume 18,  2019,  pp. 001-028. https://doi.org/10.28945/4190

Keywords: blended learning, digital literacy, e-learning, e-tutorials, higher education, online learning, online tutorials

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Dementia Online learning Systematic Review Technology and digital literacies

E-learning as valuable caregivers’ support for people with dementia – A systematic review.

Blog Authors: Tracey Howe, John McVeigh, Lisa Shields, Walter Patterson, David Cullen, Sarah Jannette Robertson, Lynn Brown.

Here’s what they did

This is a systematic review of accessible peer reviewed papers retrieved from three reputable databases. The focus of the review was to study if eLearning could be an informal support tool for Informal Caregivers of people living with dementia. It aimed to identify both benefits and imitations of this tool.

Here’s what they found

• eLearning in its various forms helped Informal Caregivers feel more confident about dementia care.
eLearning:-
• enhanced their knowledge and skills
• relieved perceived stress
• enhanced feelings of empathy and understanding.

The author/s concluded

The use of eLearning as a support tool for Informal Caregivers may have some potential, but training in its use is required to enable Caregivers to fully utilise eLearning platforms.

Our Journal Club’s views

Who are the authors of the paper and where do they work?

All of the authors are reputable, (Associate Professors, Professor and a Department Head); between them they are responsible for 1500 publications and have been cited over 50,000 times. Our view is that this is a potentially trustworthy publication.

What do we know about the journal?

The journal of BMC Health Services Research has been in existence since 2001, it is in the top 25% of medical journals, has an impact factor of 1.932 and is Open Access. Our view is that this is a reputable journal.

What about the methodology used?

The authors performed a systematic literature review based on a focused aim which could perhaps have been articulated with more clarity. They chose relevant keywords to conduct their search but the concept of eLearning could have been widened with the use of alternative descriptive search terms. The search terms were therefore mostly relevant as were the inclusion and exclusion criteria.  The choice of only focusing on older adults with dementia was a little misleading as what was construed as an ‘older adult’ was not defined and dementia also affects younger adults. They also used the ‘backward search strategy. However, the results were presented in a rather descriptive, narrative manner rather than being analytical and there was no evidence that the quality of the literature sources used had been critically appraised. There was however recognition of results being consistent with other similar studies. The limitations of the paper include: only a small number of studies relating to the topic of enquiry; wider more encompassing search terms could have been used; only papers in English language were considered; there was a lack of detailed data from each of the studies reviewed; no defined outcome measures identified for the review.

Our conclusions are – that this study has a high risk of bias and even though a paper is published in a highly rated journal it may still contain some flaws.

Implications for our practice

We will consider the organisation and provision of distance, remote and eLearning from the College’s perspective.

Next steps

Review and reflect on the distance learning as provided by the College

Consider the scaffolding required to fully support online learning, both for students and staff.

Raise the profile and awareness of Dementia.

View from

What do you think?

References

Klimova, B., Valis, M., Kuca, K. et al. E-learning as valuable caregivers’ support for people with dementia – A systematic review. BMC Health Serv Res 19, 781 (2019). https://doi.org/10.1186/s12913-019-4641-9

Keywords: eLearning, Dementia, Caregivers, Support tool, benefits, limitations

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Assessment and feedback Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. review Student engagement in learning Technology and digital literacies

Student-generated video creation for assessment?

Blog Authors: Fiona Balloch, Jan Robertson, John McVeigh, Robertson, Lisa Shields, Joe Wilson, Tracey Howe

Image: Photo by Hermes Rivera on Unsplash

Student-generated video creation assessments are an innovative and emerging form of assessment in higher education. Academic staff may be reluctant to transform assessment practices without robust evidence of the benefits and rationale for doing so and some guidance regarding how to do so successfully. JISC have recently published Future of assessment: five principles, five targets for 2025 which states ‘In a move away from the traditional essay or exam, assessments are building in authenticity by asking students to develop websites, set up online profiles, shoot and edit videos, and use social media.’

We explored the idea, with reference to the article Student-generated video creation for assessment: can it transform assessment within Higher Education? published in the International Journal of Transformative Research, 2018.

Here’s what they did

They searched literature and conducted a thematic analysis related to the use of student-generated video for assessment.

Here’s what they found

For successful use of video creation for assessment:

  • Align video creation task set to both learning outcomes and skills development required for graduate capabilities for relevant industry
  • Ensure technological support, resources and infrastructure are all in place
  • Have an intentional change management process to support both staff and students in the transition to a new assessment format.
  • Involve students in the generation of clear guidance for the assessment and development of an assessment rubric.

The author/s concluded

Video assessment is beneficial for students’ digital communication skills and an effective and enjoyable method of assessment.

Our Journal Club’s views

Who are the authors of the paper and where do they work? At the time of publication the authors are Ruth Hawley and Cate Allen, who work at University of Derby.  Our view is that the authors may be biased in favour of video assessment, in order to provide evidence to support an initiative taking place within their own institution.

What do we know about the journal? The fully refereed Journal of International Journal of Transformative Research does not seem to be live yet and will be issued for the first time in Fall 2020. Our view is that the journal does not meet its stated aims, as it says that articles should explore transformative impact but this is not the case in this article.

What about the methodology used? Research could not be easily replicated based on the level of detail provided in the paper. In addition, the findings lack critical analysis. that this evidence is inconclusive and biased. It lacks a rationale for the use of video assessment or guidance on how it can be used effectively.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

  • How can Nautical courses extend their use of asynchronous video assessment with international students?
  • How could the COGC Health suite integrate video assessment into assessments?
  • How could issues such as trolling, and confidence with one’s own image on video be addressed through digital communication skills training?
  • How could YouTube and Flipgrid be used for assessment?
  • How issues such as trolling, and confidence with one’s own image on video could be addressed through digital communication skills training. It is easier for assessors to view videos asynchronously at the time of the assessor’s choosing, than assessing a large run of live events, one after the other.
  • Training and support is available through the College Learning and Teaching Academy

Next steps

Create a working group to pursue this topic in the College with a view to group-creation of a paper on this area.

View from

What do you think?

References

Keywords: assessment, video, student-generated, Higher Education, digital, technology

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Assessment and feedback Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. qualitative quantitative Technology and digital literacies

Rubrics in Vocational Education

Blog Authors: Tracey Howe, John McVeigh, David Cullen, Walter Patterson, Ian Hamilton

Image by Cleonard1973 / CC BY-SA (https://creativecommons.org/licenses/by-sa/4.0)

Our College delivers vocational training that frequently uses observation-based assessment. However we realise that for this to be reliable, fair, and practicable it also needs to demonstrate consistency across assessors (quality assurance), and involve decisions about the range and number of observations of performance that are required to make a reliable judgement about competence. The notion of using rubrics is being explored and we looked at this paper ‘Electronic Rubrics Design to Assess Student Competence in Vocational Education‘.

Here’s what they did. Using Design Based Research they aimed to develop an instrument that contained a rubric on food and beverage service practice in vocational education that is valid, practical, and effective. The three stages included: 1) identification and analysis of problems, 2) development of prototype program, 3) test and prototype implementation of the program.

They explored the needs of 4 lecturers from food and beverage service of different universities and 30 students of culinary education Indonesian Education University. This defined the concept of evaluation tools that were made and validity was explored using the view of 2 specialist subject matter experts and 1 assessment expert.

Data collection involved interviews and questionnaires and descriptive statistics.

Here’s what they found.

  • food and beverage service lecturers have never created nor applied an assessment rubric.
  • students on food and beverage service programme do not know the assessment tools used by lecturers
  • researchers designed a task performance guide that can be used by students in the practical implementation.
  • the performance criteria for the task and performance assessment (rubric) showed a good degree of validation

The author/s concluded

The results of the study consisted of instruments used in food and beverage service performance task of student assignments as a guide for students in carrying out lab work and performance assessment consisting of electronics rubric as practical competency guidelines. The results of the development were validated, based on expert discussions conducted using the Aiken index coefficient.

Our Journal Club’s views

Who are the authors of the paper and where do they work? All authors work at UNIVERSITAS PENDIDIKAN INDONESIA,UPI The Education University.

What do we know about the journal? This paper was published as part of proceedings from the 1st Vocational Education International Conference (VEIC 2019).

What about the methodology used? The main problem with the paper was that clearly English was not the authors’ first language. This resulted in lack of clarity and understanding throughout. The methodology was unclear and all subsequent analysis, results and conclusions were difficult to interpret.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

There are a number of individuals and programme teams across our College developing and using electronic rubrics. These include Beauty and Culinary Arts and are used on ‘Moodle’ our VLE platform. It was felt that rubrics give a more standardised feedback to student that allows their understanding of their performance.

A key area where we could look at this is that of ‘meta skills’ as these are cross disciplinary in nature and could provide core methodology and consistency of approach.

Next steps

  • College staff currently developing or using rubrics could showcase their work at forthcoming internal events and conferences.
  • We could propose a work package on rubrics as part of the current institutional review of assessment and feedback
  • Create a working group of interested individuals
  • Ask OD and COPTE for staff development in this area
  • Look at the Skills Development Scotland meta skills

View from

What do you think?

References

  • Muktiarni, M. et al. (2019) ‘Electronic Rubrics Design to Assess Student Competence in Vocational Education’, in 1st Vocational Education International Conference (VEIC 2019). Atlantis Press, pp. 257–261. doi: 10.2991/assehr.k.191217.042.

Keywords: rubrics, assessment, competence, vocational, college

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

css.php