Categories
Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning qualitative quantitative Student engagement in learning Student experience Technology and digital literacies

Distance Learners’ Use of Handheld Technologies

Blog Authors: Walter Patterson; Lynn Brown; Jan Robertson; Joe Wilson; Tracey Howe

Image: Matthew Hurst from New York, NY, USA / CC BY-SA (https://creativecommons.org/licenses/by-sa/2.0)

One only has to walk through College or on any street or social place to notice that the majority of people are engrossed in some activity on the screen of their handheld technology. How can we utilise this technology for best effect for learners at our College. We explored the findings of a recently published paper Distance Learners’ Use of Handheld Technologies: Mobile Learning Activity, Changing Study Habits, and the ‘Place’ of Anywhere Learning.

Here’s what they did

Undergraduate students enrolled at the UK’s largest distance learning university were surveyed. This included questions about: (a) ownership of technologies; (b) frequency of use of handheld devices (tablet, e-readers, and smartphones) for specified leisure activities and for specified learning activities; (c) locations at which each device is used for study purposes; (d) perceived change in study habits; (e) statements about impact of use on learning; (f) reason for purchase; (g) length of time used; (h) benefits and challenges; and (i) preferences for future use of each technology for learning. Open comment questions were added to probe the types of learning used in distance learning contexts, reasons for use or non-use, and the locations of use. Students were asked separately about their use of tablets, smartphones, and e-readers so potential differences in use could be analysed.

There were 446 responses from 3000 students giving a response rate of 14.9%. All age groups, study levels, and disciplines were represented. A wide range of analytical methods were used to analyse the data.

Here’s what they found

Five key findings are:

  1. most students now use handheld devices for study-related learning;
  2. the distribution of study-related learning tasks was similar in all seven study places;
  3. there is a strong, statistically-significant correlation between the number of study places in which handheld devices are used and the number of study task types performed;
  4. two fifths of students using a handheld device for learning have noticed a change in study habit and benefit to learning;
  5. and multiple regression analysis shows three variables (number of study places, number of study tasks, and change in study habits) are predictors of finding it easier to access learning materials and improved quality of learners’ work.

The author/s concluded

The study concludes by proposing two new concepts: the flow of places and place of space. These should help direct the framing of future studies into the places, spaces, and mobility of formal and informal seamless learning.

Our Journal Club’s views

Who are the authors of the paper and where do they work? Walter and Joe declared an interest as in the past they had both worked with Prof Sharples (author of over 300 papers and founder of the Association for Mobile Learning).

What do we know about the journal? The International Review of Research in Open and Distributed Learning (IRRODL) is a refereed, open access e-journal that disseminates original research, theory, and best practice in open and distributed learning worldwide. Club members judged it to be very reputable, with all articles subjected to double-blind peer review by a minimum of 2 subject experts. According to Google scholar in 2019 the journal has a ranking of third among educational technology journals and a rank of fifth of all education journals.

What about the methodology used? The abstract was viewed by some to be concise and accurate, but others thought that it presented an overcomplicated approach to some straightforward research questions. The abstract also lacked an explanation of some important concepts in the paper (e.g. flow of places).

Some felt that the readability of the paper was impacted by its layout on the page with the text looking dense. It was also noted that there appeared to be a predisposition to adopting a particular framework and that this may have influenced the approach taken and the analysis.

The introduction is rather lengthy and would be improved by more explicit subheads in the text. It also introduces complex ideas some of which are not fully addressed in this paper. Also it was unusual for research questions to appear in the middle of an introduction rather than at the end.

It was also noted that the population for the study is very niche since it is the specific domain of distance learners, who are highly likely to also be mobile learners. This would not mirror well to the type of students who attend COGC.

The careful definition of what a handheld device is was appreciated and and the context and background of the study was well explained.

There was an extended discussion of the response rate (14.9%). There was a danger to the study that these respondents were self-selecting to have a definite view one way or the other on the topic of mobile learning. There was concern that we knew nothing about the 85% who did not respond (eg traits, attributes).

The over-representation of older students was noted, even although the initial sample was stratified according to key factors (good). The survey design itself well developed and demonstrated best practice. The choice of categories for place seemed reasonable but had only been piloted with 6 people. The inclusion of PCs and laptops was welcomed as it offered a good comparison between truly mobile devices and others, as was the recognition in the survey design that mobiles could be used for other (distracting) activities as well as study-related.

No qualitative results were presented, which was somewhat disappointing (to be published separately). The presentation of the results could have been more informative by inclusion of actual sample numbers rather than just percentages. The tables and charts allowed a clear understanding of the study outcomes. The statistical evidence was well presented.

There are strong connections between the number of different study places where mobile devices are used and the number of different tasks for which they are used. The distribution of study-related tasks was fairly even across all the different study places. Some (40%) of respondents noted a change in study habits and improvement to learning through the use of mobile devices.

Our conclusions are – that this evidence has a low risk of bias.

Implications for our practice

A study of learning designs across the college revealed a predominance of content delivery. There was scope to implement better quality learning designs that drew on varied tasks, many of which could be supported by mobile learning (ABC of Learning Design). The college could move in this direction so that students were encouraged more to participate in mobile learning.

It was commented that changing the tone of voice in some module content (eg personalised voice) had proved to better engage learners – so this could also be incorporated.

Next steps

It was intimated that one response to COVID-19 could be to move more learning to mobile and that staff would be given help to do this in the coming weeks.

View from

What do you think?

References

Cross, S., Sharples, M., Healing, G., & Ellis, J. (2019). Distance Learners’ Use of Handheld Technologies: Mobile Learning Activity, Changing Study Habits, and the ‘Place’ of Anywhere Learning. The International Review of Research in Open and Distributed Learning20(2). https://doi.org/10.19173/irrodl.v20i2.4040

Keywords: mobile learning, seamless learning, study space, handheld learning technologies, anywhere learning, distance education.

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. Enabling student development and achievement Online learning qualitative quantitative Student engagement in learning Student experience Technology and digital literacies

Students’ insights on the use of video lectures in online classes

Blog Authors: Walter Patterson, John McVeigh, Jan Robertson, Joe Wilson, Tony Adams, Tracey Howe

Image: Iase.bodh / CC BY-SA (https://creativecommons.org/licenses/by-sa/4.0)

This week our College closed in response to Covid-19 epidemic sweeping across the UK. We selected this weeks paper Students’ insights on the use of video lectures (VL) in online classes to help us explore options for remote learning for our students. The overarching question motivating this research focused on students’ perception of their own learning in courses using VL.

Here’s what they did

This was mixed method study using surveys and focus group as a source of data collection combined with a review of previous research on the topic. Selection criteria for participants included: (1) graduate and undergraduate students enrolled in online courses for Business and Education majors in the 2014–15 academic year; (2) online courses that include VL of any type; and (3) instructors’ approval to explore the design of the online courses. 96 out of 493 (424 undergraduate students and 69 graduate students) were recruited – 10 graduate and 86 undergraduate.

The online survey consisted of 18 questions that focused on 5 main categories: overall experience as online students, interaction with VL, perceived learning impact and integration of VL with other course activities. The focus group was administered via a web conferencing system.

Data from the focus group was analyzed qualitatively only. Basic descriptive statistics and graphical analysis were performed with quantitative data 

Here’s what they found

Three factors predict students’ satisfaction rate and their perception of relevance of VL in their own learning.

  • familiarity with the media,
  • the extension of experience using video in learning, and
  • educational level or academic status.

The author/s concluded

This study suggests that courses in higher education should consider the inclusion of VL in their course materials because the use of video meets different learners’ preferences, increases students’ engagement with content, enhances students’ perception of better learning experiences through content interaction, and reinforces teaching presence in online courses.

Our Journal Club’s views

Who are the authors of the paper and where do they work? Dr Norma I. Scagnoli is the senior director of eLearning in the College of Business and holds a position of Research Associate Professor in the College of Education at the University of Illinois. The other authors are also based there.

What do we know about the journal? BJET has just published its 50th Anniversary edition. The Journal is published by Wiley on behalf of The British Educational Research Association (BERA). Impact factor:2.588 (2018). ISI Journal Citation Reports © Ranking: 2018:31/243 (Education & Educational Research).

BJET is a long standing journal but it is held to be academic in it focus and its treatment of EdTech is quite different from current ALT Publications which tend to focus on ‘real world’ matters. BJET articles were characterised as being abstract, nuanced, distanced from real world practice.

What about the methodology used?

The abstract did not offer much of an explanation of the journal article – it was more of a teaser to read further. At first sight the ‘practitioner notes’ (included in a box) appeared to give a clear explanation of the context, methods and outcomes of the study but this was re-visited later.

The research questions were easy to find and the methodology was deemed to be appropriate (mixed methods using surveys, focus groups). The inclusion of the survey questions and the focus group questions in the appendix was appreciated.

However a significant weakness in this paper is that it provides no indication of the context of the undergraduate or postgraduate students who participated. Participants were clearly self-selecting from the purposeful sample with a high risk of bias to the qualitative data. The fact that only 10 graduate students participated made some of the conclusions and analysis suspect. Also, the inclusion of graduate students made the analysis and results more complex than was required for this study

It would have been good to have some demographics of the participants in terms of: age, experience, and access to technology – so that some comparisons could be with the student cohort at our College.

Nor was it possible to make sense of the data in the light of the particular population sampled. It was agreed that it would be unsafe to make these conclusions generalisable. It was also noted that the use of a particular online tool (Zoom) for the focus group could have excluded some from participating.

The statistical analysis was over complicated and a number of different tests had been carried out on the data to uncover where some statistical significance might arise – rather than proposing a hypothesis and using the data to confirm or reject that hypothesis (which is how statistical analysis should be performed).

In the discussion of the results it was noted that the percentage figures could be quite misleading and the absolute numbers should have been included. Also the choice of dark columns for the very small graduate sample meant that the reader’s eye was drawn to these figures rather than those of the much larger sample of undergraduate students. A question was also raised about the meaning of the term “Effect on learning” in the analysis.

It was concluded that any statements made about differences between graduate and undergraduate experiences and satisfaction were untenable because of the small sample size for the former.

Returning to the practitioner notes it was then realised that some of the statements made in the Notes were NOT established evidentially in this paper (even though they were seen to be reasonably correct.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

The paper gives no hint of the practical difficulties of recording and broadcasting video lessons where artefacts that can be freely used as copyright ‘fair dealing’ in the classroom became a breach of copyright when recorded to open broadcast. There are issues to do with staff intellectual property rights of video material ,and this has been the subject of much negotiation and discussion between institutions and staff unions. The paper offers no way of no advice on picking a way through such issues.

It was noted that the current Covid-19 crisis was driving delivery to online and that this had the potential to change the face of further education for the future.

Next steps

View from

What do you think?

References

Scagnoli, N.I., Choo, J. and Tian, J. (2019), Students’ insights on the use of video lectures in online classes. Br J Educ Technol, 50: 399-414. doi:10.1111/bjet.12572

Keywords:

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Enabling student development and achievement Lecturers competencies qualitative Quality assurance quantitative

Lecturers Competence in Teaching and Learning

Blog Authors: David Cullen, Jan Robertson, John McVeigh, Tracey Howe

Image: Freepik (https://www.flaticon.com/authors/freepik) / CC BY-SA (https://creativecommons.org/licenses/by-sa/3.0)

The UK quality Code for Higher Education states that ‘staff have an appropriate level of competence for teaching and supporting learning.’ College Faculty of Nautical and STEM provides a full range of marine operations courses from mandatory training for Merchant Navy Officers to a range of short courses. Therefore we were interested in this recent paper The Analysis of Lecturers Competence in Teaching and Learning Process of Cadets At Makassar Marine Polytechnic.

Here’s what they did

The study aimed to determine the competence of lecturers in the learning process of cadets at Makassar Marine Polytechnic (PIP Makassar). The research method used was a survey method to describe the existing conditions using a questionnaire of nautical, technical and management cadets. The 4 competency aspects studied were:

  • Pedagogic, such as the right methods and media used to create a good learning environment whilst providing guidance and motivation to cadets.
  • Lecturer personality, in providing a good example, duty and authority in front of cadets and still be polite when speaking as well as neatness of dress.
  • Professionalism, in managing the class and delivering training.
  • Social, in communicating and interacting with parents/ guardians of cadets as well as establishing a rapport with peers.

To study identified the main problem of the study to be the competence of lecturers in the teaching and learning process of cadets at PIP Makassar. The research method used was a survey method to describe the existing conditions using a questionnaire as a data collection instrument. The focus of research on a sample totalling 96 consisting of nautical, technical and management cadets.

The population of the study was made up of 135 Cadets of which 65 were majoring in nautical, 55 technical and 15 in management. The number of samples with an error rate of 5% from each population gave 96 cadets. The author collected data via questionnaire, interview and a document review. This data was analysed using a quantitative descriptive analysis in the Frequency Distribution Formula:

P = f/n x 100 Where P = Procedure, f = frequency of respondent’s answers, n = number of respondents

A scale was then used to measure the competence of lecturers.

Here’s what they found

  • Through the analysis of the responses, the author reported that the four aspects of competencies of lecturers measured were good or very good.
  • Through interviewing 3 cadets, one from each of the majors identified, lecturers were able to understand students as individuals.
  • Lecturers were also deemed to have paid special attention to the biological, intellectual and psychological differences of their students in order to better understand them.

The author/s concluded

That the lecturers of PIP Makassar had met the required national education competency standards.

Our Journal Club’s views

Who are the authors of the paper and where do they work?

At the time of publication Endang Lestari worked in the Nautical Department of Politeknik Ilmu Pelayaran Makassar, South Sulawesi, Indonesia. As the author was a lecturer in the department, the objectivity and independence were challenged, and a potential conflict of interest was raised regarding the anonymity of the questionnaires where the cadets are the respondents to a lecturer.

What do we know about the journal?

The journal of Advances in Economics, Business and Management Research, volume 75. 1st International Conference on Materials Engineering and Management – Management Section (ICMEMm 2018). This paper was a conference paper submission and it is unclear whether it had been peer reviewed.

The literature review section was not linked to the aim of the study and had no critical evaluation of the literature that contributed to the paper.

What about the methodology used?

The paper was more of a practitioner article rather than an academic study – “this is what we did and this is we found”, with no further discourse made. We thought that the rationale for the study was not clearly stated.

The absence of the content of the questionnaire allowed us to speculate on the number of questions asked, the weighting, the number of responses and the format of the questions, whether simplistically framed in order to give the conclusion wanted rather than discovered.

There was no breakdown of the respondents’ data available such as gender, age, questions answered or subject majoring in.

This evidence was very limited with no general recommendations for further study. There were no ethical considerations reported of the sample population or of the interviewees (cadets and staff in the institute) and this introduced a high risk of bias. The body of this paper was poorly presented, with poor syntax, poor reporting and of a poor structure and style that is sandwiched between the abstract and the conclusion.

No strengths, weaknesses, self-critique or recommendations were forthcoming in the subsequent analysis for the responses.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

A topical piece that unfortunately promised more than it delivered. It highlighted the importance of social rapport with students as a competence and led discussion to the process of recruiting lecturers. Student rapport is not guaranteed with a candidate that has the required academic and industry experience. In fact, how lecturers’ interpersonal skills contribute to the team could be more of a measure of social competence.

Next steps

There is the potential for the College to conduct a similar study, learning from the inadequacies of this study and then conduct a compare and contrast critical evaluation. The focus of future study could investigate the academic and vocational competencies required in delivering vocational courses with the demands of GTCS registration and the Professional Standards for Lecturers in Scotland’s Colleges.

View from

Joe to get a relevant external commentator

What do you think?

References

Keywords: competencies, professionalism, lecturer, cadets, learning process, questionnaire

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

Categories
Assessment and feedback Delivery and assessment of the curriculum Digital technologies to enhance learning and teaching and assessment. qualitative quantitative Technology and digital literacies

Rubrics in Vocational Education

Blog Authors: Tracey Howe, John McVeigh, David Cullen, Walter Patterson, Ian Hamilton

Image by Cleonard1973 / CC BY-SA (https://creativecommons.org/licenses/by-sa/4.0)

Our College delivers vocational training that frequently uses observation-based assessment. However we realise that for this to be reliable, fair, and practicable it also needs to demonstrate consistency across assessors (quality assurance), and involve decisions about the range and number of observations of performance that are required to make a reliable judgement about competence. The notion of using rubrics is being explored and we looked at this paper ‘Electronic Rubrics Design to Assess Student Competence in Vocational Education‘.

Here’s what they did. Using Design Based Research they aimed to develop an instrument that contained a rubric on food and beverage service practice in vocational education that is valid, practical, and effective. The three stages included: 1) identification and analysis of problems, 2) development of prototype program, 3) test and prototype implementation of the program.

They explored the needs of 4 lecturers from food and beverage service of different universities and 30 students of culinary education Indonesian Education University. This defined the concept of evaluation tools that were made and validity was explored using the view of 2 specialist subject matter experts and 1 assessment expert.

Data collection involved interviews and questionnaires and descriptive statistics.

Here’s what they found.

  • food and beverage service lecturers have never created nor applied an assessment rubric.
  • students on food and beverage service programme do not know the assessment tools used by lecturers
  • researchers designed a task performance guide that can be used by students in the practical implementation.
  • the performance criteria for the task and performance assessment (rubric) showed a good degree of validation

The author/s concluded

The results of the study consisted of instruments used in food and beverage service performance task of student assignments as a guide for students in carrying out lab work and performance assessment consisting of electronics rubric as practical competency guidelines. The results of the development were validated, based on expert discussions conducted using the Aiken index coefficient.

Our Journal Club’s views

Who are the authors of the paper and where do they work? All authors work at UNIVERSITAS PENDIDIKAN INDONESIA,UPI The Education University.

What do we know about the journal? This paper was published as part of proceedings from the 1st Vocational Education International Conference (VEIC 2019).

What about the methodology used? The main problem with the paper was that clearly English was not the authors’ first language. This resulted in lack of clarity and understanding throughout. The methodology was unclear and all subsequent analysis, results and conclusions were difficult to interpret.

Our conclusions are – that this evidence has a high risk of bias.

Implications for our practice

There are a number of individuals and programme teams across our College developing and using electronic rubrics. These include Beauty and Culinary Arts and are used on ‘Moodle’ our VLE platform. It was felt that rubrics give a more standardised feedback to student that allows their understanding of their performance.

A key area where we could look at this is that of ‘meta skills’ as these are cross disciplinary in nature and could provide core methodology and consistency of approach.

Next steps

  • College staff currently developing or using rubrics could showcase their work at forthcoming internal events and conferences.
  • We could propose a work package on rubrics as part of the current institutional review of assessment and feedback
  • Create a working group of interested individuals
  • Ask OD and COPTE for staff development in this area
  • Look at the Skills Development Scotland meta skills

View from

What do you think?

References

  • Muktiarni, M. et al. (2019) ‘Electronic Rubrics Design to Assess Student Competence in Vocational Education’, in 1st Vocational Education International Conference (VEIC 2019). Atlantis Press, pp. 257–261. doi: 10.2991/assehr.k.191217.042.

Keywords: rubrics, assessment, competence, vocational, college

Our Blog Posts are written by staff at City of Glasgow College to inform and inspire our practice. We meet together at the Journal Club to consider the latest evidence to provide insights on hot topics related to learning and teaching, quality assurance and subject needs. It forms part of our activity for General Teaching Council Scotland registration and Professional Standards for lecturers in Scotland’s Colleges demonstrating that we are a self-critical staff community.

css.php