There is nothing more practical
than a good theory (Kurt Lewin)
Hiltraut Paridon: Evidence-based teaching and learning
Paul Swuste & Frank van Dijk: How to assess the quality of academic safety and health education?
Evidence-based teaching and learning
Professor of Medical Education, Carl Remigius Fresenius Education Group
Evidence-based practice has been a topic in the health and safety professions for many years. It usually involves occupational medicine, industrial hygiene, or safety science. But what about the topic of "teaching and learning"? And what is actually meant here by "evidence-based practice"?
Paridon & Krause (2022) described the components of an educational decision. They are shown in the graphic. Teachers gain experience over time, e.g., regarding different teaching methods, but also about how to deal with different learners. For example, they develop certain ideas about how best to deal with calm learners so that they participate in the classroom or, conversely, how those who always push to the fore can learn to be considerate of the quieter ones as well. This expertise of teachers is called internal evidence. Likewise, internal evidence includes the goals and perceptions of learners, who bring different motivational backgrounds with them and whose behavior is already shaped by school learning experiences. Frame conditions such as equipment and legal and curricular requirements must also be considered in evidence-based practice. In addition to these areas, however, it is essential for an evidence-based approach to take into account and implement scientific findings. Here we are dealing with the results of teaching research and thus primarily with pedagogical-psychological issues. This is also referred to as external evidence. The focus is not on the subjects studied, but on the science of teaching and learning.
The design of education, such as the choice of social forms or teaching and learning strategies, can thus be (more or less) evidence-based. The prerequisite for such teaching is appropriate knowledge of current research results, especially in the field of pedagogy and psychology, and the willingness to take note of them and implement them in one's own teaching and learning behavior.
Therefore, teachers should be able to regularly take notice of findings from science and research on pedagogical-psychological issues. Furthermore, they must be qualified to implement the scientific results in their own teaching and also to enable the learners accordingly, e.g. to use appropriate learning strategies. Since it cannot be assumed that research findings will automatically result in applications for concrete teaching and learning activities, it would be desirable to develop a further training program for interested teachers, who then act as multipliers. By illustrating scientific findings and concrete implementation scenarios, how findings can be treated for practical settings, they may enable other teachers to engage in evidence-based teaching and learning.
Paridon, H. & Krause, A. (2022). Evidenzbasiert Lehren und Lernen: Einstellungen von Lehrkräften zu einer evidenzbasierten Unterrichtspraxis. PADUA, 17, 1-7.
Professor of Medical Education, Carl Remigius Fresenius Education Group
An interview with Hiltraut Paridon
1. You distinguish between internal evidence, based on the experiential knowledge of trainers and teachers, and external, theory- and research-based evidence. Now, the integration of safety and health into education and training is primarily about raising awareness of safety and health issues on the one hand and making safety and health principles guiding action on the individual as well as on the organisational level on the other hand. Why do I need theory and empirical research here? Isn't it always the experiential knowledge of trainers and teachers that is decisive for the planning and implementation of learning processes? Isn't it always the expertise of the head teacher on site that is decisive for the successful development of the educational institution?
Of course, experience is important, but it should not be used exclusively. This applies to both the learning processes and the content conveyed. With regard to the content, the trainers should know what current knowledge there is on safety and health - for example, which work materials are hazardous to health or which ergonomic aids are particularly suitable for averting health risks. With regard to the learning processes, there is a risk that we as trainers use teaching and learning processes that lead to a certain learning success, but we do not recognize that other processes would have been much more successful. An example: There is the learning styles myth, which states that people learn particularly well when the learning material is presented in a certain sensory modality. In fact, research shows that it is best to present the learning material in different modalities so that it can be networked in a variety of ways in the brain. If you present the learning material in one modality at a time, the learners can of course learn something and therefore they believe that they are successful with it. However, they do not see that they could have learned better if they had used multiple modalities.
2. How do you explain the gap between a basically positive attitude towards evidence-based training and teaching and the actual implementation, in our case the consideration of pedagogical-psychological research results on teaching and learning by OSH trainers? What are the biggest barriers to the implementation of pedagogical-psychological research findings on teaching and learning by OSH trainers or by teachers and headmasters who want to make their school healthier and safer?
There are various reasons for this gap. On the one hand, it is very time-consuming to always stay up to date with the latest research. You have to do a lot of literature research and reading. Current research literature sometimes costs a lot of money. In addition, not all trainers are able to evaluate the relevant literature - i.e. to what extent it is a meaningful study. Scientific work, for example in engineering, is different than in educational science. If you want to read the research literature from the educational sciences, a certain level of expertise is required, which not all trainers have. And then of course there is the implementation problem: Even though I, as a trainer, know what research has found out, I don't yet know how I can implement the findings in my own teaching.
3. What added value do you see in integrating scientific findings into the actions of trainers and teachers?
It's about conveying insights in the best possible way to achieve sustainable learning, including the corresponding action knowledge. The participants save time and it is good for the trainers to know that their training is effective.
4. In your article you focus on the pedagogical decision. That is a strongly cognitivist approach. Where are emotion and motivation here, which have something to do with a changed attitude, the development of a relationship to the world and to myself: I open up the world for myself and am opened up for it at the same time (see Hartmut Rosa).
I would like to give a two-part answer to this:
1. From my point of view, motivation, emotion and attitude are contained in several ways: on the one hand in the individual ideas, goals and expectations of the learners, on the other hand in the experience knowledge of the teachers and thirdly also in the research results, as there a lot of research on these topics exist. For example, how emotions influence learning or how attitudes arise.
2. To state it clear: I am referring to knowledge-based - and in this sense cognitive - content and skills and not to discussions of values and considerations of normative aspects. Incidentally, I think this is just as important - if, for example, a discussion is about the question of how important safety and health are to us at work, then as a discussion leader I need knowledge about how discussions can be conducted well and the ability to do so. A pedagogical decision in the sense I have described is not relevant for such an event, since values are the focus and it is not a teaching event but a discussion event.
5. What are the prerequisites for research-based practice, i.e. for the implementation / application of pedagogical research results? What promotes implementation, what hinders implementation? Using the current example: How can it be made easier for teachers and trainers to integrate research results on digital teaching and learning, but also on the effects of digitalisation on their own health and safety, into their pedagogical practice?
First of all, current scientific findings would have to be summarized and prepared in a way that is tailored to the target group. In the next step, these findings would have to be transferred into possible application scenarios - i.e. the question: What do the research results mean for my own event and what do I have to do to implement the findings? The research literature does not always provide direct recommendations for action and experience has shown that many trainers have difficulty developing appropriate ideas. There are now a number of relevant publications in the area of university didactics - there is certainly still room for improvement in the area of safety and health. There are numerous free publications and handouts on many topics relevant to occupational safety - from asbestos to vibration. There is less on the topic of “evidence-based teaching and learning”.
6. What possibilities do you see to improve the evidence-based action of trainers and teachers?
In my opinion, it is important to first create awareness that, in addition to the technical content, the implementation of the training event itself should be evidence-based. This is where publications like this help. Otherwise, I think further training is useful in which the relevant knowledge and skills are imparted.
7. You recommend offering seminars that support interested trainers and teachers in implementing educational-psychological research results in their practice of teaching and learning. What exactly could such a seminar look like? What competences should teachers or trainers acquire in such a seminar? In your view, what role does the digitalisation of teaching and learning play here, both in terms of methods and content?
I would imagine several parts. One part deals with the question of how research in educational science actually works - what different research designs there are and what they have to do with the significance of the study results. In a further part, the current scientific findings would be conveyed in a way that is tailored to the target group. The third part is about implementation. Possible application scenarios could be developed here. The first two parts could be implemented digitally as they have a lot to do with knowledge transfer. This can usually be implemented relatively easily online, although it is important to involve the participants using various tools. Since interactions and ideas in the digital space always remain limited, face-to-face events are recommended for the third part. This is currently being discussed under the term “serendipity” – that is, making happy and unexpected discoveries by chance. This can be very helpful when looking for ideas for possible implementations and is more likely to arise when we are present together.
8. In your view, is evidence-based practice a prerequisite for creating a culture of prevention, for example a school culture in which safety and health are an integral part of pedagogical practice?
In my view, a culture of prevention refers to values and normative aspects. Ultimately, it's about the question "What world do we want to live in and what are we willing to do for it?" Questions about values cannot be answered based on evidence, or can only be answered to a limited extent. A school's decision to live and work safely and healthily would always mean a restriction for the individual and his freedom of action (e.g. the decision that fast food will no longer be sold in the school cafeteria). It is a value decision whether a society wants that or to what extent it wants it. Here research results can help to predict possible consequences of different options.
Thank you very much!
The interview was conducted by Dr Ulrike Bollmann
How to assess the quality of academic safety and health education?
Associate Professor emeritus, Safety Science Group, Delft University of Technology, The Netherlands
Frank van Dijk
Professor emeritus, Coronel Institute of Occupational Health, University of Amsterdam, The Netherlands. Learning and Developing Occupational Health Foundation (LDOH)
Assessing the quality of academic safety and health education poses some major challenges. Presently, these courses and education will be evaluated in one way or another, but unfortunately, publications on this topic are rare. Presumably a lack of tradition, lack of funds, lack of rewards after publication, and financial constraints plays a role. For this paper we would like to focus on another reason: a lack of consensus on what to evaluate. Evaluation may be limited to the responses of trainees, or focus on the educational goals of the course, or to the facilities provided by the course management, just to name a few. This contribution will stress the importance of assessment of the quality of safety and health education with a special reference to the transfer of academic knowledge and skills to companies of trainees and to the workers under care.
What is quality?
A possible definition of quality of education is: ‘Quality of safety and health education is the degree to which organisations, providing these educational courses, will reach educational goals which are consistent with current professional and academic knowledge, and trainees will apply this knowledge for workers asking advice or care, and for the companies or organisations’. This definition implies that educational goals, ‘learning objectives’ or ‘learning outcomes’ should be set beforehand.
Assessing quality by output levels
Literature on quality of education frequently refers to different levels of quality assessment, which are simple and easy to understand:
Level 1 Reaction: Do trainees like the programme? The trainees’ evaluation is based on the assumption that a satisfied student will learn more and better than one who is not satisfied. Most educational programmes use this perspective for their course evaluation. A 2011 survey on post academic education in safety and health in Europe supports this conclusion.
Level 2 Learning: Do trainees understand facts, principles, theories, models, approaches presented? Classroom activities as individual performance, quizzes, discussions and written tests are evaluation techniques to assess actual learning. Many programmes have some sort of examination, either at the end of the programme or several times during the programme.
Level 3 Behaviour: Do students apply models, tools and approaches of the programme in their jobs? An evaluation may include a pre and post education survey, preferably some time after finishing education, say six months.
Level 4 Results, or impact: Are workers, companies or organisations safer or healthier as a result of activities of the students or postgraduates who successfully finished their education? Such an evaluation implies one or more measurements of safety or health or of decisive intermediary risk or behavior factors (absence or reduction of toxic materials, applying protective equipment).
Assessing quality by adapted levels
The output levels presented may suggest a relation between the levels (see Figure 1a). But trainees lack the necessary understanding and knowledge on the topic presented and will primarily judge the (attractive) form of the programme, rather than its content. Measuring the reaction of trainees does not evaluate learning. Some teachers communicate very well, without offering much content or even teaching unreliable content.
On the level of learning, most examinations test knowledge. In only a few examinations or evaluations, skills and attitudes are assessed. In occupational medicine, skills in medical interviewing (e.g. occupational history taking) and in physical examinations can be tested. The attitudes of medical students towards the specialist field of occupational medicine can be evaluated before and after education with a validated questionnaire. A complication is that evaluation tools are, in practice, mostly restricted to ‘internal tools’ which only monitor the reactions of trainees and individual teachers.
An assessment of the trainees’ behaviour on their jobs is rarely performed by educational organisations.
Like behaviour, assessment of results or impact is rarely used to measure the quality of the education given. Accident, and incident frequencies are indicators for safety. Of course, studies should be aware of biased safety outcomes. Only using accidents as an indicator can be unreliable as this indicator is subject to all sorts of variations. Accident processes, or more specifically accident scenarios, and the quality of measures to prevent accidents might be better indicators. Another example is the incidence of occupational and work-related diseases before and after education, e.g. skin diseases, musculoskeletal complaints and diseases or burnout.
Assessing quality by content and process levels
The quality levels commonly used are mainly output and outcome oriented. They lack an evaluation of the content and processes offered in the course. Content levels refer to the relevancy of the learning content and to the state-of-the-art of the knowledge provided by the course, and to the quality of the teachers. A course organiser should have an overview of the subject areas that are taught in order to adequately select teachers who have up-to-date knowledge of for the trainees most relevant subject areas and can provide them with feedback on their learning. Process levels can be divided into 1/ the relevance and quality of the selected educational activities and learning materials: do these conform with the learning objectives, are they complete and valid, and 2/ the quality of the learning by the trainees itself, such as interactive learning and learning by doing. Are all trainees involved in active learning?
The relevance of the original quality levels have been questioned. For instance, a positive reaction of trainees does not include an evaluation of learning in the sense that the trainees have understood the principles, models, essential facts, theories and techniques which have been taught. Therefore, chart b, and c) in Figure 1 do not have an arrow between reaction and learning.
Also, the relationship between learning and behaviour on the job is not obvious. Therefore, in literature from the 1990 onwards, more emphasis is placed on the transfer of education concept (Figure 1c). Transfer of education is the degree to which trainees effectively apply the knowledge, skills and attitudes gained in education in their jobs. Apart from a personal motivation of the trainee, also supporting factors in the trainee’s company or organisation will affect this transfer.
Transfer to job settings
Learned behaviour should be in accordance with actual job conditions of the trainee. Therefore, education should be connected to the practical settings of the trainees, including teaching awareness about the conditions needed for acceptance of interventions, and accounting for possible resistance to change. Incorporating the working environment into education, or vice versa, has proven to be effective. Unfortunately, much of the education fails to transfer to job settings. Transfer of learning must be evaluated by assessing if the educational goals or learning objectives have been met or not. A well-known problem in academic undergraduate student education is the lack of actual job experience. Various techniques are used to compensate for this such as role playing, using representative situations in the daily life of the students, virtual reality, site visits and internships. For secondary and higher education in safety and health, but also in academic education, the focus of the transfer process can be more on practical aspects. Transfer in (post) academic education differs from non-academic higher education in safety and health, due to its goal to teach trainees not only ‘facts’ but also critical reflection. A postgraduate safety and health expert is a direct advisor of the chief executive officer (CEO) of a company or organisation. He or she should provide functional leadership to risk management processes, implement proactive safety and health management with colleagues, and be responsible for the quality of safety and health advice, having access to relevant reliable expertise and sources. He or she should be independent, understand cross-border influences, and be able to analyse problems and provide solutions to new situations. Critical reflection implies a willingness to discuss divergent points of view on the topic concerned. This requires an overview of models, metaphors and theories of safety and occupational health science in order to be able to analyse problems encountered at a meta-level.
Dominance of quality levels ‘reaction’ and ‘learning’
Behaviour and results (impact) are interdependent since people tend to continue behaviours that are perceived to be effective even when this is not the case. Evaluating the impact is difficult, and sometimes even impossible. For example, a pre-post study design such as a comparison of safety records one year before and one year after the education may show a decrease in figures. A causal relationship between the education and accident figures remains questionable, due to statistical variability and different forms of bias. In an interrupted time series design, a series of measurements is performed before and after, followed by a trend analysis. Another possibility to evaluate results is for example to focus on good functional relationships between middle managers and frontline workers. There are also some comments on the levels proposed. Many evaluation studies that have evaluated education have reported a different effect on different levels. Because of difficulties in assessing levels three and four, often due to disinterest of the organisation in which course trainees work, evaluation of education remains mostly limited to the first two levels, reaction and learning.
As noted in the introduction, the topic of quality of safety and health training and education, both of primarily education to undergraduates, higher education and postgraduate education has been treated stepmotherly in literature. Publishing in professional, or scientific journals takes an effort and success is better guaranteed when support is provided by for instance educational departments. We belief that exchanging experience by publications can show new lights on the assessment of the quality of safety and health education.
Paul Swuste, associate Professor emeritus, Safety Science Group, Delft University of Technology, The Netherlands
Frank van Dijk, Professor emeritus, Coronel Institute of Occupational Health, University of Amsterdam, The Netherlands. Learning and Developing Occupational Health Foundation (LDOH)
This contribution is based upon research, including an extensive list of references, and published as:
1. Swuste P Dijk F van (2018). Evaluation of quality of academic safety, health and environment education. ENETOSH Factsheet 4. https://www.enetosh.net/files/enetosh_files/Publications/Enetosh_Facts_2018_04.pdf
2. Swuste P Galera A Wassenhove W van Carretero-Gómez J Arezes P Kivisto-Rahnasto J Forteza F Motet G Reyniers K Bergmans A Wenham D Broeke C van der (2021). Quality assessment of postgraduate safety education programs, current developments with examples of ten (post)graduate safety courses in Europe. Safety Science 141; 105338