Assessment of the Autonomous Learning Competence in Engineering Degree Courses at the Universitat Politècnica de Catalunya

ASSESSMENT OF THE AUTONOMOUS LEARNING COMPETENCE IN ENGINEERING DEGREE COURSES AT THE UNIVERSITAT POLITÈCNICA DE CATALUNYA

Noelia Olmedo-Torre*, Oscar Farrerons Vidal

Escola d’Enginyeria de Barcelona Est. Universitat Politècnica de Catalunya·BarcelonaTech (Spain)

*Corresponding author

Received October 2016

Accepted March 2017

Abstract

We present a strategy for the acquisition and assessment of autonomous learning conducted as part of the Graphic Expression in Engineering (GE) degree course during the first quarter at the Escola Universitària d’Enginyeria Tècnica Industrial de Barcelona (EUETIB). The strategy employed is the puzzle technique in the classroom and multiple-choice questionnaires on the virtual campus. The results show that this strategy enables the continuous acquisition of the learning objectives for the subject.

 

 

Keywords – Autonomous learning, Puzzle, Questionnaires, Test, Engineering degrees.

 

----------

 

    1. 1. Introduction

Subjects in the Universitat Politècnica de Catalunya·BarcelonaTech (UPC) Industrial Engineering Degree Courses include objectives related to both specific competencies and key skills (Torra, de Corral, Martínez, Gallego, Portet & Pérez, 2010). The UPC has 7 of its own key skills, one of which is autonomous learning (UPC. Consell Social, 2008). Improvements in data processing with computers have led to the inclusion of new assessment strategies in the classroom, among which are multiple-response questionnaires (Cano García, 2008). These strategies measure what students do and what they know how to do for themselves, but despite providing very useful information they should also be able to help students learn more and/or better (formative assessment) (Kallas & Ornat, 2014),(Pastor, 2011).

Planning of the new degree courses should take into account three vital factors:

  • Learning centred on the student,

  • the achievement of goals based on planning and skills, and

  • the assessment and monitoring of teaching activities using the European Credit Transfer System (ECTS)(López Pastor, 2012).

In accordance with this approach, the academic staff design learning activities based on the formative objectives to be reached, guide students through the learning process, and finally apply an assessment strategy that enables the competencies to be acquired (Martínez, Amante, & Cadenato, 2012). Furthermore, students carry out the planned activities, participate in and build their own learning process as well as their evaluation (self-assessment and assessment among equals). The appropriate programming of the activities enable the formative objectives set out in the curriculum to be achieved.

According to the methodological and assessment approaches to be taken into account by teachers at the UPC, the autonomous learning competence in any degree course should be focused on key skills. This competence is defined as the students’ generic ability to detect deficiencies in their own knowledge and to overcome such deficiencies by means of critical reflection and the selection of the best way of extending this knowledge.

Each key skill at the UPC is defined through three levels of attainment, formative objectives being likewise defined for each one of these levels. As regards autonomous learning, the three defined levels help us to design activities and they also correspond to a particular period of the degree course (UPC, 2009), which are as follows:

  • Level 1, Directed: Carry out assigned tasks within the time estimated for their completion, working with the recommended sources of information in accordance with the guidelines set out by the teacher.

  • Level 2, Guided: Carry out the tasks assigned on the basis of the guidance provided by the teaching staff; decide on the amount of time required for the completion of each task, as well as including personal contributions and extending the recommended sources of information.

  • Level 3, Autonomous: Apply the knowledge acquired in order to undertake a task in terms of itssuitability and importance; decide on the way to carry it out and the time required to complete it, as well as selecting the most appropriate sources of information.

The course on Graphic Expression in Engineering (Expressió Gràfica a l’Enginyeria - EG) is imparted during the first quarter of all the industrial engineering degree courses at the UPC Escola Universitària d’Enginyeria Tècnica Industrial de Barcelona (EUETIB). The level of key skills assigned to this subject is Level 1. Thus, students are assessed on the time they have taken to solve the problem, on whether the way they have done so is the correct way, and if they have given any thought to the applicability of the content.

This current paper deals with the strategy employed in this subject for the acquisition and assessment of the autonomous learning competence. The puzzle technique is used in the classroom as part of this strategy, as well as multiple-choice questionnaires as an online tool for individual assessment regarding knowledge of standardization, technical drawing and spatial geometry.



2. Methodology

The puzzle technique is characterized by creating positive interdependence between students, since it distributes learning tasks between all of them and encourages individual responsibility for justifying not only their own contribution to the task but also that of other members of the group (Martínez & Cadenato, 2010; Aronson, Blaney, Stephin, Sikes & Snapp, 1978; Rodríguez, Fargas, Llumà, Jorba & Salán, 2012).

Each member of the Core Group studies the assigned objectives of the topic in an autonomous way over the course of a week. Later, in the classroom, and before giving the explanation, each student is required to hand in a summary or written outline (portfolio) to the other members of the group so that all will have a compendium of the topics studied throughout the course. In this classroom session, each member of the group will have 10 minutes to explain the contents of the assigned objectives to their companions using their summaries as a support, so that the members of the Core Group will learn about the other parts from the explanations given by the other members of their group.

This technique is used for autonomous learning of the theoretical content of the subject and is applied in 3 groups (M22, M51 and M61), each group consisting of 30 students. The activity has a Core Group structure with three members identified by the letters A, B and C. Different specific objectives are assigned to each student in the group. Figure 1 provides an example of the specific objectives of the first assessment session for the subject.

 

Figure 1. Specific objectives for each student in session 1

These objectives cover the entire topic to be addressed in the session, while explanations of the same should be given in class one week later in a group environment.

Table 1 shows all the specific objectives for each student in session 1, where in greater detail the subject and the competence are addressed, the topic is described and the bibliographical sources given.



Specific Objectives

 

Bibliography

Subject

Code

Competence

Description

DAP

Book

Normative concepts

0.2-1

Knowledge

Define objectives of standardization

Views.pdf (page 1-5)

Topic 1

Normative concepts

0.2-2

Knowledge

Enumerate most important standardization entities

Views.pdf (page 1-5)

Topic 1

Normative formats

0.3-1

Knowledge

Enumerate basic standardized formats

Views.pdf (page 6-10)

Topic 1

Normative formats

0.3-2

Knowledge

describe relation between basic standardized formats

Views.pdf (page 6-10)

Topic 2

Normative formats

0.3-4

Knowledge

Enumerate worksheet

Views.pdf (page 6-10)

Topic 2

Normative formats

0.3-5

Knowledge

Enumerate compulsory elements of the title block

Views.pdf (page 6-10)

Topic 2

Normative formats

0.3-6

Knowledge

Enumerate complementary elements of a title block

Views.pdf (page 6-10)

Topic 2

Normative scales

0.4-1

Knowledge

Define the concept of scale

Views.pdf (page 11-12)

Topic 2

Normative scales

0.4-2

Knowledge

Enumerate the basic standardized scales

Views.pdf (page 11-12)

Topic 2

Normative views

0.5-1

Knowledge

Enumerate general rules of view selection

Views.pdf (page 13-13)

Topic 3

Normative views

0.5-1.1

Knowledge

Enumerate all the main views in orthographic projection

Views.pdf (page 14-19)

Topic 3

Table 1. Specific objectives detailed in session 1



The individual portfolio provides assurance that progress is tracked weekly. At the head of the first page of each summary, the teacher indicates the following: “Time allotted by the teacher – Time taken by each student = (+/-) Difference”. These values enable the work time allotted to each student to be adjusted in order for the task to be carried out autonomously. These individual portfolios are collected on completion of the course and form part of the formative assessment (Gilbuena, Sherrett, Gummer, Audrey & Koretsky, 2015).

At the end of the activity, each student is required to complete to a multiple-choice questionnaire on the UPC virtual campus (Atenea) (Del Canto Rodrigo et al., 2010); this questionnaire covers all the objectives addressed in the group and their summaries may be used as a support.

Each questionnaire is a multiple-response test consisting of 6 questions selected at random from a question bank; the responses are arranged randomly for each student and the maximum time for delivery of the completed questionnaire is 10 minutes.

There are four possible replies for each question, one sole solution and a penalty incurred for each wrong answer. Every correct answer is worth 0.5 points and a deduction of 0.3 points for every incorrect answer, so that the highest mark is 3 and the lowest -1.5 (0 in the final mark). Students may leave the answer blank, which neither adds to nor subtracts from the mark. Each participant may consult the number of correct replies and the mistakes as well as the final mark at the end of the assessment test.

Figure 2 provides an example of some of the questions in the assessment questionnaire.

 

Figure 2. Assessment questionnaire

There is a total of 10 assessment questionnaire sessions, which account for 28% of the final mark for the subject, as may be seen in the following formula:

Final mark = 0.5x1stP + 2x2ndP + 1.5x3rdP + 0.6xC + 0.6xEE + 2xPROJ + 2.8xEP

Where:

1stP = 1st Partial

2ndP = 2nd Partial

3rdP = 3rd Partial

C = Sketch book

EE = Tutorials

PROJ = Project

EP = Questionnaires

In addition to the final mark for the subject, each student receives a final mark for his or her autonomous learning competence and this mark is added to their academic record. This mark, together with all the others, is regularly updated throughout the course so students can keep a check on their progress; it is the sum of two components of the Final Grade, that of the questionnaires (EP) and that of the project (PROJ) belonging to the subject, which are conducted in groups of between 3 and 4 students, where:

Autonomous learning mark = 0.85xPROJ + 0.15xEP



3. Results and discussion

The percentage of students who underwent the assessment tests during the second quarter of 2015 in the three groups making up the study was 96% (86 participants out of a total of 90). The weeks of the course in which the questionnaires were imparted, the number of participants in each group, the numerical code of each questionnaire and the topics assessed, the total number of responses and the averages out of 3 in the three groups in the study can be seen in Table 2.

2015 - 2º Quarter

Week

Questionnaire

Total nº of student responses per group

 (M22, M51, M61)

Average Group M22

Average

Group M51

Average

Group M61

2

EP 21

Norms, lines, scales and views

30+29+29=88

2.0

1.7

1.4

3

EP 31

Dimensions

30+30+29=89

1.5

1.1

1.1

4

EP 41

Cuts and cross sections

30+29+29=88

1.5

1.5

1.3

6

EP 61

Threaded elements

29+28+28=85

1.7

1.5

1.9

7

EP 71

Conicity, surface finishes and dimensional tolerances

30+28+29=87

2.4

1.9

1.8

8

EP 81

Geometric tolerances and settings

30+28+28=86

1.6

2.0

2.2

9

EP 91

Standardized elements

29+28+29=86

1.9

1.6

2.5

11

EP 111

Spatial geometry

30+29+28=87

0.8

0.9

1.1

12

EP 121

Metrics and geometric synthesis

30+27+28=85

1.2

1.2

2.0

13

EP 131

Surfaces

30+25+28=83

1.5

1.2

2.1

Averages for all topics per group

1.61

1.46

1.74

Table 2. Group assessment averages. 3 is the highest mark

The results in Table 2 show that the strategy enables most of the learning objectives in the subject to be acquired, as may be observed in the averages according to group in all the topics. One may also observe that the topics in which students obtained the best results in all groups correspond to EP71 (Conicity, surface finishes and dimensional tolerances) and EP91 (Standardized elements), while on the other hand EP1 (Spatial geometry) is the topic yielding the lowest results.

Table 3 shows the averages of the final grades in all three groups compared with the averages of the marks for autonomous learning in the same groups. One may see that these latter are higher than the final overall grades for the subject.



Final grades EG

Final marks autonomous learning

M51

M61

M22

M51

M61

M22

5.6

6.5

6.1

7.1

7.7

7.4

Table 3. Marks for the three groups (Final subject grades and final autonomous learning marks)

An anonymous online survey was carried out at the end of the course with the three groups in study using Google Drive® forms in order to analyze the degree of satisfaction with the subject. The survey employed was the SEEQ type (Students' Evaluation of Educational Quality) (Corral, Almajano & Domingo, 2008; Marsh & Roche, 1970), which is a highly effective educational assessment tool in which the data gathered therein serves to improve the process (formative assessment) as well as validating the quality of teaching and learning (summative assessment) (Valero-García & Díaz, 2005).

The survey was conducted with all the students (a total of 90) belonging to the groups M22, M51 and M61. All those aspects regarded as being most closely related with the degree of acquisition and assessment of the autonomous learning competence were extracted from the survey, such as enthusiasm, interaction with the work group, and the valuation of the assessment methods, among others.

Students were asked to respond in the following terms: Very much in agreement (5), In agreement (4), Neutral (3), In disagreement (2) and Very much in disagreement (1). Table 4 shows the questions posed, the weighted averages and the standard deviation of the responses.

The results arising from the survey were processed with the IBM SPSS v19 Solutions for Education®statisticalsoftware programme, in which the variables obtained were analyzed using descriptive statistics frequencies in order to determine the percentages of the variables. Contingency tables were also used to analyze significant correlations in variable cross that were of interest for the study.



SEEQ Survey

Weighted average

(out of 5)

Standard deviation

Enthusiasm:

I have attended the lectures regularly and have participated actively in the sessions of group work.

 

 

4.7

 

 

0.67

Interaction with the group:

The work groups are a good way of studying, learning about the subject and sharing knowledge and ideas.

 

We have attended every session with the necessary material prepared.

 

 

3.9

 

4.4

 

 

1.05

 0.74

Exams:

The assessment methods for this subject are appropriate and fair.

 

The exam contents and other work assessed match both the contents of the course and the emphasis placed by teachers on each topic.

 

I think the teachers have assessed my work fairly.

 

The approach to course assessment has helped me to learn the academic content better.

 

The assessments correspond to the subject objectives as they were set out at the beginning.

3.6

 

 

3.8

 

4.0

 

 

3.6

 

 

4.0

 

0.93

 

 

0.84

 

0.82

 

 

0.8

 

 

0.7

Table 4. SEEQ survey questions, weighted averages and standard deviation

One may observe that the two most highly valued aspects are enthusiasm, followed by attendance of the sessions with prepared material (interaction with the group), both being highly appreciated features in the learning process.

From the survey, it transpires that 75% of the students agree that they participated actively in the work sessions and attended the lectures on a regular basis. This percentage rises to 98% among those who responded that they were in agreement or very much in agreement (Figure 3). 39% of the students stated that they agreed that the work groups were a good way to study, to learn about the subject and to share knowledge and ideas. This percentage rises to 71 % if one also takes into account the students who responded that they were very much in agreement (Figure 4).



 
 

Figure 3. I attended the lectures regularly and participated actively in the work group sessions

Figure 4. The work groups were a good way to study, to learn about the subject and to share knowledge and ideas



It is worth pointing out that, in Figure 5, 64% of the students believe that the content of the exams and the work assessed correspond to the contents of the course. 61% are in agreement or very much in agreement that the methods of assessment of the subject are both fair and appropriate (Figure 6).

It is also evident from the survey that half the students agree that the approach to assessment is appropriate, although 33 % do not express a clearly defined position on this point (neutral). 55% of the students think that the assessments are in accordance with the objectives set out for the subject at the beginning, a figure that rises to 78% among those who stated that they were in agreement or very much in agreement.



 
 

Figure 5. The content of the exams and the work assessed correspond to the contents of the course and with the emphasis placed on each topic by the teachers

Figure 6. The methods of assessment for this subject are fair and appropriate



Furthermore, the results show that a significant correlation exists between attendance and active participation in the classes with a fair evaluation of the work by the teachers, and the fact that the students attended the sessions with material already prepared. From the contingency analysis, it transpires that a significant correlation also exists between the fairness of the assessment methods and the approach adopted for the assessment of the course, both of which were conducive for the teaching-learning process.



4. Conclusions

The use of the puzzle technique in the classroom and the virtual campus multiple-choice questionnaires as a strategy for the acquisition and assessment of autonomous learning enables the learning objectives of the subject to be fulfilled in a continuous manner.

Most of the students in the study participated actively in the work sessions and are of the opinion that the work groups are an effective means of attaining the objectives established at the outset in the different topics of the subject.

As a proposal for improvement, the different questionnaires will be checked in order to identify the questions yielding the lowest results, and the possible reasons for this poor performance will be analyzed in order to improve the learning process so that the objectives can be attained, whether by means of classroom-based activities or by distance learning.

This proposal for improvement will form part of a final project of the Postgraduate Course in Sciences, Technology, Engineering and Mathematics (Posgrau d’Ensenyament Universitari en Ciències, Tecnologia, Enginyeria i Matemàtiques-STEM, Institut de Ciències de l’Educació - ICE UPC http://www.ice.upc.edu).



Acknowledgments

We would like to thank all the students who took part in this survey on the degree of satisfaction with the EG subject, as well as the Grup d'Avaluació de la Pràctica Acadèmica (GRAPA), the ICE and the UPC for their help in the preparation of this article.



References

Aronson, E., Blaney, N., Stephin, C., Sikes, J., & Snapp, M. (1978). The Jigsaw Classroom. Beverly Hills, CA: Sage Publishing Company.

Cano García, E. (2008). La evaluación por competencias en la educación superior. Profesorado: Revista de Currículum Y Formación Del Profesorado, 12(3), 1-16.

Corral, I., Almajano, M., & Domingo, J. (2008). La encuesta SEEQ como instrumento de mejora continuada. Universitat Politècnica de Catalunya. Retrieved from:  https://upcommons.upc.edu/bitstream/handle/2117/16014/280.pdf

Del Canto Rodrigo, P., Gallego Fernández, M.I., Lopez Canalda, J.M., Mora Serrano, F.J., Reyes Muñoz, M.A., Rodríguez Luna, E., et al. (2010). Cómo usamos Moodle en nuestras asignaturas adaptadas al EEES. IEEE - RITA (Revista Iberoamericana de Tecnologías Del Aprendizaje), 5(3), 7586. Retrieved from: http://hdl.handle.net/2117/8916

Gilbuena, D.M., Sherrett, B.U., Gummer, E.S., Audrey, B., & Koretsky, M.D. (2015). Feedback on Professional Skills as Enculturation into Communities of Practice, 104(1), 7-34. https://doi.org/10.1002/jee.20061

Kallas, Z., & Ornat, C. (2014). Technological or Traditional Tools for Documents’ Correction? A Case Study in Higher education. Journal of Technology and Science Education, 2(2), 86-93. https://doi.org/10.3926/jotse.48

López Pastor, V.M. (2012). Evaluación formativa y compartida en la universidad: Clarificación de conceptos y propuestas de intervención desde la Red Interuniversitaria de Evaluación Formativa. Psychology, Society & Education, 4(c), 117-130.

Marsh, H., & Roche, L. (1970). SEEQ Students’ Evaluation of Educational Quality: Multiple Dimensions of University Teacher Self-concept. Instructional Science, 8(5), 439-469.

Martínez, M., Amante, B., & Cadenato, A. (2012). Competency assessment in engineering courses at the Universitat Politècnica de Catalunya in Spain. World Transactions on Engineering and Technology Education, 10(1), 46-52.

Martínez, M., & Cadenato, A. (2010). El puzzle como actividad de evaluación en el aula en grupos numerosos. In XVIII CIDUI.

Pastor, V.M.L. (2011). Best Practices in Academic Assessment in Higher Education. Journal of Technology and Science Education, 1(2), 25-39. https://doi.org/10.3926/jotse.2011.20

Rodríguez, D., Fargas, G., Llumà, J., Jorba, J., & Salán, M.N. (2012). Learning Experiences of the GidMAT-RIMA Group with Materials Engineering Students in Autonomous Learning and Working in Teams Generic Skills. Procedia - Social and Behavioral Sciences, 46, 4369-4373. https://doi.org/10.1016/j.sbspro.2012.06.256

Torra, I., de Corral, I., Martínez, M., Gallego, I., Portet, E., & Pérez, M.J. (2010). Proceso de integración y evaluación de competencias genéricas en la Universitat Politècnica de Catalunya. REDU. Revista de Docencia Universitaria, 8(1), 201-224.

UPC. (2009). Resumen sobre las competencias genéricas a implantar en los planes de estudio de grado de la UPC. Institut de Ciències de l’Educació - ICE-UPC. Retrieved from: https://www.upc.edu/ice/ca/innovacio-docent/publicacions_ice/arxius/resum-en-sobre-las-competencias-genericas

UPC. Consell Social. (2008). Marc per al disseny y la implantació dels plans d’estudis de grau a la UPC. Document CG 16 / 4, 2008. Acord 38/2008, de 9 d’abril, del Consell de Govern, Barcelona: UPC.

Valero-García, M., & Díaz, L.M. (2005). Autoevaluación y co-evaluación : Estrategias para facilitar la evaluación continuada. Congreso Español de Docencia en Informática.




Licencia de Creative Commons 

This work is licensed under a Creative Commons Attribution 4.0 International License

Journal of Technology and Science Education, 2011-2024

Online ISSN: 2013-6374; Print ISSN: 2014-5349; DL: B-2000-2012

Publisher: OmniaScience