Built Environment Architecture Building Planning and Landscape Design National Student Survey... Overall, the results for the Built Environment subjects show lower levels of satisfact
Trang 1Built Environment
Architecture
Building
Planning and Landscape Design
National Student Survey
Trang 2Contents
Foreword 4
1 How to use this report _ 6
2 Architecture _ 8
2.1 Comparison with all subjects combined 8
2.2 Comparison with STEM combined _ 9
2.3 Relationships between aspects of the student experience 10
2.4 Impact of aspects of the student experience on overall satisfaction _ 11
2.5 Range of institutional results for overall satisfaction 12
3.1 Comparison with all subjects combined _ 21
3.2 Comparison with STEM combined 22
3.3 Relationships between aspects of the student experience 23
3.4 Impact of aspects of the student experience on overall satisfaction _ 24
3.5 Range of institutional results for overall satisfaction 25
3.11 Comparison with selected items from the Postgraduate Taught Experience Survey _ 32
4 Planning and Landscape Design _ 34
4.1 Comparison with all subjects combined _ 34
4.2 Comparison with STEM combined 35
4.3 Relationships between aspects of the student experience 36
4.4 Impact of aspects of the student experience on overall satisfaction _ 37
4.5 Range of institutional results for overall satisfaction 38
4.6 Comparison by nation _ 39
Trang 3Appendix A: Brief description of analyses 53
Appendix B: Full list of subjects covered in this report 54
Appendix C: Information about the NSS _ 56
Appendix D: NSS items _ 57
Trang 4Foreword
The National Student Survey (NSS) and its role
The National Student Survey (NSS) is an annual survey of all higher education students in their final year of study in England, Wales and Northern Ireland, and for some institutions in Scotland It was introduced in
2005, and is an important source of information for anyone interested in the quality of an undergraduate degree programme It is administered by Ipsos-MORI on behalf of HEFCE, and aims to “gather feedback on the quality of students‟ courses in order to contribute to public accountability as well as to help inform the choices of future applicants to higher education”.1
This report covers the wide-reaching discipline of Built Environment, which the Higher Education Academy (HEA) defines to cover Architecture, Architectural Technology, Construction, Facilities Management,
Housing, Landscape Architecture, Spatial Planning, Surveying, Real Estate, Transport and Urban Design It is one of a series of 28 NSS discipline-based reports that has been initiated, compiled and written by the HEA and their survey team This report offers a high-level analysis of the discipline of Built Environment and aims
to provide the higher education sector with a better understanding of the experience of this student
community Its general findings can be used as a first step into further qualitative investigation, which can lead ultimately to a genuine quality enhancement of the students‟ learning experience The importance of using the NSS scores only as an instigator for further investigation, however, must be stressed; the true worth of the NSS is only apparent when the data it provides are used as a benchmark, and/or to compare with like
disciplines and institutions across the sector
The analysis given in this report covers student responses for the JACS codes covering: Architecture – K100; Building – K200; and Landscape Design and Planning (Landscape Design – K300, Planning – K400 and others in Architecture and Planning – K900)
The NSS asks participants to rate their level of agreement with 22 positive statements, on a five-point scale (in addition to „not applicable‟).The statements are grouped into six areas plus an overall statement: teaching; assessment and feedback; academic support; organisation and management; learning resources; personal development In addition to this, the survey also invites free-text comments about particular aspects of their experience
This report focuses on the quantitative data for the subjects to enable „like with like‟ comparisons as far as possible It is also useful to compare results for departments/faculties with similar students (while at the same time recognising the breadth of subjects included in Built Environment), and this has been approximated by comparing with mission groups or institution types It is important to note that not all differences will be reliable or statistically significant, and caution must always be taken when interpreting or relying on small differences These reports are designed to help subject communities to use the information about the student experience for the continued enhancement of learning and teaching within the identified areas
Highlighted features in the Built Environment discipline
It is interesting to note that within all three areas of the Built Environment discipline covered in this report,
Trang 55
The Landscape, Design and Planning subject group in this report comprises the highest number of female students (48%) compared with the other two Built Environment groupings Initial results from this report indicate that male students are significantly more satisfied than their female counterparts when studying Architecture, an area that perhaps requires further analysis
Overall, the results for the Built Environment subjects show lower levels of satisfaction compared to the experience of all other students responding to the NSS Furthermore, Architecture and Building report lower levels of satisfaction with their experience compared with the experience of all other students in the wider subject area of STEM responding to the NSS The report clearly shows differences between the different cognate degrees, as would be expected, and the diversity between subjects needs further longitudinal
exploration
Overall the survey suggests that strengths and weaknesses coexist across all subject areas and probably
impact on the 75-81% overall satisfaction with the quality of the course The strengths are to be commended
in the context of the changes experienced across the sector and the weaker response areas provide the basis for further consideration in relation to the variables and practices of the local context
changing economic environment within which HE operates
We are pleased to recommend this report to you covering the NSS from 2010-11 The data as presented in this report aim to make it easy to compare any local results with comparable degrees elsewhere and to start
to evaluate your own students‟ experience However, these comparisons must be viewed as only one piece of the jigsaw to understand and ultimately improve the student experience We would stress that results from the NSS and findings from this report for the Built Environment discipline must be included in any quality assurance discussions and must assimilate, not in isolation but together with, other sources of information These should include formal reviews and assessments such as accreditation visit reports, programme reviews, module evaluations, and university-wide reviews, as well as informal reviews through student-staff panels, focus groups and other mechanisms where student views are expressed
In going forward, we would also stress that it is critical for any proposed changes resulting from such an integrated systematic review to be discussed with students prior to implementation The best long-term improvements in the quality of the student experience can only come through integrated student-staff
initiatives, when everyone is engaged with the process
Jane Kettle and Aled Williams
Discipline Leads for Built Environment at the Higher Education Academy
May 2012
Trang 61 How to use this report
This report presents data from the 2011 National Student Survey (NSS) for specific subjects, aggregated across all institutions By providing information about how subjects are reflected nationally in the NSS, the charts and tables are designed to help departments, faculties and institutions to contextualise and understand their own results
This report includes NSS data for the following subjects, as classified in the Joint Academic Coding System (JACS) (see Appendix B for a more detailed list)
Architecture (K100) Building (K200) Planning and Landscape Design (K300, K400, K900) Note about students studying multiple subjects
Unless otherwise indicated, all students studying a subject at 50% FPE (full-person equivalent) or more will have their responses allocated to that subject Students studying two subjects at 50% FPE may therefore have their responses allocated twice In addition, students studying multiple courses, all at less than 50% FPE, will
be excluded from the data These decisions have been taken to ensure that a response is allocated to a
subject when the student has had a significant experience of that subject
When used with an awareness of the limitations, NSS data can play a useful role in supporting improvements
in learning and teaching By allowing comparisons and benchmarking, the data can highlight areas that would reward further investigation, either as areas of apparent success or challenge NSS results can be a useful starting point for discussions about learning and teaching, either with colleagues, senior managers, student representatives or students themselves It is also advisable to triangulate the data with quantitative and
qualitative information from other sources in order to effectively target, design and evaluate enhancement activities
This report presents a high-level picture of the discipline through the lens of NSS data, broken down and analysed in a number of different ways It does not provide a detailed picture of students‟ learning
experiences, nor does it dictate specific areas for intervention However, it can be used in conjunction with local NSS data to gain an overview of the views of a group of students, which can provide an excellent starting point for further investigation and discussion
As with all uses of quantitative data, caution should be exercised when interpreting small differences between respondent groups Small differences may be due to random variations in response, demographic
characteristics of the respondents, method of response and many other factors, and small numerical
Trang 77
Appendix A Significance levels are included in the tables, but for ease of use significance levels of 0.05 or lower have been highlighted in bold – this is the level at which results are standardly taken to be significant, and suggests that there is a 95% or greater probability that the patterns found in the survey sample are
reflective of the final-year undergraduate population as a whole Unless otherwise stated, where differences are significant (at the 0.05 level) the higher score is in bold text Where there are more than two scores being compared and the significance level is 0.05 or lower, the significance level itself is in bold text, and indicates that there is at least one significant different between two of the scores
It should be noted, however, that significance testing assumes that the survey has been conducted using a random sample, or a design that approximates this In fact, the NSS attempts to survey the whole final-year undergraduate population and, while all surveys may experience non-response bias, it can be more difficult to correct for this in a „census‟ type survey A review by Paula Surridge for the HEA described tests for non- response bias that found no significant effect2 and the overall profile of NSS respondents is broadly
representative of the wider student body However, it is not possible to say whether each subgroup explored
in this report (such as part-time students, or the results for HEI „mission groups‟) is similarly representative For this reason, the significance levels included in this report should only be taken as indications of confidence
in the survey results and we recommend that caution be exercised when interpreting, using or relying on small differences Similarly, the error bars placed around institutional scores may, if anything, be too narrow where non-response bias is substantial
In order to present the data in a more complete manner, tables rather than charts have been used for the majority of this report3 Because there are different response rates for each item in the NSS, no single number
of responses can be included for each group in a table Instead, the range between the lowest and the highest number of responses is shown
The percentage values included in the tables correspond to the proportion of students who agreed with the relevant statement (survey item), i.e selected either „definitely agree‟ or „mostly agree‟ The number of
responses to each item includes all of the responses (including those who disagreed)
This report contains high-level analyses involving institutional and demographic characteristics Other than The Open University, no institutions are identified anywhere in the report – in the section on part-time students, the OU‟s results have been separated out as they constitute such a large proportion of the part-time student responses No group smaller than 23 students is reported, and every care has been taken to ensure that no student can be identified either directly or through implication
The analyses included in this report were carried out by Mrs Gosia Turner The HEA acknowledges the assistance of the Higher Education Funding Council for England (HEFCE) in providing the NSS dataset used in this report
2 The National Student Survey three years on: What have we learned (Surridge, 2009)
3 The data contained in the tables can be used to create charts, if desired, by copying the entire table into a Microsoft Word
document, and then copying the required data from that document into a Microsoft Excel spreadsheet
Trang 82 Architecture
There are 3043 students in the NSS dataset who study Architecture at 50% FPE or more 40.1% of students who responded are women, 82.1% are from the UK, and 96.2% study full-time
2.1 Comparison with all subjects combined
This table compares the experiences of students across the UK responding to the NSS in Architecture with the experience of all other students responding to the NSS
These percentages, in this table and all other tables in the report, correspond to the proportion of students who agreed with the relevant statement, i.e selected either „definitely agree‟ or „mostly agree‟ The number of responses to each item includes all of the responses (including those who disagreed)
Architecture) Architecture Sig
Q3 Staff are enthusiastic about what they are teaching 85.4% 84.5% 262
Q5 The criteria used in marking have been made clear in advance 73.2% 61.1% 000
Q9 Feedback on my work has helped me clarify things I did not understand 61.4% 63.9% 004 Q10 I have received sufficient advice and support with my studies 75.0% 72.6% 011 Q11 I have been able to contact staff when I needed to 83.0% 78.1% 000 Q12 Good advice was available when I needed to make study choices 72.1% 70.2% 000 Q13 The timetable works effectively as far as my activities are concerned 78.5% 71.9% 000 Q14 Any changes in the course or teaching have been communicated
effectively
Q15 The course is well organised and is running smoothly 72.6% 59.9% 000 Q16 The library resources and services are good enough for my needs 81.0% 83.4% 001 Q17 I have been able to access general IT resources when I needed to 83.4% 81.2% 000 Q18 I have been able to access specialised equipment, facilities, or rooms when
I needed to
Q19 The course has helped me to present myself with confidence 79.0% 77.0% 000
Q21 As a result of the course, I feel confident in tackling unfamiliar problems 79.2% 79.8% 204 Q22 Overall, I am satisfied with the quality of the course 83.1% 78.1% 000 Number of responses to each item (range lowest – highest) 237817 - 261279 2963 - 3043
Trang 99
2.2 Comparison with STEM combined
This table compares the experience of students across the UK responding to the NSS in Architecture with the experience of all other students in the wider subject area of STEM responding to the NSS
Architecture)
Architecture Sig
Q3 Staff are enthusiastic about what they are teaching 83.7% 84.5% 311
Q5 The criteria used in marking have been made clear in advance 72.6% 61.1% 000
Q9 Feedback on my work has helped me clarify things I did not understand 58.6% 63.9% 000 Q10 I have received sufficient advice and support with my studies 75.4% 72.6% 001 Q11 I have been able to contact staff when I needed to 84.4% 78.1% 000 Q12 Good advice was available when I needed to make study choices 72.3% 70.2% 001 Q13 The timetable works effectively as far as my activities are concerned 80.0% 71.9% 000 Q14 Any changes in the course or teaching have been communicated
Q15 The course is well organised and is running smoothly 76.3% 59.9% 000 Q16 The library resources and services are good enough for my needs 83.3% 83.4% 561 Q17 I have been able to access general IT resources when I needed to 84.6% 81.2% 000 Q18 I have been able to access specialised equipment, facilities, or rooms when
Q19 The course has helped me to present myself with confidence 76.0% 77.0% 000
Q21 As a result of the course, I feel confident in tackling unfamiliar problems 78.1% 79.8% 029 Q22 Overall, I am satisfied with the quality of the course 83.7% 78.1% 000 Number of responses to each item (range lowest – highest) 64464 - 69602 2963 - 3043
Trang 10
2.3 Relationships between aspects of the student experience
21 items in the NSS are grouped into 6 scales, each measuring a different aspect of the student experience (see Appendix D), while item 22 examines overall satisfaction This table shows the extent to which these different scales are correlated with one another In other words, it gives an indication of the strength of the relationship between different aspects of the student experience Values nearer 1 indicate a stronger
relationship However, due to the fact that this analysis shows correlations rather than causal relationships, it
is not possible to conclude that improving one aspect of the student experience will automatically lead to
improvements in another aspect, even where the relationship appears strong
Q22 Overall,
I am satisfied with the quality of the course
Quality of Learning and Teaching scale
Assessment and Feedback scale
Academic Support scale
Organisation and Management scale
Learning Resources scale
Personal Development scale
All correlations are statistically significant at 0.01 level The strongest relationship appears to be between
overall satisfaction and quality of learning and teaching.
Trang 1111
2.4 Impact of aspects of the student experience on overall satisfaction
The different aspects of the student experience, as measured by the 6 item scales in the NSS, are likely to impact upon students‟ overall satisfaction with their course, as measured by question 22 To test this, a multiple regression has been performed, examining the extent to which the results for different item scales explain or predict overall satisfaction In the table below, the higher the size of the standardised coefficient, the greater the influence of that aspect of the student experience on overall satisfaction
All scales combined explain 69% (Adjusted R2 = 0.694) of the variability of the overall satisfaction item This is
a strong effect but nevertheless suggests the existence of other factors affecting the overall experience but not measured by the NSS survey
Unstandardised Coefficients Standardised Coefficients t Sig
B Std Error Beta
Quality of Learning and Teaching scale .450 .022 .328 20.538 .000
Organisation and Management scale .168 .014 .167 11.654 .000
This analysis shows that the quality of learning and teaching is the most important factor affecting the overall experience while the learning resources have the weakest impact.
Trang 122.5 Range of institutional results for overall satisfaction
The points on the graph represent the % agree for overall satisfaction (item 22) among those answering the survey for this subject The error bars represent 95% confidents intervals – in other words, there is a 95%
probability that the actual % agree for all students taking this subject at an institution, rather than just those
who responded to the survey, lies within this range This is important because it is a significant limitation on any rank ordering of institutions based on NSS scores Institutions with 22 students or less were removed from the graph Institutions have been anonymised and the numbers on the x-axis do not correspond to the numbers on the x-axis in other graphs in this report
In general, this analysis is intended to give an indication of the range of overall satisfaction across institutions offering this subject
Trang 13Q6 Assessment arrangements and marking have been fair 62.7% 57.5% 74.4% 63.2% 165 Q7 Feedback on my work has been prompt 62.2% 44.0% 68.6% 58.8% 000
Q8 I have received detailed comments on my work 66.3% 54.4% 76.7% 63.2% 003
Q9 Feedback on my work has helped me clarify things I
Q10 I have received sufficient advice and support with
Q11 I have been able to contact staff when I needed to 78.2% 72.5% 84.9% 80.1% 211 Q12 Good advice was available when I needed to make
Q13 The timetable works effectively as far as my
activities are concerned
Q18 I have been able to access specialised equipment,
Q19 The course has helped me to present myself with
Q20 My communication skills have improved 84.0% 83.3% 83.5% 83.8% 518 Q21 As a result of the course, I feel confident in tackling
Q22 Overall, I am satisfied with the quality of the course 78.2% 73.6% 82.6% 79.4% 528
Where there are statistically significant differences for an item, this is highlighted in bold in the „Sig.‟ column
Trang 142.7 Comparison by institution type
This analysis categorises the results for the subject according to the institution‟s „mission group‟ Mission group membership is correct for the time the survey took place (Spring 2011)
Group Group 1994 Million+ University Alliance* Guild HE** Sig Q1 Staff are good at explaining things 85.1% 97.6% 81.3% 80.3% 89.0% .001
Q2 Staff have made the subject interesting 89.0% 95.3% 80.5% 83.7% 81.3% .000
Q3 Staff are enthusiastic about what they are teaching 88.7% 98.8% 81.5% 84.9% 85.3% .001
Q4 The course is intellectually stimulating 90.2% 97.6% 83.7% 85.0% 78.7% .000
Q5 The criteria used in marking have been made clear in
Q6 Assessment arrangements and marking have been fair 62.3% 80.0% 69.3% 62.0% 69.9% .002
Q7 Feedback on my work has been prompt 62.0% 80.0% 59.1% 64.1% 57.0% .015
Q8 I have received detailed comments on my work 62.8% 71.8% 66.7% 70.7% 69.1% .002
Q9 Feedback on my work has helped me clarify things I did
Q10 I have received sufficient advice and support with my
Q11 I have been able to contact staff when I needed to 83.5% 97.6% 71.7% 77.3% 84.6% .000
Q12 Good advice was available when I needed to make study
Q13 The timetable works effectively as far as my activities are
concerned
72.9% 83.5% 72.0% 74.9% 77.9% 379 Q14 Any changes in the course or teaching have been
communicated effectively
71.8% 84.7% 61.2% 66.8% 73.3% .000
Q15 The course is well organised and is running smoothly 65.0% 85.9% 56.2% 61.6% 69.1% .000
Q16 The library resources and services are good enough for
Q17 I have been able to access general IT resources when I
Q18 I have been able to access specialised equipment,
facilities, or rooms when I needed to 73.6% 77.4% 61.3% 77.5% 73.5% .000 Q19 The course has helped me to present myself with
confidence
80.5% 95.3% 73.7% 77.4% 83.7% .000
Q20 My communication skills have improved 88.3% 95.3% 82.3% 84.6% 84.4% .011
Q21 As a result of the course, I feel confident in tackling
unfamiliar problems
85.1% 97.6% 78.1% 79.9% 81.5% .001
Q22 Overall, I am satisfied with the quality of the course 84.0% 94.1% 76.0% 78.0% 79.4% .001
Number of responses to each item (range lowest – highest) 615 -
Trang 15Q3 Staff are enthusiastic about what they are teaching 85.6% 84.0% 81.5% 464 Q4 The course is intellectually stimulating 89.0% 83.7% 77.7% .000
Q5 The criteria used in marking have been made clear in advance 57.1% 64.0% 52.9% .001
Q6 Assessment arrangements and marking have been fair 59.7% 64.9% 56.7% .004
Q8 I have received detailed comments on my work 61.0% 68.3% 61.8% .000
Q9 Feedback on my work has helped me clarify things I did not understand 60.3% 66.6% 53.5% .000
Q10 I have received sufficient advice and support with my studies 74.4% 71.7% 69.4% .035
Q11 I have been able to contact staff when I needed to 82.5% 75.4% 79.6% .001
Q12 Good advice was available when I needed to make study choices 70.6% 70.6% 61.5% .000
Q13 The timetable works effectively as far as my activities are concerned 72.6% 72.2% 63.1% 140 Q14 Any changes in the course or teaching have been communicated
Q15 The course is well organised and is running smoothly 62.5% 59.2% 51.0% .002
Q16 The library resources and services are good enough for my needs 83.4% 84.0% 76.4% 151 Q17 I have been able to access general IT resources when I needed to 85.6% 79.8% 72.0% .000
Q18 I have been able to access specialised equipment, facilities, or rooms
Q19 The course has helped me to present myself with confidence 79.8% 75.8% 75.6% .004
Q21 As a result of the course, I feel confident in tackling unfamiliar
problems
83.7% 78.2% 75.5% .005
Q22 Overall, I am satisfied with the quality of the course 81.4% 77.1% 67.5% .001
Number of responses to each item (range lowest – highest) 983 -
Trang 162.8 Comparison by full-time/part-time
Full-time Part-time Sig
Q3 Staff are enthusiastic about what they are teaching 84.6% 80.2% 355
Q5 The criteria used in marking have been made clear in advance 61.1% 61.2% 744 Q6 Assessment arrangements and marking have been fair 62.8% 61.2% 878
Q9 Feedback on my work has helped me clarify things I did not understand 64.4% 51.3% 003 Q10 I have received sufficient advice and support with my studies 72.7% 70.4% 061 Q11 I have been able to contact staff when I needed to 78.3% 73.3% 421 Q12 Good advice was available when I needed to make study choices 70.3% 66.4% 180 Q13 The timetable works effectively as far as my activities are concerned 72.1% 65.5% 029 Q14 Any changes in the course or teaching have been communicated effectively 65.8% 57.4% 174 Q15 The course is well organised and is running smoothly 60.2% 52.6% 181 Q16 The library resources and services are good enough for my needs 83.4% 84.5% 219 Q17 I have been able to access general IT resources when I needed to 81.3% 79.1% 740 Q18 I have been able to access specialised equipment, facilities, or rooms when I needed to 71.7% 64.0% 203 Q19 The course has helped me to present myself with confidence 77.6% 63.8% 001
Q21 As a result of the course, I feel confident in tackling unfamiliar problems 80.3% 67.0% 001 Q22 Overall, I am satisfied with the quality of the course 78.5% 67.2% 010 Number of responses to each item (range lowest – highest) 2853 -
2927 110 - 116
Trang 17
17
2.9 Comparison by gender
Q3 Staff are enthusiastic about what they are teaching 84.7% 84.3% 912
Q5 The criteria used in marking have been made clear in advance 59.4% 62.3% 204 Q6 Assessment arrangements and marking have been fair 57.8% 66.0% 000
Q9 Feedback on my work has helped me clarify things I did not understand 62.2% 65.1% 246 Q10 I have received sufficient advice and support with my studies 69.1% 74.9% 001 Q11 I have been able to contact staff when I needed to 76.4% 79.2% 158 Q12 Good advice was available when I needed to make study choices 66.9% 72.3% 005 Q13 The timetable works effectively as far as my activities are concerned 71.7% 72.0% 894 Q14 Any changes in the course or teaching have been communicated effectively 64.0% 66.4% 356 Q15 The course is well organised and is running smoothly 56.8% 61.9% 011 Q16 The library resources and services are good enough for my needs 78.4% 86.8% 000 Q17 I have been able to access general IT resources when I needed to 78.9% 82.8% 028 Q18 I have been able to access specialised equipment, facilities, or rooms when I needed to 67.7% 73.9% 000 Q19 The course has helped me to present myself with confidence 73.4% 79.5% 000
Q21 As a result of the course, I feel confident in tackling unfamiliar problems 76.6% 82.0% 002 Q22 Overall, I am satisfied with the quality of the course 75.0% 80.2% 001 Number of responses to each item (range lowest – highest) 1185 -
1219
1778 -
1824
Trang 18
2.10 Comparison by domicile
The following analysis breaks down the NSS results for the subject by students‟ place of residence Students are allocated to one category only, so those based in the UK are not included in the EU category for the purpose of this analysis
Non-EU Sig
Q3 Staff are enthusiastic about what they are teaching 85.0% 83.6% 80.4% 101
Q5 The criteria used in marking have been made clear in advance 61.1% 60.1% 62.2% 146 Q6 Assessment arrangements and marking have been fair 62.9% 60.6% 64.0% 928
Q8 I have received detailed comments on my work 66.1% 61.0% 66.2% .023
Q9 Feedback on my work has helped me clarify things I did not understand 64.2% 63.2% 61.8% 536 Q10 I have received sufficient advice and support with my studies 74.2% 67.9% 62.3% .000
Q11 I have been able to contact staff when I needed to 78.7% 76.2% 74.2% 223 Q12 Good advice was available when I needed to make study choices 71.3% 63.3% 66.3% .010
Q13 The timetable works effectively as far as my activities are concerned 72.7% 67.9% 68.8% 267 Q14 Any changes in the course or teaching have been communicated effectively 65.0% 64.6% 70.3% .003
Q15 The course is well organised and is running smoothly 59.7% 57.1% 63.8% 317 Q16 The library resources and services are good enough for my needs 84.2% 82.5% 76.8% .030
Q17 I have been able to access general IT resources when I needed to 81.2% 81.7% 81.2% .016
Q18 I have been able to access specialised equipment, facilities, or rooms when I needed to 71.4% 68.4% 74.6% 357 Q19 The course has helped me to present myself with confidence 78.7% 68.9% 69.6% .000
Q21 As a result of the course, I feel confident in tackling unfamiliar problems 80.7% 77.2% 74.2% 073 Q22 Overall, I am satisfied with the quality of the course 78.5% 76.2% 76.1% 178 Number of responses to each item (range lowest – highest) 2431 -
2498 259 - 269 272 - 276
Where there are statistically significant differences for an item, this is highlighted in bold in the „Sig.‟ column
Trang 1919
2.11 Comparison with selected items from the Postgraduate Taught Experience Survey
The national Postgraduate Taught Experience Survey (PTES) is run annually by the Higher Education Academy
in conjunction with institutions This table shows comparisons between data from NSS items and data from relevant items in PTES The PTES data are from the 2011 administration of the survey, and the full report can
be accessed on the HEA‟s website There are relevant items in PTES for all NSS items except items 10, 11, 12 and 22 For NSS items 7 and 16 there are multiple relevant items in PTES Unless otherwise stated, the
relevant item wording in PTES is either identical to the NSS item, or contains only insignificant differences The relevant PTES item numbers are in square brackets
Please note that whereas the NSS is compulsory for HE providers in England, Wales and Northern Ireland, PTES is voluntary 80 institutions took part in PTES 2011, as opposed to the 253 institutions that took part in NSS 2011 Differences in results between PTES and the NSS may, therefore, reflect differences between the institutions taking part Nonetheless, PTES includes many of the same questions as found in the NSS as well as some that add further information to the NSS-type questions The results below compare the experience of those final-year undergraduates studying Architecture across the UK answering the NSS with the experience
of taught postgraduates studying Architecture, Building and Planning
Trang 20Please also note that no tests for significance have been undertaken for this table; the differences between results for NSS and PTES items are provided for interest only, and should only be taken as indicative
Q3 Staff are enthusiastic about what they are teaching [PTES Q4c] 84.5% 81.5%
Q5 The criteria used in marking have been made clear in advance [PTES Q11a] 61.1% 68.8% Q6 Assessment arrangements and marking have been fair [PTES Q11b] 62.8% 68.2%
I received feedback in time to allow me to improve my next assignment [PTES Q11d – no
direct NSS equivalent]
N/A 53.0% Q8 I have received detailed comments on my work [PTES Q11e] 65.7% 66.6% Q9 Feedback on my work has helped me clarify things I did not understand [PTES Q11f] 63.9% 56.8% Q13* The timetable works efficiently as far as my activities are concerned [PTES Q14a] 71.9% 73.0% Q14 Any changes in the course of teaching have been communicated effectively [PTES Q14b] 65.5% 70.6% Q15 The course is well organised and is running smoothly [PTES Q14c] 59.9% 67.3% Q16 The library resources and services are good enough for my needs [PTES Q16a] 83.4% 73.2%
The library resources and services are easily accessible [PTES Q16b – no NSS equivalent] N/A 77.9%
I am satisfied with the quality of learning materials available to me (Print, online material,
DVDs etc.) [PTES Q16f – no NSS equivalent]
N/A 70.7% Q17 I have been able to access general IT resources when I needed to [PTES Q16c] 81.2% 75.4% Q18 I have been able to access specialised equipment, facilities or rooms when I needed to [PTES
Q19 The course has helped me to present myself with confidence [PTES Q17d] 77.0% 66.1%
Q21 As a result of my course, I feel confident in tackling unfamiliar problems [PTES Q17f] 79.8% 69.4%
3043 780 - 1027
* PTES Q14a, the equivalent of the NSS item Q13, is slightly differently worded: „The timetable fits well with
my other commitments‟
Trang 2121
3 Building
There are 3669 students in the NSS dataset who study Building at 50% FPE or more 16.1% of students who responded are women, 94.8% are from the UK, and 64.7% study full-time
3.1 Comparison with all subjects combined
This table compares the experiences of students across the UK responding to the NSS in Building with the experience of all other students responding to the NSS
These percentages, in this table and all other tables in the report, correspond to the proportion of students who agreed with the relevant statement, i.e selected either „definitely agree‟ or „mostly agree‟ The number of responses to each item includes all of the responses (including those who disagreed)
Q3 Staff are enthusiastic about what they are teaching 85.5% 75.5% 000
Q5 The criteria used in marking have been made clear in advance 73.1% 70.0% 000
Q9 Feedback on my work has helped me clarify things I did not understand 61.6% 47.7% 000 Q10 I have received sufficient advice and support with my studies 75.0% 69.9% 000 Q11 I have been able to contact staff when I needed to 83.0% 75.8% 000 Q12 Good advice was available when I needed to make study choices 72.2% 65.9% 000 Q13 The timetable works effectively as far as my activities are concerned 78.5% 72.4% 000 Q14 Any changes in the course or teaching have been communicated
effectively
Q15 The course is well organised and is running smoothly 72.6% 63.4% 000 Q16 The library resources and services are good enough for my needs 80.9% 84.7% 000 Q17 I have been able to access general IT resources when I needed to 83.3% 84.7% 054 Q18 I have been able to access specialised equipment, facilities, or rooms when
I needed to
Q19 The course has helped me to present myself with confidence 79.1% 73.7% 000
Q21 As a result of the course, I feel confident in tackling unfamiliar problems 79.3% 73.6% 000 Q22 Overall, I am satisfied with the quality of the course 83.2% 75.6% 000 Number of responses to each item (range lowest – highest) 237393 - 260658 3429 - 3665
Trang 223.2 Comparison with STEM combined
This table compares the experience of students across the UK responding to the NSS in Building with the experience of all other students in the wider subject area of STEM responding to the NSS
Building)
Building Sig
Q3 Staff are enthusiastic about what they are teaching 84.2% 75.5% 000
Q5 The criteria used in marking have been made clear in advance 72.2% 70.0% 000
Q9 Feedback on my work has helped me clarify things I did not understand 59.4% 47.7% 000 Q10 I have received sufficient advice and support with my studies 75.5% 69.9% 000 Q11 I have been able to contact staff when I needed to 84.6% 75.8% 000 Q12 Good advice was available when I needed to make study choices 72.6% 65.9% 000 Q13 The timetable works effectively as far as my activities are concerned 80.1% 72.4% 000 Q14 Any changes in the course or teaching have been communicated
Q15 The course is well organised and is running smoothly 76.3% 63.4% 000 Q16 The library resources and services are good enough for my needs 83.2% 84.7% 043 Q17 I have been able to access general IT resources when I needed to 84.4% 84.7% 327 Q18 I have been able to access specialised equipment, facilities, or rooms when
Q19 The course has helped me to present myself with confidence 76.1% 73.7% 000
Q21 As a result of the course, I feel confident in tackling unfamiliar problems 78.4% 73.6% 000 Q22 Overall, I am satisfied with the quality of the course 83.8% 75.6% 000 Number of responses to each item (range lowest – highest) 64040 - 68981 3429 - 3665
Trang 23
23
3.3 Relationships between aspects of the student experience
21 items in the NSS are grouped into 6 scales, each measuring a different aspect of the student experience (see Appendix D), while item 22 examines overall satisfaction This table shows the extent to which these different scales are correlated with one another In other words, it gives an indication of the strength of the relationship between different aspects of the student experience Values nearer 1 indicate a stronger
relationship However, due to the fact that this analysis shows correlations rather than causal relationships, it
is not possible to conclude that improving one aspect of the student experience will automatically lead to
improvements in another aspect, even where the relationship appears strong
Q22 Overall,
I am satisfied with the quality of the course
Quality of Learning and Teaching scale
Assessment and Feedback scale
Academic Support scale
Organisation and Management scale
Learning Resources scale
Personal Development scale
All correlations are statistically significant at 0.01 level The strongest relationship appears to be between
overall satisfaction and quality of learning and teaching.
Trang 243.4 Impact of aspects of the student experience on overall satisfaction
The different aspects of the student experience, as measured by the 6 item scales in the NSS, are likely to impact upon students‟ overall satisfaction with their course, as measured by question 22 To test this, a multiple regression has been performed, examining the extent to which the results for different item scales explain or predict overall satisfaction In the table below, the higher the size of the standardised coefficient, the greater the influence of that aspect of the student experience on overall satisfaction
All scales combined explain 65% (Adjusted R2 = 0.648) of the variability of the overall satisfaction item This is
a strong effect but nevertheless suggests the existence of other factors affecting the overall experience but not measured by the NSS survey
Unstandardised Coefficients Standardised Coefficients t Sig
B Std Error Beta
Quality of Learning and Teaching scale .477 .021 .346 22.477 .000
Organisation and Management scale .202 .014 .189 14.015 .000
This analysis shows that the quality of learning and teaching is the most important factor affecting the overall experience while the assessment and feedback have the weakest impact The learning resources scale is not statistically significant
Trang 2525
3.5 Range of institutional results for overall satisfaction
The points on the graph represent the % agree for overall satisfaction (item 22) among those answering the survey for this subject The error bars represent 95% confidents intervals – in other words, there is a 95%
probability that the actual % agree for all students taking this subject at an institution, rather than just those
who responded to the survey, lies within this range This is important because it is a significant limitation on any rank ordering of institutions based on NSS scores Institutions with 22 students or less were removed from the graph Institutions have been anonymised and the numbers on the x-axis do not correspond to the numbers on the x-axis in other graphs in this report
In general, this analysis is intended to give an indication of the range of overall satisfaction across institutions offering this subject
Trang 26Q3 Staff are enthusiastic about what they are teaching 76.1% 75.0% 72.0% 67.8% 085 Q4 The course is intellectually stimulating 73.3% 71.6% 67.0% 66.1% 000
Q5 The criteria used in marking have been made clear in
Q6 Assessment arrangements and marking have been fair 67.0% 76.0% 69.0% 59.9% 045
Q7 Feedback on my work has been prompt 49.3% 44.2% 31.0% 28.6% 000
Q8 I have received detailed comments on my work 56.0% 40.4% 48.0% 23.0% 000
Q9 Feedback on my work has helped me clarify things I
Q10 I have received sufficient advice and support with
Q11 I have been able to contact staff when I needed to 75.3% 80.3% 82.0% 77.6% 223 Q12 Good advice was available when I needed to make
Q13 The timetable works effectively as far as my
activities are concerned
Q18 I have been able to access specialised equipment,
Q19 The course has helped me to present myself with
Q20 My communication skills have improved 74.7% 78.3% 69.7% 73.8% 358 Q21 As a result of the course, I feel confident in tackling
Q22 Overall, I am satisfied with the quality of the course 76.2% 78.7% 63.0% 69.4% 011
Where there are statistically significant differences for an item, this is highlighted in bold in the „Sig.‟ column
Trang 2727
3.7 Comparison by institution type
This analysis categorises the results for the subject according to the institution‟s „mission group‟ Mission group membership is correct for the time the survey took place (Spring 2011)
Group Group 1994 Million+ University Alliance* Sig
Q1 Staff are good at explaining things 85.1% 89.2% 80.3% 78.1% .005
Q2 Staff have made the subject interesting 68.1% 77.3% 68.7% 68.9% .013
Q3 Staff are enthusiastic about what they are teaching 80.4% 75.4% 75.2% 73.8% 288
Q4 The course is intellectually stimulating 78.7% 80.1% 72.7% 72.4% .043
Q5 The criteria used in marking have been made clear in
Q6 Assessment arrangements and marking have been fair 87.2% 75.6% 66.6% 66.1% .006
Q7 Feedback on my work has been prompt 55.3% 56.3% 47.9% 48.0% 165
Q8 I have received detailed comments on my work 63.8% 53.4% 52.8% 55.0% 272
Q9 Feedback on my work has helped me clarify things I did
Q10 I have received sufficient advice and support with my
Q11 I have been able to contact staff when I needed to 87.2% 84.1% 75.9% 73.8% .002
Q12 Good advice was available when I needed to make study
Q13 The timetable works effectively as far as my activities are
concerned
85.1% 79.0% 73.3% 71.4% 134 Q14 Any changes in the course or teaching have been
communicated effectively
78.7% 76.0% 70.5% 64.1% .000
Q15 The course is well organised and is running smoothly 78.3% 75.0% 65.7% 58.3% .000
Q16 The library resources and services are good enough for
Q17 I have been able to access general IT resources when I
Q18 I have been able to access specialised equipment,
facilities, or rooms when I needed to 76.7% 77.1% 75.7% 76.1% .326
Q19 The course has helped me to present myself with
confidence
72.3% 83.0% 73.7% 72.8% .042
Q20 My communication skills have improved 76.6% 85.2% 74.8% 72.3% .001
Q21 As a result of the course, I feel confident in tackling
unfamiliar problems
78.7% 85.2% 73.9% 71.9% .002
Q22 Overall, I am satisfied with the quality of the course 83.0% 84.1% 76.5% 73.0% .005
Number of responses to each item (range lowest – highest) 43 - 47 166 -
176
877 - 936 1483 -
1591
*Excluding Bucks New University (included in Million+)
Where there are statistically significant differences for an item, this is highlighted in bold in the „Sig.‟ column The GuildHE institutions were excluded owing to insufficient student population in this subject
Trang 28The following table shows comparisons between broader institution type
Post-1992 FEC Sig
Q2 Staff have made the subject interesting 67.5% 68.5% 73.8% 221 Q3 Staff are enthusiastic about what they are teaching 75.3% 75.3% 77.8% 728 Q4 The course is intellectually stimulating 72.8% 72.1% 76.3% 137 Q5 The criteria used in marking have been made clear in advance 67.6% 70.8% 69.1% 573 Q6 Assessment arrangements and marking have been fair 64.4% 66.9% 74.3% .015
Q8 I have received detailed comments on my work 40.7% 54.0% 71.7% .000
Q9 Feedback on my work has helped me clarify things I did not understand 42.5% 47.0% 62.6% .000
Q10 I have received sufficient advice and support with my studies 68.6% 69.7% 73.3% 190 Q11 I have been able to contact staff when I needed to 76.5% 75.3% 78.0% 490 Q12 Good advice was available when I needed to make study choices 64.7% 65.2% 72.6% 060 Q13 The timetable works effectively as far as my activities are concerned 71.5% 73.5% 66.6% 079 Q14 Any changes in the course or teaching have been communicated
Q15 The course is well organised and is running smoothly 65.6% 65.0% 48.9% .000
Q16 The library resources and services are good enough for my needs 78.6% 86.9% 81.6% .000
Q17 I have been able to access general IT resources when I needed to 83.8% 85.4% 81.7% .002
Q18 I have been able to access specialised equipment, facilities, or rooms
Q19 The course has helped me to present myself with confidence 74.2% 73.5% 73.7% 942 Q20 My communication skills have improved 77.7% 74.2% 72.7% 105 Q21 As a result of the course, I feel confident in tackling unfamiliar
problems
76.4% 72.8% 74.3% 119 Q22 Overall, I am satisfied with the quality of the course 77.9% 75.8% 70.2% 065 Number of responses to each item (range lowest – highest) 688 - 725 2386 -