7.1 Identify the right people and their role and scope of authority
7.2 Arrange for SDA/GIDE to be on your approved supplier list (if necessary)
Given that the key services will be provided to each institution by SDA/GIDE, you may need to arrange for SDA/GIDE to be on your approved supplier list.
SDA/GIDE will also provide each institution with a full proposal and commercial and legal terms, for example notifying the lead operational contact if any additional actions are required to ensure the work conforms to the relevant ethical guidelines. Additionally, if universities have their own research guidelines and protocols, they should contact SDA/GIDE, who will ensure their activities conform to any guideline and protocols.
8.1 Professional Service Unit List
The professional service unit list is the list of units that will be identified in the survey for participants to evaluate. You can define the list in any way you like, but the individual units should be recognisable by the participants and make sense as a coherent set of people or activities that can be evaluated together.
While it is tempting to break down the various services into small units so that any issues can have their source narrowly identified, this can actually undermine the validity of the survey. Depending on the nature of what they do, a very small unit which only interacts with a small proportion of staff might receive fewer than 20 evaluations making interpretation more difficult. Also, presenting participants with a very long list of units, each of which would require separate evaluation, may lead participants to choose to evaluate a sub-set of the units they interact with.
Experience thus far has suggested that 40-60 units is probably a sensible range for a large institution.
Consideration should also be given to how the units should be presented in the survey. Rather than a long list of units, it is probably better to present an organised list based on their natural hierarchy. Thus, when participants are selecting which units they will evaluate, they will see a general heading (Finance, perhaps) and then a list of financially related units (Management Accounts, Purchasing, etc). This will probably fall naturally out of your internal structures but, as long as the structure makes sense to the participants, it does not have to.
There are some units whose main purpose is not a service to staff (eg student service units) but this is not a good reason to exclude them from the survey. It would be rare for such units never to deal with staff and it would send a rather odd message if some units were excluded from the survey. However, the core purpose of such units needs to be taken into account when any results are reviewed.
8.2 Professional Service Unit Descriptions
The service unit descriptions serve three purposes:
To serve these purposes most effectively, it is important to keep the descriptions brief. Ideally, they should be between about five and seven words. This is because respondents are only skimming the descriptions for key words, to confirm and/or differentiate between units. Studies of web reading shows people can spot key words very quickly in phrases of five to seven words. When the descriptions get much longer than seven words, respondents need to focus more closely on reading the descriptions rather than completing the survey. The longer the descriptions, the more likely the respondent will curtail or fail to complete the survey, lowering response rates and value of the exercise as a whole.
There is also value beyond the PSQS in the effort to summarise the purposes or remit of the unit for the PSQS in phrases no longer than about ten words, in that the phrase can be used in other contexts, such as the unit’s (internal) website, in e-mail signatures, or other forms of internal communications. Over time, this gradually builds up a stronger basis of knowledge about which unit is responsible for what activities, and links it to the PSQS.
We’d also recommend that a single person reviews and revises all descriptions so that they are written and presented in a single, consistent and coherent style, which will be better understood by respondents than the idiosyncratic writing style of different heads of units
Finally, the longer the unit descriptions are, the more pages or the longer the page will need to be on which respondents select which units whose service quality they will assess. Evidence from earlier years shows that the longer that page, or the more pages are required, to select units to assess, the lower the proportion of respondents who complete the survey. In terms of the unit descriptions, less is more.
An example partial list of units and descriptions is provided as Appendix B in the associated PDF
TIP! We recommend a staff member with an enterprise-wide remit or experience – typically in a Strategic Planning unit – drafts the unit descriptions against the guidance above, and circulates them to heads of the relevant units to confirm or revise, rather than to request they be volunteered, in effect meaning every head of unit or service faces a ‘blank sheet’. This practice will set out good examples of the length of description expected and the standard of precision required, and expedite the completion of the list.
It is also important to tell heads of units that subsequent (or repeated) revisions once the list has been submitted to SDA/GIDE, especially for minor punctuation and capitalisation errors which should have been identified prior to submission, will delay setup of the survey and beyond a certain point will incur additional costs for the institution.
8.3 Setting the Survey Period
To obtain the best results, it is important to give careful consideration to the timing and duration of the survey. For example, be sure the survey period doesn’t overlap fully with:
8.4 Terminology and Language
You should ensure the terminology used in the survey is aligned to your university.
8.5 Paper Surveys (for staff without computer access)
In addition to an on-line survey you may wish to run a paper alternative for staff who do not have ready access to a computer (cleaners, grounds staff, etc). In deciding whether or not to do this you will need to bear in mind that:
Experience to date is that providing a paper version to a particular staff group yields a response rate of about half the online version and creates considerable costs. As an alternative it is worth considering some form of kiosk for use by such staff.
8.6 The Web Survey – Introductory and Closing Text
The introductory and closing text of the web survey should say something about the purpose of the survey within your institution, effectively setting the tone, the principles (eg not seeking redress for personal issues, responding as an individual and not a representative of a unit), an atmosphere of constructive collegiality, etc. It may also allude to other intended uses of the results, for example as part of service reviews, to target support and improvement, or to improve staff satisfaction.
8.7 Testing the Web Survey
8.8 Email Addresses
It is important to check in advance that the necessary all staff email lists are available and can be used without any limits on the numbers of emails being sent at any one time.
8.9 Internal Communications
Participants may find benefit in enlisting the support of their internal communications team, for example to develop and execute an internal communications plan.
A good communications support plan would cover the full range of communication channels and media in order to maintain a sufficiently high visibility throughout the survey and to compete effectively with the many other activities which demand staff members’ attention and may divert or distract them from doing the survey.
A good communication plan will also follow through the entirety of the PSQS cycle internally. For example it should cover:
9.1 Invitation Email
You will need to decide who the invitation email should come from (probably the Vice-Chancellor) and seek their approval for the wording.
The text of the email, in addition to including the link to the survey and any words of exhortation you wish to include, should give a contact for technical queries with the survey (an in-house contact, rather than the company running the survey) and for queries about the nature and purpose of the survey.
You will also need a reminder email to be used after the survey has opened.
9.2 Monitoring Response Rates
SDA/GIDE will provide your Operational Lead with access to an online response rate monitoring facility, which lets you see responses broken down in several ways, i.e. by frequency (responses per day, total responses), business units respondent categories and by questions.
Most importantly, this lets you see when response rates plateau, especially when that plateau is short of a desired overall response rate. At such points you may choose to prompt a new wave of respondents, for example by invitations via another channel, or targeted at units with lower respondent numbers or proportions.
If you want to monitor responses in terms of the proportion of potential respondents in a unit (whether academic or administrative) then you will need to provide SDA/GIDE with staff numbers for each department so the percentage responses rate can be calculated.
9.3 Achieving a Good Response Rate
Apart from the email invitation to all staff to participate, there are a number of additional activities we recommend to optimise the likelihood of a good (high) response rate.
10.1 Within each participating university
Apart from initial dialogue with SDA/GIDE, for example about timing, unit names, custom terminology, reporting options, etc., each University has to undertake the following enabling activities, roughly on the schedule set out below.
(The indicative dates have been retained from previous years to demonstrate the approximate flow of the process. This year they will be variable to accommodate the requirements of universities as they work to recover from the upheavals created by the COVID-19 pandemic and will depend in each instance on the schedule determined and agreed between each institution and SDA/GIDE, as well as with the other participating institutions.)N.B. Some coordination across participants is important so benchmarking results are available reasonably close to when the analysis of each institutions own results are completed.
Activities or Milestone | Approximate date |
---|---|
Provide names of units to be assessed and concise description of the purpose of each to SDA/GIDE. Decide whether a paper version is required for any staff. Previous participants: provide information on units whose names have changed or any reorganisations and confirm which of the current units should be linked back to those in previous surveys (for year on year analyses). |
Mid-April to Early May |
Optionally, provide the URL or text of a ‘thank you’ page respondents see when they click the final submit button. | April/May |
Provide details of which service units should go into which benchmark groups (liaise with Survey Lead at the University of Nottingham) | April/May |
Provide details of which service units should go into which benchmark groups (liaise with Survey Lead at the University of Nottingham) | April/May |
Check and ‘sign off’ the online survey prior to the launch date | April/May |
Ensure procedures and the necessary information is in place (including the content of the email) for emailing staff inviting them to complete the survey | April/May |
Send email to staff inviting them to complete survey and undertake any other promotional activities and awareness raising. Optionally distribute paper surveys. | May to June |
Send email reminders to staff if necessary | June |
Confirm which of the analysis and reporting options your institution would like | June |
Final ‘sign-off’ to any outputs (Excel files and written reports) provided | September to October |
10.1 Undertaken by SDA/GIDE
Activities or Milestone | Approximate date |
---|---|
Develop and test online surveys for each participating institution, provide the final URL (link) to the survey | March to May |
Monitor response rates and provide weekly updates to each institution | May to June |
Close the surveys on the specified dates, download the response data, clean and check the data and prepare the file for analysis | June |
Data analysis and provision of results in Excel format | July to September |
Production of summary reports (if required) on individual institutions’ findings and benchmarked findings | September to October |