PSQS: Participant Guide 2022 Part 2: Methods and Practice

A Resource for current and prospective participants in the UK multi-university 'Professional Services Quality Survey'

Methods and Practice

7. Before You Begin

7.1 Identify the right people and their role and scope of authority

  1. Identify the person to serve as the Lead Business Contact. This person would typically manage the following activities and tasks:
    • Confirm and own the institution’s readiness to participate.
    • Ensure the appropriate level of visibility to the university Senior Management Team and the university staff in general with regard to any relevant activities.
    • Provide a point of escalation for any queries or issues that may arise.
    • Identify the person to serve as the Lead Operational contact.
    • Determine whether or how to further engage with any other participant organisation directly after the survey results are available, for example to bilaterally share information on practice in a given area, or to endeavour to understand causes of high service quality and good practice.
  2. Identify the person to serve as the Lead Operational Contact. This person would typically manage the following activities and tasks:
    • Determine the ideal time period during which the survey will run.
    • Lead on determining and internally confirming the right service unit names and descriptions and communicating these correctly to SDA/GIDE.
    • Serve as the nominated SDA/GIDE contact for the duration of the project.
    • Determine and confirm on behalf of the institution any customisation, e.g. of language and terminology, logos, introductory and thank you messages, etc.
    • Liaise with the University of Nottingham contact, who will advise on identification and agreement of benchmarking alignment.
    • Determine on behalf of the institution the reporting needs and requirements.
    • Facilitate all commercial arrangements (POs, invoicing, payment processing, etc.).
  3. Agree which decisions each person above can take independently and which decisions may need to be escalated to the senior management team (or equivalent) owner. The kinds of decisions that may need to be escalated include:
    • The granularity or number of units to be included, as this can be controversial.
    • The options taken among the reporting options provided.
    • How widely results are circulated within each university, and how.
    • The responses and actions that might be expected of heads of services.

7.2 Arrange for SDA/GIDE to be on your approved supplier list (if necessary)

Given that the key services will be provided to each institution by SDA/GIDE, you may need to arrange for SDA/GIDE to be on your approved supplier list.

SDA/GIDE will also provide each institution with a full proposal and commercial and legal terms, for example notifying the lead operational contact if any additional actions are required to ensure the work conforms to the relevant ethical guidelines. Additionally, if universities have their own research guidelines and protocols, they should contact SDA/GIDE, who will ensure their activities conform to any guideline and protocols.

8. Setup

8.1 Professional Service Unit List

The professional service unit list is the list of units that will be identified in the survey for participants to evaluate. You can define the list in any way you like, but the individual units should be recognisable by the participants and make sense as a coherent set of people or activities that can be evaluated together.

While it is tempting to break down the various services into small units so that any issues can have their source narrowly identified, this can actually undermine the validity of the survey. Depending on the nature of what they do, a very small unit which only interacts with a small proportion of staff might receive fewer than 20 evaluations making interpretation more difficult. Also, presenting participants with a very long list of units, each of which would require separate evaluation, may lead participants to choose to evaluate a sub-set of the units they interact with.

Experience thus far has suggested that 40-60 units is probably a sensible range for a large institution.

Consideration should also be given to how the units should be presented in the survey. Rather than a long list of units, it is probably better to present an organised list based on their natural hierarchy. Thus, when participants are selecting which units they will evaluate, they will see a general heading (Finance, perhaps) and then a list of financially related units (Management Accounts, Purchasing, etc). This will probably fall naturally out of your internal structures but, as long as the structure makes sense to the participants, it does not have to.

There are some units whose main purpose is not a service to staff (eg student service units) but this is not a good reason to exclude them from the survey. It would be rare for such units never to deal with staff and it would send a rather odd message if some units were excluded from the survey. However, the core purpose of such units needs to be taken into account when any results are reviewed.

8.2 Professional Service Unit Descriptions

The service unit descriptions serve three purposes:

  • Confirm. Quickly give the respondent confidence that the unit whose service quality they intend to assess is in fact the one that provided the service they have in mind.
  • Differentiate. Allow the respondent to distinguish between purposes of units whose name or remit are easily confused, for example ‘Marketing’ versus ‘Market Research’.
  • Educate. Let respondents quickly see the full set of purposes or capabilities of units whose service they intend to assess, as well as those of other units

To serve these purposes most effectively, it is important to keep the descriptions brief. Ideally, they should be between about five and seven words. This is because respondents are only skimming the descriptions for key words, to confirm and/or differentiate between units. Studies of web reading shows people can spot key words very quickly in phrases of five to seven words. When the descriptions get much longer than seven words, respondents need to focus more closely on reading the descriptions rather than completing the survey. The longer the descriptions, the more likely the respondent will curtail or fail to complete the survey, lowering response rates and value of the exercise as a whole.

There is also value beyond the PSQS in the effort to summarise the purposes or remit of the unit for the PSQS in phrases no longer than about ten words, in that the phrase can be used in other contexts, such as the unit’s (internal) website, in e-mail signatures, or other forms of internal communications. Over time, this gradually builds up a stronger basis of knowledge about which unit is responsible for what activities, and links it to the PSQS.

We’d also recommend that a single person reviews and revises all descriptions so that they are written and presented in a single, consistent and coherent style, which will be better understood by respondents than the idiosyncratic writing style of different heads of units

Finally, the longer the unit descriptions are, the more pages or the longer the page will need to be on which respondents select which units whose service quality they will assess. Evidence from earlier years shows that the longer that page, or the more pages are required, to select units to assess, the lower the proportion of respondents who complete the survey. In terms of the unit descriptions, less is more.

An example partial list of units and descriptions is provided as Appendix B in the associated PDF

TIP! We recommend a staff member with an enterprise-wide remit or experience – typically in a Strategic Planning unit – drafts the unit descriptions against the guidance above, and circulates them to heads of the relevant units to confirm or revise, rather than to request they be volunteered, in effect meaning every head of unit or service faces a ‘blank sheet’. This practice will set out good examples of the length of description expected and the standard of precision required, and expedite the completion of the list.

It is also important to tell heads of units that subsequent (or repeated) revisions once the list has been submitted to SDA/GIDE, especially for minor punctuation and capitalisation errors which should have been identified prior to submission, will delay setup of the survey and beyond a certain point will incur additional costs for the institution.

8.3 Setting the Survey Period

To obtain the best results, it is important to give careful consideration to the timing and duration of the survey. For example, be sure the survey period doesn’t overlap fully with:

  • Dates when large numbers expected to be on holiday
  • Other staff surveys
  • Marking periods

8.4 Terminology and Language

You should ensure the terminology used in the survey is aligned to your university.

  • Confirm the correct references to the category of service units, ie ‘Professional Services’, ‘Central Services’, ‘Support Services’, etc.
  • Confirm that references to staff members (‘employees’, ‘colleagues’, etc) reflect your organisation’s practices.
  • You can title the survey internally as you wish: there is no need for participating institutions to use the ‘PSQS’ term internally.

8.5 Paper Surveys (for staff without computer access)

In addition to an on-line survey you may wish to run a paper alternative for staff who do not have ready access to a computer (cleaners, grounds staff, etc). In deciding whether or not to do this you will need to bear in mind that:

  • The cost and effort of printing and distributing questionnaires and entering responses into a database will not be trivial.
  • You will probably have to restrict the number of evaluations included in the paper version so as to keep the size of the paper version reasonable.
  • You are likely to achieve a much lower response rate for paper surveys.

Experience to date is that providing a paper version to a particular staff group yields a response rate of about half the online version and creates considerable costs. As an alternative it is worth considering some form of kiosk for use by such staff.

8.6 The Web Survey – Introductory and Closing Text

The introductory and closing text of the web survey should say something about the purpose of the survey within your institution, effectively setting the tone, the principles (eg not seeking redress for personal issues, responding as an individual and not a representative of a unit), an atmosphere of constructive collegiality, etc. It may also allude to other intended uses of the results, for example as part of service reviews, to target support and improvement, or to improve staff satisfaction.

8.7 Testing the Web Survey

  • SDA/GIDE undertakes a range of testing to ensure the survey is ready to be deployed for each institution. It is each university’s responsibility to confirm their internal readiness to run and support the survey.
  • Make sure the purpose is clear and understood by test subjects. You may want to involve a small number of staff to review the introductory and final text of the survey, the unit list and descriptions, and the invitations to participate in the survey
  • You may want to involve your IT unit to ensure that large number of internal messages will not get intercepted by institutional or individual spam filters.

8.8 Email Addresses

It is important to check in advance that the necessary all staff email lists are available and can be used without any limits on the numbers of emails being sent at any one time.

8.9 Internal Communications

Participants may find benefit in enlisting the support of their internal communications team, for example to develop and execute an internal communications plan.

A good communications support plan would cover the full range of communication channels and media in order to maintain a sufficiently high visibility throughout the survey and to compete effectively with the many other activities which demand staff members’ attention and may divert or distract them from doing the survey.

A good communication plan will also follow through the entirety of the PSQS cycle internally. For example it should cover:

  • advanced publicity that the survey is coming
  • how University staff will be informed of the results becoming available and how to gain access to them
  • any highlights from the results, for example acknowledging areas of outstanding service
  • what actions are being taken based on the resultsli>
  • acknowledging the value of the contribution people have made by participating and creating a positive attitude towards participation in the next years’ survey..

9. Running the Survey

9.1 Invitation Email

You will need to decide who the invitation email should come from (probably the Vice-Chancellor) and seek their approval for the wording.

The text of the email, in addition to including the link to the survey and any words of exhortation you wish to include, should give a contact for technical queries with the survey (an in-house contact, rather than the company running the survey) and for queries about the nature and purpose of the survey.

You will also need a reminder email to be used after the survey has opened.

9.2 Monitoring Response Rates

SDA/GIDE will provide your Operational Lead with access to an online response rate monitoring facility, which lets you see responses broken down in several ways, i.e. by frequency (responses per day, total responses), business units respondent categories and by questions.

Most importantly, this lets you see when response rates plateau, especially when that plateau is short of a desired overall response rate. At such points you may choose to prompt a new wave of respondents, for example by invitations via another channel, or targeted at units with lower respondent numbers or proportions.

If you want to monitor responses in terms of the proportion of potential respondents in a unit (whether academic or administrative) then you will need to provide SDA/GIDE with staff numbers for each department so the percentage responses rate can be calculated.

9.3 Achieving a Good Response Rate

Apart from the email invitation to all staff to participate, there are a number of additional activities we recommend to optimise the likelihood of a good (high) response rate.

  • Advance notification of heads of units: Make sure heads of units have advance notification of the purposes, benefits, launch date and duration of your university’s run of the PSQS. They can then use regularly scheduled internal department and/or team meetings to raise awareness. They can also find opportunities to make explicit their interest and support for the survey, so that upon receiving the invitation as many staff as possible know what it, why it is being done, and why their participation matters.
  • Active monitoring of response rates: As indicated above, there are means of actively monitoring rates and intervening periodically to spur a surge in responses. A passive approach - where a low response rate is only noticed at the completion – leaves little scope to remedy any issues, and a low response rate may cast questions over the validity of the results, especially for units with a very low response number.

10. Key Activities and Milestones

10.1 Within each participating university

Apart from initial dialogue with SDA/GIDE, for example about timing, unit names, custom terminology, reporting options, etc., each University has to undertake the following enabling activities, roughly on the schedule set out below.

(The indicative dates have been retained from previous years to demonstrate the approximate flow of the process. This year they will be variable to accommodate the requirements of universities as they work to recover from the upheavals created by the COVID-19 pandemic and will depend in each instance on the schedule determined and agreed between each institution and SDA/GIDE, as well as with the other participating institutions.)

N.B. Some coordination across participants is important so benchmarking results are available reasonably close to when the analysis of each institutions own results are completed.

Activities or Milestone Approximate date

Provide names of units to be assessed and concise description of the purpose of each to SDA/GIDE.

Decide whether a paper version is required for any staff.

Previous participants: provide information on units whose names have changed or any reorganisations and confirm which of the current units should be linked back to those in previous surveys (for year on year analyses).

Mid-April to Early May
Optionally, provide the URL or text of a ‘thank you’ page respondents see when they click the final submit button. April/May
Provide details of which service units should go into which benchmark groups (liaise with Survey Lead at the University of Nottingham) April/May
Provide details of which service units should go into which benchmark groups (liaise with Survey Lead at the University of Nottingham) April/May
Check and ‘sign off’ the online survey prior to the launch date April/May
Ensure procedures and the necessary information is in place (including the content of the email) for emailing staff inviting them to complete the survey April/May
Send email to staff inviting them to complete survey and undertake any other promotional activities and awareness raising. Optionally distribute paper surveys. May to June
Send email reminders to staff if necessary June
Confirm which of the analysis and reporting options your institution would like June
Final ‘sign-off’ to any outputs (Excel files and written reports) provided September to October

10.1 Undertaken by SDA/GIDE

Activities or Milestone Approximate date
Develop and test online surveys for each participating institution, provide the final URL (link) to the survey March to May
Monitor response rates and provide weekly updates to each institution May to June
Close the surveys on the specified dates, download the response data, clean and check the data and prepare the file for analysis June
Data analysis and provision of results in Excel format July to September
Production of summary reports (if required) on individual institutions’ findings and benchmarked findings September to October