Find out more about how the survey is managed and our resources for line managers.

Hearing your views

The staff survey is an important way to hear the views of staff and measure employee engagement at Sussex. Some of the survey results tell us how we are performing against our People Strategy aims and other insights gathered contribute to shaping the future of the University.

The Staff Survey Working Group, currently chaired by Professor Robin Banerjee, is responsible for survey planning and strategy. The Group includes a range of academic and Professional Services staff members.

The survey is delivered by our survey partner, People Insight, who are employee engagement specialists with extensive experience in running employee surveys with other organisations across the UK and internationally. They are also higher education sector specialists, working with many UK universities, which means they have robust, reliable external benchmarking data.

Survey themes and how engagement is measured

The survey questions cover key aspects of employee experience and employee engagement that are important, especially in the higher education sector. Themes include wellbeing, development, leadership, and inclusion, amongst others.

Employee engagement is measured with three benchmarkable questions:

  • I am proud to work for the University of Sussex
  • I care about the future of the University of Sussex
  • I would still like to be working at the University of Sussex in two years’ time

Survey dashboard users

A small number of colleagues, representing each of the Faculties, Schools and Divisions across the University, have access to online results data for their specific School or Division via an online dashboard hosted by People Insight. Dashboard users can drill down further into the data by theme or question, supporting the development of local action plans. View a dashboard users list [PDF 39.82KB].

Taking part in the survey

Find out more about taking part in the survey.

  • Who can take part in the survey?

    All staff members who are on our payroll, with a direct University of Sussex contract, will be invited to complete the survey. 

    The survey is open to staff who joined the University on or before 1 September 2024. If you joined on or after 2 September 2024 you will not receive a link to complete the survey due to deadlines in the survey setup process ahead of launch on 30 September. If you joined the University from 2 September 2024, you will be able to have your say in between surveys and further details will be available later in the year.

    If you joined the University on or before 1 September and have not received your survey link, contact our survey provider People Insight at: support@peopleinsight.co.uk.

  • How to complete the survey

    The survey link will be emailed to you by People Insight from survey-invite@peopleinsight.co.uk and it will take around 15 minutes to complete on either your computer or mobile phone. You can also scan the QR code on posters around campus which will take you to a survey login page - simply enter your employee ID to complete the survey.

    Answering the questions is really easy: you select from the multiple-choice boxes to show to what extent you agree or disagree with a number of statements. Some questions have yes/no/prefer not to say answers and then you will have the opportunity to write free text comments on issues you want us to know about. 

  • Amending your answers

    If you have submitted your answers and would like to amend them, please contact People Insight at: support@peopleinsight.co.uk and let them know. They can either reset the whole survey for you or reset the survey without deleting any data so you can make any amends from the start of the survey. Amendments cannot be made after the survey closing date.

  • Survey accessibility

    The survey has been developed by People Insight following best practices in accessibility and web standard compliance. The usability has been tested using most common screen readers such as Jaws, NVDA or MacOS Voiceover. However, due to the wide range of available screen readers and their frequent changes and updates, different interpretation of standards and different methods available for browsing a web page within said readers, People Insight is unable to guarantee error free operation in all available readers and their versions.

    If you are having difficulty accessing the survey, please get in contact with People Insight at support@peopleinsight.co.uk and they can provide alternative options.

Support for leaders and managers

Leaders and managers have an important role to play in engaging their teams in the staff survey by encouraging people to take part and communicating and acting upon the survey results.

Our staff survey line managers' guide [PDF 118.67KB] explains more about the survey, why the role of the manager is so vital and how leaders can communicate effectively about the survey.

You can also view staff survey FAQs [PDF 155.45KB] which are useful for line managers when updating their teams. 

Watch a replay of our pre-survey workshop

The one-hour workshop introduces our survey partner, People Insight and explains the ‘why’ behind the staff survey. It includes an overview of the question design process, survey timelines and useful insight on communication of the survey to teams.

View the pre-survey workshop slides [PDF 3.00MB].

  • Video transcript

    Sarah Engineer: Thank you so much for joining us today. Actually running these type of sessions is a new development for us in delivering the staff survey and it's something that we know from speaking with People Insight and research that actually involving managers and leaders and other key staff such as HR business partners, dashboard users, ODI consultants is actually hugely important in the success of a big staff survey, such as the one we're going to be talking about today.

    So just for a little bit of clarity, today it's not about the question or survey design. All of that work has been completed by the Steering Group and People Insight, and we're now actually into the testing phase of the survey. And People Insight are going to go into a little bit of detail around that, certainly the question design, a little bit later. So today is much more about it's much more about you being involved in our staff survey and the various roles that you play from promoting engagement of your teams and why it's important all the way through to interpreting the results. And then of course, supporting with what happens next and the creation of meaningful action plans. So if I could have the next slide, please, Lisa. Thank you.

    So in terms of a little bit of context, so many of you will know that we've previously ran pulse surveys and those were every, every six months. But actually after speaking with staff members, actually that wasn't the right approach for us at Sussex. It was that we were surveying too frequently and it made it really difficult to properly analyse results and then create and start working towards meaningful action plans. So it was actually agreed that we would move to larger surveys and would be surveying staff every two years. That ties in quite nicely. Our last post survey was in September October 2022 so you'll hopefully know that we're working towards the launch of our next survey at the end of the month.

    So actually what does the survey do? So this survey is our platform for staff to explore their experience of working at Sussex. And we do that through, through questions that fall under a number of themes. And those include wellbeing, development, inclusion, how we're managed, leadership and many more. It also relates to our People Strategy, which I know several staff fed into and helped shape through a number of focus groups. So it's also a way to see how we're progressing against the aims within that.

    And actually why, you know, why is employee engagement important. So it's the connection that we feel with our jobs, our leaders, our organisation. So it really does reflect whether we enjoy our roles, whether we feel secure, developed, respected and recognised for the work that we do. And actually as leaders, we all play a role in supporting and promoting that engagement so we can gain valuable insights that will help us continue what's going well and actually take that action when it is needed. If I have the next slide, please, Lisa. Thank you.

    So before I hand over to People Insight, I have previously mentioned the staff survey steering group and I just wanted to introduce you to the members who have been working really closely together over the summer months to actually get us to this point where we can go live with our survey at the end of this month. Important to add to that, actually, Sasha has also been really actively involved. She's been reviewing questions, comms, branding and is in full support of the survey and you'll be hearing from Sasha over the coming weeks. And she'll be doing her own piece to promote engagement. So I'm now going to pass over to People Insight who are going to talk through the rest of the workshop but myself and other team members will stay on the call, be able to answer any questions. There will be various touch points throughout this session this afternoon for you to ask questions. Alternatively, if you want to pop something in the chat, then I'll do my best to answer that as we go along. So Lisa, over to you.

    Lisa Hughes: Thank you very much, Sarah. Welcome, everybody. I can only actually see one screen when I'm looking at the slides so if hands do go up, Emily, Sarah, if you can, just let me know. And, you know, we do encourage as much participation. So I am Lisa Hughes. I'm a senior consultant at People Insight. I've been here just over two years, but most of my career has been in organisational psychology, leadership and talent development. I have worked in the university sector, so I worked for the University of Surrey in both professional services and at the Business School. So that's a little bit of background about me. And I've been working very closely with the steering group to design the questions and to look at more of a strategic listening approach. So, everyone's always on, but how do we take that information and how do we use it to really improve the employee experience? I'm going to now ask my colleague Emily, just introduce herself.

    Emily Hopewell: Yes. Thank you. Hi, everyone. My name is Emily Hopewell. I'm one of the client services executives at People Insight so working very closely with the project team at Sussex. Help to kind of set up the survey and make sure it's user friendly. And also help with any kind of technical issues as the survey goes live. So if there are any queries, people can reach out to our mailbox and I'll manage that as well.

    Lisa Hughes: Yeah. So thank you very much Emily. So we're going to jump into a few things now. So firstly it's a little bit under 60 minutes. Now we want to make it as interactive as possible. However just to let you know this is just one workshop of a few you will have in the future. We'll be doing post workshop, there'll be some training sessions, etc.  So this is just sort of getting everyone curious, making sure we're all on the same page. So please use the chat, ask your questions, and we will look to answer as many of them as we can, and we will have some Q&A at different points. The content we're sharing, it will all be available on your portal so you can look at the slides in a bit more detail after that, if you so wish.

    So firstly, just building on what Sarah says you are taking what I would say is now best practice approach to listening and especially in the university sector, pulsing was trialled quite a lot at university, especially through the Covid period. And although it was giving us a measure and a moment in time, it wasn't actually giving us enough information to really look at some of the employee experience holistically. So that's really great that we are now taking this approach because it's going to give us a really good baseline. But to be really clear, this is not a satisfaction survey. So when you see the questions when the survey comes into your inbox it is very much around engagement. So while statements or questions are more about how can we improve on these. And it's really giving you the tools and the insight to then think about how do we communicate that to our teams and how can we build some local activity and organise that organisational activity as well. So it's not a pulse it's a full organisational survey, and it looks at all of the factors that make great organisations. So just to go satisfaction surveys were very much a moment in time. We use much more for measuring and reporting. You can still report from this survey but it it's much more about continuous improvement.

    So who are we, just a little bit of context, People Insight 20 years young as I like to call us. Originally designed as an employee research company. Always using employee data, employee insight. But, you know, 20 years ago that was very much large reports printed out, sent out, probably took 3 to 6 months to go through. And again was then filed until they did the next one. Very much of a census approach. In the last ten years that has changed with the use of technology and the ability to aggregate quickly and be agile with data and insights. Through that, we have created our own models and tested our models with data and the research from those has created a lot of the questions you will see in your own survey. So we work with hundreds of different organisations, not just HE but I will talk a little bit more about who we work with in actually in the moment. We don't just do surveys, we do focus groups. We do 360s. I work with people on strategy, leadership development. So, you know, we do look at the whole organisation and see how we can support you by using evidence-based data to really improve the employee experience.

    We are ISO accredited and we're very proud of that. We are, one of our values at People Insight is data integrity. So it's very important that we protect your data, not only, you know, making sure where it's kept and looked after but we also make sure there is never any identifying thing whether when you complete the survey or when we build the dashboards and you have your reports. So it's a confidential survey. There is some data that is given to us to make sure we can aggregate the data in the right way. But no individual should ever be able to be identified through our data. And we, you know, that is really important to get that message over to everybody we know. Trust in surveys like this can be quite limited, but we work tirelessly to actually make sure that confidentiality is an important part of the whole project so you can feel rest assured. When we have been asked for raw data or we have been asked in an RFP to have more data we do not work with those clients because it is such an important thing in regards to making sure people can trust that their information will be protected. So that gets the message over. But we do have quite a lot of information on our website, etc. about how we do that.

    So higher education. So we're actually up to around 70 HEs now and all different sizes and we have been working with the HE sector for about six years at People Insight. But our HEI director, Jane Tidswell, has been working in the higher education sector for about 15 years. She was originally a school teacher, and she just has a passion for higher education. So we have built, you know, a great portfolio. We're the number one, leading survey within the HEI sector for staff surveys. We're really proud of that. So we invest a lot of time and effort in making sure that we have really good, strong benchmarks. So these benchmarks will give you really strong comparisons, whether it's HE overall, post 92, pre 92, academic, professional services. Russell Group if you so wish. Regional, we have Scotland. We have London. And we're also starting to look at more sort of department or faculty led. So we're looking at HESA and looking at how we can build some benchmarks there as well if you wish to have them. Our benchmarks are purely through all of our data. We never buy any in, and the benchmark is created over a three year rolling period. Yours will be completely live until we build your dashboard. That's normally about 24 hours after your survey closes. Well, the dashboard is then built and it's then checked, etc. So, you know, we're really proud of that. And we work tirelessly to make sure we can give you trend reports, look at what's happening in that sector. So in regards to the employee experience and how we can all share best practice across those different universities, and we give all of you you that comparison through our dashboard.

    So Emily is going to show you the dashboard and get you all quite excited and curious about that in a little while. So not to teach anyone to suck eggs I think we already clear that we're all employees. And when we know that we can share our views and we can feel like we've been heard and responded to we feel like we're actually part of something rather than it being done to us. So that's why employee listening really matters. I'm part of the CIPD as a fellow. CIPD is basically a sector HR industry network, and they do a annual review of great workplaces. And out of their seven criteria, listening and voice is in that seven out of everything. So we know it's important, but it's how we value that and how we use it to really improve the experience. It is evidence-based data and can complement other pieces of evidence you have in your business, whether that's hygiene HR factors like sickness and absence or attraction or levers through to, you know, making some of those solutions at a local level that we listen to as well. So we're giving feedback instantly by using our technology.

    This is definitely a survey for positive intent. It's a catalyst for positive change. Long gone are the days where this sort of tool was used to, you know, beat people over the head. It's very much something to improve the experience. But give us a baseline. Breakdown assumptions and giving us evidence to say, why are we doing what we're doing within the people agenda? Also, because you'll be using our dashboards, for a strategic piece of work. If you start using the action planning tool, which we will talk a lot more about and the post survey workshops, we'll be able to start to do some correlation between activity baseline and the next survey to show what has improved, why it may have improved, and start to share that across the business. Which again, just builds collegiality and collaboration. Also because this tool is now being more transparent because it's protecting confidentiality leaders are able to see a lot more data insight, and they ever have been able to at an aggregated level. And let's talk a little bit about that in a moment.

    So, why does it matter? It isn't just an HR exercise. HR are the facilitators, whether they're the OD team the People team, actually voice and employee engagement is all of our jobs from when we see people applying at Sussex to when they leave. We should consider how that experience has been for them and we will be able to see some demographic data in your dashboard, which will start to look at age group or tenure. And that really does start to tell us a bit of the story. So when working for an organisation, I think most of us would, you know, we want to wake up motivated to get to work. Feel valued that you'd be confident to be ourselves and suggest ideas, find work fulfilling, feel involved and proud to work. And this it doesn't just happen on its own. It is facilitated. The psychological safe environment that leaders specifically create does encourage or discourage this sort of situation. So there's a lot of research to say why people leave organisations. There are three key areas. One is line manager or the most recent manager. Two is obviously sometimes they hit sort of the glass ceiling within their career. Three often comes down to recognition more than reward. Are they being recognised for the work they do and are they having the opportunities for career or learning and development? So all those key motivators about why we come to work are important to understand in anybody's sort of individual employee life cycle.

    So just some stats for the people who like data in the room. And you can find the statistics that links to a report called Engage with Success, which looked at some universities a little while ago. I won't talk to each of these, but I think most of us will know when people are happier in their workplace. More engaged, more motivated, have all the resources to do what they need to do. There's generally a better customer student experience. Sick days are less. There's less high turnover, but more importantly, better productivity and more innovation, which obviously just creates a better, you know, a good energy, a good vitality in the organisation with people feel like they're working together rather against than against each other. So I think we've all known when we've worked in those really great environments and when they're not so great and some of the indicators that might be driving that. So this is what I mentioned earlier in regards to the data that was done by ORC. They conducted a survey and looked at five universities based on 12,500 university employees and they could see correlation between high upper quartile engagement of staff and satisfaction scores from students. So what we're seeing, what we're doing, is starting to do some work in looking at NSS scores and employee engagement and starting to see over time if you improve engagement for your employees, how does that improves improve student experience?

    So just to get into a bit more of the detail, how do we measure engagement? And, you know, various different survey suppliers will do this slightly differently but generally there's always a focal point. And for your survey we have this focal point and we'll be able to benchmark this against the comparisons I've mentioned earlier. And this benchmark and focal point will look at pride. It will look at caring about working at Sussex. And also if you would like to be still working here in two years time and again, as I've said, we can filter this in different ways but that generally gives us that sense of affiliation and advocacy. And you will have an engagement score that you can then measure over time, as you do on each survey. We tend to have either, you know, anything from 3 to 6 within the engagement focus. And it really depends on where you want to sharpen the saw, understand more detail. So we're in a good place with this focal point at this point.

    So that's quite a lot thrown at you initially. Is there any sort of questions about People Insights? Anything like that at the moment? Emily, if you see any hands come up or if there's anything in the chat.

    Emily Hopewell: I can't see anything at the moment. Nothing at the moment.

    Lisa Hughes: Nothing in. Okay, that's fine. And if it does pop up, just pop it in the chat and we will come back to it towards the end. Okay. So let's get into your survey that I already mentioned. We do do a very rigorous job around question design, especially when we're doing a holistic survey like yourselves. You know, a really good organisational survey, because we want to get a very strong baseline. We don't want any gaps in that. So we're always looking at various different factors and things. So within your survey you've got 11 factors and things which I'll share with you in a moment. What sits behind them is 38 Likert questions or quantitative. And they are measured on a five point Likert scale. And that looks at, you know, greatly agree to greatly disagree. And we'll show you on the dashboard how that insight’s then aggregated up so it's easier to understand. There is two open text questions which look at, you know, what do you believe is working well and what else would you like to see improve. And that really helps us to get into sort of the granular of understanding some potential solutions, but also some of the celebration. We look at this very much from a balanced approach. Organisations have some great successes, some great celebration, especially within the HE sector. But also understanding some of the other areas we might need to focus on to improve, to make the employee experience good for everybody.

    We do have it seems like a large number, but it serves 18 demographic questions. But the way we have positioned this survey and built this survey on this occasion is we have questions regarding bullying and harassment in our demographic and questions linked to your performance reviews etc. so that's going to sit on a slightly different page so it doesn't impact your first sort of home page insights, but you'll have all that detail as well. Ultimately, we take a, we do a lot of research around the experience of doing a survey. The last thing we want is survey fatigue. We've done a lot of research to see once people get to like 70 questions, even if they're very easy to answer people start to just press neutral a lot because they're exhausted or they stop doing surveys. So we always really focus on making sure the survey is no more between 15 and 18 minutes to get the best experience. We look at questions that are inclusive, not trick questions. No double-barrelled questions, no leading questions. And also, you know, the criteria for our question design with Sussex was very much about what could be benchmarkable against the external comparison. And is there any historical that we can look at as well and benchmark over time? Because that would give us some more intelligence about what has been working well or not so well. So, you know, there was lots of different criteria we were looking at when we were doing the question design. We approached the question design with a basic model that looks at all the organisational factors. And then we listen to everybody in the room about what they want, what they believe that we need to understand, and then we start to build from there. And then it gets down to a lot of the terminology and also we want to make it, you know, a small survey. So really refining and refining and I can honestly say there's been probably quite a few hundred hours built on question design. But I would like to say the committee and the working group has been absolutely wonderful at input, engagement in that process and being part of that. So thank you to everybody who was in that group.

    So these are the survey themes and factors that have been labelled. That's slightly different to our PEARL model. But we do like to personalise where the language is more aligned to the university. So we've got one of the questions that linked to my role which is very, you know, important in understanding how people feel intrinsically to their roles within the university. Wellbeing and balance, diversity and inclusion, senior leadership and then line management. We have been really clear in the survey who the senior leadership is and who the line management is. We want to make sure that everyone responding understands who they're talking about or who they're giving feedback about. Learning and development, innovation and contribution, civic duty, which was lovely to see. One that we're starting to see a little bit more within HE question sets. Engagement, the three questions I've already spoken to. The ADR process and bullying and harassment, which sits in your demographic. So a really well-rounded set of questions there. And there's obviously questions that, you know, connect to things like recognition, etc. that sit behind some of those labels.

    So a little bit more detail. So you're already aware and that you can start to share this with any of your colleagues because you are going to be our influencers as well in regards their people, you know, really responding and doing the survey. Our response rate is critical to qualify the insights. So we really, you know, we encourage all of our universities to really have a really strong engagement campaign. A community communication campaign. Make sure the leaders know what's going on and are involved and make sure we've got great sponsorship. Otherwise we have to do a lot more work to understand actually is the insight we're seeing for the majority? Otherwise we might be acting on the wrong thing. So if I can get anything from anybody today is just, you know, you get as many people as you can to do the survey, but in the right way when it's not mandated it's voluntary. But we really want to hear everybody's views. So we did do some pre-loaded organisational data and that may include some demographic data as well. I know Emily and Sarah are working behind the scenes to make sure. Responses come straight to us at People Insight. Your email to the survey will come from us as well. So it will be People Insight. All responses come back to us. The organisation doesn't see any of that. We hold all that data, so it's all confidential. And for this survey we have a minimum reporting of ten. So you will not be able to see any less than ten. And again Emily will show how this looks in the dashboard and how we can make sure you can't do that. And that is, again, to protect people. You know, some thresholds in other sectors it's five. But within the HE sector, ten is normal because we want to build trust especially in the first couple of years when there might be some nervousness that people might be identified.

    So our technology is really easy to use. The survey is being built in, which is fully accessible, and it's on various pieces of technology, whether that's a phone, an iPad, a laptop. We also have QR codes that people can take with their own phones and put in an ID that links to them. And this we want to make it as inclusive as possible. So those people that are on the ground, the Facilities team, are actively engaged with as well and have the opportunity to complete the survey. So as I said, it's two, I think it's called two way accessibility. We're always making sure that people can access it in the right way. And once you've completed the survey, you can see it all at the end. So you can go in and change any results if you wish to. So it's really flexible, really agile. And what we will be able to do, because we're just using technology now is to get real time response rates which will really help to, you know, encourage people to take part. Again, this is about positive intent and celebrating people, that we've got lots of people taking part. Who isn't, can we do anything else with them to engage with them to take part? Do we have to do, you know, a bit more communication? And throughout the process, People Insight, Emily, myself and the team are here to answer, field any questions so they can come direct to usif they feel they have anything, they have any question about the survey questions, confidentiality, any of those fears. We will be there. And we just encourage everyone to be a positive advocate because that's how we build response.

    So this is what the survey looks like in the background, lots of communication and engagement and branding has been happening. As I've mentioned, the survey will come from us. Oops. Let's move that. And you will be able to see, you know, it's coming from People Insight. IT would have been involved to make sure that it's not going to spam or junk. If you should notice that, notify us straight away, there's a lot of testing that happens to make that not happen. But I do know within the sector sometimes there's a few odd emails. So, you know, we're on hand to support with that. You can not forward your emails to members of your team. This is really important. They won't be able to complete it. Everyone has their own email survey. And that's really important for making sure the report insight's correct. So if you all forwarded one and you can't go in it, that's the reason why.

    We've got dedicated support. And the other thing is to support all of you in this process is we're going to have some natural reminders that will come through when the survey is open. These reminders will only be sent to people that haven't completed the survey and they're really a nudge up until the closure of the survey. And we do naturally see every time we send a comms and nudge, and we don't do them every day. We create the plan alongside the comms team to make sure it correlates. We do see an increase in response at those points since and again, the live response rates will be available and can help us to engage with any of those departments with teams that may be are not as engaged as we would like them to be. So time to get involved, get some great response rates if possible. And there has been an aspiration set which is 65% response, which is a good a very good response rate to aiming for within the sector. We have a spread of anything between 37 and about 82% at the moment. So, but generally it sits around the 68%. So 65 for your first sort of organisational engagement survey is a great aspiration, and we'd love to see us meet that goal.

    So key milestones to consider are comms campaign starts already. Survey goes live on 30th September. There'll be some reminders issued and then survey closes on the 20th of October. Then we'll have some overall presentations and then dashboards released from the 13th of November onwards. And then there'll be all sorts of support in regards to dashboard users, surveys, etc., drop in sessions and coaching sessions and look into the how we then play this back to your colleagues, your teams and how do we get them involved in potentially some local prioritisation. And while the organisation looks at some of the areas which might be bigger and might need more resourcing investment or are already in play but might need to be reprioritise depending on what the data is telling us.

    So that's a little bit around the timeline. So what I'd love to show now, and I'm going to hand over to Emily, is when you get your dashboard. If you will be a dashboard user, not everyone will, but we just want to get to show you what the output looks like. So I think that's quite exciting. So Emily can I hand over this slide to you to talk through? And then I'll stop sharing so you can show a demo.

    Emily Hopewell: Thank you. I think there was a question in the chat as well, but I think Sarah's answered that. So I think that should be fine, but we'll show you that anonymity threshold in the dashboard as well. But yes, once we've closed the survey, shortly after, we'll start building the dashboard. And if you are being invited to the dashboard, you'll get kind of an invitation like that image that you're seeing on the screen. It's really simple and easy to kind of create a password and set yourself up. And you already be given your restricted access. So if you just kind of manage one whole faculty or maybe a few departments within a Faculty or Division you'll already have access to that specific area, so you won't have to go looking for your particular results.

    So I'm now going to go ahead and share my screen to show you kind of what the dashboard might look like. Obviously, this is dummy data but it gives you kind of a rough idea. So once you're in the dashboard again, you'll be looking at your particular results but you'll be able to see your participation rate up here as well as your engagement score. So that's based on those three questions that Lisa went over earlier. Again, we base all the scores kind of in the dashboard on the favourable scores. So that's the score that you want people to hopefully be answering. So they agree and strongly agree in most cases. You'll be able to see the breakout on individual questions as well. So you'll see that nice blue colour is the agree, strongly agree. And then we've got the grey, the light grey, neither agree nor disagree which also an important group. As well as the dark grey board. Those are people disagreeing with the statements. So your engagement score comprises of these three questions.

    So if you scroll down, you'll be able to see theme headlines. So as we mentioned we've got these various different themes that we're looking at such as senior leadership, line management, my role. So you'll be able to see all of the theme breakouts here. And that's really useful for pinpointing general areas that you might want to focus on. And if I just put them in, you're able to drill down and look at each theme. And obviously these questions may be slightly different to the ones that we have in our engagement theme. But you are able to look in a bit more detail here. And there's also some information on the spread. So depending on your access level, if if you've got access to a whole Faculty, for example, you will be able to compare between departments and certain areas within that Faculty and by other different demographics as well. So you can look at that. But just go back onto the homepage. You also have some key drivers. So our system looks for correlations between the engagement questions and all the other questions that we ask in the survey. And that's mainly just because the engagement index is kind of around general sentiment. So pride, advocacy and care and that sort of thing. So you can't kind of act upon that through any one means. So your key drivers are showing you what's greatly impacting engagement based on correlational analysis. And again, this is specific to your area that you're looking at as well. We also do have the ability to filter. So any of that pre-loaded data or otherwise demographics will be located in here. So again, if you wanted to just look at a particular area of what you kind of oversee, you're able to do so here. And then it will also give you a comparison. So it will show you how that particular area is doing in comparison to Sussex as a whole. You can add multiple filters on top of one another. But as we mentioned we do have the anonymity threshold of ten. So if I go, if I went to select this, for example, it won't let me view that data just to protect that person's or multiple people's confidentiality. Of course, if you do have teams that are quite small, you will be able to look at their results in the level above. And I'll obviously be included in the overall results as well.

    Lisa mentioned our benchmarking as well. So we've got this section here which allows you to look at your external benchmarks as well as your historical surveys. So any of these pulse surveys that we've previously run with you as well as we'll have. Yeah, Professional Services benchmark, the benchmark overall picture and so on. So if I just turn two of these on as an example. I can switch on which ones I want to look at and then it will populate those here. And again, that's based on the favourable scores. So where it says plus six here it means that this organisation 81% agreed with the purpose of the organisation making them feel good about their work and that six percentage points higher than the all sector benchmark. So it allows you to get some really nice context on how you're doing. So because you can see a score and you might think it's positive or negative, but having that context is really important. The historical comparison is like for like as well. So if you're looking at a specific department, we'll have done some mapping in the background. So it is comparing that department's historical results as well. So you're getting that much more specific data. You're able to export all of the different reports, and there's lots of different reports in here. So the scorecard kind of lists all the questions that we ask in the survey. And you can reorder them how you like. So if you want to look at them. So it's favourable to most favourable score or you want to look at how you're doing historically from kind of lowest to most improvement, you can do so that and you can export as a PDF or as an Excel. The PDF takes kind of the table that you're looking at, whereas the Excel kind of moves all the data over into a spreadsheet.

    But what you will have that is probably even more useful is the ideck. So that's one of the reports here that I really wanted to show you. So this is basically taking creating a PowerPoint slide based on all the different reports and key bits of information in the dashboard. And again, you can apply filters on this. So if you wanted to just pull off a report for specific area, you can select that there. And it will create you kind of a custom presentation, which obviously saves you a lot of time if you need to kind of deliver your results locally. You don't have to spend lots of time pulling things off from the dashboard. You've already got this slide deck here that you can use. When you export it, you can select which pages you want. So if you don't want to include some of these in the download, you can turn them off and on. And then once you download it, you can't edit obviously the images here because we don't want you changing any of the scores, but you can add slides in. So if you want to add a bit more, some prompts may be some interpretation to share. You can also do that which is really useful.

    If I just go back to the home page. The last thing that I really wanted to show you to give you a bit of a flavour of the dashboard is the action planning tool. So you'll notice that against any of the kind of questions, there's this little action tick box. So here you're actually able to log an action. So if there's a particular thing that you know you're going to work on within your area, you can put that right in here. So I'll just put in this example. But there's lots of inspirations as well which we'll work closely with the project team at Sussex to make sure that they're really personable to Sussex University. And you'll be able to see all those examples there because it can be kind of difficult sometimes coming up with some actions especially if you've not really done this kind of thing before. So yep, you can add links and things. You can then set a due date for yourself. I just set up there. Add files as well, so everything's really centralised. And then once that's in there you can go into this action planning section and you're able to drag and drop these along and monitor how you're doing, edit them if you need to. So yeah, this keeps everything all in one place. You don't need to go offline and then use all the spreadsheets and things to log any actions that you're taking off the back of the survey. It's all in here next to your results. So really useful. I think that's everything I wanted to show. But if there are any questions around any of that, we may as well run through them on the call.

    Lisa Hughes: Well done, Emily. That was a really good sort of show and tell. And just to let you know, it is more of a quick demo to get people enthused about what they're going to see out the other end. And just to let you know, you do have a help centre, the content hub, and there's a little chat, the bottom called Finn. So, you know, this tool can be used 24/7. It's very flexible and agile. I haven't broken it yet, so don't be fearful of it. You can always remove things as in actions if you're just testing it. So once you do get access to your dashboard we will do, you know, post survey workshops and actions and action planning to help. But again, it's giving you some really good high-level insights and just building on what Emily said. If you are into the detail, you can then download Excel spreadsheets and see all the correlation scores and see why that might be more of a priority than another just based on the data. Remember, this is just the data coming in here. It doesn't know causation, doesn't know context. So although, you know, the key drivers are great at prioritising and helping you prioritise. Sometimes we need to go back to our teams and really challenge all these the right things we should be working on based on what the data say. And again, that's why response rates are so important to make sure we're qualifying some. So thank you very much, Emily. I know this is the first instalment of what will be coming in the future.

    So just to sort of finish off before we get to any Q&A, you know, we take, sort of, the outcome really, the insights, they're really important to us at People Insight that we support you with those. And it doesn't just become that data. So the dashboard, as you can see, is very intuitive. It's dynamic. It will give you your information by your Faculty, by your Division, by department. So that information will change based compared to sort of the university as a whole. So it can really personalise your experience. What the dashboard also does, it shouldn't be extra work. It's actually just becomes much more of an efficient tool. And if we use it already, well, we can then do some really clever reporting and correlation, as I mentioned before. But once the results are in, what we're going to do is take some time to help you understand them. And I know for some of you who are very well seasoned in this, but, you know, there's, you know, various different new people coming in, taking on roles, but really seeking to understand what the data and insights is telling us, giving you time to process it. Maybe buddy up with someone or talk to us about what it's actually saying. Or, you know, sometimes there's things that you just I don't understand why this is like this. I've been working on this area and this thing for so long, and it doesn't seem to, you know, seems to be going backwards. So we can certainly start to sort of unlayer that and have a look.

    Once we've done that understanding, we've really talked to you about how we can start to share that with your teams and departments in the right way because what we want to do is build confidence within people who've shared their views and maybe have people who haven't at that point, that when we get this feedback, we do take it seriously. We do continue to listen to you. We don't just take that data and that's it. We actually want to understand a bit more, so whether that's workshopping focus groups or just in a team meeting. And using the ideck is when, you know, that becomes really efficient. So those are the first couple of stages and then we really move into things like prioritising, planning. And when we talk about act and action it sounds big and heavy but a lot of the times, once we understand what's happening in those factors, there's already objectives or actions in play. It's really just mapping them over. And are we communicating them in the right way through the right channels, and is it not reaching people in the right way? And that's often what comes up. It's not actually that nothing is happening. Often it's just not landing in the right way. Well, maybe there is a gap and sometimes there is. And this gives us an opportunity to look at how we, what we do with that gap. And then ultimately, because this is long term strategic work, it's about how we sustain listening. How we keep people engaged, how we help you as leaders continue to, you know, move this forward. So when we get to the next survey, there's no surprises. What we want to do is get to a place where this is much more of a proactive tool rather than a reactive tool. So really just letting you know we're partnering with you all the way. You know, Sussex has invested in this like some of our other universities are taking this to the next level. I think it's always been a serious piece of work but it's taking it to the next level and making sure that the leaders and the managers who have access to these dashboards are well informed and understand and know what to do is a critical part of this strategy. So I think, you know, that's all sides done. Any initial thoughts, ideas? Any other questions? I'll stop sharing. Sarah, do you have anything else you want to say?

    Sarah Engineer: You know, there's been a few bits in the, in the chat, and I think I've gone through and answered most of them. There was a question around actually whether we've got bullying and harassment questions in the survey. I know we've got I think we've got how many have we got? You know, that's all under the demographic questions? Are you able to just talk to those briefly? Because I know that we've changed. They're now going to be a yes no response instead of. So yes, we all have the bullying and harassment questions. I think it's have you been bullied or harassed in the last 12 months? And then there is the have you witnessed any bullying or harassment? And instead of it being, as I think in the post survey, we had it on the Likert scale. I think this year we we're just having that as a yes no response to those questions. Is that right, Emily? Have you got them? I don't have them in the background. Yeah.

    Emily Hopewell: So I think previous yeah. They were strongly agree to strongly disagree, which isn't really the best scale for these kinds of questions. Yes. No. However, that will be in a piece of diagnostic done to make sure we can look at any historical in that. Yeah. Because I know there's a lot of activity that's happened. So we want to measure has that worked or not. So absolutely.

    Sarah Engineer: And I think there is also the follow up question kind of have you reported it. Either through report and support. Yes, no, I prefer not to say. So we are including those questions.

    Lisa Hughes: Anything else from anybody? How are we feeling? Any feedback initially? Are you excited to get all this insight from your colleagues? There's a few smiles. Possibly. Marina, I hope I said your name right.

    Marina Pedreira-Vilarino: Hi. Yes. Thank you. Lisa. Thank and thank you for the helpful presentation. A couple of quick questions, really. One relates to when you mentioned that it would be very clear who people are being asked about. There was a list of constituents across the university, and you had line manager as opposed to senior leaders. I think it said, and I know this has caused some confusion in the past, when you say, will it be clear, you mean it will be clearer than it is right now in that list?

    Lisa Hughes: Yes it will. I'll say this in the survey. There's a narrative before they answer the question about who they're actually sharing their voice and opinion about because we, you know, we often have a few challenges around that. Yes. So we want to make sure we're getting we're helping all your colleagues to answer that in the right way. So yes. Absolutely.

    Marina Pedreira-Vilarino: Thank you. That's really helpful. And then my second question is you mentioned that we would be able to benchmark data historically. Is this from now on or will we have access to previous data and we are able to benchmark against our previous team's data. Yes. So from 2022 and previous to that, there is some questions. Remember, they were just pulse so you won't have all of the questions historical because we haven't asked them all. But when we have you will have some historical data. So we always caveat that depending on the timeline, you know, workforce changes things. But we do try to do the mapping in the background to make sure it's right is relative to the right team. And a lot of work goes into that. So I mean, you can use it as. When it's a bit closer, like within 18 months. I think it's a stronger historical, but I think it's still useful, especially if there's been some work and activity happened in that place. So new leaders are coming, new managers and things like that. We can definitely say that there's been some change. And again, the bullying and harassment and some other questions, very specific questions that were asked of Sussex previously. We want to make sure that we don't lose the value of asking those questions in a pulse. So where possible, and as you can imagine our question list starts to grow. And we need to make sure it's not, it doesn't become a big census where no one wants to look at the insights because it's too complicated. So we feel like we're in a really strong, balanced place, but it's still factored in all of that. I hope that helps. Yes. Great questions.

    Marina Pedreira-Vilarino: Thank you very much.

    Lisa Hughes: No worries. Elizabeth. Yes.

    Elizabeth Rendon-Morales: Thank you. Hello everyone and thank you, Lisa for the presentations. And to Emily also for the explanation. It is my second year having this experience with the with the pulse survey. So this is really good and a for the whole staff I recognise that is very important. But my question is related to in the narrative of the survey. Are you going to have like a narrative or a question explaining to a staff what is going to happen after this? The people complete the survey because, for example, many of the staff, for example, they said some, some of them, they don't want to answer the survey because sometimes they don't know this is really going to have an action after they complete the information. That's why it would be very important to mention what will be the steps that will be taken with this information. How this information is going to be shared after the completion date. So just wanted to ask if there is any idea of any standard time order.

    Lisa Hughes: Yeah. I mean, that's where we see the best results, and that's absolutely where the driver and purpose of this survey is quite different to what's happened before. So I know Kerry and Sarah they've been working on the communication. And the communication will start to come out. But then also, when the emails come out, there's an initial email inviting you to do the survey and explaining. And then there's a bit more of a detailed one when you go into the survey about the whys, the whats and how it will be used. So again, this is just one, this is the time. This is the tiniest part so far doing this, you know, talking to you as leaders. But there's going to be lots of other engagement. And Sarah, would you like to add to that?

    Sarah Engineer: Yeah. Brilliant. Thank you. Lisa. Yeah. Just to to add to that. So we are also trying to be a lot more transparent in the timelines and kind of what happens next post-survey. So we certainly, we're working on some updated web pages at the moment. And I think Lisa had actually one of the slides where we broke down actually the timeline, again we know from when we've run surveys before that it has felt like I've completed the survey, and then I'm waiting quite a long time before I see I see any results. So we've been working very closely, like I said, with Kerry and Sasha and just generally to kind of look at that at that time frame. So we've got the survey will close on the 20th. And then on the 13th of November we will have a first kind of presentation of results to our University Leadership Forum. And then after that kind of the following week, hopefully from the 18th of November results will start to be shared out locally with Schools and Divisions. So we have worked very hard to make sure that actually that turnaround time and that that visibility is a lot clearer than maybe we've had previously.

    Lisa Hughes: I was trying to put that up then. Oh, yes. Thank you. There we go. There we are. And I think also, you know, best laid plans and all that. We do, you know, we're looking at much more engagement, but also, you know, it depends on the results. So let's be honest, you know, and you know, we don't we're not going to hide anything because the results, the results. But we need to think about how it impacts and how it lands. But the majority of universities I work with, and I work with quite a few, it's always very balanced and we all probably already know some of the areas that are going to come up and pop up. But it's about what do we do with that next? But I can honestly say Sussex is taking this, you know, it's really invested in this this time. And we did, you know, we went through a process to be your partner again before this happened. So they went out to market to make sure they, they have the best possible survey provider. Thankfully, you picked us. So, you know, this is being, you know, taken very seriously to make sure that we got the right approach.

    But, Elizabeth, I think to all of you, if you're not seeing it, if you're not hearing it or you don't feel like it's landing, talk to us, because that's where we can react and be more agile. And I think once we start to see the responses come in, because you'll have a live response tracker. And I think between Kerry and Sarah, they'll be sharing that. You know, there's always a sigh of relief for us in the first hour isn't there Emily, that people are taking part. But we, you know, we can really start to help you rather than going in and saying you haven't completed your survey. More about why isn't it landing and what else do we need to maybe run a session or, you know, get one of the sponsors in to go talk to that team or department just to make them feel reassured. So it helps us to just be more flexible, more agile and move into that space. Okay. We have one minute to go. And as Sarah said, the slides will be available and lots of communication will start to come out. But feel rest assured, we're here to partner with you. So if you have any questions after this session. Confidential, if you want to. You can just reach out to me and Emily. We will look to answer those. So we want to make sure you have everything you need. And you feel comfortable that when you're talking to colleagues in your teams that, yeah, they should take part because it's going to make a difference. Okay. Well, if there's nothing else we will leave you back to your early evening now. And, yeah, look forward to sharing the results with you once they're in. Brilliant. Thanks, Sarah. Thanks, Emily. Thanks, everybody. Thanks, everyone. Thank you. Bye.

See more from Staff survey