By Björn Kjellander
Director of Accreditation and Quality
Jönköping International Business School – CUME
Many schools struggle with developing structures for systematic reporting to accreditation agencies. With several national and international accreditations demanding cyclical documentation based on different standards and criteria, to show evidence of quality assurance is often both faculty and admin-intensive. The work to develop reports is often intensified when report deadlines and external review visits are imminent but when the visit is over, many schools fail to align accreditation demands with the organization’s information, data, and communication channels and responsibilities.
There are of course companies out there who focus on helping schools building workflow processes and taking the step from working in excel to a database solution. This step has become even more crucial in the wake of GDPR-legislation, where maintaining faculty data in (for the individual faculty member) non-accessible excel sheet is a major no-no.
Our school had previous experience from different kinds of database solutions. These systems had been adopted and implemented without a proper mapping and analysis of our needs. Therefore, before beginning a process to migrate to a new solution, we decided this time to take a very critical look at our current operations, systems and processes, to understand where we were and where we wanted to be. This became our first step in a GAP-analysis, where we later specifically defined where we wanted to go.
Our GAP-methodology built on some learnings acquired by doing, which in the end made the process and result an effective one – even though we at times not always could see the forest for all the trees. At the outset, there were plans to run this as a full-scale project, with elaborated GANNT-charts, involving a broad set of internal groups. However, we got to a point in the planning where we decided to trim the number of groups involved and focused on the groups and individuals who could help us to unearth the existing applications, practices, and procedures at the school and at the university, as well as their interconnections and respective system needs.
Therefore, when we began drafting a solution specification, we primarily involved university faculty from IT and the library (the latter often very skilled in database management). Periodically, we also sound-boarded with significant internal school stakeholders. With focused project members came also a focus on the goal – what is it that we are trying to define and what is it we need to be able to do? In addition, what are the critical objectives to make this happen and what is the hierarchy among these objectives?
We also tried to clearly define what a good solution what look like. A system feature high on our list was the solution’s ability to produce reports for the various needs of our international accreditors. It had proved to be quite time-consuming to develop and maintain data according to each reviewer’s specifications. It also meant that we had to ask the same source several time about similar but still different data. Also, we needed a hierarchy of importance: what would we like a solution to be able to fix short-term, that is, help with better process, focus, and other relatively low-hanging fruits. But we also needed to envision already at this stage what our medium-term goals would be with the solution, to ensure that we optimized not only for our current activities but for our future ones as well.
Something we had learnt the hard way was how our local circumstances and system specificities did not always work with an external solution’s adaptability. For example, in order to avoid double-work, we wanted a database which could connect and sync with other university databases, such as our national research repository system. In addition, all such imports needed to comply with our data policies.
Sometimes things do go wrong with data imports. It usually starts with some of the tables not making sense and then you have to back-track and audit-trail where the problem originates. So much time was spent on this in our previous solutions, just to find out that the quality chain of data upload, validation and auditing not always made sense. Therefore, our GAPanalysis was very clear on this matter, to avoid excessive manual work and the need for high quality data.
Overall, we needed to be in control of the solution and make it into ours, so that the different system screens and tabs reflected how we were organized.
At the end of this process we had a fairly long list of features we wanted to see in a database system. The final additions were ‘customer care’ and a clear implementation process. So much time had been spent on trying to figure out our previous system ourselves, looking through various on-line videos and sending half-desperate e-mail.
When we were done with the GAP-items, we again compared with the system we had in place and then asked detailed questions to other main providers of database solutions at various conferences. The whole-GAP-process gave us a far better understanding of our needs and specificities, but also made us involve local experts in our own university organization, which gave a more comprehensive, multi-angled perspective on what we needed.
What we also realized was that accreditation processes were in fact useful to streamline our internal processes and have our quality requirements imbedded in our day-to-day management. Also our GAP Analysis made us realize that we needed to change the mentality of some of our different stakeholders. A system is only as good as the data that are fed into it and the data do not magically appear out of thin air. One solution was to import from our existing systems and that made us realize that we had gaps in our own data. Finally, a system is a good start but we also had to think of how to accompany our different users in change management.