By Teri Hern, University of Western Ontario

In 2003, the School of Graduate and Postdoctoral Studies (SGPS) at the University of Western Ontario released our own application system through PeopleSoft. This article is a brief outline of the process we took and some of our successes and challenges.


Since 2002 SGPS has relied on a third party application service. This service “pushed” submitted applications into our student information system (PeopleSoft) and our document management system. We also had a homegrown reference system that would email referees and allow them to submit an electronic reference. Other paper documents such as transcripts, resumes, writing samples, etc. were sent directly to the program.

Once the program made a decision, the application was forwarded to SGPS for processing.

Although we were able to make the system work, it was far from ideal. Some of the significant challenges we faced were:

  • Building services to accommodate the third party application. This included a reference system and a web portal for programs to communicate to SGPS admission decisions.
  • No complete application file existed in one place. References could be found in the reference system, some paper documents were scanned into their electronic file, and some were left in the paper file in the Graduate program office.
  • Limited access to data. Because application documents were sent directly to each program we never had a clear picture of how many completed applications we received. We also had no access to in-progress applications as the third party provider sent us only completed and paid for applications.
  • No ability to customize. Graduate programs needed a way to provide specific information to their applicants and require documents that were not required by other program. We were directing applicants to program pages where they could find a list of required documents for their graduate application.

SGPS and graduate programs were becoming increasingly frustrated with the limitations of our third party application service. It was decided that SGPS would create an application using the tools available in PeopleSoft.

Beginning the process

We knew it was important for all stakeholders to feel involved in the project. We began by holding town hall meetings, and meeting with departments, to find out what they would want in an application. We wanted to assure stakeholders that we would continue to involve them throughout the design and implementation of the application.

We also saw this as an opportunity. Not just to build a new application, but also to look at our admissions process entirely. We wanted to map out the ideal admissions process and build the technology to support it, rather than building a process to fit with technology. While looking at our procedures we realized some were outdated and others were getting in the way of progress.

For example, our office required that applicants provide an official hard copy transcript before we would review their application. We knew if we wanted to have system where the applicant could complete their entire application online we would have to update this policy. We decided to allow applicants to upload a scanned copy of their transcript or academic record and we will assess their application. Candidates who are offered admission then have a condition on their offer stating they must provide an official transcript before they can register.

We also decided to change the flow of applications. As mentioned, SGPS used to process applications only once the program indicated they wished to offer a student admission. This meant during busy times there was often a long queue of applications waiting to be processed by SGPS, which left applicants waiting for an official offer. It turned out that this model of processing applications was in place because there was a time that SGPS reviewed applications for final approval before a student could be admitted. Since that had been changed, and programs now had the final decision regarding admission decisions, we opted for a new application workflow. One was created allowing SGPS to assess all incoming applications (assigning an average, basics of admission, admission conditions, etc.) so that once a program made an admission decision the applicant would be notified immediately.


Once we had mapped out the process, we turned our attention to the application itself. Using input from programs, we started to design the application with design goals:

  • Create the simplest path to submission.
  • Push all necessary information to the applicant within the application.
  • Build a customizable user interface based on program needs.
  • Ensure a submitted application is a complete application.


The development team consisted of three, full-time functional staff and two full-time developers.

Managing Resources

  • Work was divided into 6 development packages, which allowed projects to maximize resource time. One package was being developed while another was being tested.
  • Functional and technical resources consistently engaged with development and unit testing.

Unit Testing

  • Each work package had a dedicated schedule for unit testing.
  • Testing occurred to determine if the package does what it is supposed to do.
  • Each unit-testing phase lasted 1 to 3 weeks depending on the complexity of the package.

Integrated Testing

Two months were allocated to integrated testing. This meant testing the completed application system using different scenarios to ensure all the pieces worked together and worked with other areas of the student system. Integrated testing can often involve other areas on campus, in our case the Registrar’s Office, Undergraduate Admissions, and the security team were involved. It is therefore important to schedule time with the various units ahead of time to ensure testing is not held up.

  • 34 distinct user scenarios were created
  • 113 unique errors/bugs were logged


We launched our application in October 2013. SGPS now supports a full-time admissions team with seasonal coverage during the peak season (December–March). We also created a helpline and email for applicant inquires. All application fees are now collected directly and SGPS keeps a portion and distributes a portion to the individual graduate programs. The fee SGPS retains goes into managing the cost of having an admissions team.

We wanted to look at ways we could evaluate our success. One of the first things we noticed was that applications were being completed earlier than in the past. It was determined this change pertained to not waiting on hard copy transcripts or other documents to complete an applicant file. The new process also means we are processing applications faster.

We also collected all the emails and logged phone calls to the helpline over a one-year period. We could then categorize all of emails and try to make improvements. For example, we found that almost 30 % of emails were regarding references. Therefore, we improved the instructions in that section of the application right away and looked at ways to improve that functionality in the future.

Lessons Learned

  • Don’t rush changes into the project— initially, we were so eager to please the graduate community that we pushed changes into the application quickly. This often created unintended consequences, either from not thinking it through or not testing enough.
  • Don’t underestimate the support needed for an application, which is open 24/7—we have to coordinate carefully with the Registrar’s Office if the student system needs to go down for updates as we have people applying world-wide and at all times of the day. We also have to have a plan to support applicants during the winter holidays.
  • Offer as much training as possible through as many avenues as possible (in person, printed materials, webinars, etc.).
  • Plan for a regular schedule of updates to your service.

Overall we are proud of the successes we have achieved with our application system. We continue to make enhancements and changes annually.

I welcome anyone with questions or feedback to please email me at