ExamenTool
About the Project
Also built during my internship at New Designers, ExamenTool was commissioned by The Hague University of Applied Sciences to solve a very real and recurring problem. Invigilators were using a manually updated Excel sheet to verify student registrations at exam halls — a sheet that had to be refreshed every 8 days and was prone to errors. Students were being turned away from exams they had registered for. The university had had enough, and came to us to build something better.
I worked alongside my colleague Sara as UX Intern, and together we designed and built the system from the ground up using Next.js, Tailwind CSS, Supabase, and TinaCMS.
Design Process
Unlike projects where the problem is fuzzy, here it was crystal clear from day one — the Excel sheet had to go. That clarity was a gift. It meant we could move fast, stay focused, and pour most of our energy into execution rather than discovery. But even with a clear problem, we still needed to understand the people living with it before we could design a solution worth building.
Research
Our research was deliberately focused. We conducted desk research to benchmark existing solutions and understand what a reliable exam management system looks like in practice. We then sat down with the people who actually used the Excel sheet day-to-day — invigilators and administrators — and walked through their workflow step by step.
What we found confirmed what the university already suspected: the manual update cycle was the root of everything. But the interviews also revealed something important about roles — invigilators and administrators had very different needs and levels of access, which had to be reflected in the system we designed.
Students were being turned away from exams they had registered for. That's the problem we were solving.
Ideation
Knowing we needed "some kind of system" is a starting point, not a plan. Before a single screen could be designed, we had to define exactly what we were building — every function, every role, every flow. That meant ideation.
We ran sessions using all the classic methods — How Might We, mind mapping, user journey mapping, and more — to move from vague brief to concrete product definition. We mapped out the full feature tree: what administrators needed, what invigilators needed, how data would flow between them, and what the system had to handle at the moment it mattered most — the exam hall door.
By the end of ideation, we had a clear picture of the product we were going to build. That clarity is what made the execution phase as focused as it was.
Prototyping
Alongside the scanner, we built out the full student list view with filtering and sorting options, so invigilators could quickly find a student by name, registration number, or status. We also designed the administrator interface — a separate set of tools for managing exam sessions, updating student data, and overseeing the system — with clear role separation between admin and invigilator access built in from the start.
Every screen went through multiple rounds of iteration. We tested internally, ran demos with the client, and kept refining until every flow felt airtight.
Testing
Testing happened in layers. We started internally within the New Designers team, stress-testing every flow and catching edge cases before anything reached a real user. From there, we moved to the invigilators themselves — the people who would be using the scanning interface under pressure on exam day — and iterated based on their feedback.
The final and most telling test came when we put the system in front of real students at real exams. It worked. The scans were fast, the data was accurate, and the process that had previously caused so much friction ran smoothly. Seeing it hold up in a live environment — with actual students, actual stakes — was the validation that mattered most.
Reflection
ExamenTool taught me what it feels like to design something where the stakes are genuinely high. This wasn't a nice-to-have — it was a system that determined whether students could sit their exams. That responsibility sharpened my focus in a way that's hard to replicate on lower-stakes projects.
Working closely with Sara on every part of the system also reinforced something I'd started to learn at THiNKFeST: great outcomes come from great collaboration. Two designers thinking through the same problem from different angles consistently produced better results than either of us would have alone.
The app is currently awaiting final approval, but every test and demo has gone well. Knowing that something we built from scratch is about to replace a broken process — and make exam day a little less stressful for students — is a good feeling.
