"Obsessively focusing on 'what isn’t right' can take a design from 'nearly there' to 'there' and beyond." - Niz Hamid
One of the many things that made working at Google so enjoyable was the direct link between myself as the UX Designer and the development team. Whenever a part of the frontend came close to being completed, an engineer would tap on my shoulder to take a look at their most recent creation. We discussed the challenges and merits of my design suggestions, and worked out the final solution, putting it into code right away. Doing this on an ongoing basis allowed us to make sure that the proper and coherent UX was carried through every part of the software - as it was being created. To me, this is the ideal way of implementing Quality Assurance.
If you are not lucky enough to work directly with an in-house development team, chances are you may find yourself in a waterfall environment, which means you'll have to review the output from the development team in a dedicated QA sprint.
Here is my checklist for conducting this sort of review:
✅ Compare to the design
Rather than looking at the implementation side-by-side with the design - or worse - in isolation, I recommend overlaying it with your designs.
Export screenshots of every page from the development environment. Using a Chrome plugin like FireShot, you can capture the entire page at once.
Overlay it with your screen designs in Sketch, Photoshop, or whatever software they were created in, and turn the opacity to 50%.
Look for flaws. For the most part, nothing should change as the two layers overlap perfectly where specifications have been replicated flawlessly. You will see immediately where discrepancies crop up due to the ‘ghosting’ effect on misaligned elements.
Annotate designs with actions. Don’t get hung up on every flaw you see right away, but mark them depending on what action they require. I use coloured circles in order of priority, for example:
Green: Lowest. You may choose to ignore flaws such as padding that’s off a tiny bit when faced with many higher priority items.
Orange: Medium. Styles look off, e.g. a button has the wrong height or colour.
Red: Highest. Content or structure show discrepancies.
Blue: Investigate. Sometimes, the implementation doesn’t match the design but actually works better or turns out more accurate or consistent. It means I have made an error or not fully considered an element or its behaviour. These are the action items I have to take away and review with a UX lens again to make sure that the best alternative solution has been found.
Make a list. Once you have identified all discrepancies, collate them in an actionable document (a ticket, a comment in a Google doc, …. anything where your engineering counterpart can keep track of them). But before you call a big meeting to fix everything, carry on with the next step.
✅ Check against the style guide
Due to the nature of CSS and pattern libraries, if an element is wrong in one place, it will likely be off in others too. Thus, when you find a flaw check them against the style guide to see if it is an isolated incident or not. There is no point in creating a ticket for every instance of the same flaw when, really, it just needs to be fixed in one location.
Reversely, take a look at the style guide and make sure that every element covered in it appears in the designed screens in its correct implementation. This will help to make sure that future features or pages are developed correctly too.
✅ Play with responsiveness
Once the designs are in decent shape, it’s time to check how they behave responsively. Don’t be lazy and actually do this using various mobile devices. Your experience will be very different when you swipe and tap on your mobile or tablet versus clicking a resized browser window. If you work on a web design, use different browsers and phone makes (i.e. at least iPhone and Android), as the browser controls will make a difference to the experience too. If you find mistakes, log them in the same actionable document you used for logging visual design flaws.
✅ Review the logic
Once you have carried out the previous steps and made sure that all the content is in the right places, you have a good basis to check the logical steps users will go through when interacting with your product.
To do this, make a list of the jobs to be done or user stories you want to test and go through them one by one, comparing the flow of events to whichever asset you used to specify them to begin with - flow charts and user stories being likely candidates. Especially when working with off-site development teams, links that get you places they shouldn’t are a common occurrence.
Note that this is not interchangeable with user testing as you are testing the correct implementation of the UX, not the UX itself. This should have happened much earlier in the development process.
✅ Monitor performance
Don't forget the invisible. Wait times and performance issues are of huge importance to the overall experience, so always test the performance of your product. If you work on a web project you can use the Inspect feature in Chrome to do this,
Open the page you want to test in Chrome.
Control+click somewhere in the page to open the context menu.
In the right-hand panel, click the tab “Network”.
Once this is open, reload the page. [⌘+R]
You will now see a list of every element on the page with its corresponding load time, as well as a chart with the overall page load time. Use this to find out which elements are slowing the page down.
Some common causes of poor site performance are too large assets such as images or videos, slow scripts, API calls, or backend processes. Try to fix whatever causes delays with your team or, if you can’t, apply some methods to shorten the perceived wait time.
✅ Bonus: Check the movements
Strictly speaking, this is part of the visual QA I was talking about right at the beginning, though if your product doesn’t heavily rely on animations, you’ll probably skip this part. Still, it is always worth looking at, as every animation that is designed and implemented for the first time is setting a standard for future ones to come.
Depending on the complexity of your animation you might be fine to just watch it a few times and declare it as “fine”, or go all out and replicate the overlay method I described for static screens:
Create a screen recording of the live animation.
Overlay it with the design in an animation or video editing program such as Adobe After Effects or Premiere.
Turn the opacity to 50%.
Click through it frame by frame to uncover mismatches.
Now it’s your turn. How do you make sure that your vision makes it in front of your users? Please share your tips in the comments below.