When you're an experienced front-end developer, it takes just one look and few mouse clicks to judge whether or not the site in your browser is correctly implemented. By working on these problems day after day, you'll write dozens of lines of HTML/CSS, checking the outcome each time.
Over time some develop an intuition and utilise it effectively during the manual test phase for interface implementation review. But what about people who have no such experience? What if you're a tester without a front-end development background?
Once we were confronted with such a situation, we decided that it was time to write down our front-end knowledge and outline a dedicated testing process so anybody on the team could participate. So why not just share it with everyone?
FE & QA
The Front-end Quality Assurance process focuses on proving implementation of interface to be both accurate and adequate to the design while maintaining flexibility and correct behavior within different environments or while serving diverse content.
Quality Assurance review is based on two lists: the criteria list and the components list. Criteria list focuses on multiple aspects that should be addressed while testing every element of interface. Component list counts in hierarchical order every component that is defined as part of interface - from most primitive and atomic, through more complex and coupled to whole views and layouts used within an application.
Every step of this process originated as manual activity that heavily depends on in-browser development tools (think Firebug or inspector). With right tools and approach many of them can be automated, but human is still a key actor of this process.
The following criteria are reviewed:
- design accuracy
- responsiveness to viewport
- responsiveness to content
- components behavior
- 3rd party technologies
Every item in the list should be considered a control step for reviewed part of an interface. Whether they should be placed on checklist or not depends strictly on the projects and it's features. For the same reason below lists should serve as guides and you're welcomed to extend them upon your needs.
First and most important is visual comparison of implemented look of component to it's design. And it's not only about
does it look smilar to design, it's more about
does it match it in every detail.
Most common properties that should be revised are:
- sizes (width, height, line height)
- font family
- font size
- font weight
- spacing (margin, padding)
- list styling (ul, ol)
- number and date formats
Design accuracy should be assured on all supported browsers and on all supported platforms. That includes ensuring that any differences coming from platform and browser engine rendering capabilities are within margin of tolerance.
Responsiveness to viewport
If application support responsiveness for different viewports of client devices, design accuracy should be reviewed for every supported viewport configuration (with comparison to corresponding design of alternative state) and platform.
Responsiveness to content
Every component that contains variable content is checked against multiple combinations of possible content states. Whether it would be most common case, minimal volume or maximum complexity - process assures flexibility of components and it's reliability. Since change of component state usually impact the content with different look, additional text or indicators, it should be tested alongside other content variations.
List of tested aspects:
- component states
- text/number length
- text wrapping
- text/number align
- image align
- image floating
- content overflow and overlapping
The easiest way would be to prepare review of same component filled with diverse content or switched to certain states in form of dedicated list view. For that purpose we create and maintain component library dedicated to application - each change to component code can be immediately reviewed across different possible scenarions. If you cannot afford developing dedicated interface library just use inspector to manipulate component in tested case and observe how it responds to modification of content or styles.
Most of used components are implementations of proved interface patterns, and as such each of them often brings a number of component specific behaviors witch should be tested.
Example list of components and their expected behaviors:
- easy access
- scales with increasing number of items
- highlight current state
- scales with increasing number of items
- distinguish root levels from sub levels
- distinguish header from content
- properly formats and align different types of data
- fixed header
- does not conflict with rest of layout
- it's on the top level of all components (it's not covered by any other UI component)
- always visible
- synchronized with related data set
- scales with increasing number of presented data
- labels are readable
- form fields
- error handling is supported
- label click focus corresponding inputs
- align and position itself within view
- can be accessed (it's not covered by any other UI component)
- in popup/blocking state it's not passable and requires action
This list is just an example. Fill it with patterns used in your app and extend for every UI element you think should appear here. Remeber that this is not only a guide for testers - it's also part of specification for an interface.
Review assures that every action taken in interface does not block application and allows to still use it. Changing views, switching states, sorting data or animating components should not break user experience with slow animations, long view processing or rendering (assuming that these are not caused by high latency of API response).
Evaluated interface aspects:
- live updates
- data synchronization (timing)
- data-heavy views/lists/tables
- document rendering
Be careful especially when reviewing data-heavy lists or tables. When supporting low performance machines or mobile devices additional burden of re-rendering interface can make it unresponsive or even unusable.
3rd party technologies
Some applications require 3rd party technologies or extensions like Flash or Silverlight, whether they are used for core functionalities or fallback/polyfill for unavailable browser API. Tests should check if components based on those technologies are working properly when technology is installed in browser environment and if lack of such is gracefully handled by either fallback logic or proper message to the user.
Components list consists of five levels:
- Basic elements
- Individual components
- Coupled components
Each list should be based on design of application interface, and can take form of components library, documentation or table. During review process each component should be check for every criterion from previous list.
Basic elements are most primitive (and often atomic) parts of interface, like button, text field, hyperlink or list. All of those elements are defined within HTML language and browsers provide default look and behavior for them. List should contain every basic element that alters default styles and/or behavior.
Every recurring part of interface that requires complex logic or uses more than one basic element should be considered component. Because usually component consists of multiple elements, review should consider that code defining new look and behavior as part of component shouldn't alter their base look and behavior. Additional notice goes for independence of individual components - they should not influence surrounding elements of interface.
This is next level of component complexity - components build on top of other components, whether using them in unchanged way yet requiring their presence, or altering and changing their properties for combined purposes. This point has similar concerns as previous one - review should assure that additional code expanding or using existing individual components does not influence their original features when used outside of coupled component.
Usually build with use of either basic elements and components, they form core content of application and drive user actions. This step of review assures that every view consists of right elements and components, in right places and with working interface logic. View maquette, layout or design should be used as visual guide.
Last and mostly optional level is application layout. Usually application consists of one layout for all views. In some cases different user roles or visual themes introduce different layouts, which customize or alter default composition of views or look and behavior of components. Each layout should be reviewed against corresponding designs or color pallet maps.
Presented process can be partially automated by appropriate setup of tools resulting in visual regression tests. Depending on used tools, up to first four points of criteria list can be included in automated testing.
When new element or component is created, it should be presented in component's library, with as much variants as possible to cover necessary cases (possible states, possible content), then it should be added to list of components involved with tests. Test process creates images with current look for all listed components and compare them with source images, announcing every case with mismatch. Mismatch occurs when existing component's look is changed, or when there is no source image. Each mismatch is reviewed and marked requiring fix (accidental and unwanted change), or marked as intended (deliberate component change, or completely new component). When change is intended, current image became new source image.
If you wonder what tools could be used or how to setup such process - please check our blog soon. Another post dedicated to visual regression tests in on it's way here.
What's in it for me
Despite appearances testing front-end implementation is quite complex task. Numerous cases of possible looks, states and behaviors require patience, accuracy, spatial imagination and broad knowledge about interface quirks.
If you think that presentend process is repetitive, time-consuming and wearing - you may be right, but the same can be applied to any testing process. And that's the price of quality.