Whenever you are testing an AR experience that your creator has shared, it is important that you fully test it to ensure it is functioning properly for all kinds of people, environments, lighting conditions and more.

Different AR experiences will require different considerations when it comes to testing.

In this article we cover testing best practices for AR experiences of all kinds on the following platforms: Facebook, Instagram, Snapchat, TikTok, WebAR, and mobile apps.

Face filters

When testing face filters, it is important to ensure that the experience works on a diverse group of people.

This includes elements such as:

  • Skin tone

  • Hair colour and length

  • Gender

  • Glasses and sunglasses

  • Head and facial coverings

  • Jewellery

Environmental conditions, such as:

  • Lighting conditions. Eg. sunny outdoors, gloomy outdoors, dark indoors.

  • Backgrounds. Eg. flat colour backgrounds and busy backgrounds.

Devices, such as:

  • One new Android with a large screen, one old Android with a smaller screen.

  • One iPhone above iPhone X and one below iPhone X.

  • One Samsung phone above S9 and one below S9

Interactions, such as:

  • Gestures. Eg. open mouth, and any other gestures relevant to your experience.

  • Other interactions. Eg. ‘tap to change’ on Instagram, custom toggles and sliders, navigation controls, buttons, sliders, pickers (on Instagram, changing between different effects within one effect), capture button, share button.

  • Head movements. Eg. how the effect tracks on the face.

Audio:

  • Any audio that is supposed to play in the background.

  • Triggers. Eg. any audio that is supposed to be triggered by a gesture or interaction. Note: if your effect is audio-reactive, make sure no background music is playing while testing.

Video:

  • Example testing videos. Check Lens Studio and Spark AR example videos for inspiration.

Accessibility:

  • Anything to do with usability. Eg. physical constraints and practicalities around the experience, disabilities such as hearing impairments. What could you provide to cater to those people, such as subtitles?

Augmented world/portals

When testing world effects and portals, please ensure that you avoid moving the phone around too much. Additionally, it is important to ensure that the experience works in different kinds of environments.

Environmental conditions, such as:

  • Lighting conditions. Eg. sunny outdoors, gloomy outdoors, dark indoors

  • Backgrounds. Eg. flat colour backgrounds and busy backgrounds.

  • Note: when testing, please ensure there are no objects on the floor or surface.

Environments, such as:

  • Different floor textures. Eg. beach, grass, concrete, road.

  • Different environments. Eg. indoors living room, office space, outdoors beach.

Devices, such as:

  • One new Android with a large screen, one old Android with a smaller screen.

  • One iPhone above iPhone X and one below iPhone X.

  • One Samsung phone above S9 and one below S9

Interactions, such as:

  • Object interactions (if any). Eg. object rotations, taps, moving the object, scaling in and out with pinching.

  • Object placements. Eg. ensuring dragging and rotating works intuitively and that it isn’t centred around the wrong point.

  • Other interactions. Eg. ‘tap to change’ on Instagram, custom toggles and sliders, navigation controls, buttons, sliders, pickers (on Instagram, changing between different effects within one effect), capture button, share button.

Audio:

  • Any audio that is supposed to play in the background.

  • Triggers. Eg. any audio that is supposed to be triggered by a gesture or interaction. Note: if your effect is audio-reactive, make sure no background music is playing while testing.

Video:

  • Example testing videos. Check Lens Studio and Spark AR example videos for inspiration.

Accessibility:

  • Anything to do with usability. Eg. physical constraints and practicalities around the experience, disabilities such as hearing impairments. What could you provide to cater to those people, such as subtitles?

Image tracking

When testing image tracking, please ensure that you print out the target image for testing, rather than doing it on a screen. Additionally, it is important to ensure that the experience works in different kinds of environments.

Orientations:

  • Please test the experience on both portrait and landscape. If using portrait, please ensure that you are able to fit the entire image within the device.

Devices, such as:

  • One new Android with a large screen, one old Android with a smaller screen.

  • One iPhone above iPhone X and one below iPhone X.

  • One Samsung phone above S9 and one below S9

Interactions, such as:

  • Touch hotspots, swipes and off-screen prompts.

  • Other interactions. Eg. ‘tap to change’ on Instagram, custom toggles and sliders, navigation controls, buttons, sliders, pickers (on Instagram, changing between different effects within one effect), capture button, share button.

  • Please ensure all animations are working as expected.

Audio:

  • Any audio that is supposed to play in the background.

  • Triggers. Eg. any audio that is supposed to be triggered by a gesture or interaction. Note: if your effect is audio-reactive, make sure no background music is playing while testing.

Video:

  • Example testing videos. Check Lens Studio and Spark AR example videos for inspiration.

Accessibility:

  • Anything to do with usability. Eg. physical constraints and practicalities around the experience, disabilities such as hearing impairments. What could you provide to cater to those people, such as subtitles?

To find out more about what makes a good image target, read our article here.

Mini-games

When testing mini-games, it is important to ensure that the gaming elements are working as expected and that the experience works on a diverse group of people.

This includes elements such as:

  • Skin tone

  • Hair colour and length

  • Gender

  • Glasses and sunglasses

  • Head and facial coverings

  • Jewellery

Environmental conditions, such as:

  • Lighting conditions. Eg. sunny outdoors, gloomy outdoors, dark indoors.

  • Backgrounds. Eg. flat colour backgrounds and busy backgrounds.

Devices, such as:

  • One new Android with a large screen, one old Android with a smaller screen.

  • One iPhone above iPhone X and one below iPhone X.

  • One Samsung phone above S9 and one below S9.

Interactions, such as:

  • Gestures. Eg. open mouth, and any other gestures relevant to your experience.

  • Other interactions. Eg. ‘tap to change’ on Instagram, custom toggles and sliders, navigation controls, buttons, sliders, pickers (on Instagram, changing between different effects within one effect), capture button, share button.

  • Head movements. Eg. how the effect tracks on the face.

Gaming interactions, such as:

  • Game logic. Eg. Does the game work as expected?

  • Persistence. Eg. If using a high score, does it work correctly? Does the score remain in place when leaving and returning to the mini-game?

Audio:

  • Any audio that is supposed to play in the background.

  • Triggers. Eg. any audio that is supposed to be triggered by a gesture or interaction. Note: if your effect is audio-reactive, make sure no background music is playing while testing.

Video:

  • Example testing videos. Check Lens Studio and Spark AR example videos for inspiration.

Accessibility:

  • Anything to do with usability. Eg. physical constraints and practicalities around the experience, disabilities such as hearing impairments. What could you provide to cater to those people, such as subtitles?

Did this answer your question?