Adding Test Cases
On the Tests screen you have the opportunity to add test cases. With test cases you can indicate the expected variable values associated with an Apollo event attribute when the Apollo event is triggered on a specific URL. This information can be helpful for any people or Apollo automated systems validating analytics tags.
An Apollo event can have multiple Test Scenarios. For example, say you want to ensure that the Product Discount Amount % variable is getting set to expected values at the time of a Product Viewed event beacon. You can provide a URL for a product you know is full price and note that when a Product Viewed event occurs on this URL that the Product Discount Amount % attribute should be “0”. You can also provide a URL for a product you know is on sale for 30% off and note that when a Product Viewed event occurs on this URL that the Product Discount Amount % attribute should be “30”.
Creating Test Cases
To create a test case, you must have an Apollo event and attribute added. Navigate to the Tests screen and click “Create New Test”.
A drawer will expand where you should select the event for which you would like to create the test.
Next note the name of your test case. You should make the name reflective of the scenario that will be tested. In the Product Viewed examples above, you might consider naming the test cases “Full Price Product at Product Viewed” and “30% off Product at Product Viewed”.
Next add the test case starting URL. The starting URL should include the full URL where the test scenario can be replicated. For example, “https://www.mysite.com/products/p467586”.
Next select the attributes for which you would like to set expected values.
Finally, you can add a description which is where you can note the user action that triggers the Apollo event.
Test Case Expected Values
Now you should designate the expected attribute values for your test cases. Please note that you can only designate a test case expected value for variables whose attributes are enabled on for the event. Under 'Set Attribute Expectations', select attributes for which you would like to set expectations. Any test cases you have created will appear below the attribute. Then indicate the expectation value. The values here should be exactly how you expect them to see them, meaning you should be intentional with casing, spaces, special characters, etc.
You can remove an expected value you have added using the Trash Can. This won’t remove the entire test case, just the expected value you have entered. It will give you the opportunity to enter a new expected value.
Click 'Save' when you are done.
Test Case URLs
Note that each Test Case you create will have an associated URL you must navigate to in order to trigger the test. This URL is a combination of the Test Case Starting URL you entered when creating the test case with two query string parameters appended:
Qax_tc - This is the test case ID which is created by Apollo when users deploy using Apollo
Qax_v - This is for the test case version which is updated every time the test cases are deployed using Apollo.
You can view the Test Case URL in the Testing Report which is downloadable in the Documentation screen:
Note, that the Test Case URL(s) will not be available in the Testing Report until a Deployment to Tag Manager is performed which includes the Test Case.
Test Cases in the Testing Report Documentation
Test cases you create are connected to downstream Apollo documentation and systems. If you export the Testing Report you’ll see the Test Cases, the Test Case Instructions (which are filled from the Test Case Description you entered), and the Test Case Expected Values.
Apollo's Solution for Automated Validation Using Test Cases
This feature is currently only available for Adobe Analytics AppMeasurement implementation types. You can view your implementation type within Property Settings.
Once your implementation is deployed to Tag Manager, every time a Test Case’s Apollo Event is pushed to the data layer from the Test Case URL, the Apollo QAX Launch extension will perform automated validation for expected variable values. Apollo QAX logs the output from a failed test in the console, for example:
This is helpful to see in a development environment, when tags are being validated. Apollo QAX can also validate tags from your production environment and help aggregate common issues. To read more about this functionality, see here.