A/B testing: Also known as split testing. Companies make two versions of the same newsletter or web page with slight differences and then test which of the two versions elicit a better response in the marketplace. A/B testing, at its simplest, is randomly showing a visitor one version of a page – (A) version or (B) version – and tracking the changes in behavior based on which version they saw. (A) version is normally your existing design (“control” in statistics lingo); and (B) version is the “challenger” with one copy or design element changed. In a “50/50 A/B split test,” you’re flipping a coin to decide which version of a page to show. A classic example would be comparing conversions resulting from serving either version (A) or (B), where the versions display different headlines. A/B tests are commonly applied to clicked-on ad copy and landing page copy or designs to determine which version drives the more desired result. See also Multivariate Testing.