We use a standard lepton+jets selection with only very minor modifications:
The differences from the standard lepton+jets are the $H_T$ cut, which rejects a large fraction of our Non-$W$ background while maintaining efficiency for top, and the addition of a 3 tight jets plus at least one loose jet sample. Other lepton+jets analyses do not cut on $H_T$ and require 4 or more tight jets.
We normalize our background with Method II. Our signal Monte Carlo is a sample of Powheg generated with the luminosity profile of the entire Run II.
The production angle, $\cos \theta_t$, is the angle between the proton momentum and the top quark momentum, as measured in the top-antitop center-of-mass frame. It is related to the top-antitop rapidity difference, $\Delta y$, used in previous $A_\text{FB}$ measurements.
Most validation has already been accomplished in the previous $A_{FB}$ analysis1. We present just a few variables including the variable of interest, $\cos \theta_t$, and show that we are well modeled in our full signal sample, a side-band ("Anti-tag"), and several sub-regions of our signal sample.
The Legendre polynomials $P_\ell (x)$ are a complete set of orthogonal polynomials. They frequently appear in electromagnetics and quantum mechanics, and are closely related to the theory of angular momentum in quantum mechanics.
The 1959 paper of Jacob and Wick details the general theory of $2 \to 2$ scatter of particles with mass and spin. The angular dependence is entirely described by Legendre polynomials, and the Legendre moments are determined from a sum over total angular momentum states, incoming and outgoing helicities, and the Clebsch-Gordan coefficients. In the end, a diagram that proceeds with total angular momentum $J$, interfering with a diagram that proceeds with total angular momentum $J'$, contributes to the Legendre moments with $\left\lvert J - J' \right\rvert \leq \ell \leq J + J'$.
The only Standard Model diagram for $q\bar{q} \to t\bar{t}$ at leading order contains an intermediate state of a single gluon, so that diagram proceeds only with total angular momentum $J=1$ (the spin of the gluon). Thus, leading order Standard Model production of $t\bar{t}$ will have nonzero Legendre moments $a_0$ and $a_2$ ($a_1$ is zero because the final state is a parity eigenstate).
Beyond tree-level, additional diagrams with multi-particle intermediate states contribute. These diagrams proceed with any total angular momentum, and so contribute non-zero Legendre moments everywhere. We expect a roughly exponential decrease in the magnitude of the Legendre moments as $\ell$ increases due to the decreasing amplitude for initial states with large total angular momentum.
A model containing an axi-gluon can produce a non-zero $a_1$ at tree level because the final state is no longer a parity eigenstate. Models with a flavor-changing $Z'$ exchanged in the $t$-channel produce large non-zero moments everywhere because there is no single-particle intermediate state, similar to the NLO SM box diagram.
We use a calculation by Bernreuther et al as our Standard Model benchmark. This calculation includes both QCD and electroweak effects on $t \bar{t}$ production at next-to-leading order, and has been performed with three different scale settings.
We also compare our data to several Monte Carlo calculations. These include an NLO SM calculation from POWHEG, a LO SM calculation from PYTHIA, and two new physics models from MadGraph. Octet A is a representative $s$-channel model, with an axigluon with a mass of 2 TeV. $Z'$ 200 is a $t$-channel model, containing a flavor-changing $Z'$ with a mass of 200 GeV.
We start by evaluating the Legendre moments of the data and the background model. We also estimate uncertainties on these as covariance matrices, to account for the correlations among the measured moments. We will correct the difference between the data and the background moments ("Background subtracted") to the parton level. We also compare the Legendre series from the raw moments to a histogram of the raw data. The series curve follows the histogram very nicely, demonstrating that the Legendre moments are an equivalently faithful representation of our data.
We correct the background-subtracted data moments to the parton level and evaluate all of our systematic uncertainties. We intend to present moments up through $\ell = 8$ in a table, but only up through $\ell = 4$ in the principal plot of the analysis.
We consider many sources of systematic uncertainty. To evaluate the effect of a source of systematic uncertainty, we vary some unknown parameter, the re-do the background subtraction and parton-level correction, obtaining a varied set of parton level moments, $a_\ell^{varied}$. We compare the varied moments with the nominal moments, and obtain a covariance matrix (with 100% correlation in the effect on each moment) $$σ_{\ell m} = (a_\ell^{varied} - a_\ell^{nominal}) \cdot (a_m^{varied} - a_m^{nominal}).$$ In this manner, we obtain a covariance matrix describing the effect of each source of systematic uncertainty. We sum all of these and add them to the covariance matrix describing the statistical uncertainty, giving us a covariance matrix which fully describes the uncertainty on the measurement of the Legendre moments.
The covariance matrix may be inverted to form a $\chi^2$ statistic which measures goodness of fit to our measurement. We also show the correlation matrix, $\dfrac{cov_{ij}}{sqrt(cov_{ii} cov_{jj})}$.