2 statistical analysis reweighting
to reweigh an observable we have to compute
∫dUe−S[U]O(U)r(U)∫dUe−S[U]r(U)=∫dUe−S[U]O(U)r(U)∫dUe−S[U]r(U)∫dUe−S[U]∫dUe−S[U]=⟨Or⟩⟨r⟩. r is computed stochastically as r(U)=⟨r(U)⟩ϕ=det(D(μf)/D(μi))=∫dϕew(U,ϕ) with w(U,ϕ)=ϕ†(1−D(μi)D−1(μf))ϕ and ϕ a gaussian noise P(ϕ)∝e−ϕ2.
Since det(D(μf)/D(μi)) may be complex we compute ∣det(D(μf)/D(μi))∣=√det(D−1(μi)D(μf)D(−μf)D−1(−μi))=(∫dϕeϕ†(1−D1(−μi)D−1(−μf)D−1(μf)D(μi))ϕ)12 , thus the observable to compute is
⟨O⟨r⟩ϕ⟩U⟨⟨r⟩ϕ⟩U.
Ignoring the error on r we can define
ˉr(Ui)=∑jr(Ui,ϕij)1NU∑i1,jr(Ui1,ϕij) we can compute the sum in the numerator and in the denominator as in (Equation 2.1) and (Equation 2.2) then we can binning the data to take into account for autocorrelation ˜Ob=1NbbNb+Nb∑i=bNbO(Ui)ˉr(Ui) and then the jackknifes as Jack(O)j=1N−1∑a≠j˜Oa
to fully propagate the error instead we have to first to bin the data to take into account for autocorrelation [Or]b=1NbbNb+Nb∑i=bNbO(Ui)1Nj∑jr(Ui,ϕij)[r]b=1NbbNb+Nb∑i=bNb1Nj∑jr(Ui,ϕij) we can not compute r(Ui,ϕij) with double precision, but we can compute the exponent of it and the exponent of a sum w(Ui)=log(1Nj∑jr(Ui,ϕij))=w(Ui,ϕi0)+log(1Nj∑jew(Ui,ϕij)−w(Ui,ϕi0)).(2.1) For the case of an OS reweighting we need to take into account the square root factor as w(Ui)=12log(1Nj∑jr(Ui,ϕij))=12w(Ui,ϕi0)+12log(1Nj∑jew(Ui,ϕij)−w(Ui,ϕi0)).(2.2) Then we compute the exponent of the binning. Pb=log([Or]b) and wb=log([r]b) Pb=w(UbNb)+log(1NbbNb+Nb∑i=bNbO(Ui)ew(Ui)−w(UbNb))wb=w(UbNb)+log(1NbbNb+Nb∑i=bNbew(Ui)−w(UbNb)) After we can compute the jackknifes of the ratio Jack(Or)jJack(r)j=∑a≠j[O]a∑b≠j[r]b=∑a≠jePa∑b≠jewb=∑a≠j1∑b≠jewb−Pa
2.1 comparison
<r >