References
Abrahamsen, P. 1997. “A Review of Gaussian Random Fields and Correlation Functions.” Norsk Regnesentral/Norwegian Computing Center Oslo, https://www.nr.no/directdownload/917_Rapport.pdf.
Adler, RJ. 2010. The Geometry of Random Fields. SIAM.
Akima, H, A Gebhardt, T Petzoldt, and M Maechler. 2016.
akima
: Interpolation of Irregularly and Regularly Spaced Data. https://CRAN.R-project.org/package=akima.
Allaire, JJ, Y Xie, J McPherson, J Luraschi, K Ushey, A Atkins, H Wickham, J Cheng, and W Chang. 2018.
rmarkdown
: Dynamic Documents for R. https://CRAN.R-project.org/package=rmarkdown.
Ambikasaran, S, D Foreman-Mackey, L Greengard, DW Hogg, and M O’Neil. 2015. “Fast Direct Methods for Gaussian Processes.” IEEE Transactions on Pattern Analysis and Machine Intelligence 38 (2): 252–65.
Anagnostopoulos, C, and RB Gramacy. 2013. “Information-Theoretic Data Discarding for Dynamic Trees on Data Streams.” Entropy 15 (12): 5510–35.
Andrianakis, I, and PG Challenor. 2012. “The Effect of the Nugget on Gaussian Process Emulators of Computer Models.” Computational Statistics & Data Analysis 56 (12): 4215–28.
Ankenman, B, BL Nelson, and J Staum. 2010. “Stochastic Kriging for Simulation Metamodeling.” Operations Research 58 (2): 371–82.
Audet, C, and JE Dennis Jr. 2006. “Mesh Adaptive Direct Search Algorithms for Constrained Optimization.” SIAM Journal on Optimization 17 (1): 188–217.
Azzimonti, D, D Ginsbourger, C Chevalier, J Bect, and Y Richet. 2016. “Adaptive Design of Experiments for Conservative Estimation of Excursion Sets.” Preprint on ArXiv:1611.07256.
Ba, S. 2015.
SLHD
: Maximin-Distance (Sliced) Latin Hypercube Designs. https://CRAN.R-project.org/package=SLHD.
Ba, S, and VR Joseph. 2012. “Composite Gaussian Process Models for Emulating Expensive Functions.” The Annals of Applied Statistics 6 (4): 1838–60.
Ba, S, and VR Joseph. 2018.
MaxPro
: Maximum Projection Designs. https://CRAN.R-project.org/package=MaxPro.
Ba, S, WR Myers, and WA Brenneman. 2015. “Optimal Sliced Latin Hypercube Designs.” Technometrics 57 (4): 479–87.
Balaprakash, P, RB Gramacy, and SM Wild. 2013. “Active-Learning-Based Surrogate Models for Empirical Performance Tuning.” In 2013 IEEE International Conference on Cluster Computing (CLUSTER), 1–8. IEEE.
Balaprakash, P, K Rupp, A Mametjanov, RB Gramacy, PD Hovland, and SM Wild. 2013. “Empirical Performance Modeling of GPU Kernels Using Active Learning.” In ParCo, 646–55. Citeseer.
Banerjee, S, BP Carlin, and AE Gelfand. 2004. Hierarchical Modeling and Analysis for Spatial Data. Chapman; Hall/CRC.
Banerjee, S, AE Gelfand, AO Finley, and H Sang. 2008. “Gaussian Predictive Process Models for Large Spatial Data Sets.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 70 (4): 825–48.
Bastos, LS, and A O’Hagan. 2009. “Diagnostics for Gaussian Process Emulators.” Technometrics 51 (4): 425–38.
Bates, D, and M Maechler. 2019.
Matrix
: Sparse and Dense Matrix Classes and Methods. https://CRAN.R-project.org/package=Matrix.
Bect, J, F Bachoc, and D Ginsbourger. 2016. “A Supermartingale Approach to Gaussian Process Based Sequential Design of Experiments.” Preprint on ArXiv:1608.01118.
Berger, JO, JM Bernardo, and D Sun. 2009. “The Formal Definition of Reference Priors.” The Annals of Statistics 37 (2): 905–38.
Berger, JO, V De Oliveira, and B Sansó. 2001. “Objective Bayesian Analysis of Spatially Correlated Data.” Journal of the American Statistical Association 96 (456): 1361–74.
Bertsekas, DP. 2014. Constrained Optimization and Lagrange Multiplier Methods. Academic Press.
Bichon, BJ, MS Eldred, LP Swiler, S Mahadevan, and JM McFarland. 2008. “Efficient Global Reliability Analysis for Nonlinear Implicit Performance Functions.” AIAA Journal 46 (10): 2459–68.
Binois, M, and RB Gramacy. 2018. “
hetGP
: Heteroskedastic Gaussian Process Modeling and Sequential Design in R.” Available as a Vignette in the R Package.
Binois, M, and RB Gramacy. 2019.
hetGP
: Heteroskedastic Gaussian Process Modeling and Design Under Replication. https://CRAN.R-project.org/package=hetGP.
Binois, M, RB Gramacy, and M Ludkovski. 2018. “Practical Heteroscedastic Gaussian Process Modeling for Large Simulation Experiments.” Journal of Computational and Graphical Statistics 27 (4): 808–21. https://doi.org/10.1080/10618600.2018.1458625.
Binois, M, J Huang, RB Gramacy, and M Ludkovski. 2019. “Replication or Exploration? Sequential Design for Stochastic Simulation Experiments.” Technometrics 27 (4): 808–21. https://doi.org/10.1080/00401706.2018.1469433.
Bisgaard, S, and B Ankenman. 1996. “Standard Errors for the Eigenvalues in Second-Order Response Surface Models.” Technometrics 38 (3): 238–46.
Bogunovic, I, J Scarlett, A Krause, and V Cevher. 2016. “Truncated Variance Reduction: A Unified Approach to Bayesian Optimization and Level-Set Estimation.” In Advances in Neural Information Processing Systems, 1507–15.
Bompard, M, J Peter, and JA Desideri. 2010. “Surrogate Models Based on Function and Derivative Values for Aerodynamic Global Optimization.” In V European Conference on Computational Fluid Dynamics ECCOMAS CFD 2010.
Booker, AJ, JE Dennis, PD Frank, DB Serafini, V Torczon, and MW Trosset. 1999. “A Rigorous Framework for Optimization of Expensive Functions by Surrogates.” Structural Optimization 17 (1): 1–13.
Bornn, L, G Shaddick, and JV Zidek. 2012. “Modeling Nonstationary Processes Through Dimension Expansion.” Journal of the American Statistical Association 107 (497): 281–89.
Boukouvalas, A, and D Cornford. 2009. “Learning Heteroscedastic Gaussian Processes for Complex Datasets.” Aston University, Neural Computing Research Group.
Box, GEP, and NR Draper. 1987. Empirical Model-Building and Response Surfaces. John Wiley & Sons.
Box, GEP, and NR Draper. 2007. Response Surfaces, Mixtures, and Ridge Analyses. Vol. 649. John Wiley & Sons.
Breiman, L. 2001. “Random Forests.” Machine Learning 45 (1): 5–32.
Breiman, L, A Cutler, A Liaw, and M Wiener. 2018.
randomForest
: Breiman and Cutler’s Random Forests for Classification and Regression. https://CRAN.R-project.org/package=randomForest.
Breiman, L, J Friedman, R Olshen, and C Stone. 1984. Classification and Regression Trees. Wadsworth Int.
Broderick, T, and RB Gramacy. 2011. “Classification and Categorical Inputs with Treed Gaussian Process Models.” Journal of Classification 28 (2): 244–70.
Brynjarsdóttir, J, and A O’Hagan. 2014. “Learning about Physical Parameters: The Importance of Model Discrepancy.” Inverse Problems 30 (11): 114007.
Buchanan, M. 2013. Forecast: What Physics, Meteorology, and the Natural Sciences Can Teach Us about Economics. Bloomsbury Publishing USA.
Bull, AD. 2011. “Convergence Rates of Efficient Global Optimization Algorithms.” Journal of Machine Learning Research 12 (Oct): 2879–2904.
Burnaev, E, and M Panov. 2015. “Adaptive Design of Experiments Based on Gaussian Processes.” In Statistical Learning and Data Sciences, 116–25. Springer-Verlag.
Byrd, RH, P Qiu, J Nocedal, and C and Zhu. 1995. “A Limited Memory Algorithm for Bound Constrained Optimization.” Journal on Scientific Computing 16 (5): 1190–1208.
Carlin, BP, and NG Polson. 1991. “Inference for Nonconjugate Bayesian Models Using the Gibbs Sampler.” Canadian Journal of Statistics 19 (4): 399–405.
Carmassi, M. 2018.
CaliCo
: Code Calibration in a Bayesian Framework. https://CRAN.R-project.org/package=CaliCo.
Carvalho, CM, MS Johannes, HF Lopes, and NG Polson. 2010. “Particle Learning and Smoothing.” Statistical Science 25 (1): 88–106.
Chang, W, J Cheng, JJ Allaire, Y Xie, and J McPherson. 2017.
shiny
: Web Application Framework for R. https://CRAN.R-project.org/package=shiny.
Chen, T, T He, M Benesty, V Khotilovich, Y Tang, H Cho, K Chen, et al. 2019.
xgboost
: Extreme Gradient Boosting. https://CRAN.R-project.org/package=xgboost.
Chevalier, C, D Ginsbourger, J Bect, and I Molchanov. 2013. “Estimating and Quantifying Uncertainties on Level Sets Using the Vorob’ev Expectation and Deviation with Gaussian Process Models.” In mODa 10 - Advances in Model-Oriented Design and Analysis, edited by Dariusz Ucinski, Anthony C. Atkinson, and Maciej Patan, 35–43. Contributions to Statistics. Springer-Verlag. https://doi.org/10.1007/978-3-319-00218-7_5.
Chevalier, C, D Ginsbourger, and X Emery. 2014. “Corrected Kriging Update Formulae for Batch-Sequential Data Assimilation.” In Mathematics of Planet Earth, 119–22. Springer-Verlag.
Chevalier, C, V Picheny, and D Ginsbourger. 2014. “
KrigInv
: An Efficient and User-Friendly Implementation of Batch-Sequential Inversion Strategies Based on Kriging.” Computational Statistics & Data Analysis 71: 1021–34.
Chevalier, C, V Picheny, D Ginsbourger, and D Azzimonti. 2018.
KrigInv
: Kriging-Based Inversion for Deterministic and Noisy Computer Experiments. https://CRAN.R-project.org/package=KrigInv.
Chipman, HA, EI George, and RE McCulloch. 2002. “Bayesian Treed Models.” Machine Learning 48 (1-3): 299–320.
Chipman, HA, EI George, and RE McCulloch. 2010. “BART: Bayesian Additive Regression Trees.” The Annals of Applied Statistics 4 (1): 266–98.
Chipman, HA, P Ranjan, and W Wang. 2012. “Sequential Design for Computer Experiments with a Flexible Bayesian Additive Model.” Canadian Journal of Statistics 40 (4): 663–78.
Chipman, H, EI George, RB Gramacy, and R McCulloch. 2013. “Bayesian Treed Response Surface Models.” Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 3 (4): 298–305.
Chipman, Hugh A, Edward I George, and Robert E McCulloch. 1998. “Bayesian CART Model Search.” Journal of the American Statistical Association 93 (443): 935–48.
Chipman, Hugh, and Robert McCulloch. 2016.
BayesTree
: Bayesian Additive Regression Trees. https://CRAN.R-project.org/package=BayesTree.
Chung, M, M Binois, J Bardsley, RB Gramacy, DJ Moquin, AP Smith, and AM Smith. 2019. “Parameter and Uncertainty Estimation for Dynamical Systems Using Surrogate Stochastic Processes.” SIAM Journal on Scientific Computing 41 (4): A2212–38.
Cioffi–Revilla, Claudio. 2014. Introduction to Computational Social Science. New York, NY: Springer.
Cohn, DA. 1994. “Neural Network Exploration Using Optimal Experiment Design.” In Advances in Neural Information Processing Systems, 679–86.
Conn, AR, K Scheinberg, and LN Vicente. 2009. Introduction to Derivative-Free Optimization. Philadelphia, PA, USA: Society for Industrial; Applied Mathematics.
Cox, DD, JS Park, and CE Singer. 2001. “A Statistical Method for Tuning a Computer Code to a Data Base.” Computational Statistics & Data Analysis 37 (1): 77–92.
Craig, PS, M Goldstein, AH Seheult, and JA Smith. 1996. “Bayes Linear Strategies for Matching Hydrocarbon Reservoir History.” Bayesian Statistics 5: 69–95.
Cressie, N. 1992. Statistics for Spatial Data. Wiley Online Library.
Cressie, N, and G Johannesson. 2008. “Fixed Rank Kriging for Very Large Spatial Data Sets.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 70 (1): 209–26.
Currin, C, TJ Mitchell, M Morris, and D Ylvisaker. 1991. “Bayesian Prediction of Deterministic Functions, with Applications to the Design and Analysis of Computer Experiments.” Journal of the American Statistical Association 86 (416): 953–63.
Da Veiga, S, F Wahl, and F Gamboa. 2009. “Local Polynomial Estimation for Sensitivity Analysis on Models with Correlated Inputs.” Technometrics 51 (4): 452–63.
Damianou, A, and N Lawrence. 2013. “Deep Gaussian Processes.” In Artificial Intelligence and Statistics, 207–15.
Dancik, GM. 2018.
mlegp
: Maximum Likelihood Estimates of Gaussian Processes. https://CRAN.R-project.org/package=mlegp.
Datta, A, S Banerjee, AO Finley, and AE Gelfand. 2016. “Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geostatistical Datasets.” Journal of the American Statistical Association 111 (514): 800–812.
de Micheaux, P. 2017.
CompQuadForm
: Distribution Function of Quadratic Forms in Normal Variables. https://CRAN.R-project.org/package=CompQuadForm.
Denison, DGT, BK Mallick, and AFM Smith. 1998. “A Bayesian CART Algorithm.” Biometrika 85 (2): 363–77.
Deville, Y, D Ginsbourger, and O Roustant. 2018.
kergp
: Gaussian Process Laboratory. https://CRAN.R-project.org/package=kergp.
DiDonato, AR, and AH Morris Jr. 1986. “Computation of the Incomplete Gamma Function Ratios and Their Inverse.” ACM Transactions on Mathematical Software 12 (4): 377–93.
Drake, RP, FW Doss, RG McClarren, ML Adams, N Amato, D Bingham, CC Chou, C DiStefano, K Fidkowski, and B Fryxell. 2011. “Radiative Effects in Radiative Shocks in Shock Tubes.” High Energy Density Physics 7 (3): 130–40.
Draper, NR. 1963. “Ridge Analysis of Response Surfaces.” Technometrics 5 (4): 469–79.
Duchesne, P, and PL de Micheaux. 2010. “Computing the Distribution of Quadratic Forms: Further Comparisons Between the Liu–Tang–Zhang Approximation and Exact Methods.” Computational Statistics & Data Analysis 54 (4): 858–62.
Dunlop, MM, MA Girolami, AM Stuart, and AL Teckentrup. 2018. “How Deep Are Deep Gaussian Processes?” The Journal of Machine Learning Research 19 (1): 2100–2145.
Dupuy, D, C Helbert, and J Franco. 2015. “
DiceDesign
and DiceEval
: Two R Packages for Design and Analysis of Computer Experiments.” Journal of Statistical Software 65 (11): 1–38. http://www.jstatsoft.org/v65/i11/.
Durrande, N, D Ginsbourger, and O Roustant. 2012. “Additive Covariance Kernels for High-Dimensional Gaussian Process Modeling.” In Annales de La Faculté Des Sciences de Toulouse: Mathématiques, 21 (3):481–99.
Eidsvik, J, BA Shaby, BJ Reich, M Wheeler, and J Niemi. 2014. “Estimation and Prediction in Spatial Models with Block Composite Likelihoods.” Journal of Computational and Graphical Statistics 23 (2): 295–315.
Emery, X. 2009. “The Kriging Update Equations and Their Application to the Selection of Neighboring Data.” Computational Geosciences 13 (3): 269–80.
Erickson, Collin B, Bruce E Ankenman, and Susan M Sanchez. 2018. “Comparison of Gaussian Process Modeling Software.” European Journal of Operational Research 266 (1): 179–92.
Fadikar, A, D Higdon, J Chen, B Lewis, S Venkatramanan, and M Marathe. 2018. “Calibrating a Stochastic, Agent-Based Model Using Quantile-Based Emulation.” SIAM/ASA Journal on Uncertainty Quantification 6 (4): 1685–1706.
Finley, A, and S Banerjee. 2019.
spBayes
: Univariate and Multivariate Spatial-Temporal Modeling. https://CRAN.R-project.org/package=spBayes.
Finley, AO, H Sang, S Banerjee, and AE Gelfand. 2009. “Improving the Performance of Predictive Process Modeling for Large Datasets.” Computational Statistics & Data Analysis 53 (8): 2873–84.
Forrester, AIJ, A Sobester, and AJ Keane. 2008. Engineering Design via Surrogate Modelling: A Practical Guide. John Wiley & Sons.
Franck, CT, and RB Gramacy. 2020. “Assessing Bayes Factor Surfaces Using Interactive Visualization and Computer Surrogate Modeling.” To Appear in The American Statistician.
Franco, J, D Dupuy, O Roustant, P Kiener, G Damblin, and B Iooss. 2018.
DiceDesign
: Designs of Computer Experiments. https://CRAN.R-project.org/package=DiceDesign.
Franey, M, P Ranjan, and H Chipman. 2012. “A Short Note on Gaussian Process Modeling for Large Datasets Using Graphics Processing Units.” Preprint on ArXiv:1203.1269.
Friedman, JH. 1991. “Multivariate Adaptive Regression Splines.” The Annals of Statistics 19 (1): 1–67.
Furrer, R, MG Genton, and D Nychka. 2006. “Covariance Tapering for Interpolation of Large Spatial Datasets.” Journal of Computational and Graphical Statistics 15 (3): 502–23.
Gardner, J, G Pleiss, KQ Weinberger, D Bindel, and AG Wilson. 2018. “GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration.” In Advances in Neural Information Processing Systems, 7576–86.
Gattiker, J, K Myers, B Williams, D Higdon, M Carzolio, and A Hoegh. 2016. “Gaussian Process-Based Sensitivity Analysis and Bayesian Model Calibration with GPMSA.” Handbook of Uncertainty Quantification, 1–41.
Gauthier, B, and L Pronzato. 2014. “Spectral Approximation of the IMSE Criterion for Optimal Designs in Kernel-Based Interpolation Models.” SIAM/ASA Journal on Uncertainty Quantification 2 (1): 805–25.
Genz, A, F Bretz, T Miwa, X Mi, and T Hothorn. 2018.
mvtnorm
: Multivariate Normal and \(t\) Distributions. https://CRAN.R-project.org/package=mvtnorm.
Ginsbourger, David, and Rodolphe Le Riche. 2010. “Towards Gaussian Process-Based Optimization with Finite Time Horizon.” In mODa 9–Advances in Model-Oriented Design and Analysis, 89–96. Springer-Verlag.
Ginsbourger, D, R Le Riche, and L Carraro. 2010. “Kriging Is Well-Suited to Parallelize Optimization.” In Computational Intelligence in Expensive Optimization Problems, 131–62. Springer-Verlag.
Gneiting, T, and AE Raftery. 2007. “Strictly Proper Scoring Rules, Prediction, and Estimation.” Journal of the American Statistical Association 102 (477): 359–78.
Goh, J, D Bingham, JP Holloway, MJ Grosskopf, CC Kuranz, and E Rutter. 2013. “Prediction and Computer Model Calibration Using Outputs from Multifidelity Simulators.” Technometrics 55 (4): 501–12.
Goldberg, PW, CKI Williams, and CMB Bishop. 1998. “Regression with Input-Dependent Noise: A Gaussian Process Treatment.” In Advances in Neural Information Processing Systems, 10:493–99. Cambridge, MA: MIT Press.
Gonzalez, J, M Osborne, and N Lawrence. 2016. “
GLASSES
: Relieving the Myopia of Bayesian Optimisation.” In Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, 790–99.
Goovaerts, P. 1997. Geostatistics for Natural Resources Evaluation. Oxford University Press on Demand.
Gorodetsky, A, and YM Marzouk. 2016. “Mercer Kernels and Integrated Variance Experimental Design: Connections Between Gaussian Process Regression and Polynomial Approximation.” SIAM/ASA Journal on Uncertainty Quantification 4 (1): 796–828.
Gramacy, RB. 2005. “Bayesian Treed Gaussian Process Models.” PhD thesis, Citeseer.
Gramacy, RB. 2007. “
tgp
: An R Package for Bayesian Nonstationary, Semiparametric Nonlinear Regression and Design by Treed Gaussian Process Models.” Journal of Statistical Software 19 (9): 6.
Gramacy, RB. 2014.
plgp
: Particle Learning of Gaussian Processes. https://CRAN.R-project.org/package=plgp.
Gramacy, RB. 2016. “
laGP
: Large-Scale Spatial Modeling via Local Approximate Gaussian Processes in R.” Journal of Statistical Software 72 (1): 1–46.
Gramacy, RB. 2018a. “A Shiny Update to an Old Experiment Game.” The American Statistician, 1–6.
Gramacy, RB. 2018b.
monomvn
: Estimation for Multivariate Normal and Student-\(t\) Data with Monotone Missingness. https://CRAN.R-project.org/package=monomvn.
Gramacy, RB, and DW Apley. 2015. “Local Gaussian Process Approximation for Large Computer Experiments.” Journal of Computational and Graphical Statistics 24 (2): 561–78.
Gramacy, RB, D Bingham, JP Holloway, MJ Grosskopf, CC Kuranz, E Rutter, M Trantham, and R Drake. 2015. “Calibrating a Large Computer Experiment Simulating Radiative Shock Hydrodynamics.” The Annals of Applied Statistics 9 (3): 1141–68.
Gramacy, RB, GA Gray, S Le Digabel, HKH Lee, P Ranjan, G Wells, and SM Wild. 2016. “Modeling an Augmented Lagrangian for Blackbox Constrained Optimization.” Technometrics 58 (1): 1–11.
Gramacy, RB, and B Haaland. 2016. “Speeding up Neighborhood Search in Local Gaussian Process Prediction.” Technometrics 58 (3): 294–303.
Gramacy, RB, and Herbert KH Lee. 2011. “Optimization Under Unknown Constraints.” In Bayesian Statistics. Vol. 9. Oxford University Press.
Gramacy, RB, and HKH Lee. 2008a. “Bayesian Treed Gaussian Process Models with an Application to Computer Modeling.” Journal of the American Statistical Association 103 (483): 1119–30.
Gramacy, RB, and HKH Lee. 2008b. “Gaussian Processes and Limiting Linear Models.” Computational Statistics & Data Analysis 53 (1): 123–36.
Gramacy, RB, and HKH Lee. 2009. “Adaptive Design and Analysis of Supercomputer Experiments.” Technometrics 51 (2): 130–45.
Gramacy, RB, and HKH Lee. 2012. “Cases for the Nugget in Modeling Computer Experiments.” Statistics and Computing 22 (3): 713–22.
Gramacy, RB, and H Lian. 2012. “Gaussian Process Single-Index Models as Emulators for Computer Experiments.” Technometrics 54 (1): 30–41.
Gramacy, RB, and M Ludkovski. 2015. “Sequential Design for Optimal Stopping Problems.” SIAM Journal on Financial Mathematics 6 (1): 748–75.
Gramacy, RB, J Niemi, and RM Weiss. 2014. “Massively Parallel Approximate Gaussian Process Regression.” SIAM/ASA Journal on Uncertainty Quantification 2 (1): 564–84.
Gramacy, RB, and E Pantaleo. 2010. “Shrinkage Regression for Multivariate Inference with Missing Data, and an Application to Portfolio Balancing.” Bayesian Analysis 5 (2): 237–62. https://doi.org/10.1214/10-BA602.
Gramacy, RB, and NG Polson. 2011. “Particle Learning of Gaussian Process Models for Sequential Design and Optimization.” Journal of Computational and Graphical Statistics 20 (1): 102–18.
Gramacy, RB, R Samworth, and R King. 2010. “Importance Tempering.” Statistics and Computing 20 (1): 1–7.
Gramacy, RB, and F Sun. 2018.
laGP
: Local Approximate Gaussian Process Regression. http://bobby.gramacy.com/r_packages/laGP.
Gramacy, RB, and MA Taddy. 2010. “Categorical Inputs, Sensitivity Analysis, Optimization and Importance Tempering with
tgp
Version 2, an R Package for Treed Gaussian Process Models.” Journal of Statistical Software 33 (6): 1–48.
Gramacy, RB, and MA Taddy. 2016.
tgp
: Bayesian Treed Gaussian Process Models. https://CRAN.R-project.org/package=tgp.
Gramacy, RB, MA Taddy, and C Anagnostopoulos. 2017.
dynaTree
: Dynamic Trees for Learning and Design. https://CRAN.R-project.org/package=dynaTree.
Gramacy, RB, MA Taddy, and SM Wild. 2013. “Variable Selection and Sensitivity Analysis Using Dynamic Trees, with an Application to Computer Code Performance Tuning.” The Annals of Applied Statistics 7 (1): 51–80.
Greenwell, B, B Boehmke, and C Cunningham. 2019.
gbm
: Generalized Boosted Regression Models. https://CRAN.R-project.org/package=gbm.
Gu, M. 2019. “Jointly Robust Prior for Gaussian Stochastic Process in Emulation, Calibration and Variable Selection.” Bayesian Analysis 14 (3): 857–85.
Gu, M, J Palomo, and J Berger. 2018.
RobustGaSP
: Robust Gaussian Stochastic Process Emulation. https://CRAN.R-project.org/package=RobustGaSP.
Gu, M, and L Wang. 2018. “Scaled Gaussian Stochastic Process for Computer Model Calibration and Prediction.” SIAM/ASA Journal on Uncertainty Quantification 6 (4): 1555–83.
Guinness, J, and M Katzfuss. 2019.
GpGp
: Fast Gaussian Process Computation Using Vecchia’s Approximation. https://CRAN.R-project.org/package=GpGp.
Gul, E. 2016. “Designs for Computer Experiments and Uncertainty Quantification.” PhD thesis, Georgia Institute of Technology.
Haaland, B, and PZG Qian. 2011. “Accurate Emulators for Large-Scale Computer Experiments.” The Annals of Statistics 39 (6): 2974–3002.
Hankin, RKS. 2013.
BACCO
: Bayesian Analysis of Computer Code Output. https://CRAN.R-project.org/package=BACCO.
Hankin, RKS. 2019.
emulator
: Bayesian Emulation of Computer Programs. https://CRAN.R-project.org/package=emulator.
Hastie, T, R Tibshirani, and JH Friedman. 2009. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. New York, NY: Springer.
Heaton, MJ, A Datta, AO Finley, R Furrer, J Guinness, R Guhaniyogi, F Gerber, et al. 2018. “A Case Study Competition Among Methods for Analyzing Large Spatial Data.” Journal of Agricultural, Biological and Environmental Statistics, 1–28.
Herbei, Radu, and L Mark Berliner. 2014. “Estimating Ocean Circulation: An MCMC Approach with Approximated Likelihoods via the Bernoulli Factory.” Journal of the American Statistical Association 109 (507): 944–54.
Hernández-Lobato, JM, MA Gelbart, MW Hoffman, RP Adams, and Z Ghahramani. 2015. “Predictive Entropy Search for Bayesian Optimization with Unknown Constraints.” In Proceedings of the 32nd International Conference on Machine Learning.
Higdon, D. 2002. “Space and Space-Time Modeling Using Process Convolutions.” In Quantitative Methods for Current Environmental Issues, 37–56. New York, NY: Springer.
Higdon, D, J Gattiker, B Williams, and M Rightley. 2008. “Computer Model Calibration Using High-Dimensional Output.” Journal of the American Statistical Association 103 (482): 570–83.
Higdon, D, M Kennedy, JC Cavendish, JA Cafeo, and RD Ryne. 2004. “Combining Field Data and Computer Simulations for Calibration and Prediction.” SIAM Journal on Scientific Computing 26 (2): 448–66.
Hoff, PD. 2009. A First Course in Bayesian Statistical Methods. Vol. 580. New York, NY: Springer.
Hong, LJ, and BL Nelson. 2006. “Discrete Optimization via Simulation Using COMPASS.” Operations Research 54 (1): 115–29.
Hu, R, and M Ludkovski. 2017. “Sequential Design for Ranking Response Surfaces.” SIAM/ASA Journal on Uncertainty Quantification 5 (1): 212–39.
Huan, X, and YM Marzouk. 2016. “Sequential Bayesian Optimal Experimental Design via Approximate Dynamic Programming.” Preprint on ArXiv:1604.08320.
Huang, D, TT Allen, WI Notz, and N Zeng. 2006. “Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models.” Journal of Global Optimization 34 (3): 441–66.
Huang, J, RB Gramacy, M Binois, and M Librashi. 2020. “On-Site Surrogates for Large-Scale Calibration.” To Appear in Applied Stochastic Models in Business and Industry.
Jacquier, E, NG Polson, and PE Rossi. 2004. “Bayesian Analysis of Stochastic Volatility Models with Fat-Tails and Correlated Errors.” Journal of Econometrics 122: 185–212.
Jalali, H, I Van Nieuwenhuyse, and V Picheny. 2017. “Comparison of Kriging-Based Algorithms for Simulation Optimization with Heterogeneous Noise.” European Journal of Operational Research 261 (1): 279–301.
Johnson, LR. 2008. “Microcolony and Biofilm Formation as a Survival Strategy for Bacteria.” Journal of Theoretical Biology 251: 24–34.
Johnson, LR, RB Gramacy, J Cohen, E Mordecai, C Murdock, J Rohr, SJ Ryan, AM Stewart-Ibarra, and D Weikel. 2018. “Phenomenological Forecasting of Disease Incidence Using Heteroskedastic Gaussian Processes: A Dengue Case Study.” The Annals of Applied Statistics 12 (1): 27–66.
Johnson, ME, LM Moore, and D Ylvisaker. 1990. “Minimax and Maximin Distance Designs.” Journal of Statistical Planning and Inference 26 (2): 131–48.
Jones, DR, M Schonlau, and WJ Welch. 1998. “Efficient Global Optimization of Expensive Black-Box Functions.” Journal of Global Optimization 13 (4): 455–92.
Joseph, VR. 2006. “Limit Kriging.” Technometrics 48 (4): 458–66.
Joseph, VR, En Gul, and S Ba. 2015. “Maximum Projection Designs for Computer Experiments.” Biometrika 102 (2): 371–80.
Journel, AG, and CJ Huijbregts. 1978. Mining Geostatistics. Vol. 600. Academic Press.
Kamiński, B. 2015. “A Method for the Updating of Stochastic Kriging Metamodels.” European Journal of Operational Research 247 (3): 859–66.
Kannan, A, and SM Wild. 2012. “Benefits of Deeper Analysis in Simulation-Based Groundwater Optimization Problems.” In Proceedings of the XIX International Conference on Computational Methods in Water Resources (CMWR 2012), 4(5):10. Springer–Verlag.
Karatzoglou, A, A Smola, and K Hornik. 2018.
kernlab
: Kernel-Based Machine Learning Lab. https://CRAN.R-project.org/package=kernlab.
Kass, RE, BP Carlin, A Gelman, and RM Neal. 1998. “Markov Chain Monte Carlo in Practice: A Roundtable Discussion.” The American Statistician 52 (2): 93–100.
Katzfuss, M. 2017. “A Multi-Resolution Approximation for Massive Spatial Datasets.” Journal of the American Statistical Association 112 (517): 201–14.
Katzfuss, M, and J Guinness. 2018. “A General Framework for Vecchia Approximations of Gaussian Processes.” Preprint on ArXiv:1708.06302.
Kaufman, CG, D Bingham, S Habib, K Heitmann, and JA Frieman. 2011. “Efficient Emulators of Computer Experiments Using Compactly Supported Correlation Functions, with an Application to Cosmology.” The Annals of Applied Statistics 5 (4): 2470–92.
Kennedy, MC, and A O’Hagan. 2001. “Bayesian Calibration of Computer Models.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 63 (3): 425–64.
Kersting, K, C Plagemann, P Pfaff, and W Burgard. 2007. “Most Likely Heteroscedastic Gaussian Process Regression.” In Proceedings of the International Conference on Machine Learning, 393–400. New York, NY: ACM.
Kim, H, BK Mallick, and CC Holmes. 2005. “Analyzing Nonstationary Spatial Data Using Piecewise Gaussian Processes.” Journal of the American Statistical Association 100 (470): 653–68.
Konomi, BA, G Karagiannis, K Lai, and G Lin. 2017. “Bayesian Treed Calibration: An Application to Carbon Capture with AX Sorbent.” Journal of the American Statistical Association 112 (517): 37–53.
Lam, R, K Willcox, and DH Wolpert. 2016. “Bayesian Optimization with a Dinite Budget: An Approximate Dynamic Programming Approach.” In Advances in Neural Information Processing Systems, 883–91.
Lanckriet, GRG, N Cristianini, P Bartlett, LE Ghaoui, and MI Jordan. 2004. “Learning the Kernel Matrix with Semidefinite Programming.” Journal of Machine Learning Research 5 (Jan): 27–72.
Larson, J, M Menickelly, and SM Wild. 2019. “Derivative-Free Optimization Methods.” Preprint on ArXiv:1904.11585.
Law, A. 2015. Simulation Modeling and Analysis. 5th ed. McGraw-Hill.
Lazaro–Gredilla, M, and M Titsias. 2011. “Variational Heteroscedastic Gaussian Process Regression.” In Proceedings of the International Conference on Machine Learning, 841–48. New York, NY: ACM.
Le Gratiet, L, and C Cannamela. 2015. “Cokriging-Based Sequential Design Strategies Using Fast Cross-Validation Techniques for Multi-Fidelity Computer Codes.” Technometrics 57 (3): 418–27.
Le Gratiet, L, and J Garnier. 2014. “Recursive Co-Kriging Model for Design of Computer Experiments with Multiple Levels of Fidelity.” International Journal for Uncertainty Quantification 4 (5).
Leatherman, ER, TJ Santner, and AM Dean. 2017. “Computer Experiment Designs for Accurate Prediction.” Statistics and Computing, 1–13.
Lee, Herbert K. H. 2007. “Chocolate Chip Cookies as a Teaching Aid.” The American Statistician 61 (4): 351–55.
Lee, HKH, RB Gramacy, C Linkletter, and G Gray. 2011. “Optimization Subject to Hidden Constraints via Statistical Emulation.” Pacific Journal of Optimization 7 (3): 467–78.
Lee, MR, and AB Owen. 2015. “Single Nugget Kriging.” Stanford University.
Leisch, F, K Hornik, and BD Ripley. 2017.
mda
: Mixture and Flexible Discriminant Analysis. https://CRAN.R-project.org/package=mda.
Letham, Benjamin, Brian Karrer, Guilherme Ottoni, and Eytan Bakshy. 2019. “Constrained Bayesian Optimization with Noisy Experiments.” Bayesian Analysis 14 (2): 495–519.
Lin, CD, and B Tang. 2015. “Latin Hypercubes and Space-Filling Designs.” Handbook of Design and Analysis of Experiments, 593–625.
Liu, F, MJ Bayarri, and JO Berger. 2009. “Modularization in Bayesian Analysis, with Emphasis on Analysis of Computer Models.” Bayesian Analysis 4 (1): 119–50.
Liu, Y. 2014. “Recent Advances in Computer Experiment Modeling.” PhD thesis, Rutgers University.
Lyu, X, M Binois, and M Ludkovski. 2018. “Evaluating Gaussian Process Metamodels and Sequential Designs for Noisy Level Set Estimation.” Preprint on ArXiv:1807.06712.
MacDonald, B, H Chipman, and P Ranjan. 2019.
GPfit
: Gaussian Processes Modeling. https://CRAN.R-project.org/package=GPfit.
MacKay, DJC. 1992. “Information-Based Objective Functions for Active Data Selection.” Neural Computation 4 (4): 590–604.
Marrel, A, B Iooss, B Laurent, and O Roustant. 2009. “Calculations of Sobol Indices for the Gaussian Process Metamodel.” Reliability Engineering & System Safety 94 (3): 742–51.
Matheron, G. 1963. “Principles of Geostatistics.” Economic Geology 58 (8): 1246–66.
Matott, LS, K Leung, and J Sim. 2011. “Application of MATLAB and Python Optimizers to Two Case Studies Involving Groundwater Flow and Contaminant Transport Modeling.” Computers & Geosciences 37 (11): 1894–99.
Matott, LS, AJ Rabideau, and JR Craig. 2006. “Pump-and-Treat Optimization Using Analytic Element Method Flow Models.” Advances in Water Resources 29 (5): 760–75.
Mayer, AS, CT Kelley, and CT Miller. 2002. “Optimal Design for Problems Involving Flow and Transport Phenomena in Saturated Subsurface Systems.” Advances in Water Resources 25 (8-12): 1233–56.
McClarren, RG, D Ryu, RP Drake, M Grosskopf, D Bingham, C Chou, B Fryxell, B Van der Holst, JP Holloway, and CC Kuranz. 2011. “A Physics Informed Emulator for Laser-Driven Radiating Shock Simulations.” Reliability Engineering & System Safety 96 (9): 1194–1207.
McCulloch, R, R Sparapani, RB Gramacy, C Spanbauer, and M Pratola. 2019.
BART
: Bayesian Additive Regression Trees. https://CRAN.R-project.org/package=BART.
McKeague, Ian W, Geoff Nicholls, Kevin Speer, and Radu Herbei. 2005. “Statistical Inversion of South Atlantic Circulation in an Abyssal Neutral Density Layer.” Journal of Marine Research 63 (4): 683–704.
Mead, R, and KH Freeman. 1973. “An Experiment Game.” Journal of the Royal Statistical Society. Series C (Applied Statistics) 22 (1): 1–6.
Mehta, PM, A Walker, E Lawrence, R Linares, D Higdon, and J Koller. 2014. “Modeling Satellite Drag Coefficients with Response Surfaces.” Advances in Space Research 54 (8): 1590–1607.
Milborrow, S. 2019.
earth
: Multivariate Adaptive Regression Splines. https://CRAN.R-project.org/package=earth.
Mitchell, TJ, and DS Scott. 1987. “A Computer Program for the Design of Group Testing Experiments.” Communications in Statistics – Theory and Methods 16 (10): 2943–55.
Mitchell, T, J Sacks, and D Ylvisaker. 1994. “Asymptotic Bayes Criteria for Nonparametric Response Surface Design.” The Annals of Statistics 22 (2): 634–51.
Mockus, J, V Tiesis, and A Zilinskas. 1978. “The Application of Bayesian Methods for Seeking the Extremum.” Towards Global Optimization 2 (117-129): 2.
Morgan, JN, and JA Sonquist. 1963. “Problems in the Analysis of Survey Data, and a Proposal.” Journal of the American Statistical Association 58 (302): 415–34.
Morris, M. 2010. Design of Experiments: An Introduction Based on Linear Models. Chapman; Hall/CRC.
Morris, MD, and TJ Mitchell. 1995. “Exploratory Designs for Computational Experiments.” Journal of Statistical Planning and Inference 43 (3): 381–402.
Morris, MD, TJ Mitchell, and D Ylvisaker. 1993. “Bayesian Design and Analysis of Computer Experiments: Use of Derivatives in Surface Prediction.” Technometrics 35 (3): 243–55.
Morris, RD, A Kottas, MA Taddy, R Furfaro, and BD Ganapol. 2008. “A Statistical Framework for the Sensitivity Analysis of Radiative Transfer Models.” IEEE Transactions on Geoscience and Remote Sensing 46 (12): 4062–74.
Myers, RH, DC Montgomery, and CM Anderson–Cook. 2016. Response Surface Methodology: Process and Product Optimization Using Designed Experiments. John Wiley & Sons.
Neal, R. 1998. “Regression and Classification Using Gaussian Process Priors.” Bayesian Statistics 6: 475.
Nelder, JA. 1966. “Inverse Polynomials, a Useful Group of Multi-Factor Response Functions.” Biometrics 22: 128–41.
Nelder, JA, and R Mead. 1965. “A Simplex Method for Function Minimization.” The Computer Journal 7 (4): 308–13.
Ng, SH, and J Yin. 2012. “Bayesian Kriging Analysis and Design for Stochastic Systems.” ACM Transations on Modeling and Computer Simulation (TOMACS) 22 (3): article no. 17.
Nocedal, J, and S Wright. 2006. Numerical Optimization. New York, NY: Springer Science & Business Media.
Nychka, D, R Furrer, J Paige, and S Sain. 2019.
fields
: Tools for Spatial Data. https://CRAN.R-project.org/package=fields.
Oakley, JE, and A O’Hagan. 2004. “Probabilistic Sensitivity Analysis of Complex Models: A Bayesian Approach.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 66 (3): 751–69.
Opsomer, J. D., D. Ruppert, W. P. Wand, U. Holst, and O. Hossler. 1999. “Kriging with Nonparameteric Variance Function Estimation.” Biometrics 55: 704–10.
Owen, AB. 2014. “Sobol Indices and Shapley Value.” SIAM/ASA Journal on Uncertainty Quantification 2 (1): 245–51.
Paciorek, CJ. 2003. “Nonstationary Gaussian Processes for Regression and Spatial Modelling.” PhD thesis, Carnegie Mellon University.
Paciorek, CJ, and MJ Schervish. 2006. “Spatial Modelling Using a New Class of Nonstationary Covariance Functions.” Environmetrics: The Official Journal of the International Environmetrics Society 17 (5): 483–506.
Pamadi, B, P Covell, P Tartabini, and K Murphy. 2004. “Aerodynamic Characteristics and Glide-Back Performance of Langley Glide-Back Booster.” In 22nd Applied Aerodynamics Conference and Exhibit, 5382.
Pamadi, B, P Tartabini, and B Starr. 2004. “Ascent, Stage Separation and Glideback Performance of a Partially Reusable Small Launch Vehicle.” In 42nd AIAA Aerospace Sciences Meeting and Exhibit, 876.
Park, C, and DW Apley. 2018. “Patchwork Kriging for Large-Scale Gaussian Process Regression.” The Journal of Machine Learning Research 19 (1): 269–311.
Park, T, and G Casella. 2008. “The Bayesian Lasso.” Journal of the American Statistical Association 103 (482): 681–86.
Pav, SE. 2017.
sadists
: Some Additional Distributions. https://CRAN.R-project.org/package=sadists.
Peng, CY, and CFJ Wu. 2014. “On the Choice of Nugget in Kriging Modeling for Deterministic Computer Experiments.” Journal of Computational and Graphical Statistics 23 (1): 151–68.
Picheny, V, and D Ginsbourger. 2013. “A Nonstationary Space-Time Gaussian Process Model for Partially Converged Simulations.” SIAM/ASA Journal on Uncertainty Quantification 1: 57–78.
Picheny, V, D Ginsbourger, and T Krityakierne. 2016. “Comment: Some Enhancements over the Augmented Lagrangian Approach.” Technometrics 58 (1): 17–21.
Picheny, V, D Ginsbourger, Y Richet, and G Caplin. 2013. “Quantile-Based Optimization of Noisy Computer Experiments with Tunable Precision.” Technometrics 55 (1): 2–13.
Picheny, V, D Ginsbourger, and O Roustant. 2016.
DiceOptim
: Kriging-Based Optimization for Computer Experiments. https://CRAN.R-project.org/package=DiceOptim.
Picheny, V, D Ginsbourger, O Roustant, RT Haftka, and N Kim. 2010. “Adaptive Designs of Experiments for Accurate Approximation of a Target Region.” Journal of Mechanical Design 132 (7): 071008.
Picheny, V, RB Gramacy, S Wild, and S Le Digabel. 2016. “Bayesian Optimization Under Mixed Constraints with a Slack-Variable Augmented Lagrangian.” In Advances in Neural Information Processing Systems, 1435–43.
Picheny, V, T Wagner, and D Ginsbourger. 2012. “A Benchmark of Kriging-Based Infill Criteria for Noisy Optimization.” Structural and Multidisciplinary Optimization, 1–20.
Picone, JM, AE Hedin, D Pj Drob, and AC Aikin. 2002. “NRLMSISE-00 Empirical Model of the Atmosphere: Statistical Comparisons and Scientific Issues.” Journal of Geophysical Research: Space Physics 107 (A12): SIA–15.
Plumlee, M. 2017. “Bayesian Calibration of Inexact Computer Models.” Journal of the American Statistical Association 112 (519): 1274–85.
Plumlee, M, and R Tuo. 2014. “Building Accurate Emulators for Stochastic Simulations via Quantile Kriging.” Technometrics 56 (4): 466–73.
Pratola, Matthew T. 2016. “Efficient Metropolis–Hastings Proposal Mechanisms for Bayesian Regression Tree Models.” Bayesian Analysis 11 (3): 885–911.
Pratola, M, H Chipman, EI George, and R McCulloch. 2017. “Heteroscedastic BART Using Multiplicative Regression Trees.” Preprint on arXiv:1709.07542.
Pratola, MT, O Harari, D Bingham, and GE Flowers. 2017. “Design and Analysis of Experiments on Nonconvex Regions.” Technometrics, 1–12.
Qian, PZG. 2009. “Nested Latin Hypercube Designs.” Biometrika 96 (4): 957–70.
Qian, PZG, H Wu, and CFJ Wu. 2008. “Gaussian Process Models for Computer Experiments with Qualitative and Quantitative Factors.” Technometrics 50 (3): 383–96.
Quadrianto, N, K Kersting, M Reid, T Caetano, and W Buntine. 2009. “Kernel Conditional Quantile Estimation via Reduction Revisited.” In Proceedings of the 9th IEEE International Conference on Data Mining, 938–43.
R Core Team. 2019. R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing. https://www.R-project.org/.
Racine, JS, and Z Nie. 2018.
crs
: Categorical Regression Splines. https://CRAN.R-project.org/package=crs.
Ranjan, P, D Bingham, and G Michailidis. 2008. “Sequential Experiment Design for Contour Estimation from Complex Computer Codes.” Technometrics 50 (4): 527–41.
Rasmussen, CE, and CKI Williams. 2006. Gaussian Processes for Machine Learning. Cambridge, MA: MIT Press.
Raymer, D. 2012. Aircraft Design: A Conceptual Approach. American Institute of Aeronautics; Astronautics, Inc.
Richardson, S, and PJ Green. 1997. “On Bayesian Analysis of Mixtures with an Unknown Number of Components (with Discussion).” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 59 (4): 731–92.
Ridgeway, G. 2007. “Generalized Boosted Models: A Guide to the
gbm
Package.” R package vignette: https://cran.r-project.org/web/packages/gbm/vignettes/gbm.pdf.
Ripley, BD. 2015.
spatial
: Functions for Kriging and Point Pattern Analysis. https://CRAN.R-project.org/package=spatial.
Rogers, S, M Aftosmis, S Pandya, N Chaderjian, E Tejnil, and J Ahmad. 2003. “Automated CFD Parameter Studies on Distributed Parallel Computers.” In 16th AIAA Computational Fluid Dynamics Conference, 4229.
Roustant, O, D Ginsbourger, and Y Deville. 2018.
DiceKriging
: Kriging Methods for Computer Experiments. https://CRAN.R-project.org/package=DiceKriging.
Rullière, D, N Durrande, F Bachoc, and C Chevalier. 2018. “Nested Kriging Predictions for Datasets with a Large Number of Observations.” Statistics and Computing 28 (4): 849–67.
Rushdi, A, LP Swiler, ET Phipps, M D’Elia, and MS Ebeida. 2017. “VPS: Voronoi Piecewise Surrogate Models for High-Dimensional Data Fitting.” International Journal for Uncertainty Quantification 7 (1).
Sacks, J, WJ Welch, TJ Mitchell, and HP Wynn. 1989. “Design and Analysis of Computer Experiments.” Statistical Science 4 (4): 409–23.
Saltelli, A. 2002. “Making Best Use of Model Evaluations to Compute Sensitivity Indices.” Computer Physics Communications 145 (2): 280–97.
Saltelli, A, K Chan, and M Scott. 2000. Sensitivity Analysis. New York, NY: John Wiley & Sons.
Saltelli, A, and S Tarantola. 2002. “On the Relative Importance of Input Factors in Mathematical Models: Safety Assessment for Nuclear Waste Disposal.” Journal of the American Statistical Association 97 (459): 702–9.
Sampson, PD, and P Guttorp. 1992. “Nonparametric Estimation of Nonstationary Spatial Covariance Structure.” Journal of the American Statistical Association 87 (417): 108–19.
Sang, H, and JZ Huang. 2012. “A Full Scale Approximation of Covariance Functions for Large Spatial Data Sets.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 74 (1): 111–32.
Santner, TJ, BJ Williams, and W Notz. 2003. The Design and Analysis of Computer Experiments, First Edition. Springer–Verlag.
Santner, TJ, BJ Williams, and W Notz. 2018. The Design and Analysis of Computer Experiments, Second Edition. New York, NY: Springer–Verlag.
Sasena, MJ. 2002. “Flexibility and Efficiency Enhancements for Constrained Global Design Optimization with Kriging Approximations.” PhD thesis, University of Michigan Ann Arbor.
Schmidt, AM, and A O’Hagan. 2003. “Bayesian Inference for Non-Stationary Spatial Covariance Structure via Spatial Deformations.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 65 (3): 743–58.
Schmidt, SR, and RG Launsby. 1989. Understanding Industrial Designed Experiments. Air Academy Press.
Schonlau, M. 1997. “Computer Experiments and Global Optimization.” PhD thesis, University of Waterloo.
Schonlau, M, WJ Welch, and DR Jones. 1998. “Global Versus Local Search in Constrained Optimization of Computer Models.” Lecture Notes-Monograph Series, 11–25.
Seo, S, M Wallat, T Graepel, and K Obermayer. 2000. “Gaussian Process Regression: Active Data Selection and Test Point Rejection.” In Mustererkennung 2000, 27–34. New York, NY: Springer–Verlag.
Sheather, S. 2009. A Modern Approach to Regression with r. New York, NY: Springer Science & Business Media.
Sherlock, C, P Fearnhead, and GO Roberts. 2010. “The Random Walk Metropolis: Linking Theory and Practice Through a Case Study.” Statistical Science 25 (2): 172–90.
Shewry, MC, and HP Wynn. 1987. “Maximum Entropy Sampling.” Journal of Applied Statistics 14 (2): 165–70.
Snelson, E, and Z Ghahramani. 2006. “Sparse Gaussian Processes Using Pseudo-Inputs.” In Advances in Neural Information Processing Systems, 1257–64.
Snoek, J, H Larochelle, and RP Adams. 2012. “Practical Bayesian Optimization of Machine Learning Algorithms.” In Advances in Neural Information Processing Systems, 2951–59.
Sobol, IM. 1993. “Sensitivity Analysis for Non-Linear Mathematical Models.” Mathematical Modelling and Computational Experiment 1: 407–14.
Sobol, IM. 2001. “Global Sensitivity Indices for Nonlinear Mathematical Models and Their Monte Carlo Estimates.” Mathematics and Computers in Simulation 55 (1-3): 271–80.
Song, E, BL Nelson, and J Staum. 2016. “Shapley Effects for Global Sensitivity Analysis: Theory and Computation.” SIAM/ASA Journal on Uncertainty Quantification 4 (1): 1060–83.
Srinivas, N, A Krause, SM Kakade, and M Seeger. 2009. “Gaussian Process Optimization in the Bandit Setting: No Regret and Experimental Design.” Preprint on ArXiv:0912.3995.
Stein, ML. 2012. Interpolation of Spatial Data: Some Theory for Kriging. New York, NY: Springer Science & Business Media.
Stein, ML, Z Chi, and LJ Welty. 2004. “Approximating Likelihoods for Large Spatial Data Sets.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 66 (2): 275–96.
Storlie, CB, and JC Helton. 2008. “Multiple Predictor Smoothing Methods for Sensitivity Analysis: Description of Techniques.” Reliability Engineering & System Safety 93 (1): 28–54.
Storlie, CB, LP Swiler, JC Helton, and CJ Sallaberry. 2009. “Implementation and Evaluation of Nonparametric Regression Procedures for Sensitivity Analysis of Computationally Demanding Models.” Reliability Engineering & System Safety 94 (11): 1735–63.
Stroud, JR, ML Stein, and S Lysen. 2017. “Bayesian and Maximum Likelihood Estimation for Gaussian Processes on an Incomplete Lattice.” Journal of Computational and Graphical Statistics 26 (1): 108–20.
Sun, F, and RB Gramacy. 2019.
maximin
: Space-Filling Design Under Maximin Distance.
Sun, F, RB Gramacy, B Haaland, E Lawrence, and A Walker. 2019. “Emulating Satellite Drag from Large Simulation Experiments.” SIAM/ASA Journal on Uncertainty Quantification 7 (2): 720–59.
Sun, F, RB Gramacy, B Haaland, S Lu, and Y Hwang. 2019. “Synthesizing Simulation and Field Data of Solar Irradiance.” Statistical Analysis and Data Mining 12 (4): 311–24.
Sung, CL, RB Gramacy, and B Haaland. 2018. “Exploiting Variance Reduction Potential in Local Gaussian Process Search.” Statistica Sinica 28: 577–600.
Surjanovic, S, and D Bingham. 2013. “Virtual Library of Simulation Experiments: Test Functions and Datasets.” http://www.sfu.ca/~ssurjano.
Taddy, MA, RB Gramacy, and NG Polson. 2011. “Dynamic Trees for Learning and Design.” Journal of the American Statistical Association 106 (493): 109–23.
Taddy, MA, HKH Lee, GA Gray, and JD Griffin. 2009. “Bayesian Guided Pattern Search for Robust Local Optimization.” Technometrics 51 (4): 389–401.
Tan, MHY. 2013. “Minimax Designs for Finite Design Regions.” Technometrics 55 (3): 346–58.
Ter Meer, J, H Van Duijne, R Nieuwenhuis, and H Rijnaarts. 2007. “Prevention and Reduction of Groundwater Pollution at Contaminated Megasites: Integrated Management Strategy and Its Application on Megasite Cases.” In Groundwater Science and Policy, 403–20. The Royal Society of Chemistry.
Thompson, WR. 1933. “On the Likelihood That One Unknown Probability Exceeds Another in View of the Evidence of Two Samples.” Biometrika 25 (3/4): 285–94.
Tuo, R, and CF Wu. 2016. “A Theoretical Framework for Calibration in Computer Models: Parametrization, Estimation and Convergence Properties.” SIAM/ASA Journal on Uncertainty Quantification 4 (1): 767–95.
Tuo, R, and CFJ Wu. 2015. “Efficient Calibration for Imperfect Computer Models.” The Annals of Statistics 43 (6): 2331–52.
Ubaru, S, J Chen, and Y Saad. 2017. “Fast Estimation of \(\mathrm{tr}(f(A))\) via Stochastic Lanczos Quadrature.” SIAM Journal on Matrix Analysis and Applications 38 (4): 1075–99.
Vanhatalo, J, J Riihimäki, J Hartikainen, P Jylänki, V Tolvanen, and A Vehtari. 2012. “Bayesian Modeling with Gaussian Processes Using the GPstuff Toolbox.” Preprint on ArXiv:1206.5754.
Vapnik, Vladimir. 2013. The Nature of Statistical Learning Theory. New York, NY: Springer Science & Business Media.
Vecchia, AV. 1988. “Estimation and Model Identification for Continuous Spatial Processes.” Journal of the Royal Statistical Society: Series B (Methodological) 50 (2): 297–312.
Ver Hoef, J., and RP Barry. 1998. “Constructing and Fitting Models for Cokriging and Multivariate Spatial Prediction.” Journal of Statistical Planning and Inference 69: 275–94.
Vernon, I, M Goldstein, and RG Bower. 2010. “Galaxy Formation: A Bayesian Uncertainty Analysis.” Bayesian Analysis 5 (4): 619–69.
Wang, KA, G Pleiss, JR Gardner, S Tyree, KQ Weinberger, and AG Wilson. 2019. “Exact Gaussian Processes on a Million Data Points.” Preprint on ArXiv:1903.08114.
Wang, W, and X Chen. 2016. “The Effects of Estimation of Heteroscedasticity on Stochastic Kriging.” In Proceedings of the 2016 Winter Simulation Conference, 326–37. IEEE Press.
Welch, WJ, RJ Buck, J Sacks, HP Wynn, TJ Mitchell, and MD Morris. 1992. “Screening, Predicting, and Computer Experiments.” Technometrics 34 (1): 15–25.
Wendland, H. 2004. Scattered Data Approximation. Cambridge, England: Cambridge University Press.
Williamson, D, M Goldstein, L Allison, A Blaker, P Challenor, L Jackson, and K Yamazaki. 2013. “History Matching for Exploring and Reducing Climate Model Parameter Space Using Observations and a Large Perturbed Physics Ensemble.” Climate Dynamics 41 (7-8): 1703–29.
Worley, BA. 1987. “Deterministic Uncertainty Analysis.” Oak Ridge National Lab., TN (USA).
Wu, CFJ, and MS Hamada. 2011. Experiments: Planning, Analysis, and Optimization. Vol. 552. John Wiley & Sons.
Wu, J, M Poloczek, AG Wilson, and P Frazier. 2017. “Bayesian Optimization with Gradients.” In Advances in Neural Information Processing Systems, 5267–78.
Wu, Yuhong, Håkon Tjelmeland, and Mike West. 2007. “Bayesian CART: Prior Specification and Posterior Simulation.” Journal of Computational and Graphical Statistics 16 (1): 44–66.
Xie, J, PI Frazier, and S Chick. 2012. “Assemble to Order Simulator.” http://simopt.org/wiki/index.php?title=Assemble_to_Order&oldid=447.
Xie, Y. 2015. Dynamic Documents with R and
knitr
. 2nd ed. Boca Raton, Florida: Chapman; Hall/CRC. http://yihui.name/knitr/.
Xie, Y. 2016.
bookdown
: Authoring Books and Technical Documents with Rmarkdown. Chapman; Hall/CRC.
Xie, Y. 2018a.
bookdown
: Authoring Books and Technical Documents with Rmarkdown. https://CRAN.R-project.org/package=bookdown.
Xie, Y. 2018b.
knitr
: A General-Purpose Package for Dynamic Report Generation in r. https://CRAN.R-project.org/package=knitr.
Zhang, B, DA Cole, and RB Gramacy. 2020. “Distance-Distributed Design for Gaussian Process Surrogates.” To Appear in Technometrics.
Zhang, Y, S Tao, W Chen, and DW Apley. 2018. “A Latent Variable Approach to Gaussian Process Modeling with Qualitative and Quantitative Factors.” Preprint on ArXiv:1806.07504.
Zhao, Y, Y Amemiya, and Y Hung. 2018. “Efficient Gaussian Process Modeling Using Experimental Design-Based Subagging.” Statistica Sinica 28 (3): 1459–79.
Zhou, Q, PZG Qian, and S Zhou. 2011. “A Simple Approach to Emulation for Computer Models with Qualitative and Quantitative Factors.” Technometrics 53 (3): 266–73.