\[ %% % Add your macros here; they'll be included in pdf and html output. %% \newcommand{\R}{\mathbb{R}} % reals \newcommand{\E}{\mathbb{E}} % expectation \renewcommand{\P}{\mathbb{P}} % probability \DeclareMathOperator{\logit}{logit} \DeclareMathOperator{\logistic}{logistic} \DeclareMathOperator{\sd}{sd} \DeclareMathOperator{\var}{var} \DeclareMathOperator{\cov}{cov} \DeclareMathOperator{\Normal}{Normal} \DeclareMathOperator{\Poisson}{Poisson} \DeclareMathOperator{\Beta}{Beta} \DeclareMathOperator{\Binom}{Binomial} \DeclareMathOperator{\Gam}{Gamma} \DeclareMathOperator{\Exp}{Exponential} \DeclareMathOperator{\Cauchy}{Cauchy} \DeclareMathOperator{\Unif}{Unif} \DeclareMathOperator{\Dirichlet}{Dirichlet} \newcommand{\given}{\;\vert\;} \]

Metric data: regression and relatives

Peter Ralph

12 February 2018 – Advanced Biological Statistics

Overview

Summary

Last Thursday, we looked at:

  1. Using non-Normal noise distributions (e.g., Cauchy) to make models robust to outliers and more generally model overdispersed data.

  2. Interrogating hyperparameters to learn what we want (e.g., percent variation explained).

Today

  1. Finish up robust ANOVA.

  2. Problem #1: “too much” noise (i.e., non-Normal noise).

  3. Problem #2: too many variables.

Robust regression

Some data

Standard linear regression

##    user  system elapsed 
##   0.005   0.000   0.004
## 
## Call:
## lm(formula = y ~ x)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -127.829   -0.307    0.234    0.691   46.293 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   0.8164     0.7202   1.133    0.258    
## x             2.9323     0.2582  11.356   <2e-16 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 10.18 on 198 degrees of freedom
## Multiple R-squared:  0.3944, Adjusted R-squared:  0.3913 
## F-statistic:   129 on 1 and 198 DF,  p-value: < 2.2e-16

with Stan

##    user  system elapsed 
##  36.170   0.947  42.192
## $summary
##               mean      se_mean         sd         2.5%          25%
## b0       1.0492653 0.0011858160 0.04901458    0.9556078    1.0153481
## b1       3.0175214 0.0003716323 0.01579547    2.9876462    3.0068720
## sigma    0.4887785 0.0011772173 0.05008859    0.3999234    0.4549012
## lp__  -143.9498680 0.0440483247 1.29259423 -147.3354709 -144.5436828
##                50%         75%       97.5%     n_eff      Rhat
## b0       1.0491830    1.081127    1.144147 1708.5037 0.9996744
## b1       3.0172806    3.027901    3.048448 1806.5022 0.9985911
## sigma    0.4863373    0.519511    0.598097 1810.3575 1.0001622
## lp__  -143.6147451 -143.008105 -142.518558  861.1239 1.0039008
## 
## $c_summary
## , , chains = chain:1
## 
##          stats
## parameter         mean         sd         2.5%          25%          50%
##     b0       1.0516070 0.04808494    0.9561784    1.0172640    1.0535676
##     b1       3.0168313 0.01636800    2.9872796    3.0052616    3.0159255
##     sigma    0.4872553 0.04932650    0.3966417    0.4543649    0.4835046
##     lp__  -143.9923917 1.28438229 -147.4284683 -144.6003711 -143.6344760
##          stats
## parameter          75%       97.5%
##     b0       1.0846034    1.142480
##     b1       3.0277263    3.050457
##     sigma    0.5192919    0.591551
##     lp__  -143.0595455 -142.523462
## 
## , , chains = chain:2
## 
##          stats
## parameter         mean         sd         2.5%          25%          50%
##     b0       1.0467663 0.04860953    0.9543881    1.0131220    1.0468940
##     b1       3.0178001 0.01599980    2.9886282    3.0064272    3.0173143
##     sigma    0.4910389 0.05262914    0.4006050    0.4540697    0.4899721
##     lp__  -143.9991601 1.36601484 -147.4543730 -144.5897508 -143.6740284
##          stats
## parameter          75%        97.5%
##     b0       1.0773398    1.1438271
##     b1       3.0283606    3.0478673
##     sigma    0.5229438    0.6013051
##     lp__  -143.0507556 -142.5151027
## 
## , , chains = chain:3
## 
##          stats
## parameter         mean         sd         2.5%          25%          50%
##     b0       1.0497458 0.05014097    0.9571373    1.0153348    1.0480027
##     b1       3.0181037 0.01486087    2.9896974    3.0076207    3.0184036
##     sigma    0.4879835 0.04999519    0.3955632    0.4555168    0.4857678
##     lp__  -143.9280711 1.22779789 -146.8053849 -144.4929896 -143.6277472
##          stats
## parameter          75%        97.5%
##     b0       1.0803476    1.1503617
##     b1       3.0276672    3.0465796
##     sigma    0.5195939    0.5972677
##     lp__  -142.9810258 -142.5383894
## 
## , , chains = chain:4
## 
##          stats
## parameter         mean         sd         2.5%          25%         50%
##     b0       1.0489423 0.04922303    0.9552911    1.0157737    1.048428
##     b1       3.0173506 0.01593153    2.9864034    3.0071933    3.017419
##     sigma    0.4888364 0.04837375    0.4060966    0.4562453    0.484854
##     lp__  -143.8798493 1.28859894 -147.1772748 -144.5032436 -143.479043
##          stats
## parameter          75%        97.5%
##     b0       1.0795552    1.1438732
##     b1       3.0276787    3.0476952
##     sigma    0.5157548    0.5962235
##     lp__  -142.9418798 -142.5069584

Compare the results.

Make a table and/or a graph of estimates and confidence intervals obtained by (a) ordinary linear regression and (b) robust regression as we have done here.

Problem #2: too many variables

Example data

from Efron, Hastie, Johnstone, & Tibshirani
from Efron, Hastie, Johnstone, & Tibshirani
diabetes                 package:lars                  R Documentation

Blood and other measurements in diabetics

Description:

     The ‘diabetes’ data frame has 442 rows and 3 columns. These are
     the data used in the Efron et al "Least Angle Regression" paper.

Format:

     This data frame contains the following columns:

     x a matrix with 10 columns

     y a numeric vector

     x2 a matrix with 64 columns

The dataset has

  • 442 diabetes patients
  • 10 main variables: age, gender, body mass index, average blood pressure (map), and six blood serum measurements (tc, ldl, hdl, tch, ltg, glu)
  • 45 interactions, e.g. age:ldl
  • 9 quadratic effects, e.g. age^2
  • measure of disease progression taken one year later, y

Crossvalidation plan

  1. Put aside 20% of the data for testing.

  2. Refit the model.

  3. Predict the test data; compute \[\begin{aligned} S = \sqrt{\frac{1}{M} \sum_{k=1}^M (\hat y_i - y_i)^2} \end{aligned}\]

  4. Repeat for the other four 20%s.

  5. Compare.

Crossvalidation

First let’s split the data into testing and training just once:

Ordinary linear regression

## 
## Call:
## lm(formula = y ~ ., data = training_d)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -116.452  -31.222   -2.942   30.048  115.597 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)    150.404      2.925  51.421  < 2e-16 ***
## age            131.031     77.503   1.691 0.092046 .  
## sex           -286.088     77.795  -3.677 0.000284 ***
## bmi            587.855     99.149   5.929 9.22e-09 ***
## map            323.091     84.736   3.813 0.000170 ***
## tc           16796.166  63143.128   0.266 0.790440    
## ldl         -15061.254  55494.787  -0.271 0.786290    
## hdl          -6254.463  23596.136  -0.265 0.791162    
## tch            330.715    316.398   1.045 0.296835    
## ltg          -4856.914  20761.268  -0.234 0.815207    
## glu             19.128     79.886   0.239 0.810945    
## `age^2`        123.345     78.097   1.579 0.115410    
## `bmi^2`        -57.665     91.584  -0.630 0.529454    
## `map^2`       -161.860    102.020  -1.587 0.113777    
## `tc^2`       11301.512   7685.114   1.471 0.142563    
## `ldl^2`       7536.397   5812.171   1.297 0.195848    
## `hdl^2`       2298.167   1803.074   1.275 0.203545    
## `tch^2`        756.037    717.721   1.053 0.293100    
## `ltg^2`       1549.779   1936.210   0.800 0.424167    
## `glu^2`         49.718    107.963   0.461 0.645518    
## `age:sex`      149.665     89.058   1.681 0.093999 .  
## `age:bmi`      -51.026     89.830  -0.568 0.570480    
## `age:map`       73.426     88.382   0.831 0.406825    
## `age:tc`      -147.148    718.847  -0.205 0.837960    
## `age:ldl`     -215.127    576.917  -0.373 0.709521    
## `age:hdl`      349.106    337.121   1.036 0.301331    
## `age:tch`      450.274    258.548   1.742 0.082718 .  
## `age:ltg`       31.703    276.056   0.115 0.908655    
## `age:glu`       65.267     92.163   0.708 0.479443    
## `sex:bmi`      -14.043     91.191  -0.154 0.877726    
## `sex:map`      165.153     89.773   1.840 0.066907 .  
## `sex:tc`       -15.990    685.587  -0.023 0.981410    
## `sex:ldl`      125.005    550.648   0.227 0.820583    
## `sex:hdl`     -108.670    328.604  -0.331 0.741123    
## `sex:tch`     -314.485    253.384  -1.241 0.215623    
## `sex:ltg`       97.473    272.504   0.358 0.720849    
## `sex:glu`       24.914     83.625   0.298 0.765983    
## `bmi:map`      326.318    101.658   3.210 0.001487 ** 
## `bmi:tc`        21.507    738.126   0.029 0.976776    
## `bmi:ldl`       92.412    616.938   0.150 0.881040    
## `bmi:hdl`     -101.632    364.919  -0.279 0.780836    
## `bmi:tch`     -162.323    250.337  -0.648 0.517262    
## `bmi:ltg`       70.844    283.351   0.250 0.802760    
## `bmi:glu`       11.216    102.970   0.109 0.913341    
## `map:tc`      1003.122    781.272   1.284 0.200248    
## `map:ldl`     -916.970    661.618  -1.386 0.166897    
## `map:hdl`     -303.164    358.515  -0.846 0.398514    
## `map:tch`       83.430    240.818   0.346 0.729277    
## `map:ltg`     -455.173    322.944  -1.409 0.159845    
## `map:glu`      -65.997    112.875  -0.585 0.559242    
## `tc:ldl`    -17718.758  12810.745  -1.383 0.167763    
## `tc:hdl`     -6298.383   4188.098  -1.504 0.133773    
## `tc:tch`     -2574.374   1995.618  -1.290 0.198142    
## `tc:ltg`    -10009.299  13661.300  -0.733 0.464387    
## `tc:glu`      -497.859    705.502  -0.706 0.480991    
## `ldl:hdl`     5004.483   3442.436   1.454 0.147165    
## `ldl:tch`     1731.892   1615.223   1.072 0.284567    
## `ldl:ltg`     8124.594  11369.412   0.715 0.475468    
## `ldl:glu`      265.993    594.756   0.447 0.655064    
## `hdl:tch`     1076.355   1217.662   0.884 0.377503    
## `hdl:ltg`     3597.256   4794.265   0.750 0.453708    
## `hdl:glu`      560.737    366.606   1.530 0.127294    
## `tch:ltg`      212.466    727.798   0.292 0.770563    
## `tch:glu`      596.191    278.394   2.142 0.033119 *  
## `ltg:glu`      172.802    328.491   0.526 0.599283    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 51.82 on 272 degrees of freedom
## Multiple R-squared:  0.6421, Adjusted R-squared:  0.5579 
## F-statistic: 7.625 on 64 and 272 DF,  p-value: < 2.2e-16

With ordinary linear regression, we got a root-mean-square-prediction-error of 64.602275 (on the test data), compared to a root-mean-square-error of 46.5588313 for the training data.

This suggests there’s some overfitting going on.

plot of chunk plot_ols

A sparsifying prior

We have a lot of predictors: 64 of them. A good guess is that only a few are really useful. So, we can put a sparsifying prior on the coefficients, i.e., the \(\beta\)s in \[\begin{aligned} y = \beta_0 + \beta_1 x_1 + \cdots \beta_n x_n + \epsilon \end{aligned}\]

Sparseness and scale mixtures

Encouraging sparseness

Suppose we do regression with a large number of predictor variables.

The resulting coefficients are sparse if most are zero.

The idea is to “encourage” all the coefficients to be zero, unless they really want to be nonzero, in which case we let them be whatever they want.

This tends to discourage overfitting.

The idea is to “encourage” all the coefficients to be zero, unless they really want to be nonzero, in which case we let them be whatever they want.

To do this, we want a prior which is very peak-ey at zero but flat away from zero (“spike-and-slab”).

Compare the Normal

\[\begin{aligned} X \sim \Normal(0,1) \end{aligned}\]

to the “exponential scale mixture of Normals”,

\[\begin{aligned} X &\sim \Normal(0,\sigma) \\ \sigma &\sim \Exp(1) . \end{aligned}\]

plot of chunk scale_mixturesplot of chunk scale_mixtures

Why use a scale mixture?

  1. Lets the data choose the appropriate scale of variation.

  2. Weakly encourages \(\sigma\) to be small: so, as much variation as possible is explained by signal instead of noise.

  3. Gets you a prior that is more peaked at zero and flatter otherwise.

Implementation

Note that

\[\begin{aligned} \beta &\sim \Normal(0,\sigma) \\ \sigma &\sim \Exp(1) . \end{aligned}\]

is equivalent to

\[\begin{aligned} \beta &= \sigma \gamma \\ \gamma &\sim \Normal(0,1) \\ \sigma &\sim \Exp(1) . \end{aligned}\]

parameters {
    real beta;
    real<lower=0> sigma;
}
model {
    beta ~ normal(0, sigma);
}

is equivalent to

parameters {
    real gamma;
    real<lower=0> sigma;
}
transformed parameters {
    real beta;
    beta = gamma * sigma;
}
model {
    beta ~ normal(0, sigma);
}

The second version is better for Stan.

Why is it better?

parameters {
    real beta;
    real<lower=0> sigma;
}
model {
    beta ~ normal(0, sigma);
}

In the first, the optimal step size depends on sigma.

plot of chunk sigma_phaseplot of chunk sigma_phase

A strongly sparsifying prior

The “horseshoe”:

\[\begin{aligned} \beta_j &\sim \Normal(0, \lambda_j) \\ \lambda_j &\sim \Cauchy(0, \tau) \\ \tau &\sim \Unif(0, 1) \end{aligned}\]

parameters {
    vector[p] d_beta;
    vector[p] d_lambda;
    real<lower=0, upper=1> tau;
}
transformed parameters {
    vector[p] beta;
    beta = d_beta .* d_lambda * tau;
}
model {
    d_beta ~ normal(0, 1);
    d_lambda ~ cauchy(0, 1);
    // tau ~ uniform(0, 1); // uniform
}

The Cauchy as a scale mixture

Inverse square-root gamma scaling

It turns out that if

\[\begin{aligned} \beta &\sim \Normal(0, 1/\sqrt{\lambda}) \\ \lambda &\sim \Gam(1/2, 1/2) \end{aligned}\]

then

\[\begin{aligned} \beta &\sim \Cauchy(0, 1). \end{aligned}\]

What black magic is this??

  1. It says so here.

  2. If you like to do integrals, you can check mathematically.

  3. Or, you can check with simulation.

Using the horseshoe

What’s an appropriate noise distribution?

plot of chunk show_y

Aside: quantile-quantile plots

The idea is to plot the quantiles of each distribution against each other.

If these are datasets, this means just plotting their sorted values against each other.

plot of chunk qq

Regression with a horseshoe prior

Uses a reparameterization of the Cauchy as a scale mixture of normals.

Note the data have already been normalized, with the exception of \(y\):

##        y              age                  sex           
##  Min.   : 25.0   Min.   :-0.1072256   Min.   :-0.044642  
##  1st Qu.: 88.0   1st Qu.:-0.0345749   1st Qu.:-0.044642  
##  Median :142.0   Median : 0.0053831   Median :-0.044642  
##  Mean   :154.3   Mean   : 0.0000151   Mean   : 0.000615  
##  3rd Qu.:215.0   3rd Qu.: 0.0380759   3rd Qu.: 0.050680  
##  Max.   :346.0   Max.   : 0.1107267   Max.   : 0.050680  
##       bmi                 map                   tc           
##  Min.   :-0.090275   Min.   :-0.1123996   Min.   :-0.126781  
##  1st Qu.:-0.032073   1st Qu.:-0.0366565   1st Qu.:-0.033216  
##  Median :-0.005128   Median :-0.0056706   Median :-0.000193  
##  Mean   : 0.002986   Mean   :-0.0002015   Mean   : 0.001795  
##  3rd Qu.: 0.035829   3rd Qu.: 0.0333486   3rd Qu.: 0.032830  
##  Max.   : 0.170555   Max.   : 0.1079441   Max.   : 0.153914  
##       ldl                 hdl                 tch           
##  Min.   :-0.115613   Min.   :-0.102307   Min.   :-0.076395  
##  1st Qu.:-0.028558   1st Qu.:-0.032356   1st Qu.:-0.039493  
##  Median :-0.002880   Median :-0.006584   Median :-0.002592  
##  Mean   : 0.002058   Mean   :-0.002635   Mean   : 0.002334  
##  3rd Qu.: 0.034698   3rd Qu.: 0.026550   3rd Qu.: 0.034309  
##  Max.   : 0.155887   Max.   : 0.177497   Max.   : 0.155345  
##       ltg                  glu                 age^2         
##  Min.   :-0.1043648   Min.   :-0.1377672   Min.   :-0.04130  
##  1st Qu.:-0.0307512   1st Qu.:-0.0300725   1st Qu.:-0.03651  
##  Median : 0.0002715   Median :-0.0010777   Median :-0.01950  
##  Mean   : 0.0029586   Mean   : 0.0004587   Mean   :-0.00153  
##  3rd Qu.: 0.0336568   3rd Qu.: 0.0279170   3rd Qu.: 0.01646  
##  Max.   : 0.1335990   Max.   : 0.1356118   Max.   : 0.18276  
##      bmi^2               map^2                tc^2           
##  Min.   :-0.032976   Min.   :-0.039369   Min.   :-0.0319463  
##  1st Qu.:-0.029542   1st Qu.:-0.034253   1st Qu.:-0.0291980  
##  Median :-0.016958   Median :-0.018662   Median :-0.0163678  
##  Mean   : 0.001593   Mean   :-0.001993   Mean   : 0.0005684  
##  3rd Qu.: 0.015956   3rd Qu.: 0.015789   3rd Qu.: 0.0090080  
##  Max.   : 0.391017   Max.   : 0.180473   Max.   : 0.3025598  
##      ldl^2                hdl^2               tch^2           
##  Min.   :-0.0296059   Min.   :-0.027654   Min.   :-0.0305374  
##  1st Qu.:-0.0261629   1st Qu.:-0.025283   1st Qu.:-0.0301802  
##  Median :-0.0178284   Median :-0.014861   Median :-0.0094855  
##  Mean   :-0.0003397   Mean   :-0.002452   Mean   :-0.0006151  
##  3rd Qu.: 0.0073940   3rd Qu.: 0.005809   3rd Qu.:-0.0094855  
##  Max.   : 0.2883956   Max.   : 0.357531   Max.   : 0.2952011  
##      ltg^2                glu^2               age:sex         
##  Min.   :-0.0349354   Min.   :-0.0319022   Min.   :-0.120695  
##  1st Qu.:-0.0304543   1st Qu.:-0.0293459   1st Qu.:-0.030888  
##  Median :-0.0191324   Median :-0.0191600   Median : 0.001365  
##  Mean   :-0.0007457   Mean   :-0.0001022   Mean   : 0.001096  
##  3rd Qu.: 0.0113077   3rd Qu.: 0.0106420   3rd Qu.: 0.036802  
##  Max.   : 0.2406822   Max.   : 0.2358489   Max.   : 0.111614  
##     age:bmi              age:map              age:tc         
##  Min.   :-0.1748118   Min.   :-0.166429   Min.   :-0.136218  
##  1st Qu.:-0.0192374   1st Qu.:-0.021593   1st Qu.:-0.020113  
##  Median :-0.0065030   Median :-0.007572   Median :-0.008896  
##  Mean   : 0.0005207   Mean   :-0.001270   Mean   :-0.001146  
##  3rd Qu.: 0.0184220   3rd Qu.: 0.022763   3rd Qu.: 0.017210  
##  Max.   : 0.1702584   Max.   : 0.145881   Max.   : 0.199426  
##     age:ldl             age:hdl             age:tch          
##  Min.   :-0.155477   Min.   :-0.130730   Min.   :-0.2012887  
##  1st Qu.:-0.019358   1st Qu.:-0.022842   1st Qu.:-0.0153943  
##  Median :-0.006878   Median : 0.001632   Median :-0.0080380  
##  Mean   :-0.001112   Mean   :-0.000360   Mean   :-0.0008048  
##  3rd Qu.: 0.014055   3rd Qu.: 0.018063   3rd Qu.: 0.0196049  
##  Max.   : 0.206119   Max.   : 0.206328   Max.   : 0.1672338  
##     age:ltg              age:glu             sex:bmi          
##  Min.   :-0.1600509   Min.   :-0.119778   Min.   :-0.1565731  
##  1st Qu.:-0.0214797   1st Qu.:-0.022366   1st Qu.:-0.0317632  
##  Median :-0.0083190   Median :-0.010909   Median :-0.0016333  
##  Mean   : 0.0001587   Mean   :-0.001671   Mean   : 0.0004026  
##  3rd Qu.: 0.0172821   3rd Qu.: 0.012698   3rd Qu.: 0.0343105  
##  Max.   : 0.1795324   Max.   : 0.184369   Max.   : 0.1791461  
##     sex:map              sex:tc             sex:ldl          
##  Min.   :-0.113623   Min.   :-0.136130   Min.   :-1.297e-01  
##  1st Qu.:-0.030841   1st Qu.:-0.029601   1st Qu.:-3.037e-02  
##  Median : 0.007036   Median :-0.001490   Median : 2.988e-03  
##  Mean   : 0.001291   Mean   : 0.000314   Mean   : 4.536e-05  
##  3rd Qu.: 0.037201   3rd Qu.: 0.033216   3rd Qu.: 3.224e-02  
##  Max.   : 0.107407   Max.   : 0.161566   Max.   : 1.602e-01  
##     sex:hdl              sex:tch             sex:ltg         
##  Min.   :-1.622e-01   Min.   :-0.159990   Min.   :-0.133887  
##  1st Qu.:-3.080e-02   1st Qu.:-0.019548   1st Qu.:-0.027870  
##  Median : 2.594e-05   Median : 0.013582   Median : 0.008752  
##  Mean   :-7.790e-04   Mean   : 0.001308   Mean   : 0.002488  
##  3rd Qu.: 3.021e-02   3rd Qu.: 0.022402   3rd Qu.: 0.032245  
##  Max.   : 1.748e-01   Max.   : 0.157698   Max.   : 0.136614  
##     sex:glu             bmi:map               bmi:tc          
##  Min.   :-0.136467   Min.   :-0.1672668   Min.   :-0.2932071  
##  1st Qu.:-0.029386   1st Qu.:-0.0219508   1st Qu.:-0.0193874  
##  Median :-0.002274   Median :-0.0104349   Median :-0.0085277  
##  Mean   : 0.001033   Mean   : 0.0003255   Mean   : 0.0005746  
##  3rd Qu.: 0.033874   3rd Qu.: 0.0168374   3rd Qu.: 0.0179394  
##  Max.   : 0.137802   Max.   : 0.2284830   Max.   : 0.2269828  
##     bmi:ldl             bmi:hdl             bmi:tch          
##  Min.   :-0.286577   Min.   :-0.268316   Min.   :-0.1318179  
##  1st Qu.:-0.022307   1st Qu.:-0.015669   1st Qu.:-0.0237279  
##  Median :-0.008503   Median : 0.012965   Median :-0.0170881  
##  Mean   :-0.000038   Mean   : 0.001334   Mean   : 0.0003022  
##  3rd Qu.: 0.013517   3rd Qu.: 0.025325   3rd Qu.: 0.0203124  
##  Max.   : 0.232069   Max.   : 0.146477   Max.   : 0.2764597  
##     bmi:ltg              bmi:glu               map:tc         
##  Min.   :-1.851e-01   Min.   :-0.1543918   Min.   :-0.203608  
##  1st Qu.:-2.464e-02   1st Qu.:-0.0240990   1st Qu.:-0.021147  
##  Median :-1.414e-02   Median :-0.0144630   Median :-0.008035  
##  Mean   : 8.025e-05   Mean   : 0.0001888   Mean   :-0.001411  
##  3rd Qu.: 2.215e-02   3rd Qu.: 0.0170453   3rd Qu.: 0.019085  
##  Max.   : 2.189e-01   Max.   : 0.2246097   Max.   : 0.189747  
##     map:ldl             map:hdl             map:tch         
##  Min.   :-0.203862   Min.   :-0.217314   Min.   :-0.185154  
##  1st Qu.:-0.020688   1st Qu.:-0.018023   1st Qu.:-0.017297  
##  Median :-0.006742   Median : 0.005007   Median :-0.010037  
##  Mean   :-0.002099   Mean   : 0.001038   Mean   :-0.000926  
##  3rd Qu.: 0.017189   3rd Qu.: 0.018608   3rd Qu.: 0.021630  
##  Max.   : 0.191608   Max.   : 0.389867   Max.   : 0.195708  
##     map:ltg              map:glu              tc:ldl          
##  Min.   :-1.459e-01   Min.   :-0.143393   Min.   :-0.0476094  
##  1st Qu.:-2.391e-02   1st Qu.:-0.023048   1st Qu.:-0.0273934  
##  Median :-1.015e-02   Median :-0.011865   Median :-0.0167630  
##  Mean   :-1.827e-05   Mean   :-0.001108   Mean   : 0.0000648  
##  3rd Qu.: 2.148e-02   3rd Qu.: 0.009924   3rd Qu.: 0.0086523  
##  Max.   : 1.655e-01   Max.   : 0.279359   Max.   : 0.3086895  
##      tc:hdl               tc:tch               tc:ltg          
##  Min.   :-0.1281815   Min.   :-0.1223554   Min.   :-0.1048969  
##  1st Qu.:-0.0177190   1st Qu.:-0.0219653   1st Qu.:-0.0258567  
##  Median :-0.0028607   Median :-0.0164289   Median :-0.0144205  
##  Mean   : 0.0006946   Mean   :-0.0004093   Mean   : 0.0009462  
##  3rd Qu.: 0.0134764   3rd Qu.: 0.0109094   3rd Qu.: 0.0169747  
##  Max.   : 0.2076833   Max.   : 0.2288045   Max.   : 0.2231252  
##      tc:glu             ldl:hdl             ldl:tch          
##  Min.   :-0.100555   Min.   :-0.253886   Min.   :-0.1115079  
##  1st Qu.:-0.021634   1st Qu.:-0.013653   1st Qu.:-0.0236124  
##  Median :-0.012260   Median : 0.006305   Median :-0.0140355  
##  Mean   :-0.001407   Mean   : 0.001842   Mean   :-0.0008025  
##  3rd Qu.: 0.011721   3rd Qu.: 0.019763   3rd Qu.: 0.0102002  
##  Max.   : 0.237370   Max.   : 0.160773   Max.   : 0.1998614  
##     ldl:ltg              ldl:glu             hdl:tch         
##  Min.   :-1.826e-01   Min.   :-0.151734   Min.   :-0.234890  
##  1st Qu.:-2.081e-02   1st Qu.:-0.022529   1st Qu.:-0.010455  
##  Median :-8.185e-03   Median :-0.009924   Median : 0.017148  
##  Mean   : 4.830e-06   Mean   :-0.001870   Mean   : 0.001743  
##  3rd Qu.: 1.902e-02   3rd Qu.: 0.011544   3rd Qu.: 0.031343  
##  Max.   : 2.034e-01   Max.   : 0.227999   Max.   : 0.080445  
##     hdl:ltg             hdl:glu              tch:ltg         
##  Min.   :-0.254685   Min.   :-0.2232546   Min.   :-0.160745  
##  1st Qu.:-0.015445   1st Qu.:-0.0138933   1st Qu.:-0.027375  
##  Median : 0.012004   Median : 0.0086822   Median :-0.014089  
##  Mean   : 0.002202   Mean   : 0.0003819   Mean   :-0.001075  
##  3rd Qu.: 0.022550   3rd Qu.: 0.0215139   3rd Qu.: 0.012681  
##  Max.   : 0.163067   Max.   : 0.1725039   Max.   : 0.375845  
##     tch:glu              ltg:glu          
##  Min.   :-0.1176911   Min.   :-0.0887652  
##  1st Qu.:-0.0204024   1st Qu.:-0.0234429  
##  Median :-0.0155472   Median :-0.0130175  
##  Mean   :-0.0006656   Mean   : 0.0002682  
##  3rd Qu.: 0.0176564   3rd Qu.: 0.0138425  
##  Max.   : 0.3181041   Max.   : 0.3381838
## Warning: There were 9 divergent transitions after warmup. Increasing adapt_delta above 0.999 may help. See
## http://mc-stan.org/misc/warnings.html#divergent-transitions-after-warmup
## Warning: Examine the pairs() plot to diagnose sampling problems
## $summary
##                  mean      se_mean         sd         2.5%           25%
## b0        0.096800745 0.0007377281 0.03299221  0.031390174  0.0752586406
## sigma     0.604787108 0.0005570903 0.02491383  0.558250917  0.5872577691
## beta[1]   0.148057745 0.0109305497 0.41805937 -0.495635089 -0.0471093191
## beta[2]  -1.204385646 0.0345979818 0.98862695 -3.222027862 -1.9429444764
## beta[3]   7.350051339 0.0278128675 0.93474095  5.584185129  6.7157763171
## beta[4]   2.277303479 0.0423373558 1.15843418 -0.006683978  1.5352704427
## beta[5]  -0.513431566 0.0269555581 0.90868693 -2.901725648 -0.8055820020
## beta[6]  -0.121777929 0.0175443428 0.64054242 -1.596394583 -0.2652538749
## beta[7]  -1.103833945 0.0437622103 1.18226429 -3.820279964 -1.9136968720
## beta[8]   0.403766066 0.0264164146 0.91425363 -0.695986725 -0.0388721449
## beta[9]   6.233609645 0.0258510753 1.06610839  4.135042677  5.4998642175
## beta[10]  0.103711901 0.0113641703 0.39802961 -0.588260927 -0.0658665380
## beta[11]  0.367509644 0.0156857828 0.59610529 -0.378129452 -0.0017942749
## beta[12]  0.001984490 0.0121072660 0.39711273 -0.887628567 -0.1350136052
## beta[13] -0.181940927 0.0131277693 0.49545925 -1.608175689 -0.2860604088
## beta[14] -0.040084678 0.0142996674 0.49783406 -1.293891620 -0.1691176191
## beta[15] -0.122228457 0.0154341999 0.53788348 -1.443505820 -0.2602545503
## beta[16] -0.052335111 0.0169673578 0.53698638 -1.322121648 -0.1718361918
## beta[17]  0.411706077 0.0251864935 0.85767494 -0.601311783 -0.0305926867
## beta[18] -0.191205994 0.0134165129 0.50235797 -1.515284030 -0.3322063927
## beta[19]  0.253340032 0.0140910943 0.51135941 -0.458717777 -0.0241955844
## beta[20]  1.501949075 0.0273019943 0.98412280 -0.050165848  0.7390433187
## beta[21]  0.146875761 0.0114853253 0.43243965 -0.568196688 -0.0477025255
## beta[22]  0.413263977 0.0155014383 0.61056288 -0.320046382  0.0008245203
## beta[23] -0.065179497 0.0162340684 0.50594530 -1.213694542 -0.1936016029
## beta[24] -0.220367603 0.0149009118 0.55336005 -1.693395563 -0.3474773073
## beta[25]  0.134296902 0.0158579709 0.49794682 -0.670275192 -0.0717375598
## beta[26]  0.040186360 0.0151931244 0.52433815 -0.915157458 -0.1415451888
## beta[27]  0.434596874 0.0183564106 0.70255957 -0.353124960 -0.0074949738
## beta[28]  0.263892746 0.0138721854 0.53338117 -0.448253336 -0.0204744858
## beta[29]  0.067378889 0.0105923209 0.39503427 -0.721663625 -0.0801207739
## beta[30]  0.444734542 0.0165891506 0.66647791 -0.340316319 -0.0002432803
## beta[31] -0.099177806 0.0136434371 0.46901285 -1.243031042 -0.2564471319
## beta[32] -0.182373498 0.0136708855 0.51892805 -1.559542747 -0.3128892214
## beta[33]  0.131832611 0.0121984753 0.43653538 -0.616826932 -0.0567908880
## beta[34] -0.138428391 0.0137140764 0.50427947 -1.456616191 -0.2733217779
## beta[35]  0.010278834 0.0117435899 0.40540415 -0.813716792 -0.1387939885
## beta[36]  0.067932904 0.0106321280 0.37893229 -0.666240995 -0.0867538553
## beta[37]  1.685579427 0.0297282807 1.07141740 -0.064677669  0.8395771235
## beta[38] -0.021351244 0.0116995739 0.42473452 -0.959904519 -0.1462526653
## beta[39] -0.037146326 0.0129950160 0.41757569 -1.013351236 -0.1514916025
## beta[40] -0.019595910 0.0131165807 0.46542959 -1.070985581 -0.1713730456
## beta[41]  0.106274175 0.0139569995 0.49472332 -0.765726254 -0.0945270510
## beta[42]  0.115106253 0.0128018687 0.45648034 -0.726731733 -0.0682009663
## beta[43]  0.087326880 0.0113271341 0.39662659 -0.691810592 -0.0734540022
## beta[44] -0.024818917 0.0108920146 0.39290099 -0.989385713 -0.1387048587
## beta[45] -0.053951606 0.0116731363 0.41097421 -1.034393415 -0.1807392216
## beta[46]  0.181412548 0.0132146618 0.50308674 -0.507646511 -0.0572035075
## beta[47] -0.168847411 0.0139043202 0.50524901 -1.584479382 -0.3082805076
## beta[48]  0.030921049 0.0124374895 0.44037088 -0.932989982 -0.1302554820
## beta[49] -0.085249214 0.0125876062 0.43595250 -1.159881317 -0.2069066339
## beta[50]  0.012144693 0.0146928229 0.50207431 -1.057582894 -0.1475289241
## beta[51]  0.161262994 0.0151984694 0.54397108 -0.646935081 -0.0627006090
## beta[52] -0.283173868 0.0266882100 0.85466081 -2.719617102 -0.3888031142
## beta[53] -0.096545864 0.0150783510 0.52487838 -1.449141128 -0.2178617656
## beta[54]  0.192424623 0.0164115603 0.56626549 -0.733332957 -0.0452525746
## beta[55]  0.004703693 0.0139243482 0.49267419 -1.105409972 -0.1335667006
## beta[56]  0.160798800 0.0221464484 0.71542572 -0.989116068 -0.0646225923
## beta[57]  0.240024963 0.0171541011 0.59712377 -0.585365509 -0.0437433551
## beta[58]  0.296298625 0.0177218461 0.61143468 -0.474826327 -0.0265672831
## beta[59] -0.174561340 0.0182424229 0.60911994 -1.820805684 -0.2902358278
## beta[60]  0.358482879 0.0160736686 0.61774534 -0.399340993 -0.0081089410
## beta[61]  0.137998478 0.0166204573 0.56127480 -0.788739862 -0.0749583847
## beta[62] -0.353522241 0.0211709625 0.74083703 -2.437839055 -0.4824600899
## beta[63]  0.457375386 0.0243986751 0.81962720 -0.473226102 -0.0108862796
## beta[64]  0.053008482 0.0137594696 0.45296975 -0.874008732 -0.1047452558
##                    50%          75%     97.5%     n_eff      Rhat
## b0        9.650898e-02  0.119195144 0.1602656 2000.0000 0.9989130
## sigma     6.042950e-01  0.621087362 0.6534551 2000.0000 1.0015059
## beta[1]   3.134983e-02  0.273859073 1.2800457 1462.8235 0.9996546
## beta[2]  -1.163392e+00 -0.287647608 0.1479235  816.5135 1.0053306
## beta[3]   7.352664e+00  7.983749393 9.1787197 1129.5124 1.0040270
## beta[4]   2.340058e+00  3.104664136 4.4102073  748.6783 1.0069451
## beta[5]  -1.725640e-01  0.006100276 0.5269065 1136.4018 1.0008742
## beta[6]  -3.204547e-02  0.072503731 0.9395877 1332.9737 0.9993283
## beta[7]  -8.105861e-01 -0.069179772 0.2841098  729.8450 1.0038576
## beta[8]   7.447353e-02  0.534730711 3.0826922 1197.8037 0.9993356
## beta[9]   6.219964e+00  6.960795008 8.2320850 1700.7696 1.0014602
## beta[10]  2.629784e-02  0.224387227 1.0765856 1226.7479 1.0011508
## beta[11]  1.452309e-01  0.616118365 1.9422449 1444.2207 0.9987717
## beta[12]  1.087336e-03  0.133924232 0.8733495 1075.8097 1.0023349
## beta[13] -3.779729e-02  0.051601975 0.5445642 1424.4067 1.0034937
## beta[14] -6.476186e-03  0.120986507 0.9596036 1212.0411 1.0004960
## beta[15] -2.843583e-02  0.083924146 0.8281380 1214.5298 1.0000143
## beta[16] -3.406762e-03  0.123044577 0.9741382 1001.6087 1.0017051
## beta[17]  9.171008e-02  0.612252362 2.8556339 1159.6048 1.0025822
## beta[18] -4.614337e-02  0.042787442 0.5230953 1401.9977 1.0040739
## beta[19]  9.219333e-02  0.445871101 1.6099054 1316.9311 1.0000307
## beta[20]  1.485492e+00  2.234409378 3.4358149 1299.3012 0.9995558
## beta[21]  4.087306e-02  0.289595077 1.2545295 1417.6349 1.0006187
## beta[22]  1.887150e-01  0.714947719 2.0820662 1551.3750 0.9989734
## beta[23] -8.366392e-03  0.097026483 0.8334770  971.2978 1.0020796
## beta[24] -5.514247e-02  0.036864042 0.5088934 1379.0815 1.0028113
## beta[25]  2.872789e-02  0.239378652 1.4446361  985.9859 1.0067042
## beta[26] -8.631173e-05  0.158585436 1.2165624 1191.0465 1.0011136
## beta[27]  1.630007e-01  0.687886233 2.3323283 1464.8420 1.0024640
## beta[28]  9.647147e-02  0.427911911 1.7129741 1478.3783 0.9988235
## beta[29]  1.072545e-02  0.168685272 1.0628734 1390.8724 0.9991848
## beta[30]  1.862698e-01  0.760926082 2.2079361 1614.0732 1.0004663
## beta[31] -2.286213e-02  0.081146382 0.7771172 1181.7399 0.9999779
## beta[32] -4.283305e-02  0.049705924 0.5286878 1440.8574 0.9998730
## beta[33]  3.265539e-02  0.260274667 1.3284102 1280.6422 1.0008032
## beta[34] -2.718756e-02  0.075372541 0.7106186 1352.1020 0.9998384
## beta[35]  2.742249e-03  0.142047344 1.0014172 1191.7211 1.0005057
## beta[36]  1.299972e-02  0.187374643 1.0668694 1270.2314 1.0000822
## beta[37]  1.726574e+00  2.409029716 3.8307750 1298.9063 1.0005679
## beta[38] -7.607924e-03  0.115931255 0.9259206 1317.9382 1.0032408
## beta[39] -4.191319e-03  0.118217399 0.7783973 1032.5635 0.9998153
## beta[40] -2.638830e-03  0.137968722 0.9092323 1259.1188 0.9996931
## beta[41]  1.031600e-02  0.209939740 1.4856652 1256.4368 0.9993350
## beta[42]  2.583178e-02  0.243750560 1.2698513 1271.4446 0.9997268
## beta[43]  1.877351e-02  0.216946070 1.1038814 1226.0934 1.0005459
## beta[44] -5.555789e-04  0.122203549 0.7687124 1301.2173 0.9992789
## beta[45] -7.937788e-03  0.093216662 0.7493574 1239.5214 0.9997529
## beta[46]  4.220337e-02  0.297640899 1.6459243 1449.3512 1.0006258
## beta[47] -4.822399e-02  0.058969288 0.6748588 1320.4179 1.0011824
## beta[48]  5.947299e-03  0.183195867 1.0929164 1253.6368 1.0000397
## beta[49] -1.507508e-02  0.078328658 0.7439193 1199.4774 1.0020309
## beta[50]  3.813026e-04  0.151179088 1.1751295 1167.6845 0.9990124
## beta[51]  3.418521e-02  0.286090071 1.5798943 1281.0082 1.0031372
## beta[52] -3.638296e-02  0.070640687 0.7710908 1025.5307 1.0051598
## beta[53] -1.667603e-02  0.084334764 0.8696586 1211.7406 1.0003921
## beta[54]  4.381272e-02  0.339458843 1.7786208 1190.5303 1.0032486
## beta[55]  1.787067e-03  0.158021814 0.9713906 1251.9007 1.0019401
## beta[56]  3.041852e-02  0.249548457 1.9181104 1043.5684 1.0017113
## beta[57]  5.797979e-02  0.379843638 1.9615755 1211.6936 1.0014872
## beta[58]  8.023542e-02  0.459054373 2.0266783 1190.3707 1.0000956
## beta[59] -2.763325e-02  0.061804646 0.6862684 1114.9120 1.0003219
## beta[60]  1.280235e-01  0.572198633 2.0368150 1477.0287 1.0005896
## beta[61]  2.532298e-02  0.231892281 1.5782635 1140.4208 1.0001674
## beta[62] -7.529499e-02  0.021590673 0.4673873 1224.5152 1.0006860
## beta[63]  1.456097e-01  0.709721936 2.6199230 1128.4966 0.9993906
## beta[64]  9.284657e-03  0.180310526 1.1213950 1083.7647 1.0037735
## 
## $c_summary
## , , chains = chain:1
## 
##           stats
## parameter          mean         sd        2.5%          25%          50%
##   b0        0.096043255 0.03370987  0.02704341  0.071600437  0.094657779
##   sigma     0.604995610 0.02431772  0.56137593  0.587244673  0.604577602
##   beta[1]   0.145789362 0.38379211 -0.44378226 -0.044905826  0.025365573
##   beta[2]  -1.213246860 1.02416541 -3.20583848 -1.972631618 -1.144522454
##   beta[3]   7.383114557 0.89067475  5.68530972  6.837834951  7.385963400
##   beta[4]   2.299721401 1.11963428  0.03363081  1.579095144  2.271788133
##   beta[5]  -0.580647928 1.00231965 -3.23420358 -0.841057690 -0.215168288
##   beta[6]  -0.098255415 0.68015749 -1.48381404 -0.279858548 -0.023253103
##   beta[7]  -1.092493810 1.20086939 -4.03484086 -1.959633416 -0.762059622
##   beta[8]   0.407550528 0.94386794 -0.72007422 -0.048295192  0.068958820
##   beta[9]   6.250616833 1.08760811  4.06526228  5.497082285  6.318552138
##   beta[10]  0.087649196 0.43497841 -0.75270211 -0.093529399  0.020623200
##   beta[11]  0.374764001 0.60392956 -0.35286356 -0.008540096  0.145230932
##   beta[12]  0.015217682 0.40311001 -0.78426696 -0.116007899  0.001208401
##   beta[13] -0.162735515 0.52366082 -1.58524437 -0.235723737 -0.029633699
##   beta[14] -0.057855841 0.50637843 -1.25387506 -0.189734678 -0.011722606
##   beta[15] -0.092815115 0.48983139 -1.33523878 -0.222322009 -0.033642443
##   beta[16] -0.047676432 0.56945858 -1.44617051 -0.195068933 -0.004425763
##   beta[17]  0.421119148 0.91397336 -0.60818250 -0.036557733  0.087885563
##   beta[18] -0.209400462 0.56412756 -1.72435289 -0.375092616 -0.056872354
##   beta[19]  0.291009023 0.54540990 -0.49316429 -0.008072179  0.145946500
##   beta[20]  1.480571579 0.93695015 -0.03785704  0.761562059  1.475263995
##   beta[21]  0.164947444 0.45569576 -0.48509914 -0.063726104  0.032609242
##   beta[22]  0.401285227 0.58238643 -0.29648939  0.003210501  0.180548201
##   beta[23] -0.096383578 0.55104037 -1.38108413 -0.224872718 -0.012540336
##   beta[24] -0.176140186 0.50208690 -1.52635509 -0.268228336 -0.035128796
##   beta[25]  0.158423045 0.54024027 -0.59045018 -0.073992155  0.030904220
##   beta[26]  0.048991185 0.51585360 -0.82856444 -0.148188722 -0.005089784
##   beta[27]  0.467735996 0.70880500 -0.36294259  0.003366895  0.200763451
##   beta[28]  0.271277297 0.53185116 -0.39314028 -0.026739412  0.083399555
##   beta[29]  0.079982196 0.37760203 -0.52230749 -0.087615414  0.011931552
##   beta[30]  0.453388249 0.67475414 -0.30199073 -0.003246616  0.184021906
##   beta[31] -0.074141651 0.45556556 -1.12642425 -0.225858835 -0.016261385
##   beta[32] -0.185007548 0.48715362 -1.53759170 -0.302864944 -0.049595576
##   beta[33]  0.126140815 0.41856396 -0.57710370 -0.055925581  0.037962213
##   beta[34] -0.159575725 0.51732249 -1.52744405 -0.305762970 -0.037583731
##   beta[35] -0.008822837 0.42014013 -0.83275032 -0.170599779  0.003853561
##   beta[36]  0.069236429 0.40893326 -0.71764897 -0.100443649  0.013684134
##   beta[37]  1.679011830 1.09299819 -0.08963171  0.860737739  1.739556611
##   beta[38] -0.010093177 0.41372296 -0.89468786 -0.136736795 -0.008382645
##   beta[39] -0.037644441 0.40900583 -0.94788634 -0.144069908 -0.006006124
##   beta[40] -0.047426340 0.44121591 -1.03219435 -0.218090876 -0.012570691
##   beta[41]  0.101708612 0.48688025 -0.81965501 -0.084556619  0.024019229
##   beta[42]  0.082137517 0.44098361 -0.77701065 -0.122295220  0.011803594
##   beta[43]  0.070230622 0.38517888 -0.75336667 -0.080456197  0.011742135
##   beta[44] -0.021158858 0.37969429 -0.86271383 -0.150766507 -0.013124197
##   beta[45] -0.054081436 0.41310332 -0.99458620 -0.177353020 -0.005557781
##   beta[46]  0.209001589 0.53084922 -0.49510306 -0.028420177  0.067402154
##   beta[47] -0.170899308 0.49680324 -1.71947343 -0.274307573 -0.053944957
##   beta[48]  0.007022844 0.46820479 -1.07489551 -0.155484150 -0.001134159
##   beta[49] -0.096618793 0.42614588 -1.17799906 -0.214623730 -0.021212001
##   beta[50]  0.002787401 0.43129729 -0.87667587 -0.142250734  0.002122639
##   beta[51]  0.237378907 0.61209647 -0.53907690 -0.039252469  0.051416806
##   beta[52] -0.340299396 0.90638824 -3.08998249 -0.374818478 -0.037001423
##   beta[53] -0.115331416 0.53946233 -1.37617555 -0.205443530 -0.014712446
##   beta[54]  0.257991186 0.61432989 -0.59498359 -0.030995874  0.065448389
##   beta[55]  0.001265655 0.49389986 -1.23850895 -0.138063714  0.005329492
##   beta[56]  0.194952775 0.80126225 -0.89886795 -0.076858506  0.034767077
##   beta[57]  0.228892684 0.53157477 -0.52533825 -0.019034366  0.071198762
##   beta[58]  0.294074519 0.60949856 -0.47411970 -0.031089560  0.097103431
##   beta[59] -0.172214433 0.59225720 -1.85522252 -0.297799394 -0.027192721
##   beta[60]  0.348923422 0.64578679 -0.38416557 -0.014724533  0.102109376
##   beta[61]  0.095391621 0.53852654 -0.83790129 -0.081176267  0.021695397
##   beta[62] -0.337433537 0.74185922 -2.42750151 -0.413622941 -0.080953328
##   beta[63]  0.428202539 0.78937195 -0.51685731 -0.006522981  0.156366412
##   beta[64]  0.043233199 0.43081561 -0.82878595 -0.108388983  0.001814935
##           stats
## parameter           75%     97.5%
##   b0        0.120946240 0.1592675
##   sigma     0.620797781 0.6531221
##   beta[1]   0.258304054 1.2295684
##   beta[2]  -0.230859629 0.1511064
##   beta[3]   8.027439434 8.9647040
##   beta[4]   3.099262159 4.3408438
##   beta[5]  -0.005857782 0.4713624
##   beta[6]   0.074341678 0.9720115
##   beta[7]  -0.059292749 0.3068976
##   beta[8]   0.511876508 3.1604980
##   beta[9]   6.975133962 8.1675404
##   beta[10]  0.209583240 1.1526731
##   beta[11]  0.576222208 1.9393828
##   beta[12]  0.140293985 0.9718253
##   beta[13]  0.070944984 0.6201343
##   beta[14]  0.126220593 0.9075197
##   beta[15]  0.086361562 0.8826077
##   beta[16]  0.140554040 1.0864658
##   beta[17]  0.615549228 3.0699077
##   beta[18]  0.057648022 0.6195850
##   beta[19]  0.469665128 1.7438884
##   beta[20]  2.211829189 3.2404234
##   beta[21]  0.332877799 1.3337647
##   beta[22]  0.703329562 1.9064928
##   beta[23]  0.090081614 0.8335657
##   beta[24]  0.055339697 0.5309573
##   beta[25]  0.279325441 1.5847465
##   beta[26]  0.137548937 1.1966511
##   beta[27]  0.727220908 2.1907324
##   beta[28]  0.404205501 1.7721020
##   beta[29]  0.177450605 1.0011747
##   beta[30]  0.763871848 2.1859661
##   beta[31]  0.079431073 0.8256627
##   beta[32]  0.029789087 0.5157569
##   beta[33]  0.221507905 1.4723614
##   beta[34]  0.063839229 0.7699274
##   beta[35]  0.127799864 1.0294184
##   beta[36]  0.189143111 1.1665577
##   beta[37]  2.363913452 3.9202462
##   beta[38]  0.131798986 0.8573691
##   beta[39]  0.120162082 0.6819696
##   beta[40]  0.108664434 0.8023568
##   beta[41]  0.225805437 1.3118542
##   beta[42]  0.247151717 1.1920543
##   beta[43]  0.211060764 1.0205250
##   beta[44]  0.116910924 0.8034665
##   beta[45]  0.099004045 0.7200431
##   beta[46]  0.320150859 1.7889741
##   beta[47]  0.046675151 0.5365188
##   beta[48]  0.179542137 1.0146447
##   beta[49]  0.070369599 0.7618872
##   beta[50]  0.139873025 0.9305689
##   beta[51]  0.406381707 1.8634864
##   beta[52]  0.046064250 0.5536887
##   beta[53]  0.076721234 0.8135630
##   beta[54]  0.394356783 2.0493617
##   beta[55]  0.178487059 0.9308485
##   beta[56]  0.291127179 1.9597541
##   beta[57]  0.363313450 1.7437266
##   beta[58]  0.468983842 2.0547741
##   beta[59]  0.052215441 0.6566924
##   beta[60]  0.532104748 2.0636872
##   beta[61]  0.164378094 1.4186008
##   beta[62]  0.037565978 0.4129595
##   beta[63]  0.687495839 2.5811196
##   beta[64]  0.145885256 1.1007628
## 
## , , chains = chain:2
## 
##           stats
## parameter          mean         sd         2.5%           25%
##   b0        0.097863082 0.03312878  0.030301076  0.0766758056
##   sigma     0.604446666 0.02514548  0.555941084  0.5867280327
##   beta[1]   0.145224768 0.44810068 -0.521348357 -0.0489199128
##   beta[2]  -1.228857390 0.96624664 -3.255502882 -1.9437359819
##   beta[3]   7.326688099 0.97618114  5.560967321  6.6956470919
##   beta[4]   2.369088616 1.16371700 -0.007685079  1.6599659349
##   beta[5]  -0.500126293 0.85001519 -2.674698386 -0.7315312826
##   beta[6]  -0.108593749 0.54146286 -1.504603244 -0.2264235566
##   beta[7]  -1.098690314 1.22010311 -3.839107976 -1.9692545066
##   beta[8]   0.397948759 0.82973689 -0.587011982 -0.0258467559
##   beta[9]   6.173098755 1.09628725  3.963181456  5.4049318572
##   beta[10]  0.091542782 0.32790878 -0.467914546 -0.0676025806
##   beta[11]  0.375615292 0.59970145 -0.348440360 -0.0008684619
##   beta[12]  0.021002674 0.42582100 -0.906318896 -0.1221650491
##   beta[13] -0.141306698 0.42409526 -1.339290160 -0.2179355619
##   beta[14] -0.024069266 0.50307215 -1.373880738 -0.1688233868
##   beta[15] -0.146818166 0.51928873 -1.470019817 -0.2801568056
##   beta[16] -0.082473932 0.57541816 -1.384300214 -0.1608664583
##   beta[17]  0.439368247 0.87305397 -0.398334553 -0.0243020254
##   beta[18] -0.182210563 0.48076384 -1.472295702 -0.3047635530
##   beta[19]  0.270567484 0.49331017 -0.447393996 -0.0129180856
##   beta[20]  1.511949745 1.01060906 -0.080138814  0.7260842849
##   beta[21]  0.123455588 0.39579640 -0.657494413 -0.0327905413
##   beta[22]  0.411182059 0.60222592 -0.268690533  0.0008245203
##   beta[23] -0.034793639 0.47429448 -1.082294691 -0.1779979705
##   beta[24] -0.230155309 0.52318147 -1.704197543 -0.3786497098
##   beta[25]  0.139733717 0.51479790 -0.749578330 -0.0793635169
##   beta[26]  0.054962756 0.54697231 -0.886755415 -0.1276546732
##   beta[27]  0.421200515 0.66881961 -0.325690494 -0.0084643828
##   beta[28]  0.275000112 0.54234936 -0.431237753 -0.0228721005
##   beta[29]  0.052195761 0.41684319 -0.910779797 -0.0797980837
##   beta[30]  0.456438284 0.66810710 -0.334186020 -0.0001619620
##   beta[31] -0.134671086 0.43320674 -1.344577051 -0.2858425489
##   beta[32] -0.147004244 0.50457162 -1.302384066 -0.2684600916
##   beta[33]  0.125895336 0.45489764 -0.542556546 -0.0813635225
##   beta[34] -0.141423201 0.47790001 -1.460700477 -0.2537832701
##   beta[35]  0.026045310 0.44211248 -0.797592834 -0.1316845733
##   beta[36]  0.081913647 0.38579107 -0.554386825 -0.0894192342
##   beta[37]  1.648748077 0.99616916 -0.007438141  0.8170893082
##   beta[38] -0.008723044 0.44328298 -0.878743447 -0.1606750589
##   beta[39] -0.067986086 0.44864984 -1.311085556 -0.1815960524
##   beta[40] -0.024761878 0.47054117 -1.172615998 -0.1643637791
##   beta[41]  0.099636470 0.50789584 -0.865429147 -0.1034534098
##   beta[42]  0.139389103 0.46764841 -0.703923959 -0.0412191249
##   beta[43]  0.116948511 0.39736112 -0.486803180 -0.0564683595
##   beta[44] -0.047376088 0.42221231 -1.135213997 -0.1479052555
##   beta[45] -0.071254300 0.41215730 -1.004786522 -0.2079147483
##   beta[46]  0.152430892 0.45274040 -0.462527041 -0.0613096289
##   beta[47] -0.164580246 0.47516385 -1.438511929 -0.3185415674
##   beta[48]  0.012933709 0.41032691 -0.874436677 -0.1423919027
##   beta[49] -0.122738844 0.46462516 -1.277434484 -0.2599500682
##   beta[50]  0.009368875 0.48999158 -1.194052113 -0.1501792650
##   beta[51]  0.138115357 0.59665861 -0.706862560 -0.0786821302
##   beta[52] -0.273143198 0.76698653 -2.555352280 -0.4045591640
##   beta[53] -0.072259477 0.45727339 -1.059576330 -0.2145653947
##   beta[54]  0.193174239 0.57622013 -0.719424697 -0.0568049111
##   beta[55] -0.039015096 0.52522080 -1.108259691 -0.1804298711
##   beta[56]  0.164443130 0.60274342 -0.918509895 -0.0592996035
##   beta[57]  0.229808622 0.55051989 -0.521391319 -0.0490464572
##   beta[58]  0.283122213 0.60125244 -0.496050966 -0.0173523194
##   beta[59] -0.158270233 0.62993344 -1.801824142 -0.2870117835
##   beta[60]  0.380413184 0.62292087 -0.363521498  0.0005595968
##   beta[61]  0.184097507 0.58841029 -0.862488620 -0.0517607249
##   beta[62] -0.358028832 0.75950023 -2.314449591 -0.4930923867
##   beta[63]  0.477105484 0.84568947 -0.432172033 -0.0086127927
##   beta[64]  0.060067502 0.49340066 -0.862049773 -0.1111779175
##           stats
## parameter            50%         75%     97.5%
##   b0        0.0983291598  0.12057281 0.1638375
##   sigma     0.6058074455  0.62023563 0.6528083
##   beta[1]   0.0330896764  0.28679687 1.2271710
##   beta[2]  -1.2649638751 -0.31378045 0.1844630
##   beta[3]   7.3222455999  7.96840764 9.3989267
##   beta[4]   2.4716333694  3.14891789 4.4737615
##   beta[5]  -0.1759553447  0.01219901 0.4387424
##   beta[6]  -0.0246219837  0.08599405 0.6392261
##   beta[7]  -0.6766445144 -0.04518592 0.2878568
##   beta[8]   0.0780424066  0.56178593 2.8452628
##   beta[9]   6.1565684096  6.95043715 8.1709283
##   beta[10]  0.0283568765  0.22361960 0.8869880
##   beta[11]  0.1394076255  0.64224521 1.9797657
##   beta[12]  0.0046606300  0.18568125 1.0442417
##   beta[13] -0.0324995527  0.04559562 0.5509822
##   beta[14] -0.0027478321  0.14663774 1.0369085
##   beta[15] -0.0301099791  0.06861713 0.7360260
##   beta[16] -0.0036485069  0.09511069 0.7283029
##   beta[17]  0.1097650338  0.61552687 2.8564969
##   beta[18] -0.0450689752  0.03627951 0.4503278
##   beta[19]  0.1283815248  0.51050874 1.4486375
##   beta[20]  1.4863667267  2.27621887 3.4864057
##   beta[21]  0.0404310887  0.27833208 1.0357377
##   beta[22]  0.1808239018  0.72054760 2.1219887
##   beta[23] -0.0058255261  0.11215290 0.8817749
##   beta[24] -0.0608787538  0.02731006 0.4768778
##   beta[25]  0.0299145891  0.22256099 1.4304001
##   beta[26]  0.0060528258  0.19335475 1.3266900
##   beta[27]  0.1471225824  0.70106992 2.2864997
##   beta[28]  0.0894365245  0.43255922 1.7486068
##   beta[29]  0.0121204010  0.17537740 0.9790295
##   beta[30]  0.2304610628  0.74516051 2.1836862
##   beta[31] -0.0524931437  0.05289075 0.5132697
##   beta[32] -0.0260860390  0.06558419 0.5738092
##   beta[33]  0.0128475786  0.26782374 1.2546919
##   beta[34] -0.0268133556  0.08628742 0.6120409
##   beta[35]  0.0018990015  0.17882371 1.1060702
##   beta[36]  0.0051763384  0.18523929 1.0931118
##   beta[37]  1.6676555210  2.38875474 3.5749428
##   beta[38] -0.0069549244  0.11412554 1.0281651
##   beta[39] -0.0067119544  0.11660333 0.6635317
##   beta[40]  0.0016308233  0.13386008 0.9181422
##   beta[41]  0.0038553444  0.23418117 1.3052699
##   beta[42]  0.0357398440  0.23368250 1.3704825
##   beta[43]  0.0254481199  0.24535671 1.0750118
##   beta[44]  0.0001171708  0.13638823 0.7131980
##   beta[45] -0.0111523704  0.07625630 0.7463435
##   beta[46]  0.0195149453  0.26144475 1.3161496
##   beta[47] -0.0534358814  0.03792003 0.6694155
##   beta[48] -0.0029900703  0.15103644 0.8956869
##   beta[49] -0.0242921450  0.07183460 0.6575819
##   beta[50]  0.0003891385  0.16091299 1.1106848
##   beta[51]  0.0155896151  0.24794619 1.4668260
##   beta[52] -0.0393924906  0.08120025 0.7612110
##   beta[53] -0.0138703041  0.07971401 0.8105731
##   beta[54]  0.0270620290  0.31515369 1.8282145
##   beta[55] -0.0008713000  0.11475097 0.8804458
##   beta[56]  0.0570717737  0.24639028 1.7748681
##   beta[57]  0.0680547977  0.38831859 1.7256219
##   beta[58]  0.0760302331  0.45456092 2.0058215
##   beta[59] -0.0342535752  0.08533285 0.7676128
##   beta[60]  0.1582884043  0.58068543 2.0439788
##   beta[61]  0.0437676762  0.29437119 1.7798463
##   beta[62] -0.0671238878  0.02446569 0.4579827
##   beta[63]  0.1732226319  0.70368220 2.7428420
##   beta[64]  0.0094120070  0.19128402 1.2757906
## 
## , , chains = chain:3
## 
##           stats
## parameter          mean         sd         2.5%           25%
##   b0        0.096750534 0.03393401  0.032734310  0.0750657059
##   sigma     0.603340785 0.02411071  0.560005071  0.5871170313
##   beta[1]   0.166424556 0.44436377 -0.497341038 -0.0428261233
##   beta[2]  -1.307380419 0.98384448 -3.234348192 -2.0753758828
##   beta[3]   7.276798792 0.95436819  5.402051700  6.6069217864
##   beta[4]   2.317862239 1.16544213  0.002362751  1.5475810719
##   beta[5]  -0.555434965 0.90810727 -2.779299728 -0.9294002335
##   beta[6]  -0.123978102 0.64817610 -1.601150717 -0.2551240681
##   beta[7]  -1.153597681 1.12990145 -3.732901945 -1.8338402637
##   beta[8]   0.405613452 0.95167082 -0.707048962 -0.0577375905
##   beta[9]   6.316772749 1.02025437  4.389750560  5.5491065995
##   beta[10]  0.089587622 0.42505001 -0.696855570 -0.0750111675
##   beta[11]  0.368951755 0.61289867 -0.411411559 -0.0036770500
##   beta[12] -0.004166531 0.41410205 -0.986809073 -0.1476432059
##   beta[13] -0.252375685 0.52841382 -1.767235934 -0.4165123005
##   beta[14] -0.063122787 0.51862142 -1.350489072 -0.1884960507
##   beta[15] -0.145588384 0.60368517 -1.793984252 -0.2847139528
##   beta[16] -0.061834063 0.54995945 -1.511261383 -0.1819369550
##   beta[17]  0.477039553 0.91827533 -0.687477937 -0.0140982411
##   beta[18] -0.192275205 0.46704027 -1.461210724 -0.3425033710
##   beta[19]  0.223313102 0.51801059 -0.461913064 -0.0489740158
##   beta[20]  1.539444759 0.95225076 -0.015418008  0.8187438363
##   beta[21]  0.150815612 0.43534160 -0.511677280 -0.0406610756
##   beta[22]  0.417337861 0.61438570 -0.386328652 -0.0002418128
##   beta[23] -0.073070636 0.56280431 -1.380663279 -0.2198837696
##   beta[24] -0.270848069 0.66106431 -1.961499805 -0.4598077856
##   beta[25]  0.146050195 0.51237448 -0.705110620 -0.0747546041
##   beta[26]  0.051830884 0.56742778 -0.867117737 -0.1456944564
##   beta[27]  0.441320025 0.74038137 -0.364561032 -0.0107363626
##   beta[28]  0.260132945 0.52610070 -0.454144392 -0.0096880845
##   beta[29]  0.078251509 0.40364275 -0.707829894 -0.0817613783
##   beta[30]  0.434301858 0.66192337 -0.364293179 -0.0027183069
##   beta[31] -0.101254942 0.50178960 -1.242418108 -0.2790069605
##   beta[32] -0.199264540 0.53859729 -1.596819282 -0.3172858933
##   beta[33]  0.123931629 0.42672182 -0.670324991 -0.0543525801
##   beta[34] -0.143359266 0.55224435 -1.504804216 -0.2630534246
##   beta[35]  0.018000943 0.37647353 -0.765695043 -0.1305583148
##   beta[36]  0.058538779 0.37291533 -0.699947351 -0.0791027699
##   beta[37]  1.718524618 1.10304516 -0.094070684  0.8732578204
##   beta[38] -0.038073623 0.45322946 -1.118445977 -0.1401499835
##   beta[39] -0.038549186 0.43775311 -0.894615272 -0.1677839305
##   beta[40]  0.012205662 0.50945393 -1.118533194 -0.1423813028
##   beta[41]  0.082499607 0.48266142 -0.735477423 -0.1013413103
##   beta[42]  0.116672381 0.45178608 -0.723743946 -0.0538085353
##   beta[43]  0.082220233 0.39838762 -0.677579095 -0.0833229395
##   beta[44] -0.031745801 0.38735955 -1.013312637 -0.1486430887
##   beta[45] -0.026869508 0.39439592 -1.009089140 -0.1322340255
##   beta[46]  0.206927337 0.51713982 -0.499194519 -0.0575688024
##   beta[47] -0.170979233 0.53696919 -1.563814405 -0.3216526402
##   beta[48]  0.053620778 0.43478535 -0.863117607 -0.0955547399
##   beta[49] -0.093603162 0.44553412 -1.177801634 -0.2266249441
##   beta[50]  0.040535593 0.58921209 -1.032330854 -0.1497235926
##   beta[51]  0.140524161 0.47893979 -0.560259255 -0.0673875104
##   beta[52] -0.333009301 0.94415157 -2.813643845 -0.4627338203
##   beta[53] -0.122781410 0.54969896 -1.608262264 -0.2444734195
##   beta[54]  0.169270656 0.52910992 -0.599203773 -0.0510934259
##   beta[55]  0.003981441 0.49774158 -0.949051605 -0.1259188240
##   beta[56]  0.197579840 0.80044163 -0.899788420 -0.0618682011
##   beta[57]  0.288552711 0.69319319 -0.636844246 -0.0426753144
##   beta[58]  0.340650278 0.67350705 -0.495150503 -0.0272980993
##   beta[59] -0.150048751 0.54313799 -1.378883844 -0.2619190614
##   beta[60]  0.382176436 0.61040963 -0.376088028 -0.0017401266
##   beta[61]  0.149377983 0.56810294 -0.726145460 -0.0953938835
##   beta[62] -0.354454218 0.70511360 -2.399070399 -0.5171962304
##   beta[63]  0.497274135 0.86921490 -0.449846244 -0.0146039333
##   beta[64]  0.098332817 0.43578217 -0.752845136 -0.0787644129
##           stats
## parameter            50%          75%      97.5%
##   b0        0.0970338646  0.119486820 0.16006756
##   sigma     0.6023666549  0.617888780 0.65203278
##   beta[1]   0.0376975690  0.284696529 1.33986904
##   beta[2]  -1.3140532259 -0.416726824 0.06024406
##   beta[3]   7.2737082172  7.926729470 9.19209667
##   beta[4]   2.3717962681  3.215698788 4.43507938
##   beta[5]  -0.1947770047 -0.000215417 0.46168192
##   beta[6]  -0.0448951482  0.080193804 1.08342694
##   beta[7]  -0.9903898449 -0.137769462 0.17994152
##   beta[8]   0.0638767802  0.540289125 3.25314491
##   beta[9]   6.3067155948  7.034400439 8.32270825
##   beta[10]  0.0125431218  0.215503829 1.07346854
##   beta[11]  0.1735123442  0.626857475 1.94590295
##   beta[12] -0.0019824543  0.117262092 0.88909872
##   beta[13] -0.0858388598  0.034166885 0.47015861
##   beta[14] -0.0064761860  0.105199959 1.03103622
##   beta[15] -0.0190764657  0.085362834 0.70044552
##   beta[16] -0.0049008157  0.116871743 1.02739354
##   beta[17]  0.1146910150  0.739773011 3.18470762
##   beta[18] -0.0461477346  0.035886435 0.46509369
##   beta[19]  0.0397056785  0.391724026 1.62956348
##   beta[20]  1.5780864812  2.236841086 3.37919389
##   beta[21]  0.0524984106  0.277994818 1.31421071
##   beta[22]  0.1926241905  0.716556606 1.97437199
##   beta[23] -0.0077132298  0.101970458 0.78065344
##   beta[24] -0.0606261314  0.039093031 0.46203758
##   beta[25]  0.0261918747  0.245007705 1.51659310
##   beta[26]  0.0029116906  0.191492952 1.19011905
##   beta[27]  0.1423930498  0.713920229 2.51126968
##   beta[28]  0.1246797861  0.441369551 1.62926102
##   beta[29]  0.0050639156  0.159883913 1.17957731
##   beta[30]  0.1809641497  0.778032535 2.19543034
##   beta[31] -0.0210915686  0.085595633 0.81814290
##   beta[32] -0.0284671888  0.057978243 0.50108207
##   beta[33]  0.0254593958  0.266646710 1.13764383
##   beta[34] -0.0233136775  0.070738072 0.68730491
##   beta[35]  0.0020386478  0.151708098 0.95860096
##   beta[36]  0.0187721747  0.186990901 1.04379734
##   beta[37]  1.7428021418  2.424314784 4.00030275
##   beta[38] -0.0065737173  0.103778210 0.87843187
##   beta[39] -0.0032032428  0.108137646 0.82834883
##   beta[40]  0.0008359924  0.169449635 0.98221100
##   beta[41]  0.0051336273  0.171871414 1.45986260
##   beta[42]  0.0298321850  0.235036228 1.23931087
##   beta[43]  0.0187767589  0.199318120 1.05959201
##   beta[44]  0.0003183083  0.110557379 0.77894323
##   beta[45] -0.0049235595  0.133791544 0.70479219
##   beta[46]  0.0625556049  0.338126696 1.61054760
##   beta[47] -0.0399062083  0.087039514 0.75092893
##   beta[48]  0.0204702134  0.208265300 1.09603805
##   beta[49] -0.0169452582  0.082673635 0.72320316
##   beta[50]  0.0017584081  0.168141510 1.33736026
##   beta[51]  0.0312291180  0.239131917 1.55429295
##   beta[52] -0.0534667171  0.065356581 0.72449662
##   beta[53] -0.0233950831  0.078165615 0.85146683
##   beta[54]  0.0561823834  0.348329987 1.32936274
##   beta[55]  0.0014247424  0.157731894 0.99454269
##   beta[56]  0.0259180805  0.296598251 1.95406242
##   beta[57]  0.0697551982  0.465413747 2.19454240
##   beta[58]  0.1014221756  0.484742828 2.20963405
##   beta[59] -0.0280347403  0.056140704 0.64872808
##   beta[60]  0.1482180438  0.592202421 2.06558317
##   beta[61]  0.0202579758  0.236361216 1.90136110
##   beta[62] -0.0887302682  0.018240668 0.48304053
##   beta[63]  0.1445586561  0.828770180 2.73270332
##   beta[64]  0.0229665494  0.242535694 1.13856099
## 
## , , chains = chain:4
## 
##           stats
## parameter          mean         sd        2.5%          25%           50%
##   b0        0.096546109 0.03119814  0.03395806  0.077447745  0.0965095621
##   sigma     0.606365370 0.02601545  0.55888247  0.588350196  0.6051786083
##   beta[1]   0.134792293 0.39251495 -0.45534817 -0.050204199  0.0316981567
##   beta[2]  -1.068057917 0.96694432 -3.06514838 -1.796898313 -0.9238968042
##   beta[3]   7.413603908 0.91219501  5.61078760  6.756476080  7.4423340418
##   beta[4]   2.122541661 1.17272372 -0.03214448  1.301502627  2.2044262070
##   beta[5]  -0.417517079 0.86040240 -2.74420720 -0.686475160 -0.1053228717
##   beta[6]  -0.156284450 0.68248812 -1.78088896 -0.288960938 -0.0388863655
##   beta[7]  -1.070553973 1.17823428 -3.75289738 -1.892006598 -0.7661059973
##   beta[8]   0.403951524 0.92918512 -0.85355128 -0.025696427  0.1019137322
##   beta[9]   6.193950243 1.05596438  4.13845647  5.536986032  6.0964358343
##   beta[10]  0.146068004 0.39352310 -0.38785335 -0.037186990  0.0347360379
##   beta[11]  0.350707529 0.56840243 -0.33621216  0.001517974  0.1288057578
##   beta[12] -0.024115866 0.33938481 -0.88596422 -0.146188716 -0.0034642372
##   beta[13] -0.171345812 0.49293933 -1.68908165 -0.276011152 -0.0230133392
##   beta[14] -0.015290820 0.46114522 -1.04555045 -0.137679703 -0.0048112318
##   beta[15] -0.103692162 0.53161903 -1.39657129 -0.249554098 -0.0310001471
##   beta[16] -0.017356015 0.44165397 -1.02940990 -0.159484394 -0.0003919004
##   beta[17]  0.309297358 0.70029027 -0.62504006 -0.053299569  0.0659460915
##   beta[18] -0.180937746 0.49290100 -1.41614007 -0.283951594 -0.0371937834
##   beta[19]  0.228470518 0.48489347 -0.41688172 -0.032366918  0.0695854104
##   beta[20]  1.475830216 1.03497593 -0.05190627  0.660631041  1.4488697016
##   beta[21]  0.148284401 0.44091990 -0.58074430 -0.055827953  0.0411691746
##   beta[22]  0.423250762 0.64327663 -0.35024286 -0.001763475  0.1995558016
##   beta[23] -0.056470135 0.42198689 -1.08751179 -0.176357958 -0.0102009410
##   beta[24] -0.204326846 0.50890765 -1.56174360 -0.337839685 -0.0664825137
##   beta[25]  0.092980653 0.41403994 -0.62819505 -0.062420306  0.0252204292
##   beta[26]  0.004960615 0.46091351 -0.97960300 -0.134688447 -0.0044246808
##   beta[27]  0.408130959 0.69094943 -0.39912827 -0.010513680  0.1206067636
##   beta[28]  0.249160630 0.53431447 -0.48870831 -0.018321634  0.0794747011
##   beta[29]  0.059086089 0.38118232 -0.68374425 -0.069460437  0.0164495672
##   beta[30]  0.434809778 0.66273590 -0.30493028  0.001435845  0.1645989511
##   beta[31] -0.086643546 0.48183164 -1.22932722 -0.209075325 -0.0107216054
##   beta[32] -0.198217659 0.54310971 -1.66071143 -0.352966969 -0.0685537658
##   beta[33]  0.151362663 0.44572024 -0.56298294 -0.044021376  0.0501036917
##   beta[34] -0.109355374 0.46519544 -1.29350785 -0.233817836 -0.0201705784
##   beta[35]  0.005891920 0.37946143 -0.80753683 -0.108664324  0.0028116083
##   beta[36]  0.062042759 0.34608416 -0.64403134 -0.086753855  0.0157197062
##   beta[37]  1.696033184 1.09194149 -0.05466979  0.817392231  1.7656743500
##   beta[38] -0.028515133 0.38597976 -0.88543400 -0.145004447 -0.0096725940
##   beta[39] -0.004405590 0.36918367 -0.82951958 -0.126668548 -0.0019244500
##   beta[40] -0.018401085 0.43630236 -1.04895133 -0.171113159 -0.0042289876
##   beta[41]  0.141252010 0.50064578 -0.58262850 -0.083481042  0.0121680857
##   beta[42]  0.122226011 0.46448308 -0.72043695 -0.060445905  0.0330657894
##   beta[43]  0.079908154 0.40493419 -0.63175059 -0.088765612  0.0210775567
##   beta[44]  0.001005078 0.38038779 -0.92394620 -0.107286841  0.0006368025
##   beta[45] -0.063601182 0.42356851 -1.15345668 -0.197176905 -0.0158965326
##   beta[46]  0.157290374 0.50681227 -0.57848394 -0.085276009  0.0200104162
##   beta[47] -0.168930859 0.51155047 -1.49594730 -0.322138103 -0.0481591655
##   beta[48]  0.050106865 0.44549763 -0.77501573 -0.105762944  0.0100948284
##   beta[49] -0.028036058 0.40061927 -0.99261573 -0.144641557 -0.0041327906
##   beta[50] -0.004113098 0.48516183 -1.07529346 -0.141861334 -0.0040897514
##   beta[51]  0.129033550 0.46640788 -0.70388789 -0.074359914  0.0312289306
##   beta[52] -0.186243579 0.78081247 -2.10210458 -0.326639629 -0.0211603294
##   beta[53] -0.075811151 0.54707675 -1.50713381 -0.206610231 -0.0131417243
##   beta[54]  0.149262412 0.53707912 -0.78685277 -0.049896044  0.0371059986
##   beta[55]  0.052582773 0.44789010 -0.84649523 -0.092785314  0.0035208611
##   beta[56]  0.086219455 0.62950334 -1.17749690 -0.065058696  0.0154939968
##   beta[57]  0.212845834 0.59906723 -0.59113470 -0.056482944  0.0376803732
##   beta[58]  0.267347491 0.55503383 -0.39874925 -0.024715881  0.0671812083
##   beta[59] -0.217711945 0.66409051 -1.97110764 -0.337879094 -0.0206872799
##   beta[60]  0.322418474 0.59044164 -0.47294447 -0.015716642  0.1174859852
##   beta[61]  0.123126799 0.54656507 -0.72153136 -0.075345447  0.0191426785
##   beta[62] -0.364172375 0.75755037 -2.62379808 -0.462862511 -0.0734422675
##   beta[63]  0.426919386 0.77044999 -0.49978769 -0.009193722  0.1232542847
##   beta[64]  0.010400408 0.44607663 -1.00581239 -0.124430837  0.0026485640
##           stats
## parameter          75%     97.5%
##   b0        0.11532454 0.1588412
##   sigma     0.62389415 0.6575069
##   beta[1]   0.25869810 1.2543498
##   beta[2]  -0.21673321 0.1826845
##   beta[3]   8.00662542 9.1519377
##   beta[4]   2.95206989 4.2689175
##   beta[5]   0.02177994 0.7387188
##   beta[6]   0.04271197 1.0711325
##   beta[7]  -0.04496668 0.3521680
##   beta[8]   0.52942373 3.0966434
##   beta[9]   6.88928881 8.2305780
##   beta[10]  0.23593774 1.1671557
##   beta[11]  0.58719401 1.7988301
##   beta[12]  0.11731477 0.6330601
##   beta[13]  0.05410160 0.4392717
##   beta[14]  0.12154803 0.8854676
##   beta[15]  0.09184914 0.9041607
##   beta[16]  0.11411800 0.9584540
##   beta[17]  0.54107493 2.2854979
##   beta[18]  0.05210473 0.5497280
##   beta[19]  0.39956793 1.4887159
##   beta[20]  2.22781676 3.5487697
##   beta[21]  0.28843009 1.2147244
##   beta[22]  0.71977556 2.1977597
##   beta[23]  0.07704070 0.6800647
##   beta[24]  0.02649505 0.5950662
##   beta[25]  0.21609099 1.1731736
##   beta[26]  0.10335974 1.0320278
##   beta[27]  0.57313594 2.2421853
##   beta[28]  0.42014957 1.5453318
##   beta[29]  0.15370075 1.0275464
##   beta[30]  0.72749763 2.2662753
##   beta[31]  0.10491340 0.8203729
##   beta[32]  0.03127546 0.5296128
##   beta[33]  0.30977704 1.3396136
##   beta[34]  0.08152119 0.7557737
##   beta[35]  0.12989660 0.8501700
##   beta[36]  0.18022421 0.9007207
##   beta[37]  2.46585702 3.8206777
##   beta[38]  0.11057769 0.7658582
##   beta[39]  0.12440654 0.8293644
##   beta[40]  0.13673095 0.8512379
##   beta[41]  0.22061268 1.5640561
##   beta[42]  0.25720122 1.2235647
##   beta[43]  0.20427252 1.2053780
##   beta[44]  0.12814593 0.8949197
##   beta[45]  0.06978069 0.8546703
##   beta[46]  0.25262860 1.5940772
##   beta[47]  0.06957541 0.7047576
##   beta[48]  0.16732765 1.3135822
##   beta[49]  0.09213477 0.8457497
##   beta[50]  0.13966205 1.0919625
##   beta[51]  0.29034475 1.3679455
##   beta[52]  0.08463103 1.0607000
##   beta[53]  0.11819276 0.9025530
##   beta[54]  0.30770524 1.4930241
##   beta[55]  0.18967722 1.0539308
##   beta[56]  0.18212825 1.3860326
##   beta[57]  0.34173677 1.9991118
##   beta[58]  0.41579681 1.8002621
##   beta[59]  0.06126964 0.5687689
##   beta[60]  0.56726528 1.8606980
##   beta[61]  0.21117149 1.4984123
##   beta[62]  0.01255208 0.5103501
##   beta[63]  0.64219280 2.5380032
##   beta[64]  0.16556018 0.9009418

First compare the resulting regression parameters to OLS values.

plot of chunk compare_betas

The coefficient estimates from OLS are wierd.

##                      ols           stan
## (Intercept)    150.40351  150.441968698
## `tc:ldl`    -17718.75773    0.033353836
## tc           16796.16643  -15.094758817
## ldl         -15061.25410   -2.803126530
## `tc^2`       11301.51242   -0.566494009
## `tc:ltg`    -10009.29895   -1.458708736
## `ldl:ltg`     8124.59404    5.071689135
## `ldl^2`       7536.39662   -2.487378411
## `tc:hdl`     -6298.38254    2.990296176
## hdl          -6254.46319  -70.904720898
## `ldl:hdl`     5004.48290    0.156320821
## ltg          -4856.91364  544.081396337
## `hdl:ltg`     3597.25613   11.198654641
## `tc:tch`     -2574.37368   -3.182541198
## `hdl^2`       2298.16669   -0.298001012
## `ldl:tch`     1731.89234    2.660811733
## `ltg^2`       1549.77906   -4.036317748
## `hdl:tch`     1076.35481   -2.417174388
## `map:tc`      1003.12227   -0.048598374
## `map:ldl`     -916.96996   -0.694345338
## `tch^2`        756.03682    8.022192132
## `tch:glu`      596.19136   12.736974731
## bmi            587.85483  643.162520011
## `hdl:glu`      560.73667    2.215087063
## `tc:glu`      -497.85900    3.832447892
## `map:ltg`     -455.17300    0.520230473
## `age:tch`      450.27383   -0.007549981
## `age:hdl`      349.10626    2.512926502
## tch            330.71500    6.514453051
## `bmi:map`      326.31803  151.029322377
## map            323.09057  204.692791124
## `sex:tch`     -314.48509   -2.378188538
## `map:hdl`     -303.16364    3.691671900
## sex           -286.08810 -101.765851787
## `ldl:glu`      265.99329    7.018465292
## `age:ldl`     -215.12650   -4.823499031
## `tch:ltg`      212.46622   -6.586308746
## `ltg:glu`      172.80190    0.812160488
## `sex:map`      165.15332   16.293653062
## `bmi:tch`     -162.32294    0.902375176
## `map^2`       -161.86028   -3.306257826
## `age:sex`      149.66531  129.941006661
## `age:tc`      -147.14797   -0.731836716
## age            131.03118    2.742276426
## `sex:ldl`      125.00529   -3.746752079
## `age^2`        123.34475   12.703843426
## `sex:hdl`     -108.67040    2.856477713
## `bmi:hdl`     -101.63216   -0.230827419
## `sex:ltg`       97.47348    0.239873862
## `bmi:ldl`       92.41196   -0.366628908
## `map:tch`       83.43006   -4.218316446
## `age:map`       73.42593   16.507539257
## `bmi:ltg`       70.84372    2.259593599
## `map:glu`      -65.99667   -1.318668568
## `age:glu`       65.26740    8.438687365
## `bmi^2`        -57.66550    0.095112982
## `age:bmi`      -51.02631    3.575305840
## `glu^2`         49.71817    8.064464238
## `age:ltg`       31.70304   14.258226696
## `sex:glu`       24.91435    1.137130088
## `bmi:tc`        21.50722   -0.665490953
## glu             19.12792    2.300361215
## `sex:tc`       -15.98984   -1.999827883
## `sex:bmi`      -14.04317    0.938191184
## `bmi:glu`       11.21618    1.642182837

And, quite different than what Stan gets.

##                      ols           stan
## (Intercept)    150.40351  150.441968698
## bmi            587.85483  643.162520011
## ltg          -4856.91364  544.081396337
## map            323.09057  204.692791124
## `bmi:map`      326.31803  151.029322377
## `age:sex`      149.66531  129.941006661
## sex           -286.08810 -101.765851787
## hdl          -6254.46319  -70.904720898
## `age:map`       73.42593   16.507539257
## `sex:map`      165.15332   16.293653062
## tc           16796.16643  -15.094758817
## `age:ltg`       31.70304   14.258226696
## `tch:glu`      596.19136   12.736974731
## `age^2`        123.34475   12.703843426
## `hdl:ltg`     3597.25613   11.198654641
## `age:glu`       65.26740    8.438687365
## `glu^2`         49.71817    8.064464238
## `tch^2`        756.03682    8.022192132
## `ldl:glu`      265.99329    7.018465292
## `tch:ltg`      212.46622   -6.586308746
## tch            330.71500    6.514453051
## `ldl:ltg`     8124.59404    5.071689135
## `age:ldl`     -215.12650   -4.823499031
## `map:tch`       83.43006   -4.218316446
## `ltg^2`       1549.77906   -4.036317748
## `tc:glu`      -497.85900    3.832447892
## `sex:ldl`      125.00529   -3.746752079
## `map:hdl`     -303.16364    3.691671900
## `age:bmi`      -51.02631    3.575305840
## `map^2`       -161.86028   -3.306257826
## `tc:tch`     -2574.37368   -3.182541198
## `tc:hdl`     -6298.38254    2.990296176
## `sex:hdl`     -108.67040    2.856477713
## ldl         -15061.25410   -2.803126530
## age            131.03118    2.742276426
## `ldl:tch`     1731.89234    2.660811733
## `age:hdl`      349.10626    2.512926502
## `ldl^2`       7536.39662   -2.487378411
## `hdl:tch`     1076.35481   -2.417174388
## `sex:tch`     -314.48509   -2.378188538
## glu             19.12792    2.300361215
## `bmi:ltg`       70.84372    2.259593599
## `hdl:glu`      560.73667    2.215087063
## `sex:tc`       -15.98984   -1.999827883
## `bmi:glu`       11.21618    1.642182837
## `tc:ltg`    -10009.29895   -1.458708736
## `map:glu`      -65.99667   -1.318668568
## `sex:glu`       24.91435    1.137130088
## `sex:bmi`      -14.04317    0.938191184
## `bmi:tch`     -162.32294    0.902375176
## `ltg:glu`      172.80190    0.812160488
## `age:tc`      -147.14797   -0.731836716
## `map:ldl`     -916.96996   -0.694345338
## `bmi:tc`        21.50722   -0.665490953
## `tc^2`       11301.51242   -0.566494009
## `map:ltg`     -455.17300    0.520230473
## `bmi:ldl`       92.41196   -0.366628908
## `hdl^2`       2298.16669   -0.298001012
## `sex:ltg`       97.47348    0.239873862
## `bmi:hdl`     -101.63216   -0.230827419
## `ldl:hdl`     5004.48290    0.156320821
## `bmi^2`        -57.66550    0.095112982
## `map:tc`      1003.12227   -0.048598374
## `tc:ldl`    -17718.75773    0.033353836
## `age:tch`      450.27383   -0.007549981

Now let’s look at out-of-sample prediction error, using the posterior median coefficient estimates:

plot of chunk pred_stan

Conclusions?

  1. Our “sparse” model is certainly more sparse, and arguably more interpretable.

  2. It has a root-mean-square prediction error of 55.7274879 on the test data, and 52.083831 on the training data.

  3. This is substantially better than ordinary linear regression, which had a root-mean-square prediction error of 64.602275 on the test data, and a root-mean-square-error of 46.5588313 on the training data.

The sparse model is more interpretable, and more generalizable.