The bnns
package provides an efficient and user-friendly
implementation of Bayesian Neural Networks (BNNs) for
regression, binary classification, and multiclass classification
problems. By integrating Bayesian inference, bnns
allows
for uncertainty quantification in predictions and robust parameter
estimation.
This vignette covers: 1. Installing and loading the package 2. Preparing data 3. Fitting a BNN model 4. Summarizing the model 5. Making predictions 6. Model evaluation 7. Customizing prior
To install the package, use the following commands:
# Install from CRAN (if available)
# install.packages("bnns")
# Or install the development version from GitHub
# devtools::install_github("swarnendu-stat/bnns")
Load the package in your R session:
The bnns
package expects data in the form of matrices
for predictors and a vector for responses.
Here’s an example of generating synthetic data:
# Generate training data
set.seed(123)
df <- data.frame(x1 = runif(10), x2 = runif(10), y = rnorm(10))
For binary or multiclass classification:
Fit a Bayesian Neural Network using the bnns()
function.
Specify the network architecture using arguments like the number of
layers (L
), nodes per layer (nodes
), and
activation functions (act_fn
).
model_reg <- bnns(
y ~ -1 + x1 + x2,
data = df,
L = 1, # Number of hidden layers
nodes = 2, # Nodes per layer
act_fn = 3, # Activation functions: 3 = ReLU
out_act_fn = 1, # Output activation function: 1 = Identity (for regression)
iter = 1e1, # Very low number of iteration is shown, increase to at least 1e3 for meaningful inference
warmup = 5, # Very low number of warmup is shown, increase to at least 2e2 for meaningful inference
chains = 1
)
Use the summary()
function to view details of the fitted
model, including the network architecture, posterior distributions, and
predictive performance.
summary(model_reg)
#> Call:
#> bnns.default(formula = y ~ -1 + x1 + x2, data = df, L = 1, nodes = 2,
#> act_fn = 3, out_act_fn = 1, iter = 10, warmup = 5, chains = 1)
#>
#> Data Summary:
#> Number of observations: 10
#> Number of features: 2
#>
#> Network Architecture:
#> Number of hidden layers: 1
#> Nodes per layer: 2
#> Activation functions: 3
#> Output activation function: 1
#>
#> Posterior Summary (Key Parameters):
#> mean se_mean sd 2.5% 25% 50%
#> w_out[1] -0.2682487 0.19315006 0.3610847 -0.6939969 -0.5144810 -0.2992260
#> w_out[2] -0.4426392 0.10830231 0.2024660 -0.7055733 -0.5981513 -0.3399174
#> b_out 0.5266503 0.08137059 0.1521184 0.3630533 0.3630533 0.6022671
#> sigma 1.1249599 0.26761385 0.5002912 0.8186562 0.8849170 0.9576394
#> 75% 97.5% n_eff Rhat
#> w_out[1] 0.09320342 0.09320342 3.49485 0.9075369
#> w_out[2] -0.27880903 -0.27880903 3.49485 3.7425590
#> b_out 0.62355410 0.67554666 3.49485 0.8134989
#> sigma 0.95763938 1.90774292 3.49485 0.9124295
#>
#> Model Fit Information:
#> Iterations: 10
#> Warmup: 5
#> Thinning: 1
#> Chains: 1
#>
#> Predictive Performance:
#> RMSE (training): 0.8624338
#> MAE (training): 0.6994271
#>
#> Notes:
#> Check convergence diagnostics for parameters with high R-hat values.
The predict()
function generates predictions for new
data. The format of predictions depends on the output activation
function.
The bnns
package includes utility functions like
measure_cont
, measure_bin
, and
measure_cat
for evaluating model performance.
Customized priors can be used for weights as well as the
sigma
parameter (for regression). Here we show an example
use of a Cauchy
prior for weights in multi-classification
case.
For more details, consult the source code on GitHub.