SubspaceInference.jl

The subspace inference method for Deep Neural Networks (DNN) and ordinary differential equations (ODEs) are implemented as a package named SubspaceInference.jl in Julia.

Subspace Inference using PCA or Diffusion Map

SubspaceInference.subspace_inferenceMethod
subspace_inference(model, cost, data, opt;σ_z = 1.0, σ_m = 1.0, σ_p = 1.0,
itr =1000, T=25, c=1, M=20, print_freq=1, alg =:rwmh, backend = :forwarddiff, method = :subspace)

To generate the uncertainty in machine learing models using MH Sampler from subspace

Input Arguments

  • model : Machine learning model. Eg: Chain(Dense(10,2)). Model should be created with Chain in Flux
  • cost : Cost function. Eg: L(x, y) = Flux.Losses.mse(m(x), y)
  • data : Inputs and outputs. Eg: X = rand(10,100); Y = rand(2,100); data = DataLoader(X,Y);
  • opt : Optimzer. Eg: opt = ADAM(0.1)

Keyword Arguments

  • callback : Callback function during training. Eg: callback() = @show(L(X,Y))
  • σ_z : Standard deviation of subspace
  • σ_m : Standard deviation of likelihood model
  • σ_p : Standard deviation of prior
  • itr : Iterations for sampling
  • T : Number of steps for subspace calculation. Eg: T= 1
  • c : Moment update frequency. Eg: c = 1
  • M : Maximum number of columns in deviation matrix. Eg: M= 3
  • alg : Sampling Algorithm. Eg: :rwmh
  • backend : Differentiation backend. Eg: :forwarddiff
  • method : Subspace construction method. Eg: :subspace
  • print_freq: Loss printing frequency

Output

  • chn : Chain with samples with uncertainty informations
  • lp : Log probabilities of all samples
  • W_swa : Mean Weight
source

Autoencoder based Subspace Inference

SubspaceInference.autoencoder_inferenceMethod
autoencoder_inference(model, cost, data, opt, encoder, decoder;
σ_z = 1.0,	σ_m = 1.0, σ_p = 1.0,
itr =1000, T=25, c=1, M=20, print_freq=1, alg =:hmc, backend = :forwarddiff)

To generate the uncertainty in machine learing or neural ODE models using auto-encoders

Input Arguments

  • model : Machine learning model. Eg: Chain(Dense(10,2)). Model should be created with Chain in Flux
  • cost : Cost function. Eg: L(x, y) = Flux.Losses.mse(m(x), y)
  • data : Inputs and outputs. Eg: X = rand(10,100); Y = rand(2,100); data = DataLoader(X,Y);
  • opt : Optimzer. Eg: opt = ADAM(0.1)
  • encoder : Encoder to generate subspace from NN or Neural ODE parameters
  • decoder : Decoder to generate NN or Neural ODE parameters from subspace

Keyword Arguments

  • callback : Callback function during training. Eg: callback() = @show(L(X,Y))
  • σ_z : Standard deviation of subspace
  • σ_m : Standard deviation of likelihood model
  • σ_p : Standard deviation of prior
  • itr : Iterations for sampling
  • T : Number of steps for subspace calculation. Eg: T= 1
  • c : Moment update frequency. Eg: c = 1
  • M : Maximum number of columns in deviation matrix. Eg: M= 3
  • alg : Sampling Algorithm. Eg: :rwmh
  • backend : Differentiation backend. Eg: :forwarddiff
  • print_freq: Loss printing frequency

Output

  • chn : Chain with samples with uncertainty informations
  • lp : Log probabilities of all samples
source