SubspaceInference.jl
The subspace inference method for Deep Neural Networks (DNN) and ordinary differential equations (ODEs) are implemented as a package named SubspaceInference.jl in Julia.
Subspace Inference using PCA or Diffusion Map
SubspaceInference.subspace_inference — Methodsubspace_inference(model, cost, data, opt;σ_z = 1.0, σ_m = 1.0, σ_p = 1.0,
itr =1000, T=25, c=1, M=20, print_freq=1, alg =:rwmh, backend = :forwarddiff, method = :subspace)To generate the uncertainty in machine learing models using MH Sampler from subspace
Input Arguments
model: Machine learning model. Eg: Chain(Dense(10,2)). Model should be created with Chain in Fluxcost: Cost function. Eg: L(x, y) = Flux.Losses.mse(m(x), y)data: Inputs and outputs. Eg: X = rand(10,100); Y = rand(2,100); data = DataLoader(X,Y);opt: Optimzer. Eg: opt = ADAM(0.1)
Keyword Arguments
callback: Callback function during training. Eg: callback() = @show(L(X,Y))σ_z: Standard deviation of subspaceσ_m: Standard deviation of likelihood modelσ_p: Standard deviation of prioritr: Iterations for samplingT: Number of steps for subspace calculation. Eg: T= 1c: Moment update frequency. Eg: c = 1M: Maximum number of columns in deviation matrix. Eg: M= 3alg: Sampling Algorithm. Eg: :rwmhbackend: Differentiation backend. Eg: :forwarddiffmethod: Subspace construction method. Eg: :subspaceprint_freq: Loss printing frequency
Output
chn: Chain with samples with uncertainty informationslp: Log probabilities of all samplesW_swa: Mean Weight
Autoencoder based Subspace Inference
SubspaceInference.autoencoder_inference — Methodautoencoder_inference(model, cost, data, opt, encoder, decoder;
σ_z = 1.0, σ_m = 1.0, σ_p = 1.0,
itr =1000, T=25, c=1, M=20, print_freq=1, alg =:hmc, backend = :forwarddiff)To generate the uncertainty in machine learing or neural ODE models using auto-encoders
Input Arguments
model: Machine learning model. Eg: Chain(Dense(10,2)). Model should be created with Chain in Fluxcost: Cost function. Eg: L(x, y) = Flux.Losses.mse(m(x), y)data: Inputs and outputs. Eg: X = rand(10,100); Y = rand(2,100); data = DataLoader(X,Y);opt: Optimzer. Eg: opt = ADAM(0.1)encoder: Encoder to generate subspace from NN or Neural ODE parametersdecoder: Decoder to generate NN or Neural ODE parameters from subspace
Keyword Arguments
callback: Callback function during training. Eg: callback() = @show(L(X,Y))σ_z: Standard deviation of subspaceσ_m: Standard deviation of likelihood modelσ_p: Standard deviation of prioritr: Iterations for samplingT: Number of steps for subspace calculation. Eg: T= 1c: Moment update frequency. Eg: c = 1M: Maximum number of columns in deviation matrix. Eg: M= 3alg: Sampling Algorithm. Eg: :rwmhbackend: Differentiation backend. Eg: :forwarddiffprint_freq: Loss printing frequency
Output
chn: Chain with samples with uncertainty informationslp: Log probabilities of all samples