penalty.factor.initto support customized penalty factor applied to each coefficient in the initial estimation step. This is useful for incorporating prior information about variable weights, for example, emphasizing specific clinical variables. We thank Xin Wang from University of Michigan for this feedback [#4].
type = "dotplot"in
plot.msaenet(). This plot offers a direct visualization of the model coefficients at the optimal step.
plot.msaenet()for extra flexibility: it is now possible to set important properties of the label appearance such as position, offset, font size, and axis titles via the new arguments
init = "ridge", by using the ridge estimation implementation from
glmnet. As a benefit, we now have a more aligned baseline for the comparison between elastic-net based models and MCP-net/SCAD-net based models when
init = "ridge".
tune.nstepsto controls this for selecting the optimal model for each step, and the optimal model among all steps (i.e. the optimal step).
ebic.gamma.nstepsto control the EBIC tuning parameter, if
ebicis specified by
gamma(scaling factor for adaptive weights) to
scaleto avoid possible confusion.
gammasto be 3.7 for SCAD-net and 3 for MCP-net.
familyin all model types to be
msaenet.sim.cox()to generate simulation data for logistic, Poisson, and Cox regression models.
msaenet.fn()for computing the number of false negative selections in msaenet models.
msaenet.mse()for computing mean squared error (MSE).
msaenet.sim.gaussian()by more vectorization when generating correlation matrices.
epsilonfor MCP-net and SCAD-net related functions to have finer control over convergence criterion. By default,
max.iter = 10000and
epsilon = 1e-4.
msaenet.nzv.all()for displaying the indices of non-zero variables in all adaptive estimation steps.
coeffor extracting model coefficients. See