Dataset with both annotations and class probability estimated in the previous step, for obtaining the soft frequency matrices.
Dataset with both annotations and class probability estimated in the previous step, for obtaining the soft frequency matrices.
0.1
Obtains the likelihood for each example given a class (grouping keys)
Obtains the likelihood for each example given a class (grouping keys)
0.1
Annotations with logistic prediction information for EStep
Annotations with logistic prediction information for EStep
0.1
Combinations of annotator and classes for frequency calculation taking into account all combinations
Combinations of annotator and classes for frequency calculation taking into account all combinations
0.1
Dataset with soft frequency of (annotator,c) in the annotations dataset Represents the denominator in the corresponding element of the precisions matrices.
Dataset with soft frequency of (annotator,c) in the annotations dataset Represents the denominator in the corresponding element of the precisions matrices.
0.1
Dataset with soft frequency of (annotator,c,k) in the annotations dataset.
Dataset with soft frequency of (annotator,c,k) in the annotations dataset. Represents the numerator in the corresponding element of the precisions matrices.
0.1
Obtains the soft frequency of appearance of the key (j,c)
Obtains the soft frequency of appearance of the key (j,c)
0.1
EStep estimation point with information about annotation probability and the logistic prediction.
EStep estimation point with information about annotation probability and the logistic prediction.
0.1
Obtains the soft frequency of appearance of the key (j,c,k)
Obtains the soft frequency of appearance of the key (j,c,k)
0.1
Likelihood estimation point with annotation likelihood as well as the true class estimation form E Step
Likelihood estimation point with annotation likelihood as well as the true class estimation form E Step
0.1
Logistic Annotator params for the LogisticParams aggregator
Logistic Annotator params for the LogisticParams aggregator
0.1
Logistic prediction for the full multiclass problem
Logistic prediction for the full multiclass problem
0.1
Obtains the soft frequency of appearance of the key (j,c)
Obtains the soft frequency of appearance of the key (j,c)
0.1
Aggregation of annotator parameters for each example in the one vs all approach for logistic regression.
Aggregation of annotator parameters for each example in the one vs all approach for logistic regression.
0.1
Logistic prediction for the one vs all approach
Logistic prediction for the one vs all approach
0.1
Mu estimate with logistic params for the example
Mu estimate with logistic params for the example
0.1
Normalizer for logistic predictions
Normalizer for logistic predictions
0.1
Computes the gradient for the SGD algorithm
Partial object get data from one step to another.
Partial object get data from one step to another.
0.1
Computes updater for the SGD algorithm.
Computes updater for the SGD algorithm. Adds the regularization priors.
Applies the learning algorithm
Applies the learning algorithm
the dataset with feature vectors.
the dataset with the annotations.
maximum number of iterations for the GradientDescent algorithm
threshold for the log likelihood variability for the gradient descent algorithm
learning rate for the gradient descent algorithm
prior (Dirichlet distribution hyperparameters) for the estimation of the probability that an annotator correctly a class given another
prior for the weights of the logistic regression model
0.1
Computes the negative likelihood of a point (loss)
Computes the logistic function for a data point
E Step of the EM algorithm.
E Step of the EM algorithm.
0.1
Initialize the parameters.
Initialize the parameters. First ground truth estimation is done using the majority voting algorithm
0.1
Obtains the likelihood of the partial model.
Obtains the likelihood of the partial model.
0.1
M Step of the EM algorithm.
M Step of the EM algorithm.
0.1
Matrix multiplication (TODO: improving using libraries)
Step of the iterative algorithm
Step of the iterative algorithm
0.1
Provides functions for transforming an annotation dataset into a standard label dataset using the Raykar algorithm for multiclass
This algorithm only works with com.enriquegrodrigo.spark.crowd.types.MulticlassAnnotation annotation datasets
0.1
Raykar, Vikas C., et al. "Learning from crowds." Journal of Machine Learning Research 11.Apr (2010): 1297-1322.