Tags

tag description
2-dimensional classifier or regression on two variables
algo non-trivial algorithm implemented
analysis some extra analysis done, not just giving the test results
bagging bagging/bootstraping used
baseline baseline solution
bernoulli Bernoulli (naive Bayes) model used
better-than-no-model-baseline significantly better than stupid, no-model baseline (e.g. returning the majority class)
bigram bigrams considered
bilstm BiLSTM model used
bpe Text segmented into BPE subword units
c++ written (partially or fully) in C++
casemarker Special handling of case
character-level Character-level
char-n-grams character n-grams
chi-square chi-square test used
cnn Convolutional Neural Network
crm-114 CRM-114 used
decision-tree decision tree used
existing some existing solution added
fasttext fasttext used
feature-engineering used more advanced pre-processing, feature engineering etc.
frage FRAGE used
graph extra graph
hashing-trick Hashing trick used
haskell written (partially or fully) in Haskell
improvement existing solution modified and improved as measured by the main metric
inverted inverted
java written (partially or fully) in Java
kenlm KenLM used
k-means k-means or its variant used
knn k nearest neighbors
knowledge-based some external source of knowledge used
left-to-right only left to right
lemmatization lemmatization used
linear-regression linear regression used
lisp written (partially or fully) in Lisp
lm a language model used
logistic-regression logistic regression used
lstm LSTM network
marian Marian NMT used
mert MERT (or equivalent) for Moses
moses Moses MT
multidimensional classifier or regression on many variables
multinomial multinomial (naive Bayes) model used
naive-bayes Naive Bayes Classifier used
neural-network neural network used
new-leader significantly better than the current top result
n-grams n-grams used
no-model-baseline significantly better than stupid, no-model baseline (e.g. returning the majority class)
non-zero non zero value for the metric
null-model null model baseline
perl written (partially or fully) in Perl
probabilities return probabilities not just classes
python written (partially or fully) in Python 2/3
r written (partially or fully) in R
random-forest Random Forest used
ready-made Machine Learning framework/library/toolkit used, algorithm was not implemented by the submitter
regexp handcrafted regular expressions used
regularization some regularization used
right-to-left model working from right to left
rnn Recurrent Neural Network
ruby written (partially or fully) in Ruby
rule-based rule-based solution
scala written (partially or fully) in Scala
scikit-learn sci-kit learn used
self-made algorithm implemented by the submitter, no framework used
simple simple solution
stemming stemming used
stupid simple, stupid rule-based solution
temporal temporal information taken into account
tf term frequency
tf-idf tf-idf used
torch (py)torch used
transformer Transformer model used
trigram trigrams considered
umz-2019-challenge see https://eduwiki.wmi.amu.edu.pl/pms/19umz#Dodatkowe_punkty_za_wygranie_wyzwa.2BAUQ-
unigram only unigrams considered
vowpal-wabbit Vowpal Wabbit used
word2vec Word2Vec
word-level Word-level
wordnet some wordnet used
xgboost xgboost used