1. |
What is a classifier? (check all that are correct)
A classifier is a program that is used to form groups (classes) of students and create their time table at various types of schools.
A classifier is a data-driven decision model: based on so called training data (e.g. historical data) and evidence, the classifier
aims to make predictions/decisions (usually in form of whether or not an instance belongs to a particular group).
A classifier is a model that is able to solve various recognition tasks.
A classifier is a machine learning approach to group instances of a dataset into clusters, i.e., to find groups of instances so that
similar instances are in the same group and different instances are in different groups.
|
2. |
What is a hub in context of hubness-aware machine learning? (check all that are correct)
An instance that appears frequently as one of the k-nearest neighbours of other instances.
An instance with high in-degree in the nearest neighbour graph.
An important web service with many users from various geographical locations and/or with different cultural background.
A Facebook-user having at least 5000 friends.
|
3. |
What is hubness? (check all that are correct)
Hubness refers to the hierarchical structure of clusters in biomedical networks.
Hubness is a property of k-Means clustering algorithms, based on which k-Means prefers clusters of globular shape.
Hubness is the curse of dimensionality.
With hubness, we mean that the distribution of k-nearest neighbour occurrences is remarkably skewed, i.e., it is substantially different from a uniform or a (symmetric) Gaussian distribution.
|
4. |
The presence of hubs is related to ... (check all that are correct)
... the dimensionality of the data.
... the number of instances in the dataset.
... the domain and recognition task associated with the data.
... the compression method used when transmitting the data from one computer to another on the internet.
|
5. |
Hubness-aware classifiers aim at ... (check all that are correct)
... reducing the dimensionality of the data.
... speeding up the classification via instance selection (i.e. selection of good hubs).
... increasing the accuracy by reducing the influence of bad hubs.
... identifying support vectors and building decision trees efficiently.
|
6. |
What is a good neighbour in context of hubness-aware machine learning? (check all that are correct)
A good neighbour helps you with shopping and drinks coffee with you in the afternoon.
A good neighbour is discrete and therefore does not tell your secrets to other persons.
A good neighbour of instance x is another instance that is (i) one of the k-nearest neighbours of x and
(ii) its class label is the same as the class label of x.
All the instances of the dataset are good neighbours of each other.
|
7. |
Consider the weighted k-nearest neighbour classifier with k = 4.
We aim at classifying instance x'. The four nearest neighbours of x' have the following weights and class labels:
No. | Weight | Label |
1 | 0.15 | + |
2 | 0.20 | + |
3 | 0.40 | - |
4 | 0.10 | + |
The weighted nearest neighbor will ... (check all that are correct)
... classify this instance as negative because the third nearest neighbour has the highest weight and its class label is negative.
... classify this instance as positive because three out of the four neighbours belong to the positive class while one one of the neighbours belongs to the negative class.
... output an error because the sum of all the weights is not equal to one.
... classify this instance as positive because the sum of the weights for the positive class is higher than the sum of the weights for the negative class.
|
8. |
In case of HWKNN, the weights of the instances are determined based on ... (check all that are correct)
... their normalized bad hubness scores.
... the skewness (third standardized moment) of the distribution of neighbour occurrences.
... the selected instances (good hubs).
... the a priori probability of the classes.
|
9. |
In case of HFNN ... (check all that are correct)
... neighbours vote by their own class labels (just like in case of HWKNN).
... fuzzy labels are derived from the neighbour occurrence profiles and these new fuzzy labels are used while classifying new instances.
... new fuzzy instances (so called pseudo-instances) are created.
... the classifier has much more hyperparameters than in case of HWKNN.
|
10. |
Regression is ... (check all that are correct)
... prediction on a continuous (numeric) scale.
... just another name for classification.
... only possible if neural networks are available.
... a prominent machine learning technique.
|
11. |
The following hubness-aware regression have been presented in the lecture: (check all that are correct)
EWKNN
ECKNN
EWCKNN
EGKNN
|
12. |
Hubness-aware machine learning techniques, based on the results reported in the scientific literature, may work well... (check all that are correct)
... to classify noisy data.
... to classify time series.
... for clustering.
... for regression.
|