Over the years, numerous approaches have been applied in LTR settings. Broadly most of these approaches can be classified into four categories as follows:
- Label Aggregation
- Constraint Optimization
- Model Fusion/Aggregation
- Lexicographic
Now, lets us define these approaches.
Label Aggregation
In this method, we combine all objective functions to form a single objective function which is then minimized. Once we convert the multi-objective into a single objective by combing the labels, we solved the given LTR problem as the single objective function. Let’s say for given query q, and products P, we have two different labels,
Note: Here
In general, given k objectives rank function looks like below:
Minimise,
Subject to
Advantages
- It gives a clear interpretation of the multi-objective function and generalize it.
- It allows multiple parameters to be set to reflect preferences.
Disadvantages
- It tries to optimize for each objective function, which can be computationally expensive.
- The setting of parameters
is not intuitively clear when only one solution point is desired.
Constraint Optimization
This method optimizes the single most important objective
In general, given k objectives the ranking function is as follows:
Subject to
Advantages
- It focuses on a single objective with limits on others.
- It always provides a weakly Pareto optimal point, assuming that the formulation gives a solution.
- It is not necessary to normalize the objective functions.
Disadvantages
- The optimization problem may be infeasible if the bounds on the objective functions are not appropriate.
Model Fusion/Aggregation
This method is an aggregation of multiple independent ranking models. The final ranking socre is obtained by a convex combination of multiple models. As we saw earlier if we have two objectives, let’s say
In general, given k objectives the ranking function looks as follows:
where,
Advantages
- This is used as a post-rank method, and as such easy to tweak weighting parameters.
- Learning for one single objective will not be affected by other objectives (decoupled objectives).
Disadvantages
- It’s difficult to find the optimal weight for the final ranking.
Lexicographic
When we have more than one objective, we rank the items by ordering the objective functions according to their importance. As mentioned earlier, we have two objective functions,
In general, given k objectives, the rank function looks like below:
Minimise,
Subject to
Here, we rank the function based on
Advantages
- It is a unique approach to specifying preferences.
- It does not require that the objective functions be normalized.
Disadvantages
- It requires that additional constraints be imposed.
- It is computation heavy if we have more objectives.