Get Complete Project Material File(s) Now! »

## Decision-maker’s profiles and provided preferential information

At first, we define the diﬀerent decision-makers’ profiles that we may encounter. Then, we list the di ﬀerent kinds of preferential information they can provide. Fi-nally, we draw some critical views about the credibility we should consent to these information.

### Decision-makers’ profiles

Three decision-makers’ profiles are usually considered, according to their knowl-edge of the domain of the considered decision and their habits in making such decision:

– The naive decision-maker : He does not have any particular knowledge about the considered decision domain and is not used to make such decisions.

– The novice decision-maker : He has pretty good knowledge of the decision domain but is confronted to an unusual decision.

– The expert decision-maker : He is an expert of the decision domain and is used to make such decisions [Sha88].

A naive decision-maker is willing to make a decision in an unusual domain which he is not knowledgeable in. Although he is hardly considered in the literature, he is the most common decision-maker in day to day situations. For instance, we may quote the case of someone willing to buy his very first computer and who does not have any idea about the model that he may consider as a good one according to him. The naive decision-maker has no particular knowledge about the decision domain and is not able to configure properly any parametric decision methods, nor to express some reliable preferential information. Most of the time, he may randomly select one option or focus on a very few criteria he may understand (for instance, he takes the first computer, the cheapest, the prettiest or the one he saw on television, etc.). In the best case, his choice will be strongly supported by the recommendations of some experts: a computer magazine comparing diﬀerent models and selecting the best ones according to diﬀerent profiles of users (i.e. a normative approach), or also the advices of a sales assistant (i.e. a delegated decision). The novice decision-maker is an expert, or has at least a good knowledge, of the considered decision domain, but he faces an unusual decision. A simple example may be a man in fond of cars, who wants to buy a new one after many years. He is particularly at ease in talking about the domain and can easily compare justifiably two diﬀerent alternatives, but his understanding of the diﬀerent points of views is mainly implicit, such that he is uncomfortable in determining the impact of each criterion in the global evaluation. For instance, when considering the car example again, such a novice decision -maker will probably be able to define the overall relation between some couples of cars, but will not automatically express the relative importance of the criterion Number of seats compared to the criterion Engine power. Notice that, when dealing with a novice decision-maker, the consideration of a constructive approach seems to be an appropriate approach, as it highlights and makes explicit his preferences.

#### Quality of the expressed preferential information

To better describe the “quality” of the expressed information, we should make a clear distinction between a precise information and an accurate information: Definition 2.1 (Preciseness) An information, given by the decision-maker, is said to be precise when it constraints the value of one parameter, or the ratio be-tween some parameters into reduced intervals (the intervals may be reduced to a unique value). An imprecise information is then a less restrictive constraint.

Definition 2.2 (Accuracy) An information may be also viewed as accurate when it can be considered in total accordance with the decision-maker’s mind. On the contrary, an inaccurate information is going against the decision-maker’s thoughts. An example of precise information can be the association of a criterion weight with a unique value (e.g. “the weight associated with criterion i is 0.2”), or the fact that two criteria must be associated with the same weight. An accurate information may be the clear consideration by the decision-maker of a criterion more important than another one, without any precision on their relative importance degree. Notice that this last information is accurate but also imprecise. Also notice that we can have precise information that are inaccurate, when for instance a decision-maker is asked to give a precise value for a criterion weight, but he may not be totally confident about the expressed value.

In a quite intuitive manner, one can conceive that the more precise the infor-mation are, the more questionable their accuracy is. In that case, assuming the modeling of a decision problem, it is appropriate to consider that only an expert decision-maker is comfortable in the expression of precise and accurate preferential information, due to his experience in the domain on which he uses to take this particular decision. For a novice decision-maker, the expression of precise preferential information on the parameters may appear quite arbitrary. Indeed, he may be able to provide an accurate partial preorder between some criteria (for instance, when comparing some cars, the fact that the color is less important than the security), but he probably cannot express the exact relative importance between two criteria (for instance, the security is three times more important than the color in the decision). Asking for such precise, but inaccurate, input-oriented information may result in the setting of a method that will not reflect the decision-maker’s expectations. In consequence, it seems more advisable to focus on less precise information, but with an incontestable accuracy (i.e. a stronger support from the decision-maker).

**Setting up an iterative preference elicitation process**

In [Mou05], Mousseau defines the preference elicitation process as a “process that goes through an interaction between the decision-maker and the analyst (or a software) and leads the decision-maker to express preference information within the framework of a selected mcap”. The preference elicitation is a part of the decision aid process, that allows to construct the evaluation model. Notice that it requires the explicit use of an ag-gregation procedure. Hence, the mcap has to be selected before the preference elicitation process and should not be modified, nor questioned, during the process, unless the preference elicitation is restarted. In this section, we show how it is possible to implement an iterative process for the preference elicitation, based on a constructive approach, by first defining the elicitation process in a formal way and then studying the behavior of the decision-maker in such a process.

This section sums up to a large extent the work of Mousseau on the preference elicitation. For a more detailed discussion, we refer to [Mou03, Mou05].

**Principles of an iterative preference elicitation approach**

Let us consider an mcap P, as well as a set A of alternatives evaluated on a coherent family F of criteria. An aggregation approach consists in inferring, from a set of input-oriented preference information only, a compatible set of parameters for P that allows the construction of an evaluation model, i.e. the construction of a binary preference relation between the alternatives. The framework of such an aggregation approach has been defined as follows by Mousseau [Mou03]:

– Defining the set A of alternatives.

– Defining the coherent family F of criteria.

– Selecting a multicriteria aggregation procedure P.

– Setting values for the parameters of P.

– Constructing the global preferences by application of P.

– Analysing the sensitivity of the preference relation in order to express recom-mendations.

Let us notice that this framework is not necessarily defining a sequential process, as we may observe some step backs, in order to refine, for instance, the sets of alternatives or criteria, or to test also diﬀerent values for the parameters. In such a case, the analyst must ask the decision-maker a large number of ques-tions for a correct tuning of the parameters. In order to determine the trade-oﬀ between the criteria, for instance, he may ask questions like “How much do we have to increase the evaluation xi of an alternative x on criterion i in order to compen-sate a loss of 1 unit on the evaluation x j on criterion j?”. The ahp method [Saa80] proposes to determine the parameters by asking the relative importance between the criteria (for instance, criterion i is 3 times more important than criterion j). It is commonly stated that this approach requires a sensitivity analysis of its results, due to the possible impreciseness of the parameters and their eﬀective impact on the mcap result. We will discuss this point in section 2.3.

**Table of contents :**

Introduction

**I Anoverview of robustmultip lecriteria decisionaid **

**1 Multi criteria decisionaid **

1.1 The decision aiding approach

1.1.1 Making a decision

1.1.2 Aiding the decision

1.1.3 Involving the decision-maker

1.2 Modeling the decision aid process

1.2.1 Defining the fundamental decision objects

1.2.2 Formulating the decision aid problem

1.2.3 Modeling and exploiting preferences

1.3 The main formal multicriteria decision philosophies

1.3.1 Designing and exploiting an overall value function

1.3.2 Designing and exploiting an outranking relation

**2 Preference elicitationprocesses **

2.1 Decision-makers’ profiles and preferential information

2.1.1 Decision-makers’ profiles

2.1.2 Expressing some preferential information

2.1.3 Quality of the expressed preferential information

2.2 Setting up an iterative preference elicitation process

2.2.1 Principles of an iterative preference elicitation approach

2.2.2 An overview of disaggregation approaches

2.2.3 Knowing the potential pitfalls

2.3 Analysing the robustness of preferential results

2.3.1 Sensitivity analysis within multi attribute valued theory .

2.3.2 Credibility level cutting technique

2.3.3 Dealing with imprecise but accurate information

2.3.4 Stability of outranking relations

**II On the stability of median-cut outranking relations **

**3 Stabilityof themedian-cut outrankingdigraph **

3.1 Preliminary definitions

3.1.1 Construction of a weighted outranking relation

3.1.2 Weights preorder

3.1.3 Defining the preferable relation

3.2 Defining the stability of valued outranking relations

3.2.1 Stability

3.2.2 Extensible stability

3.2.3 !-stability

3.3 Additional properties

3.3.1 Limitation of the stability

3.3.2 Stability of the preferable relation

3.3.3 Stability within the context of the sorting problem

3.3.4 Checking the stability property with missing evaluations .

3.3.5 Properties on the discrimination of the preorder

**4 Stable elicitationof criteriaweights **

4.1 Stability constraints

4.1.1 Auxiliary variables and constraints

4.1.2 Modeling of the stability constraints

4.1.3 Constraint relaxation using slack variables

4.2 Taking into account decision-maker’s preferences

4.2.1 Types of preferential information

4.2.2 Preferences on alternatives

4.2.3 Preferences on criteria

4.3 Mathematical programs

4.3.1 Control algorithm (acon)

4.3.2 milp with real relaxed stability constraints (stab1)

4.3.3 milp with boolean relaxed stability constraints (stab2)

**5 Elicitation of weights and other parameters **

5.1 Elicitation of weights and thresholds

5.1.1 Modeling of the constraints on the thresholds

5.1.2 Additional preferential information

5.1.3 The complete models

5.2 Elicitation of weights and categories profiles

5.2.1 Modeling of the constraints on the profiles

5.2.2 Ensuring a stable assignment of an alternative

5.2.3 The complete models

5.2.4 Stable assignment of the other alternatives

**III A progressive method for a robust parameters elicitation **

**6 Empirical validationof the algorithms **

6.1 Parameters elicitation from a complete set of information

6.1.1 Elicitation of criteria weights

6.1.2 Elicitation of criteria weights and discrimination thresholds .

6.1.3 Elicitation of criteria weights and categories profiles

6.2 Iterative recovering of the median-cut outranking relation

6.2.1 Iterative elicitation of criteria weights

6.2.2 Iterative elicitation of both criteria weights and thresholds . .

6.3 Impact of the stability constraints on the preference modeling

6.3.1 Impact on the weights preorder

6.3.2 Impact on the preference discriminating thresholds

7 rewat: Robust elicitation of the weights and thresholds 127

7.1 Designing a robust elicitation protocol

7.1.1 Stage i: Initialising the outranking preference model

7.1.2 Stage ii: Validating the criteria weights preorder

7.1.3 Stage iii: Tuning the numerical values of the criteria weights

7.2 Tools for supporting a robust elicitation

7.2.1 Dynamic pairwise performance comparison table

7.2.2 Display of the elicited weights preorder

**8 Case study: Applying for a Ph.D. thesis **

8.1 Applying the rewat process

8.1.1 Stage i: Initialising the outranking preference model

8.1.2 Stage ii: Validating the criteria weights preorder

8.1.3 Stage iii: Tuning the numerical values of the criteria weights

8.2 Critical review of the case study

8.2.1 Encountered difficulties

8.2.2 Perspectives for future methodological enhancement

Summary of the main achievements

Perspectives

Concluding remarks

**A Annex **

A.1 Mathematical proof of Proposition 3.8

A.2 Complete mathematical models

A.2.1 acon

A.2.2 stab1

A.2.3 stab2

A.2.4 acon’

A.2.5 stab’1

A.2.6 stab’2

A.2.7 acon?

A.2.8 stab?1

A.2.9 stab?2

A.3 Data for the case study

A.4 Local concordance values for the case study

A.4.1 Initial local concordance values

A.4.2 After the comparison of alternatives sw2 and ca2

A.4.3 After the comparison of alternatives nl2 and us2

A.4.4 After the comparison of alternatives nl1 and us1

A.4.5 After the comparison of alternatives ca2 and fr2

A.4.6 After the comparison of alternatives fr2 and en2

A.4.7 Validation of the new preorder >w2

A.4.8 Validation of the preorder and the thresholds

**Bibliography**