CS代考 7CCSMAMS: Agents & Multi-Agent Systems – cscodehelp代写

7CCSMAMS: Agents & Multi-Agent Systems

Argumentation

1

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

1

Motivating example: Ambulance dispatch decisions
When an emergency call is received, there is a decision about what to dispatch and with what priority
Different people have different facts, expertise and opinions
How can agents reconcile diverse information to find a coherent, justified decision?
2

Evaluation of a trust-modulated argumentation-based
interactive decision-making tool. Sklar et al. 2016

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Today
Introduction to argumentation
Constructing arguments
Evaluating arguments
Argument dialogues

3

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

What is an argument?
An argument is a reason to believe/do/desire something
Agents can use argumentation for their internal reasoning
Arguments can be exchanged to resolve some difference of opinion
Agents can use argumentation to structure their interactions
Proper arguments involve eliciting reasons for statements and modelling the way in which other arguments relate to these reasons.
There are theories and evidence that the function of human reasoning is to allow production of arguments to convince others and so improve communication.

4

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

4

Argumentation theory
Argumentation theory asks questions such as:
What counts as an argument?
How do arguments relate to one another?
What counts as a winning argument?
How to structure exchanges of arguments?
How to choose which argument to present?
How to recognise arguments?
5

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Nature of arguments
There might be several arguments for and against something.
“Jaws is my favourite film and it’s on at the cinema tonight, we should go”
“I need to stay in and work tonight, we shouldn’t go to the cinema”

Arguments are defeasible.
“Tweety is a bird, birds can usually fly, so Tweety can fly”
“Tweety is a penguin, penguin’s can’t fly, so Tweety can’t fly”

6

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

6

Structure of arguments
An argument can be seen as a set of premises offered in support of a claim
7
Information I about should be published  
        because
        has political responsibilities
        and
        I is in the national interest
        and
        if a person has political responsibilities and info about that person
is in the national interest then that info should be published

claim

premises

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

7

Argumentation
Argumentation is the process whereby arguments are constructed, exchanged and evaluated in light of their interactions with other arguments.
Today we look at two approaches to using argumentation in a multi-agent system
Abstract argument theory allows us to determine consistent sets of knowledge from argument graphs, where arguments (nodes) present premises supporting claims while attacks (edges) present inconsistency between arguments
Argument-based dialogue provides principles for structuring argumentative dialogue over knowledge for persuasion, negotiation, deliberation, etc.
8

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

8

Constructing arguments
Given a database of (possibly inconsistent) formulae:

Define what constitutes an argument.
In general, an argument is of the form (premises, claim) where:
(i) premises  database
(ii) claim is a formula that follows from the premises
Identify what constitutes an attack, i.e. when one argument is in conflict with another
9

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Socrates is a man
If you are a man, then you are mortal
The grass is wet
If the hose is on, then it’s not the case that if the grass is wet then it’s been raining
The hose is on
If the grass is wet, then it’s been raining
10

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

10

Socrates is a man
If you are a man, then you are mortal
The grass is wet
If the grass is wet, then it’s been raining
The hose is on
If the hose is on, then it’s not the case that if the grass is wet then it’s been raining
Socrates is mortal

It’s been raining

It’s not the case that if the grass is wet then it’s been raining

11

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

11

Socrates is a man
If you are a man, then you are mortal
The grass is wet
If the grass is wet, then it’s been raining
The hose is on
If the hose is on, then it’s not the case that if the grass is wet then it’s been raining
Socrates is mortal

It’s been raining

It’s not the case that if the grass is wet then it’s been raining

12

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

12

Arguments in classical logic
Elements of argumentation in classical logic
Database  is a set of propositional logic formulae
Argument (,) |
 is consistent subset of 
entails  in classical logic
 is a set, minimal under set inclusion, entailing 
 is called the premises of the argument
 is called the claim of argument
Examples of defining attacks:
(1, 1) rebuts (2, 2) iff 1  2
(1, 1) undercuts (2, 2) iff 1   for some   2

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Exercise
Database  = { a, a  b, b }
Is ({a, a  b}, a) an argument?
What are all the possible arguments?
Draw a directed graph showing where:
Each argument is a graph node
There is a graph edge/arc wherever one argument attacks another argument by undercutting or rebuttal
14

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Abstract argumentation
In abstract argumentation, we ignore the structure of arguments but retain the graph of attack relations between them
An abstract argument framework can be defined formally as (A, R) where
A is a set of arguments
R ⊆ A x A represents attacks between arguments
If (A1, A2) ∈ R then argument A1 attacks A2

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

A
B
C
D
E
({A, B, C, D, E}, {(C, A), (D, A), (A, D), (A, B), (B, A), (E, B), (B, E)})
16

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

16

A
B
C
D
E
We have a set of conflicting arguments.

Which subset can we rationally present as coherent?
17

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

17

A
B
C
D
E
1. An argument is In if and only if all arguments that attacks it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.

We can leave arguments unlabelled (Undecided).
18

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

18

A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
19

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

19

A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
20

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

20

A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
21

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

21

A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
22

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

22

A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
23

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

23

A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
24

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

24

A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
25

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

25

A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
26

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

26

A
B
C
D
E
A
B
C
D
E
A
B
C
D
E
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.
27

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

27

A
B
C
D
E
A
B
C
D
E
A
B
C
D
E
Grounded semantics:
minimise node labelling
Preferred semantics:
maximise node
labelling
28
1. An argument is In if and only if all arguments that attack it are Out.
2. An argument is Out if and only if there is at least one argument that attacks it and is In.

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

28

a
b
c
d
e
f
g
h
i
j
Exercise
What is the labelling under grounded semantics?
What are the labellings under preferred semantics?
29

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Preferred credulous and preferred sceptical semantics
An argument is acceptable under the preferred credulous semantics if it is labelled IN in at least one of the preferred labellings

An argument is acceptable under the preferred sceptical semantics if it is labelled IN in all of the preferred labellings

30

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

A
B
C
D
E
A
B
C
D
E
Preferred semantics:
maximise node
labelling
Which arguments in the graphs labelled below are acceptable…
…under the preferred credulous semantics?
…under the preferred sceptical semantics?
Exercise
31

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

31

32
Given the patient is really ill,
we should give them ExpDrug
which will make the patient better fast achieving our goal of making the patient better and promoting patient health.
Given that we are low on cash,
we should not give the patient ExpDrug
which will make the patient
better and put us over budget
achieving our goal of making the patient better
but demoting hospital economy.
Subjective defeat relations
We can consider that an attack from argument A to argument B succeeds in defeating B if and only if A is preferred to B.

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

09:15
Commonsense Reasoning 11/12, HC 13
32

33
Given the patient is really ill,
we should give them ExpDrug
which will make the patient better fast achieving our goal of making the patient better and promoting patient health.
Given that we are low on cash,
we should not give the patient ExpDrug
which will make the patient
better and put us over budget
achieving our goal of making the patient better
but demoting hospital economy.
Subjective defeat relations
Value patient health above hospital economy

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

09:15
Commonsense Reasoning 11/12, HC 13
33

34
Given the patient is really ill,
we should give them ExpDrug
which will make the patient better fast achieving our goal of making the patient better and promoting patient health.
Given that we are low on cash,
we should not give the patient ExpDrug
which will make the patient
better and put us over budget
achieving our goal of making the patient better
but demoting hospital economy.
Subjective defeat relations
Value hospital economy above patient health

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

09:15
Commonsense Reasoning 11/12, HC 13
34

35
C
B
BBC says rain
CNN says sun
Arguments attacking attack relations
See Modgil 2009

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

09:15
Commonsense Reasoning 11/12, HC 13
35

36
C
T
B
BBC says rain
CNN says sun
Trust BBC more than CNN
Arguments attacking attack relations

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

09:15
Commonsense Reasoning 11/12, HC 13
36

37
C
T
B
S
BBC says rain
CNN says sun
Trust BBC more than CNN
Stats say CNN better than BBC
Arguments attacking attack relations

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

09:15
Commonsense Reasoning 11/12, HC 13
37

38
C
T
B
S
BBC says rain
CNN says sun
Trust BBC more than CNN
Stats say CNN better than BBC
R
Stats more rational
than trust
Arguments attacking attack relations

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

09:15
Commonsense Reasoning 11/12, HC 13
38

Defeasibility of knowledge
Most knowledge is defeasbile
open in principle to revision, valid objection, forfeiture, or annulment
capable of being attacked or rendered void (law)
having a presupposition in its favour but open to revision if countervailing evidence becomes known (philosophy)

Discussion
What knowledge is defeasible and what is not?
39

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Non-monotonic reasoning
An agent needs to be able to reason with its knowledge, drawing plausible conclusions
But it must be able to revise its conclusions if it learns they were unwarranted
This is called non-monotonic reasoning
In monotonic reasoning, the set of conclusions never gets smaller (e.g. classical logic)
In non-monotonic reasoning, it is possible that the set of conclusions gets smaller, i.e. we can retract conclusions
40

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Non-monotonic reasoning example
Birds normally fly.
Penguins don’t fly.

Tweety is a bird. Tweety flies.
Tweety is a penguin. Tweety does not fly.

41

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Non-monotonic reasoning example
First attempt to capture this with classical logic:

bird(X) -> flies(X)
bird(X) ^ penguin(X) -> flies(X)
bird(tweety)
penguin(tweety)

flies(tweety)
flies(tweety)

42
Inconsistent
In classical logic, we can infer anything from an inconsistent knowledge base!

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Non-monotonic reasoning example
Second attempt to capture this with classical logic:

bird(X) ^ penguin(X) -> flies(X)
bird(X) ^ penguin(X) -> flies(X)

bird(tweety)
penguin(tweety)
flies(tweety)

bird(tweety)

43
Doesn’t imply flies(tweety)

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Defeasible logic reasoning with argumentation
What about with argumentation?

bird(X) -> flies(X)
penguin(X) ->  (bird(X) -> flies(X))
penguin(X) -> flies(X)
bird(tweety)

44
< {bird(tweety), bird(tweety) -> flies(tweety)},
flies(tweety) >

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Defeasible logic reasoning with argumentation
What about with argumentation?

bird(X) -> flies(X)
penguin(X) ->  (bird(X) -> flies(X))
penguin(X) -> flies(X)
bird(tweety)
penguin(tweety)

45
< {bird(tweety), bird(tweety) -> flies(tweety)},
flies(tweety) >
< {penguin(tweety), penguin(tweety) ->  flies(tweety)},
 flies(tweety) >
< {penguin(tweety), penguin(X) ->  (bird(X) -> flies(X))},
 (bird(tweety) -> flies(tweety)) >

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Why argue?
Arguments allow different preferences to be considered
“It is more important to ensure the patient’s health than it is to protect the hospital’s budget.”
Arguments are intuitive to people
As technology becomes more autonomous, as we delegate more complex and more important tasks, it becomes crucial that:
machines are able to justify their choices and explain their reasoning in a way that humans can understand
humans are able to engage with and input into machines’ reasoning and decision making
46

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Argument dialogues
Like auctions and negotiation, argument dialogues are a mechanism that agents can use to reach agreement.
By exchanging arguments, agents are able to
share their reasons for holding particular positions
and perhaps change each other’s position on a topic
Walton and Krabbe classified types of dialogue in terms of:
the initial situation from which the dialogue arises
the main goal of the dialogue, to which all the participating agents subscribe
the personal aims of each individual agent
47

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

48
Type Initial situation Main goal Participant’s aims
Persuasion Conflicting points of view Resolution of such conflicts Persuade the other
Inquiry General ignorance Growth of knowledge and agreement Find (or destroy) a ‘proof’
Deliberation Need for action Reach a decision Influence outcome
Negotiation Conflict of interests and need for cooperation Making a deal Get the best out of it for one self
Info-seeking Personal ignorance Spreading knowledge and revealing positions Gain, pass on, show, or hide personal knowledge

D. Walton and E. Krabbe, Commitment in Dialogue, SUNY Press, 1995

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

48

Elements of a dialogue system
A dialogue system can be defined by
Moves that the agents can make, e.g. assert, accept, claim, question, challenge
Protocol: rules on what moves can be made at each point in the dialogue, e.g. you can’t assert an argument that attacks something you’ve asserted earlier, if you can’t attack an argument then you must accept it
Strategy: private to an agent, determines the move to make, e.g. if you find something agreeable then agree, if there’s a choice between an assert and accept, then assert
Rules stating when the dialogue terminates and its outcome, e.g. terminate when both agents run out of moves, agent x persuades agent y if x was the last to move
49

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Designing a dialogue system
The mechanism designer specifies a particular dialogue system (moves, protocol, termination rules) for a particular type of dialogue, aiming for particular properties, such as
The dialogue is guaranteed to terminate
If agent i persuades agent j to accept argument A, then A must be acceptable in the argument graph constructed from the arguments moved in the dialogue
The best strategy to use in a dialogue is an open question
50

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Argumentation-based negotiation
Dialogue between a buyer and seller of cars:
Seller  (Offer: Renault)
Buyer (Reject: Renault)
Seller  (Why?)
Buyer (Argue: Renault is French, and French cars are unsafe)
Seller (Argue: Renaults are not unsafe as have been awarded safest car in Europe by EU)
Buyer (Accept)
Exchange of arguments provides for agreements that would not be reached in simple propose / accept / reject protocols

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Argument dialogue example: organ donation
52
P. Tolchinsky et al. Increasing human-organ transplant availability: Argumentation-based agent deliberation. IEEE Intelligent Systems. 2006.

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Discussion
Autonomous vehicles in a convoy. Where may we use the following?
BDI reasoning Multi-agent simulation
Negotiation Temporal logic
Argumentation Auctions
Trust assessment Prescriptive norms
Voting Agent communication languages

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Discussion
Homes selling electricity into a network. Where may we use the following?
BDI reasoning Multi-agent simulation
Negotiation Temporal logic
Argumentation Auctions
Trust assessment Prescriptive norms
Voting Agent communication languages

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

Revision class next week
Next week’s class will focus on practicing the whole module’s materials with a mock exam paper
You will be able to work in groups next week (but not in the real exam!)
To get the most from the revision class, bring along your class notes to consult, unless you are already well revised and want to test yourself in exam-like conditions
55

Slide ‹#› out of 77

Some slides adapted from slides made available by M. Wooldridge, S. Parsons & T. Payne http://www.cs.ox.ac.uk/people/michael.wooldridge/pubs/imas/IMAS2e

55

Recipient agent 1
Organ
donor agent

Since no contraindications
are known, the lung is

viable for recipient 1

But the donor smoked,
so contraindication

But the donor had no
smoking related disease so
no contraindication

Recipient agent 2
Mediator agent

Slide 1

Leave a Reply

Your email address will not be published. Required fields are marked *