# 程序代写 Constraint Handling — Objective Function – cscodehelp代写

Constraint Handling — Objective Function

Objective function defines the cost of a solution. (N−1 )

minimise totalDistance(x) = ∑ Dxi,xi+1 + DxN,x1 i=1

Copyright By cscodehelp代写 加微信 cscodehelp

where Dj,k is the distance of the path between cities j and k. [Optional] Solutions must satisfy certain constraints.

Traveling Salesman Problem Formulation

Design variables represent a candidate solution.

• The design variable is a sequence x of N cities, where xi ∈ {1,⋯, N}, ∀i ∈ {1,⋯, N}.

The N cities to be visited are represented by values {1,…,N}.

The search space is all possible sequences of N cities, where cities are in {1,…,N}.

1, if xj = i i ∑j=1 j j 0, if xj ≠ i

∀i ∈ {1,⋯, N}, h (x) = N 1(x = i) − 1 = 0 1(x = i) =

Designing Objective Functions to Deal With Constraints

The original objective function of a problem can be modified

to deal with constraints.

A penalty can be added for infeasible solutions, increasing

their cost.

Designing Objective Functions to Deal With Constraints

1234 __ __ __ __

Objetive function:

( size(x)−1 )

E.g.: assume that the representation is a list of any size, and that

our initialisation procedure is uniformly at random with replacement.

12234 __ __ __ __ __

minimise totalDistance(x) =

+ Dxsize(x),x1

How to modify the objective function to deal with the constraint that each city must appear once and only once?

1234 __ __ __ __

Objetive function:

minimise totalDistance(x) =

where nm is the number of cities missing, nd is the number of duplications of cities and P is a large positive constant.

Designing Objective Functions to Deal With Constraints

E.g.: assume that the representation is a list of any size, and that

our initialisation procedure is uniformly at random with replacement.

12234 __ __ __ __ __

( size(x)−1 )

∑ Dxi,xi+1 + Dxsize(x),x1+nmP + ndP

Generalising The Strategy Minimise f(x)

gi(x) ≤ 0, hj(x) = 0,

{0 if x is feasible Only sum here the violated constraints P × [ga(x)2 + gb(x)2 + ⋯ + ha′(x)2 + hb′(x)2 + ⋯] otherwise where P is a large positive constant.

Subject to

i = 1,⋯,m j = 1,⋯,n

Generalising The Strategy Minimise f(x)

Subject to

gi(x) ≤ 0, hj(x) = 0,

i = 1,⋯,m j = 1,⋯,n

P × [vg1g1(x)2+vg2g2(x)2 + ⋯+vgmgm(x)2+

+vh1h1(x)2+vh2h2(x)2 + ⋯+vhnhn(x)2]

where P is a large positive constant, and vgi and vhi are 1 if the corresponding constraint is violated and 0 otherwise.

Dealing with Constraints Based on Objective Functions

Advantage:

Easier to design.

Disadvantage:

Algorithm has to search for feasible solutions.

Completeness

If we use a strategy to deal with constraints that never enables any infeasible solution to be generated, algorithms such as Hill Climbing and Simulated Annealing are complete.

Otherwise:

Hill Climbing: not complete if the objective function has local optima.

Simulated Annealing: not guaranteed to find a feasible solution within a reasonable amount of time.

We need to design strategies to deal with the constraints. Examples of strategies:

Representation, initialisation and neighbourhood operators. Objective function.

Next Example applications.

程序代写 CS代考 加微信: cscodehelp QQ: 2235208643 Email: kyit630461@163.com