问题描述
下面的代码在此行的for循环后显示语法错误
model += x[int((str((i*2)-1)+str(j)))] + x[int((str(i*2)+str(j)))]] <= 1
我希望像2 D数组x11,x12,x13这样声明变量,因此我出错了。
KeyError
Traceback (most recent call last)
<ipython-input-95-19b3a6e81910> in <module>()
19 for i in range (1,(Box//2)+1):
20 for j in range (1,Pallet+1):
---> 21 model += x[int((str((i*2)-1)+str(j)))] + x[int((str(i*2)+str(j)))] <= 1
22
23
KeyError: 11
我检查了所有可能的选项,似乎一切正常,请有人帮忙。
'''
from pulp import *
Box=6
Pallet=3
Variable_range=Box*Pallet
from pulp import LpMaximize,LpProblem,LpStatus,lpSum,LpVariable
# Define the model
model = LpProblem(name="Container Loading",sense=LpMaximize)
# Define the decision variables
for i in range(1,Box+1):
for j in range (1,Pallet+1):
x = {int((str(i)+str(j))):LpVariable(name=f"x{i}_{j}",lowBound=0,upBound=1,cat='Integer')}
print(x)
# Add constraints
for i in range (1,(Box//2)+1):
for j in range (1,Pallet+1):
model += x[int((str((i*2)-1)+str(j)))] + x[int((str(i*2)+str(j)))]] <= 1 # error at this line
#Set the objective
model += lpSum(x.values())
# Solve the optimization problem
status = model.solve()
# Get the results
print(f"status: {model.status},{LpStatus[model.status]}")
print(f"objective: {model.objective.value()}")
for var in x.values():
print(f"{var.name}: {var.value()}")
for name,constraint in model.constraints.items():
print(f"{name}: {constraint.value()}")
'''
解决方法
这与误解循环有关。在:
# Define the decision variables
for i in range(1,Box+1):
for j in range (1,Pallet+1):
x = {int((str(i)+str(j))):LpVariable(name=f"x{i}_{j}",lowBound=0,upBound=1,cat='Integer')}
print(x)
您每次覆盖x。因此,最终x仅包含一个元素。您可以通过将print语句移到循环之后来查看。
更好的是:
# Define the decision variables
x = {int((str(i)+str(j))):LpVariable(name=f"x{i}_{j}",cat='Integer')
for i in range(1,Box+1) for j in range (1,Pallet+1) }
print(x)
,
尝试在下面使用此代码。我改变了决定变量的定义风格。这不是抛出错误。
from pulp import *
Box=6
Pallet=3
Variable_range=Box*Pallet
x = {}
from pulp import LpMaximize,LpProblem,LpStatus,lpSum,LpVariable
# Define the model
model = LpProblem(name="Container Loading",sense=LpMaximize)
# Define the decision variables
for i in range(1,Pallet+1):
x[(i,j)] = pulp.LpVariable('x' + str(i) + '_' + str(j),1,LpBinary)
print(x[(i,j)])
# Add constraints
for i in range (1,(Box//2)+1):
for j in range (1,Pallet+1):
model += x[(i*2-1,j)] + x[(i*2,j)] <= 1 # error at this line
#Set the objective
model += lpSum(x.values())
# Solve the optimization problem
status = model.solve()