Based on the above network model and assumptions, our objective is to derive an optimal association scheduling method that maximizes the throughput of a target vehicle. In this section, we show that computing the offline optimal schedule can be casted into a linear programming problem, and thus, we have the optimal solution in polynomial time. In addition to the optimal solution that maximizes the throughput, our framework can be extended to obtain the optimality of association that minimizes the frequency of handoffs occurring in vehicles (see Appendix).
4.1 Objective function
The objective of the association scheduling is to maximize the total transmitted bits during the time intervals I = {1,2,…,m} for the given available APs and effective bit rate information. Computing the optimal association scheduling is equivalent to finding the optimal values for x_{
ij
},i∈I andj∈J_{
i
} by optimizing the tradeoff between the throughput gain from a handoff and the resulting reassociation overhead c.
Therefore, we formulate the offline optimal association control scheduling as a combinatorial optimization problem that finds the optimal values for x_{
ij
}(i∈I and j∈J_{
i
}) with the following objective function:
\begin{array}{l}\phantom{\rule{13.0pt}{0ex}}max\sum _{i=1}^{m}\left({s}_{i}\xb7\left(\sum _{j\in {J}_{i}}{r}_{\mathit{\text{ij}}}{x}_{\mathit{\text{ij}}}\right)\right.\\ \phantom{\rule{0.3em}{0ex}}\phantom{\rule{0.3em}{0ex}}\phantom{\rule{1em}{0ex}}\left(\right)close=")">c\xb7\left(\phantom{\rule{0.3em}{0ex}}\sum _{j\in {J}_{i}\cap {J}_{i1}}{r}_{\mathit{\text{ij}}}{x}_{\mathit{\text{ij}}}\xb7({x}_{\mathit{\text{ij}}}{x}_{i1j})+\sum _{j\in {J}_{i}{J}_{i1}}{r}_{\mathit{\text{ij}}}{x}_{\mathit{\text{ij}}}\right)\end{array}\n
(1)
subject to
\begin{array}{ll}\sum _{j\in {J}_{i}}{x}_{\mathit{\text{ij}}}& \le 1,\phantom{\rule{1em}{0ex}}\forall i\in I,\phantom{\rule{2em}{0ex}}\end{array}
(2)
\begin{array}{ll}{x}_{\mathit{\text{ij}}}& \in \{0,1\}.\phantom{\rule{2em}{0ex}}\end{array}
(3)
The objective function given in Equation 1 consists of two terms. The first term denotes the total amount of transmitted bits. The reassociation overhead is reflected by the second term. Note that the reassociation overhead should be considered only when a handoff occurs. In other words, if the AP j used in interval i1 is continuously used in interval i (i.e., x_{i1j} = x_{
ij
}), then there is no reassociation overhead. On the contrary, if the mobile user selects a new AP j^{′} which was not used in interval i1 (i.e., x_{i1j} = 1,x_{
ij
} = 0, yielding x_{i1j}x_{
ij
}=1), the vehicle cannot transmit data during c seconds due to the handoff overhead. Thus, we express the reassociation overhead as
c\xb7\left(\sum _{j\in {J}_{i}\cap {J}_{i1}}{r}_{\mathit{\text{ij}}}\xb7{x}_{\mathit{\text{ij}}}\xb7({x}_{\mathit{\text{ij}}}{x}_{i1j})+\sum _{j\in {J}_{i}{J}_{i1}}{r}_{\mathit{\text{ij}}}\xb7{x}_{\mathit{\text{ij}}}\right).
(4)
Also, Equation 2 is provided since the vehicle can associate with at most one AP among the available APs according to the system model given above.
4.2 Computing the optimal scheduling problem
The objective function given in Equation 1 is a nonlinear integer programming problem with binary integer decision variable x_{
ij
}. Although the optimal solution of the problem is a subset of all combinatorial and thus is solvable by bruteforce enumerative algorithms [21], its calculation complexity is too high to be practical.
To address this challenging issue, we transform the above nonlinear function into a linear function by exploiting the properties of the binary decision variable.
Since the decision variable x_{
ij
} is a binary value from which it is given {x}_{\mathit{\text{ij}}}^{2}={x}_{\mathit{\text{ij}}}, we rewrite the nonlinear component in the objective function of Equation 1 as
{x}_{\mathit{\text{ij}}}\xb7({x}_{\mathit{\text{ij}}}{x}_{i1j})\Rightarrow {x}_{\mathit{\text{ij}}}^{2}{x}_{\mathit{\text{ij}}}\xb7{x}_{i1j}\Rightarrow {x}_{\mathit{\text{ij}}}{x}_{\mathit{\text{ij}}}\xb7{x}_{i1j}.
(5)
Then, by setting z_{
ij
}=x_{
ij
}·x_{i1j}, we view z_{
ij
} as another binary variable. This yields two additional constraints in the objective function that reflect the same handoff costs:
\left\{\begin{array}{cc}{z}_{\mathit{\text{ij}}}\le {x}_{\mathit{\text{ij}}},& \phantom{\rule{1em}{0ex}}\forall i\in I\text{,}j\in {J}_{i}\cap {J}_{i1},\\ {z}_{\mathit{\text{ij}}}\le {x}_{i1j},& \phantom{\rule{1em}{0ex}}\forall i\in I\text{,}j\in {J}_{i}\cap {J}_{i1}.\end{array}\right.
(6)
Thus, the optimization problem is rewritten as
\begin{array}{l}\phantom{\rule{10.0pt}{0ex}}max\sum _{i=1}^{m}\left({s}_{i}\xb7\left(\phantom{\rule{0.3em}{0ex}}\sum _{j\in {J}_{i}}{r}_{\mathit{\text{ij}}}{x}_{\mathit{\text{ij}}}\right)\right.\\ \phantom{\rule{1em}{0ex}}\left(\right)close=")">c\xb7\left(\phantom{\rule{0.3em}{0ex}}\sum _{j\in {J}_{i}\cap {J}_{i1}}{r}_{\mathit{\text{ij}}}({x}_{\mathit{\text{ij}}}{z}_{\mathit{\text{ij}}})+\sum _{j\in {J}_{i}{J}_{i1}}{r}_{\mathit{\text{ij}}}{x}_{\mathit{\text{ij}}}\right)\end{array}\n
(7)
subject to
\begin{array}{l}\sum _{j\in {J}_{i}}{x}_{\mathit{\text{ij}}}\le 1,\phantom{\rule{2em}{0ex}}\forall i\in I,\\ {z}_{\mathit{\text{ij}}}\le {x}_{\mathit{\text{ij}}},\phantom{\rule{2em}{0ex}}\forall i\in I\text{,}j\in {J}_{i}\cap {J}_{i1},\\ {z}_{\mathit{\text{ij}}}\le {x}_{i1j},\phantom{\rule{2em}{0ex}}\forall i\in I\text{,}j\in {J}_{i}\cap {J}_{i1},\\ {x}_{\mathit{\text{ij}}}\text{and}{z}_{\mathit{\text{ij}}}\in \{0,1\}.\end{array}
However, it is still nontrivial to solve the above integer program (IP) because of the integrality constraints. We will discuss the property of the objective function and show that its solution can be obtained in polynomial time.
Let us consider a IP max \{{c}^{T}x:\mathit{\text{Ax}}\le b,x\in {\mathbb{Z}}_{n}\} and its linear programming (LP) relaxation max \{{c}^{T}x:\mathit{\text{Ax}}\le b,x\in {\mathbb{R}}_{n}\}. A linear program in real variables is said to be integral if it has at least one optimal solution which is integral. Likewise, a polyhedron P={x:A x≤b} is said to be integral if the linear program has an optimum solution x^{∗} with integer coordinates. According to the LP theory [22], solving an IP over an integral polyhedron can be done in polynomial time through LP relaxation of integral constraints, and then the LP has an integral optimal solution.
One common way of proving that a polyhedron is integral is to show that its constraint matrix is TU [21], where a matrix is said to be totally unimodular if the determinant of each square submatrix is 0,+1,or 1. This leads to the following proposition.
Proposition
1
If a matrix A is totally unimodular and b is an integral vector, then the polyhedron P = {x:A x≤b} is integral.
Proof
See [21].
We now show that the constraint matrix of the objective function given in Equation 7 satisfies TU. At this point, however, it is not easy to recognize that the constraint matrix in Equation 7 is TU by directly using the definition of TU. Thus, we consider a general sufficient condition with the following theorem.
Theorem
1
A matrix A is totally unimodular if

(i)
a _{
pq
}∈{0,+1,1} for all p, q, and

(ii)
for any subset M of the rows of A, there exists a partition (M _{1}, M _{2}) of M such that each column q satisfies
\left\sum _{p\in {M}_{1}}{a}_{\mathit{\text{pq}}}\sum _{p\in {M}_{2}}{a}_{\mathit{\text{pq}}}\right\le 1.
Proof.
See [22].
With a more general condition in Theorem 1, we will show that the constraint matrix of the linear integer problem given in Equation 7 is totally unimodular. We first rewrite the constraints of the linear integer problem given in Equation 8 in the form of the standard formation as
\begin{array}{ll}\sum _{j\in {J}_{i}}{x}_{\mathit{\text{ij}}}& \le 1,\phantom{\rule{2em}{0ex}}\end{array}
(8a)
\begin{array}{ll}{x}_{\mathit{\text{ij}}}+{z}_{\mathit{\text{ij}}}& \le 0,\phantom{\rule{2em}{0ex}}\end{array}
(8b)
\begin{array}{ll}{x}_{i1j}+{z}_{\mathit{\text{ij}}}& \le 0.\phantom{\rule{2em}{0ex}}\end{array}
(8c)
Then, one can easily express the above constraint as the form of A x≤b, where A denotes the constraint matric, defined as A\triangleq \left[{a}_{\mathit{\text{pq}}}\right], \mathbf{\text{x}}\triangleq {(x,z)}^{T}\text{and}b\triangleq \left({b}_{q}\right) such that b_{
q
}≥0. Then, we have the following theorem.
Theorem 2.
Let A be the constraint matrix of the linear integer problem given in Equation 7 (i.e., in the form of A x≤b), then matrix A is totally unimodular.
Proof.
We first introduce a partitioning method that obtains a partition (M_{1},M_{2}) for an arbitrary subset M of the rows of A. Let B_{
i
} be a subset of A such that B_{
i
} includes all the rows corresponding to the coefficients of x_{
ij
} in the constraints given in Equation 8. Let {Q}_{i}^{1}, {Q}_{i}^{2}, and {Q}_{i}^{3} denote the submatrices of A containing all the rows corresponding to each constraint in Equations 8a, 8b, and 8c for interval i, respectively. Then, B_{
i
} is given as {B}_{i}={Q}_{i}^{1}\cup {Q}_{i}^{2}\cup {Q}_{i+1}^{3}. For an arbitrary subset M, Algorithm 1 generates a partition (M_{1},M_{2}) by iteratively testing every row in B_{
i
}∩M and assigning the row into M_{1} or M_{2} for i∈I. It is straightforward to show that this portioning rule obtains a partition (M_{1},M_{2}) of M correctly.
The basic principle of how to assign each testing row is as follows. For two different rows p and p^{′} which are both correspondent to interval i, we perform the partition according to two elements a_{
pq
} and {a}_{{p}^{\prime}q} in the same column q. For the case that both a_{
pq
} and {a}_{{p}^{\prime}q} are 1 (or 1), we assign such rows p, and p^{′} to different partitions M_{1} and M_{2}, respectively. If both a_{
pq
} and {a}_{{p}^{\prime}q} are nonzero but their signs are different, i.e., (a_{
pq
}, {a}_{{p}^{\prime}q}) = {1,1} or {1,1}, then we assign such rows to the same partition M_{1} or M_{2}. Since B_{
i
}∩M and B_{i1}∩M can have nonzero values in the same column due to the third constraint (8.c), we consider the rows in M_{1} and M_{2} already assigned for interval i1 when assigning the rows of B_{
i
}∩M for interval i. Thus, we assign the rows of B_{
i
}∩M into M_{1} or M_{2} depending on whether any nonzero value in the same column was already assigned in both M_{1}(or M_{2}) and B_{
i
}∩M or not. The partitioning rule is described in Algorithm 1.
For a given arbitrary subset M of rows of A=[a_{
pq
}], we can have its partition (M_{1},M_{2}) based on Algorithm 1. Since condition (i) of Theorem 1 is already satisfied in Equation 8, we only need to show that the partition (M_{1},M_{2}) will satisfy condition (ii) of Theorem 1.
We can easily see that the number of nonzero entries a_{
pq
} on column q for all p∈M is at most three from Equation 8. Therefore, the number of nonzero entries on the same column in M_{1} and M_{2} is at most three. Let N_{
q
} denote the number of nonzero entries either in M_{1} or in M_{2} for column q.

(i)
N _{
q
}=0,1: nothing to prove.

(ii)
N _{
q
}= 2: There are three subcases for ({a}_{\mathit{\text{pq}}},{a}_{{p}^{\prime}q}) where p≠p ^{′}:

(ii1)
({a}_{\mathit{\text{pq}}},{a}_{{p}^{\prime}q})=\{1,1\}: In this case, p\in {Q}_{i}^{2} and {p}^{\prime}\in {Q}_{i+1}^{3} (or {p}^{\prime}\in {Q}_{i}^{2} and p\in {Q}_{i+1}^{3}) for ∃i. This happens only if B_{
i
}∩M≠∅ by the definition of B_{
i
}. Thus, it is belonging to the case 12 in Algorithm 1; the two rows p and p^{′} were partitioned to different partitions. Thus, \sum _{p\in {M}_{1}}{a}_{\mathit{\text{pq}}}\sum _{p\in {M}_{2}}{a}_{\mathit{\text{pq}}}\le 1 holds.

(ii2)
(ii2) {1,1} (or {1,1}): Since p\in {Q}_{i}^{1} for ∃i, and this is belonging to case 11 in Algorithm 1, the two rows p and p^{′} in B_{
i
}∩M should be assigned to the same partition M_{1} or M_{2}. Thus, \sum _{p\in {M}_{1}}{a}_{\mathit{\text{pq}}}\sum _{p\in {M}_{2}}{a}_{\mathit{\text{pq}}}\le 1 holds.

(ii3)
{1,1}: This can only happen for a column with respect to z_{
ij
} for ∃i(B_{i1}and B_{
i
}). Then, the columns of M_{1}(or M_{2}) and B_{
i
}∩M are belonging to the case 12 in Algorithm 1; thus, the two rows p and p^{′} are assigned to different partitions. Thus, \sum _{p\in {M}_{1}}{a}_{\mathit{\text{pq}}}\sum _{p\in {M}_{2}}{a}_{\mathit{\text{pq}}}\le 1 holds.

(iii)
N _{
q
} = 3: In this case, among three nonzero a _{
pq
}, two of them should be 1 and the other one is +1 where p∈M. The partition (M _{1}, M _{2}) is performed according to case 11 in Algorithm 1, and the summation of 1,1,1 is 1. Thus, \sum _{p\in {M}_{1}}{a}_{\mathit{\text{pq}}}\sum _{p\in {M}_{2}}{a}_{\mathit{\text{pq}}}\le 1 holds.
In all cases, nonzeros in a column for the partition (M_{1},M_{2}) obtained from Algorithm 1 satisfy condition (ii) of Theorem 1. Therefore, the constraint matrix A is totally unimodular.
Solving an IP over an integral polyhedron P can be done in polynomial time. Therefore, from the results of Theorems 1 and 2, we can obtain the solution of the integer program given in Equation 8 in polynomial time.
Remark
In addition to the optimal scheduling problem that maximizes throughput, our framework can be extended to obtain the optimality of association that minimizes the frequency of handoffs occurring in vehicles. In the Appendix, we formulate the optimal handoff minimization scheduling problem as a nonlinear integer programming and show that this problem also holds the TU property, and thus is solvable in polynomial time.