Skip to main content

Advertisement

Secure transmission of correlated sources over broadcast channels with ultra-low latency

Article metrics

  • 99 Accesses

Abstract

Correlated sources passing through broadcast channels is considered in this paper. Each receiver has access to correlated source side information and each source at the sender is kept secret from the unintended receiver. This communication model can be seen as generalizations of Tuncel’s source over broadcast channel and Villard et al.’s source over wiretap channel. An outer bound for secure transmission region of arbitrarily correlated sources with the equivocation-rate levels is derived with ultra-low latency and used to prove capacity results for several classes of sources and channels.

Introduction

The communication of two correlated sources S1 and S2 over broadcast channel (BC) p(y1,y2|x) with correlated side information (SI) \({\bar {S}_{1}}\) and \({\bar {S}_{2}}\) at the receivers is considered [15]. In addition, each source should be kept as secret as possible from the unintended receiver where the secrecy is measured by the equivocation rate [69]. We refer to this model as the discrete memoryless BC-SI with two confidential sources (DM-BCCS-SI). DM-BCCS-SI model is shown in Fig. 1 and covers various practical applications in distributed video compression, peer-to-peer data distribution systems, and wireless sensor networks. This paper investigates reliability and security of the DM-BCCS-SI [1013]. In general, four fundamental issues need to be solved: (i) How to use distributed source codes to decrease transmission load but increase secrecy rates? (ii) How to find capacity of BCs with arbitrarily correlated sources? (iii) How to design coding strategy for secure transmission? (iv) How to build a source-channel coding to derive the optimal bounds or make source-channel separation theorem hold?

Fig. 1
figure1

System model for the DM-BCCS-SI

Although there have been results about source-channel coding for BCs, we have a limited understanding of general source-channel matching conditions for reliable transmission, let alone for secure transmission [1419]. In 2006, Tuncel [20] found the optimal source-channel rate for broadcasting a common source to multiple receivers. In 2013, Villard et al. [21] investigated the source-channel coding for secure transmission of a source over 2-receiver wiretap channel with arbitrarily correlated side information at both receivers. In Tuncel and Villard et al.’s works, the source and channel variables are statistically independent. And in some special cases, it is proved that source-channel separation theorem holds. However, in general, the separation may be suboptimal for broadcasting arbitrarily correlated sources. So far, the well-known sufficient conditions for reliable transmission of arbitrarily correlated sources over BC firstly introduced by Han and Costa in [22] are due to using the joint distribution of source and channel variables. On the other hand, the necessary conditions were provided by Kramer et al. [23]. Recently, we studied broadcast channels with confidential sources (BCCS) and without side information [24], which generalizes Han-Costa model to secure situation by considering each source kept secret from the unintended recipient. In this paper, we are devoted to establish the sufficient and necessary conditions for secure transmission of the DM-BCCS-SI in Fig. 1.

Shannon showed that his inner bound is indeed the capacity region of the “restricted” two-way channel, in which the channel inputs of the users depend only on the messages (not on the previous channel outputs). Several improved outer bounds using the “dependence-balance bounds” are proposed by Hekstra and Willems. In this paper, we both consider the inner bound and outer bound. The source (S1,S2) is said to be admissible with secrecy level (ES1,ES2) for this BC-SI if for any λ, 0<λ<1, and for large enough m and n, there is a code with length-m source sequence and length-n codewords such that

$$ \left\{ \begin{array}{l} P_{e1}^{(m)} \le \lambda,\quad P_{e2}^{(m)} \le \lambda,\\ {E_{S1}} \le \frac{1}{m} H(S_{1}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}) + \lambda,\;{E_{S2}} \le \frac{1}{m} H(S_{2}^{m}|Y_{1}^{n}\bar{S}_{1}^{m}) + \lambda \end{array} \right\} $$
(1)

where \(P_{e1}^{(m)}\) and \(P_{e2}^{(m)}\) are the respective error probabilities for receivers 1 and 2, \(\frac {1}{m}H(S_{1}^{m}|Y_{2}^{n}\bar {S}_{2}^{m})\) is the equivocation rate which denotes the uncertainty for S1 at receiver 2 given the sequences \(Y_{2}^{n}\) and \(\bar {S}_{2}^{n}\), the similar description is for \(\frac {1}{m}H(S_{2}^{m}|Y_{1}^{n}\bar {S}_{1}^{m})\) at receiver 1. A set of all admissible sources with the equivocation rate levels (S1,S2,ES1,ES2) satisfying the condition (1) is called secure transmission region.

In this paper, we establish outer and inner bounds of secure transmission region of the DM-BCCS-SI, which consists of a set of admissible sources with a range of secrecy levels. Furthermore, the proposed outer bound is shown to be tight in the following three aspects: (i) Joint source-channel coding, whose distribution relies on joint probability of source and channel variables. (ii) Separate source-channel coding, whose distribution is determined by statistically independent distribution of source and channel variables, is not necessarily the optimal codes for the source or the channel and is refered to as Operational separation in [20, 25]. (iii) Informational separation refers to classical separation in Shannon sense, that is, comparison of the optimal source coding rate region and the channel capacity region is sufficient to find the optimal secure transmission region.

An outer bound

Let K=f(S)=g(T) be the common variable in the sense of Gacs and Korner (and also Witsenhausen), and consider auxiliary random variables W,U,V that satisfy the Markov chain property

$$S \to TWUV \to X \to YZ $$

Consider a general outer bound of the DM-BCCS-SI. Assume the common variable K=a(S1)=b(S2) of S1 and S2 in the sense of G-K. The source code length m may differ from the channel code length n (see Fig. 1). The auxiliary random variables \(\left ({\tilde K}, {\tilde S_{1}},{\tilde S_{2}}, {\tilde {\bar {S}}_{1}},{\tilde {\bar {S}}_{2}} \right)\) have the same probability distribution as \(\left ({K^{m}},S_{1}^{m},S_{2}^{m},\bar {S}_{1}^{m},\bar {S}_{2}^{m}\right)\) respectively (according with [23, Theorem 1]).

Theorem 1

(Outer bound) An admissible source pair (S1,S2) with secrecy level (ES1,ES2) for the DM-BCCS-SI satisfies the following bounds

$$ H\left(K|{\bar{S}_{1}}\right)/R \le I\left(\tilde K;{Y_{1}}|{\tilde {\bar{S}}_{1}}{U_{1}}\right) $$
(2)
$$ H\left(K|{\bar{S}_{2}}\right)/R \le I\left(\tilde K;{Y_{2}}|{\tilde {\bar{S}}_{2}}{U_{2}}\right) $$
(3)
$$ H\left({S_{1}}|{\bar{S}_{1}}\right)/R \le I\left({\tilde S_{1}};{Y_{1}}|{\tilde {\bar{S}}_{1}}{U_{1}}\right) $$
(4)
$$ H\left({S_{2}}|{\bar{S}_{2}}\right)/R \le I\left({\tilde S_{2}};{Y_{2}}|{\tilde {\bar{S}}_{2}}{U_{2}}\right) $$
(5)
$$ \begin{array}{l} H\left({S_{1}}{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)/R\\ \le I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) + I\left({{\tilde{S}}_{2}}{{\tilde{\bar{S}}}_{1}}{U_{1}};{Y_{2}}|{{\tilde {\bar{S}}}_{2}}{U_{2}}\right) \end{array} $$
(6)
$$ \begin{array}{l} H\left({S_{1}}{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)/R\\ \le I\left({{\tilde{S}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{2}};{Y_{1}}|{{\tilde {\bar{S}}}_{1}}{U_{1}}\right) + I\left({{\tilde{S}}_{2}};{Y_{2}}|{{\tilde{S}}_{1}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) \end{array} $$
(7)
$$ \begin{array}{l} \left[H\left({S_{1}}|{{\bar{S}}_{1}}\right) \,+\, H\left({S_{2}}|{{\bar{S}}_{2}}\right) \,-\, I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}K\right) \,-\, I\left({S_{2}};{{\bar{S}}_{1}}K|{{\bar{S}}_{2}}\right)\right]/R\\ \le \left[ \begin{array}{l} I\left(\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}};{Y_{1}}\right) + I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\\ + I\left({{\tilde{S}}_{2}};{Y_{2}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) \end{array} \right] \end{array} $$
(8)
$$ \begin{array}{l} \left[H\left({S_{1}}|{{\bar{S}}_{1}}\right) \,+\, H\left({S_{2}}|{{\bar{S}}_{2}}\right) \,-\, I\left({S_{1}};{S_{2}}|{{\bar{S}}_{2}}K\right) \,-\, I\left({S_{1}};{{\bar{S}}_{2}}K|{{\bar{S}}_{1}}\right)\right]/R\\ \le \left[ \begin{array}{l} I\left(\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}};{Y_{2}}\right) + I\left({{\tilde{S}}_{2}};{Y_{2}}|{{\tilde{S}}_{1}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\\ + I\left({{\tilde{S}}_{1}};{Y_{1}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) \end{array} \right] \end{array} $$
(9)
$$ {\begin{aligned} {E_{S1}} \le \left\{ \begin{array}{l} I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + I\left(S_{1};\bar{S}_{2}|S_{2}\bar{S}_{1}\right)\\ + R\left[I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) - I\left({{\tilde{S}}_{1}};{Y_{2}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\right],\\ I\left({S_{1}};{{\bar{S}}_{1}}|K\right) - I\left({S_{1}};{{\bar{S}}_{2}}|K\right) + I\left({S_{1}};{{\bar{S}}_{2}}|K{{\bar{S}}_{1}}\right)\\ + R\left[I\left({{\tilde{S}}_{1}};{Y_{1}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) - I\left({{\tilde{S}}_{1}};{Y_{2}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\right],\\ H\left({S_{1}}|{S_{2}}{{\bar{S}}_{2}}\right) \end{array} \right\} \end{aligned}} $$
(10)
$$ {\begin{aligned} {E_{S2}} \le \left\{ \begin{array}{l} I\left({S_{2}};{{\bar{S}}_{2}}|{S_{1}}\right) - I\left({S_{2}};{{\bar{S}}_{1}}|{S_{1}}\right) + I\left(S_{2};\bar{S}_{1}|S_{1}\bar{S}_{2}\right)\\ + R\left[I\left({{\tilde{S}}_{2}};{Y_{2}}|{{\tilde{S}}_{1}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) - I\left({{\tilde{S}}_{2}};{Y_{1}}|{{\tilde{S}}_{1}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\right],\\ I\left({S_{2}};{{\bar{S}}_{2}}|K\right) - I\left({S_{2}};{{\bar{S}}_{1}}|K\right) + I\left({S_{2}};{{\bar{S}}_{1}}|K{{\bar{S}}_{2}}\right)\\ + R\left[I\left({{\tilde{S}}_{2}};{Y_{2}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) - I\left({{\tilde{S}}_{2}};{Y_{1}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\right],\\ H\left({S_{2}}|{S_{1}}{{\bar{S}}_{1}}\right) \end{array} \right\} \end{aligned}} $$
(11)

where R=n/m and for the distribution

$$ \begin{aligned} &p\left(\tilde k{\tilde s_{1}}{\tilde s_{2}}{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{u_{1}}{u_{2}}x{y_{1}}{y_{2}}\right)= p\left({\tilde s_{1}}{\tilde s_{2}}{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{u_{1}}{u_{2}}\right)\\&p\left(x|\tilde k{\tilde s_{1}}{\tilde s_{2}}{u_{1}}{u_{2}}\right)p({y_{1}}{y_{2}}|x) \end{aligned} $$
(12)

Remarks 1

Without the side information \({\bar {S}_{1}},{\bar {S}_{2}},{\tilde {\bar {S}}_{1}},{\tilde {\bar {S}}_{2}}\), the bounds (2)-(11) are reduced to the bounds given in [24, Theorem 2].

Proof of Theorem 1

Fano’s inequality gives

$$ H\left({K^{m}}|{Y_{1}}^{n}\bar{S}_{1}^{m}\right) \!\le\! H\left(S_{1}^{m}|{Y_{1}}^{n}\bar{S}_{1}^{m}\right) \!\le\! P_{e1}^{(m)} \cdot m{\log_{2}}|{{{ S}}_{1}}| + 1 $$
(13)
$$ H\left({K^{m}}|{Y_{2}}^{n}\bar{S}_{2}^{m}\right) \le H\left({S_{2}}^{m}|{Y_{2}}^{n}\bar{S}_{2}^{m}\right) \le P_{e2}^{(m)} \cdot m{\log_{2}}|{S_{2}}| + 1 $$
(14)

Let \({\delta _{1}} = P_{e1}^{(m)}{\log _{2}}|{S_{1}}| + 1/m\) and \({\delta _{2}} = P_{e2}^{(m)}{\log _{2}}|{S_{2}}| + 1/m\), and define the auxiliary random variables

$$ {U_{1i}} = {Y_{1}}^{i - 1},\quad {U_{2i}} = Y_{2i + 1}^{n} $$
(15)

which satisfy (12).

At first, we consider the entropy bounds of a single source S1 and have the facts

$$ mH({S_{1}}) = H({S^{m}}) $$
(16)
$$ \begin{array}{l} m[H({S_{1}}) - {\delta_{1}}] \le H\left(S_{1}^{m}\right) - H\left(S_{1}^{m}|Y_{1}^{n}\bar{S}_{1}^{m}\right)\\ = I\left(S_{1}^{m};Y_{1}^{n}\bar{S}_{1}^{m}\right) = I\left(S_{1}^{m};\bar{S}_{1}^{m}\right) + I\left(S_{1}^{m};Y_{1}^{n}|\bar{S}_{1}^{m}\right)\\ = mI\left({S_{1}};\bar{S}_{1}\right) + {\sum\nolimits}_{i = 1}^{n} {I\left(S_{1}^{m};{Y_{1i}}|\bar{S}_{1}^{m}{Y_{1}}^{i - 1}\right)} \\ = mI\left({S_{1}};\bar{S}_{1}\right) + {\sum\nolimits}_{i = 1}^{n} {I\left({{\tilde{S}}_{1}};{Y_{1i}}|\tilde {\bar{S}}_{1}{U_{1i}}\right)} \\ = mI\left({S_{1}};{{\bar{S}}_{1}}\right) + nI\left({{\tilde{S}}_{1}};{Y_{1}}|\tilde {\bar{S}}_{1}{U_{1}}\right) \end{array} $$
(17)

where (16) follows from discrete memoryless property and (17) follows from Fano’s inequality (13). And we have

$$ m[H\left({S_{1}}|\bar{S}_{1}\right) - {\delta_{1}}] \le nI\left({\tilde S_{1}};{Y_{1}}|\tilde {\bar{S}}_{1}{U_{1}}\right) $$
(18)

Next, we consider the entropy bounds of two sources S1S2.

$$ \begin{array}{l} m[H\left({S_{1}}\right) + H\left({S_{2}}\right) - {\delta_{1}} - {\delta_{2}}]\\ \le\! I\left(S_{1}^{m};Y_{1}^{n}\bar{S}_{1}^{m}\right) + I\left(S_{2}^{m};Y_{2}^{n}\bar{S}_{2}^{m}\right)\\ =\! m[I\left({S_{1}};{\bar{S}_{1}}\right) \,+\, I\left({S_{2}};{\bar{S}_{2}}\right)] \!+ I\left(S_{1}^{m};Y_{1}^{n}|\bar{S}_{1}^{m}\right) \!+ I\left(S_{2}^{m};Y_{2}^{n}|\bar{S}_{2}^{m}\right) \end{array} $$
(19)
$$ \begin{array}{l} \le m[I\left({S_{1}};{{\bar{S}}_{1}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}\right)] + I\left(S_{1}^{m};S_{2}^{m}\bar{S}_{2}^{m}Y_{1}^{n}|\bar{S}_{1}^{m}\right)\\ + I\left(S_{2}^{m};\bar{S}_{1}^{m}Y_{2}^{n}|\bar{S}_{2}^{m}\right)\\ = m\left[ {I\left({S_{1}};{{\bar{S}}_{1}}\right) \,+\, I\left({S_{2}};{{\bar{S}}_{2}}\right) \,+\, I\left({S_{1}};{S_{2}}{{\bar{S}}_{2}}|{{\bar{S}}_{1}}\right) + I\left(S_{2};\bar{S}_{1}|\bar{S}_{2}\right)} \right]\\ + I\left(S_{1}^{m};Y_{1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) + I\left(S_{2}^{m};Y_{2}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) \end{array} $$
(20)

For any random variables W,Yn,Zn, we have

$$ I\left(W;Zn\right) = \sum\limits_{i = 1}^{n} \left[ I\left(WY^{i-1};Z^{n}_{i} \right)-I\left(WY^{i};Z^{n}_{i+1}\right)\right], $$
(21)

where \(Y^{0} = Z^{n}_{n+1} = 0\). Hence, the last two terms in (20) are bounded as the following inequality

$$ \begin{array}{l} I\left(S_{1}^{m};Y_{1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) + I\left(S_{2}^{m};Y_{2}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ \le {\sum\nolimits}_{i = 1}^{n} {\left[ \begin{array}{l} I\left(S_{1}^{m};{Y_{1i}}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2i + 1}^{n}\right)\\ + I\left(S_{2}^{m}\bar{S}_{1}^{m}Y_{1}^{i - 1};{Y_{2i}}|\bar{S}_{2}^{m}Y_{2i + 1}^{n}\right) \end{array} \right]} \end{array} $$
(22)

Substitute (15) for (22), we have

$$ \begin{array}{l} m\left[H\left({S_{1}}{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) - {\delta_{1}} - {\delta_{2}}\right]\\ \le n\left[I\left(\tilde S_{1};Y_{1}|\tilde S_{2}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) + I\left(\tilde S_{2}{{\tilde {\bar{S}}}_{1}}{U_{1}};Y_{2}|{{\tilde {\bar{S}}}_{2}}{U_{2}}\right)\right] \end{array} $$
(23)

Next, we consider another outer bound of (19)

$$ {\begin{aligned} \begin{array}{l} m\left[I\left({S_{1}};{{\bar{S}}_{1}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}\right)\right] + I\left(S_{1}^{m};Y_{1}^{n}|\bar{S}_{1}^{m}\right) + I\left(S_{2}^{m};Y_{2}^{n}|\bar{S}_{2}^{m}\right)\\ \le m[I\left({S_{1}};{{\bar{S}}_{1}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}\right)] + I\left(S_{1}^{m}{K^{m}};Y_{1}^{n}|\bar{S}_{1}^{m}\right)\\ + I\left(S_{2}^{m};{K^{m}}Y_{2}^{n}|\bar{S}_{2}^{m}\right)\\ \le m\left[I\left({S_{1}};{{\bar{S}}_{1}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}\right)\right] + I\left({K^{m}};Y_{1}^{n}|\bar{S}_{1}^{m}\right) + I\left(S_{1}^{m};S_{2}^{m}|{K^{m}}\bar{S}_{1}^{m}\right)\\ + I\left(S_{1}^{m};Y_{1}^{n}|{K^{m}}S_{2}^{m}\bar{S}_{1}^{m}\right) + I\left(S_{2}^{m};Y_{1}^{n}|{K^{m}}\bar{S}_{1}^{m}\right) - I\left(S_{2}^{m};Y_{1}^{n}|{K^{m}}\bar{S}_{1}^{m}\right)\\ + H\left({K^{m}}|\bar{S}_{2}^{m}\right) + I\left(S_{2}^{m};\bar{S}_{1}^{m}Y_{2}^{n}|{K^{m}}\bar{S}_{2}^{m}\right)\\ \le m\left[I\left({S_{1}};{{\bar{S}}_{1}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}\right) + I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}K\right) + I\left({S_{2}};{{\bar{S}}_{1}}K|{{\bar{S}}_{2}}\right)\right]\\ + I\left({K^{m}}S_{1}^{m}S_{2}^{m};Y_{1}^{n}|\bar{S}_{1}^{m}\right) - I\left(S_{2}^{m};Y_{1}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ + I\left(S_{2}^{m};Y_{2}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) \end{array} \end{aligned}} $$
(24)

For any random variables W,Yn,Zn, we attain

$$ \sum\limits_{i = 1}^{n} I\left(Z_{i};Y^{i-1} \right)= \sum\limits_{i = 1}^{n} I\left(Y_{i};Z^{n}_{i+1}|WY^{i-1}\right). $$
(25)

As a result, The last three terms in (24) are bounded as the following inequality

$$ {\begin{aligned} \begin{array}{l} \left[ \begin{array}{l} I\left({K^{m}}S_{1}^{m}S_{2}^{m};Y_{1}^{n}|\bar{S}_{1}^{m}\right) + I\left(S_{2}^{m};Y_{2}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ - I\left(S_{2}^{m};Y_{1}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) \end{array} \right] \le \\ \sum\limits_{i = 1}^{n} {\left[I\left({K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2i + 1}^{n};Y_{1i}\right)+ I\left(S_{1}^{m};Y_{1i}|{K^{m}}S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2i + 1}^{n}\right)\right.}\\ \left.+ I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right)\right] \end{array} \end{aligned}} $$
(26)

Combining (24) and (26), we obtain

$$ \begin{array}{l} m\left[ \begin{array}{l} H({S_{1}}|{{\bar{S}}_{1}}) + H({S_{2}}|{{\bar{S}}_{2}}) - I({S_{1}};{S_{2}}|{{\bar{S}}_{1}}K)\\ - I(S_{2};{{\bar{S}}_{1}}K|{{\bar{S}}_{2}}) - {\delta_{1}} - {\delta_{2}} \end{array} \right]\\ \le n\left[ \begin{array}{l} I(\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}};{Y_{1}}) + I({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}})\\ + I({{\tilde{S}}_{2}};{Y_{2}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}) \end{array} \right] \end{array} $$
(27)

We now consider the equivocation-rate bounds

$$ {\begin{aligned} \begin{array}{l} m{E_{S1}} \le H\left(S_{1}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}\right)\\ = H\left(S_{1}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}S_{2}^{m}\right) + I\left(S_{1}^{m};S_{2}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}\right)\\ \le H\left(S_{1}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}S_{2}^{m}\right) + H\left(S_{2}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}\right)\\ \le H\left(S_{1}^{m}|S_{2}^{m}\right) - I\left(S_{1}^{m};Y_{2}^{n}\bar{S}_{2}^{m}|S_{2}^{m}\right) + m{\delta_{2}}\\ = I\left(S_{1}^{m};Y_{1}^{n}\bar{S}_{1}^{m}|S_{2}^{m}\right) + H\left(S_{1}^{m}|Y_{1}^{n}S_{2}^{m}\bar{S}_{1}^{m}\right) - I\left(S_{1}^{m};Y_{2}^{n}\bar{S}_{2}^{m}|S_{2}^{m}\right) + m{\delta_{2}}\\ \le I\left(S_{1}^{m};\bar{S}_{1}^{m}|S_{2}^{m}\right) + I\left(S_{1}^{m};Y_{1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\right) - I\left(S_{1}^{m};\bar{S}_{2}^{m}|S_{2}^{m}\right)\\ - I\left(S_{1}^{m};Y_{2}^{n}|S_{2}^{m}\bar{S}_{2}^{m}\right) + m\left({\delta_{1}} + {\delta_{2}}\right)\\ \le m\left[I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + {\delta_{1}} + {\delta_{2}}\right]\\ + I\left(S_{1}^{m};\bar{S}_{2}^{m}Y_{1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\right) - I\left(S_{1}^{m};Y_{2}^{n}|S_{2}^{m}\bar{S}_{2}^{m}\right)\\ = m\left[I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + {\delta_{1}} + {\delta_{2}}\right] + I\left(S_{1}^{m};Y_{1}^{n}\bar{S}_{2}^{m}|S_{2}^{m}\bar{S}_{1}^{m}\right)\\ - I\left(S_{1}^{m};\bar{S}_{1}^{m}Y_{2}^{n}|S_{2}^{m}\bar{S}_{2}^{m}\right) + I\left(S_{1}^{m};\bar{S}_{1}^{m}|S_{2}^{m}\bar{S}_{2}^{m}Y_{2}^{n}\right)\\ = m\left[I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + {\delta_{1}} + {\delta_{2}}\right] + I\left(S_{1}^{m};\bar{S}_{2}^{m}|S_{2}^{m}\bar{S}_{1}^{m}\right)\\ + I\left(S_{1}^{m};Y_{1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) - I\left(S_{1}^{m};Y_{2}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ - I\left(S_{1}^{m};\bar{S}_{1}^{m}|S_{2}^{m}\bar{S}_{2}^{m}\right) + I\left(S_{1}^{m};\bar{S}_{1}^{m}|S_{2}^{m}\bar{S}_{2}^{m}Y_{2}^{n}\right)\\ \le m\left[I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}{{\bar{S}}_{1}}\right) + {\delta_{1}} + {\delta_{2}}\right]\\ + I\left(S_{1}^{m};Y_{1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) - I\left(S_{1}^{m};Y_{2}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ = m\left[I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}{{\bar{S}}_{1}}\right) + {\delta_{1}} + {\delta_{2}}\right] + \\ \sum\limits_{i = 1}^{n} {\left[ {I\left(S_{1}^{m};Y_{1i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) - I\left(S_{1}^{m};Y_{2i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right)} \right]} \end{array} \end{aligned}} $$
(28)

Therefore, we have

$$ {\begin{aligned} \begin{array}{l} m{E_{S1}} \le m\left[I\left({S_{1}};{\bar{S}_{1}}|{S_{2}}\right) - I\left({S_{1}};{\bar{S}_{2}}|{S_{2}}\right) + I\left({S_{1}};{\bar{S}_{2}}|{S_{2}}{\bar{S}_{1}}\right) + {\delta_{1}} + {\delta_{2}}\right]\\ + n\left[I\left({\tilde S_{1}};{Y_{1}}|{\tilde S_{2}}{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{U_{1}}{U_{2}}\right) - I\left({\tilde S_{1}};{Y_{2}}|{\tilde S_{2}}{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{U_{1}}{U_{2}}\right)\right] \end{array} \end{aligned}} $$
(29)

We consider another case

$$ {\begin{aligned} \begin{array}{l} m{E_{S1}} \le H\left(S_{1}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}\right)\\ = H\left(S_{1}^{m}|{K^{m}}Y_{2}^{n}\bar{S}_{2}^{m}\right) + I\left(S_{1}^{m};{K^{m}}|Y_{2}^{n}\bar{S}_{2}^{m}\right)\\ \le H\left(S_{1}^{m}|{K^{m}}\right) - I\left(S_{1}^{m};Y_{2}^{n}\bar{S}_{2}^{m}|{K^{m}}\right) + m{\delta_{2}}\\ \le I\left(S_{1}^{m};Y_{1}^{n}\bar{S}_{1}^{m}|{K^{m}}\right) - I\left(S_{1}^{m};Y_{2}^{n}\bar{S}_{2}^{m}|{K^{m}}\right) + m\left({\delta_{1}} + {\delta_{2}}\right)\\ = I\left(S_{1}^{m};\bar{S}_{1}^{m}|{K^{m}}\right) - I\left(S_{1}^{m};\bar{S}_{2}^{m}|{K^{m}}\right)\\ + I\left(S_{1}^{m};Y_{1}^{n}|{K^{m}}\bar{S}_{1}^{m}\right) - I\left(S_{1}^{m};Y_{2}^{n}|{K^{m}}\bar{S}_{2}^{m}\right) + m\left({\delta_{1}} + {\delta_{2}}\right)\\ \le m\left[I\left({S_{1}};{{\bar{S}}_{1}}|K\right) - I\left({S_{1}};{{\bar{S}}_{2}}|K\right) + {\delta_{1}} + {\delta_{2}}\right]\\ + I\left(S_{1}^{m};\bar{S}_{2}^{m}Y_{1}^{n}|{K^{m}}\bar{S}_{1}^{m}\right) - I\left(S_{1}^{m};Y_{2}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ = m\left[I\left({S_{1}};{{\bar{S}}_{1}}|K\right) - I\left({S_{1}};{{\bar{S}}_{2}}|K\right) + I\left({S_{1}};{{\bar{S}}_{2}}|K{{\bar{S}}_{1}}\right) + {\delta_{1}} + {\delta_{2}}\right]\\ + I\left(S_{1}^{m};Y_{1}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) - I\left(S_{1}^{m};Y_{2}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ = m\left[I\left({S_{1}};{{\bar{S}}_{1}}|K\right) - I\left({S_{1}};{{\bar{S}}_{2}}|K\right) + I\left({S_{1}};{{\bar{S}}_{2}}|K{{\bar{S}}_{1}}\right) + {\delta_{1}} + {\delta_{2}}\right] + \\ \sum\limits_{i = 1}^{n} {\left[I\left(S_{1}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2i + 1}^{n}\right) - I\left(S_{1}^{m};{Y_{2i}}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2i + 1}^{n}\right)\right]} \end{array} \end{aligned}} $$

Therefore, we have

$$ {\begin{aligned} \begin{array}{l} m{E_{S1}} \le m\left[I\left({S_{1}};{\bar{S}_{1}}|K\right) - I\left({S_{1}};{\bar{S}_{2}}|K\right) + I\left({S_{1}};{\bar{S}_{2}}|K{\bar{S}_{1}}\right) + {\delta_{1}} + {\delta_{2}}\right]\\ + n\left[I\left({\tilde S_{1}};{Y_{1}}|\tilde K{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{U_{1}}{U_{2}}\right) - I\left({\tilde S_{1}};{Y_{2}}|\tilde K{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{U_{1}}{U_{2}}\right)\right] \end{array} \end{aligned}} $$
(30)

And we also have the following steps

$$ \begin{array}{l} m{E_{S1}} \le H\left(S_{1}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}\right)\\ = H\left(S_{1}^{m}|S_{2}^{m}\bar{S}_{2}^{m}Y_{2}^{n}\right) + I\left(S_{1}^{m};S_{2}^{m}|Y_{2}^{n}\bar{S}_{2}^{m}\right)\\ \le H\left(S_{1}^{m}|S_{2}^{m}\bar{S}_{2}^{m}\right) + m{\delta_{2}}\\ = mH\left({S_{1}}|{S_{2}}{\bar{S}_{2}}\right) + m{\delta_{2}} \end{array} $$
(31)

According to (18), we get (4), and similarly get (2), (3), and (5). According to (23), (27), (29), (30), and (31), we get (6), (8), and (10) and symmetrically get (7), (9), and (11).

An inner bound

Theorem 2

(Inner bound) A source pair (S1,S2) with secrecy level (ES1,ES2) is admissible for the DM-BCCS-SI if

$$ H\left({S_{1}}\right) < I\left({U_{0}}{U_{1}}{S_{1}};{Y_{1}}{\bar{S}_{1}}\right) - I\left({U_{0}}{U_{1}};{S_{2}}|{S_{1}}\right) $$
(32)
$$ H\left({S_{2}}\right) < I\left({U_{0}}{U_{2}}{S_{2}};{Y_{2}}{\bar{S}_{2}}\right) - I\left({U_{0}}{U_{2}};{S_{1}}|{S_{2}}\right) $$
(33)
$$ \begin{array}{l} H\left({S_{1}}{S_{2}}\right) < I\left({U_{0}}{U_{1}}{S_{1}};{Y_{1}}{{\bar{S}}_{1}}\right) + I\left({U_{2}}{S_{2}};{Y_{2}}{{\bar{S}}_{2}}|K{U_{0}}\right)\\ - I\left({U_{1}}{S_{1}};{U_{2}}{S_{2}}|K{U_{0}}\right) \end{array} $$
(34)
$$ \begin{array}{l} H\left({S_{1}}{S_{2}}\right) < I\left({U_{1}}{S_{1}};{Y_{1}}{{\bar{S}}_{1}}|K{U_{0}}\right) + I\left({U_{0}}{U_{2}}{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\right)\\ - I\left({U_{1}}{S_{1}};{U_{2}}{S_{2}}|K{U_{0}}\right) \end{array} $$
(35)
$$ \begin{array}{l} H\left({S_{1}}{S_{2}}\right) < I\left({U_{0}}{U_{1}}{S_{1}};{Y_{1}}{{\bar{S}}_{1}}\right) + I\left({U_{0}}{U_{2}}{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\right)\\ - I\left({U_{1}}{S_{1}};{U_{2}}{S_{2}}|K{U_{0}}\right) - I\left({S_{1}}{S_{2}};K{U_{0}}\right) \end{array} $$
(36)
$$ {E_{S1}} < H\left({S_{1}}|{S_{2}}{\bar{S}_{2}}{U_{0}}{U_{2}}{Y_{2}}\right) $$
(37)
$$ {E_{S2}} < H\left({S_{2}}|{S_{1}}{\bar{S}_{1}}{U_{0}}{U_{1}}{Y_{1}}\right) $$
(38)
$$ {E_{S1}} < I\left({S_{1}}{U_{1}};{Y_{1}}{\bar{S}_{1}}|K{U_{0}}\right) - I\left({S_{1}}{U_{1}};{S_{2}}{\bar{S}_{2}}{U_{2}}{Y_{2}}|K{U_{0}}\right) $$
(39)
$$ {E_{S2}} < I\left({S_{2}}{U_{2}};{Y_{2}}{\bar{S}_{2}}|K{U_{0}}\right) - I\left({S_{2}}{U_{2}};{S_{1}}{\bar{S}_{1}}{U_{1}}{Y_{1}}|K{U_{0}}\right) $$
(40)

for all the distributions

$$ \begin{array}{l} p\left({s_{1}}{s_{2}}{{\bar{S}}_{1}}{{\bar{S}}_{2}}{u_{0}}{u_{1}}{u_{2}}x{y_{1}}{y_{2}}\right)\\ = p\left({s_{1}}{s_{2}}{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)p\left({u_{0}}{u_{1}}{u_{2}}x|{s_{1}}{s_{2}}\right)p\left({y_{1}}{y_{2}}|x\right) \end{array} $$
(41)

Remarks 2

The proof of Theorem 2 uses joint source-channel coding. We choose R=1(m=n) so as to apply joint typical decoding for source and channel sequences, the same method used in [2224, 26]. In addition, the receiving sequences \(Y_{1}^{n}\) and \(\bar {S}_{1}^{n}\) can be combined into one such that the proof of Theorem 2 is the same as the proof in [24, Theorem 1]. Consider limitted space, we omit the inner bound proof here.

Remarks 3

Theorems 1 and 2 extend Villard et al.’s secure transmission of a source over wiretap channel [21] to that of two correlated sources over BC. Inequalities (30)-(34) and (2)-(9) are respectively the inner and outer bounds for reliable transmission without security constraints, whose bounds extend Tuncel’s [20] and Kang et al.’s results [26] to arbitrarily correlated sources and extend Timo et al.’s result [27] for noiseless network to that for noisy BC. If \({\bar {S}_{1}},{\bar {S}_{2}} = \) a constant, Theorem 2 is reduced to [24, Theorem 1].

Special cases

We here consider three classes of DM-BCCS-SI: Joint Source-Channel Coding, A Single Source Passing through BCs with Degraded SI and Independent Sources given SI. Furthermore, we assume R=1, i.e., n=m. In this case, the capacity theorem proofs in Subsections A and B follow from Theorems 1 and 2 and they are not given here.

Joint source-channel coding

Markov sources and degraded SI

Assume the deterministic side information at the receivers for the DM-BCCS-SI.

Theorem 3

\(\left ({S_{1}},{S_{2}},K,{\bar {S}_{1}},{\bar {S}_{2}}\right)\) forms the Markov chains

$$ {S_{1}} \to K \to {S_{2}},\;{S_{1}} \to {\bar{S}_{1}} \to {\bar{S}_{2}},\;{S_{2}} \to {\bar{S}_{2}} \to {\bar{S}_{1}} $$
(42)

and deterministic functions

$$ {\bar{S}_{1}} = {F_{1}}\left({S_{1}}\right),\;{\bar{S}_{2}} = {F_{2}}\left({S_{2}}\right) $$
(43)

(S1,S2) with secrecy level (ES1,ES2) is admissible for the semi-deterministic DM-BCCS-SI, i.e., y1=f(x), if

$$ H\left(K|{\bar{S}_{1}}{\bar{S}_{2}}\right) < min\{ I\left({U_{0}};{Y_{1}}\right),I\left({U_{0}};{Y_{2}}\right)\} $$
(44)
$$ H\left({S_{1}}|{\bar{S}_{1}}\right) < H\left({Y_{1}}\right) $$
(45)
$$ H\left({S_{2}}|{\bar{S}_{2}}\right) < I\left({U_{0}}{U_{2}};{Y_{2}}\right) $$
(46)
$$ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ < I\left({U_{0}};{Y_{1}}\right) + I\left({U_{2}};{Y_{2}}|{U_{0}}\right) + H\left({Y_{1}}|{U_{0}}{U_{2}}\right) \end{array} $$
(47)
$$ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ < I\left({U_{0}}{U_{2}};{Y_{2}}\right) + H\left({Y_{1}}|{U_{0}}{U_{2}}\right) \end{array} $$
(48)
$$ {E_{S1}} < \min \{ H\left({\bar{S}_{1}}|{S_{2}}\right) + H\left({Y_{1}}|{Y_{2}}{U_{0}}{U_{2}}\right),\;H\left({S_{1}}|{S_{2}}\right)\} $$
(49)
$$ \begin{array}{l} {E_{S2}} < \min \{ H\left({{\bar{S}}_{2}}|{S_{1}}\right) + I\left({U_{2}};{Y_{2}}|{U_{0}}\right) - I\left({U_{2}};{Y_{1}}|{U_{0}}\right)\!,\;\\ H\left({S_{2}}|{S_{1}}\right)\} \end{array} $$
(50)

for some distribution

$$p\left({s_{1}}{s_{2}}{\bar{S}_{1}}{\bar{S}_{2}}\right)p\left({u_{0}}{u_{2}}|{s_{1}}{s_{2}}\right)p\left(x|{u_{0}}{u_{2}}\right)p\left({y_{1}}{y_{2}}|x\right). $$

More capable BCs with partial degraded SI

Theorem 4

Consider a class of less-noisy DM-BCCS-SI defined by I(U;Y1)≥I(U;Y2) for all Markov chains UXY1Y2, and \(\left ({S_{1}},{S_{2}},{\bar {S}_{1}},{\bar {S}_{2}}\right)\) forms the Markov chains

$$ {\bar{S}_{2}} - {\bar{S}_{1}} - {S_{1}}{S_{2}},\;\bar{S}_{1} - \bar{S}_{2} - S_{2} $$
(51)

(S1,S2) with secrecy level (ES1,ES2) is admissible if

$$ H({S_{2}}|{\bar{S}_{2}}) < I(U;{Y_{2}}) $$
(52)
$$ H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}\right) < I(X;{Y_{1}}|U) + I(U;{Y_{2}}) $$
(53)
$$ \begin{array}{l} {E_{S1}} < \min \left\{ I(X;{Y_{1}}|U) - I(X;{Y_{2}}|U) + I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right)\right.\\ - I\left.\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right),\;H\left({S_{1}}|{S_{2}}{{\bar{S}}_{2}}\right)\right\} \end{array} $$
(54)
$$ {E_{S2}} = 0 $$
(55)

for some distribution \(p\left ({s_{1}}{s_{2}}{\bar {S}_{1}}{\bar {S}_{2}}\right)p\left (u|{s_{1}}{s_{2}}\right)p\left (x|u\right)p\left ({y_{1}}{y_{2}}|x\right)\).

Remarks 4

Without side information and security constraints, Theorems 3 and 4 are reduced to [23, Theorems 3 and 4] firstly discussed by Kramer et al.. It should be noted that operational separation also hold for these two cases. That is, independent distribution of source and channel variables is also sufficient for the optimal results.

Operational separation: a single source passing through BCs with degraded SI

A single source S transmission over BC with side information \({\bar {S}_{1}}\) and \({\bar {S}_{2}}\) at both receivers is considered.

Theorem 5

(i) S is reliably transmitted if

$$ H(S|{\bar{S}_{1}}) < \min \{ I(X;{Y_{1}}),I(X;{Y_{1}}|U) + I(U;{Y_{2}})\} $$
(56)
$$ H\left(S|{\bar{S}_{2}}\right) < I(U;{Y_{2}}) $$
(57)

(ii) Consider security constraints, S with secrecy level ES is admissible for wiretap channel, Receiver 1 is legitimate user, Receiver 2 can be seen as an eavesdropper, if

$$ H\left(S|{\bar{S}_{1}}\right) < I(U;{Y_{1}}) $$
(58)
$$ \begin{array}{l} {E_{S1}} < \min \{ I\left(S;{{\bar{S}}_{1}}\right) - I\left(S;{{\bar{S}}_{2}}\right) + I\left(U;{Y_{1}}|Q\right)\\ - I\left(U;{Y_{2}}|Q\right),H\left(S|{{\bar{S}}_{2}}\right)\} \end{array} $$
(59)

for some distribution \(p\left ({\bar {S}_{1}}{\bar {S}_{2}}|s\right)p\left (x|u\right)p\left ({y_{1}}{y_{2}}|x\right)\), and \(\left (S,{\bar {S}_{1}},{\bar {S}_{2}}\right)\) satisfies the Markov chain \(S \to {\bar {S}_{1}} \to {\bar {S}_{2}}\).

For K=2, rate R is achievable using separate source and channel coders if and only if

$$ \left(H(X|Y_{1}), H(X|Y_{2}) \right)\in RC^{dm}, $$
(60)

where RCdm= {(R1,R2)Cdm}.

Remarks 5

Using operational separation, source variables \((S,{\bar {S}_{1}},{\bar {S}_{2}})\) are independent of channel variables (U,X,Y1,Y2). The reliable bounds (54)-(55) are the special case that exists in [20, Theorem 5]. The secure bounds (56)-(57) extend Merhav’s bounds for degraded wiretap channel [28].

Informational separation: independent sources given SI

Theorem 6

Consider a semi-deterministic DM-BCCS-SI, i.e., y1=f(x) where \(\left ({S_{1}},{S_{2}},{\bar {S}_{1}},{\bar {S}_{2}}\right)\) forms the Markov chains

$$ {S_{1}} \to {\bar{S}_{1}} \to {S_{2}},\;{S_{1}} \to {\bar{S}_{2}} \to {S_{2}} $$
(61)
$$ {S_{1}} \to {\bar{S}_{1}} \to {\bar{S}_{2}},\;{S_{1}} \to {\bar{S}_{1}} \to {\bar{S}_{2}}\; $$
(62)

(i) (S1,S2) is reliably transmitted if

$$ H\left({S_{1}}|{\bar{S}_{1}}\right) < H({Y_{1}}) $$
(63)
$$ H\left({S_{2}}|{\bar{S}_{2}}\right) < I(U;{Y_{2}}) $$
(64)
$$ H\left({S_{1}}|{\bar{S}_{1}}\right) + H\left({S_{2}}|{\bar{S}_{2}}\right) < H({Y_{1}}|U) + I(U;{Y_{2}}) $$
(65)

(ii) The secrecy capacity of (ES1,ES2)

$$ \begin{array}{l} {\!\!\!E_{S1}} \!<\! \min\left \{ I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) \,-\, I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) \,+\, H\left({Y_{1}}|{Y_{2}}{U_{0}}{U_{2}}\right)\right.\!\!,\;\\ \left.\!\!\!\!H\left({S_{1}}|{{\bar{S}}_{2}}\right)\right\} \end{array} $$
(66)
$$ \begin{array}{l} {E_{S2}} < \min \left\{ I\left({S_{2}};{{\bar{S}}_{2}}|K\right) - I\left({S_{2}};{{\bar{S}}_{1}}|K\right)\right.\\ +\left. I\left({U_{1}};{Y_{2}}|{U_{0}}\right) - I\left({U_{1}};{Y_{1}}|{U_{0}}\right),\;H\left({S_{2}}|{{\bar{S}}_{1}}\right)\right\} \end{array} $$
(67)

for some distribution \(p\left ({s_{1}}{s_{2}}{\bar {S}_{1}}{\bar {S}_{2}}\right)p({u_{0}}{u_{1}}{u_{2}}ux)p({y_{1}}{y_{2}}|x)\).

Remarks 6

The proof of Theorem 6 is given in Appendix A , which is based on stand-alone source and channel codes and applying Slepian-Wolf source coding followed by Marton’s BC coding. Information separation for Theorem 6 suggests that source-channel separation in the informational sense is optimal.

Conclusion

In this paper, we studied the problem of sending a pair of correlated sources through a broadcast channel with correlated side information at the receivers. In addition, each source should be kept secret from the unintended receiver. Due to the lack of a general source-channel separation theorem for broadcast channels, optimal performance sometimes requires joint source-channel coding such as Theorem 2. We also established a general outer bound and have analyzed three classes of sources and channels in which this general outer bound is tight, that is, source channel coding, operational separation, and informational separation are respectively proved to be optimal performance.

Appendix A

Proof Of Theorem 6

We outline the proof of reliable transmission bound of (S1,S2) and the bound of the equivocation-rate pair (ES1,ES2). We start with the proof of the direct part in Case (i). Let (R1,R2) satisfy the bounds

$$ H\left({S_{1}}|{\bar{S}_{1}}\right) < {R_{1}} < I\left({U_{1}};{Y_{1}}\right) $$
(68)
$$ H\left({S_{2}}|{\bar{S}_{2}}\right) < {R_{2}} < I\left({U_{2}};{Y_{2}}\right) $$
(69)
$$ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) < {R_{1}} + {R_{2}} < \\ I\left({U_{1}};{Y_{1}}\right) + I\left({U_{2}};{Y_{2}}\right) - I\left({U_{1}};{U_{2}}\right) \end{array} $$
(70)

for some distribution p(x|u1,u2). The right-hand side of (65)-(67) can be seen as Marton’s BC bound and the left-hand side of (65)-(67) can be seen as Slepian-Wolf bound for distributed source coding.

Code Generation: Consider a distribution p(u1,u2) and a function x(u1,u2), Let \({\bar R_{1}} \ge {R_{1}}\), \({\bar R_{2}} \ge {R_{2}}\). Randomly and independently assign an index \({m_{1}}(s_{1}^{m})\) to each sequence \(s_{1}^{m} \in {{\mathcal {S}}}_{1}^{m}\) according to a uniform pmf over \(\left [1:{2^{n{R_{1}}}}\right ]\). The sequences with the same index m1 form a bin ?\({\mathcal {B}}_{1}(m_{1})\). Similarly assign an index \({m_{2}}\left (s_{2}^{m}\right) \in \left [1:{2^{n{R_{2}}}}\right ]\) to each sequence \(s_{2}^{m} \in {{\mathcal {S}}}_{2}^{m}\). The sequences with the same index m2 form a bin ?\({\mathcal {B}}_{2}(m_{2})\).

For each m1, generate a subcodebook \({\mathcal {C}}_{1}(m_{1})\) consisting of \({2^{n({{\bar R}_{1}} - {R_{1}})}}\) sequences \(u_{1}^{n}({l_{1}})\), \({l_{1}} \in \left [({m_{1}} - 1){2^{n({{\bar R}_{1}} - {R_{1}})}} + 1 :{m_{1}}{2^{n({{\bar R}_{1}} - {R_{1}})}}\right ]\). Similarly, for each m2, generate a subcodebook ?\({\mathcal {C}}_{2}(m_{2})\) consisting of \({2^{n({{\bar R}_{2}} - {R_{2}})}}\) independent sequences \(u_{2}^{n}({l_{2}})\), \({l_{2}} \in \left [({m_{2}} - 1){2^{n({{\bar R}_{2}} - {R_{2}})}} + 1 :{m_{2}}{2^{n({{\bar R}_{2}} - {R_{2}})}}\right ]\). For each pair (m1,m2), find an index pair (l1,l2) such that

$${\begin{aligned} u_{1}^{n}\left({l_{1}}\right) \in {\mathcal{C}}_{1}\left(m_{1}\right), u_{2}^{n}\left({l_{2}}\right) \in {\mathcal{C}}_{2}\left(m_{2}\right), \left(u_{1}^{n}\left({l_{1}}\right),u_{2}^{n}\left({l_{2}}\right)\right) \in T_{\varepsilon}^{\left(n\right)}\left({U_{1}}{U_{2}}\right) \end{aligned}} $$

and generate a channel codeword \({x^{n}}\left (u_{1}^{n}({l_{1}}),u_{2}^{n}({l_{2}})\right)\).

Encoding: Use the above separate source and channel code for encoding. The source encoder finds the bin index m1 and m2 of \(s_{1}^{m}\) and \(s_{2}^{m}\) respectively using the Slepian-Wolf source code, and forwards them to the channel encoder. The channel encoder transmits the codeword xn(m1,m2) corresponding to the source bin index using Marton’s code.

Decoding: We use a separate source and channel decoder. Upon \(Y_{1}^{n}\), Channel-Decoder 1 tries to find the unique index m1 such that the corresponding channel codewords satisfy \(\left (u_{1}^{n}({l_{1}}),y_{1}^{n}\right) \in T_{\varepsilon }^{(n)}\) and \(u_{1}^{n}({l_{1}}) \in {C_{1}}({m_{1}})\). If such m1 exists and is unique, set \({\hat {m}_{1}} = {m_{1}}\); otherwise, declare an error. Similarly, Channel-Decoder 2 tries to find the unique m2 such that \(\left (u_{2}^{n}({l_{2}}),y_{2}^{n}\right) \in T_{\varepsilon }^{(n)}\) and \(u_{2}^{n}({l_{2}}) \in {C_{2}}({m_{2}})\), and then set \({\hat {m}_{2}} = {m_{2}}\).

Then, \({\hat {m}_{1}}\) and \({\hat {m}_{2}}\) are provided to the source decoders 1 and 2 respectively. Upon \(\bar {S}_{1}^{m}\) and \(\bar {S}_{2}^{m}\), Source-Decoders 1 and 2 find the unique \(\left (s_{1}^{m},s_{2}^{m}\right)\) such that \(s_{1}^{m} \in {{\mathcal {B}}}\left ({\hat {m}_{1}}\right)\), \(\left (s_{1}^{m},\bar {S}_{1}^{m}\right) \in T_{\varepsilon }^{\left (m\right)}\) and \(s_{2}^{m} \in {{\mathcal {B}}}\left ({\hat {m}_{2}}\right)\), \(\left (s_{2}^{m},\bar {S}_{2}^{m}\right) \in T_{\varepsilon }^{\left (m\right)}\), and thus set \(\left (\hat {s}_{1}^{m},\hat {s}_{2}^{m}\right) = \left (s_{1}^{m},s_{2}^{m}\right)\).

Error analysis: Assume \(\left (s_{1}^{n},s_{2}^{n}\right)\) is sent by the encoder, such that the corresponding indices (i1,i2). \(\left ({\hat {i}_{1}},{\hat {i}_{2}}\right)\) denotes the decoded indices at the receivers. The average probability of decoding error can be computed as

$$ {\begin{aligned} \begin{array}{l} P_{e}^{\left(m\right)} \overset{\Delta}{=} \sum\limits_{s_{1}^{m},s_{2}^{m}} {P\left\{ \left(s_{1}^{m},s_{2}^{m}\right) \ne \left(\hat{s}_{1}^{m},\hat{s}_{2}^{m}\right)|\left(S_{1}^{m},S_{2}^{m}\right) = \left(s_{1}^{m},s_{2}^{m}\right)\right\} p\left(s_{1}^{m},s_{2}^{m}\right)} \\ \le {\sum\nolimits}_{s_{1}^{m}} {P\left\{ s_{1}^{m} \ne \hat{s}_{1}^{m}|{m_{1}} = {{\hat{m}}_{1}},S_{1}^{m} = s_{1}^{m}\right\} p\left(s_{1}^{m}\right)} \\ + {\sum\nolimits}_{s_{2}^{m}} {P\left\{ s_{2}^{m} \ne \hat{s}_{2}^{m}|{m_{2}} = {{\hat{m}}_{2}},S_{2}^{m} = s_{2}^{m}\right\} p\left(s_{2}^{m}\right)} \\ + {\sum\nolimits}_{s_{1}^{m}} {P\left\{ {m_{1}} \ne {{\hat{m}}_{1}}|S_{1}^{m} = s_{1}^{m}\right\} p\left(s_{1}^{m}\right)} \\ + {\sum\nolimits}_{s_{2}^{m}} {P\left\{ {m_{2}} \ne {{\hat{m}}_{2}}|S_{2}^{m} = s_{2}^{m}\right\} p\left(s_{2}^{m}\right)} \end{array} \end{aligned}} $$
(71)

The first two terms in (68) are close to zero with large m when applying Slepian-Wolf source code, and the last two terms in (68) are also close to zero with large n when applying Marton’s code for a semi-deterministic BC. Hence, \(P_{e}^{(m)} \to 0\).

We next prove the converse for Case (i). Consider (4)-(6), let \(\left (\tilde K,{\tilde {\bar {S}}_{1}},{\tilde {\bar {S}}_{2}},{U_{1}},{U_{2}}\right) = {U_{0}}\), \({\tilde S_{1}} = {Y_{1}}\), \({\tilde S_{2}} = {U_{2}}\), U=U0U2, get

$$ H\left({S_{1}}|{\bar{S}_{1}}\right) \le I({Y_{1}}{U_{0}};{Y_{1}}) \le H({Y_{1}}) $$
(72)
$$ H\left({S_{2}}|{\bar{S}_{2}}\right) \le I({U_{0}}{U_{2}};{Y_{2}}) = I(U;{Y_{2}}) $$
(73)
$$ \begin{array}{l} I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) + I\left({{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{U_{1}};{Y_{2}}|{{\tilde {\bar{S}}}_{2}}{U_{2}}\right)\\ \le I\left({Y_{1}};{Y_{1}}|{U_{0}}{U_{2}}\right) + I\left({U_{0}}{U_{2}};{Y_{2}}\right) = H\left({Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) \end{array} $$
(74)

Consider the Markov chains (58) and (59), we have

$$ H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}{\bar{S}_{2}}\right) = H\left({S_{1}}|{\bar{S}_{1}}\right) + H\left({S_{2}}|{\bar{S}_{2}}\right) $$
(75)

Then we get (60)-(62).

We next consider Case (ii). Independence of source and channel implies that the DM-BCCS-SI can be viewed as a parallel broadcast channel. That is, in addition to the real BC p(y1,y2|x), there is a virtual BC with input (S1,S2) and two outputs \({\bar {S}_{1}}\) and \({\bar {S}_{2}}\). For the real BC, the inner bound of (ES1,ES2) can follow from [10, Theorem 4], and we here only give the con-verse proof. Let \(\left (\tilde K,{\tilde {\bar {S}}_{1}},{\tilde {\bar {S}_{2}}},{U_{1}},{U_{2}}\right) = {U_{0}}\), \({\tilde S_{1}} = {Y_{1}}\), \({\tilde S_{2}} = {U_{2}}\), and thus inequalities (63)-(64) follow easily from the first term in (10) and the second term in (11) respectively, and the fact (58), that is

$$ {E_{S1}} \le H\left({S_{1}}|{S_{2}}{\bar{S}_{2}}\right) = H\left({S_{1}}|{\bar{S}_{2}}\right) $$
(76)
$$ {E_{S2}} \le H\left({S_{2}}|{S_{1}}{\bar{S}_{1}}\right) = H\left({S_{2}}|{\bar{S}_{1}}\right) $$
(77)

Appendix B

Proof of inequalities (25) and (22)

The proof of inequality (25) uses the similar procedure as that in [23, (60)-(65)], and we here give the proof in detail:

$$ {\begin{aligned} \begin{array}{l} I\left(S_{1}^{m};Y_{1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) + I\left(S_{2}^{m};Y_{2}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ \mathop = \limits^{\left(a\right)} \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left(S_{1}^{m};Y_{1i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) + I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2,i}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ - I\left(S_{2}^{m}Y_{1}^{i};Y_{2,i + 1}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) \end{array} \right]} \\ \end{array} \end{aligned}} $$
$$ {\begin{aligned} \begin{array}{l} = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left(S_{1}^{m};Y_{1i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) + I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2,i}Y_{2,i + 1}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ - I\left(S_{2}^{m}Y_{1}^{i - 1}Y_{1i};Y_{2,i + 1}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) \end{array} \right]} \\ = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left(S_{1}^{m};Y_{1i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) + I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2,i}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right)\\ + I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2,i + 1}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) - I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2,i + 1}^{n}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ - I\left(Y_{1i};Y_{2,i + 1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) \end{array} \right]} \\ = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left(S_{1}^{m};Y_{1i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) + I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2,i}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right)\\ - I\left(Y_{1i};Y_{2,i + 1}^{n}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) \end{array} \right]} \\ = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} - H\left(Y_{1i}|S_{1}^{m}S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) + I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2,i}|\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right)\\ + H\left(Y_{1i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) \end{array} \right]} \\ \le \sum\limits_{i = 1}^{n} {\left[\! \begin{array}{l} - H\left(Y_{1i}|S_{1}^{m}S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\!\right) \,+\, I\!\left(S_{2}^{m}\bar{S}_{1}^{m}Y_{1}^{i - 1};Y_{2,i}|\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right)\\ + H\left(Y_{1i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) \end{array} \!\right]}\\ = \sum\limits_{i = 1}^{n} {\left[ {I\left(S_{1}^{m};Y_{1i}|S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) + I\left(S_{2}^{m}\bar{S}_{1}^{m}Y_{1}^{i - 1};Y_{2,i}|\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right)} \right]} \end{array} \end{aligned}} $$
(78)

where step (a) follows from Lemma 1.

Lemma 1

[23] For any random variables W,Yn,Zn, we have

$$ I\left(W;{Z^{n}}\right) \,=\, {\sum\nolimits}_{i = 1}^{n} {\left[I\left(W{Y^{i - 1}};Z_{i}^{n}\right) - I\left(W{Y^{i}};Z_{i + 1`}^{n}\right) \right]} $$
(79)

Therefore, we get (25).

Consider (75) and (20), we have

$$\begin{array}{l} m\left[ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left(S_{1};S_{2}|\bar{S}_{1}\bar{S}_{2}\right)\\ - I\left(S_{1};\bar{S}_{2}|\bar{S}_{1}\right) - I\left(S_{2};\bar{S}_{1}|\bar{S}_{2}\right) - {\delta_{1}} - {\delta_{2}} \end{array} \right]\\ \le n\left[I\left(\tilde S_{1};Y_{1}|\tilde S_{2}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) + I\left(\tilde S_{2}{{\tilde {\bar{S}}}_{1}}{U_{1}};Y_{2}|{{\tilde {\bar{S}}}_{2}}{U_{2}}\right)\right] \end{array} $$

and

$$ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{{\bar{S}}_{1}}\right)\\ - I\left({S_{2}};{{\bar{S}}_{1}}|{{\bar{S}}_{2}}\right)\\ = H\left({S_{1}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) + H\left({S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ =\! H\left({S_{1}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) \,+\, H\left({S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) \,-\, H\left({S_{1}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) \,+\, H\left({S_{1}}|{S_{2}}{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ = H\left({S_{1}}{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) \end{array} $$

Therefore, we get (22). That is

$$\begin{array}{l} m\left[H\left({S_{1}}{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) - {\delta_{1}} - {\delta_{2}}\right]\\ \le n\left[I\left(\tilde S_{1};Y_{1}|\tilde S_{2}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) + I\left(\tilde S_{2}{{\tilde {\bar{S}}}_{1}}{U_{1}};Y_{2}|{{\tilde {\bar{S}}}_{2}}{U_{2}}\right)\right] \end{array} $$

Proof of inequality (24)

The proof of inequality (24) uses the similar procedure as that in [23, (73)-(85)], and we here give the proof in detail. The last two terms in the left-hand side of (23) can be bounded as:

$$ {\begin{aligned} \begin{array}{l} I\left(S_{2}^{m};Y_{2}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right) - I\left(S_{2}^{m};Y_{1}^{n}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}\right)\\ = \sum\limits_{i = 1}^{n} {\left[ {I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right) - I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right)} \right]}\\ = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right) - I\left(Y_{1}^{i - 1};Y_{2i}|{K^{m}}S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right)\\ - I\left(S_{2}^{m}Y_{2,i + 1}^{n};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) + I\left(Y_{2,i + 1}^{n};Y_{1i}|{K^{m}}S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) \end{array} \right]} \\ = \sum\limits_{i = 1}^{n} {\left[ {I\left(S_{2}^{m}Y_{1}^{i - 1};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right) - I\left(S_{2}^{m}Y_{2,i + 1}^{n};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right)} \right]} \\ = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) + I\left(Y_{1}^{i - 1};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{2,i + 1}^{n}\right)\\ - I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) - I\left(Y_{2,i + 1}^{n};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}\right) \end{array} \right]} \\ = \sum\limits_{i = 1}^{n} {\left[ {I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) - I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right)} \right]}\\ = \sum\limits_{i = 1}^{n} {\left[ {I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) - I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right)} \right]} \end{array} \end{aligned}} $$
(80)

The first term in the left-hand side of (23) can be bounded as

$$ \begin{array}{l} I\left({K^{m}}S_{1}^{m}S_{2}^{m};Y_{1}^{n}|\bar{S}_{1}^{m}\right) \le \sum\limits_{i = 1}^{n} {I\left({K^{m}}S_{1}^{m}S_{2}^{m};Y_{1i}|\bar{S}_{1}^{m}Y_{1}^{i - 1}\right)} \\ \le \sum\limits_{i = 1}^{n} {I\left({K^{m}}S_{1}^{m}S_{2}^{m}\bar{S}_{1}^{m}{Y^{i - 1}}Y_{2,i + 1}^{n};Y_{1i}\right)} \end{array} $$
(81)

Consider (77) + (78), and we have

$$ {\begin{aligned} \begin{array}{l} \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left({K^{m}}S_{1}^{m}S_{2}^{m}\bar{S}_{1}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n};Y_{1i}\right)\\ + I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) - I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) \end{array} \right]} \\ \le \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left({K^{m}}S_{1}^{m}S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n};Y_{1i}\right)\\ + I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) - I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) \end{array} \right]} \\ = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left({K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n};Y_{1i}\right) + I\left(S_{1}^{m}S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right)\\ + I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) - I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) \end{array} \right]}\\ = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left({K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n};Y_{1i}\right)\\ + I\left(S_{1}^{m};Y_{1i}|{K^{m}}S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) + I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right)\\ + I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) - I\left(S_{2}^{m};Y_{1i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) \end{array} \!\right]} \\ = \sum\limits_{i = 1}^{n} {\left[ \begin{array}{l} I\left({K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n};Y_{1i}\right)\\ + I\left(S_{1}^{m};Y_{1i}|{K^{m}}S_{2}^{m}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\right) + I\left(S_{2}^{m};Y_{2i}|{K^{m}}\bar{S}_{1}^{m}\bar{S}_{2}^{m}Y_{1}^{i - 1}Y_{2,i + 1}^{n}\!\right) \end{array} \right]} \end{array} \end{aligned}} $$
(82)

Therefore, we get (24).

Proof of Theorem 3

Inner bound (admissibility):

Consider the case where S1→KS2 forms a Markov chain and the deterministic SI \({\bar {S}_{1}} = {F_{1}}({S_{1}})\) and \({\bar {S}_{2}} = {F_{2}}({S_{2}})\). (S1,S2) can be partition into five parts W0,W1,W2,W3,W4, where (W3,W4) with entropy \(\left (nH\left ({\bar {S}_{1}}\right),nH\left ({\bar {S}_{2}}\right)\right)\) can be decoded by \({\bar {S}_{1}}\) and \({\bar {S}_{2}}\) respectively, and three independent messages W0,W1,W2 with entropies nR0,nR1,nR2, respectively, satisfying

$$ H\left(K|{\bar{S}_{1}}{\bar{S}_{2}}\right) \le {R_{0}},H\left({S_{1}}|{\bar{S}_{1}}\right) \le {R_{0}} + {R_{1}},H\left({S_{2}}|{\bar{S}_{2}}\right) \le {R_{0}} + {R_{2}} $$
(83)

On the other hand, W0,W1,W2 are encoded by Xu et al.’s secure coding scheme before being transmitted over BC, then we get Xu et al.’s rate equivocation region [29, Theorem 1]. Comparing the source coding rate region (80) with Xu et al.’s rate equivocation region, we have an inner bound

$$ H\left(K|{\bar{S}_{1}}{\bar{S}_{2}}\right) < min\{ I\left({U_{0}};{Y_{1}}\right),I\left({U_{0}};{Y_{2}}\right)\} $$
(84)
$$ H\left({S_{1}}|{\bar{S}_{1}}\right) < I\left({U_{0}}{U_{1}};{Y_{1}}\right) $$
(85)
$$ H\left({S_{2}}|{\bar{S}_{2}}\right) < I\left({U_{0}}{U_{2}};{Y_{2}}\right) $$
(86)
$$ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ < I\left({U_{0}}{U_{1}};{Y_{1}}\right) + I\left({U_{2}};{Y_{2}}|{U_{0}}\right) - I\left({U_{1}};{U_{2}}|{U_{0}}\right) \end{array} $$
(87)
$$ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ < I\left({U_{1}};{Y_{1}}|{U_{0}}\right) + I\left({U_{0}}{U_{2}};{Y_{2}}\right) - I\left({U_{1}};{U_{2}}|{U_{0}}\right) \end{array} $$
(88)
$$ \begin{array}{l} {E_{S1}} \!<\! \min \{ H\left({{\bar{S}}_{1}}|{S_{2}}\right) \,+\, I\left({U_{1}};{Y_{1}}|{U_{0}}\right) \,-\, I\left({U_{1}};{U_{2}}{Y_{2}}|{U_{0}}\right)\!,\;\\ H\left({S_{1}}|{S_{2}}\right)\} \end{array} $$
(89)
$$ \begin{array}{l} {E_{S2}} < \min \left\{ H\left({{\bar{S}}_{2}}|{S_{1}}\right) + I\left({U_{2}};{Y_{2}}|{U_{0}}\right) - I\left({U_{2}};{U_{1}}{Y_{1}}|{U_{0}}\right)\right.,\;\\ \left.H\left({S_{2}}|{S_{1}}\right){\vphantom{{\bar{S}}_{2}}}\right\} \end{array} $$
(90)

Next, consider the semi-deterministic BC and (81)-(87), let U1=Y1 and obtain (42)-(48). Furthermore, for any distribution

$$p\left({s_{1}}{s_{2}}{\bar{S}_{1}}{\bar{S}_{2}}\right)p({u_{0}}{u_{2}}|{s_{1}}{s_{2}})p(x|{u_{0}}{u_{2}})p({y_{1}}{y_{2}}|x) $$

we can achieve the right-hand sides of (81)-(87) due to the conditions

$${S_{1}} \to K \to {S_{2}}, {\bar{S}_{1}} = {F_{1}}({S_{1}}), {\bar{S}_{2}} = {F_{2}}({S_{2}}) $$

Outer bound: Consider (2)-(11) and choose \(\tilde K{\tilde {\bar {S}}_{1}}{\tilde {\bar {S}}_{2}}{U_{1}}{U_{2}} = {U_{0}}\), \({\tilde S_{1}} = {U_{1}}\), \({\tilde S_{2}} = {U_{2}}\). For (42), consider (2) and (3), and the facts

$$ H\left(K|{\bar{S}_{1}}{\bar{S}_{2}}\right) \le H\left(K|{\bar{S}_{1}}\right) \le I\left(\tilde K;{Y_{1}}|{\tilde {\bar{S}}_{1}}{U_{1}}\right) \le I\left({U_{0}};{Y_{1}}\right) $$
(91)
$$ H\left(K|{\bar{S}_{1}}{\bar{S}_{2}}\right) \le H\left(K|{\bar{S}_{2}}\right) \le I\left(\tilde K;{Y_{2}}|{\tilde {\bar{S}}_{2}}{U_{2}}\right) \le I\left({U_{0}};{Y_{2}}\right) $$
(92)

For (43) and (44), consider (4) and (eq:F5), and the facts

$$ H\left({S_{1}}|{\bar{S}_{1}}\right) \le I\left({\tilde S_{1}};{Y_{1}}|{\tilde {\bar{S}}_{1}}{U_{1}}\right) \le H\left({Y_{1}}\right) $$
(93)
$$ H\left({S_{2}}|{\bar{S}_{2}}\right) \le I\left({\tilde S_{2}};{Y_{2}}|{\tilde {\bar{S}}_{2}}{U_{2}}\right) \le I\left({U_{0}}{U_{2}};{Y_{2}}\right) $$
(94)

For (45), consider (8) and S1KS2, \({S_{1}} \to {\bar {S}_{1}} \to {\bar {S}_{2}}\), \({S_{2}} \to {\bar {S}_{2}} \to {\bar {S}_{1}}\), and the facts

$$ \left\{ \begin{array}{l} H\left(K\right) = I\left({S_{1}};{S_{2}}\right),\;\\ I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}K\right) = 0,\\ I\left({S_{2}};{{\bar{S}}_{1}}|{{\bar{S}}_{2}}K\right) = 0 \end{array} \right\} $$
(95)

and

$$ \begin{array}{l} H\left(K|{{\bar{S}}_{2}}\right) = I\left({S_{1}};{S_{2}}|{{\bar{S}}_{2}}\right)\\ = I\left({S_{1}}{{\bar{S}}_{1}};{S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{2}};{{\bar{S}}_{1}}|{{\bar{S}}_{2}}{S_{1}}\right)\\ = I\left({S_{1}}{{\bar{S}}_{1}};{S_{2}}|{{\bar{S}}_{2}}\right)\\ = I\left({{\bar{S}}_{1}};{S_{2}}|{{\bar{S}}_{2}}\right) + I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ = I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) \end{array} $$
(96)

Consider (8), we have

$$ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}K\right) - I\left({S_{2}};{{\bar{S}}_{1}}K|{{\bar{S}}_{2}}\right)\\ = H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{2}};K|{{\bar{S}}_{2}}\right) - I\left({S_{2}};{{\bar{S}}_{1}}|{{\bar{S}}_{2}}K\right)\\ = H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - H\left(K|{{\bar{S}}_{2}}\right)\\ = H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ \le I\left(\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}};{Y_{1}}\right) + I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\\ + I\left({{\tilde{S}}_{2}};{Y_{2}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\\ \le I\left({U_{0}};{Y_{1}}\right) + H\left({Y_{1}}|{U_{0}}{U_{2}}\right) + I\left({U_{2}};{Y_{2}}|{U_{0}}\right) \end{array} $$
(97)

For (46), consider (6) and \({S_{1}} \to {\bar {S}_{1}} \to {\bar {S}_{2}}\), \({S_{2}} \to {\bar {S}_{2}} \to {\bar {S}_{1}}\), and the facts

$$ I\left({S_{1}};{\bar{S}_{2}}|{\bar{S}_{1}}\right) = 0,\quad I\left({S_{2}};{\bar{S}_{1}}|{\bar{S}_{2}}\right) = 0 $$
(98)

and

$$ \begin{array}{l} H\left({S_{1}}{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ = H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ - I\left({S_{1}};{{\bar{S}}_{2}}|{{\bar{S}}_{1}}\right) - I\left({S_{2}};{{\bar{S}}_{1}}|{{\bar{S}}_{2}}\right)\\ = H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) \end{array} $$
(99)

Therefore, we have

$$ \begin{array}{l} H\left({S_{1}}|{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ \le I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) + I\left({{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{U_{1}};{Y_{2}}|{{\tilde {\bar{S}}}_{2}}{U_{2}}\right)\\ \le H\left({Y_{1}}|{U_{0}}{U_{2}}\right) + I\left({U_{0}}{U_{2}};{Y_{2}}\right) \end{array} $$
(100)

For the first term in (47), consider the first term in (10), and

$$I\left({S_{1}};{\bar{S}_{2}}|{S_{2}}\right) = 0, I\left({S_{1}};{\bar{S}_{1}}|{S_{2}}\right) = H\left({\bar{S}_{1}}|K\right). $$

So we have

$$ \begin{array}{l} {E_{S1}} \le I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + I\left(S_{1};\bar{S}_{2}|S_{2}\bar{S}_{1}\right)\\ + I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) - I\left({{\tilde{S}}_{1}};{Y_{2}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\\ = H\left({{\bar{S}}_{1}}|{S_{2}}\right) + I\left({U_{1}};{Y_{1}}|{U_{0}}{U_{2}}\right) - I\left({U_{1}};{Y_{2}}|{U_{0}}{U_{2}}\right)\\ \le H\left({{\bar{S}}_{1}}|{S_{2}}\right) + I\left({U_{1}};{Y_{1}}{Y_{2}}|{U_{0}}{U_{2}}\right) - I\left({U_{1}};{Y_{2}}|{U_{0}}{U_{2}}\right)\\ = H\left({{\bar{S}}_{1}}|{S_{2}}\right) + I\left({U_{1}};{Y_{1}}|{Y_{2}}{U_{0}}{U_{2}}\right)\\ \le H\left({\bar{S}_{1}}|{S_{2}}\right) + H\left({Y_{1}}|{Y_{2}}{U_{0}}{U_{2}}\right) \end{array} $$
(101)

For the first term in (48), consider the second term in (11), and we have

$$ \begin{array}{l} {E_{S2}} < I\left({S_{2}};{{\bar{S}}_{2}}|K\right) - I\left({S_{2}};{{\bar{S}}_{1}}|K\right) + I\left({S_{2}};{{\bar{S}}_{1}}|K{{\bar{S}}_{2}}\right)\\ + I\left({{\tilde{S}}_{2}};{Y_{2}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) - I\left({{\tilde{S}}_{2}};{Y_{1}}|\tilde K{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\\ = H\left({\bar{S}_{2}}|K\right) + I\left({U_{2}};{Y_{2}}|{U_{0}}\right) - I\left({U_{2}};{Y_{1}}|{U_{0}}\right)\\ = H\left({\bar{S}_{2}}|{S_{1}}\right) + I\left({U_{2}};{Y_{2}}|{U_{0}}\right) - I\left({U_{2}};{Y_{1}}|{U_{0}}\right) \end{array} $$
(102)

The second terms in (47) and (48) follow from the facts

$$ {E_{S1}} \le H\left({S_{1}}|{S_{2}}{\bar{S}_{2}}\right) \le H\left({S_{1}}|{S_{2}}\right) $$
(103)
$$ {E_{S2}} \le H\left({S_{2}}|{S_{1}}{\bar{S}_{1}}\right) \le H\left({S_{2}}|{S_{1}}\right) $$
(104)

Proof of Theorem 4

Inner bound:

Consider (30)-(38) in Theorem 2, U0=S2U, U1=X, U2=a constant, S1S2 are independent of UU1U2, U and X satisfy the Markov chain UXY1Y2. We obtain (50)-(53).

Specifically, consider (30) and the facts

$$\begin{array}{l} H\left({S_{1}}\right) < I\left({U_{0}}{U_{1}}{S_{1}};{Y_{1}}{{\bar{S}}_{1}}\right) - I\left({U_{0}}{U_{1}};{S_{2}}|{S_{1}}\right)\\ = I\left(UX{S_{1}}{S_{2}};{Y_{1}}{{\bar{S}}_{1}}\right) - I\left(U{S_{2}}X;{S_{2}}|{S_{1}}\right)\\ = I\left(X;{Y_{1}}\right) + I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right) - I\left({S_{2}};{S_{2}}|{S_{1}}\right) \end{array} $$

and

$$\begin{array}{l} H\left({S_{1}}\right) - I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right) + I\left({S_{2}};{S_{2}}|{S_{1}}\right)\\ = H\left({S_{1}}\right) - I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right) + H\left({S_{2}}|{S_{1}}\right)\\ = H\left({S_{1}}{S_{2}}\right) - I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right)\\ = H\left({S_{1}}{S_{2}}|{{\bar{S}}_{1}}\right) \end{array} $$

Therefore

$$ H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}\right) < I\left(X;{Y_{1}}\right) $$
(105)

Consider (31) and the fact

$$\begin{array}{l} H\left({S_{2}}\right) < I\left({U_{0}}{U_{2}}{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\right) - I\left({U_{0}}{U_{2}};{S_{1}}|{S_{2}}\right)\\ = I\left(U{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\right) - I\left(U{S_{2}};{S_{1}}|{S_{2}}\right)\\ = I\left(U{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\right)\\ = I\left({S_{2}};{{\bar{S}}_{2}}\right) + I\left(U;{Y_{2}}\right) \end{array} $$

Therefore

$$ H\left({S_{2}}|{\bar{S}_{2}}\right) < I\left(U;{Y_{2}}\right) $$
(106)

Consider (32) and the fact

$${\begin{aligned} \begin{array}{l} H\left({S_{1}}{S_{2}}\right)\\ < I\left({U_{0}}{U_{1}}{S_{1}};{Y_{1}}{{\bar{S}}_{1}}\right) + I\left({U_{2}}{S_{2}};{Y_{2}}{{\bar{S}}_{2}}|K{U_{0}}\right) - I\left({U_{1}}{S_{1}};{U_{2}}{S_{2}}|K{U_{0}}\right)\\ = I\left(UX{S_{1}}{S_{2}};{Y_{1}}{{\bar{S}}_{1}}\right) + I\left({S_{2}};{Y_{2}}{{\bar{S}}_{2}}|K{S_{2}}U\right) - I\left(X{S_{1}};{S_{2}}|K{S_{2}}U\right)\\ = I\left(X;{Y_{1}}\right) + I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}|{S_{2}}\right) - I\left({S_{1}};{S_{2}}|{S_{2}}\right)\\ = I\left(X;{Y_{1}}\right) + I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right) \end{array} \end{aligned}} $$

Therefore

$$ H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}\right) < I\left(X;{Y_{1}}\right) $$
(107)

Consider (33) and the facts

$${\begin{aligned} \begin{array}{l} H\left({S_{1}}{S_{2}}\right)\\ < I\left({U_{1}}{S_{1}};{Y_{1}}{{\bar{S}}_{1}}|K{U_{0}}\right) + I\left({U_{0}}{U_{2}}{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\right) - I\left({U_{1}}{S_{1}};{U_{2}}{S_{2}}|K{U_{0}}\right)\\ = I\left(X{S_{1}};{Y_{1}}{{\bar{S}}_{1}}|K{S_{2}}U\right) + I\left(U{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\right) - I\left(X{S_{1}};{S_{2}}|K{S_{2}}U\right)\\ = I\left(X;{Y_{1}}|U\right) + I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) + I\left(U;{Y_{2}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}\right) - I\left({S_{1}};{S_{2}}|{S_{2}}\right)\\ = I\left(X;{Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) + I\left({S_{1}};{\bar{S}_{1}}|{S_{2}}\right) + I\left({S_{2}};{\bar{S}_{2}}\right) \end{array} \end{aligned}} $$

and

$$\begin{array}{l} H\left({S_{1}}{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{2}};{{\bar{S}}_{2}}\right)\\ =\! H\left({S_{1}}{S_{2}}\right) \,-\, H\left({S_{1}}|{S_{2}}\right) \,+\, H\left({S_{1}}|{S_{2}}{{\bar{S}}_{1}}\right) \,-\, H\left({S_{2}}\right) \,+\, H\left({S_{2}}|{{\bar{S}}_{2}}\right)\\ = H\left({S_{1}}|{S_{2}}{\bar{S}_{1}}\right) + H\left({S_{2}}|{\bar{S}_{2}}\right) \end{array} $$

Therefore

$$ H\left({S_{1}}|{S_{2}}{\bar{S}_{1}}\right) + H\left({S_{2}}|{\bar{S}_{2}}\right) < I\left(X;{Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) $$
(108)

Consider (34) and the fact

$$ \begin{aligned} \begin{array}{l} H\left({S_{1}}{S_{2}}\right)\\ <\! I\!\left({U_{0}}{U_{1}}{S_{1}};{Y_{1}}{{\bar{S}}_{1}}\right) \,+\, I\!\left(\!{U_{0}}{U_{2}}{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\!\right) \,-\, I\!\left({U_{1}}{S_{1}};{U_{2}}{S_{2}}|K{U_{0}}\!\right)\\ - I\left({S_{1}}{S_{2}};K{U_{0}}\right)\\ = I\left(UX{S_{1}}{S_{2}};{Y_{1}}{{\bar{S}}_{1}}\right) + I\left(U{S_{2}};{Y_{2}}{{\bar{S}}_{2}}\right) - I\left(X{S_{1}};{S_{2}}|K{S_{2}}U\right)\\ - I\left({S_{1}}{S_{2}};K{S_{2}}U\right)\\ = I\left(X;{Y_{1}}\right) + I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right) + I\left(U;{Y_{2}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}\right)\\ - I\left({S_{1}};{S_{2}}|{S_{2}}\right) - I\left({S_{1}}{S_{2}};{S_{2}}\right)\\ = I\left(X;{Y_{1}}\right) + I\left(U;{Y_{2}}\right) + I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right) + I\left({S_{2}};{{\bar{S}}_{2}}\right) - H\left({S_{2}}\right)\\ = I\left(X;{Y_{1}}\right) + I\left(U;{Y_{2}}\right) + I\left({S_{1}}{S_{2}};{{\bar{S}}_{1}}\right) - H\left({S_{2}}|{{\bar{S}}_{2}}\right) \end{array} \end{aligned} $$

Therefore

$$ H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}\right) + H\left({S_{2}}|{\bar{S}_{2}}\right) < I\left(X;{Y_{1}}\right) + I\left(U;{Y_{2}}\right) $$
(109)

Consider (102) and (103), the bound (106) is redundant.

Since \({\bar {S}_{2}} - {\bar {S}_{1}} - {S_{1}}{S_{2}}\), \(\bar {S}_{1} - \bar {S}_{2} - S_{2}\) and the facts

$$\begin{array}{l} H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}{\bar{S}_{2}}\right) = H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}\right)\\ H\left({S_{1}}{S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) = H\left({S_{2}}|{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right) + H\left({S_{1}}|{S_{2}}{{\bar{S}}_{1}}{{\bar{S}}_{2}}\right)\\ = H\left({S_{2}}|{{\bar{S}}_{2}}\right) + H\left({S_{1}}|{S_{2}}{{\bar{S}}_{1}}\right) \end{array} $$

Therefore

$$H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}\right) = H\left({S_{2}}|{\bar{S}_{2}}\right) + H\left({S_{1}}|{S_{2}}{\bar{S}_{1}}\right) $$

And the bound (105) is equal to (51). Due to the less noisy condition I(U;Y1)≥I(U;Y2), the bound (102) is redundant. Hence, we have the bounds (49) and (51).

Consider (37), we have

$$ \begin{array}{l} {E_{S1}} < I\left({S_{1}}{U_{1}};{Y_{1}}{{\bar{S}}_{1}}|K{U_{0}}\right) - I\left({S_{1}}{U_{1}};{S_{2}}{{\bar{S}}_{2}}{U_{2}}{Y_{2}}|K{U_{0}}\right)\\ = I\left({S_{1}}X;{Y_{1}}{{\bar{S}}_{1}}|KU{S_{2}}\right) - I\left({S_{1}}X;{S_{2}}{{\bar{S}}_{2}}{Y_{2}}|KU{S_{2}}\right)\\ = I\left(X;{Y_{1}}|U\right) \!- I\left(X;{Y_{2}}|U\right) + I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) \!- I\left({S_{1}};{S_{2}}{{\bar{S}}_{2}}|{S_{2}}\right)\\ = I\left(X;{Y_{1}}|U\right) \!- I\left(X;{Y_{2}}|U\right) + I\left({S_{1}};{\bar{S}_{1}}|{S_{2}}\right) \!- I\left({S_{1}};{\bar{S}_{2}}|{S_{2}}\right) \end{array} $$
(110)

Consider (35) and the independent distribution of source and channel variables, we have

$$ {E_{S1}} < H\left({S_{1}}|{S_{2}}{\bar{S}_{2}}{U_{0}}{U_{2}}{Y_{2}}\right) = H\left({S_{1}}|{S_{2}}{\bar{S}_{2}}\right) $$
(111)

Combining (107) and (108), we get (52).

Consider (38), we have

$$ \begin{array}{l} {E_{S2}} < I\left({S_{2}}{U_{2}};{Y_{2}}{{\bar{S}}_{2}}|K{U_{0}}\right) - I\left({S_{2}}{U_{2}};{S_{1}}{{\bar{S}}_{1}}{U_{1}}{Y_{1}}|K{U_{0}}\right).\\ = I\left({S_{2}};{Y_{2}}{{\bar{S}}_{2}}|K{S_{2}}U\right) - I\left({S_{2}};{S_{1}}{{\bar{S}}_{1}}X{Y_{1}}|K{S_{2}}U\right)\\ = I\left({S_{2}};{{\bar{S}}_{2}}|{S_{2}}\right) - I\left({S_{2}};{S_{1}}{{\bar{S}}_{1}}|{S_{2}}\right)\\ =0 \end{array} $$
(112)

Furthermore, for any distribution

$$p\left({s_{1}}{s_{2}}{\bar{S}_{1}}{\bar{S}_{2}}\right)p\left(u|{s_{1}}{s_{2}}\right)p\left(x|u\right)p\left({y_{1}}{y_{2}}|x\right) $$

Hence, we achieve Theorem 4.

Outer bound:

According to Theorem 1, we choose

$$ {\tilde S_{2}}\tilde K{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{U_{1}}{U_{2}} = U, {\tilde S_{1}} = X $$
(113)

satisfying

$$U \to X \to {Y_{1}}{Y_{2}} $$

Consider (5), we have

$$ H\left({S_{2}}|{\bar{S}_{2}}\right) \le I\left({\tilde S_{2}};{Y_{2}}|{\tilde {\bar{S}}_{2}}{U_{2}}\right) \le I\left(U;{Y_{2}}\right) $$
(114)

Consider (6), and the fact

$$ \begin{array}{l} I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) + I\left({{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{U_{1}};{Y_{2}}|{{\tilde {\bar{S}}}_{2}}{U_{2}}\right)\\ \le I\left(X;{Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) \end{array} $$
(115)

We have

$$ H\left({S_{1}}{S_{2}}|{\bar{S}_{1}}\right) < I\left(X;{Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) $$
(116)

Therefore, we get (50) and (51).

Consider the first term in (10), we have

$$ \begin{array}{l} {E_{S1}} < I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + I\left(S_{1};\bar{S}_{2}|S_{2}\bar{S}_{1}\right)\\ + I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) - I\left({{\tilde{S}}_{1}};{Y_{2}}|{{\tilde{S}}_{2}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right)\\ < I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + I\left(X;{Y_{1}}|U\right) - I\left(X;{Y_{2}}|U\right) \end{array} $$
(117)

Consider the second term in (11) and the less noisy condition and the facts

$$\begin{array}{l} {E_{S2}} \le I\left({S_{2}};{{\bar{S}}_{2}}|{S_{1}}\right) - I\left({S_{2}};{{\bar{S}}_{1}}|{S_{1}}\right) + I\left(S_{2};\bar{S}_{1}|S_{1}\bar{S}_{2}\right)\\ + I\left({{\tilde{S}}_{2}};{Y_{2}}|{{\tilde{S}}_{1}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) - I\left({{\tilde{S}}_{2}};{Y_{1}}|{{\tilde{S}}_{1}}{{\tilde {\bar{S}}}_{1}}{{\tilde {\bar{S}}}_{2}}{U_{1}}{U_{2}}\right) \end{array} $$
$$ I\left({\tilde S_{2}};{Y_{2}}|{\tilde S_{1}}{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{U_{1}}{U_{2}}\right) \le I\left({\tilde S_{2}};{Y_{1}}|{\tilde S_{1}}{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{U_{1}}{U_{2}}\right) $$
(118)
$$ I\left(S_{2};\bar{S}_{1}|S_{1}\bar{S}_{2}\right) = 0 $$
(119)
$$ I\left({S_{2}};{\bar{S}_{2}}|{S_{1}}\right) \le I\left({S_{2}};{\bar{S}_{1}}|{S_{1}}\right) $$
(120)

where (115) due to less noisy condition, (116)-(117) due to \(\bar {S}_{1} - \bar {S}_{2} - S_{2}\), \({\bar {S}_{2}} - {\bar {S}_{1}} - {S_{1}}{S_{2}}\) and the fact

$$H\left({S_{2}}|{S_{1}}{\bar{S}_{2}}\right) = H\left({S_{2}}|{S_{1}}{\bar{S}_{1}}\right). $$

On the other hand ES2≥0, therefore we have

$$ {E_{S2}} = 0 $$
(121)

Proof of Theorem 5

Assume the distribution \(p\left ({\bar {S}_{1}}{\bar {S}_{2}}|s\right)p\left ({y_{1}}{y_{2}}|x\right)p\left (x|u\right)\).

Case (i):

Inner bound:

For Theorem 2, we choose S1=S2=K=S, U0=U, U1=X, U2=a constant, satisfying the Markov chain UXY1Y2.

Consider (30) and the fact

$$ \begin{array}{l} H\left(S\right) < I\left(XS;{Y_{1}}{\bar{S}_{1}}\right)= I\left(X;{Y_{1}}\right) + I\left(S;{\bar{S}_{1}}\right)\\ H\left(S|{\bar{S}_{1}}\right) < I\left(X;{Y_{1}}\right) \end{array} $$
(122)

Consider (31) and the fact

$$H\left(S\right) < I\left(US;{Y_{2}}{\bar{S}_{2}}\right) $$

We have

$$ H\left(S|{\bar{S}_{2}}\right) < I\left(U;{Y_{2}}\right) $$
(123)

Consider (33) and the fact

$$\begin{array}{l} H\left(S\right) < I\left(XS;{Y_{1}}{\bar{S}_{1}}|SU\right) + I\left(US;{Y_{2}}{\bar{S}_{2}}\right)\\ = I\left(X;{Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) + I\left(S;{{\bar{S}}_{2}}\right)\\ \le I\left(X;{Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) + I\left(S;{{\bar{S}}_{1}}\right) \end{array} $$

where the last step follows from the Markov chain

$$S \to {\bar{S}_{1}} \to {\bar{S}_{2}} $$

Therefore, we have

$$ H\left(S|{\bar{S}_{1}}\right) < I\left(X;{Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) $$
(124)

Outer bound:

Consider (4), (5) and (8), choose \(\tilde K{\tilde {\bar {S}}_{1}}{\tilde {\bar {S}}_{2}}{U_{1}}{U_{2}}\)= a constant, \({\tilde S_{1}} = X\), \({\tilde S_{2}} = U\), and have the facts

$$ H\left(S|{\bar{S}_{1}}\right) \le I\left({\tilde S_{1}};{Y_{1}}|{\tilde {\bar{S}}_{1}}{U_{1}}\right) \le I\left(X;{Y_{1}}\right) $$
(125)
$$ H\left(S|{\bar{S}_{2}}\right) \le I\left({\tilde S_{2}};{Y_{2}}|{\tilde {\bar{S}}_{2}}{U_{2}}\right) \le I\left(U;{Y_{2}}\right) $$
(126)
$$ H\left(S|{\bar{S}_{1}}\right) \le I\left(X;{Y_{1}}|U\right) + I\left(U;{Y_{2}}\right) $$
(127)

Case (ii):

The proof of (56) follows from (54) in Case (i).

Inner bound:

Consider (35) and (37), we choose S1=S, S2=K=U2=Null, U1=U, U0=Q, we have (57).

Outer bound:

Consider (10), we have (57). Specifically, Assume

$${\tilde S_{2}}\tilde K{\tilde {\bar{S}}_{1}}{\tilde {\bar{S}}_{2}}{U_{1}}{U_{2}} = Q, {\tilde S_{1}} = U $$

Consider the first term in (10), we have the facts

$$\begin{array}{l} I\left({S_{1}};{{\bar{S}}_{1}}|{S_{2}}\right) - I\left({S_{1}};{{\bar{S}}_{2}}|{S_{2}}\right) + I\left(S_{1};\bar{S}_{2}|S_{2}\bar{S}_{1}\right)\\ + I\left({{\tilde{S}}_{1}};{Y_{1}}|{{\tilde{S}}_{2}}Q\right) - I\left({{\tilde{S}}_{1}};{Y_{2}}|{{\tilde{S}}_{2}}Q\right)\\ = I\left(S;{{\bar{S}}_{1}}\right) - I\left(S;{{\bar{S}}_{2}}\right) + I\left(S;\bar{S}_{2}|\bar{S}_{1}\right)\\ + I\left(U;{Y_{1}}|Q\right) - I\left({{\tilde{S}}_{1}};{Y_{2}}|Q\right)\\ = I\left(S;{{\bar{S}}_{1}}\right) - I\left(S;{{\bar{S}}_{2}}\right) + I\left(U;{Y_{1}}|Q\right) - I\left({{\tilde{S}}_{1}};{Y_{2}}|Q\right) \end{array} $$

where the last step follows from the Markov chain

$$S \to {\bar{S}_{1}} \to {\bar{S}_{2}}. $$

Availability of data and materials

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

Abbreviations

BC:

broadcast channel

BCCS:

broadcast channel with confidential sources

DM:

discrete memoryless

SI:

side information

References

  1. 1

    C. Li, H. J. Yang, F. Sun, J. M. Cioffi, L Yang, Multiuser overhearing for cooperative two-way multiantenna relays. IEEE Trans. Veh. Technol. 65(5), 3796–3802 (2016).

  2. 2

    W. Wu, B. Wang, Y. Zeng, H. Zhang, Z. Yang, Z. Deng, Robust secure beamforming for wireless powered full-duplex systems with self-energy recycling. IEEE Trans. Veh. Technol. 66(11), 10055–10069 (2017).

  3. 3

    F. Zhou, Z. Chu, H. Sun, R. Q. Hu, L. Hanzo, Artificial noise aided secure cognitive beamforming for cooperative MISO-NOMA using SWIPT. IEEE J. Sel. Areas Commun.36(4), 918–931 (2018).

  4. 4

    C. Li, S. Zhang, P. Liu, F. Sun, J. M. Cioffi, L. Yang, Overhearing protocol design exploiting inter-cell interference in cooperative green networks. IEEE Trans. Veh. Technol. 65(1), 441–446 (2016).

  5. 5

    Z. Chu, et al., Resource allocation for secure wireless powered integrated multicast and unicast services with full duplex self-energy recycling. IEEE Trans. Wirel. Commun.18(1), 620–636 (2019).

  6. 6

    L. Wei, R. Q. Hu, Y. Qian, G. Wu, Key elements to enable millimeter wave communications for 5G wireless systems. IEEE Wirel. Commun.21(6), 136–143 (2014).

  7. 7

    C. Li, P. Liu, C. Zou, F. Sun, J. M. Cioffi, L. Yang, Spectral-efficient cellular communications with coexistent one- and two-hop transmissions. IEEE Trans. Veh. Technol. 65(8), 6765–6772 (2016).

  8. 8

    Z. Chu, F. Zhou, Z. Zhu, R. Q. Hu, P. Xiao, Wireless Powered Sensor Networks for Internet of Things: Maximum Throughput and Optimal Power Allocation. IEEE Internet Things J. 5(1), 310–321 (2018).

  9. 9

    R. Q. Hu, Y. Qian, An energy efficient and spectrum efficient wireless heterogeneous network framework for 5G systems. IEEE Commun. Mag. 52(5), 94–101 (2014).

  10. 10

    C. Li, F. Sun, J. M. Cioffi, L Yang, Energy efficient MIMO relay transmissions via joint power allocations. IEEE Trans. Circ. Syst.61(7), 531–535 (2014).

  11. 11

    W. Wu, F. Zhou, P. Li, P. Deng, B. Wang, V. C. M. Leung, in 2019 International Conference on Communications. Energy-Efficient Secure NOMA-Enabled Mobile Edge Computing Networks (Shanghai, China, 2019).

  12. 12

    C. Li, H. J. Yang, F. Sun, J. M. Cioffi, L Yang, Adaptive overhearing in two-way multi-antenna relay channels. IEEE Signal Process. Lett. 23(1), 117–120 (2016).

  13. 13

    Q. Li, R. Q. Hu, Y. Qian, G. Hu, Cooperative communications for wireless networks: techniques and applications in LTE-advanced systems. IEEE Wirel. Commun.19(2), 22–29 (2012).

  14. 14

    F Zhou, Y Wu, Y Liang, Z Li, Y Wang, K. -K Wong, State of the art, taxonomy, and open issues on NOMA in cognitive radio networks. IEEE Wirel. Commun.25(2), 100–108 (2018).

  15. 15

    C Li, P Liu, C Zou, F Sun, J. M Cioffi, L Yang, Spectral-efficient cellular communications with coexistent one- and two-hop transmissions. IEEE Trans. Veh. Technol. 65(8), 6765–6772 (2016).

  16. 16

    Z Zhu, Z Chu, N Wang, S Huang, Z Wang, I Lee, Beamforming and power splitting designs for AN-aided secure multi-user MIMO SWIPT systems. IEEE Trans. Inf. Forensics Secur.12(12), 2861–2874 (2017).

  17. 17

    C Li, F Sun, J. M Cioffi, L Yang, Energy efficient MIMO relay transmissions via joint power allocations. IEEE Trans. Circ. Syst. 61(7), 531–535 (2014).

  18. 18

    Z Zhu, Z Chu, F Zhou, H Niu, Z Wang, I Lee, Secure Beamforming Designs for Secrecy MIMO SWIPT Systems. IEEE Wirel. Commun. Lett. 7(3), 424–427 (2018).

  19. 19

    D Wan, M Wen, F Ji, Y Liu, Y Huang, Cooperative NOMA systems With partial channel state information over Nakagami- m Fading Channels. IEEE Trans. Commun. 66(3), 947–958 (2018).

  20. 20

    E. Tuncel, Slepian-Wolf coding over broadcast channels. IEEE Trans. Inf. Theory. 52(4), 1469–1482 (2006).

  21. 21

    J. Villard, P. Piantanida, S. Shamai (Shitz), Secure transmission of sources over noisy channels with side information at the receivers. IEEE Trans. Inf. Theory. 60(1), 713–739 (2014).

  22. 22

    T. S. Han, M. H. M. Costa, Broadcast channels with arbitrarily correlated sources. IEEE Trans. Inf. Theory. 33(5), 641–650 (1987).

  23. 23

    G. Kramer, Y. Liang, S. Shamai (Shitz), in 2009 Information Theory and Applications Workshop. Outer bounds on the admissible source region for broadcast channels with dependent sources (San Diego, 2009), pp. 169–172.

  24. 24

    F. Lang, Z. Deng, B. Wang, in 2014 IEEE Information Theory Workshop. Secure communication of correlated sources over broadcast channels (Hobart, Australia, 2014), pp. 416–420.

  25. 25

    D. Gunduz, E. Erkip, A. Goldsmith, H. V. Poor, Source and channel coding for correlated sources over multiuser channels. IEEE Trans. Inf. Theory. 55(9), 3927–3944 (2009).

  26. 26

    W. Kang, G. Kramer, in 2008 IEEE International Symposium on Information Theory. Broadcast channel with degraded source random variables and receiver side information (Toronto, 2008), pp. 1711–1715.

  27. 27

    R. Timo, A. Grant, T. Chan, G. Kramer, in 2008 IEEE International Symposium on Information Theory. Source coding for a simple network with receiver side information (Toronto, 2008), pp. 2307–2311.

  28. 28

    N. Merhav, Shannon’s secrecy system with informed receivers and its application to systematic coding for wiretapped channels. IEEE Trans. Inf. Theory. 54(6), 2723–2734 (2008).

  29. 29

    J. Xu, Y. Cao, B. Chen, Capacity bounds for broadcast channels with confidential messages. IEEE Trans. Inf. Theory. 55(10), 4529–4542 (2009).

Download references

Acknowledgments

This paper was supported by the National Natural Science Foundation of China (No. 61271232, 61372126), the Open research fund of National Mobile Communications Research Laboratory, Southeast University (No. 2012D05), and the Priority Academic Program Development of Jiangsu Province (Smart Grid and Control Technology).

Funding

This paper was supported in part by the school-enterprise cooperation projects of Zhejiang Province No. FG2016049, Science Planning Foundation of Department of Education of Zhejiang Province No. 2016SCG184, Subject of Department of Education of Zhejiang Province No. Y201636730, and key research project of Wenzhou Vocational and Technical College No. WZ 2016008. (Project No. Q20161606), the Scientific Research Plan Project of the Hubei Provincial Education Department.

Author information

HC is the main author of the current paper. HC contributed to the development of the ideas, design of the study, theory, result analysis, and article writing. PZ conceived and designed the experiments and undertook revision works of the paper. All authors read and approved the final manuscript.

Correspondence to Ping Zhu.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License(http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Keywords

  • Broadcast channel
  • Information-theoretic security
  • Correlated sources
  • Ultra-low latency
  • Side information