TY - GEN
T1 - Binary causal-adversary channels
AU - Langberg, M.
AU - Jaggi, S.
AU - Dey, B. K.
N1 - Copyright:
Copyright 2013 Elsevier B.V., All rights reserved.
PY - 2009
Y1 - 2009
N2 - In this work we consider the communication of information in the presence of a causal adversarial jammer. In the setting under study, a sender wishes to communicate a message to a receiver by transmitting a codeword x = (x1, ⋯ , xn) bit-by-bit over a communication channel. The adversarial jammer can view the transmitted bits xi one at a time, and can change up to a p-fraction of them. However, the decisions of the jammer must be made in an online or causal manner. Namely, for each bit xi the jammer's decision on whether to corrupt it or not (and on how to change it) must depend only on xj for j ≥ i. This is in contrast to the "classical" adversarial jammer which may base its decisions on its complete knowledge of x. We present a non-trivial upper bound on the amount of information that can be communicated. We show that the achievable rate can be asymptotically no greater than min{1 - H(p), (1 - 4p)+}. Here H(.) is the binary entropy function, and (1 - 4p) + equals 1 - 4p for p ≥ 0.25, and 0 otherwise.
AB - In this work we consider the communication of information in the presence of a causal adversarial jammer. In the setting under study, a sender wishes to communicate a message to a receiver by transmitting a codeword x = (x1, ⋯ , xn) bit-by-bit over a communication channel. The adversarial jammer can view the transmitted bits xi one at a time, and can change up to a p-fraction of them. However, the decisions of the jammer must be made in an online or causal manner. Namely, for each bit xi the jammer's decision on whether to corrupt it or not (and on how to change it) must depend only on xj for j ≥ i. This is in contrast to the "classical" adversarial jammer which may base its decisions on its complete knowledge of x. We present a non-trivial upper bound on the amount of information that can be communicated. We show that the achievable rate can be asymptotically no greater than min{1 - H(p), (1 - 4p)+}. Here H(.) is the binary entropy function, and (1 - 4p) + equals 1 - 4p for p ≥ 0.25, and 0 otherwise.
UR - http://www.scopus.com/inward/record.url?scp=70449476512&partnerID=8YFLogxK
U2 - 10.1109/ISIT.2009.5205859
DO - 10.1109/ISIT.2009.5205859
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:70449476512
SN - 9781424443130
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 2723
EP - 2727
BT - 2009 IEEE International Symposium on Information Theory, ISIT 2009
T2 - 2009 IEEE International Symposium on Information Theory, ISIT 2009
Y2 - 28 June 2009 through 3 July 2009
ER -