# The transmission channel

### The Transmission Channel

Ideally the transmission channel should produce at the destination the symbols emitted by the source. However noice and other transmission impairments in the channel alter the source symbols, resulting in a different symbol alphabet Y at the destination. Consider a simple model of an error pone communicational channel shown in figure xxx. Let's assume Figure XXXX illustrates the forward transitions of the channel. Here X and Y represent the alphabets of symbols transmitted and received respectively, during a unit time over the channel. Let, P(yj|xi) defines the conditional probability distribution functions of output symbols yj for a given input xi, where xiX and yj Y.

### Forward transition probabilities of a noisy channel

If the system is intended to deliver yj when transmitting xi, then the error probabilities are given by P (yj|xi) for j ? i. For the channel, mutual information, I(xi;yj), measures the amount of information transferred when xi is transmitted and yj is received. I(xi;yj), is defined as follows:

Most transmission channels fall between perfect transfer (i.e. each yj uniquely identifies a particular xi) and zero transfer (i.e. yj is totally unrelated to xi ) Average mutual information is defined to analyse the general case.

H(X) denotes the entropy of the output signal X and H(X|Y) denotes the conditional entropy of the input signal (X) given the output signal (Y). Equation 5.2 states that the average information transferred per symbol equals the sourse entropy minus conditional entropy.

The solution for rate and distortion problem can be achieved by minimising the rate-distortion function given below:

Where R is the information rate and D is an average distortion. I(X;Y) which describes the average mutual information between an original source (X : where the source selects symbols from an alphabet X) and a reconstructed data (Y),. Equation 5.1 says that for a given maximum average distortion Dmax, the rate distortion function R(D) defines the lower bound for the transmission bit-rate. The minimization is over all conditional distributions P (yj|xi) for which the joint distribution P (yj ; xi) satisfies the expected distortion constraint. The set of defines all the conditional distributions of P (yj|xi)

Conditional probability p(y | x) is considered as an inherent and fixed property of the communicational channel defined by the characteristics of the noise in the channel. The joint probability distribution of X and Y is entirely determined by the nature of the channel and the distribution of messages, f(x), to be transmitted over the channel. Under these constraints, the objective is to maximize the rate of information communicating over the noisy channel. The appropriate measure for this is known as the mutual information, The theoritical upper bound of mutual information is know as the channel capacity and is given by:

C = \max_{f} I(X;Y).\!

Channel capacity has the subsequent property related to transmitting information at rate R, where R is generally bits per message or symbol. For a communication system where the information rate R is < C and coding error e is > 0, it is always possible to transmit with an arbitrarily small error, such that the maximal probability of error is less than an acceptable level e. In addition, for any rate R > C, it is unachievable to transmit with arbitrarily small block error. The objective of channel coding is to find nearly optimal codes that can be used to transmit data over an error pone channels with an acceptable error at a rate close to channel capacity. However, in most practical video communication systems, the quality of transmitted video varies due to variations in the allowable bandwidth limitations. Thus, the maximum perceptual quality, under the rate constraint, can be achieved by the solving the following:

The set of F is defines the solution space of conditional distributions P (yj|xi) for which the joint distribution P (yj;xi) satisfies the expected rate constraint.