In any of the examples, the probabilistic encoder may be implemented using an encoder deep neural network (DNN), and the encoder DNN may be trained to satisfy: a first target of maximizing likelihood between a set of recovered information at a corresponding decoder DNN, and a second target of minimizing an upper boundary of mutual information to be within the predetermined physical channel capacity limit.
In any of the examples, the encoder DNN and the decoder DNN may be trained together.
In any of the examples, the compression ratio provided by the trained encoder DNN and the decoder DNN may have been determined by performing training on a plurality of candidate encoder and decoder DNN pairs, each candidate encoder and decoder DNN pair providing a respective different compression ratio, and selecting the candidate encoder and decoder DNN pair and associated compression ratio that minimizes the upper boundary of mutual information.
In any of the examples, the apparatus may also include: a historical database storing at least one previously transmitted feature; and the transmitter may be configured to transmit a reduced set of features over the transmission channel, the reduced set of features omitting any feature that is unchanged compared to the at least one previously transmitted feature.