default search action
IRE Transactions on Information Theory, Volume 2
Volume 2, Number 1, March 1956
- Douglas G. Lampard:
The probability distribution for the filtered output of a multiplier whose inputs are correlated, stationary, Gaussian time-series. 4-11 - A. B. Lees:
Interpolation and extrapolation of sampled data. 12-17 - Nelson M. Blachman:
Report on the third London Symposium on Information Theory. 17-23 - Herbert Sherman:
Some optimal signals for time measurement. 24-28 - Gerald P. Dinneen, Irving S. Reed:
An analysis of signal detection and location by digital methods. 29-38 - B. J. DuWaldt:
Inverse probability in angular-tracking radars. 38-42 - A. Jeffrey:
The linear, input-controlled, variable-pass network. 42-44 - B. E. Keiser:
Rebuttal to comments by A. Jeffrey. 43-44
Volume 2, Number 2, June 1956
- Kent R. Johnson:
Optimum, linear, discrete filtering of signals containing a nonrandom component. 49-55 - Edward L. O'Neill:
Spatial filtering in optics. 56-65 - Mischa Schwartz:
Effects of signal fluctuation on the detection of pulse signals in noise. 66-71 - Kenneth S. Miller, Lotfi A. Zadeh:
Solution of an integral equation occurring in the theories of prediction and detection. 72-75 - Marvin Blum:
Generalization of the class of nonrandom inputs of the Zadeh-Ragazzini prediction model. 76-81 - J. A. McFadden:
The correlation function of a sine wave plus noise after extreme clippings. 82-83 - David S. Slepian:
A note off two binary signaling alphabets. 84-86 - Seymour Stein, J. E. Storer:
Generating a Gaussian sample. 87-90 - Paul E. Green Jr.:
A bibliography of Soviet literature on noise, correlation, and information theory. 91-94 - Satio Okada:
On the information invariant (Abstr.). 95 - S. Bagno:
Comments on "In which fields do we graze?" by de Rosa, L. 96 - Nelson M. Blachman:
Comments on "In which fields do we graze?" by de Rosa, L. 96-97 - M. Hoberman:
Comments on "In which fields do we graze?" by de Rosa, L. 96
Volume 2, Number 3, September 1956
- Claude E. Shannon:
The zero error capacity of a noisy channel. 8-19 - David A. Huffman:
A linear circuit viewpoint on error-correcting codes. 20-28 - Sheldon S. L. Chang:
Theory of information feedback systems. 29-40 - H. P. Kramer, Max V. Mathews:
A linear coding for transmitting a set of correlated signals. 41-46 - Marcel Paul Schützenberger:
On an application of semi groups methods to some problems in coding. 47-60 - Allen Newell, Herbert A. Simon:
The logic theory machine-A complex information processing system. 61-79 - Nathaniel Rochester, John H. Holland, L. H. Haibt, William L. Duda:
Tests on a cell assembly theory of the action of the brain, using a large digital computer. 80-93 - William F. Schreiber:
The measurement of third order probability distributions of television signals. 94-105 - Victor H. Yngve:
Gap analysis and syntax. 106-112 - Noam Chomsky:
Three models for the description of language. 113-124 - George C. Sziklai:
Some studies in the speed of visual perception. 125-128 - George A. Miller:
Human memory and the storage of information. 129-137 - John A. Swets, Theodore G. Birdsall:
The human use of information-III: Decision-making in signal detection and recognition situations involving multiple alternatives. 138-165 - A. V. Balakrishnan, Rudolf F. Drenick:
On optimum non-linear extraction and coding filters. 166-172 - Richard C. Booton:
Final-value systems with Gaussian inputs. 173-175 - Marvin Blum:
An extension of the minimum mean square prediction theory for sampled input signals. 176-184 - John L. Kelly:
A new interpretation of information rate. 185-189 - Benoit Mandelbrot:
An outline of a purely phenomenological theory of statistical thermodynamics-I: Canonical ensembles. 190-203 - William M. Siebert:
A radar detection philosophy. 204-221
Volume 2, Number 4, December 1956
- Andrei N. Kolmogorov:
On the Shannon theory of information transmission in the case of continuous signals. 102-108 - V. I. Siforov:
On noise stability of a system with error-correcting codes. 109-115 - Brockway McMillan:
Two inequalities implied by unique decipherability. 115-116 - Peter Elias, Amiel Feinstein, Claude E. Shannon:
A note on the maximum flow through a network. 117-119 - L. Lorne Campbell:
Rectification of two signals in random noise. 119-124 - Robert Price:
Optimum detection of random signals in noise, with application to scatter-multipath communication-I. 125-135 - Mischa Schwartz:
A coincidence procedure for signal detection. 135-139 - David L. Jagerman, Lawrence J. Fogel:
Some general aspects of the sampling theorem. 139-146 - J. A. McFadden:
The axis-crossing intervals of random functions. 146-150 - Arthur Glovazky:
Determination of redundancies in a set of patterns. 151-153 - Kent R. Johnson:
Correction to 'Optimum. Linear, Discrete Filtering of Signals containing a nonrandom component'. 154
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.