Raftul cu initiativa Book Archive

Machine Theory

Relative Information: Theories and Applications by Professor Guy Jumarie (auth.)

By Professor Guy Jumarie (auth.)

For 4 a long time, info concept has been considered virtually solely as a idea established upon the Shannon degree of uncertainty and data, frequently known as Shannon entropy. because the ebook of Shannon's seminal paper in 1948, the idea has grown tremendous swiftly and has been utilized with assorted good fortune in just about all parts of human pastime. at the moment, the Shannon info idea is a good tested and built physique of information. between its most important fresh contributions were using the complementary ideas of minimal and greatest entropy in facing quite a few basic structures difficulties equivalent to predic­ tive structures modelling, trend attractiveness, photograph reconstruction, and so forth. due to the fact its inception in 1948, the Shannon concept has been considered as a limited details concept. It has usually been argued that the speculation is able to dealing in simple terms with syntactic features of knowledge, yet no longer with its semantic and pragmatic elements. This restrict was once thought of a v~rtue by means of a few specialists and a vice by means of others. extra lately, in spite of the fact that, numerous arguments were made that the idea might be safely transformed to account for semantic features of in­ formation besides. probably the most convincing arguments during this regard are in­ cluded in Fred Dretske's Know/edge & circulate of data (The M.LT. Press, Cambridge, Mass., 1981) and during this publication through man lumarie.

Show description

Read Online or Download Relative Information: Theories and Applications PDF

Best machine theory books

Digital and Discrete Geometry: Theory and Algorithms

This booklet offers accomplished assurance of the fashionable tools for geometric difficulties within the computing sciences. It additionally covers concurrent issues in facts sciences together with geometric processing, manifold studying, Google seek, cloud facts, and R-tree for instant networks and BigData. the writer investigates electronic geometry and its similar positive equipment in discrete geometry, providing precise tools and algorithms.

Artificial Intelligence and Symbolic Computation: 12th International Conference, AISC 2014, Seville, Spain, December 11-13, 2014. Proceedings

This e-book constitutes the refereed lawsuits of the twelfth foreign convention on man made Intelligence and Symbolic Computation, AISC 2014, held in Seville, Spain, in December 2014. The 15 complete papers offered including 2 invited papers have been rigorously reviewed and chosen from 22 submissions.

Statistical Language and Speech Processing: Third International Conference, SLSP 2015, Budapest, Hungary, November 24-26, 2015, Proceedings

This ebook constitutes the refereed court cases of the 3rd overseas convention on Statistical Language and Speech Processing, SLSP 2015, held in Budapest, Hungary, in November 2015. The 26 complete papers provided including invited talks have been rigorously reviewed and chosen from seventy one submissions.

Additional info for Relative Information: Theories and Applications

Sample text

N. iii) Let E' c E denote a non-empty subset of E with N' elements, and let N~ denote the number of elements in E' n E k • We then pose the following problem: An element of E is chosen at random, and we know that it occurs in E'; given this "information", how do we define our new uncertainty about Y? To this e~d, we shall proceed as follows. certainty about Z given Y. This last entropy is merely a conditional entropy in the Shannon sense, namely, one has ~ 1 Ni L. qi n N! h WIt N( qi:= N' . 2) l Clearly, qi is the probability that the considered element belongs to Ei given that it is an element of E'.

2. 1) represents the amount of information contributed by a about 13, or likewise the amount of information contained in a about 13. D Important Remark. Basically, the entropy is a measure of uncertainty, while the information is a difference in uncertainty, that is to say a difference in entropies. 3) we are led to consider H(f3) as being the amount of information contained in 13 about 13 itself. It follows that some authors refer to H(f3) as an uncertainty or an information, and very often as an information only.

Yn with the probabilities ql, Q2, ... , qn. Again consider the random variable X as in Sect. 3 and define the probability distribution {rij, 1 ~ i ~ m, 1 ~ j ~ n} of the pair (X, Y); in other words (X, Y) takes the value (Xi> Yj) with the probability rjj for every (i,j). In this framework, one has p(Y = Y) X = x;) = qj/j; and all the equations above apply directly. 5 A Few Properties of Discrete Entropy In this section, we summarize those mathematical properties of H(IX), H(lXfJ) and H(fJ/lX) which we shall need in the following.

Download PDF sample

Rated 4.53 of 5 – based on 9 votes