Reddit Reddit reviews An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)

We found 10 Reddit comments about An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics). Here are the top ones, ranked by their Reddit score.

Computers & Technology
Books
Computer Science
Information Theory
An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)
Dover Publications
Check price on Amazon

10 Reddit comments about An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics):

u/acetv · 14 pointsr/math

You are in a very special position right now where many interesing fields of mathematics are suddenly accessible to you. There are many directions you could head. If your experience is limited to calculus, some of these may look very strange indeed, and perhaps that is enticing. That was certainly the case for me.

Here are a few subject areas in which you may be interested. I'll link you to Dover books on the topics, which are always cheap and generally good.

  • The Nature and Power of Mathematics, Donald M. Davis. This book seems to be a survey of some history of mathematics and various modern topics. Check out the table of contents to get an idea. You'll notice a few of the subjects in the list below. It seems like this would be a good buy if you want to taste a few different subjects to see what pleases your palate.

  • Introduction to Graph Theory, Richard J. Trudeau. Check out the Wikipedia entry on graph theory and the one defining graphs to get an idea what the field is about and some history. The reviews on Amazon for this book lead me to believe it would be a perfect match for an interested high school student.

  • Game Theory: A Nontechnical Introduction, Morton D. Davis. Game theory is a very interesting field with broad applications--check out the wiki. This book seems to be written at a level where you would find it very accessible. The actual field uses some heavy math but this seems to give a good introduction.

  • An Introduction to Information Theory, John R. Pierce. This is a light-on-the-maths introduction to a relatively young field of mathematics/computer science which concerns itself with the problems of storing and communicating data. Check out the wiki for some background.

  • Lady Luck: The Theory of Probability, Warren Weaver. This book seems to be a good introduction to probability and covers a lot of important ideas, especially in the later chapters. Seems to be a good match to a high school level.

  • Elementary Number Theory, Underwood Dudley. Number theory is a rich field concerned with properties of numbers. Check out its Wikipedia entry. I own this book and am reading through it like a novel--I love it! The exposition is so clear and thorough you'd think you were sitting in a lecture with a great professor, and the exercises are incredible. The author asks questions in such a way that, after answering them, you can't help but generalize your answers to larger problems. This book really teaches you to think mathematically.

  • A Book of Abstract Algebra, Charles C. Pinter. Abstract algebra formalizes and generalizes the basic rules you know about algebra: commutativity, associativity, inverses of numbers, the distributive law, etc. It turns out that considering these concepts from an abstract standpoint leads to complex structures with very interesting properties. The field is HUGE and seems to bleed into every other field of mathematics in one way or another, revealing its power. I also own this book and it is similarly awesome. The exposition sets you up to expect the definitions before they are given, so the material really does proceed naturally.

  • Introduction to Analysis, Maxwell Rosenlicht. Analysis is essentially the foundations and expansion of calculus. It is an amazing subject which no math student should ignore. Its study generally requires a great deal of time and effort; some students would benefit more from a guided class than from self-study.

  • Principles of Statistics, M. G. Bulmer. In a few words, statistics is the marriage between probability and analysis (calculus). The wiki article explains the context and interpretation of the subject but doesn't seem to give much information on what the math involved is like. This book seems like it would be best read after you are familiar with probability, say from Weaver's book linked above.

  • I have to second sellphone's recommendation of Naive Set Theory by Paul Halmos. It's one of my favorite math books and gives an amazing introduction to the field. It's short and to the point--almost a haiku on the subject.

  • Continued Fractions, A. Ya. Khinchin. Take a look at the wiki for continued fractions. The book is definitely terse at times but it is rewarding; Khinchin is a master of the subject. One review states that, "although the book is rich with insight and information, Khinchin stays one nautical mile ahead of the reader at all times." Another review recommends Carl D. Olds' book on the subject as a better introduction.

    Basically, don't limit yourself to the track you see before you. Explore and enjoy.
u/jnazario · 6 pointsr/compsci

i recently read Pierce's An Introduction to Information Theory and was pleased. while it's not recent it's a good intro, if thats' what you're looking for. also it's a dover edition so it's priced very low.

u/mjedm6 · 3 pointsr/math

They may not be the best books for complete self-learning, but I have a whole bookshelf of the small introductory topic books published by Dover- books like An Introduction to Graph Theory, Number Theory, An Introduction to Information Theory, etc. The book are very cheap, usually $4-$14. The books are written in various ways, for instance the Number Theory book is highly proof and problem based if I remember correctly... whereas the Information Theory book is more of a straightforward natural-language summary of work by Claude Shannon et al. I still find them all great value and great to blast through in a weekend to brush up to a new topic. I'd pair each one with a real learning text with problem sets etc, and read the Dover book first quickly which introduces the reader to any unfamiliar terminology that may be needed before jumping into other step by step learning texts.

u/c3534l · 2 pointsr/learnmath

From the ground up, I dunno. But I looked through my amazon order history for the past 10 years and I can say that I personally enjoyed reading the following math books:

An Introduction to Graph Theory

Introduction to Topology

Coding the Matrix: Linear Algebra through Applications to Computer Science

A Book of Abstract Algebra

An Introduction to Information Theory

u/trashacount12345 · 2 pointsr/neuro

If you're interested in the more computers-and-signal-processing side of neuroscience, you'll need a bunch of math. If you're interested, check out this book (http://www.amazon.com/An-Introduction-Information-Theory-Symbols/dp/0486240614/ref=sr_1_1?ie=UTF8&qid=1371748392&sr=8-1&keywords=information+theory). I read it after going to college, so it may have a smidge of calculus in there, but at least the beginning (which is all the interesting stuff) is simple enough to not need it.

Information theory is one of those math topics that makes you rethink more than just math, hence the recommendation.

u/JoinXorDie · 1 pointr/mathbooks

The Dover information theory book is also the same price.

Now I'm hunting for other Dover books I may want. Thanks!

u/DiscoUnderpants · 1 pointr/atheism

Hmmm for a biologist I am not really that sure. It is a branch of mathematics mostly statistical... Shannons original book/expanded paper is available here and I have read this but they are very engineery if you know what I mean.

But I think that kind of illustrates the point... creationist types seem to see some profound truth in information theory... but it is really a field developed to deal with the transmission of messages in a general way and dealing with errors in the transmission.The entire reason for it is to come up with ways to minimise errors in message transmission and that is all EEs care about... that they get the same message at the receiver that was sent by the transmitter.

To expand this to biology I suppose you coudl roughly say that the message is the gemone being transmitted down the generations. The the way errors are dealt with is entirely different... selection takes care of the "bad" messages. But evolution as far as I know does nto have the goal to replicate the same information... but rather the opposite.

NINJA EDIT : Clauge E Shannon shoudl be a lot better know for his contributionto humanity.

u/TheCaterpillar · 1 pointr/wikipedia

For a very good read on entropy that requires only high school math check out:
http://www.amazon.com/Introduction-Information-Theory-John-Pierce/dp/0486240614

u/dajigo · 1 pointr/hardware

There's an excellent, inexpensive book I love on the topic. Introduction to information theory: symbols, signals, and noise. I strongly recommend it.

Edit: You don't need to be an expert in maths to get the gist of the book, the exposition is great.

u/FieldLine · 1 pointr/PurplePillDebate

>And that's something that puzzles me. All these AI guys searching for the Secret of ConsciousnessTM, and no one ever stopping to ask, hey, what if there is no secret?

To be fair, there isn't much active research being done among computer scientists in private industry to search for the Secret of Consciousness^TM (as far as I know). While it's cool to dream about futuristic robots and computers that can pass the Turing Test, there isn't much money to be found by directly developing a generalized AI, and academics for the most part don't produce shit in the way of practical science. So private industry focuses on generalizing algorithms only to the extent of performing a specific task, while guys like me hope that eventually these tasks will become general enough that we can arbitrarily decide that we've created a strong AI.

>What if there is no strong AI barrier other than computational limits of modern computers?

>Every time we figure out how the brain does something, we find brute-force computation rather than sophisticated algorithms.

Once you move into the realm of practically unlimited computational power you can dispense of algorithms altogether. But where's the elegance in that?

It reminds me of a funny historical note that took place during the development of Fourier Analysis. While mathematicians were trying to prove the convergence of Fourier series, dragging around the Dirichlet Kernel in all their proofs, the engineers were perfectly happy using the dirac delta approximation. 30 years later when the mathematicians finally came up with a formal proof that allowed them to use the dirac delta approximation as well and were like "look how awesome this is", the engineers were like "duh, where have you guys been."

Point is, there's no elegance in saying that the Secret of Consciousness^TM lies in a brute force approach. That would be admitting that our brains, as awesome as they are, are just glorified roombas.

I'm just an EE guy who likes algorithms on the side, but as a real computer scientist you can probably answer this better than I can - if we disregard computational efficiency, can't every algorithm be explicitly programmed simply using an if/else control flow?

For what it's worth, I'm fairly certain that you are correct that the brain isn't wired efficiently, but has the luxury of getting away with it because it has billions of neurons offering a huge advantage in raw computational potential. Not that this statement is worth much, as it is just a gut feeling that I can't back up.

>Every time we get computers to solve a heuristic problem, we cease to think it as belonging to the field of "artificial intelligence".

Disagree. One can draw a distinction between what is and what isn't a learning algorithm, but I would consider a roomba to be a rudimentary artificial intelligence. But that comes down to personal preference and a lack of precise language.

>Would we stop thinking of ourselves as conscious, self-moving minds? Would we dispense with the notion of "free will"? Or would we merely re-arrange the notion of identity?

You can ask these same questions despite the fact that we don't know what's running under the hood of the consciousness machine. If you believe in science then you believe that everything in nature has some kind of logical explanation without the hocus-pocus of religion and morality, even if modern science isn't there yet.

The only question that remains is where it all began. But that's a discussion for another day.