Feynman Lectures on Computation

Richard Phillips Feynman, Anthony J. G. Hey, Robin W. Allen

Mentioned 2

Adaptations of lecture notes from Feynman's 1983-1986 courses on computation at the California Institute of Technology.

More on Amazon.com

Mentioned in questions and answers.

Information theory comes into play where ever encoding & decoding is present. For example: compression(multimedia), cryptography.

In Information Theory we encounter terms like "Entropy", "Self Information", "Mutual Information" and entire subject is based on these terms. Which just sound nothing more than abstract. Frankly, they don't really make any sense.

Is there any book/material/explanation (if you can) which explains these things in a practical way?

EDIT:

An Introduction to Information Theory: symbols, signals & noise by John Robinson Pierce is The Book that explains it the way I want (practically). Its too good. I started reading it.

I was going to recommend Feynman for pop-sci purposes, but on reflection I think it might be a good choice for easing into a serious study as well. You can't really know this stuff without getting the math, but Feynman is so evocative that he sneaks the math into without scaring the horses.

Feynman Lectures on Computation

Covers rather more ground than just information theory, but good stuff and pleasant to read. (Besides, I am obligated to pull for Team Physics. Rah! Rah! Rhee!)

I browsed some books about quantum computers and there is some concepts from computer science (for example, Turing machine) except quantum physics and mathematics. So, if I want to study quantum computing what I should know from computer science? Is it useful to read SICP, for example?

Chapters 2 and 3 of Nielsen and Chuang should give you the background you need.

The Feynman Lectures on Computation provides an easy to understand introduction to CS for physicists.

Beyond that, you can read some of Kitaev's Arxiv papers to see whether you're a genius.

SICP may not be directly applicable, but it may very well be the best programming book ever written, so it's always useful!

I want to go backwards and learn more about how compilers, processors and memory operate on my programs. I am also interested in the physics on which all of this depends. Any good references or books would be appreciated...

Pick up a book on "Computer Organization" or "Computer Architecture" on Amazon. This is what we used when I was in college. It's not too thick, and will give you the basics, from the gate level all of the way up to how memory is organized and programs are written. If, after this, you want to look deeper into the physics, then you'll want to pick up a book on semiconductor physics. (But if I were you I'd just start by looking up "logic gate", "diode", and "transistor" on wikipedia!)

Feynman has a nice bit on the Physics of Computation:

which addresses the second part of your question.

My first suggestion was going to be Code which has been suggested already. A better, but harder, book on the subject of processors is Computer Organization & Design by Hennessey & Patterson. You might look for an older edition on Amazon or Half.com. They'll be a lot cheaper and have basically the same information.

These will both teach you the basics of how a processor works, assembly language, etc. This will help you understand how your program will be interpreted and thus, what sort of performance bottlenecks might exist based on your design.