This is the thoroughly revised and updated second edition of the hugely successful The Art of Electronics. Widely accepted as the authoritative text and reference on electronic circuit design, both analog and digital, this book revolutionized the teaching of electronics by emphasizing the methods actually used by circuit designers -- a combination of some basic laws, rules of thumb, and a large bag of tricks. The result is a largely nonmathematical treatment that encourages circuit intuition, brainstorming, and simplified calculations of circuit values and performance. The new Art of Electronics retains the feeling of informality and easy access that helped make the first edition so successful and popular. It is an ideal first textbook on electronics for scientists and engineers and an indispensable reference for anyone, professional or amateur, who works with electronic circuits.
I've been working in C and CPython for the past 3 - 5 years. Consider that my base of knowledge here.
If I were to use an assembly instruction such as
MOV AL, 61h to a processor that supported it, what exactly is inside the processor that interprets this code and dispatches it as voltage signals? How would such a simple instruction likely be carried out?
Assembly even feels like a high level language when I try to think of the multitude of steps contained in
MOV AL, 61h or even
XOR EAX, EBX.
EDIT: I read a few comments asking why I put this as embedded when the x86-family is not common in embedded systems. Welcome to my own ignorance. Now I figure that if I'm ignorant about this, there are likely others ignorant of it as well.
It was difficult for me to pick a favorite answer considering the effort you all put into your answers, but I felt compelled to make a decision. No hurt feelings, fellas.
I often find that the more I learn about computers the less I realize I actually know. Thank you for opening my mind to microcode and transistor logic!
EDIT #2: Thanks to this thread, I have just comprehended why
XOR EAX, EAX is faster than
MOV EAX, 0h. :)
This is a big question, and at most universities there's an entire semester-long class to answer it. So, rather than give you some terribly butchered summary in this little box, instead I'll direct you to the textbook that has the whole truth: Computer Organization and Design: The Hardware/Software Interface by Patterson and Hennessey.
This is a question that requires more than an answer on StackOverflow to explain.
To learn about this all the way from the most basic electronic components up to basic machine code, read The Art of Electronics, by Horowitz and Hill. To learn more about computer architecture, read Computer Organization and Design by Patterson and Hennessey. If you want to get into more advanced topics, read Computer Architecture: A Quantitative Approach, by Hennessey and Patterson.
By the way, The Art of Electronics also has a companion lab manual. If you have the time and resources available, I would highly recommend doing the labs; I actually took the classes taught by Tom Hayes, in which we built a variety of analog and digital circuits, culminating in building a computer from a 68k chip, some RAM, some PLDs, and some discrete components. You would enter machine code directly into RAM using a hexadecimal keypad; it was a blast, and a great way to get hands on experience at the very lowest levels of a computer.
I've been a developer all my life, and my brain works in ways that make sense to a developer.
I'm interested in creating tangible, physical items using electronic circuits. I'm finding the following problems with much of the material I find:
I can learn all about the physical nature of capacitors, resistors, etc., but I'm lacking the insightful connections that would let me create my own higher-order device, such as a radio.
A lot of the things I take for granted as programmer seem difficult in electronics. For example, it's not immediately obvious how I would create a For loop electronically. I don't know how to create a circuit that can create or use a data signal (essentially, a struct. Example: "Current weather: wind=10 knots, temperature=30, humidity=20%"). I want to protect against a remote signal not being detected by a sensor.
What are some great resources for a developer to learn about electronic circuits?
Horowitz and Hill is a great one for hobbyist.
I came the opposite direction, from an EE background I got into programming and went back to school to get a CS degree. I recommend starting out with something that combines the two in order to make the transition a little smoother. There are tons of hobbyist books like Making Things Talk, and Hardware Hacking Projects that make this easier. I also recommend the Evil Genius series of books on electronics and robotics circuits.
Charles Petzold's Code: The Hidden Language of Computer Hardware and Software is another book that does a great job of tying hardware and software concepts together. I can't recommend it highly enough, although it may be taking an opposite approach than you're after. It starts with simple switches and transistors and builds up to show how they're combined to make a programmable circuit.
What are typical means by which a random number can be generated in an embedded system? Can you offer advantages and disadvantages for each method, and/or some factors that might make you choose one method over another?
One way to do it would be to create a Pseudo Random Bit Sequence, just a train of zeros and ones, and read the bottom bits as a number.
PRBS can be generated by tapping bits off a shift register, doing some logic on them, and using that logic to produce the next bit shifted in. Seed the shift register with any non zero number. There's a math that tells you which bits you need to tap off of to generate a maximum length sequence (i.e., 2^N-1 numbers for an N-bit shift register). There are tables out there for 2-tap, 3-tap, and 4-tap implementations. You can find them if you search on "maximal length shift register sequences" or "linear feedback shift register.
HOROWITZ AND HILL gave a great part of a chapter on this. Most of the math surrounds the nature of the PRBS and not the number you generate with the bit sequence. There are some papers out there on the best ways to get a number out of the bit sequence and improving correlation by playing around with masking the bits you use to generate the random number, e.g., Horan and Guinee, Correlation Analysis of Random Number Sequences based on Pseudo Random Binary Sequence Generation, In the Proc. of IEEE ISOC ITW2005 on Coding and Complexity; editor M.J. Dinneen; co-chairs U. Speidel and D. Taylor; pages 82-85
An advantage would be that this can be achieved simply by bitshifting and simple bit logic operations. A one-liner would do it. Another advantage is that the math is pretty well understood. A disadvantage is that this is only pseudorandom, not random. Also, I don't know much about random numbers, and there might be better ways to do this that I simply don't know about.
How much energy you expend on this would depend on how random you need the number to be. If I were running a gambling site, and needed random numbers to generate deals, I wouldn't depend on Pseudo Random Bit Sequences. In those cases, I would probably look into analog noise techniques, maybe Johnson Noise around a big honking resistor or some junction noise on a PN junction, amplify that and sample it. The advantages of that are that if you get it right, you have a pretty good random number. The disadvantages are that sometimes you want a pseudorandom number where you can exactly reproduce a sequence by storing a seed. Also, this uses hardware, which someone must pay for, instead of a line or two of code, which is cheap. It also uses A/D conversion, which is yet another peripheral to use. Lastly, if you do it wrong -- say make a mistake where 60Hz ends up overwhelming your white noise-- you can get a pretty lousy random number.
I'm looking for a laymen's introduction to computer hardware and organization. Here are some of the topics I would like to cover.
Brief intro to electronics.
Gates and state machines, intro to register transfer and timing.
Basic CPU design. Control.
Microprogrammed CPU design.
Memory hierarchy:registers, cache, RAM
Virtual memory organization.
Disk storage systems.
Internal busses-front side, memory, PCI
Internal busses for storage-IDE, SATA, SCSI
External busses-USB and firewire
Display systems and GPUs
I would prefer free resources online, but if nothing is available a book is fine as well. I have no background with hardware so an introductory text would be wonderful. Also I'm sorry if this isn't directly programming but I don't know where else to ask.
The Art of Electronics by Horowitz and Hill is a great one for hobbyist on electronics.
For computer architecture Computer Organization and Design: The Hardware/Software Interface
For RTL design VHDL for Programmable Logic
I would recommend the book "Code" by Charles Petzold. It covers a lot of how the low level of a computer works from a layman's perspective. Not everything on your list is included, but it will give you a good start.
Tanenbaum's Structured Computer Organization was my intro into the 'levels' of computers. It's quite logical, approaching each level built on the previous.
I've often thought of doing a similar one, stretching from quantum physics through classical physics, electronics, integrated circuits, microcode, machine code, compilers, interpreters, VMs and so on, but I fear that would be about as possible as Knuth's 12-volume series. I hope he has a child to carry on the work :-).
As mentioned already Code: The Hidden Language of Computer Hardware and Software is a great book that covers the fundamentals.
Here are a couple of other books:
Here's a good site: