In this unit we learn about the atomic bits of computation.  Many things have been tried, that is the history course, all modern systems are composed of subsystems that are built with logic blocks described by Boolean Algebra.  We will start with simple static logic circuits, and you will learn to use Logisim and other tools to build complex logic elements from nested modules that can be realized as physical circuits directly or programmed into an FPLA.

Simple Gates

Abstractly, a gate maps some number of inputs into a single output, and sometimes we generalize to compute multiple outputs from a set of inputs.  Starting from the simplest, refer to the Simple Gates circuit in the examples, we have the 1 to 1 functions, and we can see there are exactly four of them or 2<sup>2</sup>, then 2 to 1 with 16 possible functions, including the three examples (or 2<sup>2*N</sup>).  In any given circuit technology, we only have to implement a small number of gates and we can build the rest from them.

One input is either just a wire (no change), or it is NOT, where the output is the opposite value to the input.  Well, lets back up a sec, actually there are 4 functions of two inputs, but two of them are degenerate in the sense of really being zero input funtions, that is always true (1) and always false (0).

Also, sometimes a technology will have a fanout limit, but we can still get around that by putting in two NOT gates back to back to in effect re-amplify the input so it can drive more loads.  There could also be a multistage amplifier at the edge of the microscopic parts of a chip and the pads the lead out to the board and the rest of the world.  But lets not worry about that in thinking about logic.  We can rely on standards and modules to interface with the analog and digital protocols that will integrate our logic elements with other systems and networks.

Let that be the first lesson, sometimes even one bit of logic is complicated, and entire systems can exchange data in multi-layered protocols.  But as logic, there are two tables:

0 -> 0, 1 -> 1 and 0 -> 1, 1 -> 0

Then on two inputs, we have a table with four binary values, so there are 16 possible functions.  (add example tables as graphics)  Lets see how that can work.  For one bit in, I have a table with two rows, one for each input value, and it has two bits in it, one table entry for each row.  In that table, 00 and 11 are the always true cases and 01 and 10 are the other two, but we have to now what order they are to know which is whiich.  If 0 is first, the 01 is the itentity function and 10 is NOT.  If the sixteen are described in four bits, one for each combination of values of A and B, then again some are reducable to zero or one input gates if you assume a simpler gate where one or more inputs isn't connected to anything.  We often will use hamming code order, so the coding of ABCD bits below will match AB-00,01,11,10.  You can examine these in the simulator by grounding or pulling up an input.

A table form for one input would have the four functions from above with for lines, one for each "function" where each binary digit represents the output with particular input values. So the first column here is when the input is zero, and the second when one. So we have the four functions of one input:

  • 00 Always 0 (False)
  • 01 Identity (same as input)
  • 10 NOT
  • 11 Always 1 (True)

So, lets look at all sixteen (Digits AB: 00, 01, 10 11)

  • 0000 Always False
  • 0001 A AND B
  • 0011 A
  • 0010 A AND NOT B
  • 0110 A XOR B = NOT (A EQ B)
  • 0111 OR
  • 0101 B
  • 0100 NOT A AND B
  • 1100 NOT A
  • 1101 NOT (A AND NOT B) = NOT A OR B
  • 1111 Always true
  • 1110 NOT (A AND B) -- Called NAND
  • 1010 NOT B
  • 1011 NOT (NOT A AND B) = A OR NOT B
  • 1001 A EQ B = NOT (A XOR B)
  • 1000 NOT (A OR B) -- Called NOR

From Logic to Gates

Logic as above is the abstraction, the math you want your digital circuit to compute. A fact passed over above is that gates are not technically restricted to binary logic, but as a practical matter they are. We build almost everything with binary, synchronous (and we'll get into that when we talk more about timing and clocks later) logic. So the math is all Boolean Algebra as in the brief introduction above.

You won't need to know that much about how logic functions are implemented as gates (transistors) at the circuit level unless you are a specialist working for a silicon manufacturer. There are many interesting things to learn about all of that, but these days if you are designing custom logic for a project, you are either part of a big design team producing custom chips, or you are designing with some form of programable array logic. In these technologies you can get thousands and even millions of simple gates and medium scale modules either as a chip or on a system chip with a processor, memory and other standard interface logic.

You should be able to build and test any of the simple logic functions in any gate technology. You can easily show that any more complex logic function can be made from a sufficient set of one and two input gates. NAND or NOR on their own are sufficient as are either AND or OR paired with NOT. Different gate implementation may present opportunities to optimize that are somewhat specific to the implementation, but in general you can compute and N bit to M bit logic function with any sufficient set of logic gates and there are straightforward ways to optimize for space and speed. Because of this much of this work is automated in the form of hardware compilers that can compile hardware either directly onto a custom chip or into "standard" FPGA chips and systems.

If you study the list of functions, noting that the two forms on many of the lines are application of De Morgan's law, you can see that some involve no inputs, some just one, and of those, only NOT needs a circuit other than a simple wire to an input or source.

There is a lot more, of course.  Next you'll need to lean about flip-flops, simple circuits that hold state, and that will lead to clocks, timing and synchronous systems.  At the edges, they blend into analog functions that implement an interface between Synchonous Logic and a storage or transmission medium.

If there is something not already covered, do what I do, Google it. Propose your own excercises and real problems, document your work, identify and collect resources and write lessons and knowledge dumps.  Use this site as a home to build a shared knowledge base.