External Info

This course provides an introduction to basic computer structure, instruction set architecture, assembly-language programming, input/output considerations, processor design based on digital logic, and memory technology and memory system design principles. The primary intent is to provide a foundation for subsequent courses on hardware/software interfacing for microprocessor-based systems, computer system architecture, and digital systems engineering. A secondary intent is to provide an appreciation of the low-level representation of software compiled from high-level languages into machine instructions. The practical aspects of the course are illustrated with the 32-bit Altera Nios II instruction set architecture and soft processor for implementation in field-programmable logic chips, but the principles that are conveyed using this example are largely applicable to any instruction set architecture and processor implementation. This course builds on and supplements knowledge from other courses on digital logic, circuits and electronics, and software/programming.

Course Learning Outcomes (CLOs)
  • Understand the basic functional units in a typical computer system.
  • Understand binary number representation including two's-complement signed values, hexadecimal number representation, and character representation using the ASCII encoding.
  • Understand the common basic features of a processor, including the program counter register, instruction register, general-purpose registers, and memory interface.
  • Describe the essential steps for fetching, decoding, and executing instructions with respect to the common basic features of a processor.
  • Understand memory organization concepts, including endian-ness and byte addressability.
  • Understand assembly-language instruction syntax including mnemonics, operand specification, and addressing modes.
  • Understand subroutine linkage, the use of the processor stack, and subroutine nesting.
  • Design and implement assembly-language programs by interpreting specifications, partitioning the computation into tasks, defining data structures, organizing subroutines, preparing pseudocode specifications for subroutines, and translating the pseudocode into assembly language.
  • Understand basic input/output interfaces and assembly-language implementation of program-controlled input/output operations.
  • Understand the partition of instruction processing into basic steps that correspond to stages of execution in hardware.
  • Understand the design of the control portion of a processor, and its datapath consisting of the general-purpose register file, the arithmetic/logic unit, the memory interface, internal registers between stages, and multiplexers for selecting inputs to components
  • Understand the common organizational and operational characteristics of semiconductor-based memory components.
  • Distinguish between different memory implementation technologies such as SRAM, DRAM, and EEPROM (including Flash type).
  • Understand hierarchical memory design with caches, main memory, and secondary storage.
  • Understand cache organization, mapping functions, and the role of temporal and spatial locality in program execution in dictating performance with caches.
  • Design caches from given specifications on total capacity, block size, and mapping function.
  • Understand  the concept of virtual memory, relevant hardware support for it, and its basic operation.
Credit Breakdown

Lecture: 3
Lab: 0.5
Tutorial: 0.5

Academic Unit Breakdown

Mathematics 0
Natural Sciences 0
Complementary Studies 0
Engineering Science 26
Engineering Design 22

PREREQUISITE(S): APSC 142ELEC 271 or MTHE 217 (MATH 217) or permission of instruction