Research work on Computer Architecture
The Von Neumann Computer architecture and its importance:
The Von Neumann computer architecture model was first described in a paper by Burks, John Von Neumann and Goldstein in 1946. The computer comprised a CPU, memory and I/O devices with a single set of buses connecting the CPU to memory. In this architecture, the program is stored in memory, and the processor fetches an instruction at a time from the memory and executes it. When the processor is reading data values from memory, it is unable to fetch the next execution instruction till data reading is complete. This situation where data and program instructions have to be fetched sequentially is referred to as the Von Neumann bottleneck. The bottleneck limits the bandwidth and consequently the processor speed since execution is controlled by a program counter.
The Von Neumann architecture is important since it is the model upon which most modern computers are based upon, and are thus referred to as Von Neumann machines. However, to reduce the Von Neumann bottleneck and increase speeds, memory systems are designed to support caches capable of burst mode access.
The system bus and why it is needed:
The system bus is a data communications channel between major components in a computer, including the CPU. It comprises three different groups of wiring namely: the control bus, data bus and address bus. The control bus channels signals relating to coordination and control of activities in the computer, which are sent from CPU’s control unit. The data bus is a bi-directional data channel allowing data flow and exchange between the various components such as the processor, memory and peripherals. The address bus channels signals between the processor and memory regarding the current addresses being processed by the CPU such as locations the CPU is writing to or reading from. The address bus has a width equal to the largest address that the bus can access.
The system bus is needed since it connects the processor, memory and peripherals allowing for communication of all these devices. Its speed is of paramount importance in determining the performance of a computer system since it determines how fast these components communicate.
Boolean operators in computer-based calculations:
Boolean logic is a subset of algebra that is used to create true/false statements and expressions. The Boolean operators AND, OR, and NOT are used to computer computations together with comparison operators to compare values and return true or false results, as well as generate conditions for control structures. The AND operator returns a true value only if both conditions being compared are true or both are false. OR returns a true value if one of the conditions A and B are true, or one of them is true, and returns false if both are false. The NOT operator works with a single expression A and returns true if A is false (or null), and returns false if A is true.
Computer use binary operations and thus logic is usually expressed in Boolean terms. True statements usually return a value one and false statements return 0. However, computer calculations are more complex than simple binary comparisons and thus computer processors perform complex calculations by linking multiple Boolean statements together. The complex Boolean expressions can be expressed as a series of logic gates during implementation.
Memory and Storage:
Computer memory is used for the storage of data and instructions required during data processing and output presentation. Storage may be required instantly, for a limited time, or for extended periods. Computer memory is organized to achieve high performance levels at minimal costs.
The smallest unit of memory representation in a computer is the bit i.e. one binary digit. The computer handles data as a combination of digits and thus a group of 8 bits forms a byte. One byte is the smallest unit of data handled by a computer, and it can store 26 bit combinations. I kilobyte(KB) comprises of 210 (1024) bytes and increasing the powers by 10 to 220, 230, and 240 bytes, you get other memory representation units, one megabyte (MB), one gigabyte(GB), and one terabyte (TB) respectively.
Memory characterization is done on the basis of access time and capacity Access time is the time interval between a read/write request and the data availability. Capacity deals with the amount of data (in bits) that can be held by memory. Ideal memory has large capacity and fast (low) access times. The memory hierarchy is categorized into internal and external memory, and organized to optimize speed with capacity. The internal memory comprises of CPU registers, cache and primary memory. General features in this category are fast access, high cost, limited storage, and temporary storage. Primary memory is further categorized into Random Access Memory (RAM) and Read Only Memory (ROM). RAM stores information temporarily during processing while ROM is holds certain information such as BIOS, system settings etc. CPU registers have the lowest access times followed by cache memory, primary memory and secondary memory.
Secondary (external) memory has the largest storage capacity, is non-volatile, has relatively slower access times, can store information for long term use, and it is also the cheapest type of memory. Secondary storage devices include optical disks e.g. DVD, magnetic disks such as hard disks and diskettes, and magnetic tapes and drive.
Boolean Expressions. (n.d.). Working with Expressions, 8 of 11. Retrieved February 16, 2014, from http://docs.oracle.com/cd/A97630_01/olap.920/a95298/express8.htm
Goel, A. (2010). Chapter 3: Computer Memory. Computer fundamentals (pp. 39 – 41). New Delhi: Dorling Kindersley (India).
System Bus. (n.d.). Microprocessor Tutorial. Retrieved February 16, 2014, from http://www.eastaughs.fsnet.co.uk/cpu/structure-bus.htm
Von-Neumann Architectures. (n.d.). Von-Neumann Architectures. Retrieved February 16, 2014, from http://ecomputernotes.com/fundamental/introduction-to-computer/explain-about-the-von-neumann-architectures