How does a computer work?
By luneliza
@luneliza (197)
December 30, 2007 1:09pm CST
I know vaguely the theory. What I can't figure out is how those long lines of binary code can be converted into what we see and read on the screen. What does actually happen? What secret alchemy turns a string of ones and zeros into bits of information?
1 person likes this
1 response
@mlproxima (69)
• India
31 Dec 07
When ever you give an input, first it is converted to a language which is then converted to binary (1's and 0's), these long lines are then split up into smaller ones (if you have a 32 bit system then there are split up into 32 bits, if you have a 64bit System then its 64, you'll have to have the OS to go along with it), these smaller data are then sent to trillions of gates which interprets the data and again gives the output in small size binary codes, These codes are sent in and out of the processor at very high speeds, imagine it as electrons traveling at speeds of light that represent these 1's and 0's. Again once the data comes back after interpretation it is converted back to a language and then to the output you see. This explanation is based on the software....if you're puzzled about the hardware, then each data is converted by the processor, the conversion set is represented by different ones and zeros.
If you input 1+1, the 1 is converted to 0001 (for a 4 bit system) by the processor, this gets stored in the memory, the storage is given an address by the processor (again the address is represented by binary), even the + sign is given a value in binary and that also gets stored in memory. Now the + is compared with the software binary code which identifies it as +, so for that another set of binary codes are sent to the processor that tells it that it is +, and now it retrieves the stored data and the output is given in binary and is stored in memory in another address. Now the output is identified by the software as 2, and then the 2 is displayed on the screen. Now there are billons of binary codes going in and out of the processor, as i said if you imagine 1 and 0 to be represented by elctrons, there can be 3*10^8 electrons moving in and out of the processor in 1 sec, so you can imagine to have 3*10^8 32 bit codes to go in and out of processor. And this is for a single processor only, if you have a dual core then you can have 2*3*10^8 codes per sec, this is just an example, an imagination to help you understand. Very complicated if you look at the whole picture with the minute details added to it...but this is what you have got after years of hard work and research.