

The programmer says I want a number stored at such and such location, and the computer goes and looks for it. The way the computer knows what it's looking for is that the programmer tells it what its looking for. This means that letters can be displayed as numbers, and vise versa. The computer then decides what it is when it grabs it from memory.
ITRANSLATE COMPUTER FREE
Might I suggest some free online classes from MIT on the subject here.Īt the lowest level, the letter A and the number 65 are in fact stored using the same sequence of 0's and 1's. I can offer you a simplistic answer, but to fully understand you will have to do MUCH more research. That is an excellent question, and one which would take years, and several PhDs to fully explain.
ITRANSLATE COMPUTER CODE
If you for example take the character code for A and want to display it as a decimal number, the computer would calculate that the decimal representation of the number is the digits 6 ( 110) and 5 ( 101), translate that to the character 6 ( 00110110) and the character 5 ( 00110101), and then translate those into their graphical representation. Similarly, whatever we do with the numbers in the computer, it's all ways of moving binary values around, doing calculations on binary values, and translating them to other binary values. When we ask the computer to display that number as a character on the screen, it will look up the graphical representation for it in a font definition to find some other binary numbers to send to the screen hardware.įor example if the computer was an eight bit Atari, it would find eight binary values to represent the character A on the screen: 00000000Īs you can see, the binary values would then translate to dark and bright pixels when the graphics hardware would draw it on the screen. The character A stored in memory would be 01000001, and the computer doesn't see that as anything but a binary number. Please note that I don't know much about computer science, so please explain everything in simple terms.Ĭomputers doesn't actually translate anything to binary, it's all binary from the start, and the computer never knows anything other than binary.

So how would it make a difference between "Z" and "892631"? So when the computer will encounter this binary string, it will translate it to the letter "Z".īut what happens when we ask this computer "what is the product of 709 by 1259?" Let's say that a computer programmer encoded the letter "Z" so that it translates to this binary string: 11011001111011010111 How is this possible?Ĭould you show me some examples? Like how does a computer translate the letter "A" to binary?Īnd when computers see a binary code, how can they know if that long string of 0s and 1s represents a number or a word or an instruction? But what I don't understand is that I've heard that computers translate everything (words, instructions. I know how computers translate numbers to binary.
