Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

1.18K views

Programming code tells the computer to do something, but what makes the code actually mean anything to the computer?

In: Engineering

36 Answers

Anonymous 0 Comments

The compiler translates code from the words we write into numbers the computer understands.

Anonymous 0 Comments

Programs are usually written in a “programming language” that is easy for people to learn and use. Then there is “machine language”, basically a string of 1s and 0s arranged in complex patterns that a computer can understand.

In-between the person and and the computer is a special program called the “compiler”. It takes your programming language and turns it into machine language. They’re like an interpreter; if I need to talk to somebody who speaks Russian, but I only speak English, I have to find someone who speaks both to translate what I’m saying.

Why don’t programmers all just learn machine language? Well, it’s really, really hard. And it takes a long time to say anything. So the smartest ones who can speak it come up with a programming language that the rest of us can understand. Then the first things they do are write a compiler and publish a dictionary and rules of their language.

Source: am programmer.

EDIT: If you’re also curious about what the 1s and 0s mean to the computer, check out some of the excellent engineering comments.

Anonymous 0 Comments

I used to ask this question all the time, and the answer was just ‘ the compiler ‘. How does the compiler work ? No one could answer that one.

Anonymous 0 Comments

[removed]

Anonymous 0 Comments

If you want to dive deep in this topic, pick up Code: The Hidden Language of Computer Hardware and Software. It starts with basically wire and electricity and explains how to build a computer from scratch

Anonymous 0 Comments

I think what op is asking is how does a computer take binary machine code and know to run it. Is there code built into the processor that accepts the machine code as a “fuel”? What decides what code to run at what times (queue, priority etc)

Anonymous 0 Comments

A computer at the very basic is just a machine with an on/off switch. 1 is the command to turn it on and 0 is the command to turn it off.

Modern computers are a complex assembly of bajillions such machines which can do a lot of different stuff based on the right on/off configuration.

The most direct way for humans to configure computer to do their bidding is by inserting the right string of on/off configuration with a bunch of 1s and 0s. These are what is known as machine language.

But due to the sheer complexity and huge numbers of machines involved in a modern computer, this is not really efficient nor human friendly so we developed a shortcut.

We saved a bunch of frequently used on/off or 0/1 configurations to words we can recognize. So for example 0100110101010 010110010 1010001011 1010111010 can probably be the configuration needed to display a simple apple on the screen based on which pixel should be turned on and which should be turned off.

We are going to need that a lot and we don’t want to waste time typing that long string of 1 and 0s each time so we automate it by saving the whole string as “draw apple”. Nicely short, understandable and easy to remember. This process of translating human words to 0 and 1s is called compiling. So the next time we want an apple displayed, we just type in “draw apple” and the compiler automatically transform it to a bunch of 0 and 1s and set the computer to the desired state.

Overtime we managed to assemble a vast collection of such commands enabling us to do more and more complex configurations by using a relatively short and understandable way. These are the programming code we end up with.

Anonymous 0 Comments

At the lowest level, the CPU chip has a few storage slots, called registers (which typically hold one number of a binary length equal to the number of bits the CPU is designed to handle) , and a bunch of simple operations it knows how to do involving the registers. The operations are somewhat arbitrarily assigned number codes. So for example, your CPU might have operations like:

001: Add a number n to register X and store it in register X

002: Put a number n in register X

003: Copy register X to register Y

004: Read memory location z to register X

005: Decrement register X

006: Skip the next instruction if register X is zero

And so on. These commands are based on operations that can be performed using hardware logic, such as AND, OR, and XOR, and by turning dedicated circuits in the CPU on and off. To run a program, we just feed the CPU a list of these operation codes and data to use in the operations. For example a some program might be:

002 008

005

001 005

Which, based on the instructions earlier, would put the number 8 in the X register, then decrement it to 7, then add 5. After the program is compete, if we were to read the value of the X register, it should be 12.

Because the CPU operations are usually very simple we use higher level languages and compilers to convert easier to use commands into these very simple instructions.

I also recommend Ben Eater’s 8-bit computer videos if you want to understand it more deeply.

Anonymous 0 Comments

At it’s heart you can think of a computer like a programmable calculator. It can add, subtract, tell if two numbers are the same, bigger, or smaller, and do different calculations depending on the result (if bigger, do A, if smaller do B, and the calculations continue). More complicated operations can be done based on these simple ones. We humans have devised ways of describing behavior, and translating into those calculations. This is basically a programming language.

It would be tough for a five year old, but not impossible to just read one of these instructions for the computer, and figure out what it does. Some of them are pretty much the same as you’d see in code. “C = A + B”, could be “add A, B, C”, where A is the result, B and C are the values to add. This is then turned into a number that the computer understands.

How the computer understands this number isn’t exactly easy for me to explain. It’s quite complicated, even though the math is often very easy. The principle isn’t that different from previously described though. We’ve figured out how to use electricity to do basic math. That number I mentioned is turned into the math. I’m too tired to explain from there, but that’s essentially how it works.

Anonymous 0 Comments

At the human end we have high level languages which are easy for us to understand and write but which need to be interpreted or compiled to machine code before the computer can run it. Interpreted languages are slower but more immediate meaning you can run them straight from the source code (classically BASIC, more recently Python) whereas compiled languages are faster but need to go through a compilation step to be turned into executable code.

The machine code is a sequence of instructions that the CPU understands and would typically be different for each type of CPU. For example, an Intel processor can’t run the same code as an ARM processor even though the source code can be the same and the compiler would turn that into binary machine code for the specific type of CPU. At the CPU level itself there is memory (registers) and instruction decoders which then execute the binary code.

A recent wrinkle would be the Java Virtual Machine where the Java source code is compiled into a special byte code that the JVM understands and then issues CPU specific code so while the Java source code is compiled, it is designed to be an intermediate which can run on the JVM and then you just need a JVM that is compiled for your specific CPU which means languages like Java and their ilk can produce code that runs almost as fast as natively compiled code but will run on any CPU that has their virtual machine.