Programming Language: From Human to Machine (ENG)

by Eduardo Prachedes

Eduardo Prachedes
6 min readNov 18, 2023

Introduction

Whether it’s Alan Turing with his “Turing Machine” or John Von Neumann with the “Von Neumann Architecture”, computers emerged decades ago and have continually evolved over time. Nowadays, it’s unlikely that anyone in the world doesn’t have a computer by choice. Screens are increasingly prevalent in daily life, shaping social interactions. Examples such as Facebook, Instagram, X(Twitter), iFood, and Netflix have impacted urban lifestyles worldwide.

Despite this pervasive presence in everyone’s lives, few people seek to understand what happens behind the scenes. No one wonders how Facebook Shops precisely knows what to recommend and encourage consumption. No one questions why Netflix is called a streaming platform and how it was created to cater to IOS, Android, PC, and TVs.

The common response to these inquiries is always, “It’s all algorithms.” However, this doesn’t really explain anything; it’s just a way to come to terms with the hidden programming world for non-technical individuals. Therefore, the following text will attempt, in a brief and non-technical manner, to explain how the dialogue between humans and machines occurs during the programming process.

How Computers Work

First, it’s important to consider that a computer is not just a Personal Computer (PC).

“An electronic device capable of performing work steps, such as receiving, storing, logically and/or arithmetically processing data, with the main objective of solving problems based on algorithmic solutions.”

Algoritmos: Lógica para desenvolvimento de programação de computadores(PT/BR)

https://www.amazon.com.br/Algoritmos-Desenvolvimento-Programa-C3-A7-C3-A3o-Computadores-Atualizada-dp-8536531452/dp/8536531452/ref=dp_ob_image_bk

Therefore, a computer can be anything that follows these basic processes of input and output of data: cell phone, tablet, PC, TV, Kindle. To better understand the interaction between the user and the device, it’s essential to delve into the formation of a computer.

The Organization of a Computer

The organization of a computer aims to process both logical and arithmetic operations. Thus, a set of steps must be interconnected for the system to function.

The steps usually begin with user input (keyboard, mouse, touch), and this input can either be stored in secondary memory (HD) or processed immediately by RAM. Everything concludes with the output of data, where, after processing, the machine provides a response (screens, files).

In other words, all these steps are programmed to be followed, leading to the execution of operations with a final response. For example, when clicking on a file, this input is processed, and the visual response is the opening of the file on the device.

Units of Measure

What many people don’t realize is how the computer processes all this information. It’s not magic; in this case, a completely precise combination of binary digits (bits) is used, where “1” represents activation, and “0” represents deactivation of a certain feature or internal circuit. All information flowing through a system is now undergoing this constant toggling for everything to function correctly. This is how a computer communicates among its components, through various “zeros” and “ones”.

https://wiki.brazilfw.com.br/lib/exe/detail.php?id=bit_byte&media=wiki%3Abit-byte-01.png

The binary number is the smallest amount of manipulated data, formed by a set of 2 bits. A Byte is a set of 8 bits (2⁸), allowing the definition of 256 symbols distributed for numeric, alphabetic, punctuation, and graphic characters, usually following the ASCII table.

“Taking the numerical value 2 as the internal operating base of an electronic computer (the bit) and raising this value to the exponent 8, representing the number of bits in a byte (2⁸), yields the value 256, which is the maximum number of characters that can be used in an electronic computer as defined by the ASCII table.”

Algoritmos: Lógica para desenvolvimento de programação de computadores

https://www.treinaweb.com.br/blog/uma-introducao-a-ascii-e-unicode

Therefore, the bit is the unit the computer uses to turn its circuits on and off so that everything functions in order from data input to output. A Byte is a set of bits that allows the representation of external data within the computer; for example, the number “2” in the machine refers to a Byte (8 bits) of content. The units don’t stop here; there are still Kbyte, Mbyte, Gbyte, but they are not the focus at the moment.

Human-Machine Interaction

As mentioned earlier, the machine operates through binary numbers, and a combination of “zeros” and “ones” doesn’t mean anything to us. However, computers only understand this, so how does the relationship between a programmer and a machine work? This is only possible through an intermediary, and this intermediary is the programming language.

The programming language is crucial for the computer to function; it’s the “conversation” between a human and the computer. Only through it does the computer “understand” commands proposed through “humanized” instructions.

Generally, there are two types of programming languages: low-level languages and high-level languages. Low-level languages enable more natural communication with the machine, for example, Assembly language.

https://marcosemanuelss.medium.com/a-linguagem-assembly-18831eb23c6

High-level languages allow for easier communication with a computer because they are expressed in a way closer to human communication, based on English language words, such as the classic C, JAVA, PYTHON, PASCAL.

//"Hello World!" in Pascal

program Hello;
begin
writeln ('Hello World!')
end.

Therefore, the programming language allows the instructions made by humans to be interpreted and executed by the machine, either in a language closer to the machine, hence less understandable (low-level languages), or in a language closer to humans, thus more understandable (high-level languages).

Algorithms

To conclude, it’s time to apply the concepts of system operation and programming language, and this is done through algorithms.

“Algorithms are sets of finite and organized steps that, when executed, solve a specific problem.”

Algoritmos: Lógica para desenvolvimento de programação de computadores

https://www.researchgate.net/figure/Graph-of-an-algorithm-for-bread-baking_fig6_343538069

Algorithms are anything that follows a certain number of steps to reach a resolution. The simple action of baking bread is an algorithm, a systematic process, the same happens in the computer. Through programming logic, it’s possible to dictate steps for the computer to follow and provide an output considering the input data, such as calculating 2+2 or writing someone’s name.

Conclusion

Given all this information, it’s now possible to understand that the functioning of a computer is based on data input, which is processed by algorithms through the interaction facilitated by a programming language. This is the essence of this intermediary; without it, it’s impossible for humans to create interactive applications precisely on a device. Without it, there is no conversation between two different natures, no translation of external data, and, under no circumstances, computers exist.

Ciência Compulsiva

All the texts posted on this platform were originally posted on the ‘Compulsive Science’ blog. Follow our posts on the official website and listen to our podcast! (currently only in Portuguese).

This text was translated from Portuguese to English, therefore it is subject to grammatical and translation mistakes. Report any mistake

--

--