What is Programming?
May 23, 2023
DataSciencePursuit
This page covers definitions of some programming terms and gives background information on programming.

What is Programming?
Simply put, programming is communicating with a computer. For a more specific definition:
Programming is the act of giving a computer commands or instructions for it to complete a task or solve a problem. These written commands are code, and the finished product is a program.
Think about driving rules, for example, how to turn right or left at an intersection. These are logical and well-thought-out instructions on how to drive safely and efficiently. Self-driving cars can be programmed with these instructions to teach them the basics of driving.
The role of programming is to help us leverage the power of computers, which is essential when working with a lot of data. Computers can work around the clock and perform large, complex, time-consuming, or repetitive tasks faster than we could. They will also do exactly what you tell them to.
Programs in everyday life
- Calculators are programmed to allow the user to perform calculations.
- Modern traffic lights are programmed to manage traffic automatically. For example, they use data from sensors to infer how much traffic is around and control traffic lights accordingly.
- Any apps (applications) on your electronic devices are made using programming.
Programming in data science
Since the world is becoming more digital and there is a vast amount of data available, programming is used at almost every step in the data science process.
Below is a list of tasks data scientists use programming for:
-
Extracting data from websites (web scraping):
- There is a lot of data on the internet. Extracting that data can make a big impact on your project.
- My former boss used web scrapping and automation to buy a book that kept getting sold out.
-
Analyzing data:
- Computers can help data scientists find patterns quickly. This can be done through visualizations like graphs.
-
Machine learning:
- As previously mentioned, machine learning models are programs that can be used to identify insights and make predictions from data. They are an important part of a data scientist's job.
-
Automating and scheduling tasks:
- We schedule tasks and models to run automatically, freeing up time to focus on more current projects.
- I have used this outside of work. I was losing track of time watching Netflix on my laptop. Setting alarms did not work; I would hit snooze. So, I scheduled my laptop to restart when I needed to go to bed. And continue to restart every 5 minutes. This was more inconvenient than the alarm, and for a while, it got me to bed on time.
What is a programming language?
To give computers instructions, we must be able to
speak
their language.
A programming language is a tool one can use to write commands to a computer. Like most written languages, it consists of symbols or characters and rules known as syntax.
I like the use of the word language here. It is a reminder that programming is communication with the computer to get it to do something for us.
Types of programming languages
Well, now we get more technical.
Machine language is the only language machines can directly understand and execute. It is a combination of zeros and ones (binary). These are like switches to the computer, ones represent on, and zeros represent off. Each combination means something to the computer. For example, the binary code for the letter p is 01110000. This is difficult for most people to understand. Machine language and any other languages close to it are examples of low-level programming languages.
Most programmers, at least in data science, use high-level programming languages. Python and R are examples, and these are some of the most used programming languages for statistical analysis. These two languages are relatively easy to use and understand. The code looks like math and English-like words. For example, the letter p will just be written as is, "p". I believe that coding (writing code) in Python and R is for everyone. It may be a bit harder for non-English speakers, but there are only a few words you need to know. The most important thing is understanding the meaning of your code.
Optional: Wait a minute, computers can only understand machine code, so how are these languages executed? High-level languages use programs that are essentially translators. One is called a compiler which translates high-level code into machine code. The other is called an interpreter, which can execute an intermediate language called byte code, line by line. Python and R both use interpreters. The Python or R code is converted to byte code first.
Common coding errors
Most errors you will encounter in programming are user errors. The next section covers two common errors.
Syntax
Syntax means the rules that dictate the correct way to write code. These include the proper format (arrangement of characters) and all characters needed to form valid code. Each programming language has its syntax. Computer syntax is like syntax in grammar, except for one crucial difference. If you use incorrect grammar, people may still understand what you are saying, but if you write code with incorrect programming syntax, a computer will not know what you are asking it to do. When you use incorrect syntax in programming, you get a syntax error.
You may have seen a syntax error while using a
calculator. The syntax for addition is
a number + another number
. If you enter
1 +
into a
calculator, the words syntax error or invalid
format may appear on the screen.
Enter a second number, and you will have the correct
syntax.
In Python or R, code is run/executed line by line. The computer will stop execution when it finds an error, giving you the line where the error occurred. You must fix or skip over the part with the error to get the remaining lines to execute.
You will likely get a lot of syntax errors, especially when you start coding. Even experienced programmers do, so do not get discouraged. You can always search the internet for the syntax you need help remembering. The more you code, the more syntax you will remember.
Semantics
As mentioned before, computers will do exactly what you tell them to do. Semantics is what your commands mean, which is what your computer will do. A semantic error occurs when you give a computer the wrong instructions. This is usually due to incorrect logic or missing something in the code. Your computer will not know that the instructions are incorrect, so you won't get an error message. The program will run successfully, but the result will be incorrect. Semantic errors are one of the more difficult errors to detect.
An example is a program that is supposed to calculate the total cost of something, but the programmer forgets to add sales tax. It would be nice if some shopping sites had that semantic error. Other examples could be as simple as adding when you are supposed to subtract. Or dividing when you are supposed to multiply.
To detect semantic errors, you must check that your results are what you expected. If not, go back and look at your logic and anything you may have missed in the code. Doing this step by step as you write your code is the least overwhelming way to detect and correct semantic errors.
You will encounter other errors, most of which will give you an error message; my advice is to google any error you get. The internet can likely help you figure things out.
Computer limitations
Some people believe that computers perform magic. On the contrary, people with more knowledge of how computers work would say that computers are dumb. What does this mean when they allow us to do such amazing things? Keep the following in mind when learning to program:
- As mentioned before, computers will only do what you tell them to. Therefore, a programmer must be able to give good instructions. That involves logic, precision, creativity, and problem-solving. The instructions also need to be detailed, with no steps left out. Computers will not give you what you want if your instructions are incomplete, incorrect, unordered, or missing any steps. So, as you learn to program, keep this in mind.
- Computers have no common sense. As mentioned previously, in machine learning, you teach machines using data. There is an expression, "garbage in, garbage out". If that data is incorrect, the computer can not tell even if it is an obvious error, like a negative age. It will simply learn the wrong thing. It is your job to make sure that the data is correct before teaching it to the computer.
- They cannot tell you what they do not know or, rather, what you have not told them. Sometimes when your code does not work, you may have forgotten to tell your computer something.
As the end users of programs, it makes sense that we believe computers can do anything. But as future programmers, we need to realize that someone instructed them to do that. Even when we look at some applications, we can see the limitations of computers. They can only do what the programmer instructed, nothing beyond that.
I drove to meet some friends one day using Google Maps. The fastest route was through the city park. When I got to the park entrance, the road was blocked for a marathon. No matter what I tried, Google Maps gave me the same route. I could not find the option to tell it that the road was inaccessible (it seems it wasn't programmed that way). I decided to manually find another route. When I was far enough, it then recalculated. Google Maps is great, and I still give it five stars. This example illustrates that computers are not magical; they will only do what their programmers have instructed them.
In summary, computers are just tools to help you complete a task. How to get that task done correctly is the human's sole responsibility. As with a calculator, you must enter the correct expression to get what you want. Having unrealistic expectations of your computer can lead to disappointment or frustration. When learning to program, please pay attention to how computers work, so you can better command/instruct them.
Conclusion
Programming is the act of communicating with a computer using programming languages. As a person learning programming, it is best to have no expectations for computers. That way, you can learn programming with curiosity and an open mind. This is necessary to truly understand how computers work so that you can come up with effective instructions.