**How Do Coders Use Algebra? It’s Called Information Theory**

**Written by wyncode on 28th May 2014, 11:58 AM**

```
<div id="attachment_2007" style="width: 390px" class="wp-caption aligncenter"><img class=" wp-image-2007" src="/uploads/2014/05/informationalgebra001-741x940.jpg" alt="solving algebraic equation"><p class="wp-caption-text"><span style="color: #000000;">Solving an algebraic equation – Photo by <a href="https://www.flickr.com/photos/crazysphinx/3965530823/" target="_blank"><span style="color: #3366ff;">Al Ibrahim via Flickr</span></a> (creative commons)</span></p></div>
```

The science, technology, engineering, and mathematics field, or the STEMs, are very hot right now.

What unites all of these fields together are not just human inquiry into scientific problems, but a common language. This common language is mathematics and it is why a person interested in the field of computer programming and coding will need to learn to understand and manipulate mathematical logic.

The mathematics that is commonly referred to in computer programming is algebra. Algebra is the calculation of formulas and equations to find variables or solutions. Algebra may involve a simple problem like x + 3 = 5 or one that is far more complex, and full of interdependent elements.

A solid understanding of algebra includes defining the relationships between objects, problem solving with limited variables, and analytic skill development to help execute decision making.

Information algebra owes to the body of work of mathematician Claude Shannon. Shannon dealt with the complex field of applied mathematics called information theory, which he founded with a paper he wrote in 1948. According to Wikipedia, “He is also credited with founding both digital computer and digital circuit design theory in 1937.” Shannon also worked on cryptography for the United States government during World War II.

Information theory is a branch of applied mathematics that arose from a concern Shannon and other theorists had about the amount of data that could fit within signal processing operations, like storing data on a computer or facilitating telecommunications.

Information theory is a complex concept that can be reduced to a simple example. For instance, a computer can easily process the mathematical likelihood of a coin toss since there are only two outcomes. However, the computer requires more data processing to consider the outcome of a six-sided dice being rolled. Hence, for Shannon, information theory is the reduction of uncertainty so solutions can be made.

One might wonder what this has to do with computer coding. Coding is of course the language used to execute commands within a computer.

If coding is a language of commands, information algebra can be thought as the mathematical calculation of those commands. By manipulating information algebra, a computer coder is able to lay out their coding language in a logical, rules-based manner.

It is these concepts intertwined that highlights how important mathematics is to computing. By understanding algebraic concepts, a computer programmer is able to deduce the best course of action through analysis of the variables that will help them solve for a result with their coding.