When you hear the word computer, maybe you think of something like a beefy gaming desktop with flashing lights, or maybe you think of a slim and sleek laptop. These fancy devices aren't what people had in mind when computers were first created. To put it simply, a computer is a device that stores and processes data by performing calculations. Before we had actual computer devices, the term computer was used to refer to someone who actually did the calculation. You're probably thinking that's crazy talk. A computer lets me check social media, browse the Internet, design graphics, how can it possibly just perform calculations? Well, friends, in this course, we'll be learning how computer calculations are baked into applications, social media, games et cetera, all the things that you use every day. But to kick things off, we'll learn about the journey computers took from the earliest known forms of computing into the devices that you know and love today. In the world of technology, and if I'm getting really philosophical, in life, it is important to know where we've been in order to understand where we are and where we are going. Historical context can help you understand why things work the way they do today. Have you ever wondered why the alphabet isn't laid out in order on your keyboard? The keyboard layout that most of the world uses today is the qwerty layout, distinguished by the Q, W, E, R, T, and Y keys in the top row of the keyboard. The most common letters that you type aren't found on the home row, where your fingers hit the most. But why? There are many stories that claim to answer this question. Some say it was developed a slow down typist so they wouldn't jam old mechanical typewriters. Others claim it was meant to resolve problem for telegraph operators. One thing is for sure, the keyboard layout that millions of people use today isn't the most effective one. Different keyboard layouts have even been created to try and make typing more efficient. Now that we're starting to live in a mobile-centric world with our smartphones, the landscape for keyboards may change completely. My typing fingers are crossed. In the technology industry, having a little context can go a long way to making sense of the concepts you'll encounter. By the end of this lesson, you'll be able to identify some of the most major advances in the early history of computers. Do you know what an abacus is? It looks like a wooden toy that a child would play with, but it's actually one of the earliest known computers. It was invented in 500 BC to count large numbers. While we have calculators like the old reliable TI-89s or the ones in our computers, abacuses actually are still used today. Over the centuries, humans built more advanced counting tools but they still required a human to manually perform the calculations. The first major step forward was the invention of the mechanical calculator in the 17th by Blaise Pascal. This device used a series of gears and levers to perform calculations for the user automatically. While it was limited to addition, subtraction, multiplication and division for pretty small numbers, it paved the way for more complex machines. The fundamental operations of the mechanical calculator were later applied to the textile industry. Before we had streamlined manufacturing, looms were used to weave yarn into fabric. If you wanted to design patterns on your fabric, that took an incredible amount of manual work. In the 1800s, a man by the name of Joseph Jacquard invented a programmable loom. These looms took a sequence of cards with holes in them. When the loom encountered a hole, it would hook the thread underneath it. If it didn't encounter a hole, the hook wouldn't thread anything. Eventually this spun up a design pattern on the fabric. These cards were known as punch cards. And while Mr. Jacquard reinvented the textile industry, he probably didn't realize that his invention would shaped the world of computing and the world itself today. Pretty epic Mr. Jacquard, pretty epic. Let's fast forward a few decades and meet a man by the name of Charles Babbage. Babbage was a gifted engineer who developed a series of machines that are now known as the greatest breakthrough on our way to the modern computer. He built what was called a difference engine. It was a very sophisticated version of some of the mechanical calculators we were just talking about. It could perform fairly complicated mathematical operations but not much else. Babbage's follow up to the difference engine was a machine he called the Analytical Engine. He was inspired by Jacquard's use of punch cards to automatically perform calculations instead of manually entering them by hand. Babbage used punch cards in his Analytical engine to allow people to predefine a series of calculations they wanted to perform. As impressive as this achievement was, the Analytical engine was still just a very advanced mechanical calculator. It took the powerful insights of a mathematician named Ada Lovelace to realize the true potential of the analytical engine. She was the first person to recognize that the machine could be used for more than pure calculations. She developed the first algorithm for the engine. It was the very first example of computer programming. An algorithm is just a series of steps that solves specific problems. Because of Lovelace's discovery that algorithms could be programmed into the Analytical engine, it became the very first general purpose computing machine in history, and a great example that women have had some of the most valuable minds in technology since the 1800s. We've covered a lot of ground already, learning about how primitive counting devices like the abacus evolved into huge complex devices like the Analytical engine, proof that there was life before social media. In the next video, we'll learn about how these mechanical machines made the leap into modern computing.