Skip to main content

Posts

Showing posts from November, 2019

Trying to cram all of computer science into one post and failing - Part one

1. It all starts with a bit It's very common to hear in your CS 101 class, 'Everything is 1's and 0's in the computer world'. Let me explain, So there is something called ASCII - you can google it. It's just a number assigned to most of the characters (i.e. letters and punctuation) that we use. Ok, so what? If you put a bunch of 1's and 0's together, you can represent any number. Kind of like Morse code. Don't worry if you don't know it. The idea is just there exists a mapping between a set of 1's and 0's and the normal numbers that we use. It goes something like this if we take 3 bits: 00 is 0 01 is 1 10 is 2 11 is 3 With 3 bits we can represent 9, with 4 -16, with 32 bits you can represent more than 4.2 billion numbers and so on. Watch the movie The Matrix . It takes this idea to the next level. So if a bit(a single 1 or 0 is called a bit) can be mapped to a number and a number can be mapped to a character(using ASCI