1. Bits, Bytes, and Size
Next time you complain about the pitiful memory capacity of your old 8GB iPod Touch, it’s worth remembering what makes up eight whole gigabytes. Computer science grads will know that in every gigabyte, there’re 1024 megabytes; 1024 kilobytes in a megabyte, and 1024 bytes in a kilobyte. Breaking it down to the lowest level, you’ve got 8 bits in a byte. Why does that matter? Because on a flash drive, each bit of data is made up of eight separate floating gates, each comprising two physical transistors, which can basically record themselves as either a ‘1’ or a ‘0’. (Want to be impressed ever further? Each floating gate actually relies on quantum mechanics to work.) That means that an 8GB iPod Touch – the one you were laughing at a minute ago for being puny – has, according to my back-of-the-napkin maths, 549,755,813,888 individual gates arrayed inside that svelte aluminum body. Mighty clever engineering indeed.
2. Everything you see or hear on the internet is actually on your computer
All your computer-whizz friends probably delight in telling you how having a ‘library’ of videos is so 2008, that no-one torrents anymore, it’s all Netflix and iPlayer and ‘The Cloud’, whatever that means. But, you might want to remind them: every time you stream a video or the week’s latest Top 40 off the web, it’s actually, technically playing off your computer.See, every internet media file has to make a local copy of itself on your machine, first. Ever wondered what that white buffering bar means on YouTube or Netflix? It’s the amount of video that’s been copied to the local cache, a.k.a. the amount you can still watch if your internet decides to up and die.
3. The distance data travels
A quick experiment for you: click this link, which should take you to Wikipedia. With one click, you’ve just fetched a bunch of data from servers in Ashburn, Virginia, about 6000km away. Your request has travelled from your computer, through a local Wi-Fi router or a modem, up to a local data centre, from there onwards (under the Atlantic Ocean, if you’re in the UK), all the way to Virginia, and back again – in around 0.1 of a second, depending on how good your internet connection is. By comparison, your body takes around 0.15 of a second for a signal to pass from your fingers, up your spinal cord to the brain, and back down again.
4. Counting Starts at Zero
At a base level, every computer’s just a really big, complicated calculator. But thanks to the way its intrinsic circuitry works – with lots of little logic gates that are either ‘on’ or ‘off’ – every action that takes place at a base level is happening in binary, where things are either a 1 or a 0, with no shades of gray in between. This actually translates up to a neat bit of programming trivia – in the computer science world, all counting (with the rather notable exceptions of Fortran and Visual Basic) starts at zero, not one. It actually makes a lot more sense – ever thought about why the 20th century refers to the 1900s? It’s because when historians decided on the dating system, they weren’t clever enough to call the very first century (0-99AD) the 0th century. If they had, we’d probably have far fewer confused school children the world over.
5. The work that goes into a Ctrl+C, Ctrl+V
One rather under-appreciated fact about solid state drives (SSDs), regarded as the gold standard for fast, reliable storage, is the amount of copying they have to do. When you want to copy some data from one bit to another, it’s not just a matter of shuffling the data from one part of the drive to another. Because of the complicated way an SSD works, over-writing a block of old data with some shiny new data isn’t as simple as just writing the new stuff in with a bigger, thicker Sharpie. Rather, the storage drive has to do some complicated shuffling around. In practice, this can mean that writing a tiny 4KB file can require the drive to read 2MB (that’s thousands of times more data that the 4KB file you’re trying to write), store that temporarily, erase a whole ton of blocks, then re-write all the data. It’s rather labour-intensive, so think before you juggle your files around next time.
6. Code isn’t as clean as you think
The majority of us put faith in bits of technology you don’t quite understand – be it committing your life to a 747, or your dirty pics to Snapchat’s auto-delete. When you do you generally tend to assume that the code’s been scrupulously examined by teams of caffeine-fuelled programmers, with most of the niggling little bugs found and fixed. The truth seems to be quite the opposite. One Quora user pointed out that buried within the source code for Java, one of the internet’s fundamental bits of code is this gem: /** * This method returns the Nth bit that is set in the bit array. The * current position is cached in the following 4 variables and will * help speed up a sequence of next() call in an index iterator. This * method is a mess, but it is fast and it works, so don’t f*ck with it. */ private int _pos = Integer.MAX_VALUE; It just goes to show that even programmers rush things to get home for the next installment of Game of Thrones sometimes.