We need to understand what computer science really is. And I dare say we can distill computer science into just this picture.
Computer science is about problem solving. High school courses typically paint a misleading picture that it's only about and entirely about programming and people with their heads down in the computer lab working fairly anti-socially on code; but the reality is it's all about solving problems, and very often, solving problems collaboratively either in person or by leveraging code, programs that others have written in the past.
And what does it mean to solve a problem? Well, you need inputs. So there's a problem you're trying to solve. That is the input. And you want output. You want the solution to that problem. And the sort of secret sauce of computer science is going to be everything in this proverbial black box in the middle, where you begin to understand exactly what you can do with that.
But in order to start solving problems, we kind of just need to decide as a group how we're going to represent these problems and what might a problem be. Well, in this room, there's a whole bunch of people. If we wanted to take attendance or count the number of people in this room, I might need to start keeping track of how many people I see. But how do I represent the number of people I see? Well, I can do it sort of old school and I can just take out a piece of chalk or whatnot and say, all right. I see 1, 2, 3, 4, 5. I can do little stylistic conventions like that to save space and remind myself. 6, 7, 8, 9, 10, and so forth. Or I can, of course, just do that on my own hand. So 1, 2, 3, 4, 5, and so forth. But obviously, how high can I count on just one hand? So 5 you would think, but that's just because we haven't really thought hard enough about this problem. It turns out that with just these five fingers, let alone these five more, I can actually count rather higher because after all, the system I'm using of hashmarks on a board or just now with my fingers is just kind of keeping my fingers down and putting them up to represent ones, really. But what if I actually took into account the order of my fingers and sort of permuted them, so to speak, so that it's really patterns of fingers that represent the number of people in the room, and not just the mere presence of a finger going up or down. In other words, this can remain zero. This could still be one. But what if two is not just this, the obvious? But what if it's just this? So raising just one, my second finger. What if, then, three is this? So we have 0, 1, 2, 3. That's going to lead us to four somewhat offensively. But if we begin to jump ahead to five, I might now permute this finger and this finger up. And if I want to now represent six, I could do this. And now seven. In other words, I've expressed so many more patterns on my hand already and if we keep doing this, I think I can actually represent painfully perhaps like 32 different patterns, and therefore 32 different people, on my hands alone. Or 31 people if I start counting at zero. So what is that-- what's the relationship and how did we even get here? Well, it turns out that computers are kind of simplistic, much like our hands here. At the end of the day, your computer is plugged into the wall or it's got a battery, so it either has or it does not have electricity. At the end of the day, that is the physical resource that drives these things and our phones and all of technology today. So if there is either electricity or not, that kind of maps nicely to no finger or yes finger. And indeed, computers, as you probably know, only speak what language? What alphabet, so to speak? Yeah. Binary. Bi meaning two. And indeed, that refers to the fact that in binary in computers, you only have two digits-- zero and one. We humans, of course, have 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, and then we can combine those to count even higher. But computers only have 0, 1, and then that's it. Because at the end of the day, there's actually a direct mapping between power being off and it being a zero or power being on and it being one, or some electrons or whatever flowing from your battery or from the wall. So this is why computers tend to speak only binary, because at the end of the day, it just maps really cleanly to what it is that's powering them in the first place. But how is this actually useful?
TO BE CONTINUED...
Python vs C++ & Java
"I got told today that Python isn't a "real" programming language. C++ or Java are."
Reddit users: Python is used in space on a fucking Micro Controller by NASA on space.
If anyone considers himself a professional and is laughing at Python he/she forget that half of the Linux system is Python and Perl scripts. As 90% of the internet servers are running on this OS... you get my point.
Tell that Python "isn't a programming language" to places like JP Morgan + Chase and Bank of America who are using it to replace the vast majority of their old legacy Java/C++ systems.
"Workflow Automation System (WAS), an application designed to manage NASA and other third-party projects." Made in Python.
The list is quite long, but admittedly a lot of it is businesses and manufacturers everyday people aren't familiar with, with specific projects.
But regardless, Python is used everywhere in the professional world. Not for everything, but for a lot.
It doesn't feel like I've figured anything difficult out or created a novel solution for something (which is what I assume real coding must feel like).
Not to say that's a bad thing, but to me python often feels like using other peoples' work and calling it your own. Thoughts?
To be honest this is a good thing. Removing some of the cognitive load will make developers more efficient.