- Published on
Important skills to pick up during your IT studies
- Authors
- Name
- nm
Two most important skills to develop - critical thinking and learning efficiently
TLDR - what has been mentioned in this article
- Programming paradigm
- Focus on fundamentals over languages/frameworks
- Data structures and algorithms
- Algorithm complexity
- Discrete mathematics
- Computational complexity theory
- Vi, vim, nano
- Git
A lot of content that has been covered during my degree was, to put it lightly, outdated. Based on the presentation slides style and concepts discussed, it was clear that the teaching materials were first created at least 15+ years go. Is that a bad thing though? Not exactly.
For starters, fundamental computing theory has not really changed much in the last few decades, because the underlying mathematical concepts and logic remain fairly constant. Furthermore, even if the particular course of your choosing does not teach you the most "popular" languages (I spent my first year studying C and Java), you will nevertheless benefit from understanding ideas like best coding practices, programming paradigms and patterns.
Slight tangent - I put popular in quotes because I strongly believe that rather than trying to find the best language to learn, you are much better off thinking about the industry or area of software development that you would like to get into. For example, if you are aiming to work in web application development, you might consider learning TypeScript. If, on the other hand, you see yourself working in the investment industry, Python and C++ might be your weapon of choice (Python for analytics and C++ for highly efficient, fast executing components). Another great example of this is (ADA)[https://en.wikipedia.org/wiki/Ada_(programming_language)], a language that has become a standard in applications where runtime errors can have large negative consequences (avionics, space, military industries).
Lets talk specifics then
The following list of computational concepts might not necessarily be useful in applied software development. Instead, it will help you become a deeper thinker about solving programming challenges and help create better software. Make sure that over the course of your studies, you become familiar with these.
Data structures - how information is stored in a manner that it can be later retrieved or manipulated. Imagine you went shopping for spices and put every purchased spice in a single bag. It might be convenient and fast to put your salt, turmeric, cumin and chilli powder into the same packet, but good luck separating them when you get home. Same goes for data - every single word you are reading right now is neatly stored in your devices memory, in a way that can be easily retrieved, moved and eventually deleted. This will not be possible without the use of appropriate data structures.
Going back to our grocery example, lets say you bought a bulk back of avocados and are now trying to sort them in a way from greenest, to the most ripe one. This will allow you to consume them in the most efficient way, without wasting any or having to eat unripe fruits. What might be the most elegant and fastest way to sort them? To be fair, it probably does not matter, because sorting 10 avocados is unlikely to take any significant amount of time. But what if you had 10 billion avocados? With scale, a lot of problems become very tricky, and finding the most efficient way to execute these kind of operations becomes of upmost importance. This is where algorithms and algorithm analysis comes in.
Have you noticed how data structures and algorithms naturally go hand in hand?
Moving on - discrete mathematics is another area that you will definitely benefit from, not only in IT, but also in other aspects of your life. This somewhat ties back in with data structures because it covers topics like set theory and graphs. It also covers arguments (a series of statements or propositions, that together structure a logical chain, or an argument) and other aspects of logic.
One area of computing science that really showed me the benefits of learning theoretical concepts, and helped me think understand real life problems on a deeper level (and consequently, design better solutions) is computational complexity theory. What this area of computing will enable you to do is going beyond a simple-ish "Let's throw this fancy algorithm to solve the problem", and instead be able to understand the problems complexity i.e. can it even be solved and whether time to solve it on a human scale. An example of such problem, that logistics companies deal with every day is the Travelling salesman problem
This is an advanced topic, which in my case was taught in third year. Read the wiki on it, and definitely take a deep dive into it when you feel more experienced and well equipped. A great book to read on this is written by Michael Sipser - Introduction to the Theory of Computation.
Lets talk about specific technologies and concepts that are not directly related to programming but nevertheless are very important to learn about. Git - in simple terms, a way to keep track of your code but also be able to keep every saved version of it and collaborate with others (think of it as Google docs for code). Make sure to spend some time and go above and beyond to what might be covered in your course. Git is an extremely important tool to understand, and unfortunately, I see too many IT professionals, with many years of experience, who don't know Git beyond a simple commit and push. If used right, it can help you keep the history of changes, which becomes crucial when crap hits the fan and you need to diagnose issues. Seriously, make sure you learn Git! Here is a great website to learn Git in practice, make sure to spend some time here once you learn the fundamental theory of version control - https://learngitbranching.js.org/
vim, vi and/or nano are old school text editors that I highly recommend for you to get some experience with. During your time working with the command line interface (bash terminals and such), you will probably end up using one of these, so start learning them as soon as you can. And yes, there will be plenty of opportunities to use the command line interface - interacting with virtual machines, cloud computing instances, remote devices, and many more use cases. Vi is approximately 50 years old at this point, and nano is around 25. Once again, even though certain ideas and technologies in IT are relatively old, they are in no way obsolete or should be disregarded (BTW Good luck trying to exit Vim :P)
While this is in no way a conclusive list, it is hopefully a good starting point for identifying topics that will help you build a solid knowledge base and your career.