I'm nearing completion of my Cybersecurity Masters degree. It's fully online and takes around 20-40 hours per week of work. No thesis, but I do have a project/capstone course right now.
I don't have a computer science undergrad so for me the motivation was both in the potential for advancement, but also a chance to study the field I've been working in for the past decade.
School has some clear advantages:
* It standardizes taxonomies, frameworks, and methods. You'll be able to talk to others who are more ingrained in things like distributed systems or other deep fields and recognize the terminology and use the correct references.
* Things like ethics, establishing trust, and reliability of data are hard to learn "on the job" except in cases where you are directly prompted. These skills can be easily taught in a classroom setting.
* The efficiency of classroom learning is hard to beat. A well defined lecture and reading guide will show you just the right content to most efficiently convey a topic. Obviously this requires a lecturer to be good enough at their job to successfully perform.
Frankly, some of the arguments here against higher education are alarming. You should not be acquiring specific skills in your classes. You should have learned logic, algorithms, and how DAG works in your undergrad or through self teaching. Mastery of Apache Airflow is your own business, but much more straightforward with the aforementioned skills.
After continuing in my IT career and taking college courses, I have continued to regret my time as a younger administrator. I was given too much permission and it's only through luck and coincidence I never caused turmoil. Without any formal training I had access to so much data and potential to inflict problems. I can't see a world where that kind of malpractice is allowed to continue. In the same way that we ask forklift operators to guarantee the safety of those around them by passing a test, we should be asking our IT workers to prove their basic abilities before handing over domain administrator logins.
I don't have a computer science undergrad so for me the motivation was both in the potential for advancement, but also a chance to study the field I've been working in for the past decade.
School has some clear advantages:
* It standardizes taxonomies, frameworks, and methods. You'll be able to talk to others who are more ingrained in things like distributed systems or other deep fields and recognize the terminology and use the correct references.
* Things like ethics, establishing trust, and reliability of data are hard to learn "on the job" except in cases where you are directly prompted. These skills can be easily taught in a classroom setting.
* The efficiency of classroom learning is hard to beat. A well defined lecture and reading guide will show you just the right content to most efficiently convey a topic. Obviously this requires a lecturer to be good enough at their job to successfully perform.
Frankly, some of the arguments here against higher education are alarming. You should not be acquiring specific skills in your classes. You should have learned logic, algorithms, and how DAG works in your undergrad or through self teaching. Mastery of Apache Airflow is your own business, but much more straightforward with the aforementioned skills.
After continuing in my IT career and taking college courses, I have continued to regret my time as a younger administrator. I was given too much permission and it's only through luck and coincidence I never caused turmoil. Without any formal training I had access to so much data and potential to inflict problems. I can't see a world where that kind of malpractice is allowed to continue. In the same way that we ask forklift operators to guarantee the safety of those around them by passing a test, we should be asking our IT workers to prove their basic abilities before handing over domain administrator logins.