This reminds me of a few (<10) years back when a high school student needed some C programming assistance.
I then found out that they were being taught to use Borland Turbo C.
I suspect the case for this is similar: Some person who once upon a time learnt to program and then never practiced nor developed their skills somehow got a job teaching programming, and is now trying to apply the ancient tools they were taught somewhere around the time when some fish decided to walk on land because its all they know.
Borland C and Pascal were great for learning programming in mid-90s, because of one crucial feature: an immensely good built in help reference.
Whenever you wanted to know syntax of some command, you print this command, press ctrl-f1 and is presented with docs, complete with a working example. It was insanely great. Remember, stack overflow didn't exist then.
QuickBASIC works the same way. Rust doesn't really build its help documentation for "built-in" access in the programming environment (everything is web-based, even "local" docs launch in a web browser) but examples are a first-class feature; they're reused as test cases.
Even though the Rust docs aren't built for this purpose first, in IntelliJ or CLion I can still press Ctrl + Q on a method and see the full method documentation with examples. I think what IntelliJ does is simply parse the doc comment for that method and then render the markdown contained in it in the little popup. The standard docs are built from those comments as well.
Well, JavaDoc is a mix of HTML and custom tag language based, whereas Kotlin moved to Markdown + some similar tags. Because HTML is a pain in the rear to write compared to Markdown and can't express everything you want concisely.
Maybe you mean Rust's documentation is rendered to HTML. In which case, sure, but so what? Good docs can be rendered to non-HTML formats too, typically for IDEs to display quickly without needing to load a full browser.
With Go, the ability to instantly read docs for anything (even just pulling docs for a single function) in terminal is extremely helpful and far superior to a HTML-only approach. "go doc net/http.Client" pulls up the docs for the http client type.
It's one of the features I miss most as I code Rust, and the lack is very detrimental to the coding experience, as pulling up a browser is a massive context switch.
Well, it may be on everyone's machine (just type 'which emacs') but that hardly makes it the most commonly used IDE, which I think is what's hard to stomach.
Not long ago, there was support on Firefox to directly open HTML pages within compressed files (through the jar: protocol). If that still existed, it would be an option to distribute the rust HTML docs within a compressed .jar file, and let the browser read directly from it.
If the old harware aphorism is "Never trust a computer larger than you can lift", then the software counterpart to that must be "Never trust a compiler that doesn't fit on a single floppy". <g>
"Never trust a compiler that doesn't fit on a single floppy"
I'm coining that second expression right here on Hacker News, today, 1/22/2020!
I'm fairly certain no one has said it prior...
But if they have, then my humble apologies, and in such case I'm un-coining it... <g>
Stack Overflow has dumbed people down significantly because of the mental laziness it engenders.
In a time before the internet, it used to be that books, and occasionally magazines, were the initial entry points to understanding software, hardware and electronics. Then the BBS came along, these secret islands of knowledge and trade.
I learned Pascal and assembly on Turbo Pascal with trial-and-error and lots of context-sensitive help. These days, development is mostly fragmented and half-working. Progress is neither linear, increasing nor assured, and much is lost, reinvented not necessarily as well and lost again.
Also, eventually coding will
decline or bifurcate more because general AI will allow non-programmers to ask a machine to do Star Trek-like self-programming. Essentially all office jobs are vulnerable to elimination as mechanical machine operators.
Speaking as someone who learned in the Turbo Pascal / Turbo C era, StackOverflow has moved the state of the art on hugely, and people's dependency on it is more a symptom of the environements they are working in than anything else.
> Progress is neither linear, increasing nor assured, and much is lost, reinvented not necessarily as well and lost again.
The big difference is that we are now networked. The target MSDOS environment was not a moving target and had no security considerations. The world is very different now; we do so much programming in the browser because it's the one single cross-platform zero-friction environment we have. But those reasons also make it a battleground between platform monopolists.
There is also so much more software. Part of the problem that Stackoverflow solves is dealing with this. You can't be an expert on everything, there isn't enough time and it moves too fast. So you need a quick solution to incidental problems so you can get back to the "core" problem.
Interesting code and fairly readable; I liked the simple scripting system, and there were a few comments and examples, and yes - double buffering of the Mode 13h screen (though the memory allocation seemed weird, but maybe there was a good reason for that). Certainly better written code than what I was writing at 16, mainly in BASIC and assembler at the time (was a fan of the Lee Adams series of 3D programming books).
Affordable C compilers (for a teenager at least) were not really a thing at the time (IIRC, even the DOS stuff, if you owned such a machine which I did not, still cost a couple hundred dollars around 1989-90-ish). I didn't manage to get a C compiler until the mid-90s when I picked up a copy of Turbo C for DOS, but I still think I paid $100 for the privilege.
If you know another way I can deploy code instantly on everything from PCs to phones to televisions without even having to sign anything, pay anything, or get approval, I'm all ears.
Since the early days of my career, people keep predicting the demise of the programmer. I remember CASE tools, and then Microsoft with VB were signs of the end. Roll forward 30+ years in IT, and complexity keeps going up, and programmers are still needed.
Maybe I'm just old, but I'm probably going to retire, and AI is not going to replace any programmers.
From what I see, the quality of the software in Corporate America reassures me that programmers will have jobs for many decades to come.
I don’t really blame StackOverflow here. We are living in an age where there are exponentially more people attempting to program than their used to be, and with lots of different motivations. Many of those people just aren’t capable or interested in being experts. The problem as I see it isn’t that this situation exists, but that those workers still command such high salaries.
We aren’t nearly as close as you suggest do the level of code generation that would replace the bottom quarter of the labor pool even, but if that day ever comes I believe it will be really profitable for the top 20-30%.
One of the issues is people in general don't pay for dev tools anymore, and writing documentation is laborious. Borland could pay documentation authors to write extensive help and examples because they charged money, and sold a ton of licenses. Stack Overflow is simply crowd-sourcing the same thing.
'Eventually' is likely to be a very, very long time. I doubt we'll have self-programming systems like that even in my children's lifetime. It will likely take AGI to do it. There's the classic tale of programmers back in the 90s quiting their jobs because they thought Visual Basic would let non-technical bureaucrats write their own software, so there would be no need for specialist software engineers anymore.
One of the world's top self driving car engineers reckons we might never get Level 5.
I'm not an AI denier. Eventually we will get AGI, I think it's inevitable. It's just that the time horizon is so far off it's not really worth worrying about. We don't even have a clue how to go about designing it's general operational parameters or architecture yet.
> One of the world's top self driving car engineers reckons we might never get Level 5.
I have some minor-level experience with self-driving car software, algorithms, etc. The majority of this is from various MOOCs I have participated in over the years, as well as research papers, books, and other things I have read and consumed. In short, I am not an expert, but I am not unfamiliar with the technology, either.
I personally think we'll never see actual widespread usage of self-driving vehicles outside of a very few narrow and carefully controlled (and regulated) cases. At least, I suspect, within my lifetime (which honestly, I'll be lucky for another 30 years or so).
My reasoning is that people will only trust perfection when it comes to riding in a self-driving vehicle. They will only be willing to use one if they can be assured that it will Never Crash, or be crashed into. They have no problem driving a car themselves, or being surrounded by other people driving cars. They have no problem with those crashing and even killing people - maybe even themselves (though they tell themselves fairy tales of it-will-never-happen-to-me to soothe over the reality). But introduce a machine into the equation...
...and that machine has to be Perfect. It cannot make any mistakes. It must avoid issues and be Safe 100% of the time, no exceptions.
In other words, people want the impossible from a machine, but will give utmost allowances to themselves and others as "humans".
I think a lot of this has to do with assignment of blame. When they crash or are crashed into - there is someone to assign blame to; themselves, the other driver, etc. Someone they can yell at, figuratively or literally.
A self-driving car? No one to yell at. No one to assign blame. Nothing that will feel bad for its error or failure to avoid something.
People can't handle that. They don't want a self-driving vehicle that has a safety factor of say, "seven 9s" - it has to be 100% safe or nothing. Because even if it makes a mistake only once out of a million miles of driving, that is still not safe enough. They want the unobtainable - a perfect machine, a machine that will never fail. Nothing like that can or will ever exist (basic laws of thermodynamics prevent it, for one thing).
Even though they themselves, or even the most professional of professional drivers - can't even come close to approaching this level. It both madness, and understandable at the same time.
> My reasoning is that people will only trust perfection when it comes to riding in a self-driving vehicle.
From what I've seen I think your model of human behavior is flawed. Aircraft have been mostly fly-by-wire for decades and software flaws have led to some crashes (most recently those of the 737 Max), but that hasn't stopped people from flying in planes.
People have already been killed by self-driving cars, but that hasn't stopped the testing programs. Tesla automation (though far from full self-driving) has led to a number of fatal accidents yet people still buy Tesla's and I haven't heard of any public outcry to ban them from the roads.
I too am skeptical that we'll be seeing full self-driving cars anytime soon, but not for the reason you give.
I think the real issue is that AI-like technology has a 90-10 problem. It's relatively easy to get to 90% of human level performance but that last 10% is much much harder - unpredictably harder. Unfortunately self-driving cars are an area where 90% isn't good enough, even 99% isn't good enough. You probably need to get to something like 5 or 6 9's to be acceptable for broad general use and no AI-like technology has yet gotten anywhere close to that.
EDIT: I didn't mean to imply that the 737 Max is fly-by-wire, but the MCAS system that caused the crashes is a comparable technology.
The usual practice is for autopilots to be switched off when the plane is within 200 feet of the ground. (Or a similar distance.) I think a self driving car would be a much easier sell if it never came within 200 feet of anything it could hit.
The way I see it, we will have the manufacturers of the cars to take the blame and to ask for compensation. They will probably take out insurance for each car they sell to pay out any of these claims with the premium that depends on the accident rate, added to the price of the car. This would be better even if the accident rate is the same as the current human one. I think we would see self-driving cars of limited capability become commonplace in the near future because of the extra safety it offers us especially from blame if an accident does occur. Also, the complexity of developing self-driving systems is because we are trying to adapt them to current-day roads if we make changes to the roads by adding lanes, stationary sensors to help self-driving cars we could make accidents are very rare.
The stackoverflow just makes it easy to look up trivia. The internet in general makes it easy to look up answers to simple problems.
That leaves more time for the interesting problems. I think that makes programmers generally smarter than before. They start from a higher base and can climb more quickly.
Also, you’re overestimating the potential trajectory of current approaches to AI.
But StackOverflow helps only with basic to common intermediate problems. The most upvoted answers often resemble the most common beginner problems. So it indeed very much replaces the old paper/online help documentation, is often better written and more readily available. On the other hand for more difficult problems it's common to not get an answer. In fact the problem might be so specific that it's even against the SO rules.
> coding will decline or bifurcate more because general AI will allow non-programmers to ask a machine
I was expecting this development with the rise of React since at least the visualization part has become completely declarative. It's no rocket science to automatically transform that, in fact being an XML dialect there are a lot of tools available for this. But still, even in 2020 people still make money with writing HTML/CSS either plain or within a CMS.
In fact even a reverse development happened in some regard. In the 90s it was quite common for people to write CRUD apps running in Access, FoxPro etc. These tools - like Delphi which might be closest to that - practically disappeared.
The analytical load is still there and cannot be abstracted away. The decline of Access etc. happened because many aspects couldn't be mapped. Not to speak of all the nitty gritty details needed to make applications run safely and maintainable. There's still a very long road ahead to automatize all that, especially in a way that it consumes only a reasonable amount of CPU, memory and storage...
> Stack Overflow has dumbed people down significantly because of the mental laziness it engenders.
I don't think SO has dumbed down people. It's just the average level has tanked because of influx of people in tech. In 1980s average computer person was highly skilled, educated and intelligent, because even to get access to even the simplest computer required a solid dedication and investment. Now, tech is ubiquitous, easily accessible, and lots of people with most different backgrounds jump in. You'd be amazed how many 'developers' fail to solve FizzBuzz test. We actually DO test applicants with it, and results are extremely appalling. I am unable to understand how a person who calls himself a developer and has experience can fail this, but I have learned to accept this as an observed phenomenon.
I have been thinking the frustrations I am forming with writing software was due to an ever lowering barrier of entry. I never thought about mental laziness as a factor, but that explains the frustrations so much more clearly.
* The DOM is a 20 year old standard that is the primary interface to any markup language and the primary interface when working in the browser, yet it is still somehow an arcane mystery of lost knowledge - https://news.ycombinator.com/item?id=21069571
* Excessive sensitivity not found in other industries. I have heard that several psychologists are predicting my children's generation will be the most offensive generation as a counter to the extreme sensitivity (fragility) - https://news.ycombinator.com/item?id=20668380
* More selective bias, online code tests. Now when I see online code tests without extremely specific grading criteria I instantly terminate the conversation. - https://news.ycombinator.com/item?id=20605329
Sure... when general AI becomes available, flexible, and has solid usability for something like this. Which will be about the time that faster than light stardrives become common and our meals are materialized out of thin air on demand.
Despite perceptions, the last few years' "AI boom" is mostly marketing. Advances in processor power have made deep learning algorithms practical outside lab settings, but we're not even close to a "generic AI" that can think as well as even a dog can. That's decades away at the earliest.
Right now, we don't even understand how our own brains work.
I would argue that StackOverflow itself is not a problem—it is instead the laziness of many in the "newer" generations of programmers.
StackOverflow has many great explanations, which are no different than reading a chapter in a book (it's just a book that is extremely relevant to you problem).
The problem is that lazy people demand code dumps, and lazy people provide code dumps instead of explanations. This then becomes a feedback loop as people get used to this degree of laziness.
This is especially rampant for Web topics, and to some extend also a counter-argument to the Borland help pages with examples: When presented with a fully written solution, people will blindly copy it and only do the minimum amount of work to make it run.
I think this is accurate. I'm pretty technical but with no formal education on programming. Currently I can only tinker with programming in my free time. There are a lot of huge gaps in my knowledge and it sucks sometimes, but at least Stack Overflow allows me to solve certain problems more easily, so I can get on with doing fun stuff.
Regardless of trade, I never approve of this mentality as the "shit" rarely gets done responsibly or sufficiently, as that cannot be done without fully understanding at least the chosen implementation.
Perfectionism is not required, and can get in the way of professionalism. This is about a minimum amount of responsibility.
The mistake is assuming Delphi is bad and stupid. Delphi is MUCH better than C and C++.
Miles ahead.
Pascal is what we need, but C is what we deserve. Billons of dollars wasted in it.
What it have bad is the massive mismanagement of the owners, and that translate to lack the influx of talent to it.
Note: I'm moderator in a Delphi forum and use it for years. I'm now in rust, and is great yet:
- Some of the problems that rust fix? Delphi too decades ago.
- Delphi still compile so fast. All the stupid c-based langs are turtles. Including LLVM. Is a sad joke that "I use C-/C++/rust" for performance and the compilers NOT PERFORM. Period.
- All the other currents langs on earth, all of them, still fail at build a GUI easily.
- Only recently, with Go and now rust, people rediscover the joy of easy deployment. Delphi have it decades ago.
Of curse Delphi is not perfect, and we the users know it. Sadly, in this industry worse is better and without a path to improve the lang you are at mercy of the owners. But like smalltalk, foxpro, hypercard and others, Delphi is a testament of what could be a decent lang/environment.
That the community at large still inflect itself with massive amounts of pain and billons of wasted effort with c/c++/js and still refuse to learn? Is something I will never understand.
P.D.2: If this sound like a rant? Yes. My super-duper machine is still compiling with rust and I'm frustrated, because this post remind me that this pain could be avoided...
The problem I had with Delphi was Embarcardero's pricing. At the time I was looking into it there was no express license or anything similar so it would've been a few hundred dollars to just try and learn. Contrast that with Visual Studio Express which was free and allowed me to learn C#. There was a significant cost to learning Delphi. Since then, however, I discovered Free Pascal and its ability to support multiple platforms and targets. Granted, I abandoned my journey of learning Object Pascal a long time ago.
I fondly remember using Delphi to create native Windows GUIs. It was really great for this.
The reason I stopped using it was that I discovered that there are many more program types than Windows GUIs - and there, C++ (or sometimes C) is really the best. High performance lock-free lists? Support for obscure network protocols? Containers with custom allocators? Delphi does not really work there.
Is interesting to note that you can see "High performance lock-free lists" in many langs, but near none have good GUIs builders. Good GUIs builders are harder, and your "High performance lock-free lists" is easy, also you can use C/C++ libraries in Delphi too...
And with Delphi the language and ecosystem are almost synonymous. Sadly. Is the compiler was free from the start and the IDE would cost money, they might have had a much stronger offering.
What's the problem with it? Programming snob? Isn't it still C programming after all? I don't understand the problem with people bashing Borland after all these years! Didn't you know that the accomplished Mr Anders Hejlsberg was behind Borland (TurboPascal and Delphi), before he was appointed to develop Microsoft C# and Typescript? So, what's the/your problem!?
I'm thinking of a graduate turning up for a modern programming position and opining:
> I know Delphi!
I would argue you can made good courses in any language and it would seem more convenient to teach in something that has better transferrability into industry.
Also the response is likely aggravated by this quote from the source article:
> a curriculum that has taken heavy damage from the adoption of Java and C# in the early 2K’s.
May be one problem might be that it might condition learners to the peculiarities of programming for DOS and its archaic memory model, which aren't that of relevance to modern systems.
Turbo C was a great C compiler. Embarcadero still gives away two (!) free versions of its C/C++ compilers today: the command line tools, and the entire IDE and toolchain in its Community edition.
Depends on what sort of stuff they learned - remember that those students knew nothing about C so they'd be learning very simple stuff.
As an example, the free version of Turbo C++ 1.01 that is available through the Borland/Embarcadero museum (or was at some point, i don't know if still is) has ~34 introduction examples (referenced via their guide) that outside of a couple, use only basic stdio and stdlib stuff (these 2-3 examples that use a non-standard header use conio.h which AFAIK is borland specific, though Watcom/OpenWatcom and Visual C++ also seem to provide the used functions) without any DOS-specific functionality (be it far pointers or whatever) and could compile with any modern compiler.
I see your point, but the rationale might be completely different. Most programmers nowadays focus on web programming. The main reasons for this are partly technical (easy deployment, easier upgrades, multi-platform) and partly political (subscription model, complete control of user data).
Now, a minority of us believe that desktop apps are really worth fighting for. They are fast, they give user control over their data, they can work offline, and the subscription model is optional rather than built in. Some in the mainstream would like this to be killed. Some companies like Apple approve it as long as they get their cut. But most simply don't care. If you don't want a future where everything you use is controlled by someone else, then the Turkish decision is not meaningless.
You missed the point: Borland Turbo C was a proprietary compiler and IDE that had its last release in 1988 (replaced by Borland Turbo C++, which died sometime in the 90's). It has no relevance in this day and age.
If you teach C, at least teach C99 using a contemporary compiler—C11 was available when this all occurred. At no point did I suggest that a different language or paradigm should have been taught.
Had I been that minister, I would absolutely do something similar to what Turkey did.
If the goal is to get more students to become programming literate, then the last thing that I would want is to pick a random flavor of the day language, with some frameworks, and some "best practices" debated to the oblivion on HN or Reddit. I would want all schools have a pretty standard system, with identical software that works identically because the goal of that software is to act as a pen and paper in a class. It needs to get out of the way.
Imagine the conversation had they picked C++ and do Linux. First, there's going to be a multi-year debate if we should be using Ubunto, Centos, Arch, until some pointy headed expert says "The future is in CoreOS!" just before everyone decides to standardize on Fedora. After that there would be a fight about versions, because everyone knows we should always run the latest. That's except for the group of people over there, that are refusing to run the latest and want to run the proven. Oh and they do not want to do C++, they hate C++ and say it should be Rust and since it is a Linux system and shadow IT is pretty easy on it, they are going to teach Rust! Or mostly argue about teaching Rust... etc... etc.. etc.
Here's a thing - if 10% of students gets out of the Delphi environment and starts messing with GCC for any reason then the program already succeeded in achieving the goal.
> and is now trying to apply the ancient tools they were taught somewhere around the time when some fish decided to walk on land because its all they know.
A little bit too harsh. As the article points out Delphi is currently #12 on the TIOBE Index [0]. Above Go for example.
Absolutely too harsh. Remember Pascal was designed to be a learning and teaching language - this purpose is what it was made for.
And Delphi itself was a standout leader in technology, a huge inspiration for C#, and still a key technology used by many people today. It's just not well known. Not well known != ancient.
As I recall, c# was developed by one of the primary designers of object pascal, Anders Jejlsberg, after being poached away by Microsoft. (https://en.wikipedia.org/wiki/Anders_Hejlsberg)
and think that they actually didn't get a job at teaching programming but they involved in politics and appointed as "Minister of Technology" for their "loyalty" to the Head of State. That's what Turkey is dealing with right now.
5 or 6 years ago, our ex-minister, claimed that it is dangerous to work with computers since they can be challenging for the mind in a conference about cloud computing. Let that sink in.
I wouldn't call Delphi ancient. It's not like they haven't been updating it. Delphi can target Windows, macOS, Android, iOS, and Linux. It's pretty good:
On the positive side, Turbo C didn't include telemetry and didn't have to download "plugins" from the internet.
Delphi is a modern development environment. I haven't used it recently, but it was incredible in the 90s/00s and I know it kept adding features and platforms. AFAIK Beyond Compare is built with Delphi.
You missed one important point about Turbo C: The debugger. It has an amazing UI even at today's standard. If you need an all-in-one tool to teach programming (esp focus on the concepts), Turbo C is still a good choice.
I wonder if anyone has come up with a TurboC-like UI "theme" for, e.g. Emacs. Freepascal uses a Turbo-like UI by default and a free reimplementation known as RHIDE is already (IIRC) part of the FreeDOS distribution, so recreating the UI itself shouldn't be too hard. It might be a fun way to boost adoption of these terminal-based tools, since the ergonomics and UX is in fact quite reminiscent of modern IDE's.
That's a nice start, but it lacks some of the most distinctive features. Even being limited to a single full-view buffer is not very Turbo-like. And I'm not seeing much IDE-like integration.
I remember back then(2002'ish), Pretty much all CS labs had Borland Turbo C.
The funny thing was the output prompt would disappear after printing things, so to hold it there, people would add getch(). This was so common, I remember at one point people treated getch() was something you always added at the end of the main(), like a return statement.
What does it matter which programming language they learn in high school? It's not supposed to be the language you use for work. It's supposed to teach you to program.
If anything it should be mandatory to learn something else. It might be the only non-work programming language they will ever learn. Cradle to grave Python / Java?
Would Common Lisp / Haskell be considered a bad / good / ancient choice?
The student has to learn programming skills that can teach them the fundamentals quickly, ideally be transferable to most fields and hopefully something they can use freely on their own time.
Python is great for this. You can have kids building whatever in class and they can go home and use the same tools and make videogames or visual novels or whatever else strikes their fancy, and when they get hired still use the same tools.
My main issue with haskell is that it's likely not very transferable. That isn't to say you shouldn't be able to use it or teach it - maybe you can teach something unique with it you can't teach with other languages, but that needs a bit of explanation beyond just "well I prefer it".
The education should serve the students, not tickle the fancy of the teacher.
> My main issue with haskell is that it's likely not very transferable.
I thought this for a long time, having taught myself Haskell in university. I never regretted it, but I'll admit I never thought of it as a practical choice either.
Then I ran into Rust, and found I was somewhat mistaken. ;-)
technically it does not matter. The trouble is students HATE learning something they think is a waste of time. I've found it impossible to convince them that the first language doesn't matter.
If they just learned anything they'd progress at roughly the same rate, but when their motivation is impacted it makes a bigger difference.
I have fond memories of learning Common Lisp in university. They might be fond only because it's something I decided myself, but SLIME was a revelation. However...
Common Lisp lacks a static type checker, or any decent type system. These days I consider that a catastrophically bad design choice, and I can't justify teaching it as anyone's first language. It's also, well, it's design by committee.
Scheme might be better -- at least then you'd be learning about call/cc. If you aren't learning a language purely for work, then I think learning it should teach you something new. Scheme, Rust, Haskell, even Java/Kotlin if you've never used a decent IDE -- all of those can teach you something. CL sort of can't, anymore, though for a long time it could.
Or someone marketed the software to Turkey as a great way to increase their IT skill set as a country. Borland was awesome back in the day (at least in my opinion) but what Embarcadero has done with the software in years recent is a travesty. If they're going to insist on Pascal they should go with Lazarus.
I've been toying with the idea of running a few classes to help early and mid-career adults learn to program. Although it's been so long since I first learned that I worry I'd struggle to get the basics across.
Does anyone have a recommendation for a good, up to date, open source, introduction to programming? Ideally something that uses an experienced programmer (i.e. me) to guide.
My kids never showed much interest in program despite my encouragement, but this year our high school changed its programming curriculum to follow Harvard's CS50 model:
The first half of the course is mostly in C and then switches to Python, and I really think it is a very well designed course that get students doing some interesting stuff very quickly while building a solid foundation for future learning. There is an online version of the course and students can watch all of the videos and lectures. I believe there is also a teacher training program for the curriculum.
I was surprised when my son came home and started asking me questions about malloc and free, but he has learned a lot in a very short period of time.
In my experience, it's close to impossible (or maybe even totally impossible) to get the basics across. I have been trying with many non-programmers for years and I have followed many people who tried to become programmers.
At this point I think it's probably a matter of "you either have it, or you don't". All of the people who succeeded "already knew" how to program, before I even showed them, or they never managed to learn.
And my experience is even with people who actually did manage to secure a programming job. In my opinion they still don't know the basics - they blindly follow "procedures" and mostly do copy/paste programming, without being able to grok the idea and why these things work the way they do..
The truth is what many of us consider to be basics - big O notation, b-trees, so on are not really needed by the average programmer. They might be useful (I get use from them) but they're not necessary.
Likewise, getting into intimate discussions on file encoding or strings in memory just isn't going to be retained by someone whose major interaction with a computer is facebook.
Focus on practical tasks, but take time to explain what code is doing in plain english. If you've done rubber duck debugging, you're aiming for that level of simplicity in your explanation. Avoid technical words, use simple metaphors and generally avoid the history lessons. Once you establish a baseline of skill in the developing programmer, swap the roles - have them explain their code to you.
I'm not even talking about big O or algorithms & data structures.
I'm talking about the basic concepts of function, iteration, abstraction, indirection etc.
Yeah those are definitely things that new programmers need to learn, but they usually need to learn them by doing things the hard way a few times first.
"yeah, that's a lot of copy and pasting isn't it? Want a neat trick to save yourself some effort?"
I have as well, I'm happy to report out of the last 5 non-programmers or very-new programmers I've trained, 3 are senior engineers and 2 are on-level engineers and I'd gladly work with any of them.
I like the "X by Example" books from O'Reilly in whatever your desired language is. Not exactly open source but quick to learn practical programming which is probably what you're after with adults.
Sure you can learn a lot but I have no idea why you would choose Turbo C in 2020 over say a Raspberry Pi with GCC. Turbo C (discontinued in 1990) is going to expose a bunch of irrelevant things today, as even most microcontrollers are 32-bit:
painful 16-bit near/far memory model
a single task operating system (DOS) that has been long obsolete
In going from zero to programmer those aren't bad things, they are a simplified learning model.
Would you advocate pilots stop learning to fly in single engine prop planes and instead jump right into the cockpit of an Airbus because that's what they will encounter in the workforce?
Any why are doctors practicing on dead people when clearly they can't possibly help them or cure anything that way...
A 16-bit space is not at all simplified, it's far more complex. Because of that, it is a terrible choice. It's a much more difficult memory model to learn in. In a flat 32-bit address space, a pointer is a pointer and it just works. It's much easier to conceptualize and learn on. Have you ever programmed for a bank switched memory computer? It's a nightmare. It's extra mental gymnastics pushed on the developer that have been irrelevant for a long time.
I don't think your example is a valid comparison. It's more like do you think all teenagers need to learn to drive a horse driven carriage before they learn on a modern car?
It does though, if you mix near and far pointers, you are going to be getting crashes. Pointer equality and comparison are non-trivial. Whereas on a modern 32-bit CPU you don't have any such distinction.
A classic, time-tested pedagogical approach that makes it easy for a student to reason about what the hardware is doing to execute their code... or a flavour-of-the-month JS “framework” that is impossible to reason about and really will be obsolete before the end of term? Hmm tricky.
Isn't this a bit like saying the last release of Ford's Model T was back in the 1930's? It is now called C++ Builder, and it is still regularly released. https://www.embarcadero.com/products/cbuilder
Yes, if when arghwhat said Borland Turbo C to describe a product without Borland, Turbo, or C in its name that product might indeed have a different release date :)
Totally agree. I've been making the same argument elsewhere on this thread. People are disagreeing with saying things were simpler back then. I don't know how anyone can have fond memories of a 16-bit CPU with segments or bank switching compared to a modern flat model and think it was simpler.
> I don't know how anyone can have fond memories of a 16-bit CPU with segments or bank switching compared to a modern flat model and think it was simpler.
"Fond memories" is easy to explain if that's what you grew up with. As for simpler, let me play devil's advocate for a bit: our "modern flat model" looks simple until you find out it's not really "flat". The 8086 model is basically "(segment << 4) + offset", while the "modern flat model" is actually a multi-level table lookup.
Yeah I guess that explains the fond memories part.
Some 32-bit parts are totally flat. Like low-end ARM parts with SRAM and no MMU. Compare that to a 8-bit or 16-bit PIC microcontroller where you need to bank switch to have a usable amount of memory for your application and its heaven.
But yeah, I see what you are saying. Still, pulling the wool over someone's eyes doesn't seem so bad to me about virtual->physical memory and TLBs as it does making them jump through a distinction between pointer types, but maybe I am in the minority on that.
I then found out that they were being taught to use Borland Turbo C.
I suspect the case for this is similar: Some person who once upon a time learnt to program and then never practiced nor developed their skills somehow got a job teaching programming, and is now trying to apply the ancient tools they were taught somewhere around the time when some fish decided to walk on land because its all they know.