Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why weren't lambda grads able to get good jobs? I think that was the main controversy. Was it the instruction or what went wrong there?


Most of the people I saw get successful were people were people doing like... extra stuff outside of the classes. They were taking what they learned and applying it. Others organized groups of people, and essentially formed a small support network of people in a similar position. I completely understand that not everyone has the time for that.

I think two things happened over Lambda's life span that effected the quality of the teaching. The first was the initial pool they advertised too. I think I saw the post on Hacker News before everywhere else. I'd wonder if you're more selective from already technical people at the beginning and then sorta widening the net as time goes on to include more people if you could get initial big numbers push followed by more ok numbers.

The instructors I met were all great. I think a lot of that flack went more towards later cohorts or alternative program from the main full webstack track.


Did lambda school have any sort of assessment before accepting students? This really resonates with my observations. The people who got the most out of software bootcamps were those that had tinkered with video game modding, WordPress, etc. and had done some self-learning of coding before going to a bootcamp. In other words, people who knew they liked hacking on tech and had a basic ability to code. They approached the bootcamp with the mindset that they already knew they had technical interest and ability, and wanted direction on building marketable skills.

Do you think greater selectivity applicants would increase the success rate? I had thought the ISA model would incentivize this, but when I learned lambda was selling ISAs that seemed like it removed this incentive.


This is generally true but not always. As you can imagine we have loads of data on this.

If you only selected students who had been tinkering with writing code for 10 years certainly you'd be successful in doing so, but you'd also eliminate ~90% of those who we have seen become software engineers.

The only way we've found that does it well is to have people actually start writing code and see if they enjoy it. That's why we now have multiple free classes, have a free dropout period once you're in the school, and even have a three-week free trial of the school itself.

The notion that financing ISAs removed the incentives isn't really accurate.

First, most of the time ISAs are financed it's in the form of a loan you have to pay back with interest with an ISA and its repayments as collateral, or it's a sale to a neutral SPV with recourse in the case repayments don't hit a certain threshold.

In the rare instance (we've never done that) schools have been able to sell ISAs full stop, it's been at extreme discounts or based on discounted predicted likelihoods of future revenue, and if those ISAs don't repay the buyers bail and the school trying to sell them is out of business.


Edit: It's too late to edit my comment, but I noticed an error we have sold ISAs with minimal recourse not at enrollment, but at the point of _graduation_; we would sell half at an extreme discount at graduation (based on likelihood of being hired) and keep half on our books.


They had a prebootcamp with a free course that taught the basics. Some people cheat through that though and didn't quite process that... you can't really pull that off through the whole program?

I totally saw people who had never coded before succeed, I'm not sure I could pick out in an interview who would do well and who wouldn't. It's a marathon not a sprint.

For the selectivity thing... I'm really not sure. The selling ISAs thing was a bit weird when I learned about. I honestly kinda wonder about the actuarial calcs of it all.


Honestly I was a little surprised at the cheating too. I think I was a little naive in the beginning that if you actually wanted a job you would understand that you would have to be able to write code. But some folks are in a school mentality that if you get a grade/diploma you're good, regardless of whether you understand the things required to go into that.

Having tried a number of different ways to do admissions, I can assure you doing interviews is possibly the worst.

As far as ISAs go, it all comes out in the wash. If you create a pool of ISAs and students don't get hired you may have more ISAs but the average ISA is worth less, so the only thing that matters is whether each individual student gets hired. There's no financial wizardry that can let you sell $1 for $2 in the long-run.


Yeah that's fair enough. I don't think the people cheating ever got far enough to actually effect job stats. It was pretty easy to sus out who actually knew enough to keep going. I don't fault Lambda for that, it's just the reality of any educational goal line.

For the interview thing, that's just what you were doing at the time. At the rate you were iterating, I'm sure that there's a better process now.


> I don't think the people cheating ever got far enough to actually effect job stats.

Yeah, some do a remarkably good job of cheating and would get into hiring stats for sure, but I think we've almost entirely cut that out now.


Funny how some people think that "programming though Stack Overflow" will keep working after you progress to a certain level

School mentality (and I might add: grifter mentality as well)


I’m Senior Staff and still have to use Stack Overflow regularly!


Sure, not a problem. Most programmers do, probably.

However, I've seen students use Stack Overflow and other resources as a source of "program stamps":

- They enter their problem in an Internet search engine.

- Click on the first result with code in it.

- "Stamp it" on their own code: I.e., copy/paste it.

- Using editor/compiler, find any issues by trial and error and fix them so the editor/compiler doesn't complain anymore.

They might try a simple example or two to see if it works. And that's it.

There's not much reflection on their final solution, or on the bits they copy/paste. They don't seem to understand their solution, nor care about that or their problem solving process. They put in effort, they expect a passing grade.


> Most of the people I saw get successful were people were people doing like... extra stuff outside of the classes. They were taking what they learned and applying it.

Is this a knock on Lambda School, though?

You can go to MIT but if you never apply your learnings you won't be a good engineer.


Going through the motions of a CS degree is probably sufficient to land a decent job if you're getting good grades, not cheating, and you get an internship (or similar).

At least from what I've heard of bootcamps, you need to really go above and beyond to have similar chances with that route.


Well there are many many students at schools with CS programs who never get internships (because they don't apply or have an empty resume).


> Most of the people I saw get successful were people were people doing like... extra stuff outside of the classes.

Ironically this is not that different from my university experience.


Not ironically at all! If someone expects to attend programming classes (or any classes for that matter), not do any extra work and as a result to become a programmer, that someone will have a rude awakening. Or as one of my university professors said "Attempt to learn this subject by just listening these classes is equivalent to attemting to become a gymnast by watching Olympics on TV".


Mine too actually.


"They were taking what they learned and applying it."

Am I missing something, surely this is exactly what you should be doing?


If you go on the HN Algolia, there are a lot of posts from students and people involved with Lambda over the years that might paint a clearer picture of what kind of dysfunction was taking place at the company.


I don't think the assertion that "lambda grads weren't able to get good jobs" is true.

Obviously not every single student has been hired, but our hiring rates have always been pretty good (they're better now than they were in the past), and thousands of BloomTech (we had to change our name because of a trademark lawsuit) grads have increased their lifetime earnings by billions of dollars, and work at nearly every major company you can think of.


What are the hiring rates and the average and median improvements in salary?


You can see our 2021 audited outcomes report here with all of the data https://www.bloomtech.com/reports/outcomes-report. (Note: 2021 outcomes report is very recent as you have to get students graduated, give them time to get placed, etc.)

Some of it I'm thrilled about, some of it shows us where we have more work to do (or need to do better in admissions - candidly it's always a difficult balance between giving folks chance and certainty of those folks' outcomes.)

High level:

90% of those who are job seeking got hired.

Our median hired grad increased their income by $27,500 (and that's just their first job - obviously software/data science salaries shoot up quickly after a first job).

About half of our students have degrees, and half do not.


This comment got me interested, so I dived deeper into the report [0].

Learners are divided into three groups: graduated (59%), still enrolled (5%), and withdrawn (36%). Graduated learners are further divided into two groups: job seeking (63%: ~37% of all learners) and non-job seeking (37%: ~22% of all learners). Here's the definition of “non-job seeking”:

“A BloomTech graduate who has been unresponsive to outreach, has explicitly indicated they are not pursuing a technical role, or has explicitly indicated they have paused their job search.”

When we apply the base rate to the 90% rate, we conclude that 33% of those who attend the program (learners) got hired.

[0] https://www.bloomtech.com/reports/outcomes-report


I see, that’s … extremely unimpressive especially if you take the median salary increase from above. And I think maybe we should take everything else this guy is saying as potentially dishonest. Not including people who stopped trying to get a job is just an absurd way to do this calculation. Imagine a clinical trial that just ignores everyone who disconues due to adverse events. These stats seem borderline predatory


It's largely unregulated. Institutions formally registered as colleges and universities have more stringent disclosure requirements.


It's extremely, extremely regulated!


That controversy has been rightly following Lambda around for a few years now.


It's amazing, if you pre-filter all the non-successful outcomes, the success rate raises tremendously...


We should do a better job of getting more granular on that piece, because it really does matter, but the above isn't the right way to do that math to answer the question prospective students have, and is misleading in the opposite direction. The outcomes report is directed at prospective students who want to understand what will happen to them if they attend the school and look for a job.

You have to remember that (for this outcomes report) nearly every student uses an ISA under which no one is required to pay us unless/until they get a job using the skills they learned. There are a number of people who attend never intending to switch careers, a (large) number who ghost us the day after graduation, and a (large) number who get a job but don't tell us until we get tax returns (so we learn they were hired only after this outcomes report).

Our team works their asses off to work with these students, and is doing everything they possibly can. Slacks, calls, texts, emails, some of which are auto-generated from me personally, and in some cases even physical mail, to try to get them to work with us. If they respond _in any way at all_ with anything other than something that equates to, "I don't want a tech job" they are job-seeking in the outcomes report. We have built tooling to make applying to jobs easier, we find jobs that you should apply to for you, have an outreach generator where our team will write emails to hiring managers for you, and more recently even what we call "job search takeover" where we work with students on resume/portfolio/job criteria in advance, and we will actually do all of the work to fill up your calendar with interviews.

Students who look for a job in any way whatsoever get hired at a very high rate. In my view, if you're a prospective student, that's the information you actually want to understand. The fact that there are a number of students students (most of whom are using ISAs) who never intend to look for a job or don't look for a job is a fair indictment of our business model, but not a fair indictment of the quality of the school or the likelihood of getting hired.

So how should we treat that in an outcomes report? If you're a prospective learner do you want to know about the hiring rate of the people who ghost us or don't intend to look for a job, or do you want to know the hiring rate of people who map to the profile of what you expect to do?

If anyone has ideas of a better way to slice that data to convey the best information to a prospective learner, I would love to hear it.



I think presenting this data using the "funnel visualisation" is the most informative way of doing it.


I wouldn't count the withdrawn and still enrolled in that calculation, though.

If you go by graduated learners, it's 56%.


The gold is always under the shit.


Hey so I’m not your marketer, but my 2 cents is to revamp that page to read more like a “proof of performance report” i.e. remove the fluff.

The selling on the entire page makes it feel untrustworthy, which is the exact opposite of the intent.


There were for sure people I knew who got good positions in companies. I don't mean to imply otherwise.


I know someone who went through the Lambda/BloomTech bootcamp. Although they weren't able to transition to a new role, the experience helped them start to code in their job at the time. And iirc, because he didn't land a swe role, he didn't have to pay Lambda based on the ISA.


Or why were they? What percentage number is to be considered a success here? I don't have the faintest idea.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: