Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Most jobs are now hoops after hoops, not taking into consideration your particular profile or the contributions you can make.

This is really key. I have applied for jobs before, and then get questions like: "what's your experience with C++ or advanced graph algorithms?". Only that, none of that shows up in my profile or resume. But they act surprised and completely shrug off how a decade of software and other relevant experience is suddenly invalidated. As in, a person who has used and learned a dozen plus languages but only tacitly used C++ suddenly will be a complete invalid when trying to write in C++? Another company advertised that Python experience wasn't needed, but then the first phone interview peppered me with low-level Python implementation questions. Why even bother to interview me? It's a waste of everyone's time.

What it boils down to is that companies have zero idea how to hire. And they have zero idea how to mentor and train, basically for the exact same reasons for why they don't know how to hire.

While tough, it's often a good thing for the applicant as a natural filter. If someone can't hire well, it's not a good place to work. Sometimes it is, but it's relatively rare.



As a literal graph theorist, I cannot tell you how frustrating it is that (a) nobody seems to understand my work except (b) interviewers use it as a shibboleth to exclude people from jobs that will never need high performance graph algorithms. That, I never get called for these interviews because I don't use react angles or something, but if they did, I'd crush the interview and fall asleep at my desk once they start giving me work.


It is fun if you ever find yourself in this situation because you can play the uno reverse card on the interviewer and ask to clarify with impenetrable jargon and look for rising panic (can I assume the graph contains a Hamiltonian circuit? etc, etc)


Nah, when you do that, Murphy's law says that the interviewer will be the only person in the world working on extending nonstandard analysis to spectral hypergraph theory, and my attempt to snow them will reveal that I only have surface level understanding of the jargon I just emitted.


That or they're the super egotistical/arrogant type but too dumb to know you know more than them and assume somehow you're the person who don't know what they're talking about

Although in that case bullet dodged.


reminds me of the (apocryphal) joke about James Gosling (who invented Java in 1994);

interviewer in 1999: we are looking for someone with 10 years of java experience.

james: I invented java 5 years ago.


well that guys not getting hired!


You are joking but this may be true.

I once was interviewing for an interesting job and the topic was general knowledge of a system made of A, B and C (all were similar). The interviewer did not know much but insisted on some very deep details about B. I told him more than was available in the docs already and at some point I said I would need to peek in the code to say more.

He told that this was too difficult for me because only people who were part of the team who designed that would understand.

I told him that I wrote almost by myself the A part so it won't be too difficult to catch up with B.

I did not get the job (ultimately a happy ending) and was told that I did not know enough on A (the part I wrote).


I am joking but not really, basically it's my belief that any place asking for 10 years of a 5 years old technology is going to be really sensitive about anyone with an "attitude"


fun fact, I once interviewed at a place in which the tech lead interviewing me had confused the terms pass by reference and pass by value - that is to say he understood that in JavaScript objects were passed by reference and what that meant on the practical effects of assigning object a to b, but he thought the technical term for this was pass by value and the technical term for things that were passed by value was pass by reference (so according to him strings were passed by reference and objects were passed by value) and no explanation on what a reference was and how pass by reference works and why it made sense to call it pass by reference could penetrate.


just a pedantic detail, strings are passed in javascript by reference, they are just immutable


I just went down the rabbit hole of reading this post and the entire thread. As someone who has been looking for a junior job, it's probably the most depressing thing I've ever read. I've been on the market for over 6 months, I've sent countless resumes out and tried various techniques, but I'm not even getting a nibble.


I guess technically it's passed the reference to the string right, so if I say a = "stringA" there is a reference to "stringA" and that is assigned to a if I then say a = "stringAA" there is another reference created for "stringAA" and assigned to a, while "stringA" is sitting around somewhere waiting to be garbage collected in a few milliseconds - that's way complicated to think about and not sure if I haven't messed it up.

Easier to just say pass by value and forget about it. OR make all your variables consts and then it don't matter.


No, thats not correct. Value and reference assignment behave the same way for = (well, reference is hiding the fact that it’s not the literal string/object but a reference to it, a number is just the number).

Where it matters is in passing arguments to a function call. If you pass 42, it’s not mutable so incrementing, or doing anything, will not modify the original variable you passed-in. For a reference, using = will assign a new value (not change the original) but modifying the referenced object like, say a.b = 5 WILL change the original object.

It’s not really “pass by reference” that a C/C++ developer would understand but it seems to be the term that has stuck.


>= (well, reference is hiding the fact that it’s not the literal string/object but a reference to it, a number is just the number).

>For a reference, using = will assign a new value (not change the original)

what I wrote was regarding only strings, so I'm not understanding - it seems you are saying the same thing I said? But maybe I'm wrong about how the actual strings are stored.


Sorry to get a bit nerdy here, but in JS, neither pass by value nor pass by reference make sense as it’s not defined by the spec and much less followed by the implementations. Strings can be pointers to buffers or ropes, numbers can be values (NaN-boxed or otherwise) or pointers depending on a number of conditions, it all depends. However, from what’s observable in the language, all variables are pass by value. There’s no way to pass anything at all by reference, primitive or not, i.e. you can modify a field of an object you were passed but you can’t change the object.


Hashtable of all strings made in program.


So the naming is super confusing in these cases and the best way to get out of it is say "the references are passed by value", but... technically he was right. In JS everything's passed by value. It doesn't matter that those values are references. Pass by ref would mean that "function foo(a) {a='123'}; b=''; foo(b)" would change the value of `b`.

Every popular language which allows pass-by-reference makes those places very explicit (like ref in c++ and c#)


>but... technically he was right. In JS everything's passed by value. It doesn't matter that those values are references.

yes, technically I know this but even so he was technically not right because he still said there was pass by reference and pass by value in JavaScript, it's just that the description he had of what happens in pass by value is what is normally described as "pass by reference" and the description he had of what happens in pass by reference is what is normally described as "pass by value".

I think we can agree that given that he used both terms and mixed up their meanings that he was not "technically right"

on edit: meaning if he had said "we pass everything by value in JS but some values are references, what happens then?" he would be right, but when he said we pass objects by value and primitives by reference - what do these two terms mean and then he accepted the description of what happens in an object when passing the reference as being correct but insisted that was called pass by value, and he accepted that the description of what happened with a variable when it has a string assigned and then that variable is assigned to another variable and then the first variable is changed was correct including the ability to change the value of variable A and not have the value of variable B changed thereby but insisted that this process is called pass by reference, I intuited through this conversation that he was unfortunately not "technically correct"


Should've been more clear, it was only a response to the objects passed by value part as correct. Yeah, he was obviously confused by other parts.


In that case you won't do well in the interview because "bad attitude" or "lack of soft skills".


Well that's just it. Too many interviewers use it as a platform to flex how awesome they are. The proper response is at the end to ask a few probing questions of where they get to apply such skills day to day.


Unrelated, have you perhaps done anything with nonstandard analysis on graphs (or in spectral hypergraph theory -- most uses of NSA on graphs require infinite graphs, how does that work when the spectrum might not be defined?)?


Hahaha! I only crammed some dense jargon into a sentence to give the air of expertise... it's a bit of a trick, finding a combination of math terms that doesn't refer to an actual field of study.


Hahaha, I know what you mean.

It's been awhile since I've looked at NSA on graphs, but it's an interesting field of study. For something of a taste, an alternative proof of Kőnig's lemma might look like:

- Start with a locally finite, connected, infinite graph G.

- Choose any nonstandard extension G* of G.

- By the transfer principle (basically just logical compactness), there exist hyperpaths [0] of unbounded (hypernatural) length in G*. Pick one, P*.

- Restricting P* to G you obtain some path P, which is the infinite path you're looking for.

I settled into industry instead, but that's the sort of thing I'd like to study if I ever go back for a PhD, hence the interest in those sorts of ideas applying to spectral theory.

[0] The "transfer" of a path isn't actually necessarily a connected path in the usual sense, but it's indexed by the hypernaturals, and each well-ordered countable segment is connected. I'm skipping the entire intro that makes those operations make sense.


Dammit, I had hoped I'd nerdsniped you... but the nerdsniper is you and now I'm curious!


Well, you did nerdsnipe me too :) I haven't looked at this in awhile, and my curiosity is re-ignited.

The most basic style of proof in a lot of nonstandard analysis is (1) lift your problem to a nonstandard space, (2) prove something interesting in that space, (3) hopefully project something interesting back down to the problem you actually care about.

E.g., in nonstandard real analysis you can look at a real-valued function like f(x) = x^2, pick any epsilon z, and compute the hyperreal analogue (f(x+z)-f(x))/z = 2x + z. This is within some infinitesimal of 2x, so you use some machinery you've built up to conclude the derivative of the real-valued function is 2x.

The graph lemma above had a similar flow. Create G*, find something interesting, project it back down to G, finish the proof.

That's certainly not the only proof style. Nonstandard topology combines basically all the normal compaction theorems into one, for example, and that's a bit more intricate.

Even such crude techniques can bear fruit quickly though. Menger's theorem was proven in the early 1900s, and only extended to infinite graphs in the late 1900s. That 3-step proof process with nonstandard graphs makes it a bog-standard freshman exercise for locally finite infinite graphs, and only a bit more involved for the full generality originally proven by Erdos and friends.

I don't have any deep insights beyond that. The Springer GTM series has a nice intro to nonstandard analysis (not actually graduate-level IMO, a mid/advanced undergrad could use it, which is a nice level for that sort of book), building it up in a way that you could probably work with nonstandard extensions of other infinite structures (like graphs) without needing many/any other resources, especially if you've done much with building models of other theories via set structures.


> (3) hopefully project something interesting back down to the problem you actually care about.

Indeed, and this is the step that standard mathematicians tend to balk at.


Murphy's law is about things going wrong. But nothing can go wrong when encountering someone who knows more about something than you. You only stand to gain.


You gotta be careful. Some interviewers, especially the ones who are going to be peers, or worse, a peer of the hiring manager, might have mixed incentives to avoid hiring someone who could show them up.

I feel that happened to me once when I was interviewed for a Java job at a stodgy health insurer and the interviewer tried to test my Java and it quickly became obvious he was really very much a Java beginner and I could run circles around him, correcting his misconceptions. I was polite about it but naive, and it quickly became obvious he was offended and gave inaccurate feedback.

Another job, one of my rounds was with a peer of the hiring manager, and he did not ask me anything really beyond introductions, and then he lied and claimed he had asked me several technical questions and I'd failed them, which did not happen. I got that job anyway and accepted the offer, which was a mistake.

So actually, you probably don't have to be careful, because this is a good way to avoid a bad job. Unless you're desperate and need to feed the kids or something. Then feel out the interviewer, and do well, but not _too_ well. Don't make the interviewer feel stupid. Save that for after you've been working with them a while and have built up social capital in the company.


> can I assume the graph contains a Hamiltonian circuit?

Many interviewers will likely ask you: what is a Hamiltonian circuit and can you think of a solution that doesn’t contain a Hamiltonian circuit?


OP should start interviewing just to record this exact scenario - then share it here for the sweet, sweet schadenfreude.


Sadly people with power are immune to shame.


I suspect that is one of the reasons they are in power. Shame is what keeps us plebeians in place.



I love this book, but what is your point? :)

Exploiting shame is a valid strategy, and defencelessness is a weakness? People are feeling shame because they're inexperienced in power relations? Power relations are fun if you view them as games?


> Shame is what keeps us plebeians in place.

>> The first step in becoming a top player is the realization that playing to win means doing whatever most increases your chances of winning. That is true by definition of playing to win. The game knows no rules of “honor” or of “cheapness.” The game only knows winning and losing.


We are more populous. Shame we didn’t organize/unionize when time was right.


We did, but then they managed to convince us that the unions weren't on our side. But there's not much stopping us from organise ourselves again.


But what push us forward? Maybe fear?


historically, hunger and fear.


An African swallow or a European swallow?


Those who don't get the reference should immediately turn in their "I'm a nerd" tee-shirts.


People who quote Monty Python aren’t nerds, they’re just old.


Maybe so, but at least my kids would get that reference.


At least you have kids… We only have an empty hole in the ground covered by a sheet of tarpaulin, but it’s a house to us!


A hole!! You were lucky, we slept naked in the middle of Death Valley and died twice a day before going to school.


I'd dispute that in the UK at least, I think most people of a nerdy disposition tend to at least be aware of the films.


So true! Nerds would quote Adams or Gibson.


nobody expects the spanish inquisition...


One has to know these things when you're a king, you know?


Not if you’re the Burger King :)


Darn, I just bought it too.

Unfortunately (after "cheating" and looking it up) Monty Python was a bit ahead of my time. Or at least outside of my community circle.


or they're just younger than you


Or they are one of today’s lucky 10,000.

https://xkcd.com/1053/


bro it's just reference of not-so-funny skit of monty python. not a big deal by any stretch.


He’s not your “bro” little mister, he’s clearly your elder!


sorry sis but i don't care if he's old or not. monty python is boring and if you gatekeep people using "jokes" from it, you should reconsider your life.


Reminds me of the time an interviewer tried to get me to walk through an efficient solution to elevators, so I just proved it was equivalent to travelling salesman.


The answer to this metaShibboleth is only in a Adams space. There are 42 of them, but they must be specified.


In general, people are completely uninterested in experience that they don't understand, I've found. They don't want to even ask about it because it would showcase that they, gasp, don't know something that you do.


> In general, people are completely uninterested in experience that they don't understand...

It depends on the interviewer. I have colleagues who are risk averse. They want to stick with the tried and true. I on the other hand am a bit of risk taker. If you told me about something that I knew nothing about, and it was a legitimate way to improve things, you will have peaked my curiosity. I would immediately want to know more.

Also, it helps if the hiring person is an experienced dev. In my org, managers do not participate in the hiring of developers, other than background checks and verifying references.


One of the other things that I was thinking of was the notion of humility and curiosity. For me personally, I like to brainstorm and improv a bit and then shrink down to a proper design or method. This type of process is extremely difficult to communicate in an interview if the interviewer is either not curious or doesn't possess humility or both.


The idiom uses "piqued".


For all intensive porpoises it's the same word. Just spelt different.


Damn, I read that, and thought “something is wrong with it, but at least they didn’t write ‘peeked’” thanks for reminding me of the correct one.


True. 9/10 of the interviewers I have met only focus on exact experience by matching keywords, and they won't be able to identify superior candidates with slightly different experience. The reason is simply time and effort.

The upside of this is that being able to position yourself in a hot niche will get you tons of interviews without even applying. The downside is that careers become extremely path dependent, which is a bit scary.


Hell, a lot of the times I don't even have keywords anywhere on my resume or profiles, and yet I get interviewed and then asked about said missing keywords. It's bizarre.


or they wouldn't know if your answers could even be trusted, they need to be able to validate your answers.


At UC Santa Cruz I had Gerhard Ringel[1] as a professor for graph theory, talked with him outside of class otherwise. Spent a lot of time going through his book.

Some concepts I’ve used but to be asked random leet code graph questions, meh. The current interview paradigm needs to change. It’d be fine to be told - it’ll cover x and y a couple of days before but not drop into a random problem. That is not our industry.

Gerhard was an amazing teacher and taught me a lot. Sadly I’ve had to use it little.

[1] https://en.m.wikipedia.org/wiki/Gerhard_Ringel


I LOL’ed at “because I don't use react angles or something”


> As in, a person who has used and learned a dozen plus languages but only tacitly used C++ suddenly will be a complete invalid when trying to write in C++?

I've switched at least 5 languages professionally and used probably 5 more for extended periods of time and wrote a decent chunk of C++ "back in the day". I'd say C++ is the least suitable for "learn on the job" approach out of any language I can think of (I'm lumping C in there) - soo many footguns all over the place and very little to guide you to the right path.

They are at fault for even starting the conversation without making it obvious is a hard requirement.


I generally agree with you, but I think it depends on the team. If the team is just "using" C++ but aren't good software developers, then yea, I totally agree that having a non-C++ expert join the team is going to be a rough ride for everyone. But if the team's software architecture and coding practices are solid, which probably means they use a subset of C++'s vast feature set in a very clear way, then one probably could jump in just fine.

In a way, them only accepting in C++ experts probably means they're either doing something actually very complex with regards to C++ itself or their code quality is a shitstorm.

> They are at fault for even starting the conversation without making it an obvious deal breaker.

That is definitely my feeling. My resume is quite clear about my experience and tools.


The problem with C++ is you just don't know what you don't know. But you know there is a lot of it. A good framework certainly helps but it doesn't solve this basic problem.


I think by the time you’ve learned 12 different languages you’ll realize when something is hard enough that you need to take a step back and read some stuff first before diving in.


Nice that you mentioned it. Just a few weeks ago I didn't even know abstract syntax trees, AST's, existed, and I had that exact experience in order to build some stuff that works with them.


"But if the team's software architecture and coding practices are solid, which probably means they use a subset of C++'s vast feature set in a very clear way"

So... C? =P

Sorry. But my point is I think there's really very very few C++ places that could say their code is described by your statement. Not helped by the fact that I think there's really very very few C++ places at this point in the first place.


> So... C? =P

Without the string handling API, the always unsafe casts or the global states hidden in its standard library, the complete lack of automatic memory management, ... . Most of the bugs I run into in badly written C++ code turn up in places where someone had the bright idea to go C without good reason.


There still are places slowly enhancing their C codebases with C++.


There are a small number of high-end software firms doing this. "slow-enhancement" generally translates to "maintenance". The exception to this are a few prominent mega-caps.


> In a way, them only accepting in C++ experts probably means they're either doing something actually very complex with regards to C++ itself or their code quality is a shitstorm.

If you aren't doing something complex then you aren't needing C++ today, just use Rust then.


I once jumped into a C++ low level dev role having not written C++ in 20yrs (as a teen writing video games).

I think the benefit of experience as the OP is that you have a general understanding of the scope of languages, complexity and how to seek out answers.

I found that focusing on writing modern (as modern as was allowed) code using the most up to date patterns and methodologies meant that my code was generally better than peers that had been hacking C++ for 10yrs but developed bad habits or simply preferred riskier styles of coding.

I don't think C++ is special in being "hard". In fact, the language is so flexible that you can use one of a myriad paradigms that allow for fairly hassle-free coding.

The complexity is usually around the constraints of what you're coding because if you're writing it in C++ it's probably meant to be tiny, fast and memory efficient. That also implies that the problem itself is one that lends itself better to reasoning around as opposed to 42 levels of inheritance in a Java system.

I don't think _every_ developer could switch to C++ but if one of your say 5 languages is unmanaged then it's not rocket science making the switch.


Same. I wrote C++ "professionally" for ~5ish years out of my 25 year career and would only consider myself a novice in the language.


Been doing C for forty years and feel the same.


Beginner: I have so much to learn ...

Intermediate: I know everything!

Expert: I have so much to learn ...


> I'd say C++ is the least suitable for "learn on the job" approach out of any language I can think of

Anecdote: I've got a couple languages and decades under my belt, and a very simple C/C++ Arduino project is making me doubt my sanity. Ex:

    Serial.printf("%s", String("Hello world...")); // Emits 4 nonsense bytes... But shorter literals work fine.
________________________

EDIT: To save readers some time--I didn't intend to nerd-snipe [0] y'all, honest!--the answer has been found, and it is probably:

    Serial.printf("%s", String("Hello world...").c_str() );
The shorter payloads like "Hello World" happen to give the desired results only as an accident of optimization code.

[0] https://xkcd.com/356/


Your problem is that whoever designed that Serial class is incompetent and should be shot and their body defiled in unspeakable ways.

The C-language varargs concept and the C++ object model are not compatible. C++ provides varargs only for backwards compatibility with legacy C libraries. New APIs written in C++ should not be using it because passing a C++ object through it is undefined behaviour. No amount of rationalizing (as all the comments downthread are doing) is of any value: undefined behavior is undefined.


Yet GCC will still apparently let you call this function without as much as a warning. This should be one of the easiest mistakes to find statically.

Clang at least errors out here.

C++ is a mess.


Well, if the designer of the class used the right annotations GCC will issue a warning if warnings are enabled. This is third-party code and nothing to do with GCC per se and the language itself does not require any kind of compile-time warning for undefined behaviour. Then again, most people seem to disable or ignore the warnings the toolchain gives them. You can make anything idiot-proof but there's always a bigger idiot.

It sounds more like a crappy library designed by people who don't know their craft. It's easier to blame the tools.


It’s Arduino so it’s C++ through the looking glass. Some unusual choices about core classes and syntactic sugar via a bit of preprocessing, plus the opaque error messages and core dumps we all know and love, but on a tiny device with a serial connection.


Is it even possible to core dump on AVR, with no storage and MMU?


Seg fault, I guess? It’s been a while.


AVR has no protection and no segments. I mean really, it seems to be true that new generation of programmers have zero understanding of hardware, nothing personal.


Nothing personal taken. If I was something more than a hobbyist wrt to hardware, then I might.

I was having this conversation with someone the other day regarding GenAI. The expectations for understanding the lowest-level concerns have changed generationally. Today's hardware wizard might prize their old-school understanding of fundamentals but would probably be rubbish in 1948.


It is completely legal to pass a C++ object through varargs.


Will gcc catch that? The GCC compiler knows about "printf" as a special case. But "Serial.printf" may not have a definition with the annotation GCC needs to check parameters.


also avr-gcc is several major versions behind, isn't it?


Since it's using the String class, this is likely not being compiled by avr-gcc. Or at least I hope OP isn't being so masochistic as to try to use String on a Mega328.


you're right, it turns out it's an esp32


I imagine you got that code snippet(or a similar example) from somewhere but to me the fairly obvious problem is the chafing between the C and C++ world. %s is for C style strings and I have to imagine that printf function is in the C world. The String(“Hello world…”) is an object in C++ world so expect weird behavior when you try to combine them. As you say in your edit, SSO will make this even weirder.


Random 4 bytes sounds like reading a pointer as something else, shouldn't be too hard to debug.


If you're really asking about that code snippet...

I don't know that library, but have you tried this?

  Serial.printf("%s", "Hello longer world");


Oh, that works, but the Arduino String library had some features I wanted. Its docs have an explicit example of putting a literal (an even longer one) into the constructor, so... Mysteries!

My default expectation is that it is a footgun that I do not understand.

The alternative is that I've stumbled across a very rare bug in some popular libraries, or else my hardware is cursed in a very specific and reproducible way.


Looking at [0], I see a "c_str()" method. Maybe give that a shot?

  Serial.printf("%s", String("Hello world...").c_str());
I actually don't see a "Serial.printf(...)" method in these docs [1], but I do see it here [2]. I'm not sure I'm looking at that right docs for your library though.

[0] https://www.arduino.cc/reference/en/language/variables/data-...

[1] https://www.arduino.cc/reference/en/language/functions/commu...

[2] https://docs.particle.io/reference/device-os/api/serial/prin...


Just to answer the mystery, it seems the foot-gun is that smaller String()s appear to work "by accident" due to an optimization, and I have to call a method on the String object before passing it onwards. [0]

> I actually don't see a "Serial.printf(...)" method

I think it's coming from the ESP32-specific libraries. Some cursory searching didn't find the spot, but it may be some magical preprocessor directive stuff.

[0] https://github.com/espressif/arduino-esp32/blob/7a82915de215...


Maybe what's happening is the `printf` function is interpreting the memory address of the `String` object as if it were a pointer to a char array. This leads to it printing seemingly random bytes (which are actually parts of the `String` object's internal data structure) instead of the string content you expect.


This sounds kinda plausible.

IIRC, some string implementations have separate implementations for very short strings vs. longer ones. Similar thing for vectors.

I also see some size-related logic in the (current?) String implementation [0] [1].

String is a class with no virtual methods, and its first data member is a pointer to the underlying buffer [2]. So if Stream.printf unintentionally type-puns String to char*, it might work out okay?

[0] https://github.com/arduino/ArduinoCore-API/blob/master/api/S...

[1] https://github.com/arduino/ArduinoCore-API/blob/master/api/S...

[2] https://github.com/arduino/ArduinoCore-API/blob/cb3ab4c90d71...


Although sizeof(String) > sizeof(char*), so any subsequent arguments to printf(...) would probably be screwed up.


Thanks, yeah, that seems to match this SO answer [0] which refers to some "small-string optimization" in the ESP32 libraries [1].

So basically the foot-gun is that smaller String()s appear to work "by accident".

[0] https://arduino.stackexchange.com/a/90332

[1] https://github.com/espressif/arduino-esp32/blob/7a82915de215...


It gets worse.

You're only printing 4 bytes because your string is sufficiently short, and the next byte it's reading is 0, since your capacity is small.

If your string were about 17 million bytes long (0x0101'0101 == 16843009 is the first value that causes problems), then your address, capacity, and size would all likely be nonzero, in which case your Arduino would just keep printing bytes until it lucked upon a zero, or overran its buffer so badly that it started trying to read different segments of memory and eventually segfaulted.


> your Arduino would just keep printing bytes until it lucked upon a zero, or overran its buffer so badly that it started trying to read different segments of memory and eventually segfaulted.

I don't think I've ever used an arduino core for something that had an MMU. It should happily keep printing all the way until the end of memory, and either stop there because it's zeros after, or wrap back to the start depending on the board. I have written code to dump out every byte in memory over serial before, just for kicks.


This is more of a C question than C++.

In a C++ library you would probably make the format function type safe. But both Clang and gcc have annotations you can add to functions using format strings so they are type-checked


eh forget all this coding minutiae, i dabbled with arduino here and there.

in my experience it's not to do with code. your serial/usb runs the bytes back through your debugger (typically IDE On the PC). Your IDE has config settings like baud rate +/- encoding that you have to config. If that IDE config is messed up (IDE Is presumably printing out your printf bytes over usb or whatever), then your output is bytes. Make sure your baud is correct.


> "what's your experience with C++ or advanced graph algorithms?"

I once got an interview to help fix an Elixir project. They were having issues with the Websocket module of Phoenix called Phoenix.Socket. It's a 1000-line long piece of code of a much larger framework.

The person that interviewed me brushed away the fact that I had worked 6+ years full time with Elixir, and had almost 20 years in this career, and just wanted to know, in three different ways, "what is your experience with Phoenix.Socket", "how many times have you used Phoenix.Socket?", "how many years would you say you'd used Phoenix.Socket for?". That literally was the only metric they used to determine how good a candidate was. Experience or knowledge doesn't matter, just how good you fill their ridiculous expectation.

It'd be like focusing on your experience with "iostream" in C++. Incidentally, only a bonafide liar will somehow tick all the boxes and stupid requirements.

This is the state of tech recruiting in the past 5 years. It is mind boggling.


A while back this kind of experience in an interview was a gift as in you don't want to work at that place and they just told you this clearly.

But now maybe things are getting more real and you just need to eat.


It’s been that way for 20+ years. It happens when companies don’t let the experts do the interview or try to filter before hand using HR or other cheaper labor.


In my experience, C++ programmers have the smallest range of programming language knowledge. Basically, their whole world is C++. They have a hammer, and everything looks like a nail. "But performance!" they declare ... when you are writing a single function to reverse a string: "Can I use Python?" In my career and domain, I have never once seen a graph algorithm used, yet 5% of posts of HN are all about them. I guess I am an idiot.

Another thing about C++ interviews: Frequently, their difficultly far exceeds anything they are using in their own code base. That can be hugely embarrassing after you slog through a brutal few rounds of C++ interviews and land that offer. A few weeks later, you are rolling up your sleeves, waiting for the first Git clone to complete. Then you feast your eyes on a puddle of dog doo that they call uber-C++-2027 or whatever they claim. "Oh." Such defeat.


there are two types self-proclaimed C++ wizards:

- that who accepts that the language has infinite complexity and it takes a lifetime to master it, which makes it ill-suited for most real-world tasks, because there are domain-specific tools which have the same performance and are easier to use

- that who accepts the language has infinite complexity and thus it can serve any purpose, so there's no point in working with anything else, just keep improving the C++ subskills relevant to the task at hand

some folks of the second kind have a hard time with empathising that mastering C++ is not in fact the only way to build better software.


I got my first job (25 years ago it so) after reading a c++ book and doing the assignments. To get this job I had to take a c++ test made by the most experienced developer there. The code I eventually wrote had none of the tested language features.

These days I often work with a mix of C, C++, Python and Rust. I try to avoid c++ as people writing c++ (IMHO) are not writing code that is easy to understand. They usually write code which is efficient but hard to understand. Even when effectiveness is not a part of the requirements.


Does that company hire or recruiter hire h1b? In order to get h1b approval, you have to make the case that "we interviewed x,000 people andwe just can't find any qualified applicants! ". I'm starting to suspect that the industry has learned to set salaries low and churn enough applicants in order to reduce costs. One way to churn them is to do a phone screen and find a quick way to legally get rid of them. Then once you get the type of applicant you want - that happens to work for 20% less and never complains because his foreign residency is tied to his employer - simply don't ask them the question.


This has been going on for decades.

1. Find H1B candidate you want to hire.

2. Write job requirements matching that candidate’s experience so well it’s very unlikely for anyone else to meet those requirements.

3. Advertise position to meet legal requirements and reject any candidates not exactly matching requirements.

4. Hire H1B candidate.


Note that #3 is one of the "let's see how far we can bend that definition" things too. You will occasionally see this listed in a local newspaper classified postings ( https://imgur.com/W76Jdbn is one such example).


The worst, most bad-faith case I've seen of "advertising" an H1B job was of the job posting printed out, taped to the back of an interior office door, and covered by the recruiter's hanging overcoat.


This has been going until 2014 or so.

Later for serious candidates it got completely unpractical, because H1B lottery got too filled with companies mass hiring Indians, that no serious company will spend time on hiring you given they have 20-30% chance of actually being able to get you into US.

So now only H1B hiring you have is mass hiring, where company does not care if you or someone else gets through lottery, as long as they'll get their 20-30% of candidates through.


This is sometimes done by asking for five years of experience in a library that has only existed for three. Don’t bother applying at companies like that if you’re not on a visa.


[reference needed] - I read a study from a few years ago suggesting that H1B Visa hires on average get paid more than hires from inside the US.


As it's an actual requirement to publish the salary for an H1B employee as part of LCA disclosures, the data is collated every year, it's no secret what anyone on an H1B earns (if you are on an H1B you can often find your own individual record in here):

https://h1bdata.info/

My own take; its complicated. From my own direct experiences, there is a world of difference between how large companies typically use the H1B (The Microsoft/Apple/Metas) etc and how say a medium sized fortune 500 company IT department might typically use it. There are absolutely ways to abuse the process and they types of jobs being worked on an H1B very enormously in type, quality and salary - just explore the data above. The larger companies pretty much always do hire on same terms as any other American. At smaller companies, can be a lottery.

Salary isn't the only means of exploiting an H1B hire either, to be clear. Lots of other tricks like slow walking immigration processes, exploiting the fact it is harder for an H1B employee to change job than a citizen. The being skipped over for a promotion to avoid paperwork issues with the LCA etc is something I've seen first hand.


Thank you for this link :)


H1B should not exist in an overly saturated job market like this one. Most H1B don't do anything so specific that the company truly couldn't find someone in the US to replace them. Most of them are writing Web Services. Not creating the next AI, Distributed Ledger, etc.


For employers they’re desired, H1Bs are taking a lot of abuse and rarely leave because they’re tied to the job. And are paid a lot less so company saves $$$


Is that including lower job titles for the same jobs, education, and years of experience?

H1B’s sometimes get skipped over for promotions specifically because it would mess with the paperwork. Which across an industry would definitely mess with these statistics.


That’s a requirement of a h1b in fact.


It’s a nominal requirement, but a little research shows how you can work around it.


> While tough, it's often a good thing for the applicant as a natural filter. If someone can't hire well, it's not a good place to work.

But for people like the guy who wrote that article, eviction eventually becomes a problem. And so many companies can't hire well right now that in a market with declining openings he might not be able to wait for a company that can hire well.


That is definitely true. And a lot of the jobs are jobs that the person would do well in, but the employers don't bother to see it. I know there are jobs that I would have done extremely well in, but the companies were just black boxes. They just sit around being unproductive while they wait for someone to check some arbitrary checkboxes. It'd be like trying to hire a farm hand but instantly reject them because they had only driven a different manufacturer of tractor.

As another anecdote, I applied to a job that I had a project that was much simpler than several of the things I had done in my past jobs. It was a job I know that I could almost do blindfolded, so to speak. But they would literally not even speak to me because I was missing a certification (a useless one, not some real certification like professional engineer or architect or whatever) that they were for whatever reason requiring. I even mentioned to the recruiter that I had had the certification but let it lapse because there was no reason to keep paying for it, and that I knew several people who had the certification that knew the language and area less than me. Didn't matter.


> And a lot of the jobs are jobs that the person would do well in, but the employers don't bother to see it. I know there are jobs that I would have done extremely well in, but the companies were just black boxes.

Oh yes. One sticks in my mind. All of these details were present in my resume.

Job Ad:

> Expert level health insurance system knowledge and experience interfacing between EHR providers and partners.

Me: Director of Product for a claim benefits management software company (i.e. the software that insurers use to run their business, cut checks to providers, process premiums, calculate deductibles, etc., repricing, the whole nine yards). Also have worked extensively with EPIC, Cerner, ESOsuite. Have also worked in platform development, third party API integrations.

I get that it was probably the boilerplate, but stung that twenty minutes later:

"We apologize, but we are looking for someone whose skillset and experience better our aligns with our requirements for this position."

Huh. I may well not have been the ideal candidate, but not sure how much more closely my experience could have been "aligned".


The software job market for the past ~1.5 years has basically been an industrial belt-sander for candidates' self-esteem.


Generally recruitment tools have 3-4 options to pick, all of them carefully worded so that the reply to the candidate cannot be used against the company.

In your case it looks to me you were over-qualified. That's a real thing, since generally it means there will be salary issues later on in the process or you will be too senior compared with the people you work with (and leave)

Its frustrating but they did you a favor. Keep it up, something will turn up.


It's extra frustrating because after months of not working, I'm getting rejected for "too much experience" when applying to a lower-level job, getting "not enough" experience for senior-level jobs, etc...


The big problem seems to be that whatever hiring filters are being used for jobs now are completely broken. You can have a CV and history that exceeds the requirements of the job (based on the description) by miles, yet end up either ghosted or given a simple form based rejection letter.

Meanwhile you can apply to a role where you meet maybe one of the requirements, and then wind up with an interview. It's completely backwards, and makes me suspect that whatever system is being used to filter out applications simply doesn't work. That either the recruiter sorting them or the AI system being paid for is somehow doing worse than random chance.


> Meanwhile you can apply to a role where you meet maybe one of the requirements, and then wind up with an interview.

I've seen this pattern too, where I realize that one keyword on my resume got them to contact me.

Do they want spam?

Because this is how they get bored unemployed engineers to automate filling in hundreds of applications a day stochastically littered with keywords, potentially making the overall problem worse.


To be fair, the awfulness of job boards and the current hiring system is probably inspiring a lot of the same automation, even without this insanity. When your choices are 'fill in dozens of applications a week and hope one pays off' and 'spend a long time filling in one every day/two days, only to find the company doesn't even read it', then it becomes extremely easy to just treat it like a soulless box ticking exercise.


I was recently turned down on the first interview with an HR head. Motive: I said I can sometimes be perceived as frank in an argument, as an answer to the classic "tell me one of your shortcomings".

I'm not saying it can't be a valid reason. I guess I just don't get their values.

I'm convinced I would have been a pretty good match technically, but never got the chance to show it.

Italy, opening had max €50k budget.

I have 17 yoe and obviously it was not my first rodeo. Maybe I finally learned that sincerity and transparency is a dumbass move.


I am also typically overly transparent and honest about my existing on the ground experience, even though I also have a lot of experience in learning new things. It doesn't always work out, as people are often scared of transparency. I often get the feeling that there's something "hiding" behind it, which is quite unexpected. The same people will view absolutely opaque people as honest and upfront. Such is the human condition.


I never know what to reply to those "name a bad thing about you" questions. Being honest seems like a terrible idea, but then what would an acceptable "bad thing" be to lie about? Do I say the cliche of "I work too hard" that nobody believes?


Pick a minor flaw, and immediately start expounding on what you do to mitigate it. For example, "I can't always keep all of a complex thing in my head, so I've learned to do X, Y and Z to keep up -- Z is a really cool tool, actually, have you heard of it? I've found that, in my career as a..."


That’s right, answer a bullshit question with a bullshit answer. Because this is merely an exchange in a larger game.

All these people complaining that they got turned down for being “transparent” actually failed a real test: given a toy situation between people that doesn’t involve code, can you figure out what to do to accomplish your goal?

And if someone starts listing their top defects, the answer is “no.” Could be due to nerves, sure. But for a mid-career professional to not have known what to do to get past HR is a failure in problem-solving. Luckily it is solvable with the right mindset.


I'm the one above complaining for being "transparent".

You're right, I failed that game.

I'm fully aware I suck at playing the de-facto state of professionalism, or lack thereof, in the job market.

Fuck me for trying to be an adult trying to adult with other adults, I guess.


The job market can remain irrational far longer than most people can stay solvent.


Or job market remains agile while people are becoming stateless (or homeless in extreme cases).


Add to that the ageism, and it is really a bummer trying to jump into jobs, where the requirements aren't clearly specified on the CV as the very last thing we worked on, regardless of experience in previous years or open source projects proving otherwise.

I have long learned that the only way to switch technology is via consultancies, and being lucky to land on those projects where the team is sold as having "experience".


In my country, last year it was definitely a candidate's market: I had recruiters reaching out all the time and I got the first job I applied for. This month I've been applying for jobs and not getting interviews.

From the conversations I've had it seems recruiters want someone who has an exact skill match for the job. They don't care what else you have done or how many years you have under your belt it's gotta be the exact list the employer wants.

I'm now optimising my resume (CV) for the job. I summarise the stuff that I think recruiters / employers don't care about.

The other thing I've noticed now is that when a recruiter reaches out quite often that role is not listed publicly anywhere. So your profile on the job systems - linkedin and elsewhere - better look real good or you won't get a call.


I've helped hire three different developers where I work now, and been a part of countless interviews. I've found it much more beneficial to look for people who think like programmers than know any given language. Unless you're talking really specific, deep stuff in a given language, the syntax and whatnot are trainable. What you can't really train people to do is take a large task that we want our software to accomplish, and break that up into pieces or steps that can be built. Nor can you teach the basic pragmatic techniques that go into things like using objects and classes.

We hired on someone who had barely touched Swift as he'd been out of the iOS environment for many a year, and even before that had never done a ton of app development, but he had solid fundamentals in other languages so I went to bat for him and got him hired. Not even 4 months later he's a top contributor on our team.


Spot on. Even too many developers think that their main skill is recall of language/platform/tool specific niche arcana, and while it's true that sometimes having that will reduce friction, it's rarely what actually drives things forward.

Arcana are concrete and relatively easy to test for, though, so my theory is that it's a bit like the story of looking for the keys by the lamppost because that's where the light is, even if you dropped them somewhere else.


I feel like this is a pretty common opinion among HN comments, and yet the vast majority of interviews follow the known-bad pattern instead.

Are HN commenters simply too rare to make a dent in the larger hiring landscape, or are they not walking the talk?


(Didn't see this reply and it's an interesting question so excuse some necro)

For my case above, this devs experience with iOS was so minimal he didn't even have it on his resume, he listed himself solely as an Android developer (but like most places we develop for both, so it was useful experience regardless). I have a strong feeling most HN folks would absolutely interview like I do, but the problem is the interview is the last step of an otherwise highly bureaucratic process that is more or less entirely devoid of technically-minded people. Like, even the recruiter that got me my job many years ago, bless em I love where I work, but even that recruiter didn't know shit. They found me because I specialized in a lot of the things my employer was after, and that sounds alright, but that was solely based on the keywords: Swift, Objective-C, etc. A recruiter, for example, won't understand that someone fluent in Objective-C, while they're going to have an adjustment period, could probably competently write C, C#, or C++ as well with some help and training.


If the internet is to be believed the average software engineer changes jobs every couple of years.

If that's true it makes some sense for a company to want to only hire people whose skills exactly match the specific thing they are hiring them to work on. If they only think the new hire is not going to be around long term why put resources into teaching them new skills?


Oh hey, I've heard this joke before -one manager says to another "But what if they leave after we train them?"

The other manager asks back "What if we don't train them, and they stay?"

Not investing in people (and jobs) is a two-way street; there's always someone young and naive to think hard work and investment will be rewarded, and most companies have been around long enough to have set the assumption that "No it fucking doesn't".

The new guys need the most investment. Companies hiring are actively teaching them to be jaded by not investing in their employees.


that's a chicken and egg problem. People change jobs frequently mostly due to two reasons: 1. higher pay 2. to get away from bad management or a bad work environment (same thing really)

If it's the kind of place that doesn't help train new skills, that falls under "bad management". Employees could collectively try to be the better person first in fixing this, but most modern history would show that most employers if given an inch will take a mile, and will generally pay the least amount possible, expect one-sided loyalty, and overall get away with everything they can until either regulation or market forces force them to change.


>it makes some sense for a company to want to only hire people whose skills exactly match the specific thing they are hiring them to work on

The good ol' "10 years of experience in Swift" approach... Though that joke is so old now that it probably is possible to legitimately have that.

That approach would make sense if the requirements seemed possible to begin with. And the salary was enough to attract that kind of niche talent. You're basically asking for a consultant for an employee's salary at that point.


> The good ol' "10 years of experience in Swift" approach... Though that joke is so old now that it probably is possible to legitimately have that.

Just about. Swift was released on June 2, 2014.


"If someone can't hire well, it's not a good place to work."

But nobody hires well, so where can I work?


hijacking this comment to apologise for missing your reply elsewhere until it was too late to respond.

(Won't escalate the hijack by fully replying here, but TLDR: yeah - no policy can fully remove bias, but we lean heavily on diverse peer review to at least catch it, and we seek and adapt to feedback)


I was approached through a recruiter for a job with a 100k-150k pay range, but I didn't get to interview because I had "too much experience" and they wanted someone with 2-5 years experience in the 120-130k range. I had experience with everything on the job description, even the nice-to-haves section. Make it make sense.


Late to this thread, but on the face of it I'd assume they didn't want to pay more than that range for a few years, if at all. Sometimes when they say "too much experience", what they really mean is that they want a senior developer or platform architect on a junior's salary.


As a lifetime contractor, I felt this all the time. So I spoke to the boss of one of the recruitment agencies and said, hey. Isn’t this a job I’d be good at? Recruiting? Because I actually know what the hell I’m on about?

And he said, sure, and I’d hire you! But you’d be bored to death within a week because every day is exactly the same.


The older I get, the less I understand the "you'd be bored" excuse for not hiring someone. Is that boss really jumping out of bed every morning excited to talk to more candidates of varying quality and moving them into other, mostly boring, companies. And have boring meetings about what metrics to hit?

Restraint and overcoming boredom is one of the basic essences of growing up. You'd think management would understand this.


I just think he knew that I wouldn’t last.


Spot on.

I have been interviewed for X (as they do), got hired and ended up doing completely different Y in almost all of the jobs I've had. Yet, most of the job interviews I've had felt like they are looking for a very specific things and if I don't have exactly that, I am not good.


It is funny I tried to hire people, just pasted the minimum requirements and salary - and got no response.

The other day I had a tech interview and just by screen calling to update python version on OSX one of the commands was "reboot" and oh well.. the interview was over.


It depends what dialect of C++ they are using but I believe you would need at least 3 months of hard studying to go from zero to a functioning C++ programmer.

I’ve been using C++ for over a decade and I still don’t feel like I know enough about the language. It’s a bottomless pit.


"Low level python" is an amusing statement


> low-level Python implementation

Python is written in C, with raw pointers everywhere, manual reference counting, etc. Python C extensions are certainly low-level too. Knowing these details is important to write good code in some domains, and if you need to get as much performance out of Python as you can.


In C++ in particular... Yes. A lack of recent C++ experience is a real hindrance.

The spec is longer than double the King James Bible, and depending on what firm you're working with various parts of the language are either banned or best practice. These days I don't generally find that familiarity with any other language (including C++ from 10 years ago) translates well to fluency with modern C++.

Of course, I solve this problem by avoiding the language like the plague, because why do I want to write code in a language that I'm going to have to work that hard to find developers to extend and maintain?


Having experience from both the side of applying for jobs and running (some) job interviews I have to say that the delusions some candidates have shown have really surprised me. When I read comments online about application processes I wonder what our most delusional candidates might say about our extremely fair, one hour max interview process..

What I mean by delusional is that many candidates seem to not put in the work to research what we do, how we do it, etc. They then applied for a position that fits their own profile, not for the position we offered. Some even had a different companies name/address and the wrong date in their application letter. This combined with a display of extreme self confidence you can only wonder..

My number one tip for anyone applying is: Although it feels like interviews are about you, the way you should look at them is you solving a problem the people hiring you want to see solved. Your task is to know who they want and tell them why it is you, despite certain obvious flaws that you could anticipate them seeing, be it your age, your lack of big names in the CV, etc. But key is, that this is not a game where the higher leveled person automatically wins.

That means if they are looking for someone cleaning up their rubish database and you come across as The mega database expert that only picks the toughest battles and gets bored if he can't find a bigger mountain to climb, chances are that less qualified but more stubborn, more patient person will get the job, as they are more likely to be the lasting solution.

So the key questions for you should always be: "Who, exactly are they looking for?" and "Am I willing to be that person under the conditions of the job?". The first question is a research question for you before and during the interview, the latter a question you should keep in your mind throughout the whole process.

This is also why to me it is kinda a red flag if a candidate has no questions, because usually they should have if they wonder if they are the right person for the job and if the job is the right one for them. Especially since that is an interview-cliché already.


I agree with all of this except:

> This is also why to me it is kinda a red flag if a candidate has no questions, because usually they should have if they wonder if they are the right person for the job and if the job is the right one for them. Especially since that is an interview-cliché already.

…which seems like a non sequitur to me. I only apply if I think I'm capable of the role. If research ahead of the interview is positive, and how the interview process is handled seems good, I may not have any questions. Although, I admit I sometimes thought of some later, which I emailed if important.

But possibly where I interview worst is in refusing to exaggerating or puff myself up. I've come to wonder if some of the roles I didn't get were because there weren't any lies (such as "extreme self confidence") for them reduce to a guesstimate of reality.


this is very solid advice - I completely agree with you


> If someone can't hire well, it's not a good place to work

This is just objectively false, most companies suck at something, and in all likelihood you will never have to deal with hiring process again once you are in. It’s one of the things you care about least.

The idea that a company has to be perfect for you to join is unrealistic


I've found it very difficult to mentor people working remotely.


I found a job as a developer after about 10 months of applying for 100 jobs per month. It was very weird this time round. Especially weird considering that my resume is exceptional (lots of experience; corporate, startup, back end, front end, full stack, distributed systems, open source). The list of technologies that I'm competent with is long and, for most of them, I've got open source projects to prove my claims. Air tight situation.

This time round, employers had all kinds of weird esoteric requirements which made no sense.

For example, I applied for a job where I matched every single technology in their stack (and it was a long list), I even had proven experience in 'Web Components' which was still niche in a professional setting (most jobs are still React, Vue, etc...) Anyway, they ended up rejecting me with the explanation "The head of engineering is very particular, even though this position is Node.js/JavaScript, he likes to hire candidates who have a background in C#." There was no mention of this on the job advert! Besides, I did use C# full time for 2 months during the summer break while at university which I mentioned after she seemingly invented this requirement but she responded 'He likes candidates who started with C#.'

There is no way they would find anyone who meets all of these requirements who also happen to know this completely unrelated technology.

Anyway, after 1 year, out of 1000+ applications I submitted, I got about 5 phone screens with recruiters and 2 interviews with actual company insiders.

The first company Founder I interviewed with seemed keen to hire me at first and kept leading me on; but they always waited for me to ask them about the next stage in the process before actually proceeding... They kept half-ass ghosting me until the last phase and then I was like "They can't be serious about hiring me" and I stopped asking about the next phase.

My sister works in HR and she told me that I only needed to match about 60% of tech requirements to get a job. It didn't correspond at all with my observations of reality...

Putting 2-and-2 together, it seems like they were giving all these high paying jobs to beginners and tossing experienced candidates' resumes in the trash... Probably the HR leads were not even seeing the resumes of experienced candidates coming in.

It all seems kind of conspiratorial if you ask me.

Another weird thing that happened (going 2 years back and which led me to being unemployed) is that the startup I was working for was getting very few job applications when I applied. They were super keen to hire me and I got a big salary, share package, everything... Then 6 months in, candidate applications started pouring in by the thousands... After sifting through thousands of applications, they managed to hire an absolute weapon; this guy was not only sharp, but he was churning out maybe 1K lines of code per day. Really impressive. I had not seen anyone code that fast before. He built features really quickly. I was doing peer reviews for the whole team, which was hard work and essential at the time, especially with code being churned out at that rate! Anyway I was laid off because the founders didn't see the value of my PR code reviews.

They didn't understand how important it was to have someone looking over this superhuman code churning to keep complexity under control. Multiple times, I saved the front end from memory leaks and rogue setInterval/setTimeout which were unnecessary or not cleaned up properly (among many other issues). Sigh. I feel like this situation would have been a startup founder's dream, surely combining the massive development speed with the safety/de-risking I was adding was worth the tiny sub-% equity I was set to receive 4 months later... Sigh.


There is something off here.

- 1000+ applications, 2 interviews. Holy batman. That’s what, 4+ applications a (work)day for a year straight? How do you even find these positions.

- “Air tight”, god-tier CV.

- Only doing “PR”s.

- Handling “setInterval”s.


same... also that saying "churning out 1k lines of code per day" is automatically good... sounds like churning out 1k points of complexity every day


I didn't say it was entirely a good thing, hence the importance of the PR review process. As I said, I found numerous bugs, sometimes nasty ones that are hard to identify/fix before they got merged. That engineer was building features really fast, however. Maybe it wasn't 1K lines per day every day, but definitely he did hit that mark on some days.

He was surprisingly skilled considering the volume of code and he had a solid understanding of a lot of advanced concepts and nuance so I know he wasn't blindly using LLMs. He did implement features really quickly and bug density was quite low overall.

I'm sure he could have implemented those features using fewer lines of code, but as the team lead, what can I say to a highly motivated 25 year old who is churning out new features faster than the rest of the team combined? Motivated people aren't typically very receptive to generic feedback like "This is great but you should try to reduce complexity"... Of course, I could provide slightly more detailed feedback, but that would be getting into my personal coding philosophy and didn't quite align with the broader practices of the company (a startup) at the time. There were a lot of things that the company was doing, which is standard (most companies are doing the same) but which I don't agree with and which would sound controversial. I could provide strong arguments for my positions, but humans are flawed, and carefully thought out, nuanced arguments that go against conventional thinking often tend to fall on deaf ears... You can only rock the boat so much.

Also, you don't want to de-motivate a highly productive person. Even if they're productive only in one narrow dimension. With me looking over his code, we could keep complexity under control at a maintainable level. Keep in mind, we were a startup in a competitive, growth sector. So developing features quickly was quite important and throwing away entire features to pivot was considered an acceptable risk.


Yes and doing that requires a solid non-stop and consistent 3 working LoC per minute for six hours straight. I don’t think a human is capable of processing that kind of volume unless it is pure boilerplate.

I can see LLM’s playing a role here..


> This time round, employers had all kinds of weird esoteric requirements which made no sense.

I have often seen some jobs that sounded like a directory level manager mixed with principal engineer mixed with principal scientist all into one job with the title "senior engineer". It's just bewildering who these people are that they're looking for with such requirements that aren't even in the "nice to have" category.

> It all seems kind of conspiratorial if you ask me.

I agree that something seems very, very strange in this market in the past year or two. What it is, I don't know, but there is so much that doesn't make sense. It really makes me question if anyone really knows what's going on, in the sense that there are just a ton of companies out there that are near incompetent. Whatever is causing it, I think it's actually very bad for the U.S. economy. It seems like no one wants to do anything that's off the so-called rails. No risk. Do whatever everyone else is doing. Only find people who have no want to do anything except what you're already doing.


> It seems like no one wants to do anything that's off the so-called rails. No risk. Do whatever everyone else is doing.

Fundamentally that's what high interest rates will do, there's no need to take risk and there are no cheap loans to spend on moonshots that might not go anywhere.


<< if anyone really knows what's going on

I have long maintained that real knowledge is a well guarded secret.

<< in the sense that there are just a ton of companies out there that are near incompetent.

I want to say no, but I am old enough now to be able to compare to some of the previous positions and I can't help but wonder how much of all this is just posturing and bluffing ( fake it till you make it kinda deal ). I accept it was always there to an extent, but the current project I am a part of makes me question a lot.


some companies do know how to hire and how to train. its rare- not zero.


Conversely, my company, which interviewed me by having me write ‘for’ loops in all 12 or so languages listed on my CV during the technical interview, is a pretty decent place to work.

I cannot tell you how bemused I was after leaving that interview. I’d done months of degrading stupid ‘tell us some trivia about JS’ interviews, and here was someone that (apparently) just wanted to judge whether the information on my CV was true.


> Conversely, my company, which interviewed me by having me write ‘for’ loops in all 12 or so languages listed on my CV during the technical interview, is a pretty decent place to work.

This is funny. I’ve interviewed so many candidates who think keyword stuffing their resume has no consequences, but then you get them in the interview and discover they couldn’t actually tell you much about the languages or frameworks they listed other than very high level descriptions.

When I do resume reviews for people I try to push back on some of the keyword stuffing, but there’s a lot of resistance. There’s a popular idea that resumes are just for tricking the ATS and nothing more, but then these people get in front of someone who actually wants to discuss their resume and it gets ugly fast.


> I’ve interviewed so many candidates who think keyword stuffing

They do that because HR looks for it.

>frameworks they listed other than very high level descriptions.

Would you remember the details of the framework you worked on 10 years back? Especially give that in the 10 years you went through another 15 frameworks?


> They do that because HR looks for it.

You've described a beautiful symbiosis.

Half the people in the interviewing process see keyword stuffing as pointless bullshit, and only care if you can reverse a linked list in a language of your choice.

Half of them only care about keyword stuffing, and will send your resume in the shredder if you don't do it.

And the third half see anyone trying to avoid the nitpicks of the second half as a person to expose as a fraud.


The problem is that HR is the first line of entry, and if you're not keyword stuffing every possible technology under the sun, they'll pass you on, because they don't know that being a python expert is fine for a Ruby job. So you have to put both down, lest the HR screen say "well they didn't specify Ruby, they don't make it through!"


Yeah, I've actually had this happen before. I keyword stuffed my CV, then when I got to the interview stage, the actual tech lead quizzed me on it. I just replied honestly that while I had passing knowledge, I didn't actually know these things in detail and I did it just to get the interview. I think they appreciated the honesty, or at least could just relate to it. We tested the stuff I actually knew and got the job in the end.

It probably helped that I was just bullshitting the fringe cloud stuff, the actual languages/framework everyday related skills were all true.


My own HR people tell me it happens lol

I don't get to see the fat stack of resumes they get, only the stuff HR has already approved and passed on to me to further review. I asked them about their process, and they basically prioritize anyone who has the exact keywords on the job listing, and they more or less straight up ignore anyone who doesn't.

We had a lot of frontenders that HR filtered out because they only had React on their resume, but not Vue or Svelte (both of which are much simpler than React, so even a complete Vue/Svelte novice would do fine if they were good at React). Similarly when we were looking to fill a Ruby (NOT Ruby on Rails) dev, they filtered out people that didn't have specifically Ruby and instead had `Ruby on Rails`.

You can't even blame them, half of the stuff in the job listings sounds like complete mumbo-jumbo to anyone not directly involved with the stuff. After they told me about the Rails thing I sat down with the HR team and gave them a high-level overview on this stuff, but it still happens from time to time, especially when the job requirements/description changes every once in a while. Plus, the edge cases here (is someone good at Python OK to let through for the Ruby job? What if they have an Elixir background? Or just JS one? etc.) are basically infinite, and you can't educate them on every single possible variety of technologies out there.

The only real system would be to have the devs evaluate every resume that comes in, but that comes with its own set of problems, too.


I'm actually somewhat anti-AI given my domain, but this sounds like the exact kind of issue a LLM can be trained in a few weeks to do for the hiring manager. Not that braindead ATS keyword matching that's happening as of late. Actually language processing.

It was pretty much made to sift through bulks of data and idenitfy similar concepts, so it's kind of funny in this AI gold rush that I haven't heard of a proper example. The application and impact is obvious, and it's not like HR's only job is resume grokking so minimal displacement.


I will never forget the candidate around 2010 who came in claiming PostScript. Well, our interviewer started asking about stacks and clipping paths, and it quickly emerged that the candidate knew how to click Print > To File.


I've done this before:

Me: you've listed Java as "Expert", perhaps you could discuss about <favorite footgun in project>

Interviewee: sweats for a bit, finally admits only used Java factory factory and copied lots of code

Me: ok great, let's move on, you've listed Rust as "Intermediate", lets...

Interviewee interrupts, admits they know how to spell Rust and watched a couple of videos of crabs


I've had a candidate that had Apache Spark on the resume, turns out he was sitting in a room with other people who were writing Spark jobs.


Wow, do you use Spark in production? I can't claim to have much experience with it, I just played around with it a loooong time ago, but I was impressed. I thought it went out of use along with Hadoop and HBase, glad to hear this is not the case. Are you hiring? :-D


Out of use? Databricks is like 40 billion dollar company providing basically "managed Spark" platform.


I had a guy tell me he had experience with Apache's mod_rewrite module and, when asked what its purpose was, he said verbatim "It gives you, like, more commands and stuff."

Other questions were met with equally disastrous responses.


Oh man that would be a nightmare - I mean I probably could do it for most of the languages I have used but I would sweat during an interview. Worse would be "show how long a string is in all these languages". I always keep confusing that syntax when I initally switch between languages.


I mean, it’s not like my interviewer knew all those languages, so I guess the fact the results were plausible was enough.


[flagged]


Yea, cause everyone who is already writing C++ is really good at it and good at not writing bugs.


No, but a person who hasn't written C++ will write way more bugs, because so many of the stuff they learned in other languages creates extremely severe bugs in C++ rather than compiler errors. I can understand anyone who wants to avoid having to deal with those learning mistakes that every new C++ programmer has to go through.

In C++ code that looks right and clean to a non-C++ developer and that passes tests can still crash the entire program in production with no stack trace available. Or you can accidentally make copies of a big collection every function call (since C++ is pass by copy value by default even for large collections) in a hot path which would also bring down production, or massively inflate your bills if you dynamically scale up.

The same doesn't apply to most other languages, there experience transfers much better and its unlikely for experienced programmers to add novice bugs in those even if he has never programmed in it before.


C++ experience is like bash or Perl experience. You can write something, often not even that bad, but you will be much slower and there will be a lot of cliffs from which you will fall.

It is not to boast or sanctify the language. After 15 years of commercial experience with it I just hate it and feel like it is a lot of useless knowledge, which I regret. It can be useful and there are always trade-offs, but I still hate it.


Even great C++ developers write shitty C++ code. It’s truly a “let’s take all the warning labels off” language.


Nah. Many people who have been writing C++ poorly for decades never learn to do it better. This is less about "even people who are great are bad" and more about "no amount of experience guarantees that you get great". There are plenty of ways to write the language that aren't as error-prone, and most development of new features in the language since at least 2011 have been creating new ways to do that more. The "problem" is that they value backwards-compatibility, and most pedagogy for the language both in academia and industry is at best outdated and often just teaching bad ideas outright, often both, so while there are whole enormous and robust codebases out there that don't use them at all, the idioms that contain footguns still exist in the language and are valid code and some old dude who's been doing this for 30 years will still sit in a meeting and tell me that he's been writing pointer arithmetic in deep nested loops for 20 years and it should work fine after insisting on being in part of a review process that is for the most part not adding any new information halfway through a rewrite process to fix some show-stopper bugs in a legacy codebase, which ended up making said process about 10x more annoying and take at least twice as long


> some old dude who's been doing this for 30 years will still sit in a meeting and tell me that he's been writing pointer arithmetic in deep nested loops for 20 years and it should work fine

Pointer arithmetics in nested loops is just regular C code though, that is perfectly fine. The problem comes when you start to mix C++ datatypes and templates with pointer arithmetics, that is when the footguns becomes completely unmanageable.

However if your team doesn't have other people with experience writing such code then it is best to avoid it anyway, if you decided to not write C style code in C++ then he should respect that. But he is right that it is perfectly fine thing to do, writing C in C++ is basically just C with namespaces and scoping.


Sure, it's doable, but it was in this case the source of the issue, along with some obfuscation of where and when those particular sections of code were being executed through shenanigans with encapsulation and inheritance (I respect people who can manage their pointers well, but I genuinely dislike a lot of OO practices even aside from if they're done "well"). People write exceptional code in C despite and sometimes because of it not protecting you from writing it wrong. I respect the view that some people don't need their programming languages to protect them from these problems. In this case, it helped quite a lot, especially since the application was essentially reimplementing vector processing, getting the memory management wrong, and not doing anything to justify not taking advantage of the insane degree of optimization that's gone into std::vector


>pointer arithmetic in deep nested loops

What's wrong with that? Pointer arithmetic is just indexing with different syntax.


Nothing if you do it right. Indexing also goes wrong if you do it wrong. In both cases, they make sense as abstractions, but are less safe than certain other abstractions

I see lots of good code in the linux kernel that uses explicit indices and pointer arithmetic. It's written by people who know what they're doing with it, and perhaps more importantly, have hundreds of maintainers gating updates directly with review and thousands of open-source enthusiasts who might notice a problem

However, many modern languages, including C++, can do a lot of the things you might do in this explicit way in other ways that are considerably harder to screw up, and often drastically more efficient to boot, which was the case here. I think the guy I was interacting with wasn't even the one who wrote the code, he just had the vague sense that really good coders could write code this way, so it must be a good way to go about writing code. My point isn't that it's inherently bad or going to fail necessarily, just that it's the kind of pattern people mean when they complain about older languages without as many guardrails. This particular code was causing a memory leak that eventually crashed the device in some but not all of the contexts that called it, not because it was pointer arithmetic, but it had stayed broken and gotten more broken because the ways in which it was done wrong were harder to notice and suss out in that form than what I eventually worked it into, and the way that I did that used newer idioms (mostly involving lambda capture, moving some of the implicit polymorphism accomplished by casting pointers into explicit templated functions, and moving things into vectors to isolate where things were going off the ends of arrays, nothing super fancy, but newer than this guy who hadn't worked in C++ in a decade had seen), and we were all stressed out but it was definitely annoying that I was getting all these notes on interim commits that were basically a guy who was not on that project and did not understand the code trying to flex seniority on me by insisting that doing stuff in a way he didn't understand was wrong and bad. To his credit, after some of the stress had passed he did try to learn some of the functional-first kit of idioms I had used to fix the issue (once he saw that this indeed was what had happened), but it still left a sour taste in my mouth. I really dislike most interactions that are deeply concerned with status

I've written a lot of C/C++ and I've seen the cases where it makes sense, sometimes even is better to use the kinds of patterns that motivated language design choices like explicit pointers and casting and calling new all the time to allocate class instances, where those things still are necessary or efficient or even just make sense. But also, they are the kinds of things people complain about when they say that the language lets you make a lot of mistakes many other languages, especially newer ones, don't. I often write in subsets of C++ that have this property, and converting code I'm not ultra familiar with to equivalent functionality using different idioms just to see is often a productive way of finding and eliminating difficult bugs. This doesn't mean no one can or is writing good nested loops with pointer arithmetic driving the iteration, but it's a place to look for errors because it's harder to be good at that than it is to write a range-based for loop


Nothing compared to C.


C++ has all the footguns of C, but add in tons of implicit function calls that creates bugs for half of the things you'd think they can be used for, and because you don't see that implicit code it is really hard to debug why an implicit call was made or not made when you thought it would or wouldn't.

So no, C isn't even close to the danger of C++, in C every copy, every destructor, every alloc, is explicitly declared everywhere. Explicit is much safer than implicit, unless the implicit behavior is rock solid like in garbage collected languages, or very strict languages like Rust.


>in C every copy, every destructor, every alloc, is explicitly declared everywhere. Explicit is much safer than implicit

Unless a programmer forgets to call a 'destructor' on some code path in C's explicit error handling maze.


Agreed. Explicit is only better than implicit when you have a mechanism that can reliably identify instances where you forgot to do something.

Actually for resource management I'd go so far as to say that implicit is better than explicit in the vast majority of cases. Now if only the rules that determine what happens during initialization in C++ weren't so horribly convoluted.


No, it's because your experience with 'dozen plus languages' is mostly not transferable to C++.


The way I see it is that what most companies hiring c++ programmers really need are good systems programmers. However, being an expert in advanced and/or esoteric features of the shitshow that is the C++ language is neither necessary nor sufficient to be a good systems programmer.


>What it boils down to is that companies have zero idea how to hire.

Yeah, certainly doesn't have anything to do with hordes of people with make-believe degrees in everything.


> And they have zero idea how to mentor and train

From my experience, they have zero _interest_ in mentoring or training. For C++, there are ways other than professionally to get experience. Despite the hate it gets here, there are examples of large, solid C++ codebases to peruse and learn from. Taking the initiative to learn a language prior to a phone screen or interview goes a long way. If a C++ job is what you’re after, it might be worth investing the time to use it on a non-toy project and narrowing your search focus.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: