Hacker Newsnew | past | comments | ask | show | jobs | submit | more azanar's commentslogin

A candidate who is at an elevated risk of being replaced by severance, which is a position most new hires are in, is probably putting their aspirations on hold while you figure out if they are the right fit. Severance is typically enough to keep an employee above water for a time while they find a new position that fits them better, but it certainly isn't anything people can build dreams on top of.

I think there is a common point of view that wants to assume that recruiting a good fit is something that you determine based on a set of input, and then commit to. If you have to reassess at a later time, and especially if you have to reverse your decision, you've failed at recruiting, and in a way that is preventable in a deterministic way. Experience suggests that people who believe this do one of two things: either they attach an ethical weight to the employee/employer relationship that means you have to weight the cost to your business against the cost to your sense of self-worth; or, they have gotten lucky enough up to this point to meet/interview/hire people who have not misrepresented themselves or otherwise projected an image that they would be much more valuable than they proved to be.

Having been involved in a number of instances in the last two years that have exposed me to the randomness of recruitment and hiring, even using all of the hacks people use to remove the error, I'm honestly a little surprised that people can have any imperatives about recruitment. The whole things seems at best stochastic, and errors are unpreventable.

I think the reason why people are so reluctant to man up is because it means they failed at something we believe they shouldn't fail at. I would argue they've failed as something we all fail at, and that accepting that will make the whole process better for everyone.


Actually, I think "hire them on a temporary basis" is more egomaniacal than obsessing over job interview questions. For someone to work for you as a temp, they have to leave their current full time job and put their benefits in jeopardy. When you hire someone, you should be ready to commit.

In my experience, when someone is hired on to a new company, they are on a trial basis for a certain number of days anyway. This may seem less risky than being explicitly labelled as a temporary-to-hire worker, but they could still find themselves and their benefits in jeopardy if within that trial period either decides this is not as good a fit as the recruitment process suggested it would be.

I think that underneath the intuitive reaction we have to "temporary worker" vs. "permanent worker on a trial basis", they're actually very alike from an individual risk perspective. Either way, in 30/60/90 days, you could find yourself unemployed and still in the same bind. There may be benefits consequences the direction of a temporary worker, especially if you are temped through an agency. On the other hand, that agency might find you another role if the one you are in doesn't work out.

There may be a different discussion about whether or not having this trial period is right, or ethical, or good business, or whatever, regardless of what whatever label gets put on it. My stance on this is that a good process will inevitably make very bad decisions from time to time, and it's not always the best idea to force those bad decisions to be irreversible.

Maybe my experience with the occasional mistakes of what I've seen as otherwise good processes in my past has made me a little less hard-lined about this.

EDIT: get my quoting italicized correctly.


Is it worth making the world a worse place just to get a measly thousand bucks?

I think it depends on whether the person set to net a thousand bucks would agree with you on whether censoring a mascot makes the world a worse place, or whether they think censoring the mascot brings the world, such as it is, into relief.

It may not be that most people would care about the daemon turned devil. But that can be all the more incentive to just do what the vocal minority wants. No one else really cares, and that is as much about making the world a bad place as the person who actively insists on censorship. So why should you make an effort against a disinterested population?

I agree with you, but I think it is important to realize that there are varying degrees of both willfulness and jadedness that can make an argument like "you're selling the world out" fall flat. As much as I find a lack of resistance from people toward the censorship calls of others, apathy and jadedness are far more often the cause than agreeing that something ought to be censored.

Maybe finding a way to convince these apathetic ones that the world isn't beyond redemption would go a long way to convincing people that these things are worth fighting against, but I am still trying to find a reliable and repeatable way to do that. I will always be the seemingly naive optimist I suppose.


Seattle, WA

Wetpaint

We’re working on a platform of new services and tools aimed at a revolutionary new way of doing publishing. Systems that can spot breaking news, predict the amount of traffic a piece of content will drive and figure out where, when and how to best distribute this content.

We are looking for a software developer and a test engineer. Details here: http://www.wetpaint.com/page/jobs

Feel free to contact me about either of these, or anything else on that page: [email protected]


The best jobs and the best candidates for jobs will both be placed privately.

I realize that the plural of anecdote is not data, and that you aren't arguing the contrapositive. But I have a couple of data-points from the last few years that make me believe both directions are not necessarily tautological.

In the direction of placed privately -> best candidate: a company I worked for some time ago had a developer who was placed privately. Specifically, the developer already knew another one of the developers at the company, and came with a glowing recommendation. This person was hired somewhat before the time I was brought on, and as a result, they were well-entrenched by the time I arrived. This would not have been a problem, if this person weren't one of the worst developers I have ever worked with. I wish I could provide more evidence, but it would probably result in both a breech of NDA, and also enough detail that someone on here might know who I was talking about. I've known other people who have had similar experiences of incompetence brought in by some insider repaying a favor of work done earlier. Suffice it to say that there is plenty of offal out there that find themselves insided into companies assuming roles they ought never have been employed doing.

In the direction of best candidate -> placed privately: there is always the story of paul on here joining Google. I'm sure plenty of other stories have been recounted of talented people throwing resumes over the transom, and managing to get someone's attention on the other side. For my own anecdote, one of the best sysadmins I've known in my life managed to find his way into the company I was working at on the basis of a cold resume submission. Nobody knew him, and he was just another name in a pile of resumes; but, it was a pile of resumes that a couple people with a reasonable degree of cluefulness were given to read through, and this sysadmin stood out even on paper. Had we punted him in favor of someone privately placed, I'm suspicious that we'd have been in a much better place.

I realize that you are plenty talented, and am not casting aspersions your way. But there are plenty of hires that are the product of favors of favors, where the biggest favor the middle-party could've done is to never have introduced the company to the candidate in the first place. That's the trouble with favors, though; it is difficult to not be willing to make the connection, because it can bear a heavy social cost to have to say no to a friend.

I'll admit I might be a bit more sensitive to this now because I'm at a small company. A bad hire who was brought in on a favor could send the place into financial pain with maybe a day or two's worth of misguided exuberance. I don't know of someone internally earning social capital for themselves with that as a potential expense is a worthwhile trade-off. I'd rather keep the roadblocks up regardless of how the candidate is sourced.

TLDR summary: Sure, listen to and harvest from your connections, but realize that they have their own interests which may not necessarily align with yours or your corporations, and vet the people you find from them accordingly. The candidate you find or are pressured to hire through the grapevine or a favor may not be nearly as talented as patio11.


Great comment, and from my perspective a view that is perfectly complementary to Joel's post. It could very well be that the same people who are firing off hundreds of resumes are also hitting up dozens of their connections to try to find work. As you mention, hiring even one of these people into a role for which they are not qualified, whether through resumes or private placement, could really damage a small company.

When people make a statement like "xx% of all hires are made through private placement," my first thought isn't "wow, networks are such an effective hiring mechanism!" Rather, it's that our other alternatives are frequently so lacking that we rely on "known-unknowns" in the absence of a better method.

Networks are both a strength and a limitation. When you rely on them for hiring you might be able to attract the best candidate from within your set of connections, but if there are 100 people outside your network who are objectively better then you will have missed out. Unfortunately, limitations in our current tools make it very difficult to determine whether those 100 other people are actually out there and, if they are, how to find them.


In economics the reverse hockey stick growth is known as diminishing marginal returns. This simply means the market values the earliest years of experience as much more beneficial to a developers skill than the later.

I wonder which part of the market it is that champions valuing developers this way; is it the companies that value that additional experience less, or is it the developers who diminish the additional value more experience adds? They both ultimately agree, as but I wonder which side is making the harder compromise here to get to that agreement.

I can't help but see this graph, though, and get the impression that after a certain number of years, people just stop caring so much. It might be that they grow apathetic. It might also be that the asymptote is the level at which salary stops mattering, and people focus on other things. I'd kind of believe that; you can been pretty comfortable most places at $100-120k/yr, and might start worrying more about things that go beyond mere comfort.


There is an interesting subtext in this article, which is to suggest that you take more risks as you get closer to the asymptote of salary growth, because it's the way to avoid diminishing returns for the additional skills you've learned between hitting the level of "experienced" and now.

It's the opposite advice that makes up common wisdom, which is to take risks when you are young, inexperienced and without responsibilities, and then settle into the comfortable long-term job with small raises once you get past that phase. I'm not surprised by this; if technical skills compound at all, it would seem that a developer's value would trend exponential and not logarithmic. That is, so long as they don't weary and stagnate.

But then, there is also value for the inexperienced in a startup, in that you'll learn a ton, and be given a lot of responsibility and autonomy. It might be that common wisdom is only half-broken.

Granted, there are opportunity costs involved. You might be in a position where you have to trade capitalistic striving for a paycheck, as a ribbonfarm article posted here a while back put it. It's as much being aware of those opportunity costs, though, as it is knowing whether or not you are in a position to need to accept them.


Not quite. I think this is conventional wisdom:

If you take a risk and co-found your own startup with 0 years of experience, then even if you tank 2 years later, you are likely to wind up close to the 5 years of experience point of the hockey stick when applying for a job.

Basically, the startup gives you a small chance of huge upside, and on the downside, you become a normal W-2 salaried employee but begin at a point farther along than someone who played it safe the entire time.

Whether this happens in practice is probably highly dependent on whether the risk taker demonstrated some goodness during the failed startup.


If you take a risk and co-found your own startup with 0 years of experience, then even if you tank 2 years later, you are likely to wind up close to the 5 years of experience point of the hockey stick when applying for a job.

That's probably closer to the conventional wisdom around HN, but it is a far cry from conventional wisdom amongst the masses. The wisdom amongst the masses is that if you co-found or join a startup, you are playing roulette with your financial security.

But the article's point -- which I agree with -- is that even if you accelerate this process of getting to that 5 year threshold, you are doing yourself a disservice fiscally by settling in at the plateau unless you have a good reason forcing you into settling.

You're likely still learning more, and becoming more valuable, but it becomes much more difficult to extract that value through salary. So, it becomes beneficial to join companies where that only makes up a part of the total compensation package.


I certainly don't think that. If you're incompetent you can get lucky, have a moderately successful failure, and still know 4/5ths of fuckall. And you may then become that most dangerous of individuals, the ignorant 'expert'.


Skills don't really compound, it's not like interest where you gain 10%/year on your existing skill set.

We forget stuff, and things we learn don't always enhance/build on things we already know.


what happens if the global logistics infrastructure breaks down because of war/rising oil prices/political turmoil etc?

What happens if some local infrastructure breaks down in a way that that the local community is put to a standstill until things get fixed?

A good example of this: a major highway bridge becomes impassible in a way that is not trivial to repair. The lack of this bridge becomes a huge expense and drain on quality of life in the region. You can either ride of self-sufficiency to get this bridge rebuilt, or you can rely on the efficiencies of volume producers of raw materials, machinery, and whatever else you'll need to build that bridge back up.

Self-sufficiency, in the sense that you exchange with few or no other people for your own wants and needs has huge costs as well; but, because most of us aren't self-sufficient in that sense, we tend to romantically gaze in that direction and only remember all the good that comes out of being self-sufficient. The trade-offs are not that obvious and one-sided.


What happens if some local infrastructure breaks down in a way that that the local community is put to a standstill until things get fixed?

In a non-networked world, the problem is limited to the local community. In a networked world, everyone is harmed.

It's pretty easy to see that there are tradeoffs to be made between reliability and efficiency. As we become dependent on products which are centrally produced (very efficiently) and widely distributed, we all become vulnerable when a disaster strikes the producers.

I personally believe that the efficiency gains outweigh the reliability losses, but that doesn't mean the tradeoffs don't exist.


As we become dependent on products which are centrally produced (very efficiently) and widely distributed, we all become vulnerable when a disaster strikes the producers.

True, and I tend to underestimate just how centralized the production of a lot of things are, since what I do isn't really based that heavily on geographic location.

My counter to that, though, is that there is usually at least a couple of producers of most products, and that sometimes -- though not always -- they are geographically separated enough that the risk is somewhat mitigated. Other times, though, geographic constraints sort of mandate that every supplier be located within a small number of miles of one another; suppliers of natural resources (coal, oil, etc) come immediately to mind.


If I shop at Wal-Mart, the money leaves my local economy forever and basically just goes to shareholders and overseas manufactures.

This is only the case if no one who gets money, either directly or indirectly, from Wal-Mart wants to buy any of the stuff your local economy was making. Think of all the people all around the world who get money from Wal-Mart. If none of these people want to buy anything you and your fellow townsmen make, then you are absolutely right that the money leaves your local economy forever.

But this points to a deeper pathology. You want something people outside your community provide (through Wal-Mart or whatever), but you refuse to give back an equal amount of value for what you extract. Because of this expectation, you rail against anything non-local. This is how trade deficits get started, and the solution to them is not to scream for a reemergence of isolationist economic policy, which is essentially your concept of a self-circulating local economy. The solution is to realize that other people want you to do things for them when they do things for you, and to get over this quaint notion of historic mainstreet town life if you really want the things that a big box style economy provides. You can't have both unless you want to go broke.

I also think local merchants are more likely to sell goods/services that are better for their local economy than a big box retailer that is completely disconnected from the local economy.

Can you back this up, or are you waxing sentimental?

As far as the flow of capital is concerned, a big box retailer provides a lot for a local economy because people spend a lot of money there. They're winning as far as cash flow goes, which means they are giving people what they want, even if they whine about getting it.

If nothing else, I think this shows that local economies aren't as special and unique as people claim they are; it's like the Lake Wobegon Effect for retail. Your local economy is not that special, your locality is not that special, and your fight to remain isolated in all the ways you think matter while becoming interconnected with other communities in all the ways you are willing to tolerate is going to backfire, because you come across expecting more from others than they are allowed to expect from you. If you want that money back, then do something that makes people want to give it back to you. It's as simple as that.


I don't understand why some people on HN are so needlessly condescending: Can you back this up, or are you waxing sentimental?

Why would I even want to discuss something with someone with such an unnecessary rude and snarky attitude?

Yes, I can back this up... go into your local pizza shop/taco stand/burger joint/Italian restaurant/seafood shack and get a pizza/burrito/burger/pasta/fish and compare the ingredients to Pizza Hut. One is made out of a bunch of "edible food-like substances" (to quote Michael Pollan) the other is made with actual food.

Quote:

Wal-Mart sells cheaply and uses fewer workers partly because it is a technological and organizational innovator, but its success depends even more on its relentless pressure on workers and suppliers, and its extraordinary market power is by far the dominant retailer of many goods. The corporation is likely to control 35 percent of all U.S. food and drug sales by 2007.

Wal-Mart also shifts many of its costs to taxpayers (or other businesses that indirectly pay costs of Wal-Mart’s underinsured employees). A recent study by Good Jobs First, an organization that monitors economic development policies, found that state and local governments had given at least $1 billion in subsidies to stores and distribution centers. Wal-Mart also pays so little that many of its workers rely on state healthcare subsidies, food stamps, housing vouchers and other public aid. According to a recent study by the University of California at Berkeley Center for Labor Research and Education, California alone spends $10 billion annually to subsidize Wal-Mart and similar low-wage employers. Congressional Democratic staff calculates that federal taxpayers pay $2,103 per year in subsidies for the average Wal-Mart worker.

Source: http://www.inthesetimes.com/article/the_wal_mart_effect/


One is made out of a bunch of "edible food-like substances" (to quote Michael Pollan) the other is made with actual food.

Because once we poison the well, only then can we have a fruitful discussion about this. Let's not drop this to the level of throwing about "frankenfood" type labels; those convince for entirely bogus reasons.

And I have had some absolutely asininely terrible food from local restaurants that would make a chain look worthy of Michelin stars by comparison. I'd argue that is what you lose more of with chains; you don't get the variance that allows for the highs, but you are also somewhat covered from that same variance giving you some really dismal lows. Sometimes, predictable but mediocre is more desirable than variable and possibly outstanding or complete crap.

Wal-Mart also pays so little that many of its workers rely on state healthcare subsidies, food stamps, housing vouchers and other public aid.

FWIW, I think the subsidies given to Wal-Mart are a bunch of crap. If you dislike those, talk to your government; don't damn the recipient for taking what they've been given.

But Wal-Mart isn't entirely to be damned for the rest of this either. The presumption under all of this is that the people who are employed by Wal-Mart have no other options. Is this true? Yes, I know; of course it's true. But, is it true?

Either way, things are rather bleak. Either people aren't exercising their options to their own advantage, or they are trapped in a position with absolutely no bargaining power.

But we won't solve this by killing Wal-Mart; these problems are merely exposed by a big box store forcing the economics we've been ignoring for a long time for the sake of social harmony and idealism. Products were sold at marked up prices to support marked up sales staff, and we look back and argue all of that was to preserve a milieu that is considered an unquestionably good thing.

The same thing in the past has happened to a ton of other professions. In every case, we moved on. In a way, all of these subsidies prevent us from doing so.

I wonder if the reality is much more mundane and non-idealistic; we didn't care much either way, but didn't know until Wal-Mart and the like came along that it could be done cheaper.

As a footnote, I'm not pro-Wal-Mart. I'm just someone who is alternatively interested in and confused by people's reactions to economic forces.


Are you sure you know where the local restaurant's food comes from? It's most likely from SYSCO: http://www.sysco.com/products/productpage_search.asp?product...

Not every local restaurant sources local organic farm-to-plate food.


Well my girlfriend is a food analyst for one of SYSCO's rivals, so I tend to have a bit more knowledge than the average person, but yes I agree that a lot of local restaurants don't use local produce, but it's still better than going to basically any chain restaurant.


I understand the point you are trying to make in posting this, and it is a good point to make.

However, in the article, Norvig himself points out the worthiness of learning several languages, and doesn't claim that you have to get on a ten year treadmill for any individual one, or even a subset. My impression was that Norvig was really targeting the people for whom whatever-language-in-a-fortnight-and-a-half would be their first exposure to programming at all.

As you gain more languages, provided you are judicious in choosing, so shall your learning of each of those languages be more rapid than the prior.

Does a month per language seem a bit rapid? Yeah, it kind of does, depending on the time spent in study and tooling. The pragmatic programmer guys suggest learning a new language every year; that seems wholly attainable. In reality, the author studied a language to what he considered a sufficient level in a quarter. That also doesn't seem that unreasonable, depending on the level of mastery desired. Could be that both perspectives have more than a grain of truth.


I like to learn new languages, but I end up at this.

What matters more professionally is Frameworks, SDKs, Libraries, etc.

For example I'm not a big fan of C#, or static typing, but Unity's C# (And UnityScript/BooLang) framework is pretty good to be ignored.


> What matters more professionally is Frameworks, SDKs, Libraries, etc.

It depends. Frameworks and SDKs cannot overcome shortcomings of your language and/or language implementation (and often you don't have the luxury to switch gears, because there is only one reference implementation).

When I studied Erlang's support for concurrency, I realized how hard it would have been trying to do that in another language. Yes, you could somewhat manage to replicate that, but:

- your language implementation could lack microthreads, thus impacting how concurrency would scale;

- you may need to follow some protocol in using the framework, otherwise you could fall into one of many pitfalls, which aren't there in the language you're using as a model.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: