Hacker Newsnew | past | comments | ask | show | jobs | submit | DSMan195276's commentslogin

Yeah this definitely falls into the category of "I use them so they feel natural", there's nothing amazing about those names.

The underlying problem is that you now run into so many named things (utilities, libraries, programs, etc.) in a day and they all have to differentiate themselves somehow. You can't name every crypto library `libcrypto` for obvious reasons.


Fine. Name it sodium-crypto.a or sodium.crypto.a or whatever. The author's complaint does hold water.

You can, but then the names get needlessly long and one of the things we generally like (especially for command-line programs) is names that are short and easy to type. If we're going to make this argument then why not call the unix tools `concatenate`, `difference`, `stream-editor`, etc. Those are way better names in terms of telling you what they do, but from a usability standpoint they stink to type out.

Libraries and programs also have a habit of gradually changing what exactly they're about and used for. Changing their name at that point doesn't usually make sense, so you'll still end up with long names that don't actually match exactly what it does. Imagine if we were typing out `tape-archive` to make tarballs, it's a historically accurate name but gives you no hint about how people actually use it today. The name remains only because `tar` is pretty generic and there's too much inertia to change it. Honestly I'd say `cat` is the same, It's pretty rare that I see someone actually use it to concatenate multiple files rather than dump a single file to stdout.

The author is missing the fact that stuff like `libsodium` is no differently named from all the other stuff he mentioned. If he used libsodium often then he may just as well have mentioned it as well-named due to it's relation to salt and would instead be complaining about some other library name that he doesn't know much about or doesn't use often. I _understand_ why he's annoyed, but my point is that it's simply nothing new and he's just noticing it now.


Short names are a figment of the age of teletypes when you had to repeatedly type things out. This hasn't been the case for at least 3 decades. Most good shell+terminal combinations will support autocomplete, even the verbose Powershell becomes fairly easy to use with shell history and autocomplete, which, incidentally, it does very well.

If you are repeatedly typing library names, something is wrong with your workflow.

Niklaus Wirth showed us a way out of the teletype world with the Oberon text/command interface, later aped clumsily by Plan 9, but we seem to be stuck firmly in the teletype world, mainly because of Un*x.


libeay

`eay` is just the initials of the original author, so basically the same thing as `awk`.

> The author's complaint does hold water.

Ironically, much like sodium itself, a substance of which the author seemingly possesses too much of.


Without looking it up, is it sodium for "salt"? That's about as tethered to the actual use (salt + hash being a common crypto thing) as any of the names in the root comment

I think the problem is that if they're in the road their liability and required smarts go up a lot. Right now it sounds like they're at least partially relying on being the largest thing on the "road" and everyone else will naturally get out of their way.

In this case the problem is that the fed is the one who runs the TSA and created the Real ID rules, but the states are the ones that actually issue the IDs meeting those rules. The fed couldn't force the states to implement the rules and the states didn't want to spend money on something they didn't really care about.

Of course, they didn't really care about it because it's mostly just security theater and thus the fed was never going to start turning people away simply for not having a compliant ID (which is still true). If there were much more valid reasons for why everybody needs to have a Real ID then states would have put more effort into getting everybody to have one.

There's also the separate the issue that the Real ID rules are questionable and it's not always easy for someone to get a Real ID even if they want one.


It's a single number in that if you take an IQ test one time you get one number, but that doesn't mean you'll get that exact number every single time you take an IQ test. Even ignoring more complex questions about them, your score on an IQ test will vary depending on simple things like how tired you are when you take it, so in practice there's some variance and you do not always get the same number every time you take a test.


Well based on the paragraphs in the README it's not actually being updated anymore, it only reflects SteamOS as of August and the author quit running their process to update it.


What prevents a farmer from simply switching back to the non-GMO seeds if the GMO option goes up in price? Or even ignoring that, switching to a different cheaper GMO seed from a different company?

I think that's the piece I and others are missing, isn't it ultimately a question of which seeds will make the farmer the most money? If a particular GMO seed suddenly become so expensive that either non-GMO or other GMO seeds are more cost-effective, why can't they just start using them instead?


Not really - if the market price for a crop is such that it depends on the greater volume which can be produced by GMO seeds, switching to non-GMO seeds becomes uneconomic.

Let's say GMO crops gives you a grain yield of 1-ton/acre and that non-GMO crops gives you a yield of 0.5-ton/acre. Now the market price is say set at $100/ton. This cuts down their earnings by half in the best case, all other inputs remaining the same.

Now if the GMO-seeds are controlled by a foreign entity, your entire agri output becomes dependent on that foreign entity not behaving badly. Whichever nation that controls the entity who owns the GMO-seed now has leverage over you.

So no, it isn't as simple as "switch back to using non-GMO seeds". This has to be carefully considered before adopting GMO-seeds.


"Bugfixes" doesn't mean the code actually got better, it just means someone attempted to fix a bug. I've seen plenty of people make code worse and more buggy by trying to fix a bug, and also plenty of old "maintained" code that still has tons of bugs because it started from the wrong foundation and everyone kept bolting on fixes around the bad part.


One of frustrating truths about software is that it can be terrible and riddled with bugs but if you just keep patching enough bugs and use it the same way every time it eventually becomes reliable software ... as long as the user never does anything new and no-one pokes the source with a stick.

I much prefer the alternative where it's written in a manner where you can almost prove it's bug free by comprehensively unit testing the parts.


It's a difference of whether the function arguments are declared or not. If you declare a `void foo()`, and then call `foo((float)f)`, the `foo()` function is actually passed a `double` as the first argument rather than a `float`. If you instead change the declaration to `void foo(float)` then it gets passed as a `float`.

Ex: https://godbolt.org/z/TKjz3Tqqr


> I'd go read the original PR and the discussion that took place.

Until your company switches code repos multiple times and all the PR history is gone or hard/impossible to track down.

I will say, I don't usually make people clean up their commits and also usually recommend squashing PRs for any teams that aren't comfortable with `git`. When people do take the time to make a sensible commit history (when a PR warrants more than one commit) it makes looking back through their code history to understand what was going on 1000% easier. It also forces people to actually look over all of their changes, which is something I find a lot of people don't bother to do and their code quality suffers a lot as a result.


It also enables bisect to work properly.

Bisecting on squashed commits is not usually helpful. You can still narrow down to the offending commit that introduced the error but… it’s somewhere in +1200/-325 lines of change. Good luck.


If your PR is + 1000 code lines long, you already made a mistake at the requirements and planning stage (like many teams do).


This sounds unattainable to me. For code bases in the 2 million or more lines range, something as simple as refactoring the name of a poorly named base class can hit 5000 lines. It also was not a mistake with the original name it had, but you'd still like to change it to make it more readable given the evolution of the codebase. You would not split that up into multiple commits because that would be a mess and it would not compile unless done in one commit.

How is this a mistake?


Such PR's shouldn't be the norm but the exception. What happens way more often is that such refactoring happen in addition to other criteria on the same PR. In high-functioning teams i've worked this is usually done as a separate PR / change, as they are aware of the complexity this operation adds by mixing it with scope-related changes and that refactoring shouldn't be in the scope of the original change.


I don't agree, for example my team includes yarn.lock in the commit which adds quite a few lines to the PR.


Or finding what bug was reintroduced in a +13/-14398


FWIW licensing is definitely part of why some 'obvious' stuff is still missing, Nintendo doesn't own the rights to games that they didn't develop themselves (generally speaking).

Ex. We'll probably never see the first six FF games on Switch Online, Square Enix is just unlikely to agree to that for a variety of reasons.


Which is rather surprising to me. I don't know what the contracts between Nintendo and developers say but I would have expected "rights to publish or distribute in perpetuity" would have been in there as part of the deal for making official carts.


That's probably true for later generations, but in the NES (and maybe SNES) era? Undoubtedly, they didn't have the foresight to write that into the contracts.

In the early days of television, many broadcasters were prohibited by contract to retain any copies of the performance, because no value was seen in reusing them, and there's no other reason to give them any rights. Also see shows like WKRP in Cincinnati where music was only licensed for the slim purpose of the original broadcast (and perhaps direct repeats in syndication), but for release on home video, the music used did not support that use, so it had to be replaced.


Nintendo already tried this kind of power grab, more or less, and it failed.

The NES didn't have software lockout in Japan. Most third-party Famicom[0] games were manufactured and sold by their publishers, with little or no control from Nintendo. Nintendo's way to wrestle back control over their platform was the Famicom Disk System, a disk drive add-on that was intended to work around the NES's 32k ROM limit (and associated costs of ROM) with cheaper disks that could hold up to 64k of loadable data per side.

The key was that the FDS had two lockout features that Famicom cartridges didn't:

- FDS disks had an imprint of the Nintendo logo at the bottom that meshed with plates in the disk drive. If your disk did not say Nintendo on it, it would not mount in the drive and the game would not play.

- FDS disks were rewritable, and Nintendo planned to sell games at special vending machines that would write you a fresh disk. If you weren't selling your game through Nintendo, your game wouldn't be on these vending machines.

So if you wanted your game on FDS, you needed to sign a distribution agreement with Nintendo. I'm told the terms were rather draconian. Most developers just... put larger ROMs and better enhancement chips on their third-party Famicom cartridges. This continued until someone figured out how to copy FDS games using just the RAM adapter carts and Nintendo gave up on the FDS concept entirely.

To be clear, Nintendo did have software lockout in the US, and you did have to license your game to Nintendo and have them sell copies of it using their hardware. This caused a lot of problems for NES releases of Famicom games that had custom chips in them (e.g. Contra). But even then, these were not perpetual licenses.

A perpetuality requirement would have killed any and all licensed games stone dead. If you're making a movie tie-in cash grab game on NES, you don't want to have to license that movie in perpetuity just because Nintendo demands a perpetual sublicense for a game with an expected shelf-life of about a year. Hell, not even Nintendo licensed Mike Tyson for Punch-Out perpetually.

[0] "Famicom" is just "Nintendo" in Japanese!


> I would have expected "rights to publish or distribute in perpetuity"

I think that would have been unlikely to occur to Nintendo's lawyers, since in the NES era, publishers required the developer's co-operation to provide masters targeting any additional platforms. This was before intergenerational emulation and internet distribution became widespread, and Nintendo would have had a sunset date for NES title sales.


I imagine we won't see MGS: Twin Snakes for the same reason


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: