Hacker Newsnew | past | comments | ask | show | jobs | submit | more nhaehnle's commentslogin

> Every PR/commit merged to master should be a clean logical unit.

The issue is one of review scaling. I wrote a blog post about this a while ago[0], but the gist of it is that those clean logical units are often too small for meaningful high-level reviews of more complex work.

With complex features or refactorings, you're often in a situation where those clean logical units allow reviewers to do a good low-level review (do a check for logic corner cases, style issues, etc.) but they don´t allow a high-level review of how all the pieces of the feature work together.

IMHO the most open-source process friendly solution to the issue is to review patch series, where you can review the series as a whole for the big picture, but also dig into individual commits for the details. Building such a patch series requires an approach as described in the article.

(In closed source environments, you may get a good enough approximation of the result with a separate, disciplined software design process.)

[0] http://nhaehnle.blogspot.com/2020/06/they-want-to-be-small-t...


My understanding is that they're not necessarily asking for the commits to be squashed, they're asking for them to be submitted together as a patch series.

Patch series are an important part of a healthy review workflow because they allow both a micro and a macro view of changes. I have written about this before: http://nhaehnle.blogspot.com/2020/06/they-want-to-be-small-t...


One related aspect that the article doesn't go into too much is that there is a tension in the size of the unit of code review: there are reasons for reviewing big chunks at once, but also reasons for reviewing individual changes that are as small as possible. I've gone into more detail on this in the past.[0]

Stacked diffs make that possible because you can review either individual commits or an entire stack at once.

The irony is that this is largely the way that Linux kernel development works -- and the Linux kernel was the first user of Git! Most projects who later adopted Git seem to have never learned this lesson that was there from the beginning, and have since been reinventing wheels.

[0] http://nhaehnle.blogspot.com/2020/06/they-want-to-be-small-t...


Great post!


Put uncommitted changes into a WIP commit (or multiple WIP commits), then fetch & rebase.

Putting stuff in commits ASAP locally is good practice anyway, since it guarantees you will never accidentally lose your work.


So it is exactly what i'm saying. You need 3 operations, that will completely change the state of your repo folder.

with svn, it was common for me to have something with 50 files.

Currently working on 5 files "v,w,x,y,z".

I noticed a simple typo in file "c":

modify "c", commit "c"

3 pending modifications in "f,g,h".

commit "f"

commit "g"

commit "h"

Someone ask me to correct something else in "c".

svn up, correction and commit "c".

then "svn up", just updating everything without messing with current work on "v,w,x,y,z".

(no need to stash, or rebase, or anything as no one else touched them anyway)

And let's suppose that i just want the new version file "k" without modifying everything in my current working folder:

svn up "k"

Good old time was as easy as that... even if git has its own advantages for some other cases.


We have all been part of a massive real world experiment that shows a pretty obvious correlation between the introduction of vaccines and the reduction (and even elimination) of associated diseases.

Saying that a relatively small number of people have done the work to prove the effectiveness of vaccination is really disingenuous.

(Also, there's a pretty big difference between doubting an n=50 sociological study and deciding not to believe the very public historical record of global disease patterns.)


Do you trust the statistics that show how effective vaccines are and that the downsides are small?

Great! Me too! I wasn't involved directly in that work, but I trust that it hasn't been manipulated.

This is what I mean, I still trust these institutions (in part because I'm part of the technical/scientific community, and understand how it works). Many people appear to have lost trust in these institutions.


This article is terrible.

> Devalue Venezuela’s currency, the bolivar, by a whopping 95%. The new currency will be renamed the "sovereign bolivar."

> Instead of an exchange rate of 250,000 bolivars per US dollar, it will increase to around 6 million.

This is not a devaluation by 95%. The math doesn't add up.

> The petro is valued by the Venezuelan government at around $60, or 3,600 sovereign bolivars.

So 60 sovereign bolivars are one USD? Previously it said 6 million sovereign bolivars are one USD.

> To make things more complicated, the new sovereign dollar will also be re-denominated, which will remove about five zeros from its unit measurement.

Oh, okay. I assume they mean sovereign bolivar, not dollar (did nobody proofread this?). That makes the above point make more sense.

> At the same time, President Maduro also announced a huge 3,000% increase to the minimum wage.

> So in the new re-denominated currency, a person on the minimum wage will receive around 1,800 sovereign bolivars a month, instead of 1.8 million.

Is that 3000% increase in bolivars? In real terms? And again, I can't see the math adding up, even taking the supposed 95% devaluation.

And I haven't gone into any of the cryptocurrency stuff...

There's probably a lot of stupidity going on in this new Venezuelan policy, but that article makes a total mess of it on top of it. It's context-free reporting, by somebody who doesn't seem to care to understand what they're reporting, and the numbers just don't fit together.

The best I can make of it is:

1. The Venezuelan government still refuses to let the Bolivar float freely, which means that black market currency exchanges will continue to operate.

2. The Venezuelan government is changing the official exchange rate of Bolivar to USD.

3. The Venezuelan government is replacing the bolivar by the sovereign bolivar, where 1 sovereign bolivar = 100,000 bolivar.

4. They're increasing the minimum wage by an effective factor 100 in domestic currency. (The value of the minimum wage internationally will change by a different amount -- and not 3000% -- because of point 2, but any numbers you're getting out of the article are likely moot anyway because of black market exchanges.)

5. They're introducing some weird cryptocurrency gimmick that isn't explained properly.

Edit: And here's a Reuters article with different numbers for the minimum wage (stating a 3 mio baseline as opposed to 1.8 mio.): https://www.reuters.com/article/us-venezuela-economy/venezue...


From what I understand, another disadvantage of monolithic 3D are the yield implications.

When you fabricate several 2D chips and then integrate them, it allows you to test those 2D chips for errors separately before the integration, which should give you better yields overall.


Yes, that's a major problem as well since it's an exponential issue.


You should look into integer linear programs. They are a much useful "DSL" for discrete optimization. You get a lot of insight into flow problems from studying their LPs, for example, and it's very easy and efficient to solve flow problems with additional constraints using ILPs. Also, the state of the art for solving TSP and related hard path-finding problems uses techniques from integer linear programming.


You are correct. (I am very familiar with LP as well)


There's no general answer to your question, it depends on the problems. You can find parameter-preserving reductions between some problems, but this isn't always the case.

Also, instead of looking at fixed parameter tractability, it often makes more sense to look at approximation algorithms (if your goal is to optimize something, rather than getting a strict Yes/No answer).


Integer programming, not linear programming. Linear programming (without the integrality constraint) is in P, so you cannot use it to solve general SAT problems, which are NP-complete (unless a major and highly surprising theoretical breakthrough is found).

The free SAT solvers are very good, and much better than commercial IP solvers at solving problems that are a natural fit for SAT. (Obviously, you can encode any IP as an SAT formula and vice versa, and the IP solvers are better at solving the problems where you actually have meaningful arithmetic.)


Thanks for clearing that up!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: