Hacker Newsnew | past | comments | ask | show | jobs | submit | more jfkebwjsbx's commentslogin

The point of GP is that the match assumes an equality test and it does it against a single object too.

There is no way in many languages to write an if using a match.


It is the normal Git flow. GitHub just made it into a web.


The Linux kernel manages to do it with versions many years behind.


No. The Linux Kernel uses release branches. This thread is specifically about rejecting the use of release branches.


Releases are created off the master. Only if something is supported for long a branch is created where only fixes are backported, no development happens there.

What the GP suggested I assume is one master-like branch for every deployed version. Otherwise it makes no sense.


The point remains. The Linux branching workflow is vastly more complicated than what GP advocates.


The major reason for the transition is higher margins, plain and simple.

For customers, both average users and developers it will be a pain with little to be gained.


Unlikely - being able to have full control of your roadmap is a huge strategic advantage. Profits and revenue are nice, but if Apple was interested in that they could dual source x86 from AMD and drive cost down.

You don’t think companies like Oculus are envious of Apple’s flexibility from not having to rely on Qualcomm for their mobile SoCs? It’s not just about profit margin.


> Profits and revenue are nice, but if Apple was interested in that they could dual source x86 from AMD and drive cost down.

If Apple could do that and play it to their advantage, they would have done so a long time ago.

Higher margins and profits are the drivers in the end. Strategic control or not is just a way they use to achieve that. It is a publicly traded company, after all.

> You don’t think companies like Oculus are envious of Apple’s flexibility from not having to rely on Qualcomm for their mobile SoCs?

I doubt Oculus cares given their goal. There are pros and cons of vertically integrating an entire company into one.


No doubt profit margins are a big factor in their thinking.

However, performance/watt matters too.


Genuine question: unless you run a datacenter with thousands of CPUs, does it really matter?

Apple has zero presence in data centers.

I read people here writing "double the battery life" without any source, but even if that was the case I own a laptop that does 2 hours on battery, I use it to run models on a discrete GPU so power efficiency goes out of the window anyway, it's really not achievable.

The other one can handle average workloads for 12 hours and weights a bit more then a kilo, if it was smaller or lighter it would be a much worse laptop than it is (if it's too light it has no stability and you fight constantly to keep it steady on your desk)

Who needs more?


> Genuine question: unless you run a datacenter with thousands of CPUs, does it really matter?

I think it does. Other than double the battery life (which I wouldn't really need, but my Dad who travels a lot would absolutely love), the big thing is thermals (which were specifically mentioned in the keynote).

The biggest constraint on Laptop performance is thermal throttling. That's why gaming laptops have huge bulky fans, and a current MacBook has pretty decent performance for short bursts, but if you are running something (say a compiler) at full throttle for a few minutes then it gets significantly throttled.

Better thermal performance (which is directly proportional to power usage) could well be the key to unlocking close-to-desktop performance in a laptop form-factor. Which could be a pretty big win for the MacBook Pro market.


Source?

VS is not even 64-bit yet...


It is the fault of both.

Interns are not stupid and so they have to carry the burden of their mistakes too.


No, absolutely not. As an engineer you develop systems and processes that don’t allow such major mistakes.

You can’t fault people for making simple mistakes or you’ll end up with an organization where nothing gets done.


Every single human holding a responsibility at any level gets blame all the time. There is nothing wrong with that, nor with making mistakes, and it is a fact of life.

Engineering processes are orthogonal to that.


Apple does not make LLVM on their own, they contribute to it like many other companies.


Clang (not LLVM) was made by Apple [1, 2]. They specifically hired the lead developer of LLVM to build it because they were not happy with GCC. At this point many other companies have also contributed, but Clang is first and foremost developed as Apple's compiler. It is a safe bet that they will optimize it for new Apple hardware.

[1] http://llvm.org/devmtg/2007-05/09-Naroff-CFE.pdf

[2] https://lists.llvm.org/pipermail/cfe-dev/2007-July/000000.ht...


> Clang (not LLVM) was made by Apple [1, 2]. They specifically hired the lead developer of LLVM to build it

No, one cannot claim "Apple makes Clang" (which was your original claim) just because they funded the initial effort many years ago. It is not their product and they do not control it. It is like saying Blender is made by the original company who developed the initial version.

The LLVM Foundation (a legal entity) is the actual owner.

> because they were not happy with GCC

Many companies are not happy with GCC due to the license. Which is why so many companies work on LLVM.

> Clang is first and foremost developed as Apple's compiler.

False. The Clang version that Apple includes with macOS is not even close to the latest Clang.

> It is a safe bet that they will optimize it for new Apple hardware.

False. Optimizing is mainly the job of LLVM, not Clang.

Further, most optimization passes are independent of architecture. Codegen targets Intel, AMD, ARM, etc. hardware in general, not Apple’s in particular.


It is different enough that cppreference has its own Apple specific clang column.


The problem is that most of the games you mention are not what are considered educational games by the average parent nor institutions.

Even young generations, with casual gamer parents, can hold positions like "videogames are banned at home until our son is 8".


> It's the PR firm's job to have the "media connections" you so desperately crave.

Any meaningful connection is going to be too expensive for someone like GP, I am afraid.

> Tech people are notoriously bad at public relations

Please avoid (false) generalizations.


Any meaningful connection is going to be too expensive for someone like GP, I am afraid.

This is a false generalization. You don't have to choose a big-name New York PR agency. There are thousands of other shops that are quite affordable. Some are just one or two people, and can be surprisingly effective if you pick the right one.


The PR firm is not what I said is expensive, but whatever deal you reach with the contacts they provide you.

In other words, even if the PR firm is small and one of the best and cheap, they cannot really influence much whatever their contacts offer. They may have some leverage, but the price is the price...


> much harder to do it

What is harder? It is a single command to install most dependencies.

Using a C dependency is super easy from most languages, including C itself, of course.

> binary size and compile times are smaller at the expense of actual time spent writing code.

Nobody writing C professionally is doing that for those reasons unless the library is trivial.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: