Hacker Newsnew | past | comments | ask | show | jobs | submit | diegocg's commentslogin

There are certainly other use cases. git bisect was enormously useful when it was introduced in order to find Linux kernel regressions. In these cases you might not even be able to have tests (eg. a driver needs to be tested against real hardware - hardware that the developer that introduced the bug could not have), and as an user you don't have a clue about the code. Before git bisect, you had to report the bug and hope that some dev would help you via email, perhaps by providing some patch with print debug statements to gather information. With git bisect, all of sudden a normal user was able to bisect the kernel by himself and point to the concrete commit (and dev) that broke things. That, plus a fine-grained commit history, entirely changed how to find and fix bugs.


> With git bisect, all of sudden a normal user was able to bisect the kernel by himself and point to the concrete commit (and dev) that broke things.

Huh. Thanks for pointing that out. I definitely would never have thought about the use case of "Only the end user has specific hardware which can pinpoint the bug."


This is why operating systems are hard. It's not the architecture or the algorithms.


At this point it would seem that the cause for the current outages goes beyond the original DNS issue.


As someone who uses D and has been doing things like what you see in the post for a long time, I wonder why other languages would put attention to these tricks and steal them when they have been completely ignoring them forever when done in D. Perhaps Zig will make these features more popular, but I'm skeptic.


I was trying to implement this trick in D using basic enum, but couldn't find a solution that works at compile-time, like in Zig. Could you show how to do that?


  import std.meta: AliasSeq;

  enum E { a, b, c }

  void handle(E e)
  {
      // Need label to break out of 'static foreach'
      Lswitch: final switch (e)
      {
          static foreach (ab; AliasSeq!(E.a, E.b))
          {
              case ab:
                  handleAB();
                  // No comptime switch in D
                  static if (ab == E.a)
                      handleA();
                  else static if (ab == E.b)
                      handleB();
                  else
                      static assert(false, "unreachable");
                  break Lswitch;
          }
          case E.c:
              handleC();
              break;
      }
  }


Thanks! That indeed does the equivalent as the Zig code... but feels a bit pointless to do that in D, I think?

Could've done this and be as safe, but perhaps it loses the point of the article:

    enum U { A, B, C }

    void handle(U e)
    {
      with (U)
        final switch (e) {
        case A, B:
          handleAB();
          if (e == A) handleA(); else handleB();
          break;
        case C:
          handleC();
          break;
        }
    }


This makes me sad. AI is a product, these days being mentally healthy should imply having the emotional ability to be aware of that. If you become emotionally attached to a commercial product you should seek for help.


I wonder if it would even be helpful because they avoid the increasing AI content


This is what I was thinking. Eventually most new material could be AI produced (including a lot of slop).


I would say that the lesson here is that cross-vendor replication is more important than intra-vendor replication. It is clear that technology can (largely) avoid data losses, but there will always be humans at charge


Nitpick: true replication is high-availability, not disaster-recovery (i.e. not a backup)

If wrong data gets deleted, and that gets replicated, now you simply have two copies of bad data.


Yep, for me it confirms all the reasons why I think python is slow and not a good language for anything that goes beyond a script. I work with it everyday, and I have learned that I can't even trust tooling such as mypy because it's full of corner cases - turns out that not having a clear type design in a language is not something that can be fundamentally fixed by external tools. Tests are the only thing that can make me trust code written in this language


> Yep, for me it confirms all the reasons why I think python is slow

Yes, that is literally the explicit point of the talk. The first myth of the article was “python is not slow“


Reminds me of all the optic fiber infrastructure that was built during the dot com bubble


But fiber optics are different, they stay good for a very long while and you just need to add better transceivers at the ends to upgrade them, it's questionable whether the current GPUs are even worth powering up once 2nm ones are out, and we also don't know how long they last, electromigration might start killing them in 5 years.


GPUs might become obsolete after the AI bubble bursts, but tons of supporting infrastructure, including non-GPU servers, will be in the market for pennies.


Yes, this bubble is much better than crypto/nft. Same as mentioned railroad rush - capex burned, but lots of stuff have been built and left after burst.


The problem is that even when you give them context, they just hallucinate at another level. I have tried that example of asking about events in my area, they are absolutely awful at it.


Don't worry Mark we will also get AI based ad blockers


browsers like Brave.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: