Hacker Newsnew | past | comments | ask | show | jobs | submit | post-it's commentslogin

However,

> Well, there’s nothing special about the timing of my visit. I’m just travelling—you know, Europe on five dollars a day—and I’m observing the Wall because it happens to be here.

It's relatively unlikely that you'd visit the Berlin Wall shortly after it's erected or shortly before it falls, and quite likely that you'd visit it somewhere in the middle.


No, it's exactly as likely that I'll visit it at any one time in its lifetime. Sure, if we divide its lifetime into 4 quadrants, its more likely I'm in quadrant 2-3 than in either of 1 or 4. But this is slight of hand: it's still exactly as likely that I'm in quadrant 2-3 than in quadrant (1 or 4) - or, in other words, it's as likely I'm at one of the ends of the lifetime as it is that I am in the middle.


I think most people that enter `1/2` in a spreadsheet do indeed mean `January 2nd` and not `0.5`. In the wider world of people using spreadsheets, dates are certainly more common than fractions.


You're right about that, but maybe it should just treat '1/2' as '1/2' and only convert it if it makes sense for the current operation. If I type 1/2 and I want the date, then I want 1/2, not Feb 1, or Jan 2, or 01/02/2025, or 2025-02-01, unless again, I have explicitly specified that this cell is a date, and this is the format I want it in.


In Sweden we don't use that numbering scheme and instead use Day/Month Year (which makes more sense as it goes from smaller to larger).


I think it's just what you're used to. If counting from smallest to largest was inherently better, then a dozen would look like 21, not 12. Little vs Big Endian, I suppose.


The issue is when you do Month/Day/Year, then you lose consistency. Both Year/Month/Day or Day/Month/Year are more logical.


It would be interesting to know if your Excel correctly interprets 1/2 as 1st February based on your international settings.


Spoiler: it does.


The feds aren't just going to let the land sit unbothered. They're going to sell it off and/or allow resource extraction. Nothing a non-profit can prevent.


It's just a matter of phrasing. "Hi, I wanted to give you a heads up that XYZ needs doing, and I'll be doing it on Wednesday. Let me know if that doesn't work."


I think what they're saying is that Cursor makes money because it's a good editor in general that integrates AI well, not just because of the fact that it uses AI.

If you just slap a ChatGPT backend onto your product, your competitors will do it too and you gain nothing without some additional innovation.


Cursor without AI is just VSCode. They came up with an AI-native code crafting experience that no one else has thought of before and if you asked me how they did it I wouldn't be able to answer you.


(1) That's what the original author is saying. Their valuation is possibly incorrect.

(2) On the other hand, Cursor's value is essentially gluing the two things together. If your data is already in the castle (e.g. my codebase and historical context of building it over time is now in Cursor's instance of Claude) then the software is very sticky and I likely wouldn't switch to my own instance of Claude. The author also addresses this noting that "how data flows in and out" has value, which Cursor does.


But how defensible is that in the market?


> You can’t build a moat with AI

> The AI Code Editor - Built to make you extraordinarily productive, Cursor is the best way to code with AI.

Cursor is literally a VS Code fork + AI.

> unless you’re building or hosting foundation models, saying that AI is a differentiating factor is sort of like saying that your choice of database is what differentiated your SaaS product: No one cares.

Cursor is doing exactly what they say "no one cares" about.

It's bad writing (and thinking).


but that just Cursor actually is???? strip the chatgpt integration and it is just vscode


Yeah, I think this really exemplifies the "everyone more specialized than me doesn't get the bigger picture, and everyone less specialized than me is wasting their time" trope. Developers who don't want to deal with the nitty gritty in one area are dealing with it in another area. Everyone has 24 hours in a day.


The difference between a good developer and a bad is understanding the stack. Not necessarily an expert but I spend a lot of time debugging for random issues and it could be dns or a file locking issue or a network or a api or parsing EDI whatever. Most recently I found a bug in software that had to do with how Windows runs 32 bit mode on 64 bit. I've never used windows professionally and I have only had unix machines since I got a free Ubuntu CD. Yet I figured it out in like 20 minutes exploring the differences between the paths when running in two scenarios. Idk maybe I'm a genius, I don't think so, but I was able to solve the problem because I know just barely enough about enough things to poke shit and break them or make them light up. Compare that to a dev on my team who needed help writing a series of command line prompts to do a simple bit of textual adjustments and pipe some data around.

I'm not a even good developer. But I know enough to chime in on calls and provide useful and generally 'Wizarding' knowledge. Like a detective with a good hunch.

But yeah just autocomplete everything lol


It's great that you were able to debug that. It may have come at an opportunity cost of being able to solve some more specialized problem within your domain.

In my job I develop a React Native app. I also need to have a decent understanding of iOS and Android native code. If I run into a bug related to how iOS runs 32 bit vs 64 bit software? Not my problem, we'll open a ticket with Apple and block the ticket in our system.


I guess I never have enough leverage to order Apple to fix stuff. I'm like water and gravity. It's just a random example though and I agree you do give up a lot by being a generalist. However for most people we don't do really new or hard problems. Its a lot of spaghetti


I don't think of it as spaghetti but as messy plumbing.


I don't disagree with you, but I do think it's important to acknowledge that this approach requires someone else to do it. If you're at a big company where there are tons of specialists, then perhaps this is just fine because there is someone available to do it for you. If you find yourself in a different situation, however, where you don't have that other specialist, you could end up significantly blocked for a period of time. If whatever you're working on is not important and can afford to be blocked, then again no problem, but I've been in many situations where what I was doing absolutely had to work and had to work on a timetable. If I had to offload the work to someone else because I wasn't capable, it would have meant disaster.


> we'll open a ticket with Apple and block the ticket in our system.

Wouldn't it be annoying to be blocked on Apple rather than shipping on your schedule?


If we're blocked on Apple, so is everyone else. A key consideration in shipping high-level software is to avoid using niche features that the vendor might ignore if they're broken.


This is a year old, does anyone have an article with updates?



Compared to other CVEs, the description for this one looks very different.

Does anyone know why it reads so apologetic?


No updates because its just a dude freaking out about his incredible jump to conclusions. The bug was fixed and everyone kept living their lives


Also, in Canada, it's basically impossible to protect your IP as an individual due to the astronomical cost and lack of options to recover that cost. So copyright will never incentivize my creations, or those of any small creator.


Yeah, but it doesn't look like there's been much progress on the robotics front.


Until reasoner AI 10x's research and solves it far ahead of schedule.

You don't even need robotics, just a good multi-modal reasoner model with domain knowledge that you can attach to a headset manna-style [0]. The only thing that makes blue collar work different from any minimum wage labor is the System 2 reasoning required, and that'll get solved.

[0] https://marshallbrain.com/manna1


It sounds like you've never used a welding torch, installed a kitchen sink, or done similar blue collar work. These jobs will never be replaced by robots, or by a non-trained person wearing a headset.


> It sounds like you've never used a welding torch, installed a kitchen sink, or done similar blue collar work. These jobs will never be replaced by robots, or by a non-trained person wearing a headset.

Why do you think they will never be replaced by robots?


Not the person who said it and I wouldn't say "never"...

But I will say that until we have a robot that can fold laundry, we won't have a robot that can go into your crawlspace and replace a chunk of crusty old galvanized pipe with copper or pex.

Robots have excelled, so far, in controlled environments. Dealing with the chaos of plumbing in a building that has been "improved" by different people over the course of a century is the opposite of that.


We do have robots that can fold laundry (in a regular laundry room, and supposedly trained with a generalist policy that can learn other tasks).

https://www.youtube.com/watch?v=YyXCMhnb_lU


One thing is as sibling post commented, the complexity of such jobs are staggering from a robotics point of view.

The other thing is that the salary of the plumber or welder is in the range $20/hr to $40/hr. Can you make a general purpose, agile robot function at a total cost of ownership that's substantially lower than this?


Also, you know, muscle memory. The idea that you could slap a headset on a rando and just walk them through doing a trade is ludicrous. It's a great teaching tool, but you can't use it to replace a trade worker real-time.


You could set up a DMZ if you're concerned. But even if someone gets on my LAN, it's not the end of the world. They could send goatse to my Chromecast, I guess.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: