Hacker Newsnew | past | comments | ask | show | jobs | submit | YuukiRey's commentslogin

The Steam Machine is smaller than any case that would be considered mainstream in the small form factor community, at least to my knowledge. The FormD T1 is around 10L for example, and would look almost comically large compared to the Steam Machine.

And enthusiast cases like this are often quite expensive and not easy to get. Then you need to think about thermals, and find hardware that actually fits.

You can approach it form another angle and treat it more like a NUC and get a SoC but then you're probably not going to get close in terms of gaming performance.

So long story short: I disagree that it would be straight forward to build something like this on your own, at the same price point.


How are you declaring it not possible "at the same price point" when the price of this isn't even announced?


At the expected pricepoint, this Steam machine can't cost too much over say a PS5 or a regular PC with comparable specs and still make sense

We're more or less waiting to see if / how much is Valve willing to subsidy the price with the expectation to recoup it with software


Are we looking at different charts? What I see right now on a 5 day view is:

- Nvidia -11%

- Palantir -16%

- Oracle -11%

- Meta -5%

With some very quick and extremely cursory napkin maths I do get in the 800 billion range, which the original article mentioned. I guess the linked article rounded it up to make it more sensational.


> I have seen what people are capable of doing when their tools get out of the way, and they are free to just create. This is how world class athletes, musicians, artists, writers, and of course programmers take what is in their mind and translate it into reality.

I think this is a fallacy. If you approach the question of how these people achieve the things they do with a bias towards tooling then you'll come to the conclusion that it plays a big role in their success.

In reality, many of these folks start with a very strong drive to achieve something and then the rest sort of follows. If you want to be a world class musician, start practicing an instrument. Ideally fall in love with music. The rigorous and meticulous practice routine comes later.

In other words: you can have the world's best tooling that gets out of the way, but you're still as unmotivated to do anything as before.

I think it's a cool idea and it sounds like a fun and creative endeavor. I don't want to talk it down. But I also wouldn't want folks to get the, in my opinion, misguided impression that "tooling -> success" is the correct order.


I'd go further. Some of the world's best tooling is only usable by the world's best users. Examples abound on this. The best drivers are in vehicles that I would almost certainly crash. Our best pilots are in planes that I literally don't understand the controls on. (That is true for the cars, too, oddly.)

A really good guitar is easy to miss notes on. Precisely because good guitarists don't typically miss.

Now, I think you could reword a little bit. The best performers know their tools very well. So well, that they often "just disappear" for them. This does require some level of quality in the tool. But requires a ton of effort from the performer, too.


As you get better at something you become more opinionated at what you need your tools to do. You demand more specific and tailored things from your tools and so you start to lean towards things that are more adjustable.

There is also the case that once your entire livelihood depends on something, consistency and muscle memory matter a lot. Lots of world-class athletes, drivers, and performers probably use tools that closely resemble the tools they learned and trained with their whole lives so they would probably seem kinda anachronistic to a newcomer.


You made me think of this quote: "If you want to build a ship, don't drum up the men to gather wood, divide the work, and give orders. Instead, teach them to yearn for the vast and endless sea." - Antoine de Saint-Exupéry


"if you want to build 100 ships, refer to the former"


> If you approach the question of how these people achieve the things they do with a bias towards tooling then you'll come to the conclusion that it plays a big role in their success.

I think the point of the author in your quoted text is that you want to avoid the tools getting in your way. If you're a writer, you become successful by writing good stuff. That's harder to do if your OS crashes and you have to click through a bunch of menus while you're writing. That's the reason so many bloggers adopted markdown 10-15 years ago - writing in plain text meant the tools got out of their way. It's not about the tools making you more productive, it's about using tools that don't make you less productive.


I think of it a bit differently: if the tool is getting in the way, this will hamper the effectiveness and raise the barrier for skilled individuals to do their best. Yeah, the absolute top-tier max-talent people can do well regardless, but if the tools are better quality and more "out of the way", this allows a greater pool of people to do their absolute best, with less friction.


> if the tools are better quality and more "out of the way", this allows a greater pool of people to do their absolute best, with less friction.

I think YuukiRey's point is that this is not true. The bottleneck for people to do their absolute best is almost never tool-induced friction, until you've already built a strong pre-existing skillbase. Overwhelmingly it's motivation, interest, time, energy, etc.

In theory tools can help with this. In practice usually the pursuit of tooling ends up being a distraction. This is how you end up with the (overly derogatory) idea of GAS, "Gear Acquisition Syndrome." The equivalent of this for digital things is e.g. the writer who spends money and time trying to find the perfect text editor paired with the perfect keyboard paired with the perfect monitor etc, instead of just writing. There are of course exceptions where tooling is really the main unlocking feature, but those are far and few between.

In fact what I get from YuukiRey is the opposite of this:

> Yeah, the absolute top-tier max-talent people can do well regardless

Rather it's that the best tooling only really makes sense for top-tier people, because for almost everyone else the tooling is not the bottleneck.


It’s a poor carpenter who blames his tools


Russian equivalent of this idiom sounds like "skillful surgeon helps a bad dancer". Even though it sounds confusing, what it really means is "a bad dancer blames his balls"


It’s a poorer carpenter who uses a can of beans as a hammer. Pros are responsible for choosing appropriate tools.


Right, I get that. That's their opinion, and I was expressing a differing opinion (that's why I said "I think of it a bit differently" lol)

I have recorded hundreds of songs using digital audio production software since ~1999. Switching to Logic Pro unlocked the opportunity for me to work WAY more effectively than a shareware tracker software I was using before (and Fruity Loops after that), in fact allowing production techniques that are literally impossible with a tracker. Not just large-scale features, but minutiae in how the interface works, "intuitiveness", ease of access like a single key-press to enter a certain editing mode, things like that.

When I am working with my mind and trying to be creative, every millisecond spent thinking about stupid UI quirks/peculiarities takes away from the part that actually matters: creating.

If the UI is obtuse, and I can't figure out how to employ a certain technique, the tool is hampering my progress. Conversely, a thoughtful feature in a tool can boost productivity and boost the success rate of reaching a "flow state"[0]. One example of this: there's a common technique to record multiple takes of the same segment of instrumental performance or vocals, and then layer those multiple recordings together to give more dynamism and depth to the sound. Infected Mushroom uses this technique a lot[1]. In Logic Pro 10 or so, they added built-in support for doing this, making it super easy to quickly/successively record multiple takes[2]. I don't know what other DAWs did this at the time, but if you're just learning, this is a really nice production process surfaced in a very low-friction way. Otherwise you are making a new track, recording, trying to line it up properly, making a new track, recording again, lining it up again, etc. It's also not even obvious that this is something you could do, but because Logic made it an actual built-in feature, its very existence also acts as a form of "tutorial" if someone is just exploring the UI or reading the documentation.

So, yeah, as a pretty amateur studio producer at the time, the "best tooling" allowed my skills to improve by a gigantic margin, compared to the slow progress I was making with inferior tools before that. I can't agree for a second that tooling doesn't matter or only matters for people at the top of their game.

[0] https://en.wikipedia.org/wiki/Flow_(psychology)

[1] https://www.prosoundweb.com/exclusive-interview-production-t...

[2] https://support.apple.com/en-ca/guide/logicpro/lgcpb19806af/...


> Right, I get that. That's their opinion, and I was expressing a differing opinion (that's why I said "I think of it a bit differently" lol)

That's fair. I was mainly pointing out that "a bit differently" is significantly underselling it. You are basically of the opposite point of view.


Maybe it's a fallacy and maybe it isn't. But I often hear people say "I don't use tool X because it doesn't actually increase my productivity". X is emacs or debuggers or profilers or Linux or version control or code comments or whatever. And after observing such people work over time I decided that most of them are just trying to justify their laziness. YMMV.


Emacs is in a different category from all of those, since it gives the dev more control rather than more abstraction.


I think I agree that best tooling is not a sufficient condition for success.

But I don't see where the author is committed to such a thesis in the quote you provide.

As far as as I can see, they are not even committed to best tooling being a necessary condition for success.


"Give me six hours to chop down a tree and I will spend the first four sharpening the ax."

"If I had only one hour to save the world, I would spend fifty-five minutes defining the problem, and only five minutes finding the solution."


Indeed. When you have superior skill, inferior tooling can be a constraint. But a superior tool will not compensate for inferior skill.


Agreed. You can have all the best tools and you still aren't guaranteed "success"


I think this is a bit of an oversimplification, I see art and technology as more like a dance where it's unclear who's leading who.

E.g., quick high-level examples: Photograph invented led to Impressionism, Andy Warhol's entire oeuvre. Today one of the most talked about artists is Beeple (technology-forward in distribution medium, tooling, and even practice techniques [e.g., "dailies"]).

Music is probably where this is the most profound, take the trajectories of Miles Davis and the Beatles, both started their career with a fledgling recording industry, ended it record in sophisticated studios using instruments and studio techniques that didn't exist a mere 5-10 years earlier.

In electronic music this is even more pronounced, e.g., Native Instrument's Massive synth facilitating dubstep is a nice clean example, but if you've followed electronic music overall the connection is obvious. E.g., what dates most pre-2000s era music is that today musicians use DAWs whereas before it was less powerful hardware devices that had a more specific sound (and other arrangement and composition limitations).

This actually feeds into one of the points you made: Being successful at art (or anything really) has a lot to do with how excited and motivated you are to pursue it. It's easier to be excited if you feel like you're exploring new territory, ripe with untapped potential, and that's historically often been unlocked by technology. Whereas if you keep comparing your solos to John Coltrane when you're learning the saxophone, that's going to be demoralizing and you'll feel like you'll never get there so why bother trying. There's also diminishing returns, e.g., that music territory has been so thoroughly explored now, so the ROI on developing that specific skill (playing jazz at that level) has been reduced, because so much of that artistic space has already been explored.

If you tie that all back to the art itself, I'd assume today that we already have saxophone soloist who are more technically skilled than John Coltrane, e.g., the music theory is better understood, and we've had decades of iteration to improve practice techniques (there are tons of books and studies on this subject now). But you can't replicate the sheer excitement that those musicians must have felt as they unlocked new music possibilities by iterating on music theory (a form of technology), and recording as a new medium to share and learn from.

To be clear, most of what you've said I'd agree with, but I'd add more nuance like: Leverage technology to make the act of creation as exciting for you as possible, but the main goal of the excitement is to keep yourself practicing and improving. And also look for untapped potential (e.g., a specific example that's relevant today, I think GPU-based rendering is still under-explored today Beeple has been able to leverage this in his art, but I think the big barrier of entry [probably ~$10,000+ for hardware/software over the course of a career] means there's untapped potential there.

E.g., Daft Punk on well-tread territory due to the accessibility of technology https://pitchfork.com/features/lists-and-guides/9277-the-yea...:

> Technology has made music accessible in a philosophically interesting way, which is great. But on the other hand, when everybody has the ability to make magic, it's like there's no more magic—if the audience can just do it themselves, why are they going to bother?


Seems particularly funny in an article about Emacs, a piece of software that lets you get in situations where some portion of your "just create" time becomes "managing my custom emacs, please don't break".


One way to look it is to approach it as a creative practice. A good part of any practice is devoted to developing technique.

Some are just fine with a standardized but unoptimized tool while others are fascinated by building their own high-flying TUI. The journey is the destination. If all you create is a config file, it still counts.


I find it a bit strange when people write about themselves in third person on their own website (see footer and about).

Anyway, the article seems very Amazon centric since I have no idea what an L6 or an L7 is. I get that they’re career ladder steps but that’s it.

And having testimonials about yourself on your own website…

The whole website feels like I clicked on an Ad for a person.


If I’m understanding right, he has a side hustle as a public speaker. So the website is an ad for him.


L6 is a senior engineer - typically effecting change and setting direction within a development team (or small group of related teams)

L7 is a principal engineer - typically effecting change at org level which will impact many teams


Amazon skipping “staff” and jumping straight to “principal” seems like such title inflation


on the contrary it seems like title deflation as Amazon principal engineers typically work at a higher level than staff at most other orgs (at least I remember a Microsoft principal would be basically an Amazon L5-6 level)


Amazon L5 is SDE2. I am not sure how you can equate a Microsoft Principal to Amazon L5. Getting to L6 in Amazon is very easy these days due to title inflation. Managers also know how to rig the system to gather the data points for promotion. There was a time when Amazon promotion bar was high and Amazon SDE3 were considered same as Microsoft Principal. But things have changed now. A fresher needs only 2 promotions to get to L6. Some are getting there in 2-3 years. So Amazon L6 does not have the value that it used to have a decade ago. At Microsoft a fresher will need 6 promotions to reach Principal level. People are reaching principal levels early, but not in 2 years.


Note that these are Amazon's definitions - at many companies L5 is Senior SWE and L6 is usually a lead or a step towards staff engineer.


… wait until you see their HN submission history


[flagged]


Ha, this is true. I thought the post was quite useful beyond getting the author’s name out there.

It can get much more navel-gazing than this, such as the posts by 24yos who insist they finally figured out what life is about after working at a startup for 3 years. :)


That's why I flagged it and encourage you to flag it too. Let's keep this self-aggrandizing slop off of HN.


From the guidelines: Please don't complain that a submission is inappropriate. If a story is spam or off-topic, flag it. Don't feed egregious comments by replying; flag them instead. If you flag, please don't also comment that you did.


I didn't do any tests myself but based on what I read you still pay a hefty performance penalty when using e.g., a 5090 on Linux.


That's expected. Always use AMD for Linux gaming.

There is some WIP to address it for Nvidia, but it requires new Vulkan features.

See: https://indico.freedesktop.org/event/10/contributions/402/at...


It’s an example of AMD catering to the AI crowd to somewhat refute your claim that they are clueless.

Not exactly a gigantic mental leap.


I think it actually reinforces the point. They know how to cater for the AI Crowd in terms of hardware but still drop the ball on the software level.


And every person I met today had a parrot on their shoulder. Doesn't really mean it applies to the general public (here meaning most developers out there).

I'd say <1% of all developers world wide have even heard of Nix.


It is used in production much, much more widely than Haskell is, though it remains far from the most common way to do builds or deployments.


This is a really insightful post. I created a Vim color scheme that uses even fewer colors than his but I didn’t realize that you might want to express nesting through varying lightness levels. I also didn’t realize that using HSLuv and making all lightness uniform might actually hurt the scheme.


I can highly recommend a recent series of podcasts by The Economist on precisely this topic https://www.economist.com/podcasts/2025/02/06/1-pigs-in-a-ba...


No it's not out of date. It's very much the reality. Every new tool is just one more thing added on top. When I have to do something in a JS/TS repo at work it's always a surprise which epoch of JS hype stuff I find. Today I fix ESLint warnings, tomorrow it's Biome errors, then I need to figure out how to override dependencies in pnpm, but oh no there's a some bug in Bun now. Did I forget the 10249120491412e12 config options of Jest? Ah no wait, this one is Vitest.

For NextJS, do you remember the runtime used for middlewares? What was this swc thing again?

It never ends. Every year new things are added and they never really replace anything, it's just one more thing to learn and maintain.

If every technology causes exactly 1 issue per week then you quickly spend 50% of your time fixing bugs that have absolutely zero to do with what your company is actually doing.

---- EDIT

And it doesn't even stop at issues. Every one of those technologies regularly goes through breaking changes. In the process, plugins are deprecated, new ones have completely different APIs. You want to upgrade one thing for a security fix, then you're forced to upgrade 10 other things and it spirals out of control and you've spent entire work days just sifting through change logs to change `excludeFile` to `excludedFile` to `includeGlob` to `fileFilter` to `streamBouncer` to I don't know what.


If your complaint is that there are too many problems in too many different tools, then you sound like the perfect target for a UNIFIED tool that abstracts over others.

Because of Vite, there was a total of ZERO work from my side involved in changing from Rollup to Rolldown, or from babel to Esbuild to SWC.

The Rust/Go/uv model is the one to go. This is ONE step in this direction.


Lets assume Vite+ ends up working super well. Then projects using it could very well end up being a delight to work with. But that's a big IF. They'd have to resist the urge to integrate with the many other parts of JS and basically say no to a lot of requests.

But many projects won't adopt it. There are so many competitors all with their own little ecosystems. So in the end, I'll still have to fix all the issues I fix right now PLUS the issues that Vite+ will add on its own.

The only chance I see for something like this actually working is if something like Node/NPM decided to add a default formatter, linter, and so on.


My complaint is that there is too much tool churn in the JS space specifically.

I haven't experienced nearly as much brittle build and dev tooling with other ecosystems, PHP or Python for example. Sure, they have their warts and problems and their fair amount of churn. But the sheer amount of JS tool and framework churn I experienced over the last few years was insane.

It might have cooled down somewhat by now, but I'm burned out. So reading about more churn to fix the churn just rubbed me the wrong way.


I very much feel what you're saying. I could spew out quite a few of these sediment layers from our projects as well - lerna comes to mind, for example. We still have that lerna monorepo that somehow still needs to chug along. Just the thought of having to touch that CI pipeline gives me PTSD, something something EFILTER and I don't know what it was any more, yarn workspace, lerna.json: conventionalCommits yadda yadda.

And as I wrote in another reply: of course other technologies are not without issues and have their churn and warts and problems, but the sheer amount of JS hype and tool and framework churn I experienced over the last few years was insane.

It might have cooled down somewhat by now, but I'm burned out. So reading about more churn to fix the churn just rubbed me the wrong way.


Greybeard (35) here.

When these cynical takes were crafted, Angular, AngularJS, Aurelia, Backbone, Ember, Knockout, React and Vue were all competing for mindshare, with new members joining/leaving that group every month (anyone remember OJ.js and Google FOAM?) being compiled by traceur, 6to5, the Google Closure Compiler and others from (Iced) CoffeeScript, TypeScript, ES6, Atscript, Livescript and Closurescript. We had two fucking major package registries (npm and bower) for literally no reason and we’d use both in every project. We had like 4 ways of doing modules.

Today the stack has stabilized around React and Vue, with a couple perennial challengers like Suede in the background. Vite and Webpack have been the two main build toolchains for years now. We discarded all of those languages except for TypeScript (and new ES features if you want them, but there are fewer changes every year). There are a couple package management tools, but they’re cross-compatible-ish and all pull from the same registry.

So does the fact that it’s not NEARLY as bad as it was in 2015 mean that people in 2025 aren’t allowed to complain? Yes. Yes it does.


Okay fair point, the various *Scripts predates my entry into the developer workforce somewhat.

But just in the repos at work I deal with: yarn, npm, pnpm, bun, NextJS, biome, Prettier, ESlint, Vite, Vitest, Jest, Turbopack and esbuild. At least those are the things I remember right now. They all have their idiosyncracies and issues. They all require learning slightly different configs. Nothing is compatible with anything else. I can't take one library and `npm link` it into another because their toolchains are completely different. Then you have the whole topic of exporting TS vs. JS, different module types, module resolution, custom module resolution rules to accomodate for how some outdated plugin somewhere works.

And this really is just the tip of the iceberg.

I wish these folks the best and I hope I'm wrong and in a few years all of this is replaced by `vite lint` and `vite build`. But my past experience tells me that I'll simple add one more word to this list.


My bet is the same -- this one looks like it will stay or at the very least will set the trend on consolidation under one or two vertically integrated toolchains. Vite itself is just better than all the previous things by a big margin and it promises that the whole zoo of auxilary tools will either perish or be nicely integrated into it.


nextjs and vite don't work together. same with non, pnpm & yarn. if your projects have different set of tools, thn your projects should have a readme file so people can manage working on them


Slightly older here although I don't see how that matters.

No, it doesn't mean people in 2025 can't complain. Hype based noise that gets in the way of people doing good work should be decried even if someone else somewhere (or somewhen) has it bad too.

This isn't a competition of badness.

Gates open. Come on in!


Meh, you're just describing software. Especially complex client-side software build chains.

Opening up iOS or macOS app source code I haven't touched in years in the latest Xcode I just downloaded is a lot like that. There is anything from Swift errors to API changes to my build plist being invalid. And if I use any third-party tools, they probably don't work until I visit each one's latest readme.

And that's without even insisting on using the latest unstable tech (bun, biome, nextjs) like you did in your comment where you would expect that experience.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: