Hacker Newsnew | past | comments | ask | show | jobs | submit | delecti's commentslogin

Better speech transcription is cool, but that feels kinda contrived. Phone calls exist, so do voice messages sent via texting apps, and professional drivers can also just wait a bit to send messages if they really must be text; they're on the job, but if it's really that urgent they can pull over.

They can also use paper maps instead of GPS.

I get that title length is limited, but the "Trump Says" in the title is a pretty significant detail. He "says" things all the time.

That's true, but to play devils advocate for a second, just because he says something doesn't make it wrong or bad. Banning Wall Street from buying single-family homes is a great thing that I completely support, and I don't really care which president makes it happen.

It also doesn't mean he can actually do it. There's no obvious mechanism by which this can be enforced without a law from Congress, and it's not entirely clear such a law would be Constitutional (they'd have to base it in the right of the federal government to regulate interstate commerce if they're going to base it in anything, which presupposes an interstate market for shelter, and there's a reasonable argument to be made that maybe that's not a thing; housing is a local concern, and home prices in Topeka don't impact me, a buyer in Boston, if I want to live in Boston).

The policy would be great! You aren't playing devil's advocate for what I said. I wasn't talking about the merits of the policy at all. Him saying it just doesn't have any connection to whether the policy will ever exist. The headline without that detail is wrongly implying certainly that isn't warranted.

The point is that leaving off “Trump says” makes it sound like something that will actually happen.

What baffles me is why people still take it all seriously, we've had well over a decade to examine his patterns of behavior and the takeaway is that fully 99.999% of his utterances are worthless. In the rare case that his promises are turned into some shambling semblance of reality there's always plenty of warning; in the case of VZ and Maduro you had significant troop movements for months for example.

Unfortunately by treating his every utterance as requiring attention he gets what he wants, the media gets clicks, bloggers get clicks, and people get to use it as part of an eternal argument over "what comes next".

People here at least should be more adept at recognizing and responding to patterns.


Importantly, that philosophy relies on the result having merit, and working cohesively on its own terms, even if it's not your preference. Like, if I go to a restaurant that refuses me sugar for my tea, it better be darn good tea.

> Like, if I go to a restaurant that refuses me sugar for my tea, it better be darn good tea.

But if you demand sugar in your tea it doesn't matter how good the tea is, right? You are not going to like that restaurant.

> Importantly, that philosophy relies on the result having merit, and working cohesively on its own terms, even if it's not your preference

I am too dumb to understand what this means.


I might prefer my tea with sugar, given the choice, but I'll still be satisfied if the tea is very good (this metaphor assumes I don't demand sugar, but merely prefer sugar). I might prefer that Apple products work differently, but if they work well, I'll tolerate that they don't work exactly how I want. In either case, I'm willing to adapt my preferences a bit to an expertly made product.

> I thought that was supposed to be Apple’s thing. “We decide how to make it and you decide to buy it or not.”

This was Apple; your customization options were limited, but things were well designed and cohesive. If you were willing to adapt to their design paradigms, you'd benefit from their expertise, and also have to put in less effort tweaking. Plus you could pick up any random new Apple product and be up to speed immediately.

But to extend the metaphor, if the tea sucks, I'll stop going to that restaurant. If Apple makes their UIs both immutable and bad, I'll use something else.


> but things were well designed and cohesive.

This is an opinion, though. macOS did do certain things better than Windows, but it also did a lot of things markedly worse. The Mac market share never overtook the Windows market, on-merit it was considered a worse product. You or I might think it was a decent system at some point, but the evidence is really just anecdotal.

I agree with the parent comment, Apple's "thing" was their financial skill and not their manufacturing or internal processes. Once the IIc left the mainstream, people stopped appreciating the craftsmanship that went into making the Apple computer. It was (smartly) coopted by flashy marketing and droves of meaningless press releases, documented as the "reality distortion field" even as far back as the 1980s.


That certainly applies to the biggest subs, but there are usually still high-quality subs for most topics.

Small subs are more diverse and accommodating IME. Worse than popular though are flaired-only subs. They are so heavily moderated that posting feels like an exercise in guess-the-unspoken rules.

What about the war powers act are you talking about? It just limits situations (or purports to) where the president can use the military without a declaration of war. Even if we were suddenly actually attacked (not just Venezuelan forces fighting back) it wouldn't give any path to "no more democracy".

The President can now tell "his" DOJ to indict someone in another country (like its leader), and use that to 'legally' justify an attack on said country to grab the person.

Ironically, the current administration thinks that American courts can hold any president accountable for crimes, except the American president.


There is a path to no more democracy, and being at war makes that path a lot easier.

I'm not American, my 'war powers act' statement wasn't pointing at specific legislation, it was a hand wave to the past.

It rhymes.


I think fragmentation is the wrong way to look at it; they're all basically compatible at the end of the day. It's more like an endless list of people who want to min-max.

Any reasonably popular distro will have enough other users that you can find resources for fixing hitches. The deciding factor that made me go with EndeavourOS was that their website had cool pictures of space on it. If you don't already care then the criteria don't need to be any deeper than that.

Once you use it enough to develop opinions, the huge list of options will thin itself out.


I have a related anecdote.

When I worked at Amazon on the Kindle Special Offers team (ads on your eink Kindle while it was sleeping), the first implementation of auto-generated ads was by someone who didn't know that properly converting RGB to grayscale was a smidge more complicated than just averaging the RGB channels. So for ~6 months in 2015ish, you may have seen a bunch of ads that looked pretty rough. I think I just needed to add a flag to the FFmpeg call to get it to convert RGB to luminance before mapping it to the 4-bit grayscale needed.


I wouldn't worry about it too much, looking at ads is always a shitty experience. Correctly grayscaled or not.


True, though in the case of the Kindle they're not really intrusive (only appearing when it's off) and the price to remove them is pretty reasonable ($10 to remove them forever IIRC).

As far as ads go that's not bad IMO)


The price of an ad-free original kindle experience was $409. The $10 is on top of the price the user paid for the device.


Lets not distort the past. The ads were introduced a few years later with the Kindle Keyboard, which launched with an MSRP of $140 for the base model, or $115 with ads. That was a substantial discount on a product which was already cheap when it released.

All for ads which are only visible when you aren't using the device anyway. Don't like them? Then buy other devices, pay to have them removed, get a cover to hide them, or just store it with the screen facing down when you aren't using it.


Yes and here in Europe they were introduced even later, with kindle 4 IIRC.


Sure, and piss doesn't taste quite as bad as shit yet I still don't want it in my food.

I don't think Kindle ads were available in my region in 2015 because I don't remember seeing these back then, but you're a lucky one to fix this classic mistake :-)

I remember trying out some of the home-made methods while I was implementing a creative work section for a school assignment. It’s surprising how "flat" the basic average looks until you actually respect the coefficients (usually some flavor of 0.21R + 0.72G + 0.07B). I bet it's even more apparent in a 4-bit display.


I remember using some photo editing software (Aperture I think) that would allow you to customize the different coefficients and there were even presets that give different names to different coefficients. Ultimately you can pick any coefficients you want, and only your eyes can judge how nice they are.


>Ultimately you can pick any coefficients you want, and only your eyes can judge how nice they are.

I went to a photoshop conference. There was a session on converting color to black and white. Basically at the end the presenter said you try a bunch of ways and pick the one that looks best.

(people there were really looking for the “one true way”)

I shot a lot of black and white film in college for our paper. One of my obsolete skills was thinking how an image would look in black and white while shooting, though I never understood the people who could look at a scene and decide to use a red filter..


This is actually a real bother to me with digital — I can never get a digital photo to follow the same B&W sensitivity curve as I had with film so I can never digitally reproduce what I “saw” when I took the photo.


Film still exists, and the hardware is cheap now!

I am shooting a lot of 120-format Ilford HP5+ these days. It's a different pace, a different way of thinking about the craft.


> I shot a lot of black and white film in college for our paper. One of my obsolete skills was thinking how an image would look in black and white while shooting, though I never understood the people who could look at a scene and decide to use a red filter..

Dark skies and dramatic clouds!

https://i.ibb.co/0RQmbBhJ/05.jpg

(shot on Rollei Superpan with a red filter and developed at home)


If you really want that old school NTSC look: 0.3R + 0.59G + 0.11B

This is the coefficients I use regularly.


Interesting that the "NTSC" look you describe is essentially rounded versions of the coefficients quoted in the comment mentioning ppm2pgm. I don't know the lineage of the values you used of course, but I found it interesting nonetheless. I imagine we'll never know, but it would be cool to be able to trace the path that lead to their formula, as well as the path to you arriving at yours


The NTSC color coefficients are the grandfather of all luminance coefficients.

It is necessary that it was precisely defined because of the requirements of backwards-compatible color transmission (YIQ is the common abbreviation for the NTSC color space, I being ~reddish and Q being ~blueish), basically they treated B&W (technically monochrome) pictures like how B&W film and videotubes treated them: great in green, average in red, and poorly in blue.

A bit unrelated: pre-color transition, the makeups used are actually slightly greenish too (which appears nicely in monochrome).


To the "the grandfather of all luminance coefficients" ... https://www.earlytelevision.org/pdf/ntsc_signal_specificatio... from 1953.

Page 5 has:

    Eq' = 0.41 (Eb' - Ey') + 0.48 (Er' - Ey')
    Ei' = -0.27(Eb' - Ey') + 0.74 (Er' - Ey')
    Ey' = 0.30Er' + 0.59Eg' + 0.11Eb'
The last equation are those coefficients.


I was actually researching why PAL YUV has the same(-ish) coefficients, while forgetting that PAL is essentially a refinement of the NTSC color standard (PAL stands for phase-alternating line, which solves much of NTSC's color drift issues early in its life).


It is the choice of the 3 primary colors and of the white point which determines the coefficients.

PAL and SECAM use different color primaries than the original NTSC, and a different white, which lead to different coefficients.

However, the original color primaries and white used by NTSC had become obsolete very quickly so they no longer corresponded with what the TV sets could actually reproduce.

Eventually even for NTSC a set of primary colors was used that was close to that of PAL/SECAM, which was much later standardized by SMPTE in 1987. The NTSC broadcast signal continued to use the original formula, for backwards compatibility, but the equipment processed the colors according to the updated primaries.

In 1990, Rec. 709 has standardized a set of primaries intermediate between those of PAL/SECAM and of SMPTE, which was later also adopted by sRGB.


Worse, "NTSC" is not a single standard, Japan deviated it too much that the primaries are defined by their own ARIB (notably ~9000 K white point).

... okay, technically PAL and SECAM too, but only in audio (analogue Zweikanalton versus digital NICAM), bandwidth placement (channel plan and relative placement of audio and video signals, and, uhm, teletext) and, uhm, teletext standard (French Antiope versus Britain's Teletext and Fastext).


(this is just a rant)

Honestly, the weird 16-239 (on 8-bit) color range and 60000/1001 fps limitations stem from the original NTSC standard, which considering both the Japanese NTSC adaptation and European standards do not have is rather frustating nowadays. Both the HDVS and HD-MAC standards define it in precise ways (exactly 60 fps for HDVS and 0-255 color range for HD-MAC*) but America being America...

* I know that HD-MAC is analog(ue), but it has an explicit digital step for transmission and it uses the whole 8 bits for the conversion!


Ya’ll are a gold mine. Thank you. I only knew it from my forays into computer graphics and making things look right on (now older) LCD TV’s.

I pulled it from some old academia papers about why you can’t just max(uv.rgb) to do greyscale nor can you do float val = uv.r

This further gets funky when we have BGR vs RGB and have to swivel the bytes beforehand.

Thanks for adding clarity and history to where those weights came from, why they exist at all, and the decision tree that got us there.

People don’t realize how many man hours went into those early decisions.


> People don’t realize how many man hours went into those early decisions.

In my "trying to hunt down the earliest reference for the coefficients" I came across "Television standards and practice; selected papers from the Proceedings of the National television system committee and its panels" at https://archive.org/details/televisionstanda00natirich/mode/... which you may enjoy. The "problem" in trying to find the NTSC color values is that the collection of papers is from 1943... and color TV didn't become available until the 50s (there is some mention of color but I couldn't find it) - most of the questions of color are phrased with "should".


This is why I love graphics and game engines. It's this focal point of computer science, art, color theory, physics, practical implications for other systems around the globe, and humanities.

I kept a journal as a teenager when I started and later digitized it when I was in my 20s. The biggest impact was mostly SIGGRAPH papers that are now available online such as "Color Gamut Transform Pairs" (https://www.researchgate.net/publication/233784968_Color_Gam...).

I bought all the GPU Gems books, all the ShaderX books (shout out to Wolfgang Engel, his books helped me tremendously), and all the GPU pro books. Most of these are available online now but I had sagging bookshelves full of this stuff in my 20s.

Now in my late 40s, I live like an old japanese man with minimalism and very little clutter. All my readings are digital, iPad-consumable. All my work is online, cloud based or VDI or ssh away. I still enjoy learning but I feel like because I don't have a prestigious degree in the subject, it's better to let others teach it. I'm just glad I was able to build something with that knowledge and release it into the world.


Cool. I could have been clearer in my post; as I understand it actual NTSC circuitry used different coefficients for RGBx and RGBy values, and I didn't take time to look up the official standard. My specific pondering was based on an assumption that neither the ppm2pgm formula nor the parent's "NTSC" formula were exact equivalents to NTSC, and my "ADHD" thoughts wondered about the provenance of how each poster came to use their respective approximations. While I write this, I realize that my actual ponderings are less interesting than the responses generated because of them, so thanks everyone for your insightful responses.


There are no stupid questions, only stupid answers. It’s questions that help us understand and knowledge is power.


I’m sure it has its roots in amiga or TV broadcasting. ppm2pgm is old school too so we all tended to use the same defaults.

Like q3_sqrt


Yep, used in the early MacOS color picker as well when displaying greyscale from RGB values. The three weights (which of course add to 1.0) clearly show a preference for the green channel for luminosity (as was discussed in the article).


I'm not sure I understand your complaint. The "expected result" is either of the last two images (depending on your preference), and one of the main points of the post is to challenge the notion of "ground truth" in the first place.


Not a complaint, but both the final images have poor contrast, lighting, saturation and colour balance, making them a disappointing target for an explanation of how these elements are produced from raw sensor data.

But anyway, I enjoyed the article.


That’s because it requires much more sophisticated processing to produce pleasing results. The article is showing you the absolute basic steps in the processing pipeline and also that you don’t really want an image that is ‘unprocessed’ to that extent (because it looks gross).


No, the last image is the "camera" version of it- though it's not clear if he means the realtime processing before snapping the picture or with the postprocessing that happens right after. Anyway, we have no way to understand how far the basic-processed raw picture is from a pleasing or normal-looking result because a) the lighting is so bad and artificial that we have no idea of how "normal" should look; b) the subject is unpleasant and the quality "gross" in any case.


The bottle in question seems to be glass, so many of those questions aren't really relevant. Glass doesn't degrade much from UV light, or at all from biological activity, whether on land or under 7 miles of ocean. Glass is denser than water, so it sank.


Because it's an LLM spambot, it "saw" a couple of keywords and wrote a comment that's vaguely relevant to the article at hand. Do help with kicking it out by flagging its comments.


Ugh, thought at first you might be just being mean but a quick look at its other comments 100% confirms. I don’t understand — what’s even the point of such comment slop. I mean on Reddit it’s for karma and selling accounts or whatever. But here on HN?


> AI placeholders during development as it can majorly speed up/unblock

Zero-effort placeholders have existed for decades without GenAI, and were better at the job. The ideal placeholder gives an idea of what needs to go there, while also being obvious that it needs to be replaced. This [1] is an example of an ideal placeholder, and it was made without GenAI. It's bad, and that's good!

[1] https://www.reddit.com/r/totalwar/comments/1l9j2kz/new_amazi...

A GenAI placeholder fails at both halves of what a placeholder needs to do. There's no benefit for a placeholder to be good enough to fly under the radar unless you want it to be able to sneak through.


it's not better as they fundamentally fail to capture the atmosphere and look of a scene

this means that for some use cases (early QA, design 3D design tweaks before the final graphic is available etc.) they are fully useless

it's both viable and strongly preferable to track placeholders in some consistent way unrelated to their looks (e.g. have a bool property associated with each placeholder). Or else you might overlook some rarely seen corner cases textures when doing the final cleanup

so no, placeholder don't need to be obvious at all, and like mentioned them looking out of place can be an issues for some usages. Having something resembling the final design is better _iff_ it's cheap to do.

so no they aren't failing, they are succeeding, if you have proper tooling and don't rely on a crutch like "I will surely notice them because they look bad"


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: