Hacker News new | past | comments | ask | show | jobs | submit | lutusp's comments login

This fully reflects my own Android experience (https://play.google.com/store/apps/developer?id=Paul+Lutus) -- writing Android apps is by no means a write-and-forget experience. As time goes by more of my apps are dropped from the platform from my unwillingness to drop everything and rewrite code for each new Android version.

My original intent was to put my free, open-source apps on the platform, much as I had done before Android existed. But no -- Android doesn't work that way.

My best-known Android app is SSHelper (https://arachnoid.com/android/SSHelper/), a Secure Shell server meant for file transfers. Still works perfectly, dropped some time ago.

TankCalc (https://arachnoid.com/android/TankCalcAndroid/), same story. It's a well-known multi-platform app tank farm managers use to profile storage tanks. Still works, dropped from the platform.

And not just mine. Many other free, first-rate Android apps -- Termux (https://termux.dev/) comes to mind -- have been driven off the platform by Google's onerous demands and commercial focus.

It's as though a wall is going up between people who like programming and people who like money.


My time with Atkinson came before the Macintosh, before Hypercard. As a company Apple was struggling and we were preparing for what, in retrospect, was the really terrible Apple III. It was a less optimistic time -- after the Apple II and before the Macintosh.

A digression: the roster of Apple-related pancreatic cancer victims is getting longer -- Jef Raskin (2005), Steve Jobs (2011), now Bill Atkinson (2025). The overall pancreatic cancer occurrence rate is 14 per 100,000, so such a cluster is surprising within a small group, but the scientist in me wants to argue that it's just a coincidence, signifying nothing.

Maybe it's the stress of seeing how quickly one's projects become historical footnotes, erased by later events. And maybe it's irrational to expect anything else.


If there was a link, I would be thinking about all the superfund sites in Silicon Valley, pondering the manufacture of the Apple II, demographic clustering, or whether there was an unusually strong smoking culture at young Apple Computer, rather than some unique mental stress of the job.

Steve Jobs had pancreatic neuroendocrine tumor, which is not the traditional form of the pancreatic cancer people usually talk about. It is far less aggressive and completely treatable, in fact almost 100% curable as Jobs had it diagnosed at such an early stage.

"Good writing" nearly always collides with something else, for example a writer paid by the word. Or a writer granted too little time to compose prose, as opposed to merely creating it.

A shorter exposition is nearly always (a) better, and (b) more work. I'm reminded of Mark Twain's remark, “I didn't have time to write a short letter, so I wrote a long one instead.”

A classic writing book, now nearly forgotten -- "Strunk & White"/"The Elements of Style" (https://en.wikipedia.org/wiki/The_Elements_of_Style) -- famously exhorts authors to "Make every word count."

An underlying cause is that people don't read enough, before presuming to write. This results in malaprops like "reign him in", an example I see almost daily now. (A monarch reigns over a kingdom, a cowboy reins in a horse.) Examples abound, this is a common one.

Even worse, I now see automatic grammar checkers making ungrammatical "corrections" (incorrections?) like replacing "its" with "it's," or the reverse, but in the wrong circumstances.

But my all-time greatest annoyance are constructions like "Similar effect to ...", which in nearly all cases ought to be "Effect similar to ..." with copious variations, all wrong. Online searches discover that, in many such cases, the wrong form prevails over the right one.

Someone may object that language is an art form without fixed rules, that seems right. But granted that truth, many popular word sequences sound like fingernails on a chalkboard.


Elements of Style is one of the most assigned textbooks around. It's far from "forgotten."


> I’m really struggling with promotion. How do you get people to notice your product?

Start by spelling all the words correctly. If this were a video it wouldn't matter, but text-based appeals ... wait for it ... require competently crafted text.

> I feel like I know less than an elementary school student.

Not likely, but do avoid focusing on esoterica when the most basic elements need work.


This strikes me as a "safe" psychology study, one virtually certain to produce a publishable outcome.

Surprisingly, the linked technical article, which was paid for with tax dollars, is paywalled -- isn't that practice supposed to end?

Without being able to read the article, I'll go out on a limb and guess that the article's data were collected by interviewing people, asking about their drinking habits. This is a very unreliable method compared to measuring people's blood alcohol levels -- granted that the latter design would be prohibitively expensive.

Anecdotal studies are notoriously unreliable. A young researcher once performed an interview-based study that showed married people live longer than single people. On reviewing the paper, an older, more experienced scientist suggested that public records would cost less and produce better results. The young scientist tried again, using actuarial data, and the original conclusion was falsified: married people don't live longer, it just seems longer.


> Doesn't that suggest that the field of theoretical computer science (or theoretical AI, if you will) is suspect?

Consider the story of Charles Darwin, who knew evolution existed, but who was so afraid of public criticism that he delayed publishing his findings so long that he nearly lost his priority to Wallace.

For contrast, consider the story of Alfred Wegener, who aggressively promoted his idea of (what was later called) plate tectonics, but who was roundly criticized for his radical idea. By the time plate tectonics was tested and proven, Wegener was long gone.

These examples suggest that, in science, it's not the claims you make, it's the claims you prove with evidence.


My interactions with Steve Jobs came earlier, when he wasn't quasi-mythical, but was already a PITA. A typical interaction with Steve Jobs in 1976:

"Hi! Are you Steve Wozniak?"

"No, I'm Steve Jobs."

"Okay ... umm ... where is Steve Wozniak?"

I suspect people's preference for those who were actually building things, over selling them, may have twisted SJ's character ... I mean, more twisted than it already was.

Ironically, two people I worked with in the early Apple days -- Steve Jobs, enough already said, and Jef Raskin, who designed the first incarnation of the Macintosh -- both died of pancreatic cancer.

I actually miss Jef. We lived together for a while, as I was finishing Apple Writer and my frequent commutes from Oregon were becoming impractical.

Here's a Jef Raskin story I think almost no one knows. Jet resolved to design an electric car. He packed a bunch of 12 volt car batteries into a relatively small, lightweight car, and, after removing the ICE, rigged an electric motor in its place.

First test drive, Jef tried to descend a hill, only to discover the car's brakes, which until then had gotten an assist from the ICE, were nowhere near adequate to stop the suddenly-massive battery bank. Very scary, briefly out of control, but no harm done.


Tangentially, there remains a test electric car gathering {r,d}ust in one of Google's parking lots, from the early years, that I believed "belonged" to Sergey. IIRC it's at 37.417743, -122.082186

I wonder if they'll ever move it out, put it in a museum or something.


Apparently still there, but mostly hidden under a tree (as seen by Google Maps). In a spectacular irony, Google has no street view of their own parking lot.

I suppose one could periodically check for the presence of this artifact, and if it were to suddenly vanish, that would suggest that Google has decided to build another electric car. It is, after all, legacy IP, best hidden away.


So the mythical Apple car project actually goes way back :)


> Could this be upstreamed into the language's API?

If a language is in use, and if people have written code that generates pseudorandom sequences based on an initial seed, then no -- bad idea. Some programs rely on the same pseudorandom sequence for a given seed, and may fail otherwise.


That really depends on whether the language's standard library API specifies that implementations will use a particular random number generator, or merely specifies that the RNG will have certain properties. If you're going to depend on reproducible, portable RNG behavior you need to get those random numbers from an API that guarantees a particular algorithm.


"Perfect Random [sic] Floating-Point Numbers" ???

I had hoped that, somewhere in the article, the author would say, "In this article, the term 'random' is shorthand for 'pseudorandom'." But no.

Programming students might read the article and come away with the idea that a deterministic algorithm can generate random numbers.

This is like the sometimes-heard claim that "Our new error-detecting algorithm discovers whether a submitted program contains errors that might make it stop." Same problem -- wrong as written, but no qualifiers.


But this article converts another source of randomness, it doesn't care where the random bits came from.

It will be just as happy with dice rolling, the PRNG from Commodore 64 BASIC, the built-in random numbers from a modern Intel CPU or "random" bits harvested by sampling keyboard input, it just takes the "random" bits and makes floating point values.

So there's no reason to say it must be pseudo-random. In most practical applications it would be, but that doesn't seem essential AFAICT


Something interesting happend, while reading...

(Sry, none native English speaker, in German:) "Moment mal, sagst Du da gerade, mit aller höchster Wahrscheinlichkeit gibt es gar keine zufälligen Zahlen, die ein Computer generieren könne, da 'Bits' immer nur entweder den Wert '0' oder eben '1' annehmen können (und das auf inzwischen mehreren Layern, andere Geschichte...)?"

let me try to explain it a little...

3 * (A = >0) adding... 1 * (A = 1) (..another topic yes...)

...but that what was wanted is "NONE" (of it...) or in other words 'maybe random'??

??

:confused:


IMHO a better model would take into account both ends of the temperature spectrum. There are days in the southwest so hot that one wouldn't think of going outside -- just as though it were rainy and/or cold.

In fact, because of climate change, on days when the so-called "wet-bulb temperature" gets to 35°C (95°F), people who dare to go outside will simply die. That day may arrive sooner than people think.

Imagine this: Phoenix, AZ, a day with a wet-bulb temperature at or above 35°C. Everyone is cowering inside near their air conditioners. Then the power fails. This might also happen sooner than people expect.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: