I've been an Eclipse guy since college (12 years ago) and haven't seen any need to change. I used vim once for a small Ruby project (couple of days) and came away thinking "I don't see how, even with years of practice of keystrokes, etc, someone can be more productive in vim than a modern editor." And went right back to Eclipse.
Your vision of the future assumes that folks will remain complacent about how much they trust their doctors, health insurance companies, medical device companies and hospitals with their minute-to-minute activity.
To rule out root causes. If the data showed increases and serotonin, decreases in stress hormones and improved sleep quality and the patient still replies "bad" to the question "How do you feel?" Then obviously a serotonin imbalance is not the problem (which is all SSRIs are designed to do).
If the patient still feels bad and the data shows no serotonin change, then the doctor can conclude that maybe serotonin is still the problem, but Zoloft isn't the right drug to fix it (or the dosage was wrong).
In other words, more data = better ability to get a real answer.
It is absolutely crazy to think about how NASA got to the moon in freakin' 1969. Ten years ago, we didn't have FB, YouTube, widespread WiFi, flatscreen TVs, smartphones, LTE... That we landed a spaceship on the moon 4.5 times longer ago (45 years) when the state of the art was a photocopy machine is truly mindblowing.
What's even more mind blowing is the amount of time we've wasted by pumping money, time and resources into useless wars and spying infrastructures. Imagine what mankind could have accomplished if WW2 truly was the last war and all energy could have gone into getting of this planet. I'm sure we would have had "real" hover boards by now..
And there were people alive at the time who remembered the airplane being invented. Today, we can tell astounding stories to our snapchatting kids of the time when men walked on the moon.
I was lucky enough to once see a Mitsubishi "Zero" fighter in Japan once being spun up and taxi'ing about. I later found out from the company the owns it that they knew the pilot who flew it in WWII Personally.
That pilot had started his career on biplanes and ended it (in a crash, unfortunately) flying F-4 Phantom Jets (Commonly called Third generation Jets).
In my mind these eras of flight have been so ... seperated that I could not really imagine one man flying basically such a wide range of technology (and of course, he saw the moon landings).
Yeah, the delta in technology that bookends a life is getting bigger fast.
My Dad told me that his first airplane ride was as a kid when a barnstormer put on a show at his town in Utah. The pilot/showman took him up in his Jenny:
The kicker is that Dad told me this when he, Mom, and I were at flight level 350 about halfway to Hawaii for their 61st wedding anniversary, flying in a Boeing 757.
From an iffy ragwing biplane to a routine miracle world-spanning machine in his lifetime. Wonder if I'll get to see the equivalent of 757 service to Mars before I shuffle off...
I have just finished reading the Command and Control book about nuclear weapons development in the US. The differences in technology are even more striking than the space race. That until quite recently nuclear weapons had less security than the lock screen on a smart phone. That the early warning/firing systems had less communication reliability than sending an email. So much of the modern world seems to be derived in significant part from the cold war arms race.
Consider in 1969--much less going back to when many of the design decisions were made. (And by "no" here, I mean no in anything approaching mainstream.) No electronic calculators, just barely color television, mostly rotary dial telephones. Moore's Law had only recently been coined. No PCs of course. No Internet in any meaningful sense.
All of what you say is true, however it's not as bad as you might think. Once upon a time were these magical devices called "mainframes" [1], and in the 1960s they were quite powerful (for that era).
Most engineering work was done in FORTRAN, and it ran very efficiently on the hardware. There were (usually) no CPU cycle sucking GUIs to slow down the computers.
As a high school student in the early 1970s I was privileged to take a summer course at the Goddard Institute for Space Studies [2] where they had an IBM 360/95 mainframe [3] for the scientists to take turns using (job entry by punched cards, job output by paper printout).
It's been so many years, and it was only casually explained to me, but I think NASA used three other 360/95 mainframes (the IBM top of the line at the time) located at the Goddard Space Flight Center [4] in Greenbelt MD to track the Apollo missions. I think these ran the same program more-or-less in triplicate (but there was no hardware for syncing). I think NASA also had an IBM 7094 [5] running an independently written program as a backup in case something went wrong with the S/360 computers.
Trust me, these computers were very very capable for the time. It's not like the primitive computer onboard the Apollo LEM. Mainframes were quite up to the task.
I've programmed on an IBM 360 in FORTRAN :-) so I knew they had computers. NASA has always been one of the big US government buyers of computer technology. The technology they had to work with on the ground was certainly relatively more sophisticated than what they could fit into the spacecraft and expect to run reliably.
But color TV had been widely available for about a decade. In fact, I watched the 1969 moon landings on my parents 1961 color Zenith. (They had rotary phone well into the 80s, however.)
My mother had one of the early SCM Marchant electronic calculators in her lab http://www.oldcalculatormuseum.com/scm240sr.html. I don't remember the exact date but looks like it looks like it would have been late 1960s as well. I remember how laughably slow it was doing square roots; you could watch it going through some sort of successive approximation algorithm.
"By the late 1960s they did the technical ability, not to mention the requisite madness, to send three guys to the moon and back. They did not have the technology to fake it on video." https://www.youtube.com/watch?v=sGXTF6bs1IU
Great video. For others wondering, it's a video that debunks conspiracy theories about faking the moon landing by saying it was technically possible to fly to the moon but not to fake it. It goes into the logistics of how a moon landing would be faked (film capacity and technical capabilities of cameras in the 60's).
I think the moral is that you work with what you got. And that a lot of modern technology makes us more efficient in terms of personnel numbers, but if your personnel isn't constrained, that's not overall that much more efficient. Like, if you can't easily change drawings in a CAD program, that's a pain, but the solution is to hire dozens of drafters to redraw them.
What's more crazy to me is that 50 years later, we have not built an air-breathing aircraft that surpasses the SR-71 in speed or altitude. We clearly have the capacity to achieve great feats of technological ingenuity, but that doesn't mean those feats get done.
> I find Evans' analysis of mobile a bit hyperbolic.
Spot on. I've been a follower of his for some time now and while he's obviously smart an insightful, he does sometimes veer into hyperbole bordering on know-it-all snark. There's nothing wrong with it, per se, other than that the audience might not take it as seriously as an argument made more rationally.
I wonder to what degree public perception of a company's brand (think HP, Microsoft, Ebay vs Google, Apple) helps or hinders their ability to get a product off the ground. It reminds me of the old Shakespearean "What's in a name?" question.
One would like to believe that the product, if good enough, will always win out, but that's probably not the case, especially if it relies on an ecosystem to develop around it to be fully viable.
If brand quality matters severely, then an interesting question is whether or not startups have an advantage against large corps with bad reputations. Is it better to be StartuppyMcstartup nobody's ever heard of or Microsoft?
I think it can hinder it quite a bit, especially for something like this. The UX design for blending physical and virtual environments has to be nearly perfect. HP's track record with consumer facing software does not instill me with confidence that they can pull this off. I hope to be surprised.
I think this will have a huge impact on the success of the product. I have an HP laptop right now and all the HP made software is terrible, it doesn't run well, crashes constantly and doesn't use the UX rules of any Microsoft OS (in fact parts would feel more at home on OSX than Windows) If they can get their stuff together and have a team that actually puts out usable software, that would go a long way towards making this a viable product.
If apple had released this product, it would sell like crazy. I think the fact that HP made it will prevent it from getting traction, but it would be nice to be proven wrong.
I understand your sentiment, but can't quite agree.
Apple are innovating. I like the way all my iDevices are becoming 'as one'. Not quite to Mark Weiners vision yet - but its compelling and builds on Apples core value proposition: non-fragmentation.
Sprout is a 'gilding the lily' kind of innovation. Its impressive (in my humble opinion) and could open up a new type of product if there is sufficient demand for HP to continue. But it feels more like a marketing led shot in the dark, than HP building on their strengths in the touchscreen PC space.
--
Anecdotal cul-de-sac:
My assumptions about innovation were overturned on a college summer project (EE). Having been told to 'innovate', I produced a thing + bells + whistles. My comparatively low marks & tutors comments showed me (rightly) that innovation should have focused more on 'thing'. The rest was not so important.
I now think of it as an Overton window in the product lifecycle (Im sure theres a term for this, but I don't know it).
'Just because it can be done, doesn't mean it should.'
> Any way Sprout looks very interesting and promising.
Interesting, I agree completely. Promising? I dunno about that. Given the price, I'd be afraid to buy it, given the high likelihood that HP forgets it exists in 6 months. Unique hardware like this requires serious development effort to utilize it properly, and if this doesn't get traction—and at almost $2,000, I think it's unlikely to get any traction—then nobody's going to bother writing that software. It's a chicken-and-egg problem; without sales, there won't be much custom software, and without that custom software, there's no reason for most to buy it.
I admit the second thing I thought of after finally figuring out what this thing does, was how are they going to sell this for $50 and what will be the equivalent of the required every three months $75 ink cartridge?
The first thing I thought of was whenever you see something like this, its to goose the stock price. Generic investor types will fall for anything, as generations of AT&T and IBM advertisements have shown. I checked finance.google.com and this must be either very new news or older than a week news.
When we all first joined Facebook, we were hit with an exciting torrent of old friends we'd lost touch with, especially old hookups/romances that suddenly became possible again. This was like a pathogen entering a fresh population of potential hosts.
Then, over the past 6 years, we explored all of those new possible relationships and took them to their conclusion. But now there's nothing left; like the pathogen killing off all its hosts and having nowhere else to go.
In order to survive, the population has to create new hosts faster than they're being killed i.e. Facebook would have to generate new connections for us at a faster rate than we can explore them. I don't think it comes anywhere close.
In this sense, it definitely has biological underpinnings.
>In order to survive, the population has to create new hosts faster than they're being killed
I don't think that's true. Why do there need to be new connections for you to go through? What you just described is the equivalent of setting up your contacts list from scratch. Say on your phone. You get a new phone and now get to re-add all the people you know or knew. Because you want to stay in touch with them.
Once that is done, does that contact list of yours also "die out" because there is nothing new to add on a regular basis? I don't think so. You just finished building your initial contacts. And you know what happens then? You start to use it. On a regular basis. It is a tool you set up and once it is set up you use it. Simple as that. And that is the same way you can use Facebook. Once you added all the people you want to stay in contact with you can do just that.
It all boils down to how people actually use Facebook. Do they use it like a tool as I just described? Or do they merely play the game of "Who can collect the most "friends""?
I use it as a tool. I have my real friends on there. I don't have 300+ people on there just so when I post something I can marvel at the multitude of "likes" to get some kind of gratification through it. And as long as all those people continue to use Facebook I can continue to keep in touch with them this way. Even past 2017 or whatever date these people have calculated.
One reason AIDS has been so successful as a pathogen is that it takes a long time to kill the host. The infection has plenty of time to spread.
Ebola, on the other hand, never spreads very far because it kills so quickly, as described above.
I wonder if we'll see a "slow-kill" social service emerge in the next few years; one that slows your roll, prevents you from blitzing through the experience all at once.
I was wondering about this myself. I've been developing a Chrome extension off/on for the past year and did moderate reading into Chrome, Chromium and how user data is handled. Seemed ok to me. Wireshark doesn't show anything particularly alarming, either. Can anyone else confirm/deny?