Hacker Newsnew | past | comments | ask | show | jobs | submit | maaku's commentslogin

Somebody should talk to the OP's mother about cryonics.


Why? They haven't even proven human revival is possible. Until they do it is nothing more than vaporware. More of a scam. And this statement... "We believe that the damage caused by current cryopreservation is limited and can someday be repaired in the future." I don't really care what you believe when selling a scientific solution. What is proven? Why do you believe that? Good marketing? Actual data that shows it is limited?

But hey, by the time they fail to prove it correct you'll be dead and have no recourse or care. Thanks for the money.


If you're waiting for revival to be shown then your missing the point. The whole idea is to halt decay while there are problems that can't be solved yet. It's an ambulance ride to the future.


It's a little late for that, unless you plan on performing a séance.


I did read part of the article but didn't finish because it is too depressing to read about someone murdering family out of ignorance. Thanks for letting me know how it ended.


There are other religions that will make the same promises for less money.


Fuck off.


We ban accounts that do this, so please don't do this on HN, regardless of how wrong or provocative another comment may have been.

https://news.ycombinator.com/newsguidelines.html

https://news.ycombinator.com/newswelcome.html


Leave provocative statements as-is and censor succinct and appropriate responses? You should reconsider what effect that policy has given context.

Goodbye.

(Bug report: there appears to be no way to delete accounts.)


You're assuming machines can't be creative.


While they may one day be 'creative', I feel this will be the last bastion of human capability beyond AI. Luckily, the interesting thing about creative endeavors is that they are often a unique synthesis of many things. AI may create wonderful art, music, entertainment, etc., but this does not mean things created by humans won't still be valuable to other humans. Things would just be created in parallel and in communion with AI. That being said, the percentage of the population that can 'create' for income/profit may be very small. Hopefully by that point, we will create for the intrinsic value of creating and sharing, and not for money.


They can only have the perspective that was possible from the program that created them -- ultimately they're constrained by the original programmer (artist). That set of possibilities might be very large, but it's still limited.

Because art is expression, and expression is fundamental to humanity, machines won't be able to push art in the same way that humans can.


They can but it's going to take a good long while before they have the same kind of access to their physical environment to experiment like humans can. Just think about the steps between figuring out that bean water can be used in meringues for instance; not just the mental ones, but the physical ones as well.


I think there is a high likelihood that machine creativity or thought will be different from human creativity or thought.


Probably yes, but people will also keep valuing stuff created by other people.


The LTS stability doesn't really start until the .1 release.


Certainly. Also weather balloons that were in fact doing remote observation of Soviet nuclear weapons testing (see project Mogul run out of Roswell, NM).


They are in fact teenagers by the modern definitions. You were a full responsible adult shortly after puberty. Many of the greats did their best work before 20.


Who did their best work before 20? I looked up a bunch of Greek philosophers, and it doesn't seem to apply to any of them. Socrates was in the military during his early life, and Aristotle did his most famous work late in life.


One possibility is that GP is thinking that "life expectancy was 25, so 20 means you're an old geezer on the deathbed." Alternatively, it may also be a reference to the extreme youth of Alexander the Great, who became king of Macedon at age 20 and proceeded to put down a Greek rebellion and then conquer Thrace, Persia, Egypt, Bactria, cross the Hindu Kush, conquer bits of India (then unknown to the Greeks) and was only stopped from going further when his army said "it's been long enough, can we go home?" He died 3 years later, at age 32. Obviously, he is very much an outlier.


Measures of life expectency are misleading in this regard.

Much of low life expectency comes from very high infant mortality -- deaths to age 5 or 10. Once you've passed that threshold, your odds of survival increase markedly. Perhaps slightly lower than today's first-world standards, but hale-and-hearty at 15 doesn't mean you're going to fall over dead at 25.

You also want to look at life expectency at any given age. And I distantly recall that the number of (I can't remember specifically which) Greeks or Romans surviving to age 80+ compared favourably to modern times.


Yeah, Alexander was a super hero.

I remember reading stories where he charged ahead of the rest of the army, scaled enemy walls, chased down royal bodyguards and so on. He was bloody crazy. A guy with his (lack of) brains should not have made it past 16, let alone get to 32.


A real life example of refuge in audacity?


I miss common sense.


Sometimes you want to read the article.


I did. It was full of shit. These are the ramblings of someone who's never worked with a mature framework where, as always, there will be lots of legacy stuff laying around just begging to be fixed.

But no, fixing stuff is hard. Let's write bitchy blog posts and blame everyone else.


Demurrage money.


You do cryptographic compute. You're about the only HPC application that benefits from AMD architecture. Most other scientific compute and deep learning applications have horrible performance on AMD compared with NVIDIA (usually for software reasons that are in principle fixable, but it is what it is).


There is a lot of applications beyond crypto that depend on an integer performance. Even in graphics, not just scientific/engineering compute.


sure, but if you're not integer performance bound, which most graphics work won't be, it doesn't usually matter.


Yet, a lot of stuff beyond crypto depends solely on an integer performance + memory bandwidth. Graphics included. 2D image processing is better done in integer or a fixed point.


In HPC?


HPC included. A whole lot of simulations fits well into a fixed point.


If there were blank white columns on either side of text in the middle, you wouldn't have complained.


That only works until people start doing it. Then it retroactively stops working.

The only way to prevent your face from being recognized is by not having your face in a photo to begin with.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: