Charitable acts being lessened by others learning about them is possibly the most harmful widespread opinion that exists today.
Your comment makes it less likely that others will broadcast their charitable giving, result in less people being prompted to give, and overall reduce the amount of giving in the world; all for no gain for yourself or anyone else. You want "a bit sad"? Dwell on that for a while.
There's a "Curb Your Enthusiasm" episode where Larry David's character donates to a museum with his name on the donation. There's no self deception for him; he wants to do something good AND he wants the credit. And that's totally fine! The character is self aware enough to know that part of his motivation is selfish.
My point is that by not acknowledging the selfish part of "wanting the credit", the author comes off as not self aware [or disingenuous, but let's give them the benefit of the doubt]. That's what I meant by cheapening it for them.
If the good is done and lives are improved, does it matter is someone felt good about their act?
Re: "cheapening", who exactly is ranking the charitable act and why does their perspective matter? I was raised with that ideology but eventually abandoned it.
Nowadays, I care a lot more that something helpful is done that the signaling attempt to admit our desire for recognition or hide our involvement in the good deed because that mindset prevents a lot of helpful giving in my experience.
You sound like an idealist. You expect people to do good without getting anything out of it. The fact that they seek recognition reduces the value of their gesture in your eyes.
I don't follow. Does broadcasting this on the internet reduce the number of lives saved? The point of altruism to save lives, so as long as those are maximized, I don't see any "cheapening" happening.
"But when you give to the needy, do not let your left hand know what your right hand is doing"
A lot of charity is not for the sake of helping someone, but for the sake of making oneself to look noble, get publicity, or earn money (like those "buy X, then we will donate Y" promos).
Personally I don't have such bad feelings about that, as it is a win-win situation very often - those who need get something eventually. I can tolerate a little bit of hypocrisy in that case.
The reason I started giving to AMF was because a friend of mine posted about their giving to AMF. I would have been less likely to do so if I hadn't seen that post, so broadcasting probably led to more lives saved. Another friend of mine said he was also motivated to give by seeing the very same post.
Anyway, I didn't claim that broadcasting is positive, or has a point. I simply responded to the claim that broadcasting "cheapens" altruism.
It lessens it for the author because they're looking for validation from someone in some type of way, the same way the breakfast people do. If you're content being someone that needs validation from other people for your actions, then sure, it doesn't lessen it for them.
I bought two of the limited-edition hardback books -- one for myself, and one to gift a friend. I'm very happy that Derek informed me that the proceeds are being donated to charity. Heck, I probably would've purchased more as gifts had I known about his plans ahead of time.
Maybe he had other reasons to post? Maybe it attracts people toward him that think similarly? Maybe it’s to showcase his thought process on how he likes to maximize his contributions or his thought processes in general? Maybe it’s to make his customers even more happy that the money went to a good cause? Maybe it’s to popularize his way of thinking about money and gratitude? Solely feeling the need to broadcast to the Internet is a tough assumption to pin on Derek. Assuming other people’s motives may say more about the one assuming.
Nah, there's something unique about Rust zealots in this respect.
nicoburns comment is completely off topic, and it's extremely obnoxious to see this kind of thing over and over again from the Rust community. And I even like the language!
I wish I could upvote this a thousand times. At the end of the day, it's all ego driven development. Nobody wants to think that their job has already been solved, and what's left is mundane and boring [engineering wise]. And so the business ends up with using a distributed log to handle a data set that could fit on a big thumb drive.
Yes, but there at least is some genuine innovation going on. The backend frameworks are just churn for solving the same boring and lucrative business problems in new, fashionable ways.
Obviously you should do what you feel comfortable with, but now is a good time to go the doctor and get that checked out. The numbers are pretty low and stable for the moment. And FWIW, I had surgery last week so I'm not giving advice that I wouldn't follow myself.
So what? In the history of Python, there was literally 0 bugs caused by print being a statement. Breaking backwards compatibility for shit like that is amateur hour.
edit: And just to be clear, I'm fine with making print a function for all new code. There were ways that the python community could have accomplished that [and the unicode switch] without breaking the entire language.
Being a statement is less of an issue than the special behavior of the trailing comma, '>>' token, and the inability to augment behavior with named arguments. The lexer shouldn't have to deal with that sort of stuff.
Backwards compatibility is something that you only ditch for extremely good reasons, for instance to deal with a security issue that can not be handled in any other way. Breaking it for reasons of aesthetics is just plain dumb. The python ecosystem is still suffering from these major mistakes many years later and python's image as a reliable and trustworthy tool suffered right along with it. Such changes are fine pre 1.0, but once you ask people to base their production systems on it you maintain backwards compatibility at a cost, even if that means holding your nose occasionally. And my suggested fix would at least allow for the old syntax to live happily along with the new one, which given the price (two stupid brackets) could have easily been dealt with. Focusing on purity vs the responsibility towards people maintaining 100's of millions of lines of code in aggregate (if not another order of magnitude more) means you don't have your priorities straight.
Imagine we'd drop HTTPS for a new protocol because it fixes the 'Referrer' spelling error.
The analogy here is: "Imagine making a new, incompatible protocol and NOT fixing the 'Referrer' spelling error, because the old protocol had that spelling error"
Python 3 is the version with breaking changes from 2.x and while stuff is being broken, they decided to fix the print statement.
Python 3 wasn't invented to fix the print statement.
> The lexer shouldn't have to deal with that sort of stuff.
The problem with that is that the entire python ecosystem had to deal with that instead...
I'm going way past rant territory at this point, but there's a reason that Microsoft and Amazon are worth a trillion dollars a piece, and it's not because of beautiful and elegant APIs.
SO, you're saying that if Python abandoned quality in favour of aggressive sales and marketing in the channel and cut-throat monopolistic practices against competing program languages, then its core programmers would rise to the ranks of American social elites?
Python 3 literally broke "hello, world". For some vague notion of aesthetics. Or for some even more obscure "inside baseball" reason that 99% of Python users could care less about.
The following still compiles and runs, on the compiler that comes with the latest OpenBSD:
main()
{
printf("hello, world\n");
}
Somehow the C language has managed to survive and thrive without breaking the canonical example program.
It survived and thrived because it did not break the canonical example program. If they had broken printf in say 1995 the situation would be entirely different today. Oh, and they did get it right the first time by making it a function.
It really opened my eyes to what AD(H)D is really like from someone actually suffering from it and speaking his mind without mincing words, and how hyperactivity is sometimes overemphasized in the diagnosis. So some people get the impression that if you're not hyper it doesn't apply.
And for me specifically this was so ironic because ADHD was the one disorder that my parents insisted I never had, so I just ended up believing them the entire time I was with them.
But at least now I can think for myself and revisit the possibility for once without being shut down.
And they had once been wrong about me not having this other disorder that, surprise, I actually did have, which is the only reason I was hired into a position specifically seeking those with a diagnosis of said disorder, allowing me to pay my own rent.
And it certainly has something to do with anxiety.
I might have had panic attacks at one point when the pain was new and I wasn't sure if I was in immediate danger. Even after I sort of got used to it and it became "normal," it still stresses me out but not in an "am I going to die in 5 minutes" sense, spurred on by late-night WebMD binging.
I very well could be wrong, but that's almost completely besides the real point [as well the self diagnosis regarding ADHD].
The amount of anxiety you're experiencing in your day to day life is grossly abnormal. I would suggest not waiting another day to start seeking out the help [therapy or working with a psychiatrist] you deserve. I also you suggest you research hallucinogenic medications as something to possibly relieve your existential anxiety: https://www.hopkinsmedicine.org/news/media/releases/hallucin...
For the Linux kernel? Absolutely. Rust is way too unstable right now for such an important piece of software. Give it 20 years and a standard, then we can talk.
Based on what? The relevant folks and decision makers are already talking, so it doesn’t sound like they think there’s a need to wait 20 years to at least start the conversation.
Writing some optional, non-essential piece != rewriting a piece that will require taking rust as a dependency for the rest of time.
The kernel folks just did a lot of work to remove the strict dependency on GCC. To think any of the big players [Red Hat et al] want to turn around and make their shining star dependent on a language that's been 1.0 for less than 5 years is just not realistic.