Hacker Newsnew | past | comments | ask | show | jobs | submit | austenallred's commentslogin

> deep learning vs crypto is a clear divide of rotators vs wordcels.

"deep learning vs. crypto is a clear divide of math people vs creative people."

> the former offends theorycel aesthetic sensibilities but empirically works to produce absurd miracles

deep learning doesn't seem like it should work to people who are entrenched in theory, but somehow it produces great results.

> the latter is an insane series of nerd traps and sky high abstraction ladders yet mostly scams

crypto is full of interesting technical challenges but mostly produces scams.


I haven't felt quite that out of touch in a while; thanks for the translation.


It's like glancing at Chaucer and wondering how that ever became the English language, but in reverse.


File a chargeback on the credit cards


Each chargeback on a transaction costs Facebook about $20 even if the credit card dispute does not end up being resolved in your favor. I recommend doing this (in addition to getting legal advice or taking legal action) if you want to penalize Facebook as much as possible for the inconvenience they caused you.


It'll penalize OP way, way, way more and cause them way more inconvenience. Cost of getting a lawyer, ending up on potentially shared blacklists for chargebacks, wasted time, etc.

I feel for the OP, but this won't stick to the man. It'll stick to the op.


I think the fear of a "chargeback blacklist" is overstated here. Anyone can easily apply for another credit card, add an authorized user to their existing card, or use any card masking service to change their payment information. Chargebacks exist exactly for situations like this. I've never had issues with any shared blacklist after filing chargebacks against shoddy businesses. You must be ready to cut off contact with the business you are filing against, in case they refuse to do business with you in the future, but that's not a problem here because it looks like Facebook has already cut the OP off.

You don't need a lawyer to go to small claims court. OP can get free legal advice through a local self-help program or an inexpensive legal consultation through a local bar association program if they want to, but paying for legal assistance is certainly not mandatory.


You don’t need a lawyer to issue a chargeback.

Agreed about the FB blacklist though. Sometimes I think I should set up a bunch of shell LLCs for my consumer accounts.


1. You can add more

2. New RAM is not like old RAM.

Macs are dramatically more optimized than they used to be.

I have a MacBook Air (M1) with 16GB of RAM and it runs more smoothly than older systems that had twice that much.

8GB of RAM today feels like what 32GB of RAM used to. 8GB of RAM can handle very process-heavy tasks, like... running Chrome ;)


> 1. You can add more

Tell that to Apple so they can do that out of the box.

> 8GB of RAM today feels like what 32GB of RAM used to

No. I think you're confusing memory speed with capacity. Making ram faster doesn't mean it can magically store more.


> Making ram faster doesn't mean it can magically store more.

For real. This myth that "Unified RAM" doesn't need as much capacity as "regular RAM" needs to stop being perpetuated. Intel-based Macs already had memory compression and SSD swap.

My M1 MBA with 16GB of RAM was definitely limited by the amount of RAM a number of times throughout its life, and my 24GB M2 MBA has a much better balance due to the additional RAM.

8GB is fine for someone who doesn't do anything but basic web browsing and word documents, but I'm not comfortable recommending 8GB of RAM to anyone who intends on doing more than that. I'm honestly a bit uncomfortable with 16GB these days, but it is tremendously nicer than 8GB.

If Apple hadn't just lowered prices by $100 on Mac mini, I would say that 16GB should be the minimum, but for $599... I think 8GB is probably fine for what you're getting.


If it's running an SSD, at least the swap speed won't be as crippling. Assuming it uses any.


Everything you said is true; I'll even add that MacOS is probably the best functioning Desktop OS under low memory conditions!

However the parent is right, it's disgusting how much memory capacity is needed to run a basic environment.

Maybe having exceptionally fast machines with limited ram will cause people to actually think about their resource usage.


> the answer to bloated apps

RAM is dirt cheap these days. Bloated apps are bad, but 8GB really is simply a joke that punishes many people for the sake of product categories, at least until RAM is indistinguishable from storage.


Not all RAM is dirt cheap. RAM in DDR4 sockets is dirt cheap.


8GB of general-purpose RAM, regardless of form factor, is dirt cheap. In fact, not having the sockets might make it cheaper than socketed RAM in volume, since there are fewer parts in total.


Except, with Apple's new kit here, the RAM isn't simply some external chip soldered to the main board, it's actually on-die with the CPU silicon (and everything else in that silicon: GPU, memory controllers, etc).

So yes, arguably there are fewer parts (just one), but in the event of e.g. some bad RAM during manufacture, it's far more costly to throw out the chip containing that bad RAM.


No. It is not possible to make DRAM on the same silicon process as high-performance CPU logic. It is a myth that Apple Silicon includes the RAM on its die. Apple uses external LPDDR packages, just like everyone else, which you can clearly see in this photograph of the mac mini's CPU module: https://valkyrie.cdn.ifixit.com/media/2021/01/28102657/m1_ch...


Thanks for the correction, appreciated.


I don't think this is correct.

https://www.macobserver.com/analysis/understanding-apples-un... is an article which describes the architecture. It's not on-die, it's next to the CPU, basically as close as it could physically possibly be without being on the die.

Here's a photo of an M1 CPU:

https://www.techinreal.com/wp-content/uploads/2021/05/2feddf...

Those chips on the right side are LPDDR4x chips (which you can verify by googling the part numbers visible on them). They are "off-the-shelf" so to speak, not custom on-die memory.


Thanks for the correction, appreciated.


> best functioning Desktop OS under low memory conditions

Not it's not. Gnome or KDE can have 2x if not 3x memory overhead.

> with limited ram will cause people to actually think about their resource usage

Why? Memory is cheaper than designing new CPUs. 8GB in the base model is just a way for Apple to upsell upgrades and improve margins through market segmentation. If they shipped 16GB in the base model they'd have to cut prices or lose customers. It's a simple as that.


GNOME and KDE are desktop environments, not operating systems.

Linux (probably the operating system you mean) performs terribly when there is no available memory. Solaris run (ran?) GNOME and was also excellent when there was low memory

Why force people to build software more efficiently? Because I have a desktop with 64G of ram and it is doing the same thing my desktop from 2008 was doing, only a few more pixels, a few less animations and a whole bunch of people thinking that hardware can pick up the slack from development companies externalising their costs (since devs cost money, right!?).

well. hardware costs money for us. fix your software.


> Linux (probably the operating system you mean) performs terribly when there is no available memory.

I assumed you were talking about system with low amounts of memory rather than well almost all available memory is already used. You might be right about the second case (of course why would anyone have a new Linux PC with just 8GB of memory. Linux laptop OEMs charge the same for an upgrade to 64GB as Apple does for 16GB)

Why do you need to buy the latest macbook with a very fast CPU? Well you wouldn't, if software was more efficient. It's exactly the same argument as with memory.

> doing the same thing my desktop from 2008

Similarly to how your desktop from 2008 was doing the "same" job as a Windows 98 machine? Except it's not really doing the same job, expectations consumers have on software have changed dramatically over the last 15 years.

> well. hardware costs money for us. fix your software.

Yes it does cost money. Because Apple charges extremely high margins on memory upgrades. If you could update your own RAM or if Apple's upgrades had the same margin as the base model itself additional 8GB wouldn't be more than extra $50 (maybe a $100 at most if the memory Apple uses is so 'advanced' AFAIK it's not..).


Send messages to people, write some code, do some online shopping. I don't see how much has changed except battery life and a few more pixels.

8GB of ram is more hardware cost than 16,32,64- thats just how it goes.

Now, I dont want to make it sound like I am an apple fanboy, but iPhones have less ram than android phones, and for some time this had a material impact on size and battery life: the reason they got away with this is because in order to make software for iPhone, you had to deal with what you were given.

the situation today is: “everyone has 16GiB of ram, why would I spend company resources prematurely optimising”, and since every fucking company has this same mentality it leads to slack, teams, asana, jira - fucking everything basically, using more RAM each than my first computers had disk space.

The state of hardware progress has slowed, for most of the last decade you could barely buy a laptop with more than 16GiB of memory, I had to buy some godforesaken workstation laptop to get 32G in 2017- It is not OK to externalise this cost on people, and some downward pressure is needed.

I’m not really defending Apple like you think I am, I’m saying 8G is enough for doing most things, but our apps have become bloated as fuck and make us think that 8G is nothing.

My last linux laptop used 300MiB (not including filesystem caches) for everything, including mail, chat, development (but that bursted during compilation); until I opened discord, teams, a web-browser with all of its integrated product suite, or slack.

then I was up to 12 or even 15GiB of resident memory.

I wont apologise for that and force my hardware vendors to give me more ram at a lower price because of that.

The better performance of the CPU helps everything, battery life (race to idle) included.

More RAM helps people who don't close tabs; and people making software that does not even attempt to constrain its resources.


> Send messages to people, write some code, do some online shopping > I’m saying 8G is enough for doing most things,

All of that can be technically accomplished on Windows 98 machine with 128MB of RAM can't it?

Why shouldn't 4GB be more than enough for most things (maybe even 2GB? That would have been a huge hard-rive several decades ago). I mean I do agree with your main point, but the cost of additional 8GB at this points is not really significant compared to the cost of the entire (~$1000+) device which alone is IMHO a pretty good argument to ship 16GB in the base model.

> The state of hardware progress has slowed, for most of the last decade you could barely buy a laptop with more than 16GiB of memory

Through most of that decade you could upgrade most laptops yourself. Even macbooks until ~2012, I had a 17" MBP from 2011 and I had no issues installing 16GB RAM myself back then.

But yeah I agree that there is no good reasons for Slack, Discord etc. to use 1GB+ memory. The web apps use a bit less I think and they doe pretty much everything (including notifications) though, Safari and Firefox also seem to be much more memory efficient than Chromium.

> The better performance of the CPU helps everything, battery life (race to idle) included.

> More RAM helps people who don't close tabs; and people making software that does not even attempt to constrain its resources.

Well you have different preferences than some other people. Also there are perfectly legitimate reasons to need more than 8/16GB RAM besides more open tabs (what's wrong about wanting to open more tabs though?) or using Electron Apps.

Also I can both agree with you that software could and should be more efficient and think that Apple charging this much for memory and storage upgrades is objectively outrageous. They can only get away with it because they purposefully made their HW non upgradeable and because people who use macOS simply have no choice than to pay that much.


its not about preferences really.

as a developer myself I can justify the extra cost of RAM, I can make use of it (IDEs for example need a lot of RAM) but the company I work for should really pay for this. With that in mind: $200 is almost nothing.

Overall, I think we agree, mostly I’m absolutely pissed off about runaway hardware requirements for running basic software, leaving no room for me to run my specialist tools; even with top of the line laptops. (leading to me buying an absolutely overjacked desktop, which apparently is not enough soon?)

As mentioned, most of the 2010s I ran with the most RAM you could reasonably get in a laptop, but still felt the slowness because of these “productivity” programs which are often completely proprietary.

My main argument here is that I don't think we should all be running 256GiB of RAM, but it feels like the consensus is that “we need more RAM” and that continues to be an argument, because “we cant do much with 16G”.

I say we agree, because as you say “why is 4GB enough” I am saying “when is it ever enough?”


> $200 is almost nothing.

What I'm trying to say is that even with decent margins extra 8GB does not cost anywhere near $200. Apple is just price gouging their customers cause they are a monopoly in the macOS market.

> I am saying “when is it ever enough?”

I'd say as long as it's relatively cheap. Most people buying $700-1000 machine would be willing to spend $50 for 8GB of RAM. That's pretty reasonable. So the "minimum" amount should be based on what's generally affordable to 80-90% of consumers buying new hardware. Anyone buying a laptop (or a screen less device based on laptop components like the mini) in the price range I mentioned would afford 16GB of memory if they could install it themselves or if Apple sold upgrades at with margin similar to that it puts on the the base device (and not 200% they charge now).

Anyone who is using an old device should be able to upgrade it's memory without having to buy a new one. The fact that Apple is selling computers that could be obsolete in a couple of years is deplorable from the perspective of the environment (obviously great for Apple's shareholders).

That developers are writing inefficient software doesn't really justify this in any way.


Or

- Spend less on stuff than they assumed is necessary in a hand-wavy calculation


Yeah like healthcare, emergency funds, basic entertainment, education, food that isn't going to ruin your body, the ability to take medical leave / time off so you don't burn yourself out.

You know.. the hand-wavy stuff.


The stuff isn’t hand-wavy, the number of “this is a living wage” is


For me half of the point is that I can just dump everything into one place and not categorize or think about it


A16Z didn't invest in FTX


No, but they've been all over crypto:

https://a16zcrypto.com/


Yes, but crypto didn’t steal billions of dollars. Individual companies did.

A lot of people are still reasonably very excited about Bitcoin and crypto.


A lot of people are un-reasonably excited about Bitcoin and crypto, in large part because VC backed companies are pushing the story that there is money to be made there.

A lot of those people will lose their savings and/or their shirts because they are not made aware of the fact that those companies are on the sell side. You can do your utmost best to try to separate that from 'Bitcoin and crypto' but to your average victim that doesn't really mean much.

I've used crypto, I think I understand the main limitations, have mined a single bitcoin back in the day when this was still possible on a GPU, received donations and made donations using bitcoin, in the end I see it is of limited usefulness, mostly related to when people are unbankable for a variety of reasons, but I do not buy the hype.



The juxtaposition of the headline with the items above it from the present is interesting.


If I am interpreting the parent's comment correctly, he's referring to TAs (we called them "team leads") not instructors. Based on the context of the comment I'm assuming that this was a time when we had a TA for every 8 students on top of a layering of instructors for each cohort, and indeed the TAs would do a lot of the 1:1 interaction. There has never been a time we didn't have qualified instructors, though I would readily admit the quality of our instructors has improved over time as we got better at instructional design and hiring.


I love the idea of CIRR but it is largely a failed institution. Their measures have changed dramatically over the years (the last CIRR event anyone at BloomTech attended resulted in the notion that anyone who adds anything new on LinkedIn could be considered "hired," even if it was a portfolio project or self-employment), and are used very differently from school to school, resulting in every major school I know of stopping to work with them.

For example, we used them for our first outcomes report and paid extra to have them "verify" our outcomes report, but they literally never opened the Google Drive file we sent them.

I think it was a great idea set up by well meaning people, but the self-governing aspect and comparisons created ended up in weird incentives that resulted in it falling apart.

The review sites are perhaps marginally better, but the positivity of reviews are almost 100% correlated with how hard schools work to farm for positive reviews, and their business model is selling leads to the schools, so the incentive isn't for objectivity there either.

Honestly the best way, though it requires more work, is to find a handful of recent grads on LinkedIn and ask them about their experience.


Ah, OK. Now I wish it wasn't too late to edit my original post :)

The stats on the CIRR site across schools did always seem a little... odd to me, with differences in outcomes too big to believe at times. Sounds like I would have found the same thing if I looked at any individual school over time, as the rules and practices changed.


No, the fact that it was a trademark lawsuit is a matter of public record. https://dockets.justia.com/docket/california/candce/4:2019cv...



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: