The whole question sounds wrong from the beginning. It is not 'the internet', which is being addictive but certain services, who use the internet to provide their service.
I do not think that it is questionable, that, for example, facebook or online games like world of warcraft have a huge potential to be addictive, but this is not a problem of the internet itself.
You would not blame drug addiction to transport infrastructure, calling for its regulation, just because it is involved in bringing the addicted and their dealers together.
The article is using layperson terminology and is infact talking about "certain services", not the IP protocol. I think this is apparent from the article.
There's a lot of head-in-the-sand, kneejerk appeals to individual responsibility here, which is understandable on a forum populated by the people who are perpetuating the most egregious examples of the behaviour the author is lamenting. This ignores the author's point, which I'm assuming is the point of these responses.
Instead of just making some abstract appeal to 'individual authority', how about actually engaging with what they're saying?
"A handful of corporations determine the basic shape of the web that most of us use every day. Many of those companies make money by capturing users’ attention, and turning it into pageviews and clicks. They’ve staked their futures on methods to cultivate habits in users, in order to win as much of that attention as possible."
This seems like a salient point that is worthy of proper discussion and analysis. I've recent been reading a great book [1] about it, that is forcing me to reconsider a lot of positions I took for granted, as one of those building the web, complicit in a whole bunch of this madness. I highly recommend looking into this more, as, in my opinion, we're becoming victims of our medium in the most McLuhan-esque sense.
Ok, let's discuss the ideas presented in the article. The two forms main forms of regulation suggested near the end of the piece seem to be letting users control the amount of content/notifications that they consume, and flagging users who seem like they are exhibiting signs of addiction.
Most sites do offer options to control the number of notifications, etc, but most people just stick with the defaults anyway. Even if there were an option for "Disable infinite scroll for me because I can't control my procrastination", how many people do you really think would use it?
Likewise, if Zynga et al did implement some kind of flag for users who play too much, do you think it would make a difference? If someone is willing to invest 5 hours clicking a Farmville button, then a popup that says "We noticed that you spent the whole day clicking a button, don't you have something better to do?" probably isn't going to be epiphanic for them. Casinos and some gambling websites are required to do this now, and it hasn't cured gambling addiction yet. Just like adding a perfunctory "Please drink responsibly" to the end of beer commercials has had a small effect.
What could work? In some parts of Asia, security officers will forcibly remove people from cybercafes if they're showing signs of addiction. But most people in the west use home internet, which would make physical enforcement cost prohibitive. Laws could be passed that outright ban certain types of games. But a politician would look pretty silly taking a stand against Mafia Wars on the floor of a legislative body, especially given recent world events.
There might be a great solution, maybe offered in your book recommendation (thanks for that, btw). But it's hard to see what can practically be done about this problem, other than people exerting more self control in a changing world.
For me, the question is bigger than just harm reduction and efforts at it, in fact, in the book a whole chapter is made around the example of slot machines, and the concept of 'autism as a design principle'. It's about the design decisions made and the way that we accept design which is clearly optimised to exploit human weakness. I guess, more broadly, it's about the ethical framework that one choses to make their decisions and broader policy within - which is really the whole point of the book I went against, and which your comment reinforces - But a politician would look pretty silly taking a stand against Mafia Wars on the floor of a legislative body.
That statement is only true within our current mode of thinking about these things, this book and all the other writing on the economics of attention is showing me that there's another way to think about this, and just because the standards we currently accept are normative, doesn't mean that they're exhaustive - and certainly not perpetual.
"This seems like a salient point that is worthy of proper discussion and analysis."
It may be salient but is also rather pedestrian. Yes, companies vie for attention and favourable habits. See also the whole non-online world: branding, advertisement, episodic entertainment, subscription services, etc. etc. etc.
I don't think regulations are the most effective things on the Internet, but the knee-jerk rejections of the very concept miss one crucial thing: addictive services are not something you can make people resist by willpower. Why?
Because they're optimized to be as addictive as possible by malicious people who get paid boatloads of money for being good at it. If people can resist them, they'll be improved until they again can't be resisted.
Right now, in many conference rooms worldwide, there are professionals sitting and talking about how to make their websites more addictive and life-sucking. Of course they don't call it that way. They use euphemisms like "user engagement", "retention", "user satisfaction", "conversion funnels", and picture things in terms of company growth. The addictive Web didn't happen all by itself. It was engineered this way on purpose, by people who don't care about others.
So whatever regulation there is, I wish it would be something that would allow to strike back, and not just attempts to shield users. If you're adding heroin to food to make it addictive, you will find yourself in front of a judge who will send you to jail. And so should be if you're making media addictive on purpose, to the detriment of health of the consumers.
We can't forget in those discussions that we're fighting living, sentient opponents, who are going to try and counter every coping mechanism we develop - and who are better at this game than we are.
>>addictive services are not something you can make people resist by willpower
Yes, you're right, but to your first point, drugs are addictive too and the War on Drugs (regulation) has widely been seen as a major failure.
I'll admit I didn't read the article, but I get a little annoyed when I see articles that reach thousands of people calling for regulation because one journalist/writer found some problem they deem a societal problem that can be best solved by regulation. For many, many of our problems, there are literally thousands of options to solve them before resorting to regulation. The problem with regulation is there is no guarantee that it is the best solution, but by it's very nature, it kills the possibility of better solutions being tried out later on.
> drugs are addictive too and the War on Drugs (regulation) has widely been seen as a major failure.
That's why I'm not jumping on the regulations bandwagon. We had enough such examples around banning addictive substances to prove that regulations can hurt people much more than they're helping.
You're right that regulation isn't the best default solution. They have a lot of momentum; once enacted, they get very hard to remove.
This actually reminds me of pg's insight from his very relevant essay[0]. He argues that society develops antibodies against addictive things, and that the law actually follows - not precedes - social customs that change to protect us. But he also points out that technology seems to be outpacing the rate at which society can adapt itself.
It will be interesting how to draw the line here. Clearly, some of the highly addictive Internet platforms actually contain useful information like the StackExchange network, some parts of Reddit, HN… So you would have to make exceptions for things that are likely a professional or personal net benefit for the user to get such a regulation right. I think this will be hard to quantify as opposed to identifying heroine in a product.
Perhaps nudges are a viable alternative approach to this problem. Perhaps simple banners like "Procrastination can kill." on such websites are sufficient. There is a great book on this topic by Richard H. Thaler and Cass R. Sunstein.
The primary difference between SE, HN, even Reddit and addictive places is that the former aren't designed around getting you addicted to the service. They have it as a side effect, but we're talking of a similar side effect that makes one read books or talk with other people.
I'd like to see more focus on separating places that are incidentally "addictive" from those that are addictive on purpose.
I read an article[1] a while ago on evaluating political proposals: When proposing a rule, imagine the rule under the control of politicians you actually know, running in electoral systems with voters and interest groups that actually exist. In other words, deciding that HN and Reddit obviously aren't addictive in the wrong way, but Candy Crush obviously is, is fine when it's you and me working it out -- but what is your confidence that an Federal Internet Addiction Board with members named by presidents Donald Trump and Hillary Clinton[2], and confirmed by (the intellectual heirs of) senator Ted Stevens[3] would reach a similar conclusion?
This seems to want to incriminate someone's thoughts, based only on some other people's optional responses.
Can you imagine a whole legal framework for divining and arguing "design" intent? "No, your honour, it is designed for maximum entertainment and repeated play value, it says so right here in our design manuals."
I can't. But I'm increasingly noticing that the biggest problems with our existing legal framework happen when it can't follow the perpetrator into the intent-land. There are many cases where it's blindingly obvious that something was done with ill intent, but the legal system can't process it (and trying to make it would only open it up for more abuse). I don't know how we should handle those cases but I know we need to figure it out, because they're the ones that are becoming the most harmful.
I mean that people are finding a sweet spot to act maliciously while still staying on the legal side. We all know what they're doing, but we can't make it illegal without hurting innocent people.
Consider advertising. We can often tell when a vendor is bullshitting their customers, but if they do it well, the law has no way of reaching such person. Trying to extend the reach of law to cover such cases would create opportunities for other malicious actors could silence people who disagree with them. So bad people are free to thrive at the intersection of "in bad faith" and "still technically legal".
That may be so, but it sounded like you were making a much stronger claim two messages up - as though intent made activity more harmful to the victim (than if no intent were present). That worries me because that way lies thought policing.
Of course intent doesn't change the harm done to victim, but I believe it definitely changes the way we should look at the perpetrator.
In a perfect world, the only thing that would matter in a justice system would be intent - it would determine whether one's to be punished or left alone. The issue of harm to the victim could be covered by insurance. But such a system would require something that could determine intent and rely it truthfully. We don't have such an instrument, so we're stuck with a system that considers everything but intent in order not to get gamed too easily.
Again, what I think we need to figure out is not how to just limit harm done to the victim (which, of course, is not dependent on intent), but how to push back against the perpetrators who are clearly acting in bad faith but stay on the legal side. Just like we don't only instruct people to keep their valuables safe - we also go and catch thieves.
> In a perfect world, the only thing that would matter in a justice system would be intent - it would determine whether one's to be punished or left alone. The issue of harm to the victim could be covered by insurance.
Not really: insurance for harm to victims deals with harms for which the justice system (particularly, the civil justice system) holds the insured liable. If the justice system doesn't deal with the harms, there is no liability to insure. It doesn't make sense to talk about insurance for harms to victims as a primary solution independent of the justice system.
"how to push back against the perpetrators who are clearly acting in bad faith but stay on the legal side"
But by definition of being on the legal side, such people are not even perpetrators.
Perhaps "bad-faith-crimes" are something the legal system shouldn't get involved in (never mind that it cannot competently, as we agree). Among other reasons, the bad-faithness definition itself becomes easy to game (see "*ism!" accusations).
Well, HN is definitely not an example of what I'm complaining about. Hell, it's designed to filter people out! HN's addictive powers come from the simple fact talking to interesting people and reading trivia is addictive. I'm not sure we should be focusing on those.
No. It's mostly similar to HN IMO, it's just more inclusive. The content that is being created for Reddit, or generally for social media, would be a perfect example though. It tends to be designed for quick dopamine hits, and this is the other source of Reddit's, or Facebook's addictiveness.
If you find yourself explicitly thinking about virality of your content, you have to ask yourself why exactly do you want to unleash a memetic weapon on people.
Many answers here say "No" because it talks about regulation. But the article is an interesting study: What website features could be disabled because they're systematically addictive? The article suggests:
> 1. To forbid infinite scroll,
> 2. Sites should be required to flag users who display especially compulsive behaviours,
> 3. Certain sites or browsers would be required to include tools that let users monitor themselves – how long they’ve been on a site,...
I'm an addict of 9Gag. I'm also to a lesser level a compulsive reader of HN, fb, commitstrip, my sales dashboard, support tickets and emails. I'm not sure however what would be the solution. Actually it all started because the compilation time (60-90s) is just long enough to lose focus from programming and let me wander onto other topics (and if it were not fb, I'd find something else like my bank account).
During compilation time, do you think it's possible to let your mind stay idle, as to avoid distraction and internet compulsive disorder?
I'm not sure whether we should use "addiction" and "lack of discipline" interchangeably. Ever since I was small, I've associated the former word with changes in brain chemistry that cannot be overcome with a regular dose of self-discipline (e.g. heroin addiction). If we can break it behaviorally, then isn't it a compulsion disorder of some sort?
Addiction is generally defined as not being able to stop an activity that rewards you in someway despite the consequences. With drugs like heroin it's important to note the difference between chemical dependance and addiction. Chemical dependance can certainly cause addiction, but plenty of heroin addicts are (citation needed but I'm pretty sure) staying addicts because they want the high - which is the same principle as how people get addicted to gambling, or to weed (a drug that doesn't cause any physical dependance) - not because they are trying to keep withdrawal symptoms away. Of course, plenty of addicts do at some point want to quit and at that point the dependance can keep them addicted.
Arguably the line between "addiction" and "lack of discipline" is subjective and arbitrary. It's possible to be a heroin addict, decide to never do it again, and... just do it. I've no personal experience but know it must be hell to try and do that, but it's absolutely possible. And former addicts, who no longer have any traces of the drug in their body, who have got past all the withdrawal... often relapse, for what you could call "lack of discipline". Equally, many people are addicted to gambling in ways that leave them losing all their savings, or addicted to weed in a way that they get nothing done in life because they're getting stoned all the time, or...
For me, the line would be drawn based on how hard it would be to give it up, and how much damage is being done. If you're just annoyed that you spend too much time browsing the web, it probably shouldn't be called addiction. If you spend 95% of your time in the office browsing the web, and are concious that this is causing major problems for you, but still can't bring yourself to stop... addiction seems fitting.
People tend to talk about "dependence" and not "addiction", even for things that have a strong physical addiction, eg alcohol.
An alcohol dependence means the person is preoccupied by alcohol, has a tolerance for alcohol, has a craving for alcohol, and will continue to use alcohol even though they know it's causing them harm.
That definition can be applied to other things, especially some Internet things.
I don't think a solid line can be drawn between compulsion disorders and addiction. Both are reinforced by the reward mechanisms in the brain, and both can have (psycho)somatic effects that are hard to overcome with willpower alone.
I would even guess that the lesser somatic addictions (e.g. cigarettes) are easier to overcome with willpower alone than compulsion disorders, but I have no actual experience or numbers to back that up.
You need to add "habits" to the equation as well. All of those things exhibit the same properties. We differentiate them by a mix of how many problems it causes for the person, for the person's surroundings, and various moral preconceptions about what one should do. Biochemical changes and responsiveness to raw willpower aren't really useful metrics, because both habits and addictions affect the brain, and the whole point of a good habit is to make something that's resistant to willpower.
I at first read this as "Medication is a good start" and I'm relieved to see on second reading it wasn't that.
I suffer from the same problem though, the 1-2 minute compilation times are just long enough that I constantly distract myself opening and closing HN probably hundreds of times a day, often the content is identical to the previous time of opening, but the occasional nugget is enough that my brain clearly gets something from just opening HN even to just close it immediately that it has become ingrained in my working life.
I'm interested to hear how you use meditation to break that?
Don't we need something to keep our brain active during compilation? It's rather the opposite: Black out Facebook/email as soon as compilation is over. The problem isn't being on fb, the problem is being there too long and decreasing the productivity.
I really like this article, it's full of good points and it's definitely a huge real problem, but I do think better tools for users are the better (and easier) route to fixing this.
Personally I find every time that my biggest problem beating this is that it makes distraction totally automatic; I open facebook without even thinking about it. I have a reasonably good level of willpower when I'm actively making the decision, but I find as soon as I'm a bit bored or distracted or doing something I don't really want to do then suddenly I'm on my third buzzfeed article that somebody posted on facebook, and half an hour's disappeared down the drain. Painful.
It's the constantly compulsive going to check facebook part that's the real problem, imo, rather than the staying on it too long. I'm not too sure how regulation is really going to help you pull that back, even with the ideas proposed here.
Excellent point about the existing tools not being accessible to the non-tech crowd though, even though the problem is universal; that's definitely been my impression of the current audience. Not sure why though; any idea what they're doing wrong? What do we need to do to get everybody using Build Focus/RescueTime/ForestApp/etc to help your average person stop losing time to facebook too, instead of just the productivity hackers?
[Disclaimer: I'm actually the creator of Build Focus (http://www.buildfocus.io), so I do have a little skin in this game]
My take on drugs is that kids should not use them. Not because I say so or because some of them are dangerous. But because I hope they really don't need to self medicate broken psyche.
So the best way to dump addiction is not to ban alcohol from yourself. It's finding some meaning for your life before you turn into chronic alcoholic.
...and here I am, browsing HN because I'm absolutely terrified to make few phone calls. Not because of the calls themselves, but because I don't know what will happen afterwards.
I only skimmed over the article as I find the beginning already horribly patronizing. Why don't we deregulate drugs and treat drug addicts like we treat internet addicts, by offering help and counseling? Find the holes in their lives and try to fill them with non-self-destructive habits.
> Should we blame Michael S for wasting hours of his life hitting a small button? We could.
Yep we could, and we should. I'm all for personal responsibility, and this feels like setting up a massive excuse for someone wasting their day on Facebook or Reddit. Unlike the Pigeons in the story, you're not locked in a box and are free to not use the internet if you so wish.
Yes, you are also free to choose your food at the supermarket, but there are still regulations in place to prevent selling of potentially harmful stuff. Although I agree, the author has somewhat missed the point here as Facebook != Internet.
That's a great example, actually: there are regulations that prevent actively, unexpectedly harmful stuff. There are no regulations preventing stuff that tastes great, is easy to eat but not entirely healthy.
We should take the same approach to the internet: prohibit the actively harmful (malware, scams) but not "addictive" content.
Actually, the most addictive content would under this example get banned too - as it should be (or rather, addictive content delivery methods should be). There are no regulations preventing stuff that tastes great, but there are regulations preventing companies from adding addictive substances to the food.
I think there might be some that apply to your criteria, for example at least in some European countries, sugary foods are controlled by taxation. I mean something like the idea of banning infinite scroll that someone mentioned earlier might be worth considering.
This sounds like yet another case of someone confusing Facebook (plus a few related businesses) with the internet. Regulate the endpoints, if you must, instead of blaming the internet for letting you talk to that endpoint.
Who are the self-selected few in substance or gun regulation?
"when implemented correctly" is a very strong condition, I suppose my start-up would be worth billions and my country would only ask for 10% taxes if they were implemented correctly.
In the case of illegal drugs (assuming that's what you mean with substance), the "self-selected few" are the traffickers and producers (including middle-men): they fully control pricing, availability and quality.
In the case of gun regulation, it's more or less the same, but there's another component: the items themselves are a means of asserting power, so the regulation has to account for the entire product lifecycle more than with other areas (yes, drugs can also be used to assert power, but in those cases it's the supplier who has power over the user, and with guns it's the user who has power over a third party).
I included "when implemented correctly" only to avoid having to discuss regulatory failures such as market capture (e.g. FCC) or rubber-stamping (e.g. FISA) as fundamental arguments against regulation. We don't use Volkswagen's recent troubles as arguments against cars in general, yet somehow that line of reasoning is valid when discussing regulation.
In the drug case people are not self-selected, because anyone can easily become a trafficker himself.
Same for the gun regulation case.
However, as soon as there is a legal structure that exerts regulation, it's very hard, basically impossible, to enter the regulatory institution. This staticness is one of the reasons, regulatory institutions fail after some time - Quis custodiet ipsos custodes?
But that's exactly what I meant with self-selection: anyone can choose to become a power figure in an unregulated market. How did you interpret self-selection?
Yes, regulatory bodies require maintenance and vigilance both to remain effective and to maintain an accessible market. Neither is a fundamental argument against regulation.
In the case of Twitter they probably don't know how often you're checking it. It's a stream that's constantly pushing updates to your computer whenever you have a client open. If you open it and put it the background and only check it a few times a day that looks the same as if you open it and keep it in the foreground checking every tweet.
[Novel-reading is] "one of the more pernicious habits to which a young lady can become devoted. When the habit is once thoroughly fixed, it becomes at inveterate as the use of liquor or opium. The novel-devotee is as much a slave as the opium-eater or the inebriate."
Dr. John Harvey Kellogg, 1882. (Yes, the cornflakes-guy).
https://books.google.co.uk/books?id=1ocgQy4hQA8C&lpg=PA37&ot...
There exists well-know literary mechanisms by which to make a book "addictive" and those mechanisms are used by unscrupulous authors and publishers to profit on getting people hooked on their books and to buy them and their endless sequels.
Make an argument that also includes regulating certain plot-devices in pulp-fiction novels (that doesn't make you throw up a little in your mouth), then we'll talk.
1. I spent half an hour the other day helping someone get Websense running again. Some places do regulate internet use.
2. I remember from many years ago the sense of disgust at knowing that I had wasted hours of a perfect Colorado afternoon in watching an old mediocre movie on TV. That, I think would have been before even DARPANET.
Perhaps we should consider regulating false advertising (in and out of the Internet) before trying such strange measures as "forbidding infinite scroll". Perhaps such regulation should include the form of false advertising we call click-bait (of which this article is, perhaps, a borderline case).
why do you consider this article click-bait? The content-to-link ratio of that page is extremely low. I agree that the HN link title is a bit sensationalist, but I'm a bit puzzled why the article itself would be bait-y.
Perhaps click-bait was too strong a word. Reading the article, I got the feeling the author crafted both the title and the content to stir up feelings on both ends of the opinion spectrum.
But the parallel between food and web in terms of consumption doesn't stand. The currency for food is money and organic food is more expensive, while the currency for web is generally time spent, and so the junk web is more expensive.
And no, we should not regulate either. Just promote the more valuable and healthier options. By the way, besides tools such as Freedom which only reduce the intake of junk web, what services do you think would fit in the category of organic web?
Can someone please fix the link title? The article is very interesting, but I almost skipped it because of the miscontructed sentence. The only reason I opened it was to check whether the article itself had that headline...
(edit: actually, the target page title is grammatically correct)
I do not think that it is questionable, that, for example, facebook or online games like world of warcraft have a huge potential to be addictive, but this is not a problem of the internet itself.
You would not blame drug addiction to transport infrastructure, calling for its regulation, just because it is involved in bringing the addicted and their dealers together.