Discussion about this post

User's avatar
Shreeharsh Kelkar's avatar

I am mostly with Purser here. Adults who want to waste time should be left to their devices (pun intended). Children are a different matter; and while what they do at home is best left to parents, letting schools experiment with phone bans would be great (this would be less about mental health and more about helping students concentrate; these things are very distracting!). And yes, the phone problem can only be solved by social norms, not legislation.

Also, abolishing Section 230 would mean social media would be much less vibrant (everyone hates it but they also like it): we know that from what happened when Congress exempted trafficking from 230 through FOSTA-SESTA, Craigslist closed down its Personals. Maybe you think that's not such a big deal but a full-scale deactivation of 230 will mean that most smaller social media sites will go out of business. Or they will do something like what PornHub did when credit card companies turned on it after it was revealed that there were some cyberbullying videos on there: they just took down all videos that were not from a "verified" publisher. I wish Wertheim had engaged more with this; I'm fine if he says: "so we'll get less social media and maybe less "controversial" video content? Big deal. That's a price worth paying." But he seems to think that taking Section 230 down would keep the content as it is, especially when it comes to content that is political (and let's face it, when people are scrolling, most of them are not scrolling political content; they are scrolling fun things).

I wonder if the right analogy here might not be Big Tobacco but Big Food. Big Food is delicious and while it causes "harm," it's not the same thing as cigarettes. More diffuse, harder to pin down. It's also much harder to regulate because you are not talking about one specific product; it's a whole bunch of things (just like social media!).

Last, but not the least, one thing that both Purser and Wertheim are missing might be more regulation of actual posters, rather than the platforms. For instance, Drew Margolin has a nice paper where he argues that platforms might be able to do something as simple as implement a checkbox where posters are asked to rate the importance of their posts and also how much they stand by it. These can be weighed into recommendation systems but posters can also be sanctioned (by platforms; and also open themselves up to liability) if they post content they stand by which is objectionable. https://ijoc.org/index.php/ijoc/article/view/22369. The precise mechanics can be worked out but it strikes me that asking posters to take more responsibility is the kind of thing that might restore order to social media while still keeping some of its vibrancy. Whether it reduces people from scrolling is another matter.

Expand full comment
connecticutyimby's avatar

If it is a requirement for the tech companies to know that their recommendations are causing harm for them to be sued, won't they respond by refusing to look into whether or not their algorithms are causing harm?

That way they can ensure that they are protected by not knowing that their product is causing harm. It seems extremely hard to assess if an algorithm is causing harm without the companies express cooperation. I am concerned that the bad press from the "Facebook Files" has caused Facebook to stop assessing the damage that their algorithms are causing.

Expand full comment

No posts