Facebook, Section 230, and how we all might be affected

Oh!

Adherent
Joined
Oct 1, 2020
Messages
298
I just read a short article from the BBC website. It interviews Sarah Miller from Biden's transition team. So, no doubt, she is speaking for Biden.


It is noteworthy because is mentions, specifically 230, and it goes over comments from Biden about his attitude towards Facebook. We should also keep in mind the opposition to FB from many Republicans too (but for other reasons). The combined will ton exact revenge upon FB (and Twitter), albeit for differeing reasons, is an unfortunately coincidence. Many from both sides would like to Section 230 to go (I'd suggest, in particular, Section 230(c)(1) for the Democrats, and Section 230(c)(2) for the Republicans).

Wikipedia:

Section 230, as passed, has two primary parts both listed under §230(c) as the "Good Samaritan" portion of the law. Section 230(c)(1), as identified above, defines that an information service provider shall not be treated as a "publisher or speaker" of information from another provider. Section 230(c)(2) provides immunity from civil liabilities for information service providers that remove or restrict content from their services they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected", as long as they act "in good faith" in this action.

The BBC article also mentions antitrust problems with Facebook and other large Tech companies - another thing I've railed against here at TAZ. I am in favor of this.

I am concerned that Zuckerberg will cause problems for the whole platform/forum/blog ecosystem. I have no problem with FB paying a price for its general irresponsibility, but I hope it is not exacted upon the rest of us too.
 

Joeychgo

TAZ Administrator
Joined
Feb 28, 2004
Messages
6,990
I think it will be ok for us in the end. I suspect it will be easier to go after companies like FB for their size and monopolistic nature then to substantially change sec 230. They might add in some kind of takedown provisions or something. But in the end, we cant really control what people want to say any more then a bar can control the conversations of its patrons.
 

Nev_Dull

Anachronism
Joined
Apr 27, 2010
Messages
2,292
I agree completely that section 230 should go. It's played a big part in many of social media sites' wrongdoings. All sites need to be accountable for the type of content they contain (and in many cases, promote). FB is the poster child for what happens when you give sites that shield.

I'm not as onboard with breaking up big companies. It's easy to accuse them of buying up the competition, but is that a bad thing? That competitor chose to sell. It's a different story when a company forces competitors to sell or actively suppresses them. Further, if companies like facebook had been accountable for the content they host, they likely would not have been able to grow so big, so fast, because a lot of their resources would have been focussed on the content side.
 

Oh!

Adherent
Joined
Oct 1, 2020
Messages
298
I think it will be ok for us in the end. I suspect it will be easier to go after companies like FB for their size and monopolistic nature then to substantially change sec 230. They might add in some kind of takedown provisions or something. But in the end, we cant really control what people want to say any more then a bar can control the conversations of its patrons.
I too expect (or, at least, hope) that Section 230 remains unchanged and it will be the antitrust issues which will be tackled. After all, laws already exist to deal this and it is clear that FB and Google are stifling competition. This is the real problem.
I agree completely that section 230 should go. It's played a big part in many of social media sites' wrongdoings. All sites need to be accountable for the type of content they contain (and in many cases, promote). FB is the poster child for what happens when you give sites that shield.

I'm not as onboard with breaking up big companies. It's easy to accuse them of buying up the competition, but is that a bad thing? That competitor chose to sell. It's a different story when a company forces competitors to sell or actively suppresses them. Further, if companies like facebook had been accountable for the content they host, they likely would not have been able to grow so big, so fast, because a lot of their resources would have been focussed on the content side.
I fundamentally disagree with you there. Section 230 protects all US individuals and US-registered companies from being sued into bankruptcy because of the actions of their users. If 230 was to go, it would affect far more than just FB, Twitter, Reddit and Youtube. The whole ecosystem will be affected. Yes, things do need to change (with respect to social responsibility from the large tech companies), but doing away with 230 will break the Internet (at least in the shorter term). I suspect that what would happen in those circumstances is that the big social media companies will move their businesses away from the US, perhaps to countries which are more problematic for the US and all of us.

Monopolies, duopolies, near monopolies and companies colluding to distort and restrict the market is a problem for everyone. That's the issue which must be tackled. Yes, 230 might be tweaked, but be very careful. Because, for the most part, 230 works and works well. If 230 goes, I expect that the majority of the most problematic content will simply switch to smaller venues (because the FB et al will not take the risk). And then, without 230, all manner of lawsuits against smalltime social media operators (including forum owners) will be at the sharp end. Revoking 230 would affect us all, and seriously so.
 
Last edited:

Nev_Dull

Anachronism
Joined
Apr 27, 2010
Messages
2,292
And then, without 230, all manner of lawsuits against smalltime social media operators (including forum owners) will be at the sharp end. Revoking 230 would affect us all, and seriously so.
I think that's a bit of an alarmist reaction. There's no evidence to indicate any such outcome. I can see there being an initial fleury of lawsuit threats if the law goes away, but almost none of them would come to anything. Any lawsuit has to have a legitimate base in law. Even in America, I don't think you can successfully sue someone because you feel offended by something you read on their website or forum. And if they can, a better way forward is to lobby for changes to the laws around lawsuits, not putting up silos to hide from them.

I live in one of the many countries where our sites are held accountable for the content they host. We are seen as publishers of content. Strangely enough, blogs, forums, etc. are created and exist without being sued into oblivion. The few examples of those that have run into legal problems are all sites that have actively engaged in spreading hate and misinformation. (white supremacy, holocaust deniers, and the like). The rest of us just do what we all should be doing; running our sites and moderating against harmful and unwanted content.
 

Oh!

Adherent
Joined
Oct 1, 2020
Messages
298
I think that's a bit of an alarmist reaction. There's no evidence to indicate any such outcome. I can see there being an initial fleury of lawsuit threats if the law goes away, but almost none of them would come to anything. Any lawsuit has to have a legitimate base in law. Even in America, I don't think you can successfully sue someone because you feel offended by something you read on their website or forum. And if they can, a better way forward is to lobby for changes to the laws around lawsuits, not putting up silos to hide from them.
History is the evidence. The CDA was passed in 1996 because it was already apparent by then that there was a problem. The Internet was tiny compared to now, and social media was just email lists and Usenet. None of the large platform could exist without Section 230 (or similar protections if they relocate the registration of their business). The landscape has totally shifted since the 90s. Discussion would continue, and I expect that most of most toxic content we typically have now (at larger platforms) would shift to smaller ones. Ordinary forums would be forced into monitoring content much more closely (if they bother to continue at all). Some alternative platforms might attempt to launch, but host too would be liable, so they would probably avoid them too.

Prior to the Internet, case law was clear that a liability line was drawn between publishers of content and distributors of content; publishers would be expected to have awareness of material it was publishing and thus should be held liable for any illegal content it published, while distributors would likely not be aware and thus would be immune. This was established in Smith v. California (1959), where the Supreme Court ruled that putting liability on the provider (a book store in this case) would have "a collateral effect of inhibiting the freedom of expression, by making the individual the more reluctant to exercise it."[21]

In the early 1990s, the Internet became more widely adopted and created means for users to engage in forums and other user-generated content. While this helped to expand the use of the Internet, it also resulted in a number of legal cases putting service providers at fault for the content generated by its users. This concern was raised by legal challenges against CompuServe and Prodigy, early service providers at this time.[22] CompuServe stated they would not attempt to regulate what users posted on their services, while Prodigy had employed a team of moderators to validate content. Both faced legal challenges related to content posted by their users. In Cubby, Inc. v. CompuServe Inc., CompuServe was found not be at fault as, by its stance as allowing all content to go unmoderated, it was a distributor and thus not liable for libelous content posted by users. However, Stratton Oakmont, Inc. v. Prodigy Services Co. found that as Prodigy had taken an editorial role with regard to customer content, it was a publisher and legally responsible for libel committed by customers.[23]https://en.wikipedia.org/wiki/Section_230#cite_note-25

Service providers made their Congresspersons aware of these cases, believing that if upheld across the nation, it would stifle the growth of the Internet. United States Representative Christopher Cox (R-CA) had read an article about the two cases and felt the decisions were backwards. "It struck me that if that rule was going to take hold then the internet would become the Wild West and nobody would have any incentive to keep the internet civil," Cox stated.[24]

At the time, Congress was preparing the Communications Decency Act (CDA), part of the omnibus Telecommunications Act of 1996, which was designed to make knowingly sending indecent or obscene material to minors a criminal offense. A version of the CDA had passed through the Senate pushed by Senator J. James Exon (D-NE).[25] A grassroots effort in the tech industry reacted to try to convince the House of Representatives to challenge Exon's bill. Based on the Stratton Oakmont decision, Congress recognized that by requiring service providers to block indecent content would make them be treated as publishers in context of the First Amendment and thus become liable for other illegal content such as libel, not set out in the existing CDA.[22] Cox and fellow Representative Ron Wyden (D-OR) wrote the House bill's section 509, titled the Internet Freedom and Family Empowerment Act, designed to override the decision from Stratton Oakmont, so that service providers could moderate content as necessary and did not have to act as a wholly neutral conduit. The new Act was added the section while the CDA was in conference within the House.

The overall Telecommunications Act, with both Exon's CDA and Cox/Wyden's provision, passed both Houses by near-unanimous votes and signed into law by President Bill Clinton by February 1996.[26] Cox/Wyden's section became Section 509 of the Telecommunications Act of 1996 and became law as a new Section 230 of the Communications Act of 1934. The anti-indecency portion of the CDA was immediately challenged on passage, resulting in the Supreme Court 1997 case, Reno v. American Civil Liberties Union, that ruled all of the anti-indecency sections of the CDA were unconstitutional, but left Section 230 as law.[27]
By all means take my views on this with a very large pinch of salt. But I suggest that read what EFF has to say about this matter first.

I live in one of the many countries where our sites are held accountable for the content they host. We are seen as publishers of content. Strangely enough, blogs, forums, etc. are created and exist without being sued into oblivion. The few examples of those that have run into legal problems are all sites that have actively engaged in spreading hate and misinformation. (white supremacy, holocaust deniers, and the like). The rest of us just do what we all should be doing; running our sites and moderating against harmful and unwanted content.
Canada is far less litigious than the US. And this does not take into account the huge numbers of problem accounts which would be forced to exodus FB et al for pastures new. Smaller platforms will not be able to manage the huge influx of (automated) problem content - no chance. The (partial) closing of the huge platforms will not be an opportunity for smaller forums - it will just shift most of the liability to those least able to withstand it.
 

Nev_Dull

Anachronism
Joined
Apr 27, 2010
Messages
2,292
By all means take my views on this with a very large pinch of salt. But I suggest that read what EFF has to say about this matter first.
It's an interesting article, but it also takes me back to the issue of Parler. Why should it not enjoy the protection of 230? I know the arguments made here are that it violates the rules of the hosting company, therefore it must go. But then what of twitter or facebook? They have content that many find objectionable or even illegal. I'm pretty sure that would violate the rules of the network carriers those platforms use. Shouldn't we be calling for the carriers to shut them down for the same reasons as Parler?

If you are going to have a law like 230, it must be applied equally. (And no, I'm not so naive as to think any law is always applied equally.) If you can choose to ignore it in favour of the hosting company's or carrier's terms, it isn't much protection at all.
 

DigNap15

Fan
Joined
Sep 14, 2019
Messages
590
I agree completely that section 230 should go. It's played a big part in many of social media sites' wrongdoings. All sites need to be accountable for the type of content they contain (and in many cases, promote). FB is the poster child for what happens when you give sites that shield.

I'm not as onboard with breaking up big companies. It's easy to accuse them of buying up the competition, but is that a bad thing? That competitor chose to sell. It's a different story when a company forces competitors to sell or actively suppresses them. Further, if companies like facebook had been accountable for the content they host, they likely would not have been able to grow so big, so fast, because a lot of their resources would have been focussed on the content side.
Rememebr when Facebook started it was just a way of finding your old friends and keeping in touch with your family
Now its a place to post memes
 

DigNap15

Fan
Joined
Sep 14, 2019
Messages
590
I too expect (or, at least, hope) that Section 230 remains unchanged and it will be the antitrust issues which will be tackled. After all, laws already exist to deal this and it is clear that FB and Google are stifling competition. This is the real problem.

I fundamentally disagree with you there. Section 230 protects all US individuals and US-registered companies from being sued into bankruptcy because of the actions of their users. If 230 was to go, it would affect far more than just FB, Twitter, Reddit and Youtube. The whole ecosystem will be affected. Yes, things do need to change (with respect to social responsibility from the large tech companies), but doing away with 230 will break the Internet (at least in the shorter term). I suspect that what would happen in those circumstances is that the big social media companies will move their businesses away from the US, perhaps to countries which are more problematic for the US and all of us.

Monopolies, duopolies, near monopolies and companies colluding to distort and restrict the market is a problem for everyone. That's the issue which must be tackled. Yes, 230 might be tweaked, but be very careful. Because, for the most part, 230 works and works well. If 230 goes, I expect that the majority of the most problematic content will simply switch to smaller venues (because the FB et al will not take the risk). And then, without 230, all manner of lawsuits against smalltime social media operators (including forum owners) will be at the sharp end. Revoking 230 would affect us all, and seriously so.
Re your last paragraph
I have a general/politics forum in a small country
We have a law that says if a complaint is recieved we have to give the member 48 hours to respond and either let their post stand and justify it or remove it
So far, no one has complained
That may change if free speech / hate speech laws are chnaged

We have or own strict anti hate speech rules.
 

Oh!

Adherent
Joined
Oct 1, 2020
Messages
298
It's an interesting article, but it also takes me back to the issue of Parler. Why should it not enjoy the protection of 230? I know the arguments made here are that it violates the rules of the hosting company, therefore it must go. But then what of twitter or facebook? They have content that many find objectionable or even illegal. I'm pretty sure that would violate the rules of the network carriers those platforms use. Shouldn't we be calling for the carriers to shut them down for the same reasons as Parler?

If you are going to have a law like 230, it must be applied equally. (And no, I'm not so naive as to think any law is always applied equally.) If you can choose to ignore it in favour of the hosting company's or carrier's terms, it isn't much protection at all.
Hi Nev_Dull,

I think you have misunderstood 230. Parler absolutely can enjoy the benefits of 230. Indeed, it has done so by not being prosecuted or sued for the content posted by its members. The issue of Amazon ending its business relationship with Parler is allowed under US commerce law and more deeply, by Freedom of Expression doctrines, as protected by the UC Constitution. Section 230 also makes explicit that companies can remove content and members (moderate) so long as they act in 'good faith' without fear of being sued by members/users. This is what is covered by Section 230(c)(2) (the Good Samaritan clause). Amazon dropping Parler is protected by Section 230(c)(2). Whereas, Parler (and Amazon) are protected from being held liable for member-posted content by Section 230(c)(1).

We all might pick holes in Section 230, based upon our particular points of views (and political leanings). But, if we think through what 230 does, who and what are protected, and what would happen if it did not exist, it becomes apparent that it is actually a well crafted piece of legislation and has stood up well for 25 years. There have been only one or two minor amendments to the legislation (if memory serves, I think through case law rather than primary legislation) during all this time. 230 was introduced because of some troubling (successful) lawsuits which would have seriously curtailed the ability of the Internet to develop (in the US).
 

Oh!

Adherent
Joined
Oct 1, 2020
Messages
298
Re your last paragraph
I have a general/politics forum in a small country
We have a law that says if a complaint is recieved we have to give the member 48 hours to respond and either let their post stand and justify it or remove it
So far, no one has complained
That may change if free speech / hate speech laws are chnaged

We have or own strict anti hate speech rules.
Well, that's one approach. But for many forums, managing such claims likely would be burdensome. Though, I suppose, it could be automated quite easily.

However, this could be a charter for people to abuse the system and have content removed. You are reliant upon the member responding in a very timely manner. And, presumably, you have to be very careful to remove the content (if you do not receive a response or denial from the content creator), else you will be on the hook. So, no taking a couple of days off, ever!

The content could be completely correct, balanced and well within what any reasonable person might consider as 'fair comment'. Or, even if it is not, perhaps it is scathing critique of a piece of art, movie, sportsperson, etc. Should such content be at risk of removal because of an anonymous complaint? What happens if the content receives a thousand complaints, all claiming to have legitimate cause to have the comment corrected or removed? Must the account holder consider each complaint (since I assume there is no way to positively identify who is a legitimate interested party). Please respond if I have misunderstood anything about how this system operates.

This is somewhat akin to DMCA takedown notices. Or, maybe it even more directly analogous (than I assumed above). But instead of the issue being a claim about copyright infringement, it is more about accuracy. Even if this is the case, I think the DMCA system is broken. Yes, claims of infraction must be asserted as made in good faith, but there is no way to positively check that the claim meets this standard (or the true identity of the claimant). Really, this needs to change and for there to be real and meaningful consequences for bad faith claims (false DMCA claims are regularly made in the attempt to have critical content removed). Claimants need to face serious financial consequences for bad-faith claims. As things stand, they do not. I suspect the same is true for the system you describe.

Anyway, that's my very longwinded way to explain that I think the system you describe is open to be abused, and likely will be abused (and a certainty on larger platforms). I guess there are no large platforms registered in your country, and this legislation likely would form part of the decision of where to register the business.
 

Tracy Perry

Opinionated asshat
Joined
May 25, 2013
Messages
5,047
Any lawsuit has to have a legitimate base in law.
You forgot the "to prevail" part. You don't have to have a legitimate base to file, then the one being filed on has the burden of cost of defending against it, even if to only prove that the lawsuit has no basis. There would still be that cost involved.
Hell, look at all the lawsuits that were filed about the U.S. Presidential election, like the "Kraken" one. No basis in fact, but the lawsuits still had to be "defended" against before they got dismissed.
 

mysiteguy

Migration Expert
Joined
Feb 20, 2007
Messages
3,239
It's an interesting article, but it also takes me back to the issue of Parler. Why should it not enjoy the protection of 230? I know the arguments made here are that it violates the rules of the hosting company, therefore it must go. But then what of twitter or facebook? They have content that many find objectionable or even illegal. I'm pretty sure that would violate the rules of the network carriers those platforms use. Shouldn't we be calling for the carriers to shut them down for the same reasons as Parler?

If you are going to have a law like 230, it must be applied equally. (And no, I'm not so naive as to think any law is always applied equally.) If you can choose to ignore it in favour of the hosting company's or carrier's terms, it isn't much protection at all.

I believe you're making an apple and oranges comparison.

One if the application of the law by the government.
The other is policy, by a company.

The decisions of companies to stop providing services has nothing to do with the equal application of 230. It's the government that enforces 230, not the companies, and visa versa.
 

Nev_Dull

Anachronism
Joined
Apr 27, 2010
Messages
2,292
You forgot the "to prevail" part. You don't have to have a legitimate base to file, then the one being filed on has the burden of cost of defending against it, even if to only prove that the lawsuit has no basis. There would still be that cost involved.
Hell, look at all the lawsuits that were filed about the U.S. Presidential election, like the "Kraken" one. No basis in fact, but the lawsuits still had to be "defended" against before they got dismissed.
You make an interesting point. I was assuming part of the your system includes some checks and balances that prevent most spurious lawsuits from being heard at all. Perhaps that's why we don't see people suing each other for everything in my country.
 

Nev_Dull

Anachronism
Joined
Apr 27, 2010
Messages
2,292
I believe you're making an apple and oranges comparison.

One if the application of the law by the government.
The other is policy, by a company.

The decisions of companies to stop providing services has nothing to do with the equal application of 230. It's the government that enforces 230, not the companies, and visa versa.
I understand that. I probably didn't explain it well enough, but I was more referring to the differences in some threads on here. In one case, people were defending some social media sites as simply platforms that shouldn't be held responsible for content (citing 230), while in another, people were arguing for the shutting down of Parler by its hosting company, citing breach of terms. I was trying to point out that we (as forum owners) should be looking at all cases from similar perspectives, not trying to find different approaches based on our personal opinions of the sites or their content.
 

Nev_Dull

Anachronism
Joined
Apr 27, 2010
Messages
2,292
We all might pick holes in Section 230, based upon our particular points of views (and political leanings). But, if we think through what 230 does, who and what are protected, and what would happen if it did not exist, it becomes apparent that it is actually a well crafted piece of legislation and has stood up well for 25 years. There have been only one or two minor amendments to the legislation (if memory serves, I think through case law rather than primary legislation) during all this time. 230 was introduced because of some troubling (successful) lawsuits which would have seriously curtailed the ability of the Internet to develop (in the US).
While I can't say I'm entirely convinced, I do bow to your greater understanding of your laws. You've made some good points throughout, though I still see some areas of confusion and conflict between the law, corporate policy and which takes precedence and why. Overall, I have to say I find our way of dealing with the matter more straightforward. However that could simply be a case of the the devil you know.
 

Oh!

Adherent
Joined
Oct 1, 2020
Messages
298
You make an interesting point. I was assuming part of the your system includes some checks and balances that prevent most spurious lawsuits from being heard at all. Perhaps that's why we don't see people suing each other for everything in my country.
Some US states do have such laws. But, there is no federal law. These are the kinds of imbalances which need addressing. You are probably familiar with the term 'anti-SLAPP law' (I believe you use the same term in Canada; similar laws (and the same term) exist in the US to a more limited degree).

 

Oh!

Adherent
Joined
Oct 1, 2020
Messages
298
While I can't say I'm entirely convinced, I do bow to your greater understanding of your laws. You've made some good points throughout, though I still see some areas of confusion and conflict between the law, corporate policy and which takes precedence and why. Overall, I have to say I find our way of dealing with the matter more straightforward. However that could simply be a case of the the devil you know.
Hi Nev,

Actually, I am not a US citizen, but I did live there for a number of years. It is more that I have taken an interest because of the project with which I am heavily involved (I've posted about it in other threads; a forum-focused or group-based platform). Since IANAL, it is entirely possible that I have not been entirely accurate or have even got some things plain wrong. So, some pushback is very helpful - it tends to drive me to recheck my facts and better educate myself on these matters.

And in any case, some of this is me purely expressing opinion, so reading alternative views is very useful in my goals with the new platform. These are tricky subjects. I too have been very critical of the big players (at least in some respects), but arriving at alternate and practical solutions is much more tricky than providing critiques.
 

DigNap15

Fan
Joined
Sep 14, 2019
Messages
590
For years forum owners have talked about content on their forums - who is responsible etc
We might have removed some content, or banned a member

But now in 2021 it has become very serious
Big Tech are abusing their power and terms of service to force people into accepting their narrative and point of view
Eg deplatforming Parler and now Telegram is to my mind abuse of their power because of their near monopoly.

Who knows what will come next
We are supposed to live a free world
But we are being led into a communist style of world
 

Oh!

Adherent
Joined
Oct 1, 2020
Messages
298
For years forum owners have talked about content on their forums - who is responsible etc
We might have removed some content, or banned a member

But now in 2021 it has become very serious
Big Tech are abusing their power and terms of service to force people into accepting their narrative and point of view
Eg deplatforming Parler and now Telegram is to my mind abuse of their power because of their near monopoly.

Who knows what will come next
We are supposed to live a free world
But we are being led into a communist style of world
That's the only real problem. The principle that private companies should be free to decide upon with whom they do business is the correct one. Unless they are colluding with other businesses or are being prejudicial towards protected groups, that's fine and as it should be. But this has little to nothing to do with Amazon kicking out Parler from their hosting services for serially breaching their terms of use.

Parler are free to move to another host. There's no real issue there (though, Amazon is becoming too dominant within hosting and this will need to be addressed at some point soon; their dominance within other fields and retail marketing in general must be addressed now). The likes of Facebook are obviously in breach of antitrust laws, so this needs addressing ASAP too. But none of this is relevant to the principle at stake.
 
Top