Mark Zuckerberg could never have imagined that his quaint social communication brainchild would have grown into the humongous monster it is today. It was originally a means for school alumni to keep in touch – more or less an alumni cyber magazine. But like some of those California campfires, it grew into an out-of-control communication behemoth.

For sure, it has a lot of good points. Like the Internet, itself, Facebook enabled worldwide communication. You can communicate with a person in Turkistan as easily as a person next door. You could keep up – or get re-acquainted – with friends of the past. Or you could drum up interesting chats with people you never knew – making new friends or new adversaries.

Facebook redefined the word “friend.”  For most of human history, that meant someone you liked, trusted … and actually knew.  I now have thousands of “friends” – a few I even know.

But as Facebook grew, it changed from that benign communication platform to a monstrous and disturbing disrupter of human civility and comity.

Facebook started to reveal our information so that commercial enterprises could reach out to us.  It became an important platform for advertisers.  And it became a vehicle for small-donor fundraising.

It also became a platform for political dialogue and debate – including the spread of nutty conspiracy theories.  Conspiracy theories have always been part of our civic and political dialogue, but Facebook spread them farther and faster.

Finally, the central question was: How do we control this cyber-Frankenstein’s monster? Or better yet, who should control it? Zuckerberg’s creation ran smack dab into the First Amendment.

The central question is whether Facebook is merely a platform upon which free speech flourishes – even offensive and inaccurate speech? Or is Facebook essentially the publisher of all that appears — and thereby shares liability.

Justice Oliver Wendell Holmes observed that freedom of speech does not give a person the right to yell fire in a crowded theater without there being a fire.  Let’s apply that principle.

If someone did yell fire in a crowded theater – and caused a deadly panic – should the theater owner be held liable?

If you slander, incite a riot, scream in front of someone’s home late at night, criminally conspire or yell fire in a crowded theater, we have laws that address those instances.  The same should apply to the social platforms.  If someone is vulgar, but does not break the law, let it be. But if they say things that do break the law, let law enforcement and the judicial system deal with them individually – just as would be the case if they committed slander or incited a riot in some other venue.

The difference between a social platform and a newspaper or book publisher is that the latter makes the decision on what to print – and what not to print.  I do not have any ability to put my thoughts in the New York Times on my own.  They must accept my submission.  Of course, they never do – but that is another story.  Since folks can, of their own volition, express their opinions or share information on Facebook, the platform should not have ANY liability.

The problem for Zuckerberg – and the other oligarchs of the Internet – is that they stepped into the quicksand of liability by claiming the right – or responsibility – of deciding what can and cannot appear on their platforms.  It was the slippery slope that caused them to expand their censorship over a broader range of opinion.

Facebook knocked Trump off their platform because they said he was inciting violence.

But Facebook is not a law enforcement agency.  It is significant that while they made the armchair legal judgment, Trump has never been indicted or convicted of inciting a riot.  So, that makes Facebook’s action purely political.  And therein lies the problem.

In a futile effort to dodge the issue of censorship, Zuckerberg set up a so-called independent panel to decide when to boot folks off the platform.  It is called the Oversight Board.   Since it is HIS creation, it is not independent.  It just shifts the corporate opinion away from the boss.  No matter by what mechanism Facebook makes such decisions, but the proverbial buck still stops at Zuckerberg’s desk.

The solution for Zuckerberg should be simple.  Make the platform available to everyone without censorship – and let the law take care of those who break it.

But it is not so simple.

One of the things upon which Republicans and Democrats agree is that Facebook needs to change.  Republicans tend to favor the maximum free speech approach outlined above,  BUT … Democrats want to impose stricter rules that would require Facebook to censor a broad range of free speech – not just the illegal, but whatever the left sees as misleading or offensive – to them.

The fix for Republicans is more free speech and the fix for Democrats is more censorship.  The fact that the opposition to Facebook policies comes from such diametrically opposing viewpoints means that there may not be a majority of members of Congress to agree on a solution.

On this one, there seems to be agreement between Republicans and Democrats. They see Facebook as being too big, too powerful with too much market share.

There is also concern of very specific monopolistic actions. Most notably is the acquisition of competitors, such as WhatsApp. The need to take anti-trust action against the Internet giants could result in common cause between such philosophic legislators as Senator Elizabeth Warren on the left and Senator Rand Paul on the right.

It appears that the new Internet platforms will follow the path that almost all previous new technologies went down. They start out relatively unrestricted. But over time, the tight clench of regulation frames their activities. Think of the automobile. There were very few rules-of-the-road in the beginning. Now, the car is among the most regulated technologies in the world.

It is impossible to predict how the Facebook of the future will operate, but you can bet that it will be far different than it does today.

So, there ‘tis.