In 2020, it has become painfully clear that the Internet has a dark side. Russian bots corrupt elections. Fake news and misinformation corrode our faith in society. Trolls group together, spread rumors, and can ruin lives. Fake news is easy to generate but difficult for people to spot. Furthermore, online discourse is frequently a “race to the bottom,” with online discussions quickly degenerating into shouting matches and name-calling. Part of the problem is anonymity offered by the Internet – it fails to discourage behavior that we would find abhorrent in person. But anonymity is only part of the problem. What is the solution?
I’ve been considering this problem for several years, and I believe that a combination of technologies and operating procedures, applied synergistically, could go a long way toward promoting civil discourse and protecting truth. This would be a net benefit to society as a whole. Here are a few of my patents, where I am a co-inventor, for those who wish to explore further: US Patent 9,639,841, US Patent 10,033,537, and US Patent 10,686,609.
1) Digital signatures. Users can protect their online content (posts, videos, resumes) with hashes and digital signatures – either as a single protected item, or as a collection of individual items such as text sentences, images, and videos. If they did so, their content could still be viewed and copied, but it would be protected against modification. If the vast majority of users signed their work, those that did not do so would tend to stand out like a sore thumb. Those with honorable intent should be happy to sign their work (“I support this message”); those with dishonorable intent, less so.
2) Viewing and editing tools that take account of digital signatures and are adapted to handle them. The presence of a digital signature might serve as a high-level filter for scoring and display. If a user is prepared to “stand behind” his/her post, the content of the post might be viewed as likely to have higher integrity, compared to a post that is not signed or authenticated by any known user. This might lead to greater number of views for signed work — thus providing an incentive for individuals to actually sign their work. In the extreme, unsigned work might default to be hidden from view (although this might be a tailorable viewing option on a site, or in a user’s web browser).
Would digital authentication allow users to respond to, and even paraphrase, a digitally signed post? The answer is yes. One can always copy a signed post along with its signature, thereby demonstrating fidelity to the original source material. One could also copy a portion — a paraphrase. While this would “break” the original signature, a user can still choose to do this. If the editing is reasonable, a link to the original signed content could be included to demonstrate good faith. The original user could also respond with an “I approved of this message” endorsement, and a second signature. On the other hand, malicious or improper copying and paraphrasing, as well as citing text “out of context”, would be easy to spot and flag based on the original signed content.
3) Veracity Server. A “veracity server” (or several such servers) could keep track of, and cross-index, all signed content. Signed comments against an original post, pro and con, could be referenced to the original post based on the digital signature (which is effectively unique) and all related posts. This would allow a user to quickly assemble all relevant discussion associated with a post.
4) Portable Diplomas. Cryptographically secure certificates of status (“diplomas”) can be awarded by a web site to reward positive behavior (e.g., truthful, insightful, experienced, funny, etc.). These would be “portable” in the sense that a user could display his/her certificates on a third-party site, demonstrating a proven track record of good behavior even if he/she has never participated on the third-party site before. As I describe below, these portable certificates would have economic value to the awardee, and users would have an incentive to act honorably in order to compete for these certificates.
Users would compete to gain status and earn portable diplomas, which cannot be transferred to others or counterfeited, but can be displayed anywhere (including other sites) to enhance their scoring on a new post, burnish a professional reputation, and support other online activity (i.e., the gig economy). Users gain status on a particular web site by being truthful, insightful, experienced, funny, or doing well on one or several assigned tasks. Once a user receives a certificate authenticated by the site, he/she may display the certificate to other sites. Other web sites can evaluate the offered certificate(s) based on their own (local) criteria, adjusting the score of a newly-arriving post accordingly. This “pre-mediation” can be applied to simple commentary as well as more economically obvious applications such as resume scoring and the gig economy.
Separately, users can choose to digitally sign their online posts/content, on any site, in order to guard against modification. Fake news, deep fakes, or any other modification, will break the signature, and a broken signature flags the modification as suspect. But, one can still “paraphrase” while citing (pointing-to) an original signed work, to demonstrate a valid summary. So flexibility of online discourse is maintained. The veracity server cross-references all signed work based on the digital signatures referred-to, allowing any user to see the full back-and-forth of an online discussion. Scoring metrics allow the server, as well as ordinary users and human mediators, to evaluate the quality of the commentary and the weight of evidence offered pro-and-con.
The net effect is that users are incentivized to act honorably in order to earn diplomas that can enhance their online activity. Deep fakes are harder to generate and easier to recognize. False persona and trolls are more difficult to setup because it takes time to gather certificates demonstrating you are a real human, and that you have been recognized by the community. Furthermore, false persona, and trolls, once created, become useless more quickly when they start to behave poorly. This shifts the cost-benefit ratio for bad actors, increasing costs and reducing the potential “bad benefits” of malicious activity. Users, by working with the online analysis tools, learn the value of reliable evidence, and become better at spotting fake news and misinformation on their own. This is a net benefit to society as a whole. Furthermore, the metrics for online analysis can be tailored to the needs of individual websites, allowing for implementation flexibility.
Dedicated websites can increase their appeal to users by offering worthwhile certificates of status. They can also enhance the quality of discourse on their platforms by scoring posts in accordance with the certificates offered by new and returning users. Aggregating servers can attract users by providing indexing services for authenticated content, and tools to manipulate and evaluate that content. Finally, companies that market web browsers can attract new users by providing tools to evaluate the provenance of online material, and weigh the possibly conflicting evidence related to that material.
Today, fact-checking and identifying fake news is frequently considered an unfortunate cost of business. It does not have to be that way. By providing economic incentives for companies hosting websites, aggregating servers and web browser products, we can create a “virtuous cycle” where courteous and honorable behavior is rewarded, and Adam Smith’s invisible hand is mobilized as an ally in the arms race against online shaming, gratuitous invective, and fake news.
Dr. Heppe has over 40 years of experience in the area of telecommunications, electrical engineering, radio/radar, unmanned aircraft, and GPS/GNSS with General Electric Space Division, Stanford Telecommunications, Telenergy, Insitu/Boeing, and independent consultancies.