Monday, September 14, 2020

Assigned Beliefs

Scott Adams made the claim that we do not choose our beliefs, they are assigned to us.  One of the difficulties of adulthood is that peer pressure increases, not diminishes, as we choose our peers more, and hence they influence us. He notes that on Google, if you want to ask the question "Does Joe Biden have dementia?" you have to type in the whole thing, whereas on Bing and DuckDuckGo it pops up after typing "Does J..." As there is not plausible explanation how Google's algorithm could naturally be so different, it is an obvious manipulation - and thus election interference, especially as there are likely a hundred similar examples if there is this one.

But even the conservative press won't pick it up and assign it to us, so even conservatives will let this slide.  My guess is that it is too hard to tell the story quite quickly and dramatically enough to make it a sound-bite, so they move on to things they think they can sell better.  Even though this is bigger. Reporting by omission, while presenting yourself as an at least somewhat neutral information source, is very dangerous, yes.

22 comments:

Harold Boxty said...

I think Adams said some stuff that was contradictory in that podcast. I heard the bit you mentioned, but later in that podcast or the day before he kept complaining he can't find an article that shows what would have happened if we had widespread testing earlier on because the answer might change people's opinions. That didn't make sense if our opinions are assigned. Adams didn't mention the paper test strips that he seems hung up on today.

I think in the same podcast he suggested if your opponent suggests something silly like putting up statues of felons then you should agree and amplify their idea (if all reasoning fails iirc). This evening I watched a Vox Day Darkstream where someone asked why Trump is so supportive of a vaccine. Vox responded that it's because Trump knows there will never be a vaccine for COVID19. It blew my mind.

Assistant Village Idiot said...

I only listened to the first minutes of that podcast. I catch his show once every couple of weeks, maybe. He has an ability to follow a point step by step, but not notice that there might have been other possibilities along the way. He's fun that way, but worrisome, too.

Google messing with us is wrong. I can see why it might be a big deal. But in the end, what if it's not? What if it is wrong but has little effect?

Sam L. said...

This is why I do not Google. Can't trust it.

Zachriel said...

Assistant Village Idiot: He notes that on Google, if you want to ask the question "Does Joe Biden have dementia?" you have to type in the whole thing, whereas on Bing and DuckDuckGo it pops up after typing "Does J..."

Possibly because the claim that Joe Biden has dementia is false. Here he is today:

https://www.c-span.org/video/?475777-1/joe-biden-holds-veterans-roundtable-tampa-florida&vod
(start at about the 21 minute mark)

It's actually quite strange that Bing and DuckDuckGo, out of the billions of possible strings, promotes "Does J" to "Does Joe Biden have dementia".

Christopher B said...

You'd think if the answer was an unambiguous 'no' the question could still be asked.

Assistant Village Idiot said...

Why would it be strange? If people are suspicious of it, a lot of folks would type it and the algorithm would pick that up, true or false. Auto-complete comes up with a lot of ridiculous, untrue things.

What is strange is when something that people are clearly going to be typing a lot - for good reasons or bad - doesn't register, even though the other similarly-constructed search engines have it quickly. I don't see any explanation other than intentional erasure that fits the facts. That is clearly evil. Whether it actually has any effect on the general public, as Adams claims it simply must, remains unproven.

QAnon makes so much noise about pedophilia that anything related to some government officials and that subject must have been picked up by the search engines at this point, by sheer volume alone. I don't believe a word of it. Pedophiles do seek niches, and there may even be some self-protective "rings" somewhere in the federal government, just as there might be anywhere where anyone might seek powerful protection and/or access to children. But it can't be to the extent they claim, not by two orders of magnitude. Still, I would expect search engines to follow the volume unless someone made an effort to erase it.

I will note that you have again changed the subject to something you like better, as is common for you, Zachriel. Whether it is 10 degrees or 45 degrees, you are unfailing in this. I imagine it is quite automatic at this point, avoiding difficult information quite seamlessly. You may not be able to stop doing it, but I will continue to note it.

Zachriel said...

Christopher B: You'd think if the answer was an unambiguous 'no' the question could still be asked.

It can be asked.

Assistant Village Idiot: Auto-complete comes up with a lot of ridiculous, untrue things.

That's a common problem with search. When people try to find relevant information they are often directed to false information because lies often propagate more rapidly than the truth. Consider that on some search engines just typing in "Does J" pushes you to the false claim that Joe Biden has dementia.

Assistant Village Idiot: I don't see any explanation other than intentional erasure that fits the facts.

Google Autocomplete uses a neural net that makes decisions that considers not just a simple notion of popularity, but relevance. It also considers your personal search history and your geographical location.

Assistant Village Idiot: QAnon makes so much noise about pedophilia that anything related to some government officials and that subject must have been picked up by the search engines at this point, by sheer volume alone.

Google lists those sites. We're only considering Google Autocomplete.

Assistant Village Idiot: Still, I would expect search engines to follow the volume unless someone made an effort to erase it.

Or the artificial intelligence considers other factors as well as volume.

Assistant Village Idiot: I will note that you have again changed the subject to something you like better ...

Huh? The subject in the original post was Google Autocomplete, which considers relevance as well as volume. You provided the example. False information has lower relevance.

Assistant Village Idiot said...

Someone has to decide it is false information.

The testimony of the other search engines refutes you.

PenGun said...

Google's alogrithm is just far more sophisticated.

From what you have said, its likely data on numbers of people searching a thing, was simply how many. That is a popularity contest, and a little thought might reveal reasons to not go that way.

Zachriel said...

Assistant Village Idiot: Someone has to decide it is false information.

We provided evidence concerning Biden’s mental capacity above.

More generally, the validity of a claim is evaluated by consilience. Pseudosciences and other forms of quackery tend to be isolated, fractured, and frequently contradicted by consilience. Consider a flat-Earth claim: While the world-view may be internally consistent, it is contradicted by physics, astronomy, satellite photography, astronauts, even by Eratosthenes. When a neural net with access to billions of web pages evaluates the claim, it may readily determine it lacks consilience, and is probably bogus.

The point is that it doesn’t require nefarious intent, or intentionality at all, to push down false claims.

Assistant Village Idiot said...

PG - There is zero evidence their algorithm is "far more sophisticated." You are just making that up.

Zachriel - '...it may readily determine it lacks consilience, and is probably bogus." There is zero evidence for this. You are just making it up.

Deevs said...

Wow, Zachriel, I was unaware I could use Google's autocomplete function to establish the veracity of a claim. I've just learned that Donald Trump isn't evil, a dictator, or even a narcissist. Man, I've got some acquaintances who are quite worried about all those things. Glad I can tell them they can rest easy now.

Uh oh. It takes up to "Is the earth fl" before you get the autocomplete for "Is the earth flat" in Google. I suppose that makes flat earth theories more credible than Joe Biden having dementia or Trump being a narcissist. Did not see that coming.

Kidding aside, I tried the same for a number of political figures with similar results. Oddly, "Jeffrey Epstein didn't kill himself" gets you nothing while "Epstein didn't kill himself" gets you suggestions of merchandise bearing the slogan.

So, yeah, I'm not buying that the algorithm was set up and just happened to work out that it wouldn't suggest something like "Is Donald Trump a narcissist" or "Does Joe Biden have dementia". Someone is putting their thumb on the scale, but I can't say it's to help or hurt any given political ideology as much as Google trying to not ruffle any feathers. Personally, I'd prefer Google not try to be the arbiters of which questions are okay to ask, though.

I think the main takeaway here, though, is that typing "Is Hitler" autocompletes to "Is Hitler my dad." I want to know who's asking that question.

Christopher B said...

Biden's always been a lyin' plagiarizin' perverted hair-sniffin' gaffe machine so dementia might actually be a best case.

Zachriel said...

Assistant Village Idiot: There is zero evidence their algorithm is "far more sophisticated.

Google doesn't publish their algorithms. However, they have divulged some details, and analyses by systems engineers have revealed still more.

Google started with an algorithm called PageRank which measures the number of links to a page. This method can be gamed, however, such as by buying links and link farming, initially addressed with Google Panda. Today, Google uses artificial intelligence, which learns how to read natural language, and evaluates 200 different parameters, including popularity, link quality, domain history, and recency.

One of the most important factors is webpage "authority." When Google scans a page, it doesn't just look at individual words, but attempts to discern the underlying meaning of a page through an algorithm called Google Hummingbird. Authority is dependent on relevance of the meaning.

Consider when the New York Times publishes an article. Thousands of websites will typically publish original commentary. Other sites might just copy a bit of text and append a bunch of unrelated content. The former have more relevance, so add to the authority of the New York Times page, while the latter do not, even though they may suggest simple popularity. If a site, such as the Washington Post, which also inspires thousands of pages of relevant commentary, links to the New York Times, then it adds a significant amount of authority to the New York Times.

Authority is a web, just like links form a web. Authority builds on authority, consilience, just as irrelevance builds on irrelevance. Is the system perfect? Of course not. And it can still be gamed. However, the use of advanced artificial intelligence by Google is not in dispute, and is sufficient to explain anomalies in Google Autocomplete without the need to ascribe nefarious intention.

Zachriel said...

Deevs: I think the main takeaway here, though, is that typing "Is Hitler" autocompletes to "Is Hitler my dad." I want to know who's asking that question.

Heh. Good example. There's a book out "Hitler, my father," so the commercial and recency effect may predominate. It's doubtful, however, that Google is intentionally bringing up "Is Hitler my dad," rather than it just being a fluke of the algorithm, like Trump saying "herd mentality".

Zachriel said...

Christopher B: Biden's always been ...

If you search Bing for "Does", it suggests "Does Joe Biden have dementia". The top results are instructive.

Summit News
The Black Sphere
Chicago Sun Times
News Observer
NOQ Report

Only the Chicago Sun Times is anything that could be considered authoritative, and that article is an opinion piece by a local columnist, so the page has lower search ranking authority.

We provided evidence above that Biden doesn't not have dementia. He is old and has always been subject to verbal stumbles, if that is what you mean. We won't address the balance of your comment except to say that it fits with the search returns listed herein.

Zachriel said...

By the way, Assistant Village Idiot adds significant relevant content, and we do appreciate your efforts. We always read your posts with interest.

PenGun said...

I have hung out in places where very smart people liked to discuss things. Its gone now but alt/sysadmin/rec was a good place to learn from the people who built the internet. They were so smart I think I only posted a couple of times, as one watched, and tried to understand what they were talking about. As well any dumb thing you said would get you eviscerated. ;)

I learned about spiders and their ecology there. ;)

Sam L. said...

Ah, Zachriel. I've noticed his absence for quite some time. It's been long enough that I have forgotten where he used to comment.

Assistant Village Idiot said...

Whether Biden actually has dementia has been irrelevant since the beginning of this conversation. Positing that Google has magic beans and describing a difference between their procedures and the others that could not move the dial more than a couple of percent does not change that. It isn't that hard to sit and imagine What would the search engine look for? How would it tell whether a source that mentioned Biden and dementia was pro- or anti-? Even highlighting a few sources as super-authoritative cannot get you past the sheer volume, without destroying the entire mechanism of a search engine and its immediacy, because even the favored sites are asking the question, even if only to strike it down.

Zachriel said...

Assistant Village Idiot: Whether Biden actually has dementia has been irrelevant since the beginning of this conversation.

It matters in terms of how the claim is expressed around the web.

Assistant Village Idiot: It isn't that hard to sit and imagine

We don't have to imagine. We posted Bing's results for "Does Joe Biden have dementia," and you can see that the webpages it finds to be most relevant are not authoritative in either sense of the word. Google Autocomplete downgrades such results. Bing Autosuggest may emphasize popularity and recency more.

Assistant Village Idiot: What would the search engine look for? How would it tell whether a source that mentioned Biden and dementia was pro- or anti-?

That's not the criteria, but ranking authority (among many factors). In general, ranking authority is related to veracity. While lies can travel quickly and disseminate widely, authoritative sources are more conservative, and are more likely to filter out lies. Some media sources even have what are known as "reporters" who actually try to directly verify facts.

Of course it's possible that Google intentionally downgrades "Does Joe Biden have dementia" and intentionally upgrades "Is Hitler my dad?" But it's more likely an effect of the algorithms used.

bs king said...

Late to the party, but here's something:

Google will not autocomplete "Does Joe Biden have dementia" or "Joe Biden dementia". Bing did not autocomplete it for me either, maybe because I never use Bing. In Google, there were 8.1 million results for the first phrase and 7.5 million results for second.

Then I decided to try the same phrases with Donald Trump instead. Neither autocompleted for me in either browser, but this time there were 24.2 million results for "Does Donald Trump have dementia" and 30.7 million for "Donald Trump dementia".

I note that in DuckDuckGo I have to type out the whole "Donald Trump dementia" phrase, but the Joe Biden phrase autocompletes. This was the only discrepancy I found between the three search engines.