In the last couple weeks, we’ve seen a legislative and legal shift in tides for both Google and Facebook in a number of their biggest markets, see the perspectives and forum member commentary below.
Members also discuss a upward trend of bot traffic and new “muting” capability on twitter.
The latest chapter in the battle over net neutrality surfaced on Threadwatch.
Back story: In May, the Federal Communications Commission (FCC) began to remove regulations on Internet Service Providers (ISP’s), instituted during the Obama Administration. Collectively, these regulations are colloquially referred to as Net Neutrality rules.
Speaking more broadly, the idea of net neutrality as a concept is that ISP’s cannot slow down or block websites, nor charge apps and websites extra fees. Web platforms staged a day of protest on July 12. Below we provide the perspectives of the Federal Trade Commission, Web Platforms, and Webmaster World Members
Points of view surrounding the latest rollback of FCC Net Neutrality Rules:
Speaking in terms of the point of view of the FCC, Alina Selyuk at NPR, states that
“The rules, passed in 2015, had placed cable and telecom companies under the strictest-ever oversight of the agency”. As further reported by Selyuk, quoting the FCC Chairman Ajit Pai, “The utility-style regulations known as Title II were and are like the proverbial sledgehammer being wielded against the flea. Except that here, there was no flea.” Pai adds that the new proposal would “return to the Clinton-era light-touch framework.”.
Speaking in terms of the point of view of web platforms, on their call-to-action site to save net neutrality, web platforms believes that, “The FCC wants to destroy net neutrality and give big cable companies control over what we see and do online. If they get their way, they’ll allow widespread throttling, blocking, censorship, and extra fees.”
Webmasterworld members chimed in with a variety of interesting point of views on both sides of the issue.
Kufu comments that,
“Net neutrality is not as simple as allowing all traffic to be equal. There may be legitimate reasons for giving some traffic priority over others; one such example that I have heard mentioned (by John C. Dvorak) is the need for priority access for remote robotic surgical procedures which will have to have as little delay as possible.
This is a more complex question to answer, and none of the answers by the major players address actual quandaries, but rather each entity’s self-interest.”.
Engine commented about throttling,
“I wonder if there’s going to be a complete change from white to black. Throttling, as was suggested in some sectors, will just annoy people as, often, they just don’t care. I doubt Google will do any throttling, although they could with YouTube“.
Surfaced on Threadwatch is the news that the The News Media Alliance, which represents news publishers, asserts that Google and Facebook have too much power and they seek to be able to collectively bargain against them.
Jim RutenBerg, of the New York Times sums up the perspective of news publishers ,
“Google and Facebook continue to gobble up the digital advertising market, siphoning away revenue that once paid for the quality journalism that Google and Facebook offer for free. […] And for all of Google’s and Facebooks efforts to support journalism by helping news organizations find new revenue streams – and survive in the new world that these sites helped create – they are , at the end of the day, the royals of the court. Quality news providers are the supplicants and the serfs. ”
RutenBerg goes on to state that,
“This week, a group of news organizations will begin an effort to win the right to negotiate collectively with the big online platforms and will ask for a limited antitrust exemption from Congress in order to do so.”
As reported by Harper Neidig of The Hill, a Google spokeperson has stated the following in response,
“We want to help news publishers succeed as they transition to digital.
“In recent years we’ve built numerous specialized products and technologies, developed specifically to help distribute, fund, and support newspapers. This is a priority and we remain deeply committed to helping publishers with both their challenges, and their opportunities”.
Over on cre8asiteforums, member bobbb, shares the recent news reporting that Germany now legally requires social media firms to control hate speech and that failure to remove illegal speech.
Bobbb adds an interesting point on enforcement, stating that this will likely be done algorithmically, which will result in users trying to identify ways to manipulate it to distribute illegal speech.
Bobbb also states that,
“Like the EU fine of $2.7B, I see this being eventually watered down.”.
Keyplyr states that that bot traffic is on an uptrend, sharing the following breakdown from a sample of 10k daily pageloads:
28% Search Engine & other good bots
10% Scrapers & Downloaders
5% Hacking tools & scripts
1-3% Automated link spam
12% Other impersonators
Keyplyer adds that the problem with bot traffic is that analytics and site reporting software is easily fooled by these bots and may count the traffic as human traffic.
Member wilderness adds that,
“I’ve always believed the bot traffic (good & bad) is closer to 70%, and nothing has happened in the last decade to make me believe otherwise!”
Tangor jumps in with thoughts on controlling for the issue,
“I’m with wilderness (as to the absolute noise on the web these days!). Aggressive bot thumping is a chore, but whitelisting is so much easier and bandwidth conserving. As webmasters we pick our poison and go from there.”
Iamlosts adds an interesting perspective comparing Google Analytics and Piwik data,
“I also am in agreement with those who believe the percentage of bots to humans is actually higher than most reports. Late 2011 early 2012 was when I first read reports of bots surpassing human web traffic. Those same reporters had bots peaking above 60% in 2013 and falling back to high 40s low 50s since. The common reasoning for the drop was Google and Penguin. However, in that same time period I was seeing an increase in much more humanistic bots, especially since 2015. I’ve been running tests on my sites with Piwik every year since 2014 (and colleagues have run similar with Google Analytics even longer) that consistently show them misidentifying 40-50% of otherwise identified bot traffic as human.”.
Other interesting comments included specific categories of bots
“If you use chrome to visit your website or any website you might as well count it as 2 hits. One google bot and 2 you. If you look at your log files you will see this. it has been years since I looked at this but working off a bad memory I remember it as the content bot.”
I am seeing over 50% bots. And this is after discounting all Amazon’s “cloud”, AWS bots as I block ALL of their IP blocks, all of them.
Twitter announced that they released new functionality to help users curate feeds, using a mute functionality. Users will now have the capability to mute the following types of accounts:
- Accounts that are new (that you don’t follow).
- Accounts that don’t follow you (that you don’t follow).
- Accounts you don’t follow.
- Accounts with a default profile photo (that you don’t follow).
- Accounts without a confirmed email address (that you don’t follow).
- Accounts without a confirmed phone number (that you don’t follow).
In their announcement, Twitter did not specify any specific reasoning for the new feature.
In the discussion on Webmasterworld surrounding the announcement, JS_Harris states that this may be the wrong approach, suggesting that an unintended consequence of this action may be muting people with differing opinions, which may create a sort of personalization bubble for users. Waitwhiterabbit comments that the feature does have a precedent, a similar functionality exists for the Facebook feed and that part of the intent may be spam control.
Join the discussions above to read insightful comments and contribute your thoughts!