At Pubcon Las Vegas 2016, Gary Illyes sat down with author Eric Enge for a casual conversation about a not very casual topic. Gary answered direct questions about the Penguin 4.0 integration into Google’s core algorithm and how real time processing will affect ranking changes going forward. Gary also took many questions from engaged viewers on a wide variety of topics. The following are highlight topics that were discussed in the morning segment.
Real Time Updates in Penguin 4.0
Gary clarified the concept of “real time” as is applies to Penguin’s new processing power. Although Penguin 4.0 is identifying problematic links faster than ever, Google still may look deeper into certain links and a sites overall link profile. Your websites historical link profile still plays the major role in your ranking and is not necessarily altered or discounted by new real time capabilities. Sites that have a history of poor or spammy links won’t be saved by Google’s new real time link processing. New links will be instantly taken into consideration but actual ranking changes will occur slowly over time. The calculation process has become more advanced but faster processing doesn’t equate to erratic or frequent changes. With the increase in links processing is a greater need to disavow poor links by webmasters and the disavow tool is an important way to do so.
Negative SEO Calculations by Penguin 4.0
The questions about negative SEO once again were raised. And, once again there was a general tone of “What negative SEO?” It was explained that poor links will still be evaluated as poor links and great links will still hold their value. Reviewing your sites link profile is an important exercise in managing your website. The cases of nSEO are so sparse that almost no one at Pubcon Las Vegas 2016 has had a great example with an accompanying prescription.
Machine Learning isn’t the Only Solution
Machine Learning is great for some tasks but is not the solution for all algorithms. Many times manually built algorithms out-perform machine learning pieces. Gary used the analogy of the multi tool pocket knife. You wouldn’t use a multi tool for burrowing into concrete. As such, machine learning isn’t the right tool for every operation. There are many components of search algorithms that are more efficient when they are built by humans.
The Future of Search Engine Optimization
A smaller but very engaged group of Pubcon Las Vegas 2016 attendees posed several questions to Gary Illyes about topics affecting the future of website rankings. In no particular order, these are some of the responses that we given by Gary on issues facing websites moving forward.
No Index Tags on Key Pages
Gary Illyes mentioned in his discussion with Eric Enge Wednesday morning that he sends out around 20 emails a week discussing no index tags that are on key web pages on websites. Often times websites will have the no index tag, which prohibits Google’s crawlers from crawling some of the website’s most important pages.
Server Error Messages
404 errors and other server errors are still an important issue around credibility and “crawlability”. Gary recommended that webmaster check their search consoles for crawl errors of all types. Persistent 404 errors will damage your sites reliability. 301 redirects and custom 404 pages need to be in place to correct potential dead ends in user browsing on your sites.
Is your site easily crawled by Google? Does your code allow Google bot to easily follow your content? It would seem that an older topic in the world of SEO is still on the forefront of the minds of Google engineers because of how crawl issues can ultimately affect user experience and lead to lower engagement. Having pages cached on the server in simple HTML still remains an important way to increase speed and consumption.
Look into the Future of Mobile Page Delivery – Progressive Web Apps and Google AMP Project
Gary Illyes cautioned digital marketers to look to the future. How can we optimize the experience for the mobile user? Mobile’s hottest issue right now is speed. Gary mentioned two important new technologies in serving faster mobile pages — progressive web apps and the Google AMP project. Google is focusing a lot of resources on supporting optimized page delivery technologies.
About Google AMP
AMP stands for Accelerated Mobile Pages, a Google-backed project designed as an open standard for any publisher to have pages load quickly on mobile devices.
On Feb. 24, 2016, Google officially integrated AMP listings into its mobile search results. Pages making use of AMP coding appear within special places in the search results and/or with a special “AMP” designation.
For more about AMP, see the AMP Project site, our How To Get Started With Accelerated Mobile Pages (AMP) article and our coverage below. See also our AMP category at Marketing Land for stories about AMP that go beyond Google.
What are Progressive Web Apps?
Progressive Web Applications take advantage of new technologies to bring the best of mobile sites and native applications to users. They’re reliable, fast, and engaging.
With the updates that have been rolling out as parts of Penguin 4.0 many of the core issues in SEO are being revisited in more advanced ways. While these are topics that webmaster have dealt with in the past, Penguin 4.0 is tying these factors together tighter and creating more efficient valuation processes that will create further corrections for website that have been managed in accordance with Google guidelines.