Marx Communications
Marx Communications

Summary of AMA Interview with John Mueller, SEO & Webmaster Trends Analyst at Google

AMA Interview with John Mueller, SEO & Webmaster Trends Analyst at Google 2018

Summary of AMA Interview with John Mueller, SEO & Webmaster Trends Analyst at Google

Keith Peterson
Jul 16, 2021

SEO AMA: Is there any benefit in implementing a paywall markup on a website with a loginwall? How do bots/spammers affect trends and search volume? What do you do to filter out those effects?

Q: Can we get confirmation that these links will not be discounted by Google's Mobile First Indexing, or should webmasters prioritize important links being visible on load?

A: Just purely for usability reasons I'd always make important UI elements visible. Technically, we do understand that you sometimes have to make some compromises on mobile layouts (the important part is that it's loaded on the page, and not loaded on interaction with the page), but practically speaking, what use is search ranking if users run away when they land on your pages and don't convert? :-)

Q: How do you filter out bots/spammers? What do you do to filter out those effects??

A: The internet is filled with bots, we have to deal with it, just like you have to deal with it (filter, block, etc.) - it can get pretty involved, complicated, expensive... Heck, you might even be pointing some of those at our services, for all I know :-/. For the most part, it seems like our teams are getting the balance right for filtering & blocking, though sometimes I see "captcha" reports from people doing fairly reasonable (though complicated) queries, which I think we should allow easier. 

Q: If you are creating a site that is heavy in SVG animation and the only real difference that any crawler can detect if the h1 and Mata data used on the page. ( For the purpose of the site I don't think we need text content on there ). How would you go about making it more search friendly?

A: We're not going to "interpret" the images there, so we'd have to make do with the visible text on the page, and further context (eg, in the form of links to that page) to understand how it fits in. With puns, especially without a lot of explanation, that's going to make ranking really, really hard (how do you know to search for a specific pun without already knowing it?). What I'd recommend there is to find a way to get more content onto the page. A simple way to do that could be to let people comment. UGC brings some overhead in terms of maintenance & abuse, but could also be a reasonable way to get more text for your pages.

Q: Crawl map visualizations. How close are they to what Google actually sees when doing crawling, parsing and indexing of the website? Nowadays, few SEO tools are putting their focus on crawl maps and I am wondering if that method of presenting nodes and edges is too simplified and old school (related to old Page Rank patents)? Or they are useful and help us to understand internal architecture. What are your thoughts? Thanks

A: FWIW I haven't seen crawl map visualizations in a long time internally. It might be that some teams make them, but you might have heard that the web is pretty big, and mapping the whole thing would be pretty hard :-). I love the maps folks have put together, the recent one for paginated content was really neat. I wouldn't see this as something that's equivalent to what Google would do, but rather as a tool for you to better understand your site, and to figure out where you can improve crawling of it. Getting crawling optimized is the foundation of technical SEO, and can be REALLY HARD for bigger sites, but it also makes everything much smoother in the end.

Q: When might we expect the API to be updated for the new Search Console?

A: Oy, I hear you and ask regularly as well :-). We said it wouldn't be far away, so I keep pushing :-)

Q: When are you going to bring the extended data set to the existing API version? What new features are planned for the new API version?

A: As I understand it, the first step is to expand the Search Analytics API to include the longer timeframe. Past that I'd like to have everything in APIs, but unfortunately we have not managed to clone the team, so it might take a bit of time. Also, I'd like to have (as a proxy for all the requests we see) a TON of things, so we have to prioritize our time somehow :-). At any rate, I really love what some products are doing with the existing API data, from the Google Sheets script to archive data to fancy tools like Botify & Screaming Frog. Seeing how others use the API data to help folks make better websites (not just to create more weekly reports that nobody reads) is extremely helpful in motivating the team to spend more time on the API.

Q: Is it necessary to set a X-default for your alternate lang tags? Or is it enough to set a alternate lang for every language/site?

A: No, you don't need x-default. You don't need rel-alternate-hreflang. However, it can be really useful on international websites, especially where you have multiple countries with the same language. It doesn't change rankings, but helps to get the "right" URL swapped in for the user.

Q: We see a difference in indexed pages between the old and the new search console. ( https://ibb.co/kZShvS https://ibb.co/gZTY27 ). Is this a bug? Or does it have another reason?

A: If it's just a matter of multiple languages, we can often guess the language of the query and the better-fitting pages within your site. Eg, if you search for "blue shoes" we'll take the English page (it's pretty obvious), and for "blaue schuhe" we can get the German one. However, if someone searches for your brand, then the language of the query isn't quite clear. Similarly, if you have pages in the same language for different countries, then hreflang can help us there.

Q: Is Google trying to move away from link based authority or will they always play some part in the overall algorithm?

A: I still click on links a lot when browsing, so I guess there's that :). SEOs often focus on links too much, it doesn't always helps their sites in the long run or is probably not the most efficient thing to do to promote a website.

Q: Does not updating content will make the site lose its ranking?

A: Updating content doesn't automatically make it rank better. Some pages change all the time, others stay pretty much the same for years, all of them can be very relevant to users.

Q: Does Google search rely on data soruces not produced by the crawl for organic rankings purposes? What might some of those be?

A: We use a lot of signals(tm). I'm not sure what you're thinking of, but for example, we use the knowledge graph (which comes from various places, including Wikipedia) to try to understand entities on a page. This isn't so much a matter of crawling, but rather trying to map what we've seen to other concepts we've seen, and to try to find out how they could be relevant to people's queries.

Q: How is keyword cannibalization seen by Google? People believe that having multiple pages about the same topic confuses search engines and that hurts their chances of ranking.

A: We just rank the content as we get it. If you have a bunch of pages with roughly the same content, it's going to compete with each other, kinda like a bunch of kids wanting to be first in line, and ultimately someone else slips in ahead of them :). Personally, I prefer fewer, stronger pages over lots of weaker ones - don't water your site's value down.

Q: I've been seeing a lot of inconsistencies between different markets of Google. I.e, Google.co.za operates differently than Google.co.uk etc. Will all markets eventually be the same? Are the smaller markets like Google.co.za a bit delayed? Will they ever be on the same level as the larger ones? I'm asking in terms of the Google Algorithm and ranking factors / penalties.

A: For the most part, we try not to have separate algorithms per country or language. It doesn't scale if we have to do that. It makes much more sense to spend a bit more time on making something that works across the whole of the web. That doesn't mean that you don't see local differences, but often that's just a reflection of the local content which we see.

The main exception here is fancy, newer search features that rely on some amount of local hands-on work. That can include things like rich results (aka rich snippets aka rich cards), which sometimes don't get launched everywhere when we start, and sometimes certain kinds of one-boxes that rely on local content & local polices to be aligned before launch. I'd love for us to launch everything everywhere at the same time (living in Switzerland, which seems to be the last place anyone - not just Google - wants to launch stuff, grr), but I realize that it's better to get early feedback and to iterate faster, than it would be to wait until everything's worked out, and then launch something globally that doesn't really work the way we wanted.

Q: I am trying to do a split test on my site by showing a new version of page to 10% traffic and original page to 90% traffic. For googlebot do we split them at runtime in the same proportion or push all of googlebot to original design so as not to confuse them.

A: We need to write up some A/B testing guidelines, this came up a lot recently :). Ideally you would treat Googlebot the same as any other user-group that you deal with in your testing. You shouldn't special-case Googlebot on its own, that would be considered cloaking. Practically speaking, sometimes it does fall into one bucket (eg, if you test by locale, or by user-agent hash, or whatnot), that's fine too. For us, A/B testing should have a limited lifetime, and the pages should be equivalent (eg, you wouldn't have "A" be an insurance affiliate, and the "B" a cartoon series). Depending on how you do A/B testing, Googlebot might see the other version, so if you have separate URLs, make sure to set the canonical to your primary URLs. Also, FWIW Googlebot doesn't store & replay cookies, so make sure you have a fallback for users without cookies.

Q: The historical standard for paraphrase or quote attribution arose from an industry where the content was first purchased and then shared (ie: libraries) - content creators were paid before attribution. On the web, the content is shared without purchase but with Google Search, attribution led to visits, offering a different form of payment (often ads). But with voice search, the likelihood of attribution yielding a visit is very low. How does Google intend to encourage healthy content production on the web given the disruption to the attribution funding model which has powered content creation since, well, the beginning of the printed word?

A: I'll let you in on a secret: the health of the web-ecosystem is something that pretty much everyone on the search side thinks about, and when we (those who talk more with publishers) have stories to tell about how the ecosystem is doing, they listen up. Google's not in the web for the short-term, we want to make sure that the web can succeed together for the long-run, and that only works if various parties work together. Sure, things will evolve over time, the web is fast-moving, and if your website only makes sense if nothing elsewhere changes, then that will be rough. As I see the SEO/web-ecosystem, there are a lot of people who are willing to put in the elbow-grease to make a difference, to make changes, to bring their part of the world a step further, and not just cement existing systems more into place. (Personally, it's pretty awesome to see :-))

Keith Peterson | I'm an expert IT marketing professional with over 9 years of experience in various Digital Marketing channels such as SEO (search engine optimization), SEM (search engine marketing), SMO (social media optimization), ORM (online reputation management), PPC (Google Adwords, Bing Adwords), Lead Generation, Adwords campaign management, Blogging (Corporate and Personal), and so on. Web development and design are unquestionably another of my passions. In fast-paced, high-pressure environments, I excel as an SEO Executive, SEO Analyst, SR SEO Analyst, team leader, and digital marketing strategist, efficiently managing multiple projects, prioritizing and meeting tight deadlines, analyzing and solving problems.

Related

Top 10 B2B Marketing Companies in 2021

In essence, B2B marketing follows the same methods as those employed by marketers of B2C. However, the same benefits with B2B marketing are significantly harder to attain. The biggest difficulty for B2B marketers is to persuade policy-makers to choose their products or services for other companies.

Top 9 B2B Marketing Trends in 2021

It may not change your firm, but information can help you keep up with the competition as a consequence of the current and biggest B2B marketing trend. You will be able to make more smart choices if you can take a step back and examine your marketplace and the trends affecting it.

7 Tactics To Boost B2B Lead Generation With Instagram Stories

A number of strategies are being used to crowdsource marketing minds all across the internet realm. Every month, if not every week, a new platform, tool, or marketing approach develops that alters marketers' capacity to reach their target audience.

14 Ways to Energize Your B2b Content Marketing Strategy

Is your B2B content marketing plan working? Or do you ever feel that your plan may use a refresh? Content marketing leaders receive 7.8 times more site traffic than non-leaders, according to the Content Marketing Institute.

How To Leverage AI In PR?

The Artificial Intelligence tools that we use and encounter in our daily lives are only the tip of the iceberg. This technology is still primitive in comparison to what the future holds. Public Relations (PR) is about communication and brand strategy; hence, utilizing AI to deliver the relevant message to the audience provides chances for brand building.

5 Ways To Create A Killer LinkedIn Summary In Oct 2021

LinkedIn is much more than just a digital resume. It's a location where industry insiders and customers may connect. It's a site where you can find leads and business opportunities. Your LinkedIn profile's front entrance is your summary. It's the first thing people see when they land on your profile page, and it has the power to make or break your ability to connect with your target market.

18 Personal Branding Experts To Know In 2021

You're probably aware that three out of every four businesses fail. It is impossible for an entrepreneur to establish credibility if he or she lacks it. To distinguish yourself in any industry, you must position yourself in distinctive ways.

16 Tips On How To Boost Your Personal Brand On LinkedIn

The world may be a frightening place, especially when it comes to finding work after the age of 50. At times, it may even feel like you're swimming against the stream. However, utilizing LinkedIn for personal branding may be compared to using a speedboat to push through the river.

The Complete Guide To Digital Public Relations (PR) Oct 2021

The field of digital public relations is always developing, evolving, and adjusting to the demands of organizations. As new technologies arise, so do new opportunities for businesses to profit from their use.

B2B PR + Content Marketing = Powerful Results

I just came across an intriguing proverb: "When spider webs unite, they can tie up a lion." When it comes down to it, there's a lot to be said about the collaboration. But what about two teams that appear to be diametrically opposed, such as B2B PR and content marketing?

Top 14 Worst Mistakes in B2B Content Marketing 2021

Content marketing is a great marketing tool that has helped many businesses increase their revenue. You don't believe me, do you? DemandBase, a marketing technology company, created 1,700 new leads and nearly $1 million in additional income by using a variety of content (infographics, white papers, webinars, and so on).

© Copyright 2021 Marx Communications All Rights Reserved

Terms & Privacy | admin@marxcommunications.com