What are some of the most effective technical SEO quick wins in 2022? That’s what we’re discussing on episode 22 of the Knowledge Panel, with Dixon Jones joined by Kara McClure from Mindshare, Sara Taher from PDFTron and Wilhemina Gilbertson-Davis from B-digitalUK.
Sign up below to watch future episodes live…
Want to Read Instead? Here is the Transcript.
Dixon: Hello again and welcome to the Knowledge Panel Show, Episode 22 and today is “Technical SEO” and quick wins. As you can see my background here, I’m actually doing the show from North Wales, from Snowdonia instead of my normal location but nevertheless it’s good to see you all. Hope that the internet connection holds up in in Snowdonia. We’ve got a fantastic panel today. Unfortunately, Sara couldn’t make it but I’m not too worried – well, I’m sorry for Sara, she’s caught away for an emergency – but Wilhemina and Kara, we were chatting before the show, and I think we’ve some absolutely incredible experience within the world of technical SEO, so I think we’re going to be in pretty safe hands today. So Wilhemina and Kara, welcome to the Knowledge Panel Show! Why don’t we start off by you guys introducing yourselves. Kara, why don’t you go first, tell us who are you and where do you come from as the show panel host says.
Kara: Cool, thank you and firstly thank you for having me, very honored to be here. So my name is Kara McClure, I’m currently at an agency called Mindshare as SEO account director a lead on a few big clients, Apple being one of my main clients. I think that’s everything.
Dixon: So I think we can’t go, you know, SEO for Apple, that’s pretty cool. Wilhemina, can you beat it?
Wilhemina: I absolutely can’t beat it you. I’m Amina, an SEO specialist at British American Tobacco and I work across the UK and highland accounts.
Dixon: It’s not exactly a small company either, so both of you have got some incredible challenges, both of you, different kinds of technical challenges, I would imagine. We were talking about age restrictions and things like that before we came on the show and you were talking about doing technical SEO in Python and things, Kara, so maybe we’ll come on to some of those ideas but before I do, I want to just bring in my producer, make sure that I have talked about all the things I need to do before we start, say hello and thanks for coming and stopping me from making too many errors, David.
David: The only one thing I want to say is you’re probably listening to the show on Apple podcast on Spotify, come and join us live next time, if that’s you. Just go to theknowledgepanelshow.com, sign up for the next show and hopefully we can see you interacting as part of the live chat for the next show and I’ll tell you more about what the next episode is towards the end of this one.
Dixon: Brilliant. Let’s get into the show and start with question one, which is, if people haven’t got to be got time to be around for the whole show, and we wanted one technical SEO quick win that you wanted people to take away, what would you go for? So one idea that people can go for to take away is a technical SEO quick win and I’ll go with Kara again.
Kara: There’s so much, I guess I would say, the main thing is probably quiet, maybe an old school basic one, is make sure your Google Search Console is configured in the best way possible, because that actually gives you a lot of insight that people kind of overlook sometimes and I think if there’s anything technically wrong with your site, Search Console is going to flag out to you and you’re going to be able to get a bit under the hood and be able to fix certain things so I would say make sure your Google search was configured correctly.
Dixon: So is that really just mostly about setting up the domains? Because the domains you used to be able to have different, I remember with my Majestic days, you had https majestic.com and then en.majestic.com and jp.majestic.com and every single version one under the sun, most of which did nothing but I had to put them all in. Is that still the case or they fixed that now?
Kara: It’s still the case to a certain extent, so you have to still configure it to whether you sit on http or https. Hopefully everyone should be on https now for obvious reasons, so that thing is fine but in terms of the configuration I was more referring to making sure it’s linked to Google Analytics, making sure it’s linked to any other analytics tools that you have, making sure the core web vital stuff is organized in the way that you need it to so if anything is flagged up then you understand it because search console will give you I guess the good visualization of how search engines are looking at you, predominantly Google, right, which is what everybody cares about so I just think if anything technical needs to be flagged before you even get to the fixing search console is one of my go-to’s all the time.
Dixon: OK, good tip. Wilhemina, what’s your one tip for the top of the show?
Wilhemina: One tip… it’s quite a hard one.
Dixon: I think we’ve seem to have lost a connection a little bit there. Is that me or is that Wilhemina?
David: I think it’s Wilhemina.
Kara: Yeah, I’m still here.
Dixon: We seem to have lost Wilhemina. We’ll carry on you and I, Kara, we’ve now got a down to two.
Kara: No pressure.
Dixon: Don’t worry about it, we’ll sort it out in post-production, don’t panic. Let’s carry on with the search console then. Because there’s also a Bing Webmaster Tools as well, which is kind of, you get a lot of people saying: oh, you should check out bing as well as search console, and then I don’t suppose many people do. I mean, do you? Does Apple?
Kara: I do use Bing Search Console, not necessarily for Apple, but I have used it in the past and I think obviously I’m not sure how much people will take note of the market share space particularly because a lot of laptops now that you buy and even from search for smartphones that you buy, some of them come loaded with certain browsers and within those browsers sometimes I know for Microsoft for a time they were giving everyone their laptops loaded with Bing and obviously if you’re loaded with Bing then people are probably just going to use that because people are lazy, right? So people shouldn’t necessarily overlook the other search engines, obviously their market share is obviously minuscule compared to Google but there still is a little bit of market share so I think Bing Webmaster Tools was also another one. I haven’t used it recently, if I’m being perfectly honest, but I have used it in the past and it does give you some good insights as well because I think I with technical SEO I think the main thing that you need to understand is understanding what is happening behind the scenes and webmaster tools or search console is going to tell you that.
Dixon: Excellent. Wilhemina, sorry, you seem to have a crash there, are you back?
Wilhemina: Yeah, mid speech.
Dixon: Nevermind, let’s go again, just jump in and tell us your tip.
Wilhemina: There are so many, but I would say, don’t underestimate the importance of the site map – that would be my tip.
Dixon: I guess on a large site that becomes even more important. I’m just about getting you there, so have you got any particular tools you like to use to generate site maps or are you lucky enough to have a CMS that does it all properly?
Dixon: I can hardly hear you.
Wilhemina: It’s frozen for me.
Kara: I think you’re still there.
Dixon: Yeah, you’re still here but I’ll pass it over to Kara for a bit. Maybe if you can turn off your video settings then we’ll get it all right for the podcast and stuff and that might save some bandwidth and perhaps can go on from there. Kara, do you guys use sitemaps a lot?
Kara: To be honest, when you asked the question, I was toying between search control or sitemaps but I think I’ll still go with search console from the essence of seeing what’s under the hood essentially. With sitemaps I think the good thing about sitemaps is again you’re still telling search engines what’s going on, you’re telling them every single page that you want them to look at, you tell them what they want to index and that kind of thing, and I know we mentioned WordPress before we jumped on live and the one interesting thing I found with WordPress, I haven’t used it recently but from what I understand you have to like use plugins for WordPress to be able to actually generate site maps, whereas obviously if you build it on a different kind of CMS you can auto generate an xml sitemap for example or a html sitemap and I think that’s probably one of the downfalls of WordPress not generating an auto-generated xml sitemap because I think it’s quite important. So I was actually auditing a site the other day and they just didn’t have a sitemap, you just couldn’t find a sitemap anywhere and it’s quite a large website and then I came to find that they were built on WordPress and that’s not obviously great for SEO.
Dixon: OK, but I kind of just I guess pushing back because there’s only two of us now so we’re just gonna have to just disagree, but so many people use WordPress that I would imagine, that the Google’s ability to crawl WordPress is incredibly easy. What advantages does a sitemap have for Google surely it can get through the content pretty easily. Because one of the things about WordPress and plugins is another SEO will tell me, every time we put another plug-in, you put a nail in the coffin of SEO for WordPress, because it just sort of slows the site down, although I suppose a sitemap plug-in wouldn’t necessarily slow a site down very much because it’s just creating an extra page but even so what are the advantages of having the site map do you think?
Kara: I guess WordPress is a bit of a anomaly within the fields but for me personally the advantages of having a site map is again it has a date, it has a time, it has a stamp, you’re literally updating search engines every time you’ve made a change, so using Apple for example whenever they launch new products on their site, which they do often, as I’m sure if anyone’s Apple fan they would know, it’s particularly around September or march when they’re releasing new products, we kind of auto push and submit all the new site maps so all of those URLs can be indexed straight away, and obviously if in fact like Apple is doing that, then you kind of understand how important it is because in the grand scheme of things. For me I think the technical SEO the standpoint is always you’re constantly communicating with the search engine bots and that’s the best way to do it for a technical aspect of making sure things are being indexed. But in saying that you do make a good point in terms obviously Google are good at understanding WordPress sites, calling WordPress facts then I guess it’s more about how well technically built is your website on WordPress so you may not necessarily have to rely on a sitemap. But I guess if you have it, it’s a nice to have.
Dixon: So if you put a sitemap and you miss out a bunch of URLs, does that recruit, does Google just then not bother with those or is it just going to still find those through discovery through cruel discovery and add that into the mix?
Kara: Yeah, I guess it’s not the fact that they wouldn’t bother with it, but it may potentially take them longer to find them. I guess it depends on what signals you’re sending. If you’re sending those of offsite signals and on-page signals then I’m pretty sure that Google will probably notice the euro at some point. Obviously depending on your caller budget, how often your site is crawled, it all depends on a lot of those different things as well. So I wouldn’t necessarily say if your euros aren’t in the site map they won’t necessarily be noticed at some point, but for me I think technical SEO is all about being able to be in control and I think when you have things like search console self properly and site maps you’re able to control a bit more. Of course Google always going to do what they want to do because, let’s be honest, Google don’t like SEOs, but anything that’s going to allow you to be in control, I would say, just get that under your belt just so you can try and influence as much as possible, because the other standpoint is making sure you’ve got strong optimization onsite and offsite, to make sure that your URL is noticed if it hasn’t been submitted through the sitemap.
Dixon: So sitemaps when in days gone by used quite a lot of I could use priorities in sitemaps, I don’t know, I don’t think that’s available now, so sitemaps are all about getting discovered and getting indexed in the first place really, there’s nothing in a site map that can tell you some priority tell a search engine a priority or your preference of one page over another these days. Because they used to be able to give a number between zero and one in the site map was to the priority but I think that was a long time ago but it seemed to never do anything anyway and that’s probably why they took it away. We’ll try one more time with Wilhemina – how are you Wilhemina if you’re online and fine give us a tip and just chat away if we fail again don’t worry about it.
Wilhemina: I don’t know why my internet is working against me today or in this moment actually, but I’m back on, I can hear everyone and I think you can hear me as well.
Dixon: Yeah, but give us a tip. Oh no, not sitemaps, we’ve just done sitemaps while you were off, so we can move on. If you’ve got another thought then we’ll jump on to that one though.
Wilhemina: No, we can move on, I’m happy to move on.
Dixon: That’s all right then. OK, so let’s go on to 404s. Well it kind of moves on from… we’ve lost Wilhemina, I think we’ll probably go without Wilhemina for now.
David: She should be still there, actually, but I just asked her to turn off her camera.
Dixon: OK, that’s great. So let’s go on to the consequences of setting up new site maps, presumably if you’re gonna put up a new sitemap, you’re also gonna in that time take some content down so you’re gonna start ending up with 404s on pages or if you’ve been cleverer about it, 301s, but I suppose with something like Apple, if you’ve got new versions of software or new versions of products, does Apple prefer to remove URLs for redundant products or redundant content or do they like to redirect it and whether that’s Apple’s view or your view. What’s your view, Kara?
Kara: I guess I’ll give my view on this one just because I don’t want to get too into the detail of Apple, because…
Dixon: Sure, I’ll stop querying Apple too much now.
Kara: I can’t share all of Apple’s secrets, but in a general sense I think again with the site maps I guess it depends how you manage it, whether it’s auto generated or whether you manually manage it and I guess also depends on how massive your site is. What I will say, which I think is quite a good tip, is depending how big your website is you should have separate sitemaps for separate parts of your website so you can easily if you understand that you’ve got loads of products, have a product site map, if you’ve got a particular loads of categories, have a category set up, maybe trying to segment it out so if you know that there is going to be an instance where you may decide to take a whole lot of products off your page for example, you can refer straight to the product sitemap and then edit that accordingly. So I would say if it’s manageable to edit the sitemap then yes of course remove them from the sitemap so Google can then in turn remove them from the index. But personally how I generally manage 404s is by redirects, because I think again with the site maps depending on how things are set up, especially if you’re an agency and you’ve got to get free developers and you may not have access to the site map, you might have access to the CMS, you may not have access to certain things, I think redirects is probably one of the still one of the stronger signals to send to Google to say hey this URL used to live here but actually it lives over here now and obviously the difference between a 301 and a 302 so if you know I use Excel as an example, a lot of my clients have been retail like e-commerce and they have sales a lot of the time, so it’s more their page may go down temporarily for sale, so we’d implement a free too, but then if we know that a page is permanently being moved whatever reason then we’d use a free one. But in terms of 404 is more than like you’re going to implement a 301 because it’s an error page and if you know it’s not coming back then you just send that signal and hopefully and what does generally happen is a lot of the authority, depending how long the page has been a 404 for, then you’ll be able to kind of I guess mitigate as much loss as possible, and be able to say for example if the page has been a 404 for two days if you implement a redirect you’re probably not really going to lose out, if it’s been a 404 for like a month there’s probably going to be a lot more that you would have lost so I would say implement redirects as quickly as possible for when it becomes a 404 and just manage that process and I guess just have a clear logical step of understanding whether you want to put a 301 or 302 and just things like that.
Dixon: And is Google Search Console giving you messages when you get 404s?
Kara: Yeah, it does. I guess the only thing that, well, I haven’t unless someone can tell me how, you can’t really set up alerts within Search Console and get notifications so you have to go in there but then it gives you quite a good understanding of server errors so even down to if you do manage your sitemap, if you submitted URLs within the sitemap that are throwing up errors, search console will tell you that. If you have 500 errors it will tell you that. It gives you a lot and that’s why I say search console is good. I think it’s just a good kind of bible to have with technical quick wins because the other side is running loads of crawls and doing that thing which you can do as well but again depending on your site, depending on your server size, depending on your server logs, could slow things down, whereas search control is just always plugged in so it’s an easier thing.
Dixon: Wilhemina, are you there to talk about redirects, how you approach redirects?
Wilhemina: I am here, if you can hear me. In terms of like redirects and how we use them – like Kara was saying, when to basically use them, so people they are permanent redirects again something to use if this is where something is going to be permanently used. Again it depends on where you want to use it, so how do you want to use it, so just like in context and how do you want to use it.
Dixon: I’ve heard from a Googler, I’m not going to say which one, that they’re now treating 301s and 302s is as the same. Firstly, do you think they do, and secondly, do you think they should?
Wilhemina: I don’t think so, no, I don’t think they should be treated as the same thing. I feel like they should be kept separately that the way they are now, also kind of helps again because the whole point of this to kind of like figure out where that page is going, so if it’s a permanent then use it and if it’s not going to permanent then use it to keep those two things separate as well so it gives people that chance to use a separate than combining them both together.
Dixon: Yeah, I agree. Kara, you do think they should be different, they should always be different, there’s a real good reason for them to mean different?
Kara: What do you mean in terms of they should or shouldn’t be different?
Dixon: Well, the Googler was saying, that they effectively will treat a redirector to redirect regardless whether it’s temporary or permanent in for their search engines and I was surprised to hear that, it was a little while back. I was surprised to hear that and I was surprised that somebody else from Google didn’t come down and make that perfectly clear that these are different things and if they’re not different things, why on earth, there’s a good reason for having a temporarily out of stock 302, permanently out of stock 301, there’s good reasons for having different commands and I’m just surprised that Google treats it differently. I suppose from their point of view, all they need to know is whether to show it in the search results or not.
Kara: Yeah. To be honest, I have heard murmurs of that, in terms of it doesn’t actually really matter, you redirect as a redirect, but I’ve always been on the notion of I think it’s still important to set the status code because there are other redirect codes you can have as well there’s a 308 that people don’t really speak about. So me personally I would still do use the logic and the process that I have, which is either 301 or 302 if you know. I think obviously the balance between it is if you don’t know when a URL has come back just put a 301, if you do know that it’s a temporary then put a 302 but I have heard murmurs that it doesn’t necessarily differentiate but there’s nothing to really prove that at the moment. So I guess with most things in SEO to be honest it – I know you don’t want to hear this – but it depends. It does depend because it’s kind of like there’s no black or white, right, wrong. Obviously there’s a lot of things that we all do that are similar that we know not what to do within SEO and there’s some things that we know to do but in that sense it’s like there’s a lot of great areas around a lot of things because Google are never really going to give us a definitive answer because like I said they don’t like SEOs right because they don’t make money out of us. We are the ones to work out how they do what they do so we can manipulate for our client or if we’re in-house or whatever so I still stand firm that I think you should state the status code but I do understand where people are coming from they’re saying doesn’t necessarily matter.
Dixon: Fair enough. All right, let’s move on to duplicate content then. Wilhemina, is that something that you guys have to tackle a lot or you don’t have that problem, two URLs for the same content? I mean large websites must have it just in parameter, must be appearing all over the place. How do you handle or mitigate duplicate content, Wilhemina? You’re on mute. You’re either on mute or dead. Offline. Oh, there we go, there you go.
Wilhemina: So thankfully duplicate content isn’t something that we enter quite a lot. We try to make sure that everything all our content is different from the others. In some cases where I have experienced it, it’s… I don’t know how to really to mitigate it but to basically make sure that even if we do need to have the content in more than one place, they come from different places, we come from different sources if that makes sense?
Dixon: OK, and I guess also Kara did you guys use canonicals at all in content for example?
Kara: Yeah, Interestingly one of my other clients has got an issue at the moment with duplicate content because they’re basically acquiring a lot of content from another website and they don’t have the resource to rewrite the content so they’re essentially copying all the content from another website, totally different domain and one of the things I was discussing is usually I would suggest canonicals but me personally I’ve never had experience of implementing a canonical to a different domain, it’s always within the same domain. So because this is two separate websites, my first reaction is to implement canonical so let Google know that this is the more authoritative page that you should be looking at, but I’m not sure how Google are gonna feel about that because it’s two separate domains, so canonicals is kind of my go-to but I think within duplicate content and this is probably going to sound quite funny but my main thing about how to manage duplicate content is just don’t have it. It’s not very wise to have it, it’s just not best practice, it’s not logical, why would you have, you know… I always kind of refer to SEO as a book sometimes. When you read books you don’t have two pages that’s exactly the same thing because that messes up the story, why would you have that. So how I manage duplicate content is tell my clients don’t do it. But if they do do it then canonicals would be one and then the other is to… Sometimes they have duplicate content and they’re not really sure why they have it, so sometimes you can actually edit content to differentiate it from one another. So I recommend that as well, but yeah I think canonicals are going to be the best way or you can… I’ve done content audits before where I’ve seen OK you may have great content but how many pages do you really need, let’s actually do an audit and maybe archive some of these pages or redirect some of these pages, get rid of some of these pages and then re-optimize some of the other content as well.
Dixon: I see duplicate content happening a lot particularly on WordPress sites where people have put in a post and associated with more than one category and that automatically will create two URLs which are essentially have the same content and there’s ways around that well within WordPress I’m sure you can use plugins to stop that from happening or the best thing to do is not to put one article in two categories. But also recently I’ve been seeing probably not something that large organizations would like but I’ve seen people doing those kind of redirects on the Edge and on Cloudflare and these kind of things, so if all of a sudden a catrice turns up that you don’t that you that’s a sort of a an old category it’ll redirect actually at the DNS level which I’m sure would scare the heck out of some organizations but it’s quite a powerful way to do SEO.
Kara: Yeah, there is that, I guess it depends on the level of duplication and how aggressive, like you said, it would scare people. I guess it depends on how aggressive do you want to go with the duplicate content and that’s why I say my advice is just always just don’t have it because you alleviate a lot of issues. I get it from some standpoint, duplicate content sometimes it’s unavoidable. Again working on some large sites there’s issues where we have I guess a bit of a separate issue but there’ll be like loads of duplicate message descriptions for example because if they sell a mass amount of products but the only differentiation is the color or something they’re not really going to write a different meta description for every single URL just because they’ve got it in blue, red, black, green and yellow.
Dixon: So are you lamenting the Google Search Console dropping the parameter thing? So somewhere in the in Google Search Console you used to be able to say: please ignore any parameters that says color equals or whatever and they’ve dropped that. Is that something you lament or you think they get it right all the time?
Kara: To be honest with that I more rely on robots.txt for things that we want to be ignored or disallowed in the service and to be honest I guess it depends on how the parameters are rendered because some parameters just aren’t rendered in search because they’re quite ugly URLs and they’ve got loads of different characters and that kind of thing so I guess it depends on if you know for a fact that your parameter URLs are being indexed or they’re being rendered in a particular way where people are going to land on them then I would say then fix it but also disallow it in the robot.txt right because with Google doing what they did in search console and to be honest I wasn’t really convinced of how accurate that function in search muscle was anyway. I think for me I like to have a bit thing a bit more succinct and a bit more black and white and I think robots is the better way to go.
Dixon: I find Google ignore so many bits of robots.txt though and it was never an official system really was it, so it’s kind of optional for search engines really so it’s not ideal but we’re SEOs where nothing’s ideal really as you say. It is what it is.
Kara: Exactly. It is what it is and I think we with that going back to my point before saying not what you want to hear but it depends. It’s more so because even down to like meta descriptions, Google are now deciding what meta description they want to put in so you can implement a whole bunch of meta descriptions but actually Google may decide to go on your URL that you’ve written that meta description for and take a chunk of your text that’s on page and they think that’s the best message description.
Dixon: And they’re doing that a lot, they’re doing like 60% of the time or 40 to 60% of the time. I’ve seen a couple of different surveys and things so it’s huge. But when we looked at, when I saw a study, when they were looking at the change that they made, most of the changes were more sense with sensible changes so they were helping the user at the far end but it was a shock to the SEO community when they started changing your content on the fly which kind of I guess it would have been an interesting one to put back to Wilhemina because that might affect, she’s in a regulated industry so are there gonna be occasions where a search engine may inadvertently cause a some especially somebody in a regulated industry to be breaking the law because they legally have to say something in a in the description for example.
Kara: That’s interesting actually because I had the issue with one of my clients and there’s a restriction that they have from a dimensions point of view within particular markets. They have to kind of state the dimensions of the products and that because of Google doing whatever they wanted to do, we would put it in the meta description but it wouldn’t always show, because Google decided to take something else. But I guess, obviously I can’t speak for Wilhemina, but I’m assuming within her industry whatever needs to be put out there, they would have on site anyway so there would be some type of hope that if Google are pulling from on page then it would pull the right thing or at least pull something and that’s why I think being within SEO and I’m not sure if about the latest update the Google mum update which is all very wishy-washy as it always is, I think the more and more updates that come I think from a technical standpoint, it’s about just making sure that your site is technically and technically sound and the most accessible because that’s even though Google are never going to give you the clear answer, that’s exactly what they care about. They care about the user. All of these updates that are coming out, you know the update before was about the product reviews and there’s so much that they’re trying to do just to serve the user and give the user everything that they need without the user working too much or the person who’s googling working too much. I think anything to do with technical thing you just have to make sure your site’s built correctly and you’re able to understand everything that’s going on under the hood so things are accessible for the user and then I guess that kind of crosses over into CRA but that’s a whole separate conversation.
Dixon: Yeah but I think that comes to the heart of philosophy SEO as a potential problem for a search engine in the end, because – hear me out here, because this is old man talking here – but it’s like they’ve given us so many hoops to jump through like making sure that all your images are well tagged and all these things are done right, you’ve got schema on the page or you’ve got things in bullet points and you write the style and what’s starting to happen is that in most markets pages that are not optimized, that were not ever designed to be optimized, are never going to reach the top of the SERPs but the interesting thing for me is that the real authority on any particular subject doesn’t give a flying fig about the SERPs and they’re just going to write about life on Mars or whatever their specialist topic is and if Google doesn’t understand the author of that content or doesn’t know that they’ll never get to show that to the to the people so they’re actually shooting themselves in the foot by telling everybody to do CRO and UX because it’s only the ones that do it, that they show up in the SERPs and therefore you actually suppress the actual authority authoritative content. Is that old man talking or do you think I’ve got something?
Kara: I’m never going to call you “old man”, it’s offensive but no, I wouldn’t say it’s crazy what you’re saying, I think again it goes back to that whole point of it depends, because you’re gonna do depending on what you do, I think a lot of the time and I think the work because a lot of people ask me why do you even do SEO like there’s so much that’s just like it’s a long long-term channel gain, there’s so many factors that can impact things, Google can literally just pull the rug from under your feet any given moment but I think for me it’s about especially from a technical standpoint it’s think of technical SEO as like a house, like the technical SEO is the foundation that people don’t see when they come to your house, the brick work, the electrics, the plumbing, all of that stuff, the stuff that keeps your house structurally going and everyone just cares about the wallpaper, the flooring, the sofa, the TV or whatever but it’s about understanding that within that standpoint of if they are being told to do CRO if they are being told to do stuff like that, I guess it’s similar to how a builder would be told and architect would be told to how to drop a plan for a house because you’re told that to make it accessible. Because at the end of the day, everybody’s website if it’s an e-commerce website or if you make revenue from it, you want people to convert so it’s one thing to make your website discoverable and make sure your website can be indexed in the SERPs and on the first page although there’s a lot of studies now that even second page is the place to be but who knows.
Dixon: Really? OK.
Kara: Yeah, well as in people are getting more… I think searches are getting a little bit more inquisitive with their searches so if they’re searching for something and obviously Google’s changing the setup so much now with there’s so much above the fold, there’s map results, there’s feature snippet results, there’s people also ask results, there’s video results, there’s so much, there’s social media results, there’s knowledge panels, there’s so much stuff that can come into that play and I think people are, some people I guess, I don’t think there’s any stats that have come out about it yet but they probably will because we’ve been having a lot of discussions about it as to whether how far do people scroll now and you know there was that time when Google trialled the scroll where you didn’t even have to click on a page you just didn’t do that, yeah, exactly, you just kept scrolling, you didn’t know what page you were. You could be on page four but you were just scrolling and scrolling and scrolling so within the CRO I think it definitely goes hand in hand and I think I remember doing a project for salvages on conversion like just working with UX a lot more and I was like guys, we should probably work a lot more closely together because we’re the ones focused on bringing all the traffic to your site and you’re the ones focusing on converting people but if the two aren’t matching then it’s not going to work because I can drive a million visits to your site in a week but if you if your navigation is poor then no one’s going to want to convert.
Dixon: Then no one goes from there and I think there’s something, you might have something by saying that people are digging deeper into the search results because I I’ve been surprised recently at a couple of non-SEOs who said: “Oh, I never click on ads, I go straight past the ads”, so there and because Google has put so many ads up there now, you’ve got to consciously scroll down to see anything except for ads and a lot of the time so that puts people in the habit of digging into the results a little bit more so you may well be correct on that. I’d be interested to see those studies, I think. If we find one, we’ll put it in the comments on various podcasts, if we can or put it back on The Knowledge Panel Shows website at least. I just got to finish up I think. I wanted to talk a little bit about crawlers but I don’t think we’ve got time, we’ll actually maybe we’ll because we might have to do some editing because of Willemina technical challenges. So do you use crawlers much yourself, Screaming Frog or Sitebulb or Oncrawl or those kind of things and if so what do you what do you use and what do you like about them?
Kara: It’s probably gonna sound really sad, but I love crawlers because crawlers tell me what I need to know and Screaming Frog I guess is my main day-to-day that I would just call to check some stuff and I think for me I’ve got to the point of using the more advanced features of Screaming Frog that probably people don’t look at, so building regex, building x-paths, extracting a lot of different things that you need from it so Screaming Frog is definitely a go-to for me also use Deepcrawl. Deepcrawl have actually advanced themselves quite a lot and based on when I used to use the tool when I first stepped into the industry it’s definitely come a long way.
Dixon: And they’re really good for large sites, aren’t they? Not a (unintelligible) Screaming Frog for a large site.
Kara: Yeah, so if I want to do a large like a big massive crawl on a on a huge site, I would set the call for Deepcrawl and let it run overnight. If I’m just trying to find something really quickly, I’ll just grab it from Screaming Frog but those are the two tools that I would use and I guess slightly off topic a little bit but I think log file analysis is definitely a big thing that people don’t look into from a technical aspect and within that Botify is probably one of the best tools I’ve seen that do that.
Dixon: What do you find in log files that you don’t find in in web crawlers a lot of those 500 responses properly.
Kara: Yeah, a lot of 500 responses, it’s interesting because the log file analysis I guess it just gives you a more detailed view of how search bots are actually looking at every single thing that happens when it comes to your website crawls or your pages and kind of deciphers and I guess puts it in its own little database of whatever it feels. I think with the log file analysis you’re literally getting I guess deeper into the hood or under the hood of understanding how your website is being treated from a bot perspective and you’re being able to manage 500’s because even with search goings or stuff like that it will flag up 500 errors but I think log file analysis is a lot more real time so if you’ve had some 500 errors they may have and you’ve been flagged up in search console a day or two later whereas with log file analysis it’s a lot I think a log file analysis is a lot more reactive.
Dixon: The problem with a log file analysis on some of the customers that you use, you know, these big people, these Apples and salvages, they’re big files really, there’s an awful lot of data in those files so probably not so easy to go and have a look at last month’s log files.
Kara: No, the log file analysis that I have done has been on smaller sites for sure but it’s good to give that more detailed overview of what is happening because it’s important. But from a caller perspective I’d say Deepcrawl, Screaming Frog, definitely good to understand what’s happening. There are some plugins that people can use with Chrome as well, obviously they’re not exactly crawler plug-ins but they kind of give you top line over if you want to check some stuff and also Chrome Developer Tools – it’s not really a crawler but Chrome Developer Tools does give you some more insight in terms of the back end, the html, you can edit stuff, you can really kind of understand and see and you can change the device and see how things are looking from a responsive view so yeah there’s a lot of good tools out there.
Dixon: So I wanted to finish up with asking a question about how you assess. How you want and how you actually do assess the effects of changes that you make? What kind of feedback loops do you have at your disposal to so right okay we’ve made this change on a server or we’ve added the script here. Do you always have time to assess the change or did you just a lot of the time have it say well let’s fix that so we can just move on?
Kara: No, I do a lot of impact analysis and I guess because some of the work that I do is a lot more technical now so if there’s been a technical release that we fed into particularly and we say that if you do this release then it’s gonna give an SEO benefit of x or whatever, I would give it maybe one to three months, depending what the release is, it depends what impact we’re expecting and then I guess it depends. So I’m trying to think of one that was done recently. If we’ve made a massive update to like message descriptions for example or something we may look at click-through rate, I think for every release you’re going to have to understand what you think the impact should be and then that’s my first point of call but I definitely would say that with anything that I do, there needs to be some poor impact analysis otherwise it’s a bit pointless why are we fixing things.
Dixon: The difficult thing though with that is that it depends on the cadence with which you do releases though, because and of course the cadence with which Google does updates, but because you can make a change and then one to three months later potentially you’ve made a dozen other changes and Google’s done three updates and so does that does that nullify a lot of the impact analysis or is that just part for the course and you’ve got to go with it anyway because there’s no other choice?
Kara: I guess, I mean, if anything’s broken you gotta fix it right and I think it’s about sometimes there are fixes that just need fixing and you’re not necessarily going to see an impact but that if you don’t fix it the consequences will be more detrimental so something’s broken. Say if you’ve got loads of broken pages, it may not necessarily have a massive boost on performance but you just know it’s just bad practice to have broken pages. But if there’s a massive impact of page, let’s take page speed for example. Core web vitals are always going to be a thing, page speed is always going to be a massive thing, especially with the mobile index only and all the focus on mobile and things like that so I think page speed is a is a project that I’m constantly working on and I’m constantly reviewing, okay, cool we’ve done this now what’s the load time now on this page, how many seconds have we, how many seconds have we scratched off or etched off and things like that so yeah I think you just got to kind of do it case-by-case and understand what fixes you’re doing that are going to have an impact and if it does have an impact then it’s about what because I think with all the fixes that I put for you personally we always set a bit of a KPI of understanding what are we trying to achieve with this, there’s no point in suggestion to fix this if we don’t have a KPI and then.
Dixon: So you’ve got something to measure after the event, that’s brilliant. Honestly I know we’ve had some technical challenges on this one but this has been thoroughly interesting and I’d love to talk more and hopefully you don’t mind if I reach out on LinkedIn as well because you’re a very interesting person and if we’re ever at a conference I’d love to buy you a beer or a wine or whatever. David, what’s going on next time on The Knowledge Panel Show?
David: Next time we’re actually back to Mondays, we were broadcasting live on a Wednesday this time because it was bank holiday Monday in the UK and probably many other places around the world as well. So next time it’s going to be Monday the 16th of May. We’re going to be talking about SEO and Spanish versus SEO in English and we’ve got four great guests booked for that one already: Adelina Bordea, Filipa Gaspar, Gemma Fontané and Montserrat Cano. So just go to theknowledgepanelshow.com, sign up to watch that one live – if you can join us live that will be superb.
Dixon: OK and I haven’t spent any time talking about our sponsor, InLinks, so hi InLinks, thanks for being on the show. It’s allright, they can’t sack me, I’m the CEO. Guys see you on the next episode and thank you very much for coming along and Kara again thank you very much.
Kara: No problem, thank you for having me.
Transcript edited on 11th September 2022.