SEO
What SEO tools could you not live without and why?
Any tool that helps you do keyword research, check backlinks, helps you understand your keyword rankings, and where your traffic is coming from, how each user is interacting with your website, and conversion numbers is very valuable. Any of the very popular ones like majestic, sem rush, moz, ahrefs can help you do that, and some help you do things a little slightly better than others. So you have to pick on choose depending on your preferences but I personally like majestic and ahrefs
What is more valuable, long-tail or short-tail keywords?
Both are valuable but longtail keywords tend to have a different group of people searching for them. So they might be in a different stage of the buyers journey than people that use specific intent keywords
What is canonical Issue? How to resolve this issue?
Canonical arises when the same page access from multiple URL's. This issue can lead to duplicate content with in the same website. For example in E-commerce industry same page can be access through various filters like color, size, price etc. To resolve this issue canonical tag is used on multiple version of the original page with the <rel canonical > tag.
Migrations
Combining websites for exponential value. Quick disclaimer I've never had to do that. But what I do know is that when you have multiple sites and you combine them, the result isn't always what you anticipate. So i had a buddy that had a few websites he thought he could turn into an authority site. She mentioned that it wasn't easy to do and the results weren't very predictable. Maybe she might have missed a thing or two, but I believe combing websites is a very delicate process that has to be done properly
What is the best way to get a page indexed in Google?
Content
Tell me one website that you did best seo job so far. Then the discussion follows implemented seo strategy, traffic growth, generated leads and number of conversions
Plumbing websites
What is the function of the robots.txt file?
This file tells the crawlers which pages to crawl and which ones not to. Because google does have a budget on how many pages it can crawl so it's important you let it crawl your most valuable pages. It can negatively affect your rankings if it takes too long to crawl your pages
Can we use rel=canonical tag for cross domain?
Yes you can use rel=canonical tag for cross domain also. Google announced the support of cross domain canonical tag in 2009. This type of canonical tag is generally used for syndicated content.
Is there any limit for robots.txt file?
Yes, Googlebot reads only first 500 KB of Robots.txt file
How do you see what pages on your site Google has indexed, and why is this information important?
You can request indexing by using the URL inspection tool in search console to request indexation
How to prevent crawler from indexing a webpage?
You can request the crawler for not indexing your web page by two ways. From first method you can block that particular in robots.txt file. Second is , use NOINDEX attribute on a page with Meta Robot tag.
Indexation
canonicalization, robots.txt, meta-tags
Content Optimization
keyword targeted search engine friendly url optimized title tag optimized meta description, h1 tags target keyword in first paragraph optimized h2 tags mobile first design layout social shares inbound/outbound links images and video page speed
How has hummingbird changed the landscape of search?
some
How have you utilized structured data to earn featured snippets
some
What is Google Sandbox
Google sandbox is basically an imaginary place where Google keeps new websites before giving them high ranking position in SERP. The reason to put a new website in sandbox is to check whether the website is really optimize in better way or they are doing some black hat activity to rank well in search result. A domain can be in sandbox for the time period less than one month to 8 month.
How do you check the crawl rate of a site and why is this important
Google search console gives you the number of times the page has been crawled and if it found any issues. It's important because that's how you find out what's going on with the structure and the technical side of your website. Making frequent site updates and if you happen to have new content, sharing it with other blogs will encourage the bots to crawl your site. Create some new internal links.
What are the five most important on-page optimization factors?
H1 tags, title, alt text, keyword density, internal links
What is the difference between HTML and XML sitemaps?
HTML site-map is generally created for better user experience purpose where user can easily navigate to all other internal pages from a single page. This site-map is also give information about website structure. XML site map is created for search engine so that googlebot can easily crawl, index and discover new pages. XML site-map also give various option such as you can add additional information about pages within site-map like image, video, news etc. You can also add change frequency to corresponding page with options weekly, daily, monthly.
On a scale of one to ten, how important is site speed to the optimization process?
If the website takes too long to load, you'll end up having a huge bounce rate which isn't good because google takes a note of that and pushes the sites with much quicker speeds in front of the users that don't have to wait to get answers to their que
Character limit for meta tag description
160
Character limit for title tag ?
60 characters for a title tag
Status codes
A 200 status code says to google there's something worth indexing here. 3xx, 4xx, 5xx
What are 301 and 302 redirects? When should I use these redirection methods?
A 301 redirect is also called permanent redirect that informs the crawlers that your page content is removed permanently to the other page. 301 link redirection methods passed the link juice completely to the new page. Permanent redirection is used when domain is moved to new CMS or when the page URL structure is changed. A 302 is commonly referred as temporary redirection. This link redirection doesn't pass the link juice to the new page. This method informs the search engine crawler that your content is just offline temporarily. Temporary redirection is widely use in e-commerce industry for all category pages where product is sold or out of stock.
Define duplicate content and its relation to search engines.
Duplicate content is the exact copy of an original content. Google does not like duplicate content, and tends to penalize websites with duplicate content, or at the very least, give the keywords within that piece of content no credibility. So if you rank for a 5000 word article and it's a duplicate of an original piece of content, you'll lose that ranking in no time and get some penalties as a result of that. It will be looking at your website closely for bad links as well.
What is EMD Update of Google?
EMD stands for Exact Match Domain, it is a filter by Google to prevent those poor sites to rank well who want to take advantage of commercial keywords in their domain name.
What are the five most important off-page optimization factors?
External links, social signals, web 2.0's, guest posting, directory submissions
How to target any specific country audience for your business?
For targeting the people of any country first you should have the domain with TLD (Top Level Domain) of that country. For example for targeting country such as Canada you must have a domain like http:://www.example.ca. Your website IP should also from the same region so host your website within the same country. The next important step is the geographical setting in webmaster tool. Go to search traffic section in your webmaster tools account, now choose International targeting, and under country section select the desired country audience that you want to target.
List some tools that you use for your daily SEO activity?
Google Analtyics Google Webmaster Tools Open Site Explorer Alexa Ahrefs Screaming Frog Xenu
How quickly after making changes to a page should you expect to see an impact in search?
It really depends on the website and a few other factors like how competitive the keywords are and how many authority sites are on the first page 30-90 days
Structured Data
It's how you mark up your content in a very specific set of items and subitems so the search engine understands better and is able to pull information from the value card and show it on the snippets. So if a user searches for a neck pillow, the user can see the item right away and decide if that's what they're looking for. They're more likely to click on an item and go over to the website to check out that specific subcategory. There's also a webpage search box where users can search the website for specific information. They can find out about events, locations, and variations of the specific thing they searched for. If they search for a biology course, there will be anchor links under the meta descriptions for anatomy and physiology. That by itself can increase visitors. You can use Google's structured data helper to structure website content. You can also use the structured data testing tool to validate your markup for the pages you want. And then use google search console to track the progress.
What is the single best way to find out what your customers are looking for?
Keyword intent
Crawling
Log file analysis allows you to understand exactly how search engines crawl your website as every request made on the hosting web server is saved. Is your crawl budget spent efficiently? What accessibility errors were met during crawl? Where are the areas of crawl deficiency? Log analysis can also help you determine whether your site architecture is optimized, if you have site performance issues, and so on. Bot crawl volume can show you if you have been crawled by a specific search engine. For instance, if you want to get found in China but Baidu is not crawling you, that is an issue. XML sitemaps help crawlers crawl your website and because crawlers usually have a budget and won't crawl every page on your website, the sitemap serves as a guide for the crawlers to know which pages are important.
How much do broken and redirecting links impact your optimization efforts?
Not very good, because it can reduce traffic if the users aren't able to access the page their looking for so doing a 301 redirect is important. Making a typo when you created the link, deleting an image, video, file or an entire web page renaming or moving a page and forgetting to update your internal links, linking to content like images, videos, PDFs, that has been deleted or moved, changing domain names and moving the site to a new URL
Mobile
Progressive Web Apps are a version of your website with pre cached content so that users can access your website without having to download an app. They can also navigate the website regardless of how good their connection is. This assurance is increased by the use of responsive design, which allows PWAs to be used regardless of device form factorIt's huge for seo and can increase conversion rates by 15%. Quick and smooth. A major benefit to using AMP is that you can improve load times on specific pages without having to redo your entire site. Simply create an AMP template for a given content type and use a rel="amphtml" tag on the non-AMP version of the content to ensure that the AMP version is accessible to search engines. The AMP version then points back to the non-AMP version with a canonical tag.
Site structure
Slugs need to be well sectioned under a parent page. So that everything is a lot more organized. It's like going into the grocery store to buy some dairy and you find produce in that section. It doesn't belong and you're not able to silo very well doing that. Building the right internal links also helps search engines understand how different pieces of content care related to each other. Having the right URL structure is also important that it's scalable. So keeping it simple and easy to understand for users is important. Especially if there are multiple locations
How to remove Canonical Issue ?
The best and most effective way to resolve the canonical issue is with a permanent 301 redirect. This can be implemented in a number of ways, as detailed below. Depending on what server your website is hosted on will determine the method which you use to implement a redirect
What is the importance of the title, description, and keyword meta tags?
The search engine crawlers, just like the users need to understand what your page is about. And in order for them to do that, they have to be able to look at the title, description, and keyword metatags to get a sense of what the page is about
How to increase the page load speed?
To improve the page load time we should follow these steps: Use external CSS for web pages. Avoid large size images on your web page. Compress the image before using on web page. Remove the irrelevant code from your page.
How to remove a page from Google Indexing?
To remove a web page from Google indexing you can use remove URL option from webmaster tool. Before submitting the page in webmaster tool first removes it from your website so—that it return 404 or 410.
What makes a URL SEO friendly?
URL's are SEO friendly if the user can accurately guess what the piece of content is about, if it has some keywords in them, and then you use hyphens to separate the keywords because google prefers them.
What's the ideal speed to load a website
Under half a second is ideal according to google. But they don't want it to be more than 2 seconds.
On a scale of one to ten, how important is validated HTML and CSS to optimization?
Very important because that's how you're able to optimize the structure and overall appearance of the website. It's also how search engines are able to crawl your site and get a deeper understanding of the layout of the website and the content on the website
