General SEO - ME

अब Quizwiz के साथ अपने होमवर्क और परीक्षाओं को एस करें!

navigation

- To search engines, this looks like every page on your site is linking to those top level pages. Internal linking shows the importance of different pages within your site, so pages in the universal _______________ will rank better than pages further down in the __________al structure. - Every link in the universal ______________will have a backlink from every page on your site. These links should point to your highest level, most important pages.

302

A __________ redirect is a temporary redirect. It passes 0% of link authority and, in most cases, should not be used as the 1.1 HTML version changed the meaning of a 302 from "Moved Temporarily" to "Found". When __________is used instead of a 301, search engines might continue to index the old URL, and disregard the new one as a duplicate. Link popularity might be divided between the two URLs, hurting search rankings. Google will treat a 302 as a 301 if they think the webmaster has made an error but you can never be too careful.

URL redirect

A ___________ is a function of a webserver that sends a user from one URL to another. Redirects commonly take the form of an automated redirect that uses one of a series of status codes defined within the HTTP protocol. Redirects can be applied to a single page, a subfolder or an entire domain.

site search

A ____________________ that is relevant, optimized in the right places (URL, title, headings) and has unique content, is no longer a poor quality page and now has potential to be targeted as a valuable page. As Maile Ohye wrote: "If your site design surfaces category pages similarly to search result pages, adding valuable content to the page makes the content more helpful to the searcher (and no longer just search results)". In this sense some internal search result pages can be indexed if it serves business purposes

over-indexation, under-indexation

A domain can have many thin and duplicate content pages appearing in the SERPs (_______________________), while not having the actual unique landing pages indexed (______________________).

XML sitemaps

According to case studies (details below), Google finds over 70% of the important URLs through ____________ and only about 30% of the URLs through crawling the interlinking structure (often referred as "Discovery").

RSS feeds

Adding ____________ can help a site ranking, but in a less direct way than the unique content available on your pages.

404

Additional navigation options should be added to the ______ page to ensure the user does not leave the site and finds the information they're looking for. A search box or a link to the homepage will also be of more value to the user than a simple ______ page.

sledgehammer

Always remember that robots.txt is a sledgehammer and is not subtle. There are often other tools at your disposal that can do a better job of influencing how search engines crawl, such as the parameter handling tools within Google and Bing Webmaster Tools, the meta robots tag, and the x-robots-tag response header.

XML sitemap

An ___________________ is a document that helps Google and other search engines better understand your website while crawling it. It will contain a list URLs on your website, but to get more out of it, you should also include other information such as last modified time, how often the file is updated and what is the URL's priority

trailing slash

Another well known issue with URLs is the ______________. There is no evidence to suggest that having / not having a trailing slash is better for SEO but one has to be chosen over the other for consistency. The following instruction should be added to the .htaccess file which will remove any trailing slashes:

TLDs

Apart from the most commonly used .com and .net, there are also many other __________that are used across the board, i.e. info, .edu, .biz, etc. List of TLDs can be found on this Wikipedia page. It is recommended to use an appropriate TLD for the relevant site. For most business sites that are targeting global audiences, .com is obviously the safe choice. However for sites that are targeting a specific country such as the UK, it might work out better if .co.uk is picked since Google UK may favour content from within the region.

duplicate

As a general rule, the robots.txt file should never be used to handle _______________content. There are better ways. Rel canonical is your best friend here.

caption tags

As human beings, our attentions are naturally drawn to images, and our eyes would almost instinctively being looking at the text associated with those pictures. For that reason, this is a good opportunity to emphasise some points or even make some suggestions. Although this practice might not reward the website as a ranking factor alone, it is surely a smart way to keep user around on page for longer, which is valued by Google (time on site and user engagement). What is the tactic called?

suppressed listing

Blocked content can still appear in search results, leading to a poor user experience in some cases. When Googlebot is blocked from a particular URL, it has no way of accessing the content. When a link appears to that content, the URL often is displayed in the index without snippet or title information. It becomes a so-called "__________________________" in organic search.

iFrames

Can Google Crawl ________________ on Webpages? - Previously, ____________ could not be crawled by Google or other search engines. Despite users being able to read the content, as it was not crawled by search engines it obviously had a negative effect on the site's SEO. -In some instances, web crawlers were able to enter an _____________, but would become stuck and could not get out of the ____________ to crawl the rest of the website. -Google is now capable of crawling ____________ on webpages, as long as the ______________ are SEO friendly. However it is suspected that Google will ignore _______________ that point to a different domain, and not the top-level domain.

directives

Disallow statements within the robots.txt file are hard _______________, not hints, and should be thought of as such. _________________here are akin to using a sledgehammer.

URL readability by human beings

Due to the limits on current technologies, it is still a much valid practice to improve the readability for both human beings as well as search engines. The following illustration created by Rand Fishkin from MOZ gives a clear expiation of the importance of readable URL structure:

about us

E-commerce sites, transactional sites, and online services sites need a strong '____________' section because users often wonder who's behind a Web-based service, how it's funded, and whether it's credible. If you order from an e-commerce site, can you trust the company to ship the package? Will it accept a return if the product arrives in poor condition? If you register on a site, will it sell your personal information to anyone who can pay, and thus expose you to endless spam about everything?

CSS files

Ensure ______________ are not blocked in robots.txt. For similar reasons, javascript assets that assist in rendering rich content should also be omitted from disallow statements, as these can cause problems with snippet previews.

404

Ensure that any _____ pages do not return a 200 status code as this tells the search engines the page has rendered correctly.

JavaScript

Essentially, the benefits from being able to crawl _______________ effectively are that pages will hopefully rank better in Google due to a better flow of link juice from navigation menus, and improved relevancy and quality of content from crawlers being able to view text and images effectively. However, the use of excessive _______________ is currently rampant and often times a browser has to make a significant quantity of additional requests and spend time downloading this ______________. Now that the Googlebot has to do this too, many sites' load times in the eyes of Google are likely to increase, which may in turn have negative impacts on rankings. This may even result in higher crawler abandonment rates - since the Googlebot won't be capable of processing extremely complex ___________________, it might give up on indexing pages that use a lot of __________________ to produce an unconventional user experience.

resized

Every webmaster should have already known that all images used on their site should ideally be hosted on their site too. With that said, many webmasters might not know is that images should be ___________to the actual need before being uploaded and used on webpages. Although many are used to modifying the parameters to adjust the size of photo, they may have overlooked the fact that the large image files are quietly taking up storage space, putting extra burdens on visitors' browsers and bandwidth to load up and resize the images. Search engines have stated that fast-loading sites would be rewarded for better user experience, therefore reading text content while waiting and guessing what the images look like is certainly not a good experience.

404

Finding _______ pages You can find ______ using Screaming Frog or Google Search Console. In Search Console you'll navigate to Crawl and select Crawl Errors. At that point you'll select the 'Not found' tab to find the list of ______ Google has identified. Click on one of these URLs and you get a pop-up where you can select the 'Linked from' tab. What you're looking for are instances where your own site appears in the 'Linked from' section. On larger sites it can be easy to spot a bug that produces these types of errors by just checking a handful of these URLs.

hyphens

For domains names that consist two or more words, you may want to consider to use hyphens, separating the words, to make it easy to read. However, many spammy websites use domain names with hyphens, which means your domain name with ____________might get frowned upon. Generally speaking, it's better not to use more than one hyphen, or in other words, use one or two words only for your domain name.

Python and Ruby

For those who are eager to program their own scraper, the programming languages ___________________________________ offer a wide range of libraries that can be used to download full pages or just certain elements. Scrapers built with Python or Ruby are practically the fastest and most versatile scrapers, however learning to built or even install these apps might take some time.

subfolder

From the experiment carried out during 2011-13, MOZ has seen its blog section received much better ranking status after being moved to ________________from subdomain. It is recommended that subfolders are safe choices whilst sub domains should be chosen only when absolutely necessary.

JavaScript

Google can execute some ______________ to find content but Google has limitations on what it can do, and what it can understand. The best practice remains the same: put the content you want Google to crawl and index in basic HTML. You can use jQuery tabs to put the content on one file instead of AJAX tabs that spreads out the content across several files. In short, make it easy for Google to access your content. However, this all depends on a case-by-case basis and an understanding of client needs and abilities. Google stated in 2014: "You should allow Googlebot access to the JavaScript, CSS, and image files that your pages use. This provides you optimal rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in your site's robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings." A final point to mention is that Google's algorithms change all the time and web languages and features come and go in popularity. It was only recently that Flash was used all over the internet, and now sites that use Flash are penalised by Google due to security vulnerabilities etc. Therefore, if a site is built entirely in JavaScript and suddenly Google backtracks on their embracing of it, the site will essentially have to be re-built from scratch, rather than simply removing features.

Web 'spider' or 'crawler' or 'robot'

Google's ______ constantly crawl webpages across the internet in order to build up a picture of how the internet is interconnected. It's these links and the value that they pass, which makes them valuable to SEO.

Site Search

How to Handle _____________ Pages - The right approach for site search pages is usually using a combination of tools. - Robots.txt is the most emphatic and easiest method, however if you already have tens of thousands of site search pages indexed you'll want to use meta robots noindex. - Parameter handling can be very effective too, providing your URL query strings are encoded in a series of field-value pairs. - To avoid wasting crawl budget, it worth changing internal search form from using GET method to use POST method, so that search crawlers can't "type " in the search box.

301

If a page receives important links, gets a substantive volume of visitor traffic, or has an obvious URL that visitors or links are intended to reach, a _____redirect should be employed to the most relevant page possible. Ensure you don't redirect every single 404 page to the homepage but to the most appropriate page for the user.

Deep Crawl

If one needs to crawl hundreds of thousands (or even millions) of links and have a budget for it, _______________ will be able to offer a range of options to do so. Similarly to Screaming Frog, different crawl settings (exclusions and inclusion criteria, crawl speed, crawled content etc.) can be set beforehand. It also offers a range of reports that helps to understand how the website is built up.

link

If the image is sourced from elsewhere, it might be a good gesture to credit the source with a link, which some readers may find very helpful. On the other hand, it might be a good linking opportunity to find out who has 'borrowed' the image without accreditation. Therefore it might worth a few minutes to drop a line requesting a _________from them.

RSS

If you add _______feeds that have been aggregated or customized, they will contain a flow of keywords that you will not necessarily rank for, but if related semantically (by meaning) to other words and phrases on your site, they will be seen as relevant. Additionally, the freshness of information is also a measure of how well your pages will rank.

XML Sitemap Tree

If you have many sitemaps, you can use a sitemaps index file as a way to submit them at once. The XML format of a sitemap index file is very similar to the XML format of a sitemap file. The sitemap index file uses the following XML tags: sitemapindex - the parent tag surrounds the file. sitemap - the parent tag for each sitemap listed in the file (a child of sitemapindex). loc - the location of the sitemap (a child of sitemap). lastmod - the last modified date of the sitemap (optional).

70

If you keep your titles under _____characters, you can expect at least 95% of your titles to display properly. If the title is too long, engines will show an ellipsis, "..." to indicate that a title tag has been cut off. That said, length is not a hard and fast rule.

10%

If you're tracking internal site search, what search terms do visitors use once they're on your site? On average, only _____ of visitors use site search. So, it's safe to assume that most people only use site search if they have a hard time finding what they want with your navigation. What terms are visitors searching for? Do you have that page? Is it hidden?

.html , .php

If your URLs end with unnecessary extensions such as ________ or _____ these can also be removed via htaccess by adding the following instrction into the file RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.html -f RewriteRule ^(.*)$ $1.html

canonicalisation

In general, duplicated pages should be either tagged with _____________________, or redirected to the main page to keep. In the case of redirection, do bear in mind to avoid multiple hops of redirection (URL 1 -> URL 2 -> URL 3).

'affiliate link' + 'nofollow'

In order for a link to pass SEO benefit / Domain Authority / SEO 'Juice', to another domain, the link cannot be a shortened ___________ or a ___________ link.

case-sensitivity

It has been noted that ___________________ in URLs may not cause trouble if the servers are based on Microsoft / IIS. However it's definitely not the case for sites hosted on servers based on Linux / UNIX (As shown on the following chart). Ideally, webmasters may want to see all URLs that use wrong cases automatically redirected to the target URL.

TLDs

It is also a common practice for many international companies to use country specific _________with hreflang tags as part of their international online marketing strategies. This may be particularly important when it comes to other search engines. For instance, it is well-known that Baidu is not as multilingual-friendly as Google, therefore a separate TLD such as .com.cn or .cn is generally recommended for Chinese search market). Additionally, a sitemap that indicates alternate language pages may prove to be helpful too.

302

Keep in mind that not every broken / _________redirecting page has quality links and reinstating or redirecting these might by diminishing for rankings. This lost link equity analysis should be integral part of backlink profiling or at least it should consider only external links that reach some sort of quality standard.

extension

Limited use of regular expression is supported. This means that you can use wildcards to block all content with a specific extension, for example, such as the following directive which will block Powerpoints: Disallow: *.ppt$

404

Link equity from links with a ________error will not be passed to the rest of the site as search bots see these as a "dead end".

URL

Match ______to title if Possible. This practice not only improve the level of relevancy of content, but also helps to bring the clarity of content before the eyes of searchers and search engines. In such a case the content is more likely to meet the expectation from the searcher. As mentioned earlier, if a shorter URL is preferred, webmasters may wish to consider omit some stop words, e.g. and, or, but, the, of, etc, provided that it does not cause major misleading issues.

160

Meta description limits vary in length depending on the search result. Sometimes Google allows 2 lines, sometimes 3 or more. It is recommended to still keep meta descriptions within _____ characters to reduce the likelihood that they are truncated (18/10/2016).

hyphens and underscores

Most modern search engines are now treating ___________________ very similarly. It is still common to see spaces being used in URLs, but they are rendered as %20, which is not ideal for readability.

Image

Most webmasters understand the importance of having HTML/XML sitemaps for their sites, but some of them might not know an _________sitemap could be created.

product

Navigation doesn't stop at category pages. On a ___________/conversion page, you want your visitors to be able to go back to the navigation page they were on before they got to this product page (vertical linking) and hop around to similar product pages (horizontal linking).

equity

No equity will be passed through URLs blocked by robots.txt. Keep this in mind when dealing with duplicate content.

noindex

One important note: while robots.txt will create these undesirable suppressed listings, use of meta robots _______________will keep URLs from appearing in the index entirely, even when links appear to the URLs (astute readers will note this is because meta noindex URLs are crawled).

301

Pages with lost link equity should be reinstated or _________Redirected to preserve the ranking authority gained by them.

keyword-rich

Past research has also shown that many searchers also tend to read URL before clicking in reality. Similarly to a keyword-rich domain, a ______________________ URL may get copied & pasted regularly without using any anchor text, the URL in such a case acts as anchor text. Therefore the relevant keywords in the URL will make a positive impact on the site ranking. Please also note this is by no means to suggest keywords-stuffing in URLs which is likely to be penalised by search engines if found guilty.

Penguin penalty

Penguin penalty, ___________ is a Google algorithm to uncover spammy backlink profiles that devalues the impact of spammy backlinks on rankings and penalizes websites employing low quality backlinks by de-ranking them.

site map

People rarely use site maps. According to Nielsen-Norman research, only 7% of users turned to _________ when asked to learn about a site's structure. A site map are essentially a secondary navigation feature.

YouTube + Vimeo Video

Pros: -No cost for hosting and maintenance -Great brand exposure on PC, mobile smart-TV -User Generated Content friendly -Content may rank well for being on YouTube -Supportive, cost-effective and ever-evolving creator-friendly features (playlist, analytics) Cons: -Unplanned and uncontrollable advertising injected by hosting sites -Uncontrollable related videos (video from competitors) -Limits in file sizes and formats -Content ranking may not translate to own-domain ranking as the video mainly helps the video-sharing platforms -Might be pushed to consider paid service (YouTube Partner Program)

Self-hosting Video

Pros: -Full control of physical video files -Full ownership of video content -Full control of video display settings -Customisible embed code Cons: -High cost of hosting infrastructure -Greater management effort -More technical challenges -High cost of in-house tech support

eally Simple Syndication

RSS stands for 'Really Simple Syndication'. It is an XML based data format which is used to syndicate (i.e. contribute) the contents of a web page especially dynamic web pages. RSS feeds are the syndicated content itself. But technically speaking, it is an XML file which contains channel information (i.e. website information) and one or more items of a website. These items are generally the news items of a website or blog.

302

Search bots encountering _______temporary redirects will ignore these and move on, as they will assume that the page will be reinstated.

intentional

So for your website, you'll want to be very intentional about what items you place in these spots. Think about what is most important for your typical visitor.

duplicate content

Sometimes _____________ or even entire pages is inevitable. And a way to get around this is canonicalization, which kind of means hide one from Google. An example of this could be Forward 3D client Ralph Lauren, who hosts a global and local page under the same domain. Ralph Lauren canonicalise their international URLs to one singular page, thus only indexing one.

site map

The 404 page is a perfect opportunity to provide help to someone who might be lost. Obviously, if the customer is on the error (404) page, then he/she clicked on the a bad link or guessed at a URL and got it wrong. Placing the ______________ on your 404 page allows the user to say, "Oh!! Here is a link to the page I was looking for!"

Manuel

The ____________ Actions page in Google Webmaster Tools lists actions and links to steps you can take to address the problem. If your site is impacted by a manual spam action, you will also be notified in the Message Center in Webmaster tools.

HTML site map

The ______________ is effective for accessibility (in W3C terms), navigation, and internal linking purposes. But there are also plenty of considerations to make when thinking about the usability experience of a website. Imagine that a customer comes into the site and imagine that this same customer is also colour-blind. Maybe the layout of colours and design makes it so that it is difficult to navigate the website. Maybe your design simply sucks and is hard to use! In either case, the HTML sitemap provides an alternative to typical navigation on the website.

relevance

The images used on page should be a good addition to the text content, on the flip side, the text around the photo should add ___________________to the image.

50-60

The limit on length of URL is 2083 characters, however a shorter URL (___________ characters) is much recommended. A shorter URL is convenient for reading, sharing in social media, embedding on blogs and forums, copied & pasted in emails, etc. In the case of slashes or folders, less is generally better.

'prospecting'

The process of evaluating a domain or website for SEO potential is known as __________. Certain measurements are used to assess the SEO potential of a particular domain, the most commonly used measurement being 'Domain Authority', or 'DA'.

duplicate

The three biggest issues with _________________ content are: Search engines don't know which version(s) to include/exclude from their indices. Search engines don't know whether to direct the link metrics (trust, authority, anchor text, link juice, etc.) to one page, or keep it separated between multiple versions. Search engines don't know which version(s) to rank for query results. This therefore wastes crawl budget and also weakens any good link metrics coming in to any of your pages.

Robots tag

The value of the name attribute must specify a robot's name (e.g. googlebot) or simply "robots" to include all of them. The value of the content attribute must be one or more valid directives. Among the few existing directives, here are the ones primarily used for SEO: all: There are no restrictions for indexing or serving. Equivalent to index, follow. Note: this directive is the default value and has no effect if explicitly listed. noindex: Do not show this page in search results and do not show a "Cached" link in search results. nofollow: Do not follow the links on this page. none: Equivalent to noindex, nofollow Note that the absence of _________________ or an empty value of the "content" attribute is also equivalent to the default index, follow value.

subdomains

There are a few approaches when it comes to language-specific websites (Google's support page). Some webmasters prefer to choose ___________________, i.e. fr.example.com being the French version. Some may want to stick to subfolders named after language codes, i.e. /es/, /nl/, etc.

subset

There is a well documented Allow directive for robots.txt. This can be quite useful, for example if you want to disallow URLs based on a matched pattern, but allow a ____________of those URLs. The example given by Google is: User-agent: * Allow: /*?$ Disallow: /*?

Out of stock product pages

These tactics are good for... 1)Redirect to the deleted products' category page 2)A 301 redirect coupled with a noindex/follow meta-tag on the search results page 3)Manually redirect to a similar product 4)Redirect Based on Relevancy Value 5) 404 page 6)Permanently delete the expired product's pages, content and URLs 7)Reuse URLs 8)In case of temporarily out of stock items

Robots tags

They are page-level signals and can be implemented in a meta tag (robots meta tag) or in an HTTP response header (X-Robots-Tag).

Googlebot

This example of directives applies differently to all user agents, and ________________, respectively: User-Agent: * Disallow: / User-Agent: Googlebot Disallow: /cgi-bin/

URL rewriting

URL rewriting is the technique used to "translate" a URL like the last one into something the server can understand. Dynamic elements are sometimes added to the URL by the CMS which can make it longer, messier and less memorable.

disallowing

Use care when _________________content. Use of the following syntax will block the directory /folder-of-stuff/ and everything located within it (including subsequent folders and assets): Disallow: /folder-of-stuff/

robots.txt

Using _______________ to disallow URLs will not prevent them from being displayed in Google's search engine

PageRank

Using either method (meta noindex or robots.txt disallow) creates a wall that prevents the passing of link equity and anchor text. It is effectively a __________________dead end.

navigation

What are your top exit pages? If they're locations or external contact information, that's probably something a lot of your visitors are looking for. You should include that in your top ______________________.

navigational

What pages on your site get the most traffic? If those are the pages that you want to get the most traffic, keep those in mind as you build your __________________structure to make sure they're easy for visitors to find. If they aren't particularly high conversion pages, what's a similar page that you can steer those visitors to?

rel="nofollow"

When _________________ is added to a link (and assuming bots are respecting the annotation) crawlers won't follow this particular link, in general. Therefore, there is no weight, equity, PageRank, etc. being flown across this link. For this reason, the tag should be used on some external links and especially paid links.

Googlebot

When ___________________is specified as a user agent, all preceding rules are ignored and the subsequent rules are followed. For example, this Disallow directive applies to all user agents: User-Agent: * Disallow: /

Google Image SERP

When conducting public-facing activities including creating a website, it's always a relief to push the legal risks out of the way. Unlike the common practice many unscrupulous webmasters do, sourcing from __________________________ page without careful scrutinising might incur potential legal risks in violation of internet copyright. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/305165/c-notice-201401.pdf If found wrongdoings in using unlicensed images, webmasters may receive removal requests. They may even face legal actions if they fail to comply.

Image

When uploading the image files, you may also want to fill in with some keyword phrases naturally associated with the image. Do avoid keyword-stuffing if you do not wish to be seen as over-optimisation (http://singlegrain.com/penguin-recovery-plan-eliminating-over-optimization/). In correctly doing so, not only the website gets a bit boost in overall SEO performance (more keyword ranking opportunity in SERPs), the images themselves may also have a better chance to show up in Google ______________search results.

navigation

When visitors come to your site, you want them to figure out how to navigate quickly. Most websites have a ________strip at the top of every page that stays the same, with links to the major sections of the site.

Screaming Frog

While many different site crawlers exists, during most audit or on-page analysis it is recommended to start exploring a site by running a crawl with _______________________. __________________ is a versatile tool that allows the user to analyse only a set of important pages while excluding less important pages such as "email to a friend" URLs, or "write a review" pages. Also, it makes it possible to ignore certain parameters from a crawl, such as session IDs or tracking codes.

target keywords

While you are editing the images, take a few minutes to assign a new filename with ___________________________. The will further improve the site from an SEO's perspective and save some effort for future SEO and file-management works.

manual penalty

Whilst Google relies on algorithms to evaluate and constantly improve search quality, thy can also take ________ actions on sites that use spammy techniques, such as demoting them or removing them from Google's search results altogether.

Search Console

Why can't we just compare the 'site:' search results with Search Console index numbers? While comparing the site: search numbers with Search Console index stats (or crawl numbers) are generally accepted as best practice, this isn't actually correct (or it is often misleading). The site search and search console comparison will have almost no value, while the crawl will almost always will be much lower, but this won't help understand what the 'extra' part is in the SERPs. The problem is that while a site might be under-indexed in unique content pages, it might also be over-indexed in duplicate pages, meaning that the final number will be somewhere between.

404

Why should one take care of 404 pages? Most _____ errors don't have a direct impact on SEO, but they can eat away at your link equity and user experience over time. One of the major issues with _____s is that they stop the flow of authority. The reason is that if authority passed through a ________ page I could redirect that authority to pages not expressly 'endorsed' by that link. Also if _________ would pass authority, a site would be exposed to negative SEO, where someone could link toxic domains to malformed URLs on one's site. This means webmasters need to review their _________ (broken) pages carefully and only handle the ones that are relevant to the site's current content.

Screaming Frog

With _________________________ it becomes easy to understand how a site is organised and it also helps to decide how to handle all of those extra indexed pages (or thin content pages, or broken pages etc.). It lets the analyst learn when canonical link elements might be set up incorrectly, and which links lead to redirects and broken pages, which page titles, meta descriptions, and headings are under-optimised and so on.

sitemaps

XML sitemaps give Google information about all the pages on your site, while RSS/Atom feeds let Google know what has been most recently updated on your site. Google also adds that "submitting _____________or feeds does not guarantee the indexing of those URLs."

video, image, and mobile

You can use a sitemap to provide Google with metadata about specific types of content on your pages, including ________, _________, and __________content.

300

______ - Redirection: Request has been received, but needs to perform an additional step to complete the request.

200

______ - Success: Request was received and processed successfully.

410

______ Gone: The requested resource is no longer available at the server and no forwarding address is known. If the server does not know-or has no facility to determine-whether or not the condition is permanent, the status code 404 (Not Found) should be used instead of ______ (Gone).

400

_______ - Client Error: Request was made by the client, but the page is not valid.

HTML

________Sitemaps are essentially 1 or more pages that list just the pages and info a user needs to be concerned with (i.e. the most important landing pages).

500

_________ - Server Error: Valid request was made by the client, but the server failed to complete the request.

100s

__________ - Informational: Request has been received and the process is continuing.

Black hat tactics

__________ tactics may include: -Submitting company information to irrelevant directories. -Cramming articles with keywords. Using invisible text to insert keywords into pages. -Using 'link networks' or 'link wheels' - websites that continually spam links to one another. -Guest posting for the purposes of inserting follow links. -Creating blogs which use lots of duplicate content obtained from other websites. -Using generic 'keyword specific' anchor text rather than 'branded' anchor text, for example:

Robots tags

_______________ are directives that allow webmasters to control how search engines crawl and index the content of a website.

iFrame

_______________ stands for inline frame, and it is a HTML document that is actually embedded within another HTML document. Most of the time, the _________________ HTML element is utilized to efficiently insert content that is from another source, such as advertisements. ________________ can be altered to have their own scroll bar, which is completely independent of the parent HTML document's scroll bar.

404 error codes

________________ have a negative influence on SEO as they cause sites to lose link equity from broken links. They are a major point to consider when performing a technical audit of a site, and should be redirected to a correct page.

X-Robots-Tags

________________ serve the same purpose as meta robots tags, they simply are a different way to implement robots tags directives. They are particularly useful in absence of an HTML document (e.g. PDFs or images) or if a directive has to be implemented site-wide.

iFrames

________________ which are pieces of code embedded into another HTML document for display on a website.

Robots meta tags

_________________ have two attribute-value pairs, name and content.

Internal site

_________________ search also allows you to see exactly what your users are looking for on your site - data that cannot be given through search engines due to the 'not provided' problem on Google Analytics.

Internal site

_________________ search is what a user will use to find products when they are actually on your site already.

External 404

_________________: occur when someone else is linking to a broken page on your site. Even here, there is a small difference since there can be times when the content has legitimately been removed and other times when someone is linking improperly. Webmasters should looks periodically and find pages that are externally linked and redirect the ones that would pass link authority. This process is often called "Lost Link Equity".

301 Moved Permanently

__________________ The requested resource has been assigned a new permanent URI and any future references to this resource should use one of the returned URLs. The _____ redirect, as it is commonly called by SEOs, can be recognised by a change in URL in the browser's navigation bar. This solution should be utilized any time one URL needs to be redirected to another. ______ redirects proved to consolidate large part (but not all) link equity directed to one page, thus is preferred solution when handling legacy pages, old, broken pages or just pages that create unnecessary duplication across the site.

Duplication

__________________ issues can cause confusion for Google or other search engines, which in turn will affect you or your client's site. According to their Content Guidelines, Google defines duplicate content as generally referring to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Mostly, this is not deceptive in origin.

503 Service Unavailable

__________________: The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The __________ should be used whenever there is a temporary outage (for example, if the server has to come down for a short period for maintenance).

404 File Not Found

___________________ (Broken pages): The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent. This should occur any time the server can't find a matching page request. It often occurs on e-commerce websites due to product pages becoming out of stock and being removed from the websites. If there were any links to these pages, for example, from fashion blogs, these links would return a _____ error and the site would not effectively receive link equity.

Robots meta tags

___________________ must be inserted in the <head> section of the HTML page.

Web scraping

___________________ or web data scraping is a technique used to extract data from web documents like HTML (simple webpage) and XML files. Data scraping can help you a lot in competitive analysis as well as pulling out data from your client's website like extracting the titles, keywords, links and content categories. By scraping organic search results you can quickly find out your SEO competitors for a long list of search terms. You can determine the title tags and the keywords they are targeting.

Internal 404s:

___________________: occurs when the site itself is linking to another 'not found' page on their own site. These are normally very bad news. Google considers high number of internal 404 pages as a low quality content factor that implies the site isn't maintained properly. Google doesn't want their users having a poor experience so they might steer folks away from a site they know has a high probability of ending in a dead end.

Drop down menus

____________________ must be in the HTML. Otherwise, search engines probably can't read them. You can use CSS or JavaScript to hide the drop downs once the page is loaded. Allow visitors to navigate the site without drop downs. If you don't, they won't be accessible for tablet users or anyone trying to get by without their mouse on Windows 8. Or anyone who has a hard time keeping drop downs activated as they move their mouse around. Don't list too many links. Remember, everything in the universal navigation is a link off every page on your site.

HTTP Status codes

______________________ are three-digit numbers returned by servers that indicate the status of a page. Each three-digit status code begins with one of five numbers, 1 through 5 and fall into the following categories:

Outgoing 404

_______________________: occur when a link from your site to another site breaks and returns a 404. This can cause problems if the issue is on scale: if only minor number of outgoing links are broken, Google will unlikely penalise the site for it. If significant (10%-20% or more) number of links are broken, search engines might decide to penalise the page(s) that have these outgoing URLs. It's still a good idea, from a user experience perspective, to find those outgoing 404s in your content and remove or fix the link.

universal navigation

__________________n for websites, specifically dealing with how to build a universal navigation, how to divide key pages / products into categories, and how to order navigation including implementing interlinking, drop-downs and sidebars.

RSS feed

_________________contents in themselves will not enable much better ranking as their content is not unique; however, the increase in traffic that is generated as a result of your site being the place to go for information will help considerably. The more time a user spends at your site, the higher it will rank too. Google keeps a track of all these metrics and uses them to keep track of your site and how much it appeals to the people who go there.

Redirects

________________are important to SEO as they help maintain a healthy site structure, funnel link authority and help guide the user to relevant information. Essentially, they keep the site clean and instruct the browser the right page to load and where that page is located. For a larger site redirects are invaluable as search engine spiders have a crawl limit and redirects can instruct the spider about which areas of the site to ignore. Without _______________users may be reading out of date information, accessing pages that may have been moved or deleted or losing any link value those pages may have contained.

A link detox

_______________is the process in going through your previous backlinks and cleaning up any suspicious and harmful or potentially harmful links. This process should be done regularly to avoid being hit by a link penalty, however this process can be done after being hit, to prove to Google you are trying to make amends for previous "mistakes".

robots.txt

___________files are files used by webmasters to give instructions about their site to web robots e.g. Google's crawlers.

HTACCESS

__________stands for hypertext access and is a file that should placed in your root folder ie. the same folder as your index file on the server.

301

_______redirects are recommended when redirecting sites or pages. This is because 302 redirects do not instruct search engine crawlers that a page or site has permanently moved.

'Nofollow' links

are links that have the 'rel=nofollow' attribute applied to them in the source code of a website. This attribute tells Google not to use them as indicators of website quality. In other words, they will not count for the purposes of SEO and will not improve a website's search engine results rankings.

Hypertext

is text with hyperlinks, which for SEO purposes is known as 'anchor text'. For example, all of the navigational menu options on the left side of this webpage are hyperlinks. A hyperlink (when applied to a string of text), looks like this in the source code of a webpage: <a href="http://www.example.com">The example text we're using as a link.</a>

Link building

is the practice of obtaining hyperlinks (or 'backlinks', or simply 'links') from one website domain, to another target domain. In most cases the target domain for link building will be either your own (if you own a website or blog), or your client's. In the case of a webpage, a link is a reference that a user can click on to directly follow through to a different webpage (or a point on the same page).

<a href=" ">

points to the web page or location which you'd like to link to, while the plain text (or anchor text) appears as a hyperlink. The </a> closes the link tag. Therefore, on a web browser, this sentence will actually look something like this: The example text we're using as a link.


संबंधित स्टडी सेट्स

Intro to Computers - Technology Chp 4

View Set

Research with Prisoners - SBE (ID 506)

View Set

Administración de desarrollo UNIDAD 3

View Set

Chapter 5 MCQs and short answers

View Set

Product Cost, Selling or General and Administrative

View Set

Fundamentals of Python Final Study Guide

View Set