100 Updated Search Engine Optimization Interview Questions
Q – 1 What is the density of Keywords Stuffing recommended by Google?
Ans- I think 5 is the density of keywords stuffing.
Q – 2 Search engines do not index some common ( such as “orâ€, “andâ€, “whenâ€, and “in†) within the webpage. what are these common words called?
Ans-
Q – 3 If you enter ‘help site:www.globalguideline.com’ in the google search box, what will google search for?
Ans-
Q – 4 What is site box?
Ans- – SandBox is a filter that is applied to new sites. A new site is put in the sandbox and is kept there for some time until the search engine starts treating it as a normal site.
– SandBox is a filter applied to new inbound links to new sites. There is a fundamental difference between this and the previous assumption: the filter is not based on the age of the site, but on the age of inbound links to the site. In other words, Google treats the site normally but it refuses to acknowledge any inbound links to it unless they have existed for several months.
Since such inbound links are one of the main ranking factors, ignoring inbound links is equivalent to the site being absent from search results. It is difficult to say which of these assumptions is true, it is quite possible that they are both true.
Q – 5 Google looks down upon paid links for enhancing page rank. If a website sells links, what actions does Google recommend to avoid being penalized?
Ans-
► a. The text of the paid links should state the words “paid text link” for Google to identify it as a paid link
► b. Only Paid text links to non-profit websites should be accepted
► c. Paid links should be disclosed through the “rel=nofollow” attribute in the hyperlink
► d. Paid links should be disclosed through the “index=nofollow” attribute in the hyperlink
c. Paid links should be disclosed through the “rel=nofollow” attribute in the hyperlink
Q – 6 How are site maps important for the Search engine optimization process?
Ans-
► a. Site maps help the search engine editorial staff to go through a website, hence ensuring quicker placement
► b. Google gives credit to the websites having site maps. The GoogleBot looks for the keyword or title “Site Map” on the home page of a website.
► c. Site maps help the search engine spider pick up more pages from the website
► d. None of the above
c. Site maps help the search engine spider pick up more pages from the website
Q – 7 Which of the following website design guidelines have been recommended by Google?
Ans-
► a. Having a clear hierarchy and text links
► b. Every page should be reachable from at least one static text link
► c. If the site map is larger than 100 or so links, you should break the site map into separate pages
► d. Keeping the links on a given page to a reasonable number (fewer than 100)
► e. Use less than 30 images or graphics per page
a. Having a clear hierarchy and text links
Q – 8 All major search engines are case sensitive.
Ans- ► a. True
► b. False
b. False
Q – 9 What is the most likely time period required for getting a Google page rank?
Ans-
► A. 1 week
► B. 3 weeks
► C. 2 months
► D More than 3 months
C. 2 months
Q – 10 Which of the following options describes the correct meaning of Mouse Trapping?
Ans-
► a. The technique of monitoring the movement of the mouse on the webpage
► b. The technique of monitoring the area on which an advertisement was clicked
► c. The web browser trick, which attempts to redirect visitors away from major websites through a spy ware program
► d. The web browser trick, which attempts to keep a visitor captive at on a website
d. The web browser trick, which attempts to keep a visitor captive at on a website
Q – 11 A Hallway Page is used to:
Ans-
► A. Attract visitors from the search engines straight onto the Hallway Page
► B Organizes the Doorway Pages
► C Helps people navigate to different Doorway Pages
► D. Enables search engine bots to index the Doorway Pages
D. Enables search engine bots to index the Doorway Pages
Q – 12 Which of the following methods can help you get around the Google Sandbox?
Ans-
► a. Buying an old Website and getting it ranked
► b. Buying a Google Ad words PPC campaign
► c. Placing the website on a sub domain of a ranked website and then 301 re-directing the site after it has been indexed
► d. Getting a DMOZ listing
c. Placing the website on a sub domain of a ranked website and then 301 re-directing the site after it has been indexed
Q – 13 Google gives priority to themed in-bound links.
Ans-
► a. True
► b. False
a. True
Q – 14 Which of the following facts about Alexa are correct?
Ans-
► a. Alexa provides free data on relative website visitor traffic
► b. Alexa and Quantcast provide information on visitor household incomes
► c. Alexa is biased towards US based traffic
► d. Quantcast only tracks people who have installed the Quantcast toolbar
a. Alexa provides free data on relative website visitor traffic
Q – 15 Cloaking is a controversial SEO technique. What does it involve?
Ans-
► a. Increasing the keyword density on the web pages
► b. Offering a different set of web pages to the search engines
► c. Hiding the keywords within the webpage
► d. Creating multiple pages and hiding them from the website visitors
d. Creating multiple pages and hiding them from the website visitors
Q – 16 If you search for the term “iq test” in the Word Tracker keyword suggestion tool, will it return the number of independent searches for the term “iq”?
Ans-
► a. Yes
► b. No
b. No
Q – 17 Which of the following are examples of agents?
Ans-
► A. Internet Explorer
► B. Search engine spiders
► C. Opera
► D. SQL Server database attached to a website
B. Search engine spiders
Q – 18 What is Keyword Density?
Ans-
► a. The number of times the keyword is used / (DIVIDED BY) the total word count on page – (MINUS) the total words in HTML on the page
► b. Combination of the number of times a keyword or a keyword phrase, in proportion with other words, appears on a Web page
► c. The number of times the keyword is used in the page description
► d. The number of times the keyword is used in the page title
► e. The number of times the keyword is used / (DIVIDED BY) the total word count on the page
b. Combination of the number of times a keyword or a keyword phrase, in proportion with other words, appears on a Web page
Q – 19 The following robots Meta tag directs the search engine bots:
Ans-
► a. Not to index the homepage and not to follow the links in the page
► b. Not to index the page and not to follow the links in the page
► c. To index the page and not to follow the links in the page
► d. Not to index the page but to follow the links in the page
b. Not to index the page and not to follow the links in the page
Q – 20 Which of the following can be termed as appropriate Keyword Density?
Ans-
► A. 0.01-0.1%
► B. 3-4%
► C. 7-10%
► D. none of the above
D. none of the above
Q – 21 Which of the following search engines or directories provides the main search results for AOL?
Ans-
► a. Lycos
► b. DMOZ
► c. Google
► d. Yahoo
► e. Windows Live
c. Google
Q – 22 Which of the following search engines offers a popular list of the top 50 most searched keywords?
Ans-
► a. Google
► b. Yahoo
► c. AOL
► d. Lycos
d. Lycos
Q – 23 What is the illegal act of copying of a page by unauthorized parties in order to filter off traffic to another site called?
Ans-
► a. Traffic jacking
► b. Visitor jacking
► c. View jacking
► d. Page jacking
d. Page jacking
Q – 24 Which of the following options is correct regarding the Keyword Effectiveness Index (KEI) of a particular keyword?
Ans-
► a. It is directly proportional to the popularity of the keyword
► b. It is inversely proportional to the competition for the keyword
► c. It is directly proportional to the chances of the keyword ranking on the first page of the Google search results
a. It is directly proportional to the popularity of the keyword
Q – 25 Which of the following statements about RSS are correct?
Ans-
► a. It is a form of XML
► b. It stands for Real-time streamlined syndication
► c. It is a good way of displaying static information
► d. It is a Microsoft technology
a. It is a form of XML
Q – 26 What will the following robots.txt file do?
User-agent:Googlebot
Disallow:/*?
User-agent:Scooter
Disallow:
Ans-
► a. It will allow Google to crawl any of the dynamically generated pages. It will also allow the AltaVista scooter bot to access every page
► b. It will disallow Google from crawling any of the dynamically generated pages. It will also disallow the AltaVista scooter bot from accessing any page
► c. It will disallow Google from crawling any of the dynamically generated pages. It will allow the AltaVista scooter bot to access every page
► d. None of the above
c. It will disallow Google from crawling any of the dynamically generated pages. It will allow the AltaVista scooter bot to access every page
Q – 27 What is the name of the search engine technology due to which a query for the word actor will also show search results for related words such as actress, acting or act?
Ans-
► a. Spreading
► b. Dilating
► c. RSD (real-time synonym detection)
► d. Stemming
► e. Branching
d. Stemming
Q – 28 Which of the following statements about FFA pages are true?
Ans-
► a. They are greatly beneficial to SEO
► b. They are also called link farms
► c. They are paid listings
► d. They contain numerous inbound links
b. They are also called link farms
Q – 29 What does the 302-server response code signify?
Ans-
► a. It signifies conflict; too many people wanted the same file at the same time
► b. The page has been permanently removed
► c. The method you are using to access the file is not allowed
► d. The page has temporarily moved
► e. What you requested is just too big to process
d. The page has temporarily moved
Q – 30 Which of the following statements is correct with regard to natural links?
Ans-
► a. They are two-way links (reciprocal links)
► b. They are from authority websites
► c. They are voluntary in nature
► d. They are from .edu or .gov extension websites
a. They are two-way links (reciprocal links)
Q – 31 What is the term for Optimization strategies that are in an unknown area of reputability/validity?
Ans-
► a. Red hat techniques
► b. Silver hat techniques
► c. Grey hat techniques
► d. Shady hat techniques
c. Grey hat techniques
Q – 32 What does the term Keyword Prominence refer?
Ans-
► a. It refers to the fact that the importance of choosing high traffic keywords leads to the best return on investment
► b. It refers to the importance attached to getting the right keyword density
► c. It refers to the fact that the keywords placed in important parts of a webpage are given priority by the search engines
► d. It refers to the fact that the keywords in bold font are given priority by the search engines
c. It refers to the fact that the keywords placed in important parts of a webpage are given priority by the search engines.
Q – 33 Are RSS/Atom feeds returned in Googles search results?
Ans-
► a. Yes
► b. No
b. No
Q – 34 What term is commonly used to describe the shuffling of positions in search engine results in between major updates?
Ans-
► a. Waves
► b. Flux
► c. Shuffling
► d. Swaying
b. Flux
Q – 35 What is Anchor Text?
Ans-
► a. It is the main body of text on a particular web page
► b. It is the text within the left or top panel of a web page
► c. It is the visible text that is hyperactive linked to another page
► d. It is the most prominent text on the page that the search engines use to assign a title to the page
c. It is the visible text that is hyperactive linked to another page
Q – 36 If you enter Help site: www.globalguideline.com.com/seo in the Google search box, what will Google search for?
Ans-
► a. It will open up the Google help pages applicable to www.globalguideline.com.com/seo
► b. It will find pages about help within www.globalguideline.com.com/seo
► c. It will only find page titles about help within www.globalguideline.com.com/seo
► d. It will direct you to the request page for re-indexing of www.globalguideline.com.com/seo
a. It will open up the Google help pages applicable to www.globalguideline.com.com/seo
Q – 37 What does the 301-server response code signify?
Ans-
► A. Not Modified
► B. Moved Permanently
► C. syntax error in the request
► D. Payment is required
► E. The request must be authorized before it can take place
B. Moved Permanently
Q – 38 Which of the following factors have an impact on the Google Page Rank?
Ans-
► a. The total number of inbound links to a page of a web site
► b. The subject matter of the site providing the inbound link to a page of a web site
► c. The text used to describe the inbound link to a page of a web site
► d. The number of outbound links on the page that contains the inbound link to a page of a web site
d. The number of outbound links on the page that contains the inbound link to a page of a web site
Q – 39 10 people do a web search. In response, they see links to a variety of web pages. Three of the 10 people choose one particular link. That link then has a __________ click through rate.
Ans-
► A. less than 30%
► B. 30 percent
► C. more than 30%
A. More than 30%
Q – 40 If a websites search engine saturation with respect to a particular search engine is 20%, what does it mean?
Ans-
► A. 20% of the WebPages of the website have been indexed by the search engine
► B. Only 20% of the pages of the website will be indexed by the search engine
► C. 20% of the websites pages will never be indexed
► D. The website ranks in the first 20% of all websites indexed by the search engine for its most important search terms
B. Only 20% of the pages of the website will be indexed by the search engine
Q – 41 I have some long prose pages, and I wonder at what point should they be broken into separate pages. I hate the scroll….but I hate the little chunks per page. These will not be shortened; they are the length they are. So, what is the optimal number of words per separate html document?
Ans- I have not seen any studies specific to this, although I have some observations. I assume you mean for readability and usability, and not for SEO. For a multipage article, we found at GlobalGuideLine.com that page views dropped off dramatically after 4 or 5 pages.
When we made an article longer, fewer people read pages 6 or higher. Also, page length is related to page size, and without feedback you need to make sure your pages load in at most 8 to 10 seconds. That is about 30 to 34K total. With a 10K banner and a logo say, that is a maximum of 20K.
Q – 42 Do you feel that information architecture (in this case I mean the categorization of web pages for find ability) can have an effect on site optimization? I suppose I’m asking if things like intuitive URLs and labels can reduce the need for extra context on a page. How would you separate site optimization and usability/IA?
Ans- Yes, there is a tradeoff for some techniques between IA and WSO, and with Search Engine Optimization (SEO). Good IA has a logical hierarchy and clear, unambiguous labels. Some WSO techniques uses short abbreviated names and URLs to achieve savings, which can preclude descriptive terms.
SEO also can conflict with IA and WSO, with some sites sacrificing logical hierarchy to create keyword-filled directory and file names. Balancing these three disciplines is an art in itself. For high traffic pages like home pages, I favor speed over IA and SEO.
In the book, I discuss mapping techniques that you can use to have the best of both worlds. The book gives you the tools you can use to optimize your content. How far you go is up to you.
Q – 43 I could create a stunningly beautiful entirely graphical page, or a simple page with no graphics, or something in between. How do I determine the safe point; the point where a page is acceptably attractive, authoritative and/or creates the right impression, and yet loads fast enough to serve my customers?
Ans- It depends on the type of site. For informational sites like GlobalGuideLine.com or news-related sites, the graphics should be kept to a minimum. With the advent of widespread support of CSS, you can now create many pleasing effects without graphics.
I cite a study in the book on this ratio. For shorter delays users prefer documents that include graphics, for longer delays users prefer text-only documents. Search Engine Optimization Articles
Q – 44 There is software that assists and, in some cases, automates the accessibility process. Is there any software that does the same for web site optimization?
Ans- Yes, there are a number of products, mainly for optimizing HTML and JavaScript. SpaceAgent from Insider Software, VSE Web Site Turbo from VSE Online, and of course automated graphical tools. I test and demonstrate many of these products in the book. To convert to CSS-based layouts, and to do it right, you’ve got to do it manually.
Q – 45 What major web sites do the best job of optimization?
Ans- Yahoo.com has the most highly optimized home page I have seen. They use URL abbreviation to save nearly 30% off their home page HTML. View source to see what I mean. But even Yahoo has bloated up, they have nearly 300 links on their front page. WebReference.com of course 🙂 I like most anything from Zeldman and company, very clean and CSS-based. Though there’s always room for improvement.
Search Engine Optimization Articles
Q – 46 What is the most common problem to be solved when optimizing web sites?
Ans- Too many HTTP requests. This is due to the overuse of images and external files. We’re also seeing a trend of too many external JavaScript and CSS files in the HEAD. This delays the display of your content as they must load first.
Q – 47 What are the downsides of stripping out every non-printable space, tab, and line break from an HTML document, so the entire code essentially resides on a single line? I thought Netscape 4 had trouble with very long HTML lines?
Ans- I don’t advocate making your entire HTML page into one single line. Some editors can choke on long lines, older versions of the Oracle info server can choke on long lines, and if you email your pages (as we do at webref), some email programs can flag a virus in longer lines.
So I advocate a max of 255 character lines to be safe, or a max of 2000 character lines to avoid problems with Oracle. Also, removing whitespace can break some JavaScript code, and make your code hard to read.
You can avoid these problems by keeping unoptimized versions for any edits, and punctuating your JavaScript statements with semicolons.
Search Engine Optimization Articles
Q – 48 Does precompilation of loop limits in Web programming languages like ColdFusion, Java, and JavaScript help?
Ans- Yes, this is also called coding motion out of loops, and is one of Bentley’s 27 rules for code tuning. Also, using local variables is much faster. Many of these refactorings are covered in Chapter 10, “Optimizing JavaScript for Execution Speed.”
Q – 49 How can the weblog be optimized to load better but also show first-timers that work is being done?
Ans- Many of the weblogs I view source have lots of embedded formatting, like font tags and complex CSS classes. Many weblogs are also by their very nature verbose 🙂 I’d advocate using higher-level type selectors in CSS, contextual selectors, and to be brief in decks and point to longer stories for those who want to read more. Writing succinct headlines is also important.
In general, cut your prose as much as possible, especially on high-traffic pages. Users don’t read as fast on the screen. On the web, users are information foraging, trying to maximize the value of their time. They flit about like hummingbirds, looking for nuggets that interest them. One study I read showed that on average, users spend about 1 second per page, and rarely stay more than 10 seconds. Once they find an article they want, they’ll stay longer.
Q – 50 Can a page load too quickly?
Ans- As far as I’m concerned, no. But according to the response time research that I read, and mention briefly in the book, you can have response times that are too fast, and this increases errors. But on the web, that is unlikely to happen anytime soon.
Q – 51 Have you seen UIEs research on users perceptions of download speed? Does not that really debunk the notion that code/graphic optimization improves usability? Is not it really all about scent of information and users “feeling” like they are consistently making progress?
Ans- Yes, I have read that. That is poorly supported elsewhere, I spend the first chapter showing why response times are important, summarizing key research into HCI and response times. However, there are factors that can affect how we perceive delays, like feedback and task complexity.
Attainability is another interesting area of research, with users adjusting “subjective time bases” based on the pace of particular systems. If Domino’s usually delivers in under 30 minutes, and then one day took an hour, you’d certainly notice it.
Search Engine Optimization Articles
Q – 52 What is the ROI [return on investment] for SEO activities? Can you give any examples? Another related question might be “How do you build a business case for doing SEO?”
Ans- This is akin to asking what is the ROI for usability. Speed is a key component of usability. Small improvements in speed can take critical pages below typical attention thresholds, and dramatically lower bail-out rates and abandoned shopping carts. I talk about this in the book, but compression alone can save 30-50% in size and bandwidth costs.
Webmasters who have employed compression and optimization typically save 30 to 50% off their bandwidth costs, and retain more customers, and have improved conversion rates.
Q – 53 Have new web technologies like XHTML, CSS, Flash, and XML changed the SEO game or the ways search engines work?
Ans- Yes, CSS has made it possible to transform table-based layouts into CSS-based layouts. Typically this reduces page size by 25 to 50%. The ratio of content to markup improves dramatically.
Q – 54 If you only had an hour (in one shot) to spend on WSO [web site optimization] for a given site each year, what would you do for that hour, and why?
Ans- I’d pick the low-hanging fruit. Eliminate excess (graphics, multimedia), cut your prose in half, and optimize the rest. (you could install mod_gzip etc. in less than an hour also). The main thing is to make sure that your home page loads quickly.
Sear