Programming Microsoft ASP.NET 4 - Dino Esposito [185]
The description meta tag, instead, is more relevant, even though it’s not specifically for raising the rank. If a description meta tag is found, search engines embed that content in the result page instead of creating their own description. If the description is attractive enough, your page has more chances to be clicked. A description is ideally around 200 characters and should read well and be informative.
Search engines don’t like many things that often populate Web pages. They don’t like duplicated URLs, for example. If there are two or more URLs used to get the same content, search engines tend to lower the page ranking. This happens even if you have subdomains, such as www.yourserver.com and yourserver.com. Without a permanent redirect being configured at the Internet Information Services (IIS) level, your home page will suffer.
Search engines don’t like query strings, hidden fields, Flash/Silverlight components, or rich JavaScript content. All these things make the page harder to analyze. Search engines, instead, love plain anchor tags, title attributes, and alt attributes—plain HTML.
If not properly handled, redirects are also problematic because they can lead to duplicated URLs. Classic redirects you perform through Response.Redirect result in an HTTP 302 status code. As developers, we tend to forget that HTTP 302 indicates a temporary redirect. A temporary redirect therefore tells engines that eventually the page being moved will return to its original location. If this doesn’t happen, engines keep on storing two locations for the same content. A permanent redirect is HTTP 301, which in ASP.NET 4 is enforced by a new method—Response.PermanentRedirect.
Query strings should be avoided too. Ideally, URLs should be extensionless and represent a meaningful path within the content of the page. URL rewriting is an ASP.NET technique that can help in this regard. In ASP.NET 4, however, routing is a type of URL rewriting that offers a richer programming model and the same (if not higher) degree of effectiveness. (See Chapter 4.)
SEO and ASP.NET
Although ASP.NET 4 put some effort into making it easier for you to improve SEO, there are a few structural aspects of ASP.NET that are not specifically optimized for search engines. I don’t mean this to be necessarily a bad statement about ASP.NET Web Forms as a platform. On the other hand, ASP.NET Web Forms was designed a decade ago when we all were living in a totally different world and were chasing different priorities than today. In this regard, ASP.NET MVC is a better (because it’s newer) platform that is natively optimized for search engines.
So my point here is don’t be fooled if you read that ASP.NET improves SEO. It simply gives you some new tools for implementing features (permanent redirection, meta description, routing) that were only harder, but not impossible, to achieve before.
Let’s briefly review some structural SEO-related issues of ASP.NET.
The postback mechanism carried via JavaScript code is like smoke in the eyes of search engines. Every time you use link buttons or built-in paging/sorting capabilities of data-bound controls, you put your page at risk of not being ranked properly. Search engines don’t follow JavaScript and ignore cookies. Because the session ID is stored in a cookie and engines ignore cookies, some of the page content might remain undiscovered. What about a cookieless approach for sessions, then? (We’ll discover this feature in Chapter 17.) It would be even worse because