23 Comprehensive SEO Tips To Increase Website Traffic

High visitor numbers are important for any form of making money on the Internet. But how do you get more visitors to your site? This article gives 23 detailed SEO tips that ensure better positions in the search engines or visitors open new channels.

SEO Secrets and How Anyone Can Do It the Easy Way


Avoid duplicate content

Duplicate content often leads to devaluation in the search engines. The most common cause of duplicate content is that sites both with www. and without www. are accessible and deliver the same content (eg. http://www.mrheather.com and mrheather.com ) without is diverted to one of the two realizations via 301 Redirect header.

Duplicate content often leads to worse positions in the SERPs because two sides must share the backlinks that would normally be allotted to one side, because one-time (unique) content is classified by Google as superior. Attempts duplicate content whenever possible to avoid. This you can achieve via .htaccess (best solution) or via the Canoncial day.

. The following code in the configuration file .htaccess  ensures constant redirection of http: //www.domain.tld on the domain.tld:

  1. RewriteEngine On
  2. RewriteCond% {HTTP_HOST}! ^ Domain \ .tld $ [NC]
  3. RewriteRule ^ (. *) $ Http: //domain.tld/ [L, R = 301]

Another way to avoid duplicate content, which is Canoncial day. The Canoncial tag is supported by both Google and BING. It is inserted in the <head> section of the HTML code as follows. The following Canoncial tag would find, for example, for the URL www.domain.com/ products.html use:
<link rel = “canonical”
href = “http: //www.domain.com/product.html” />
</ head>

Complete options for user-generated content

“Content is king” (contents are the royal road) is one of the ultimate SEO truths. Eighth therefore it wherever possible to encourage your visitors to content on your site to create (User Generated Content). The following site areas are particularly suitable for user-generated content:

  • Forums
  • Question / answer fields
  • User Reviews
  • Profile Pages
  • Bug reports
  • Annotation function (eg blog)

But remember that you should check third-party content and editorial that you post any spam posts. To do this use eg Spam Shields (here using the example of TYPO3).

Ameliorate, your texts hierarchically with headings

The structure of web texts with headings is not only important to ensure good legibility, but also has a positive impact on search engine optimization. So search engines to classify the individual text segments semantically better and pull keywords from the headlines. Another positive aspect is that a good outline automatically improves the logical structure of a text.

Concern regularly for new and current content

Frequently updated pages are indexed often by Google, get better positions in the SERPs and get a higher quality with adequate Trust than similar sites. Often, these sites will also be rewarded with a higher PageRank. Integrating therefore if possible a news section, a blog, an article directory or a forum to your website in order to provide regular current new content.


Use Long Term Keywords for competitive search terms

With Long Term search terms such as SEO Berlin can usually achieve more. This is partly because that is searched frequently for Long Term search terms (special requirements provide better results for the user), secondly to the fact that the number of competitors is low and you therefore easily Keywords entwines with better long-term.

Optimize your keyword density

When keyword density, it is important that this is not too high and not too low. Too high keyword density (about 5%) can be easily seen as keyword spamming and be penalized in the search engines with bad SERP positions. Too low a keyword density (less than 0.7%), however, does not bring the best out of a text, and may prevent a good ranking, even if all other factors are optimal.

When choosing the keyword density is also important to ensure that a text is all the more boring, more often occur in short frequency the same keywords. A keyword density of 1% for example, means that every 100th word is the keyword. The optimal Keyword density considering readability and optimization should be between 1.5% and 3%. About our transparent Keyword Density Checker You can leave check your keyword density automatically.

Tracke your keyword positions in the SERPs

The best optimization measures are useless if you can not monitor the success or failure, because otherwise you never really know what has brought a measure or if you had not perhaps achieved more by other means. Monitoring your keyword positions in the SERPs (Search Engine Result Pages) helps you to keep track of your SEO measures and to identify strategies that bring you the most benefit. To monitor your keyword positions in Google, the free program is suitable, for example, Free Monitor for Google from CleverStat.

Find optimal keywords through a keyword research

Make sure to use optimal keywords for your website. Here is a comprehensive keyword research usually unavoidable, mainly because the search volumes must be estimated.Also synonyms of a word can sometimes achieve higher search volumes or have less competition. With the Google Adwords Keyword Tool You can search volumes and appreciate for you find optimal keywords.

Use keywords in URL, Title and Meta Description

Keywords in the URL, in the title or in the meta description should appear in bold in the Google search results. Your websites gain greater attention among visitors. In addition, the likelihood of clicks and thus increases the click-through rate (CTR).

Whether you’re using speaking links or your keywords are relaying as a variable, it makes no difference. Also dynamic URLs can be very well indexed by search engines now. Eighth simultaneously that Titles and Meta Descriptions are interesting in order to keep the click-through rate (CTR) as high as possible.

Backlinks, links and links

Leite backlinks that point to nonexistent pages to

During a site-life content or pages every now and then removed, the revised page structure or change URL paths. This often leads to the fact that until now existing backlinks now to non-existent pages (status code 404 – Not Found) refer. In order for these backlinks go virtually lost.

So make sure that it, when you change the URL paths the ancient paths always means 302 (follow) forwarding to redirect to the new paths (eg via .htaccess). If pages have been completely removed, it is advisable to redirect to similar pages or necessary to the home page.

However, only when it is actually backlinks to these sites are (eg cost in Google Webmaster Tools). Scour regularly the non-found (404) to detect errors in Google Webmaster Tools page, accessible sind.Über following entry in the .htaccess you can backlinks from old to new pages redirect the (no longer):

  1. Redirect 301 /alte-seite.html /neue-seite.html

Operate a continuous organic link building

The number and quality of your backlinks are very crucial for your positions in search engines. It is therefore important that you run a continuous link building in order to expand your backlinks constantly on.

As organic backlink structure while a process is known in which the backlinks grow in natural way. So for example, should rather every month 100 backlinks differing intensity and quality constructed instead of once a year in 1200 backlinks, for example, all have a Pagerank. 3

Build Backlinks is not as difficult as it looks at first. Read you as an introduction to our products 10 sources free backlinks through to you in advance to build a solid backlink foundation.

Link to authorities

Link to your articles on topics relevant items or URLs from authority websites.Authorities on the Internet are known to one or more of the following indications:

  • they have a large number of backlinks
  • there are a large number of blog articles about them
  • they have a high Pagerank (PR5 and higher)
  • they possess an advanced age (5 years and older)
  • they have high activity in social media portals
  • they achieve very good positions in highly competitive keywords
  • New content will be ranked well in the shortest time

Authorities enjoy greater credibility than other pages. Page report, for example, is one such authority. Through a cross-linking to authority websites you show the search engines that your article is classified as high quality. You get quasi by linking from a part of the credibility of the authority’s website.

Optimize the distribution of your Link Juice

The link juice (link juice) will be distributed in equal shares to all existing on one side of internal and external follow and nofollow links. Make sure that you optimally steer the Link Juice and are on the sides steer which will be ranked highly. It brings you more little example, if most of your internal links point to an insignificant side, you bring the relatively little from SEO perspective. About the Link Juice Checker you can analyze your link juice and find optimization opportunities.


Pay attention to security aspects of your website

Every day, hundreds of sites are hacked. From SEO point of view not only the fact is crucial that visitors leave a hacked site immediately and criminals get to stored data, but above all that hacked sites are often deposited at Google with a security warning: “This site might endanger your computer “.

Such a warning means that almost no one calls the site more. The following basic safety practices are therefore in the operation of a website also from SEO point of view almost inevitable:

  • Regular CMS-core updates (TYPO3, Joomla, etc.)
  • Regular extension updates
  • Regular inspection of the PHP Error Logs
  • Make your PHP code online only if you know how PHP code is secure (forms, etc.)
  • PHP Code serviced regularly (per 1000 lines of code sneaks on average at least good programmers. 1 error a)
  • Undergo used code penetration tests
  • Create a contingency plan and keep accessible (who must be notified, what happens to customer data, what to do, how do we get the page as quickly as possible online, contingency plans, etc.)

Server administrators have to deal with the matter, of course, something profound, this should note at least the following points:

  • Regular updates of important components (Apache, PHP, MySQL etc.)
  • Subscribe important security bulletins
  • Regular recording of patches
  • Securing the server (mod_security mod_evasive, etc.)
  • Minimizing the dissemination of information (server signature turn off)
  • Regular testing important log files (access.log, error.log, user logins, system messages, etc.)
  • Regular penetration testing (checking e-mail server, check the Apache server, frequent attack scenarios through play etc.)
  • Automatic dubbing of log files to external drives (eg, via on-line transmission), in order to prevent subsequent manipulation of the log files
  • possibly similar live monitoring of accesses via apachetop DDoS attacks against
  • To be Uptime Monitoring and monitoring of all major services to be informed directly in case of damage
  • Draw up contingency plans and keep accessible

If you are using WordPress, read as a supplement to Article 9 SEO Security Tips for WordPress. Also you should see the security of your PC not ignore, because an attack on your website can also be done about the way your PC (FTP spy login roadmaps and prototype code to steal, etc.). Regular information about PC security, you can find among others secuteach.


Optimize the performance of your pages

The loading time of a website is an important ranking factor that can decide on front or rear seats in the search results at Google. Google called in his Web Performance Best Practices 6 optimization options, such as the on-page building can be speeded up:

  • Caching optimize (content offline keep available)
  • Minimize request-response loops
  • Avoid data-ballast (overhead) (eg GZIP use)
  • Traffic minimize (Responses, downloads and others)
  • Optimize browser rendering
  • Optimize pages for mobile devices

For the analysis and implementation of the different aspects, among other things suitable Google Page Speed ​​browser plug-in (part of the page Report analysis). In addition, there are a number of server-side optimization options:

  • PHP Caching (eg via eAccelerator)
  • MySQL Server Tuning + Caching
  • faster server CPU
  • latest PHP and MySQL use versions
  • Prevent PHP error
  • use current CMS versions
  • Server location close to the target group (see. With traceroute)
  • Optimize connection and response time

Measure the charging time your website also regularly webpagetest.org.

Marketing and Social Media

Place high-quality content in Google News

On Google News placed articles can draw on their site very high visitor flows. However, the quality guidelines for recorded products are quite high and the positive handling of a membership application usually takes quite a long time. Who regularly quality (self-written) news published on its website, should sign his website on Google News.Google itself has this tutorial “Getting Started published”. Decisive should always be the question of whether the own content Google News adds value.

Use Social Media

Social Media portals offer you the opportunity to reach a very long range with good content or interesting tools in a short time. In particular, the possibility of slight proliferation (“like”, “Retweet”, “+1”, guerilla marketing, etc.) can make social media a meaningful foothold your SEO. The following social media portals you should know and use regularly for the dissemination of your articles and content:

  • Facebook
  • Twitter
  • Google+


Keep the bounce rate on your pages in the eye

The bounce rate (also bounce rate) are as defined by the Web Analytics Association (WAA) information as to how many visitors leave another page after only one page view. Depending on analytics software also short visits between 5-10 seconds are included in the bounce rate.

You can, among other things very well with the bounce rate on your pages Google Analytics to monitor. The bounce rate is usually higher, the more visitors to enter through search engines find (because this example for a search word and the results may not be 100% fit for fugitives).

Keep pages with a high bounce rate, especially in the eye and try to optimize this (thematically better keywords, better web design, higher usability, better texts, voltage, interesting for the visitors, etc.). Any bounce rates above 50% should be considered in more detail.

Improve your accessibility (uptime)

The percentage of time is referred to as Uptime, in which a site is accessible. Synonym for this is the down time the time during which a website is unreachable. A non-reachability can be site-specificity. For example, a website for example, of German Internet connections to be easily accessible and give it the same when calling France from problems.

Therefore, it is important in measuring the uptime, the site not only call itself in the browser, but to resort to external tools that can access from different locations on a website. An ideal tool for measuring the Uptime is the Pingdom website monitoring,which is free in the standard version.

Optionally, you can be notified via email by Pingdom event of any downtime directly, so that can be taken quickly. Eighth in your SEO on a high uptime, because it may adversely affect too many failures on your positions in search engines (high downtime = less quality service). The Uptime should be at least 99.5%.

Use GEO tags Regional Offers

Local deals are displayed in the Google search above the regular search results. By installing Geo coordinates (location-relatedness) You therefore might make attendance particularly in regional greatly enhance search.
<meta name = “geo.region” content = “EN-SN” />
<meta name = “geo.placename” content = “Dresden” />
<meta name = “geo.position” content = “51.03983; 13.748498” />
<meta name = “ICBM” content = “51.03983,13.748498” />
<meta name = “DC.title” content = “SG Dynamo Dresden” />

Use microformats to stand out better in the SERPs

Microformats (also called Rich Snippets) enhance search results with additional information and help them better classify information semantically. For example, next to an article, an author photo display or incorporate the review of an article in the advertisement in the search engine results with. Google previously supported microformats for the following content types:

  • User Reviews
  • people
  • Products
  • Companies and organizations
  • Recipes
  • Events
  • music

For a complete documentation on the Google microformats can be found here.

The use of microformats is quite simple. Existing HTML markup is simple itemscope with the additions and itemprop (erty) = “abc” adds:
<! – Example microdata markup
for the content type “Person” ->
<div itemscope
itemtype = “http://data-vocabulary.org/Person”>
Name: <span itemprop = “name”> Fritz licorice </ span>
Username: <span itemprop = “Username”> Lafritz </ span>.
Web site: <a href = “http: //www.domain.tld”
itemprop = “url”> www.domain.tld code
Occupation: <span itemprop = “title”> Pixelschieber </ span>
<span itemprop = “affiliation”> Gold ducks GmbH </ span>
</ div>

Check your links regularly with a Link Checker

Links that lead nowhere, angering not only visitors but also leave existing link juice in Nirvana disappear. Check your websites occasionally with a link checker. In CMS TYPO3 version 4.5 this is an integral part of the program scope. Users of other CMS can instead the Link Checker W3C use to let validate the links on your site.

Avoid a bad neighborhood

The IP address of your site in a bad neighborhood is that an indication may be for search engines that your content have not so high quality. Use the page analysis report “IP Neighborhood” (included in the analysis Reportaire of premium memberships) to see which other websites are using your website on the same IP address. The Report page analysis automatically checks whether the domains contained emerge in a blacklist, which is a strong indication of a bad neighborhood (be marked in red). The following events may indicate that a site is considered a bad neighborhood:

  • Security warning about the website in the Google search
  • Site was in one or more blacklist (s) entered (eg malware blacklist Abuse blacklist, etc.)
  • The site only has obvious spam content


SEO is already no longer simply SEO. Many aspects of other web, usability, programming, security and marketing areas have begun to play with into the search engine optimization. This trend is set to continue, because search engines consider getting stronger with these other factors. It is therefore essential to deal with the full web area, if you want to operate a long-term search engine optimization.

Of course, not all points that best practices be any search engine optimizer and website operator. I have compiled these in the hope that they will help beginners and professionals may provide new ideas.

Add a Comment

Your email address will not be published. Required fields are marked *

Advertisment ad adsense adlogger