Tuesday, 12 July 2011

Three Essential Apps for Search Engine Optimization Pros

One of the most difficult and fascinating aspects of successful blogging is search engine optimization. Many individuals first enter the blogging world expecting to become successful bloggers by simply writing quality posts and hosting them on a webpage. However, as many of us know, blogging has become far more involved than simply having some minimal tech knowledge and a writing interest. Bloggers who wish to be really seen within the blogosphere must develop ways to get their articles and webpage ranked highly on the major search engines throughout the web. There are several ways in which search engine rankings are determined and it can be difficult to keep up with them at times. These three applications can help bloggers perfect and keep track of their search engine analytics.


Pokeseo


This app is available for the iPhone and is heralded as the "first" SEO reporting application for the iPhone. Pokeseo allows you to check your Google page rank and reported backlinks for your domain or a competitor's domain at anytime from any place. Pokeseo has a built in email support system that allows you to email an SEO statistics report to any individual when needed. This can be particularly helpful for businesses or marketing companies trying to keep track of their site's web analytics and share them with other business members. Pokeseo has a very intuitive and beautiful interface that is very easy to use. The simple interface enables users who are new to the blogging and SEO world to stay on top of and understand important web analytics for their website or sites.


Analytics App


With dozens upon dozens of web apps available for search engine optimization and web analytic reports, this app is the closest thing to logging into your Google Analytics account. This analytics app offers over 55 different reports, the ability to track multiple domains, custom date views, and the ability to inspect your adWord's campaign traffic. This application has a sleek and clean interface that is easy to navigate and simple to learn. While the Analytics App offers everything that Google Analytics does, it does have a few added features you cannot find anywhere else. The "Today" report and the "Yesterday" report are features you can only find on this app and can help you stay up to date on your site's web analytics.
 


SEM Calculator


This iPhone app is free on iTunes and allows you to do numerous calculations based off of common search engine marketing equations. This app is slightly more involved than the others that made this list. In this way, SEM requires a little bit further knowledge of SEO methods and analytics. With various calculators, including CPM, CPC, CPA, CTR, and more the SEM Calculator app is useful for all different types of users. There is a calculation cheat sheet included in the newer version of this app that allows users to understand relevant formulas. So, if you are an SEO beginner this app may not be for you quite yet, but it can be extremely helpful once you gain a greater understanding of what SEO is and how web analytics work.


 


About Author


Alvina Lopez is a freelance writer and blog junkie, who blogs about accredited online colleges. She welcomes your comments at her email Id: alvina.lopez @gmail.com.

Thursday, 23 June 2011

My website got banned by Google [Part 1]

Hello! I am writing post after long time due to several reasons and one of them is that I was busy to migrate my server to another company because of poor support. In the mean time, one of my friend Muhammad Faisal (Karachi) contacted me and told me something that made me worry.


He told me that his website is not getting any search engine traffic and has dropped about 75% traffic. This was very amazing for me as he worked hard to get his website indexed in search engines and he also got a very good position in SERP (His website was in top 3 results). I checked his website in Google using site: url and found nothing in result and after this I tried to search some phrase relating to his website but the result was same.


I thought that this may be a temporary ban from Google and website will be indexed soon again but that was nothing but a nightmare only. I, then, started to crawl the reasons for this and found lot of causing this problem. If you want to save your website to get banned by the SE then review your website and follow the instructions below.


1. Duplicate Content or Websites


Don't know why people copy someone else data to get their website indexed. Actually, people wake up in the morning, create website and then starting copy data from other website. This is the main reason to get banned by the Google as Google has powerful Algorithms to catch your fraud. Google can easily distinguish between original content and copied one.


If Google finds multiple web pages have the same content they may penalize each website for this. Of course, someone may have copied your content and Google banned you even though it was your original content that was taken. Make sure no other site is using your content. You can do this by performing a Google search using some of your text with quotation marks (") around it. If you do find someone is using your original copy visit here to learn more about copyright infringement: http://www.google.com/dmca.html.


2. Robots and Meta Tags


Yes robots and meta tags can also help your website not to be indexed in search engines because robots.txt can be treated as an Oxygen pipe to your website. So it will better to check your robots.txt file, if you have. Also if you find the text like something this is in your pages.


<meta name="ROBOTS" content="NOINDEX">. If you find something similar to this then your website is blocked to be accessed by search engines.


You can deny and allow access to search engine bots via robots.txt. Here is an example of such 2 tasks.


The line given below will allow all robots to crawl your website


------------------------------


User-agent: *
Disallow:


-----------------------------


This example keeps all robots out:


----------------------------


User-agent: *
Disallow: /


---------------------------


3. Cloaking


Google says: "The term "cloaking" is used to describe a website that returns altered web pages to search engines crawling the site. In other words, the web server is programmed to return different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. This can mislead users about what they'll find when they click on a search result. To preserve the accuracy and quality of our search results, Google may permanently ban from our index any sites or site authors that engage in cloaking to distort their search rankings."


This is the biggest fraud that is done by the webmasters. In this case a page is shown to user is much more different that is presented to search engine. Actually, this done to rank higher position in search engines and to acheive high PR. Google has latest technology and can easily detect such kind of fraudulent activity. So, keep your website clean from Clocking.


4. Hidden Text and or Links


Search engines hate such kind of fraud and your website can be banned permanently. Not really used frequently but in some cases. Webmasters hide text of a webpage in different ways such like by applying white color to the text while the page have white background or tell the CSS file to not show the text but as we know that search engines are more intelligent as compared to us.


5. Keyword Spamming


Keyword density must not be over crowded as most of the webmaster want to rank higher very quickly and they adopt this technique. A webpage must have a reasonable quantity of keywords that are being used in a phrase. If search engine founds that some words are being repeated in a page then your rank will be reduced and you may get banned.


 


This is the first part of post because I can't write the second part of the post as I am busy in exams. So, as I get free will write second part as well. Your comments are welcomed.

Tuesday, 17 May 2011

Finding a Job in Web Design

Finding a job in website design may not be as difficult as one might think. There are several ways a qualified designer can work. He can get a job working for a web design company, begin his own business or work freelance—promoting his services online to design for other people while remaining his own boss.

When applying for a job, the applicant needs to establish his qualifications; don’t commit to a specific job title. The Human Resources person will see which position would best suit those specific qualifications. Many jobs within a company that are web related don’t state that in the title: copywriter, copy editor, producer, information architect, graphic designer, program manager, layout artist, and digital developer are some options.

Concentrate on the job description scanning for web technology terms such as Web or HTML. Any job for an online company will have something to do with the web field and get the novice in the door. Being a copy editor may not be the goal, but starting small with a good company can get the employee where he ultimately wants to go.

Non-corporate settings can also provide work in web design. Being a freelance web writer, establishing one’s own website, writing website reviews, or becoming a web designer for others on a freelance basis can provide a good income. However a freelance web designer must set himself apart. He should cater to a very specific industry, knowing their insider terms and jargon well in order to fill their niche most effectively. He may offer a variety of services—webhosting, design, writing, editing, and specialty art—to meet that industry’s needs.

A freelance web designer may want to have an onboarding life events system to keep his business productive. It will provide personalized employee portals for task management and electronic forms. Onboarding apps also support employee life events such as promotions, medical leave, and employment termination. As an entrepreneur the freelancer must also be able to manage time, business orders, and finances for himself and his employees. Some onboarding programs provide assistance in these areas also. Managing contracts and other legal aspects must be taken into account when one establishes a freelance business for himself and his employees. Tax payments and licensing must also be considered. It is no small job to work for one’s self.

Before accepting any employment in the field, get the education and certification needed to be prepared for all aspects of web design. Today any job is somewhat difficult to get. Having the specifically required skills will place one applicant far ahead of another.


Danielle is a writer with a passion for web design, social media and freelance business.  In her free time she enjoys playing with her two English bulldogs. Read her blog about technology, computers, and gadgets at TechNected.com.

Friday, 6 May 2011

Google banned my website

Usually there is no warning for being banned or penalized by Google except for the steady drop of sales and visitors to your site. Many site owners and search engine optimization firms are left with little to no idea why they were removed and can be left scratching their heads as to how to get back in. While there are many reasons why a site has been banned, here are a few of the more common reasons. If your site has been banned contact your SEO company or give Big Oak a call to help you get back on the right track to high Google rankings.


1. Robots and Meta Tags


The first and simplest solution many be that your robot.txt file has been changed to prevent search engines from entering your site. Or your meta tags could be directing the search engine robots to exclude your site. While this would be highly unlikely, it is best to rule this out. So check your robot.txt file (if you have one) and your meta tags. Unless you want your site hidden, you should never read this in your meta tags: <meta name="ROBOTS" content="NOINDEX">. If you see this, you are blocking your site from Google.


You can also ban your own site by having a robots.txt with the wrong code. Two examples of robots.txt code are below.


This example allows all robots to visit all files because the wildcard "*" specifies all robots.


User-agent: *
Disallow:


This example keeps all robots out:


User-agent: *
Disallow: /


Read more about this at: http://en.wikipedia.org/wiki/Robots.txt


2. Cloaking (A Big Google No-No)


Straight from Google's website: "The term "cloaking" is used to describe a website that returns altered web pages to search engines crawling the site. In other words, the web server is programmed to return different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. This can mislead users about what they'll find when they click on a search result. To preserve the accuracy and quality of our search results, Google may permanently ban from our index any sites or site authors that engage in cloaking to distort their search rankings."


If your website or web pages are set up to display different information  for a search engine spider versus a real person, then you are cloaking. Cloaking delivers one version of a page to an Internet user and a different version to a search engine. The cloaked page is packed with keyword and terms that the site wants to be highly rank for so, in essence, they are cheating. There are good reasons for cloaking as well, such as targeted advertising, but if you are trying to manipulate your rankings you should put an end to this immediately.


3. Duplicate Content or Websites


If Google finds multiple web pages have the same content they may penalize each website for this. Of course, someone may have copied your content and Google banned you even though it was your original content that was taken. Make sure no other site is using your content. You can do this by performing a Google search using some of your text with quotation marks (") around it. If you do find someone is using your original copy visit here to learn more about copyright infringement: http://www.google.com/dmca.html.


You can check here to see if your site has been duplicated unbeknowst to you: http://www.copyscape.com


4. Hidden Text and or Links


How can text been hidden? Well, there are a variety of ways - some are more sneaky than others. But is boils down to this: it is considered hidden if the text or link is invisible to the website visitor but can be seen by search engine spiders. This used to be done quite often, such as making your text white on a white background or using cascading style sheets (CSS) to hide your text, but search engines can easily spot this today so it is best to avoid it altogether.


5. Keyword Spam and Keyword Stuffing


Ever seen a web page with a very awkwardly written first paragraph where a certain word is repeated ad nauseam? Here's an example:


"We sell the best father's day gifts for father's day. If you like to celebrate father's day we can help with the best father's day gifts for father's day."


Care to guess which keywords are being targeted? This is keyword spamming or stuffing but it is just the tip of the SEO iceberg. This is just the content on the page, there is probably keyword stuffing happening in the code: in the meta tags, invisible text, alt tags, title tags and comment tags. etc. If the word or phrase is repeated too often Google can place a filter to reduce the site's rankings or simply ban the site. Keyword density can be tricky but, as a general rule, Big Oak shoots for 1% to 5% of all text on a page to be our targeted keywords. Ultimately you must write for the reader not the search engine. Be sure the keywords flow naturally.


6. Doorway Pages


Defining a doorway page can be difficult so here is our definition that could potentially ban your site in Google: pages that are created in order to attract search engine spiders and be ranked highly for their targeted keywords. Real visitors find this page and then continue to the "real" website from there. Hence the name "doorway page". These pages aren't in the navigation most of the time. If you come across a page where much of the information is duplicated from other pages on the site but it is different in terms of keywords only, this is most likely a doorway page.


As you can see this can be a gray area. Some pages on a website may focus on a particular subject and be innocent of trying to lure search engine spiders only for high rankings. Err on the side of caution and make sure the page is useful and part of the your site's navigation.


7. Redirect Pages


Sneaky redirection pages are set up in groups from 5 to hundreds. They all target similar and related keywords or phrases. Usually, the only links on these pages are links to other pages in the same family creating a false sense of related linking.


These pages don't necessarily contain content that any human would be interested in. These pages may show up high in Search Engine Results Pages (SERPS), but when you click on one of these pages from the SERPS, you will be redirected to another page. In other words, the page you click to see is not the page you actually get to read.


The redirect can be automatic, done with a meta refresh command or through other means such as a the mouse moving while on the redirect page.


8. Buying Links


While buying links may not get you banned, they can certainly hurt your page rank. Google has slowly been catching on to this fad and has measures in place to put your site in limbo for 6-8 months (known as the "sandbox effect") so you can't instantly benefit from buying links to your website. Many sites that sell links are being devalued by Google, making an investment in this strategy a waste of money and time. Ultimately, stay away from buying links to increase your ranking.


9. Linking to Bad Neighborhoods


Link campaigns are good thing when done correctly; we would say they are a necessity in today's SEO world. But linking to bad neighborhoods are a sure way to lose your rank in Google. If you aren't careful about who you are linking to you can easily disappear overnight. Basically, while you may be ethical and do everything right linking to someone who isn't can be considered guilt by association. Always verify your links to other sites. Make sure they have page rank in Google and are indexed by Google. Try searching for their URL to see if they are indexed. Avoid linking to any sites that use spamming techniques to increase their search engine rankings. Regularly checking outbound links from your site and removing any offenders is a good idea.


A few site types to avoid:



  • Free-for-all link farms

  • Adult sites

  • Gambling sites


10. Code swapping


Optimizing a page for top ranking, then swapping another page in its place once a top ranking is achieved.


What does Google say?


"Don't deceive your users, or present different content to search engines than you display to users," Google says, and they list some bullet points on avoiding being banned.



  • Avoid hidden text or hidden links.

  • Don't employ cloaking or sneaky redirects.

  • Don't send automated queries to Google.

  • Don't load pages with irrelevant words.

  • Don't create multiple pages, subdomains, or domains with substantially duplicate content.

  • Avoid "doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.


Google also states:


"Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, 'Does this help my users? Would I do this if search engines didn't exist?'"


While creating a page without a thought to search engines is probably going a little too far, optimizing your site for an organic search, as long as it conforms to their standards, is perfectly acceptable.


How to get back into Google


Visiting this old link I used to reccommend http://www.google.com/support/bin/request.py shows no option for “I’m a webmaster inquiring about my website” any longer which allowed you to request reinstation.


However, logging in to Google Sitemaps now shows a direct link at the bottom of your main account page to “Submit a reinclusuion request” which takes you here - https://www.google.com/webmasters/sitemaps/reinclusion?hl=en


This means you will have to register your site to do.


From there you get to check boxes that let you admit guilt, acknowledge modification, agree not to do it again, and even a box to explain yourself.


You don't have to contact Google but it can't hurt. They will eventually spider your site again and see that you have cleaned up your website. You may have to wait a few months for Google to re-index your site so be patient and don't tinker with your website too much unless dictated by your site's products or content needs.


The worst case scenario is to start a new site. Sometimes this can be necessary but only in the most extreme cases.

Wednesday, 27 April 2011

Post optimization Tips to get higher SERP

Post optimization Tips to get higher SERP


Let’s talk about On-Page Optimization, especially on posts content. Like everybody said that content is the king, and SEO is the queen. That is the critical area in our site. Blog post optimization gives your posts the best chance for showing up in Google for a predetermined keyword phrase.
I am using a EO plugin, and it give me inspiration to share what are they talking about. There is a technique to optimize blog post for better SERP in search engine result. Not just make a new post, publish and you will get good SERP. At least there are 10 things in their suggestion. What are they? Here is the complete list.
1.    Use Enough Keyword Density.
Keyword density is the percentage of times a keyword or phrase appears on a web page compared to the total number of words on the page. Try to use it between 2 – 6%.
2.    Add Keyword Phrase in Post Title.
Always include the primary keyword phrase in the page title. Keep the title less than 70 characters in length.
3.    Add an H1 Tag Containing Keyword Phrase..
Most blog CMS’s automatically assign an H1 tag to the blog post’s title. But if you not sure that your theme have it, add H1 tag with the primary keyword phrase in it on the post content.
4.    Highlight Your Primary Keyword with boldface or strong near the top of your content
5.    Add an image with keyword phrase in alt text.
Always make sure to have at least one image on every blog post. Also, include the primary keyword phrase in the image file name and the image alt text.
6.    Add Keyword phrase to first sentence.
From the start of the sentence, you should use keywords that leads your post contents.
7.    Add Keyword phrase to last sentence.
In your post, most of the keywords should be used so that your post may get indexed. It is strongly recommended that the usage of the keywords must be reasonable otherwise you may be penalized by search engines.
8.    Inbound or Internal Links.
It is recommended to link the primary keyword phrase to a page on your website domain with the intent of helping that page rank for that word.
9.    Use more words.
Try to use enough words in your post content so that most of them get indexed in different search engines like Google. It is commonly seen that longer posts are indexed for different keywords.
10.    Use nofollow link for external links.
As you know that this is a DoFollow blog and it receives almost 98% spam comments. So, don’t use “DoFollow” link for external link, except the link have good and important or have correlation with your post. I suggest you to use “Nofollow” link to all of external link in your blog post.

Wednesday, 6 April 2011

High PR Dofollow forums Create Free backlinks website traffic Seo

All of us know the importance Backlinks that works as Oxygen for a website or a blog. DoFollow links plays an important role in link building purpose that is necessary for a website to get high rank.


I have already shared a Dofollow blogs list for SEO'ers so that they can get some benefit from this list for their own blogs. But this time I am  sharing a list of High PR Dofollow forums that gives you an opportunity to promote your links in signatures and every post you make on these forums will give you a free backlink.


Register at these forums and post something and set your website or blog link as signature. Your website will get a juice of Backlink. If you like this post, Please share this post.


[caption id="attachment_1214" align="aligncenter" width="285" caption="Dofollow forums list"][/caption]



  1. forums.ukwebmasterworld.com

  2. forums.seochat.com

  3. cre8asiteforums.com/forums

  4. submitexpress.com/bbs

  5. highrankings.com/forum

  6. forums.digitalpoint.com/

  7. v7n.com/forums

  8. affiliateseeking.com/forums/

  9. sitepoint.com/forums/

  10. forums.seroundtable.com/

  11. forums.teneric.co.uk

  12. websitebabble.com/

  13. webmasterforums.biz/

  14. siteownersforums.com/

  15. webmasterforums.com

  16. webmaster-talk.com/

  17. webmasterforumsonline.com/

  18. ukseoforums.com

  19. webforumz.com

  20. australianwebmaster.com

  21. webmastershelp.com

  22. webcosmoforums.com

  23. daniweb.com/forums/

  24. zymic.com/forum/

  25. googlecommunity.com/forum/

Sunday, 3 April 2011

Keywords research can make you millionare

I’ve been following Satrap’s tips and pointer on making money online for quite a while, and I figured it is about time I contribute something of genuine value to the blog (and the blogger) whose input helped me reach several significant conclusions on internet marketing.


We all know that there are many ways to make money online – different markets and niches within them, revenue models, advertising and affiliation options and the list goes on. The variety is staggering, and while I could delve a bit deeper into this I’d rather leave it to Satrap and focus on what I believe to be the common denominator to practically all online marketing methods.


No matter which method of monetization you go for, traffic generation is crucial to your success. Search engines send incredible multitudes of people in virtually every direction, and by “direction” I mean search query. It would be unwise (the understatement of the century, if you ask me) to ignore search engine traffic in your online money making endeavor, and the first step of tapping into that surge of potential conversions is keyword research.


Explore your niche and closely related niches as well. Get a comprehensive relevant keyword list compiled, and conduct an in-depth keyword research isolating the keywords that are both popular (searched for) and relatively easy to rank for in terms of existing competition. It is widely known that short tail keywords are usually very popular, but at the same time they tend to be the toughest to rank for with their high competition levels. That’s why I always recommend looking for those longer tail keywords that are both more specific (and as a result generally closer to conversion) and relatively easy to rank highly for. Low search engine rankings mean your site does not receive the exposure it needs in order to attract search engine traffic. Thinks about it – what would be better: appearing on the 15th SERP page for a highly relevant and competitive query that has an average daily pull of 10000 inputs, or ranking first on 7 different, yet relevant long tail queries, each pulling an average of 2000-3000 unique inputs? You get the idea.


It takes time and some creative thinking, but a properly conducted keyword research, incorporating the right keyword research tool, is worth the investment in the long run.