Tuesday, September 30, 2008

Accessibility important factor in SEO

How crucial is web site accessibility for your online business

2 complimentary aspect
of Online Business are important: SEO and Web Accessibility.

1-- SEO with offsite and onsite techniques
2-- Site Accessibility

Neither one of these aspects can be ignored when you are online serving customers.

Why is SEO one of the important aspect of Web Marketing? It is done with the aim of achieving higher traffic. If the SEO is done properly the traffic goes higher, and a high number of users want to access the site. Unfortunately the serving pages going slowly or not serving them at all, cause the user to see "Internal Server Error" , "Not Found" or "Page can not be displayed" and other ... errors.

This is when you worry about accessibility. Not being able to reach your site means a negative brand impact to your site. In addition, if Web site services-including support information, and the functionality-are inaccessible the risk of loosing direct customers is higher.

The results of ignoring the second aspect of SEO:

  • Waste of optimization efforts
  • Risks to fail online
  • Lost customers
  • Lower brand reputation
  • Traffic decrease
  • Customer complaints
All these you get if your SEO is not combined with improved uptime and performance.

Site accessibility means quick load time and high uptime.

Dynamically improving the site performance allows you to ensure a consistent quality of service to the user, avoiding the possibility of them not returning when faced with poor performance.

How to improve: by constantly keeping your eye on your web site performance

Web Monitoring

HTTP Monitors checks web site header response time and notifies if it is slow or unreachable. The bad response are 400 client errors for instance 404 is when page is not found .
You need to control each outgoing HTTP request from your computer and download only necessary Internet resources. And monitor critical applications from your client location to ensure mutual connectivity for better business and customer satisfaction.

Server Monitors check web server performance and availability: CPU, memory and processes. Monitoring key processes on your servers you will determine CPU and memory consumption by each process. If CPU consumption is high and/or processes take up a lot of the server resources, all the applications on the sever start loading slowly causing web site a bad user experience. This monitoring reveals obvious problems long before the processor or memory would be permanently damaged.

By proactive monitoring you'll be the first to know of your site functionality problems before your customers.

SEO efforts are geared to attracting customers to your site. Monitoring helps to provide excellent customer experience and satisfaction. Both together ensure good service online.

Now watch the video to see how to increase your revenue potential with Web Monitoring.

Author: Kristina Frangulyan

Monday, September 29, 2008

Prevent GoDaddy Charging $6,579.51 to a Customer

Adam Fendelman wrote and article in The Huntington Post about GoDaddy charging him $6,579.51. After four days of investigation by GoDaddy's security department the reason was found. The server wasn't "hacked"; the problem came from Drupal open source software which deposited thousands of temporary files into my GoDaddy hosting account. After back and forth negotiations with GoDaddy, finally the situation was resolved by GoDaddy promising "significant changes" to help prevent this issue for other people in the future.

GoDaddy wanted the customer to monitor this proactively by logging into his account, digging deep within their tools and checking the one that reports disk space usage.

Below addressed 2 other ways to prevent the issue.

1) Using a third party proactive server monitoring service would've revealed the issue much earlier to prevent customer paying $969 in the first place, spending long unproductive hours of investigating the situation and being frustrated and unhappy with the provider.

2) For GoDaddy to prevent all problems, including client complaints and refunding they could setup a simple server monitoring on the disc space cap and an automated email notification to client about critical risk of exceeding quota.

Friday, September 19, 2008

IBM Server in Top 10, Apache 40% popular than Microsoft IIS

IBM Http server is among top 10 when analyzing statistics of web servers' uptime in Mon.itor.Us monitoring data. It's based on over 10 million daily checks over 70K http test performed by monitoring servers from 3 geographic locations.

The three criteria for evaluating considered:

Popularity - number of http tests set for websites per server;
Uptime - average percentage of OKs for the http checks per server
Response time - average page load time of websites per server

is the most popular still, with 33,902 tests being set in Mon.itor.Us network followed by Microsoft-IIS, which in August showed 40% lower popularity.

GFE and IBM Server are seen in the two performance graphs.

Websites hosted on AkamaiGHost servers show the quickest average response but only 4 sites are monitored, gws has 50 sites, Rapidsite 28. IBM server has 184 sites in Mon.itor.Us monitoring data.

AOL server closes the list of servers showing ~99.5% uptime from the websites monitored on Free Website and Server monitoring service.

Friday, September 12, 2008

Mind the Uptime to Increase Web Site Crawl Rate

Ann Smarty analyzed 10 ways for making web sites important and valuable for search engines. "Note that you can’t force Googlebot to visit you more often - what you can do is to invite it to come", she said.
Server uptime and page load time are listed among the 10. Nowadays search engines, including Google look at their search techniques work for humans and likes and dislikes similar to human actions. If site is slow or doesn't work regularly, SE don't add value to those pages.

These quality guidelines make Webmasters spend their time and energy to a much better user experience.

Monday, September 8, 2008

About HTTP Response Codes and SEO

HTTP response codes can be considered one of the most technical aspects of SEO, yet hardly taken care of, states ZealousWeb Technologies blog. SEO is a way to get your website noticed on the web and it consists of online and offline techniques. Content is an on-line SEO and sometimes is referred as a a King. The review on the ZealousWeb referring the statement says: "...but it doesn’t matter how well you write your copy or optimize your pages - if you can’t be indexed, you can’t be found."

The Author listed all known response codes and their meaning. Why, for example, getting to http 404 response code is bad for SEO?
The reaction of the search engines spiders depends on the response codes that they get back from server. A badly configured server sending back the wrong response codes can stop sites from ever being indexed. Here are a couple of examples from the post:
  • The server always returns the response code 404 - Some badly programmed scripts that give sites “search engine friendly URLs” return 404 values instead of 200. In this case, the search engines won’t index these pages at all.
  • The server never returns 404, even when a page is not found – HTTP Redirect If you type a wrong URL, then ideally you should get a page telling content cannot be found. But if this page isn’t returning a 404 code then the spider will assume the page is ok. So, whenever you remove a page, or if content on your site expires, then the page will still be indexed in the search engines, but with the “Sorry this page cannot be found” text instead of the original content. This page could compete with your other pages in the search results, and creates unnecessary duplicate content throughout your listings.
  • The server redirects pages using 302, not 301 – Suppose for a special campaign which has a short URL like “/discountoffer/” and it redirects to another page on your site like /rates/. It should use a 301 redirect to tell the search engines that the real page is /rates/, not /discountoffer/. Using a 302 will confuse the search engines as you are saying “The real page is /discountoffer/ and /rates/ is just a temporary page”. This will make it hard for /rates/ to be listed properly in the search results.
The response codes can have a quite drastic impact on your search results, author says.

Server monitoring will help to discover these issues and to take actions of improvements.

Wednesday, September 3, 2008

Webmaster Unlimited addresses the need for monitoring multiple hosts per server

A New post is published at Monitis Blog about the possibility to monitor hundreds of hosts per one server without limits. Perhaps Webmasters are going to be interested in a cheap and fixed offer for $298/year. Its going to be a significant savings if you consider monitoring 200 URLs or more a month with 1-60 minutes intervals and boundaries for notifications.

Monday, September 1, 2008

Uptime Trends in Social Sites

Social networking is one of the most productive ways to promote your site and interact with others and gain some knowledgeable experience. And if you are a sociable and interesting person this will be a real benifit for you . Day by day these social sites are experiencing tremendous audience growth.

When you find something which fascinates you then this can help to broaden your knowledge. Yo can then share this with other people or friends in the web. For me, I simply enjoy my time in these sites to get useful information about everything in the world. By the way I'm new in this sphere but must say I like it.
I did a small study on site uptime and performance of the social media sites I was using a lot during my work. I noticed that some times they work slow or are not accessible.I decided to monitor them using mon.itor.us (A Free website monitoring tool). I entered the following sites to the monitoring ( Linkedin, Simpy, Tagged, Blinklist, Reddit, Furl, Wists, Connotea, 2collab, Sphinn, Livejournal,Technorati ) but talked about the ones I use frequently.
I put the sites on monitoring August 5 2008 and used 3 weeks to get the results.

Here are the results...

Digg has 382,792,751 - Inbound Links, 22,578,945 - Compete Monthly Visitors
It is one of the best social bookmaking sites where you can share content that you like with a broader audience of Digg users.
And you can easily send out a "shout" to all of your friends, which will in turn bring you some additional diggs to help push your post to the top.
Being popular on Digg means Dependable traffic increases for the website you submitted.
Now let us see the Digg
1)Average uptime 99.88 %
2)Average Response time 1028.4

StumbleUpon 19,050,177 - Inbound Links, 1,313,586 - Compete Monthly Visitors
This is one of my favorite sites I like stumbling upon and I'm keeping my thumbs up.
With StumbleUpon I connect with friends and share my discoveries: videos, photos and more based on my interests.
Meet people that have similar interests. It's community is lovely and valuable.I like stumbleUpon features, especially when it tracks my visitors, shows my friends latest updates in “what’s new .“ There are lots of things to do in this site and I'm just beginning to surf the internet with the StumbleUpon.
In this month Stubbleupon
1)Average uptime 99.97%
2) Average Response time 278.4

At August 26 Facebook hit 100 million.
And now 100 million people around the world using Facebook to keep up with friends, upload photos, share links and videos.
Facebook is a good and trusted place for me to share my pics, videos some interesting posts and simply interact with my friends.
And among this 100milion people I enjoy my visit to this site each time. Oh, I was nearly forgetting to talk about there applications, they are so many funny applications and many remind me of my childhood games.
1)Average uptime 99.69%
2) Average Response time 397,6

A combined a view of everyone's bookmarks with a given tag available within 171,593,051 - Inbound Links and 1,699,128 - Compete Monthly Visitors.
For SEO specialists and Web 2.0 users it is essential to bookmark in Delicious.
It gives you an opportunity to
subscribe to particular tags of users.
So If you are interested only in particular parts of a user's bookmarks then you can subscribe directly to that tag's feed.
1) Average uptime 99.74 %
2) Average Response time 173.6

Propeller has 997,000 - Inbound Links and 1,454,912 - Compete Monthly Visitors
It is a social news portal site where you can submit your story and many people will view your story if it is interesting and unique. There are also cool groups which you can join.
But... I liked the old version more. And google was indexing my posts very quick ,which I don't see now, even after some days. And also I don't like the Propeller's new design.
Anyway this is Propeller uptime and response time
1) Average uptime 99.29 %
2) Average Response time 174.64

MySpace is a popular social networking website with more than 100 million users and 230,000 new users per day.
I have my profile there it offers an interactive, user-submitted network of friends, personal profiles, blogs, groups, photos, music and videos. But the trend there more is socializing directed, than business usage in comparison with facebook or stumbleupon.
1) Average uptime 99,74
2) Average Response time 276,3

Author: Kristina Frangulyan
My Profiles:

My Sosial sites are benchmarked in Mon.itor.Us Benchmarks.

I also included screenshots from Slide.com on Social Sites' daily uptime results, and video on my monitoring dashboard from YouTube.

ss_blog_claim=6e1b5517025444916f3631c45ec0295a ss_blog_claim=6e1b5517025444916f3631c45ec0295a