Posts

Social Media - 10 Mistakes People Make On Twitter

Let us look at the most common mistakes people generally make while creating and using a Twitter account. People create multiple accounts which often diverts their followers instead just create one 'Twitter Handle' (id) for your account and manage it thoroughly  Customize the biography , which people often ignore & be specific in branding your products or services Don't follow a number of people instead follow people in your space tweeting useful information Don't talk too much about yourself, instead talk to the consumers & value them, talk about your products & services People simply ignore the 'hashtag' (#) but actually, hashtag gives you visibility. Therefore, make complete use of 'hashtag' and if possible always use one or the other relevant 'hashtag' while tweeting. Don't tweet too much otherwise your tweets will get lost in space and will not get importance. It's better to stick to a maximum of 5 tweets per

SEO - 4 Reasons Why Website isn’t Working to its Potential

In this post, we will try to understand why the website is not working as per its potential. When a website is running at 100% efficiency but sales or traffic goes down, business teams starts blaming SEO. However, one must understand that search engine optimization is a process that requires a lot of effort and there are many areas which need to be maintained properly in order to make things work. SEO is about delivering inbound traffic to any website. Once the website starts receiving traffic, it depends on the brand, website and product offering to convert the visitor or make a sale. Let us have a look at the reasons why traffic might not be converting: 1. Poor Website Layout Visitors are unable to find relevant information on the web page, or the layout of the website is poor and doesn’t flow properly. Under such circumstances visitors would more likely move onto some other website where they can find the information, they are looking forward too. 2. An Obsolete Web Desig

SEO - On-page/site SEO Myths and Facts

We all are aware of the fact that search engines update their algorithm on a regular basis to refine the search and deliver better search engine results. Therefore, it is recommended to follow and optimize the website as per as per search engines guidelines and algorithm. On-page/site optimization is the most important part of SEO or in other words, SEO begins with on-page/site optimization. On-page/site factors play a major role to enhance the ranks of any website in SERP's. But, many of us are not aware of the myths doing rounds in this field. Hence, in this post, we will look at the prevalent myths & facts about on-page/site optimization. Website Title: The Title of a website/webpage is a vital component of onsite optimization. Myth: Many SEO specialists believe that stuffing keywords into the title would help fetch higher ranks for the website. This SEO technique was effective until search engines started recognizing SEO spammers and spammed webpages/websites. Fac

Cross Domain URL Selection

First, let us understand what is meant by Cross Domain URL selection - When a search engine finds matching/identical content pieces on different URLs/websites, it runs an algorithm to select a primary URL among all other URLs and removes them from search results. What does this mean? Google lays emphasis on its algorithm to identify the original and highest quality link for the content, and to devalue the other URLs having the same content, further removing them from the search results. The idea behind this is to cut down on duplicate search results, to reduce a lot of copy cat, scrapers, and other unethical content websites. Through this user will be benefited by getting better and diversified search results, not just 10 different URLs saying the same thing. What did Google say about it? There are numerous possible causes of the cross-domain URL selection which may include: Duplicate content, 301 redirects, Configuration mistakes Incorrect canonicalization Misconfigure

SEO - Importance of Google Webmaster Tool

Let me ask you a few questions about your website. Do you know what is happening on your Website? Do you know your website statistics? Are there any areas which the bots cannot go through? Do search engine (specifically Google) look the website the way you want them? If you do not have the answer to these questions then its time to learn and take advantage of Google Webmaster Tools. Google Webmaster Tools aims to give information about Overview, Diagnostics , Statistics , Links , Sitemaps , and Tools . It helps to understand the performance of any website. Top queries: Find the top queries that drive traffic to the website and where the website is included in the top search results. This will let you learn how users are finding your website and what are the possible search queries? It also shows impressions and CTR (Click Through Ratio) of your top search queries. Indexing information: Find out how the website is indexed and which of the web pages are included in t

SEO - Add Images to XML Sitemap

Image
Add Images to XML Sitemap Sitemaps are an invaluable resource for search engines and so as the images placed on the website. Now through Sitemaps, you can provide Google the information about important images present on your website by using a simple code. Adding images to your Sitemaps is quite simple. You can follow the instructions in the Webmaster Tools Help Center or refer to the image below: A sample of XML Sitemap Image Tags in XML-Sitemap Google index billions of images and see hundreds of millions of image-related queries every day. Its high time you can take advantage of this traffic very easily, all that you have to do is update your Sitemap file with information about the images from your site now. Image Courtesy : Google Support

SEO - 3 Important Steps to Improve Crawlability

Image
Let us have a look at the 3 important steps that can help improve the crawlability of any website. 1. Make a robots.txt file A robots.txt file will prevent the Robot from crawling - - web pages with sensitive material - web pages that you don't want to be found through any search engine, - web pages that are not important or can have a  negative effect on rankings. It helps to keep the bot away from anything that's not good for Search Engine Rankings of the website. Yep, just tell him not to go here or there — and its all done. Robots.txt file can be created through Google's webmaster tool. An example of the robots.txt file 2. Make different paths to reach a page Strong interlinking of the web pages is required, which enhance the way bots can find any web page. Strong interconnection of web pages helps increase the crawl frequency. 3. Fix broken links A broken link is the one having some elements incorrect or missing from the link's HTML cod