Friday, January 17, 2014

Karma - Creative Landing Page Blogger Templates





Karma is most advanced landing page template for blogger which provides lots of incredible features. This theme is very advanced which means it may be a bit difficult to use for the average user. This is best blogger template till date.


What you will get ???


XML File Template + Link Video Tutorial : How to install this template 


Image Slider

This template comes with image slider which is very customizable, We used flexslider as base and then we have added our Css code to give it a different look.


Shows Of Your Product

This theme will definitely increase your product sell, you can use many awesome fonts and icons to show off your product features.


Key Features




  • Image Slider: You can add infinite images to this slider, all you have to do is to replace the existing image url with yours.
  • Animation: You can feel the animations on scroll which make this theme more attractive.
  • Blog: We have added separate blog tab to this theme which make it look professional.
  • Lots of Ready Made Icons: Use have added all icons of fontawesome in this theme, watch above video to learn how to use it in your template.

  • Video Tutorial : How To Set Up This Template

    If you like this blogger template. You can click ads at sidebar or on the top post to donate for me. Thanks

    Free Download + Like And Share If It 's good for you. Thanks !

    Thursday, January 16, 2014

    Bpress - Magazine Professional Blogger Template





    BPress Blogger Templates is Responsive and High User Friendly Blogger Template. Bpress Blogger Template Design For Magazine Style Premium Blogger Blogs.

    What you will get ???


    theme + documentations

    Responsive Test


    Easy Custom Install


    bpess-template-blogger



    Key Features

    • Responsive Template Design
    • Auto Blog Post summarize
    • Auto Image Crop With Thumbnail (Home Page and Label Page Only)
    • Css And HTML Base Menu With Drop down
    • Auto Date Show At Header Right side
    • Tabs Widgets ready (Powered By Jquery)
    • Image Slider (Powered By Jquery)
    • Simple User Friendly Search Box
    • Auto Pagination Added To Post Footer
    • Simple And User Friendly Social Icon’s Placed Header Right Side
    • Social Share Widget (Placed Post Footer)
    • Custom Error 404 Page With Search Box (Official 404 Page)
    • Related Post Widget (Placed Post Footer)
    • Ads Banner Ready
    • Meta Keywords And Description Support (Official Meta Description Support)
    • Modern Recent Post By Labels
    • Quick Massage Option (Using Official Contact Form)
    • 4 Column Footer Area
    • Flicker Image Gallery
    • Official "FOLLOW BY EMAIL" Widget
    • If you like this blogger template. You can click ads at sidebar or on the top post to donate for me. Thanks

      Free Download + Like And Share If It 's good for you. Thanks !



    Monday, January 13, 2014

    Google Analytics: Understanding and Lowering Bounce Rates

    google analytics


    Understanding and Lowering Bounce and Exit Rates

    Google Analytics provides valuable intelligence into how visitors find, interact with and leave your website. This intelligence is central to improving both user experience and the profitability of your website. Google Analytics provides many useful metrics that help you do this and two of the most useful are bounce rate and exit rate.
    The difference between a bounce and an exit can be confusing, especially if you are new to analytics. The goal of this article, then, is to demystify the two and explain why they are important. It also acts as a guide to interpreting bounce and exit data and how to lower them in order to improve the performance of your website.

    Making An Entrance That Counts

    Before you can understand and calculate bounce rate you need to know a little about entrance pages, also referred to as landing pages and entry pages. Google defines an entrance page as:
    • Entrances: This metric identifies the number of entrances to your site. It will always be equal to the number of visits when applied over your entire website. Thus, this metric is most useful when combined with particular content pages, at which point, it will indicate the number of times a particular page served as an entrance to your site. Source
    In short an entrance page is the first page a visitor lands on when visiting a website. Entrances are, as we will see, a key factor in calculating bounce rate.

    What Is A Bounce?

    A bounce is a single page visit. A bounce occurs when a visitor enters and exits a website viewing no other pages other than the entrance page.

    What Is Bounce Rate?

    If, for example, if 100 visitors enter your site via Page “A” and 20 of them leave without clicking through to any other page, page “A” would have a bounce rate of 20%.
    Fig 1: Site Wide Averages
    Fig 1: Site Wide Averages
    Some of the reports Google Analytics generates will give site wide averages. The screen grab above has been taken from the Top Content report which can be found by clicking the Content tab in your Google Analytics dashboard.
    The first thing you might notice is that when you add the average bounce rate and the average exit rate together the result is greater than 100%. If bounce rate and exit rate are measures of how many people leave your site, how can the total be greater than 100%. The answer is that it can’t.
    You might be fooled into thinking that bounce rate is calculated as a percentage of Pageviews. A logical thought since it is figured in the report. However, when added together, bounces and exits would again be greater than the total Pageviews.

    Bounce rate is not based on the number of visitors or the number of page views it’s based on entrances.

    Will The Real Bounce Rate Please Step Forward

    Fig 2: Sitewide Entrances And Bounces
    Fig 2: Sitewide Entrances And Bounces
    To get at the real numbers that contribute to bounce rate you need to dig a little deeper. The screen grab above has been taken from the Top Landing Pages report which can also be found by clicking the Content tab in your Google Analytics dashboard.
    As you work your way down the report you can also view bounce rates for individual pages.
    Fig 3: Bounce Rate At Page Level
    Fig 3: Bounce Rate At Page Level
    The Top Landing Pages report helps to identify pages with high bounce rates that might require further investigation.
    You can clearly see from Figure 3 how bounce rate is calculated at for a single page: (283 bounces / 303 entrances) * 100 = 93.39939939934% which analytics has rounded up 93.40%. As interesting as this is, it tells us nothing about what is driving the bounce rate and what steps if any are required to lower it.

    Bounce Rate Through The Looking Glass

    Pages that fail to meet visitor expectations, don’t provide clear navigation, talk about features rather than benefits and content that’s not actionable all increase bounce rate. Not all visitors to your site are using desktop machines with ultra-fast connections and will abandon your site if page takes too long to download. If you have been over-zealously linking to your site, links from pages that are not closely related can also increase bounce rate. These are all things you can test for and fix to a degree.

    Missing Timestamps And The Pages Time Forgot

    Google Analytics reports the time visitors spend on pages by comparing timestamps. When a visitor lands on a page a timestamp is created which records the precise time they arrived.
    If a visitor arrives at page “A” at 13.45 and clicks through and lands on page “B” at 13.47 two timestamps will be created. By subtracting the time the visitor lands on page “A” from the time they land on page “B” you arrive at time spent on page “A”:
    13.47 – 13.45 = 2 minutes spent on page “A”.
    If at 13.50 the visitor leaves your site completely no timestamp is created and there is no way to tell how long the visitor spent on page “B”.
    Why was no timestamp created? If the page was outside the scope of your analytics account, on another domain for example, the timestamp can’t be accessed by your analytics account. Therefor and the time spent on that page can't be determined for that page view.
    Similarly, the time spent on a page by visitor who enters a site and bounces without visiting any other page cannot be measured either.

    Cookies, Sessions And Timeouts

    Google Analytics uses cookies to track the activity of visitors to your pages and report those activities back to their server. Cookies enable Google to distinguish the activities of each visitor individually and track sequential page visits made by the same user during their time (session) on your website. This information is then reported back to you when you log into you Google Analytics account.
    Every bounce or exit is the result of a session timeout. In Google analytics, a session will timeout after 30 minutes of browser inactivity. If a visitor navigates to another website, the session will still continue for a maximum of 30 minutes before registering a bounce or exit. As long as the visitor returns before the session times out and clicks through to another page of your website, it will not be considered as either a bounce or an exit.
    • Each and every visit to your site culminates in a session timeout.
    • A session that times out after a single page view is classed as a bounce.
    • A session that times out after multiple page views is classed as an exit.
    Have a look at the tabs open in your browser right now - how many have been open for more than 29 minutes without any activity? Even though the page is still open in your browser, some of the sessions associated with individual pages might have already timed out causing a bounce or exit. Similarly closing your browser, hitting the back button or disconnecting from the internet will all cause a session timeout that will likely be recorded as an exit or a bounce in someone’s Analytics.
    In the next article in this series, Bounce Rate, Dwell Time And Panda, delves deeper into bounce rate and how dwell time influences post panda search rankings.

    Bounce Rate, Dwell Time And The Panda Update


    In a previous article about bounce rate, Understanding and Lowering Bounce Rates, I laid out the basics of how bounce rate was calculated and how to find the bounce rate of specific pages. This time we are going to dig a little deeper and discuss why Panda has polarized the importance of lowering bounce rate.

    First I want to make a couple of things clear. It’s highly unlikely that search engines use bounce rate directly when scoring or ranking webpages. Nor is a high bounce rate a definite signal of low quality or a failure to meet visitor expectations or needs.

    Something To Dwell On

    A high bounce rate could be the result of a page that does exactly what it sets out to do or one that completely fails. News sites, sites that provide “dip in” resources or tutorials, article sites like Ezine Articles and content farms including HubPages tend have a naturally high bounce rate. These are sites that can satisfy visitor needs with a single page visit.
    If search engines don’t use bounce rate and it isn’t necessarily a signal that indicates poor user experience, why is it so important to lower it?
    There is a new metric, one that Google Analytics won’t reveal to you and it’s a big contributor to sites being Panda-lized. It’s a Key Performance Indicator like no other and it’s here to stay. It’s called Dwell Time, and by lowering bounce rate and increasing time on site you can keep your site safe and out of harm’s way.

    What Is Dwell Time?

    Dwell Time is a measurement of how long a visitor spends between entering your site and leaving. On the face of it, it sounds similar to bounces and exits. The problem with both these metrics is that they don’t report how long someone was on a page prior to bouncing or the time spent on the last page of a visit prior to an exit.
    There is a strong correlation between Dwell Time and engagement. Dwell Time has been used for some time in the calculation of AdWords quality sore i.e. a short dwell time is a strong signal that tells search engines that the landing page lacks relevance and quality.
    The result of a short Dwell Time within AdWords is increased ad costs, in organic results it means rankings will tank.
    At this time there are no search engine algorithms that can accurately distinguish high quality content from the hum drum. Search engines are getting better in areas such as discerning natural language patterns, however, true qualitative assessment is still a long way away.
    Panda has seen Dwell Metrics evolve beyond measuring the length of time consumers spend engaging with online ads to a measurement of engagement between all web users and every kind of content whether it be informational or transactional.
    • While it may feel like you’ve poured your heart and soul into creating the content on the website, quality is in the eye of the visitor, and short page dwell times can indicate the content is not capturing the visitor’s interest. Something about the content is not grabbing their attention. 

      Source: Duane Forrester, Sr. Product Manager at Bing

    Bouncing Back To Full Health

    Hopefully, analytics packages will evolve to let us see Dwell Time. In the meantime any measure that can be employed to improve engagement and increase the time visitors spend interacting with our content is essential.
    The next article in this series will look at techniques to lower bounce rate, increase user experience and extend user interaction time.

    What Is Anchor Text And Why Is Anchor Text So Important?


    What is anchor text? Why is anchor text important? Didn’t I tell you already?

    During my time online I have written literally thousands of articles that relate to Search Engine Optimization in one way or another. These articles regularly use technical or industry specific terminology that is common place to me. There has, however, been an assumption on my part that my intended readers are fully conversant in the jargon I use when writing. One example is “Anchor Text”.

    Everyone’s Getting Hyper

    HTML is an acronym of HyperText Markup Language. HyperText Markup Language is used in conjunction with CSS to create webpages and tell Web browsers how to display them. Links, or Hyperlinks to give them their Sunday name, connect one piece of HyperText (a web page, for example) to another piece of HyperText.

    Anchors Away!

    Hyperlinks connect one web page to another and make the web navigable. Anchor text is the visible clickable text of a hyperlink that is often differentiated from other text by being underlined, a different color, or bold. Anchor text also provides useful information to users and search engines about the page being linked to.
    Code Example:
    <a href="http://seo-bloggertemplates.blogspot.com/">Free Download Blogger Templates</a>

    Cracking The Code

    The above example shows how a link is coded in html. It opens and closes with an anchor tag, which is represented in HTML as, <a> for the opening anchor tag, and </a> for the closing anchor tag. An anchor tag marks the beginning (<a>) and the end (</a>) of a hypertext link.
    The opening <a> tag contains a href attribute - href is an acronym for Hypertext Reference. (Although there is no explanation found in the W3C Recommendations as to the true meaning of href, Hypertext Reference is the most widely accepted interpretation.) The href attribute tells your browser which web page to open when the link is clicked on.
    Next comes the anchor text itself, in this case its lets people know that clicking on the link will take them to my profile page (Peter Hoggan's HubPages Profile).
    Finally we have the closing anchor tag(</a>). Between the <a> and </a> tags we have told the browser what page to open and the user what to expect once the page opens. Here is what the link looks like in HubPages:

    Why Is Anchor Text So Important?

    One reason, which I have already touched on, is that anchor text helps people understand where the link will take them and what to expect when the get there. You probably found your way to this page via a link. Whether that was from my profile page, a link from another of my articles, through HubPages internal navigation or from a search engine, the anchor text was most likely an important factor in getting you here. In this sense a link, and its anchor text, can serve to pre-sell your content to your target readers and significantly increase page views.
    From a search engine optimization point of view, links and their associated anchor text are extremely influential in gaining rankings. Just like real people, anchor text tells search engines what pages are about and is therefor a major SEO consideration.
    You've probably read time and time again that search engines treat a link as a vote for the page they point to. That's perfectly true, however, it's the associated anchor text that gives meaning and context to the link.
    Bad Anchor Text Example
    To read my article about understanding and lowering bounce rate Click Here.
    A Better Anchor Text Example
    In the bad anchor text example the anchor text was set to Click Here, humans might "get it", but from a search engines point of view the page would be a good resource to send people searching for "Click Here". This type of linking has become known as mystery meat navigation because of its lack of meaning and context. Incidentally, Adobe have held the number one spot for Click Here for as long as I can remember.
    In the Better Anchor Text example both search engines and humans can easily determine what the target page is about. Search engines rely on the anchor text they find in your site navigation and external links in order to categorize and score your pages correctly. Although link building is a very broad subject and beyond the scope of this short article, a great place to start is ensuring that the anchor text used in your websites navigation is clear, concise and intuitive.

    On Page SEO Part 2: An Introduction To Signals of Quality


    In the previous tutorial we looked at some basic on page factors including the alt attribute. It was suggested that every img tag should also have an alt attribute even if the image referred to was entirely decorative. These changes might at first seem a bit pedantic, however it makes for better accessibility and standards compliant HTML.


    Ensuring pages are accessible and standards compliant can cause a lot of work for webmasters trying to rectify things after a site has gone live, especially if every page contains multiple HTML errors. So is it worth all the bother? The simple fact is that accessible sites are generally more search engine friendly and can be viewed on a wider selection of devices and browsers.
    Making sure that every piece of html code on every page validates and meets current accessibility standards are signals that a business cares about every single visitor to their website. Spammers using ‘throwaway domains’ are more likely to shy away from this type of work because of labor, time and expense.
    Signals of quality are rarely about relevance, for example it’s easy to understand why allowing a page to go live as an ‘untitled document’ would harm relevancy, it’s not so obvious why including a telephone number would increase search engine rankings.
    There is a distinct difference between quality and relevance and search engine must necessarily balance both aspects in order to deliver the best results. The task of Identifying quality is becoming increasingly important due to the amount of low-quality content that is being uploaded to the web every day.

    Bayesian Filters

    Bayesian filtering is utilized by most modern day mail clients as a means to weed out spam emails from legitimate emails. Search engines use it to categorize documents and Google uses it to deliver relevant Adsense ads. How do Bayesian filters Work? Initially the process starts with a list of sites that have been classified as high quality and another list that has been classified as low quality. The filter looks at both and analyzes the characteristics common to either type of site.
    Once the filter has been seeded and the initial analysis completed they can be used to analyze every page on the web. The clever thing about Bayesian filters is that they continue to spot new characteristics and get smarter over time. Before we delve into any great detail on how Bayesian filters work, here is a couple of quotes from Matt Cuts regarding Signals of quality that clearly show Google is addressing the problems caused by low quality mass generated content.
    “Within Google, we have seen a lot of feedback from people saying, Yeah, there’s not as much web spam, but there is this sort of low-quality, mass-generated content . . . where it’s a bunch of people being paid a very small amount of money. So we have started projects within the search quality group to sort of spot stuff that’s higher quality and rank it higher, you know, and that’s the flip side of having stuff that’s lower-quality not rank as high.”
    “You definitely want to write algorithms that will find the signals of good sites. You know, the sorts of things like original content rather than just scraping someone, or rephrasing what someone else has said. And if you can find enough of those signals—and there are definitely a lot of them out there—then you can say, OK, find the people who break the story, or who produce the original content, or who produce the impact on the Web, and try to rank those a little higher. . . .”
    There has been mention of Signals of Quality in Google patents and some specifics have been discussed by Google engineers so hopefully the days of article mills and article spinners are numbered.

    How Bayesian Filtering Works

    Although it is known that search engines use Bayesian Filtering the exact algorithm is of course proprietary and unlikely to be made public, however the actions of Bayesian filters are well understood. So lets start by looking at how Bayesian filtering works.
    To begin a large sample or white list of known good documents (authoritative highly trusted pages) and a large sample of known bad documents (pages from splogs, scrapper sites etc) are analyzed and the characteristics of each page compared. When a large corpus of documents is compared programmatically patterns or ‘signals’ emerge that were hitherto invisible. These signals can then be used to provide a numeric value (or percentage likelihood) of whether the characteristics of other pages lean towards those from the original sample of good documents or those from the original sample of bad documents.
    Some simple examples of this would be to compare the words in the good documents to those in the bad documents, if it is discovered that many low quality pages use the terms like ‘buy cheap Viagra’ or have a section on each page for ‘sponsored links´ then other pages that do the same might be of low quality also. Conversely if it is discovered that high quality pages often contain a link to a Privacy Policy or display a contact telephone number then other pages that do the same might also be high quality pages.
    As the process continues more signals are uncovered. In this way the filter learns to recognize other traits and whether they are good or bad. There is likely to be many signals of quality measured, each one adding to or subtracting from an overall score of a pages quality.
    This means is that SEO’s web designers and webmasters need to adopt a holistic approach that takes into account information architecture, relevancy, accessibility, usability, quality, hosting and user experience.

    The Link Structure of The Web

    Although links will be covered in future tutorials, it makes sense to discuss some of the implications of recent changes in the link structure of the web now. Once upon a time reciprocal links were all that were needed to achieve top search engine rankings. Because reciprocal links were easy to acquire and made it easy to promote sites of lesser quality so that they outranked quality sites search engines stepped in and devalued reciprocal links along with PageRank.
    One way links were now the way to go, so a new market in selling one way links emerged. Search engines again viewed this as a way to game the system and paid links, if detected, were devalued so that they passed no value whatsoever. The nofollow attribute was implemented so that, amongst other reasons, links could be sold without penalty. The nofollow attribute has also been adopted for other reasons and is used on millions of blogs and some of the most popular social sites.
    URL shortening is also popular and again is used by some of the most popular sites on the web. The upshot of all this is that although the web continues to grow the ability of many millions of pages to link out and cast a vote for other pages has been removed. Of course you still get the traffic which can be substantial if you make the front page of Digg. Because the link graph of the entire web is essentially in recession, search engines are again reevaluated the way they calculate rankings and quality has many discernable signals.

    The Need To Discern Quality

    According a study carried out by WebmasterWorld the top 15 doorway domains are a haven for spam. The study analyzed popular search terms and discovered that more than 50% of the results were spam. 77% of the results from blogspot.com were found to be spam. The following list shows the level of spam found on the top 15 doorway domains:
    Dorway Domain
    Spam%
    sitegr.com
    100%
    blog.hix.com
    100%
    blogstudio.com
    99%
    torospace.com
    95%
    home.aol.com
    95%
    blogsharing.com
    93%
    hometown.aol.de
    91
    usaid.gov
    85
    hometown.aol.com
    84
    maxpages.com
    81
    oas.org
    78
    blogspot.com
    77
    xoomer.alice.it
    77
    netscape.com
    74
    freewebs.com
    52
    The study shows that on the keywords tested some of these blogs are used exclusively by spammers, while others had a very high percentage. The reason for this is that these sites provide free blog space which is a magnet for spammers who need to generate links to low quality splogs or scraper sites quickly.
    The next list compares percentage of spam sites by top-level domain' (TLD):
    TLD
    Spam%
    .info
    68
    .biz
    53
    .net
    12
    .org
    11%
    .com
    4%


    This research highlights the incredible amount of spam that exists on the web but it would be unfair to penalize every .info domain for example just because a high percentage of .info domains are used by spammers
    Conversely it would be unwise to trust every .com even though in general they seem to be comparatively spam free. To discern quality many signals have to be considered covering every aspect of a website.
    The next tutorial in this series will be looking at on page signals of quality nad why quality score is the new PageRank
    Tải lại