Five Essentials for Local SEO Success

Google has been working hard over the past year to refine its search results based on location. Its Pigeon Update in 2014 was the first time most webmasters became aware of the importance of local search, and since then most local businesses have been trying to improve their position in local search rankings. But aside from creating keyword based web content, what other steps can you take to improve your website’s position? Keep reading for some ideas.

local seo

Royalty free photo

(more…)

Read More

old-school-seo

The Old SEO Methods That Can Be Put to an End

If you are aware of the Google Penguin/Panda updates, you should think of removing the old SEO methods and update according to the modern techniques.

  • Article Submissions: This is an age old method where links can easily be created without much effort. Instead of building links you can build positive relationships with the bloggers. This can create an opportunity for writing guest posts for quality sites.
  • Link Exchanges and Reciprocal Links: This process can be applied easily by swapping links with other sites. These cheap useless links will not do any favor for your site. If you go through any quality content in any site, just link to them and you can definitely expect to get linked if you can produce quality content in your site.
  • Thin Content Creation: As you know that great content can attract great links, the creation part should be focused with full concern. If you are outsourcing your content writing activities, make sure it is going to the hands of qualified persons who will be able to produce worthy content for your site. Ensure your purpose of writing. Whether you are writing only for the search engines or you want the users to get benefit of the content.
  • (more…)

Read More

Nofollow Link

The Worth of NoFollow Links for SEO, Traffic and Leads

It is a truth that has been universally acknowledged by many SEO and inbound marketing professionals that not all inbound links have equal worth in the eyes of Google’s search algorithm. One inbound link might do little to help a website’s search ranking, while another might bring in more than its share of desirable web traffic, thus helping to generate fresh leads and web authority.
Inbound links (also known as backlinks) are links that direct customers and search engines to your site. They can be further assigned into two main groups that tell Google how to treat them: ‘DoFollow’ and ‘NoFollow’.
So what’s the difference?

Nofollow Link

DoFollow Or NoFollow – I Don’t Follow

Both ‘DoFollow’ and ‘NoFollow’ links are tags that may be added to a link to instruct search engines to treat it a certain way. All links are automatically treated as ‘DoFollow’ unless otherwise specified. ‘DoFollow’ links are the bread and butter of SEO – these (as the name suggests) tell Google’s bots to follow and index links, incorporating them into its algorithm and offering the site at the other end a little bit of authority. These links can benefit a website’s credibility as being linked to by varied sources implies relative importance in the Internet’s vast hive.

The ‘NoFollow’ attribute, as its name also suggests, instructs search bots not to follow the link to which it is assigned. This was introduced by Google in 2005, in an effort to stop spammers from manipulating search engine rankings by creating masses of inbound links to their websites in blog comments and other public online forums. ‘NoFollow’ tells Google explicitly to not incorporate the associated link into its rankings. For this reason, ‘NoFollow’ links are often seen as worthless, but this is not necessarily true.

So NoFollow Links Aren’t Worthless?

In two words: not entirely. Although Google claims that their search algorithm does not follow these links at all, contrary studies have stated that search bots do follow NoFollow links, but only index the destination page if it is already indexed for other reasons (such as having other DoFollow links referring to the same page).

Ever since Google released the ‘Penguin’ update, which introduced measures to tackle common ways its search algorithm was abused to artificially enhance website rankings, it has been public knowledge that link diversity is essential. Not only do NoFollow links diversify your portfolio of inbound links, but they will also generate traffic, thus enhancing authority, reach and influence as well as the opportunity to convert leads.
Author: Jordan Kantey from Archetype Copywriting

Read More

Using ScrapeBox for best practices in SEO

One of the tools that are frowned upon in the world of SEO is ScrapeBox . This is a tool that is classified as SPAM, but in my opinion it is not. This tool may or may not be spam, it all depends on the person using it, the end of the day, it is a tool that collects data (among many other features), and like any tool can be used to do “good” things or to make things “not so good”.

As you know if you’re in this world of SEO , these last few months have changed many things about link building made our website, now we need to focus more on the quality of each link in the amount and with the passage of months this will be even more important and each link to place can benefit or harm our Link Building strategy.

In the following book I’ll tell you to use ScrapeBox and reported benefits me.

Using ScrapeBox for best practices in SEO

Scrapebox is a tool for scrapear to automate tasks and create you Google your own footprints (If you do not know what the footprints here I leave a full article on them ). Many professionals think that Scrapebox is a tool to get massive links and see it as a SPAM tool, not culp0 them since many people use this type of software for black hat practices than ever recommend.

Scrapebox is not a spam tool, in fact the use Scrapebox in my day to day tasks White hat now throughout this guide you will see.

Wandering around the interface of this tool

Using ScrapeBox for best practices in SEO

Before you begin to list all the possibilities and gives you this tool in your day to day, I’m going to show its interface and briefly explain what each one of them:

Section Harvester: You can put the footprints you want to use to acquire blogs, domains, and other resources. You can also select the “Custom Footprint” option to further customize and get more out of this option.

Section Keywords: This is where the keyword is inserted on which we seek with the custom footprint.

For example, if we want to find blogs related to SEO and are functioning under wordpress.com just what we would do is put the following line: site: wordpress.com “SEO”. This would return all the results we found based ScrapBox this footprint and this keyword.

Harvested URLs: In this window you will find all the results you’ve got to get through your internship as I have discussed in section 2.

Select Engines & Proxies: Here is where you will select the search engine and insert the proxy you want to use to start scrapear information. If you will use this tool to detox issues bonds or competitive analysis I recommend also select search engines like Bing and Yahoo subsequently compared more information.

Coment Poster: This option allows you to post comments on sites that have previously obtained using the appropriate footprint, but this would not recommend you use it well, as it’s black hat and such practices would end up penalizing your project. You can use it to ping your links and get them indexed faster.

Start your Link Building strategy with option Keyword Scraper

Using ScrapeBox for best practices in SEO

Let’s start with the good, one of the features I like. This is the Keyword Scraper option that will help you to acquire numerous related keywords, which you can then use to promote to your list and get more information scrapear. This option will suggest a host of related keywords in the purest style suggestive keywords in Google.

To operate this wonderful option, simply select the keyword Scrape scrape >> Section 2 (Keywords), enter on the left side the keywords that you have designed and click the “scrape” button. Once this task you will see how many ScrapeBox will start to show you related keywords.

To avoid duplicate or unwanted keywords find a button called Remove , here you can remove duplication, keywords that do not contain the word you want and some very últiles to finish outlining keywods your list of options.

Using ScrapeBox for best practices in SEO

ScrapeBox. Complete Guide to Good Practice.Now we start with the complete guide to good practice with ScrapeBox . Once you’ve seen some of the most important features of this software, now you will see the practical benefits that we reported daily use of this tool.

1. Scrapea URLs from options ScrapeBox

Definitely one of the best scrapeadores you can find on the internet, but the best. So once you’ve got your list of selected keywords (remember to choose keywords that relate directly or indirectly to your niche with a volume of important searches), those introduced to in section keywords for which I have already spoken lines above and follow step 2.

2. Use the Footprints.

The footprints are commands that are inserted into google for more specific searches . To help you understand this better let’s see an example:

Imagine you want to find all URLs that Google has indexed in its search containing the keyword “SEO”, for it empelearemos a code to facilitate this search, in this case so specific. Then would read: “inurl: SEO”.

You see the footprints are very useful in day to day SEO and few people use them, I encourage you to start using it and make the most of your actions.

Here I leave all 3 search operators you should know:

Site: Displays lists of domain specific URLs or links to a specific domain.
Inurl: Displays the URLs that contain a specific text fragment previously defined.
Intitle: Sample URLs that contenien some text in the meta title tag previously defined.

As additional information here I leave some of the advanced search operators that Google recommends.

Now that you know who they are and what they are for search operators or footprints, sure you already know what to do if you have the following questions:

  • I need to get links from Blogs and Bloggers related to my topic.
  • I need to get my product and related websites where you can be mentioned links.
  • Platforms need to know where I can share my products.

3. Find good sites for Guest Posting by ScrapeBox.

Using ScrapeBox for best practices in SEO

Suppose you have a blog Web Analytics and want to share your professional services analyst and try to attract customers to your blog. Some of the actions you should take to try to achieve your objectives are:

Post specific content on Web Analytics on your blog regularly.
Post specific items related Blogs and have visibility that accept guest posts.
Being present in the best business platforms related to your niche.

I propose a strategy to three months where you are able to get 30 reviews with many links (do-follow / no-follow) on authority sites related to your niche.

For this you must do is to use the following combination of footprints:

“Powered by WordPress” + “Google Analytics” and site: .com

I explain in detail the input data:

“Powered By WordPress” footprint introduce this because I have found that most people who have a blog related to Online Marketing which are often under the WordPress platform.

“Google Analytics” This footprint use it because I want you to give me back only the specific sites where we talk about Google Analytics and WordPress are also low.

Site:.com: I use this advanced search operator to select the domains that end in .com alone.

If you want more curl curl you can even get the following footprint:

“Powered by WordPress” + “Marketing” and intext: “guest post”

You see, here a more general search is made. With this we ensure any findings made in WordPress Blog Marketing talk and you will surely have some section or have already been invited papers.

Now once you understand a little operation of Scrapebox footprints just open and insert everything you’ve learned.

4. Check the quality of the links found.

Once you’ve done your keyword research and ScrapeBox has made you first list of URLs should start making a check of some basic data on each of the links. In addition you must also use the APIs Moz.

STEP 1 >> Play with Domains: The strategy that I have raised I spoke guest articles, relationships and authority pages. In this case you should just forget to look out for in the articles and focus on the entire domain . For this you must go to 0pción of “Trim to Root” is in “Manage Lists” to cut all URLs and domains to be alone.

Using ScrapeBox for best practices in SEO

STEP 2 >> Eliminates duplication: Once the previous step, you must click “Remove / Filter” >> “Remove Duplicate URL” for e liminal duplicate domains.
Using ScrapeBox for best practices in SEO

STEP 3 >> Check PageRank: You must check the PageRank of each URL to ensure that this indicator exceeds 2 or 3. This means that the domain is one that has minimal authority. This you can see in the section “Get Domain PageRank”.
Using ScrapeBox for best practices in SEO

STEP 4 >> Check the Domain Authority (DA): This should be done with the “Page Autority Addon” option. Besides domain authority, it also gives you the authority information page, the Moz Rank and external links.
Using ScrapeBox for best practices in SEO

STEP 5 >> Get emails from your list of URLs: One option that I like about ScrapeBox. With the option “Grab Emails locally from list”, you’ll get a list of all the emails from each of the owners of those domains so you can contact them to agree to get reviews, guest articles and synergies. Definitely an option that saves you hours of work.
Using ScrapeBox for best practices in SEO

5. Eliminate duplication using this tool

Another very useful feature of ScrapeBox. With this option you can detect duplicated URLs when you’re doing work for detection of toxic bonds.

To do a job Audit links, especially when you’re doing it on a site penalized for links, you must do a thorough job and do this work based on when different sources:

  • You can do this with the tool Link Detox.
  • Combining it with a craft with the help of Ahrefs.
  • Combining it with manual labor also based on when data Open Site Explorer.

You see, doing this job, you would have three separate files linked to interesting content. One of the main objectives would be to combine these 3 files into 1 single and eliminate duplication of links or repeated domains.

To do this you will use the addon “DupeRemove”. Once installed and running you will get a window like this:

Using ScrapeBox for best practices in SEO

Now go to the “Select source files to merge” and select the 3 files you want to combine into one. After selecting the files containing information of the links you must click on the “Merge Files” option to merge the files into one.

Finally to remove duplicate domains or URLs, you need to click on the “Select Source File” and choose where you want expoertar URLs or domains unduplicated.

6. Detect “META” tags for each URL on your list

ScrapeBox also allows you scrapear titles and descriptions from a list of URLs. This chooses located within the option “Grab / Check” the “Grab harvested meta info from URL list” option.

This can have a quick view of the title, the description and keywords for each of the URLs in the list that you imported target.

Using ScrapeBox for best practices in SEO

This will be useful if you export this information to an Excel file and and you use it to check the number of pages that use an exact match key word in the title or some other over-optimization that is not well seen by Google.

Once you detect these over-optimization you must do is change all those titles for more natural titles.

7. Check the validity of each of your internal and external links.

Using ScrapeBox for best practices in SEO

Another practice that should be done on a website when it is carrying out an audit or simply when you want to test different aspects of the web, is to check the status of links, both internal and external.

To perform this test you must use the addon “Scrapebox Checker Alive”. This option failitara us this task.

To detect any problems with your links you just have to set this up properly. This simply adds response codes 200 and 301 as well and check the “Follow Relocation” box.

Using ScrapeBox for best practices in SEO

This program will mark you as erroneous all links return a different response code 200 and 301 .

8. Check if your URL changes in your project are being indexed by Google.

If you are making major changes in your project related to the total amount of internal pages, you’ll want to make sure that Google re-re-index all new URLs you entered and all the changes you’ve implemented.

To make sure all this I recommend combining Screaming Frog ScrapeBox tool , now I tell you how …

STEP 1: The first thing to do is scrapear your project with Screaming Frog, by setting options that let you Basci following images:

Using ScrapeBox for best practices in SEO

STEP 2: Once Screaming Frog has completed the process, you should keep export data to a CSV file, and later open it and copy it to ScrapeBox. Another option is also valid CSV convert that into a .txt and then import it as you see in the image below:
Using ScrapeBox for best practices in SEO

STEP 3: After importing the list of URLs you just have to click the button “Check indexed” and select the option to “Google Indexed”

As a result of having done well these 3 steps, the tool will return a pop up where you will have all the information indexed and non-indexed URLs.

I recommend you export the list of URLs not indexed to analyze in more detail.

Outbound links 9. Check your list of links obtained.

Using ScrapeBox for best practices in SEO

As a final option that I really like, I tell you that with ScrapeBox can also find outbound links and Outbound links for each of the URLs obtained result of having done a job search sites add your reviews as I have spoken to in point 3 of this article.

For this there is an option in the list of addons called “Outbound link checker”. This option will show the outgoing links of each of the URLs obtained after doing a search for places to insert your links or reviews.

Using ScrapeBox for best practices in SEO

As you see in the picture, you return some useful data. Once obtained this information you can also check the authority of these URLs and decide for yourself if they are valid or not.

ScrapeBox. My conclusions

You see, even though many professionals or “not so professional” ScrapeBox considered as a SPAM tool is actually a very powerful tool that will save you time on your daily tasks and other advanced tasks. As I mentioned throughout the article you can use it to do well or you can use it to make other practices, all depends on the person who is at the head of this tool.

I recommend you to read when eschuches or an industry professional badmouth ScrapeBox think that perhaps you have not used this tool or unknown many of its features as I just like to show throughout this article.

If you liked this article and want to know more information and advanced options for this software I recommend that you consult a Chuiso, the only person I know who knows by heart this tool and who has helped me discover a host of advanced options.

Finally I would like to know what you use your ScrapeBox and what benefits you reported in your daily life.

Read More

Does Google want to equate SEO with SEM?

Not if you have noticed but Google has changed the format of the ads in their searches. With this new change, Google wants to equate SEO with SEM, as currently there is not as big as it was before visual difference. Look at the comparison:

Before:

Does Google want to equate SEO with SEM

Now:

Does Google want to equate SEO with SEM

There are only two small differences between ads and natural positioning : The word ad in yellow and the small bar separating ads with SEO. With this new change, what is the aim of Google? Very easy, earn more money. If before people did not click on links either because the background was yellow or because they knew it was an ad and preferred to go to the first results of natural positioning (which are relevant), with this new change is beginning to not differentiate the two types of results.

This may just be the beginning and google in the future or will remove the word “advertisement” or that the bar disappears, or to put more and more results of Google Adwords and organic results down positions … So far I’ve only I got to see 3 advertisements in a search, thus, the 4th is 1 to hold the natural position.

There may be many changes and the key to all the have Google, consider that SEO as we know it depends on Google and if you see that there are companies that make money selling those services and they stop making money, maybe we are starting live stage decline of SEO, I hope to be wrong, because many people live in it and I think all would lose the essence of online marketing.

Increasingly it will be important to invest in SEM to be well positioned as Facebook did with its algorithm change, forcing companies to promote their publications to reach more users. What do you think of the change? Do you agree?

Read More

If Google SEO dies too? #RipGoogle?

If Google SEO dies too

It’s a question I create the following post: Google wants to equate SEO with SEM. The question I threw not think it will happen, at least in the short to medium-term, but in the business world it is known that one day up and one day you’re broke. It may be, it is difficult, but taller towers have fallen. Anyway I’ll answer the question:

If Google SEO dies too?

Rip Google seoTo answer this we must have another. You know how to position a website in Google naturally follow a set of rules (algorithms) to be able to appear in the top positions, therefore, I ask this question:

Does Google is SEO?

You know that SEO is the natural position of a website on a search engine, and Google is the most used search engine by the Internet, 90% worldwide. With which, you might say yes, SEO is Google. If you notice it is he who gives the rules of the game and you and I have them well positioned to meet to our website.

Whenever there is a change in the Google algorithm, SEO changes, therefore, I would say that Google controls the SEO today. And Why? For the monopoly has achieved in his field and the condition of absolute leader. People just search Google and other search engine sees as resembling but not quickly change.

However, that said, back to the question in the title of the article. If Google disappears SEO too? I say yes. To be first in Google (SEO) is to follow a strategy that will get you to this goal: Quality content, 100% web-optimized, presence on social networks, optimum backlinks etc. But Who put these guidelines and / or obligations? In case you did not know, Google.

Therefore, I would say that SEO is a “quality indicator” that creates a search engine for each website, resulting in positioning your web (browser). Today, who gives guidelines for positioning a website in search engines is the search engine itself, and the king on this is Google, so it is in charge of SEO today. That’s why, if you die the most famous Internet search engine, SEO will also die with him.

Read More

Tips for Google Panda 4.0

Almost a month since I wrote on the blog because I am focusing on the new design, and Google does not like that Web content does not update, so back again to the load (with the intention 2-3 post weekly). At this same time, Matt Cutts announced on his Twitter account the Google Panda 4.0 update. Here are some tips to avoid being penalized by Google Panda or even for your users. Note that what I am going to tell is nothing new, and has been no major change in the SEO, just an update on an algorithm to enhance more the following:

Tips for Google Panda 4.0

Web / Blog for your users and not for the SEO. Quality + quantity

Sometimes the mistake of caring more about the SEO content that commits itself. It is essential to add value to your site and provide quality content to users, Panda really focuses on them to see if the content is of quality or not, because otherwise it would be impossible to verify all internet websites. Users will be those who stay longer on the Web, share content, interact on your site etc. And that, you can get it only by offering quality content that makes the user to perform these actions. Sure you’ve already heard that that content is king, then yes, it is.

Besides the quality, another aspect to take into account is the amount. On each page have to be at least 400 words, blending both text and videos or pictures.

Being origi Original Content SEOnal

You have to be original and create new content and not duplicate. Google to Bear (Panda) does not like to do copy and paste, in fact penalized. Soon you will serve copy content to the Web, but then mentions the source, the search engine does not want duplicate content so be original.

Tips for Google Panda 4.0

The fundamental social net works social networks

As I said before, one aspect where Google is based to see if the content is of quality is at the time that the web has been shared on the various social networks, I would say much more on Google+ and in his famous +1. That’s why you have to try and share the content you are active on social networks so you can more successfully shared your links. Essential is that you have no buttons to share and follow.

Some people confuse continue sharing, of little use to have 10,000 fans and have 10 shares. Find more sharing the fan, so far not been proven to have more fans then you have better position in Google, my opinion is that it does not affect anything.

On the Blog have already talked about the importance of social media in SEO.

These are 3 little tips but essential to be in the top positions of Google. Remember that he is the boss: Google and SEO and putting the game, and we have to comply.

Read More

Stop dwelling on the PageRank

Stop dwelling on the PageRank

Ya Matt Cutts said that would be very rare to see a Pagerank update before 2014, there was in December 2013, and after 9 months it has been the last. Whereupon you have to stop dwelling on it and start to look at other indicators, as it is outdated. Therefore, the best alternative for the Pagerank values are offered by the company SEOMoz: Page Authority (PA) and Domain Authority (DA)

Page Authority (PA)

Page Authority, better known as PA (Page Authority), SEOMoz is a calculated metric that indicates the strength of a particular page, not the domain itself (to have the domain DA). Basically it works like PageRank, and many people think that PR is associated (or is associated) to the domain, and it is false, since it is associated to each page of a website, I invite you to do the test.

In the same domain (Web), a page can have a PA 80 and PA have another 1, for example, this post is nothing more to be posted. This can serve to assist you in making your link building strategy because, the higher is the PA will give you better results that link.

There are many factors to determine the PA of a page, including is: mozRank, MozTrust, quantity and quality of links etc. The algorithm provides a number from 1 to 100 and is frequently updated, you can see that this indicator change quickly, which is why it is more reliable than PageRank. From my point of view is the metric most cases you have to do.

Stop dwelling on the PageRank

Domain Authority (DA)

Domain Authority PAThe domain authority, known as DA (Dominion Authority) is the metric created by SEOMoz which measures the authority, trust or credibility of a Web domain. As for the PA, the algorithm will give a score between 0 and 100.

This indicator is calculated by combining all other metrics SEOMoz: number of total links, mozRank, MozTrust, etc. and more than 40 signals to be included in this calculation. As with the PA, the DA is updated often, and regularly can know if our SEO efforts are effective or not.

According to the creators of these two metrics, it is easier to get when you score that having 70-80 20-30. From 20 and is considered a good.

Have high values of DA and PA do not guarantee visitors, but are indicators that help us to improve or continue working on our SEO strategy. That’s why we use these metrics can help a lot, especially because they are updated frequently.

How I can know my PA and DA?

You have several options for a BP and DA of a website. The first and most convenient is to install Moz bar in your browser, so you’ll have instant information:

– Moz Firefox
– Moz Chrome

But you want to install any bar on your browse, you can do it manually open site explorer where, in addition, you will receive more information than it can help.

Read More

Charging time influences the SEO. Chapter 5

Something very important for positioning, not just for SEO, but also to ensure you do not lose visitors and patience of your users, it is the burden of the website.

Charging time influences the SEO. Chapter 5

Sometimes, you may find that pages are very slow, and after a short time, you end up dating. Imagine for a user if it is annoying, for Google as it is … You need to give greater facilities to robots, or rather, to search engine spiders, that crawl your website. If it takes too long to load, spiders will go directly to another site without any content indexed from yours.

How to analyze the speed of my site?

It’s simple, just enter your URL in the following address: Google Page Speed

Why I say this tool and not another? Note that this application provides you the Google and our goal is to position our website in this search engine, so you’ll have to make your job easier so you can index all your pages. With which, if we follow the steps that tells us we can achieve better load for him and for our users.

After adding the URL, in about 20 seconds, you have a complete analysis of your website. The analysis will give you a number between 0 and 100, it is best to have a score greater than 80, that means that our site works correctly.

Although, for optimal site, the ideal is to get more than 90, but now let me tell you it is very difficult and not all factors can depend on you. PageSpeed Getting a 100 is very very complicated

Charging time influences the SEO Chapter 5

In addition, it will analyze all mistakes and give advice to solve them and thus improve the ratio. Try first correct the errors that appear in red, are the most important, then center yourself in highlighted in yellow color.

Then, note that each item that you need and / or you may consider editing you a link where Google gives you advice appears, they are in English, but it can serve as a guide for searching Google or serve the same to proceed to correct failures.

Another site I find very useful for analyzing any website is GTmetrix We also indicate faults in order to improve the load time and the solutions have to take to make your site faster.

You conclude, need your Web fast, not only for your users but also for search engines. Remember that the latter are those that can help you get more views, so make every effort to make your job easier. If you look unprepared to do so, you can hire a freelance developer to, for 50 € you can optimize your website and make it go faster, clearing all unnecessary code and make your site loaded slow.

Read More