Harmful SEO Methods to Avoid

Search engine optimisation is critical to the success of an online business. And as the battle to get higher in the rankings continues to get more competitive, you need to know what you should and should not be doing. In the long run, you will be much better off optimising your website the right way, without resorting to unethical “black hat” methods that could get you banned at any moment. In this article, we’ll take a look at some of the SEO methods to avoid.

Read More

Five Essentials for Local SEO Success

Google has been working hard over the past year to refine its search results based on location. Its Pigeon Update in 2014 was the first time most webmasters became aware of the importance of local search, and since then most local businesses have been trying to improve their position in local search rankings. But aside from creating keyword based web content, what other steps can you take to improve your website’s position? Keep reading for some ideas.

local seo

Royalty free photo


Read More

How Do You Rank Website Search Engine Ranking

3 ways to interpret your SERP ranking report

Keeping track of rankings for keywords in Google is part and parcel of SEO, and it can be both rewarding and frustrating in almost equal measure. Sometimes, however, you can concentrate on ranking reports so long that you miss out on the main message – because you’re used to seeing the same reports over and again.

  1. The most relevant themes on which to focus

An examination of the types of phrases, which your website ranks for, can give a big interpretation on the concepts that are most relevant to you. If a large number of relevant ranking keywords revolve around a certain theme, this can be interpreted to mean your site/brand/business is considered an authority in that theme. If there is a prevailing subject across many keywords, this is representative of either an opportunity or an area of weakness.

Read More


The Old SEO Methods That Can Be Put to an End

If you are aware of the Google Penguin/Panda updates, you should think of removing the old SEO methods and update according to the modern techniques.

  • Article Submissions: This is an age old method where links can easily be created without much effort. Instead of building links you can build positive relationships with the bloggers. This can create an opportunity for writing guest posts for quality sites.
  • Link Exchanges and Reciprocal Links: This process can be applied easily by swapping links with other sites. These cheap useless links will not do any favor for your site. If you go through any quality content in any site, just link to them and you can definitely expect to get linked if you can produce quality content in your site.
  • Thin Content Creation: As you know that great content can attract great links, the creation part should be focused with full concern. If you are outsourcing your content writing activities, make sure it is going to the hands of qualified persons who will be able to produce worthy content for your site. Ensure your purpose of writing. Whether you are writing only for the search engines or you want the users to get benefit of the content.
  • (more…)

Read More

Nofollow Link

The Worth of NoFollow Links for SEO, Traffic and Leads

It is a truth that has been universally acknowledged by many SEO and inbound marketing professionals that not all inbound links have equal worth in the eyes of Google’s search algorithm. One inbound link might do little to help a website’s search ranking, while another might bring in more than its share of desirable web traffic, thus helping to generate fresh leads and web authority.
Inbound links (also known as backlinks) are links that direct customers and search engines to your site. They can be further assigned into two main groups that tell Google how to treat them: ‘DoFollow’ and ‘NoFollow’.
So what’s the difference?

Nofollow Link

DoFollow Or NoFollow – I Don’t Follow

Both ‘DoFollow’ and ‘NoFollow’ links are tags that may be added to a link to instruct search engines to treat it a certain way. All links are automatically treated as ‘DoFollow’ unless otherwise specified. ‘DoFollow’ links are the bread and butter of SEO – these (as the name suggests) tell Google’s bots to follow and index links, incorporating them into its algorithm and offering the site at the other end a little bit of authority. These links can benefit a website’s credibility as being linked to by varied sources implies relative importance in the Internet’s vast hive.

The ‘NoFollow’ attribute, as its name also suggests, instructs search bots not to follow the link to which it is assigned. This was introduced by Google in 2005, in an effort to stop spammers from manipulating search engine rankings by creating masses of inbound links to their websites in blog comments and other public online forums. ‘NoFollow’ tells Google explicitly to not incorporate the associated link into its rankings. For this reason, ‘NoFollow’ links are often seen as worthless, but this is not necessarily true.

So NoFollow Links Aren’t Worthless?

In two words: not entirely. Although Google claims that their search algorithm does not follow these links at all, contrary studies have stated that search bots do follow NoFollow links, but only index the destination page if it is already indexed for other reasons (such as having other DoFollow links referring to the same page).

Ever since Google released the ‘Penguin’ update, which introduced measures to tackle common ways its search algorithm was abused to artificially enhance website rankings, it has been public knowledge that link diversity is essential. Not only do NoFollow links diversify your portfolio of inbound links, but they will also generate traffic, thus enhancing authority, reach and influence as well as the opportunity to convert leads.
Author: Jordan Kantey from Archetype Copywriting

Read More

Using ScrapeBox for best practices in SEO

One of the tools that are frowned upon in the world of SEO is ScrapeBox . This is a tool that is classified as SPAM, but in my opinion it is not. This tool may or may not be spam, it all depends on the person using it, the end of the day, it is a tool that collects data (among many other features), and like any tool can be used to do “good” things or to make things “not so good”.

As you know if you’re in this world of SEO , these last few months have changed many things about link building made our website, now we need to focus more on the quality of each link in the amount and with the passage of months this will be even more important and each link to place can benefit or harm our Link Building strategy.

In the following book I’ll tell you to use ScrapeBox and reported benefits me.

Using ScrapeBox for best practices in SEO

Scrapebox is a tool for scrapear to automate tasks and create you Google your own footprints (If you do not know what the footprints here I leave a full article on them ). Many professionals think that Scrapebox is a tool to get massive links and see it as a SPAM tool, not culp0 them since many people use this type of software for black hat practices than ever recommend.

Scrapebox is not a spam tool, in fact the use Scrapebox in my day to day tasks White hat now throughout this guide you will see.

Wandering around the interface of this tool

Using ScrapeBox for best practices in SEO

Before you begin to list all the possibilities and gives you this tool in your day to day, I’m going to show its interface and briefly explain what each one of them:

Section Harvester: You can put the footprints you want to use to acquire blogs, domains, and other resources. You can also select the “Custom Footprint” option to further customize and get more out of this option.

Section Keywords: This is where the keyword is inserted on which we seek with the custom footprint.

For example, if we want to find blogs related to SEO and are functioning under just what we would do is put the following line: site: “SEO”. This would return all the results we found based ScrapBox this footprint and this keyword.

Harvested URLs: In this window you will find all the results you’ve got to get through your internship as I have discussed in section 2.

Select Engines & Proxies: Here is where you will select the search engine and insert the proxy you want to use to start scrapear information. If you will use this tool to detox issues bonds or competitive analysis I recommend also select search engines like Bing and Yahoo subsequently compared more information.

Coment Poster: This option allows you to post comments on sites that have previously obtained using the appropriate footprint, but this would not recommend you use it well, as it’s black hat and such practices would end up penalizing your project. You can use it to ping your links and get them indexed faster.

Start your Link Building strategy with option Keyword Scraper

Using ScrapeBox for best practices in SEO

Let’s start with the good, one of the features I like. This is the Keyword Scraper option that will help you to acquire numerous related keywords, which you can then use to promote to your list and get more information scrapear. This option will suggest a host of related keywords in the purest style suggestive keywords in Google.

To operate this wonderful option, simply select the keyword Scrape scrape >> Section 2 (Keywords), enter on the left side the keywords that you have designed and click the “scrape” button. Once this task you will see how many ScrapeBox will start to show you related keywords.

To avoid duplicate or unwanted keywords find a button called Remove , here you can remove duplication, keywords that do not contain the word you want and some very últiles to finish outlining keywods your list of options.

Using ScrapeBox for best practices in SEO

ScrapeBox. Complete Guide to Good Practice.Now we start with the complete guide to good practice with ScrapeBox . Once you’ve seen some of the most important features of this software, now you will see the practical benefits that we reported daily use of this tool.

1. Scrapea URLs from options ScrapeBox

Definitely one of the best scrapeadores you can find on the internet, but the best. So once you’ve got your list of selected keywords (remember to choose keywords that relate directly or indirectly to your niche with a volume of important searches), those introduced to in section keywords for which I have already spoken lines above and follow step 2.

2. Use the Footprints.

The footprints are commands that are inserted into google for more specific searches . To help you understand this better let’s see an example:

Imagine you want to find all URLs that Google has indexed in its search containing the keyword “SEO”, for it empelearemos a code to facilitate this search, in this case so specific. Then would read: “inurl: SEO”.

You see the footprints are very useful in day to day SEO and few people use them, I encourage you to start using it and make the most of your actions.

Here I leave all 3 search operators you should know:

Site: Displays lists of domain specific URLs or links to a specific domain.
Inurl: Displays the URLs that contain a specific text fragment previously defined.
Intitle: Sample URLs that contenien some text in the meta title tag previously defined.

As additional information here I leave some of the advanced search operators that Google recommends.

Now that you know who they are and what they are for search operators or footprints, sure you already know what to do if you have the following questions:

  • I need to get links from Blogs and Bloggers related to my topic.
  • I need to get my product and related websites where you can be mentioned links.
  • Platforms need to know where I can share my products.

3. Find good sites for Guest Posting by ScrapeBox.

Using ScrapeBox for best practices in SEO

Suppose you have a blog Web Analytics and want to share your professional services analyst and try to attract customers to your blog. Some of the actions you should take to try to achieve your objectives are:

Post specific content on Web Analytics on your blog regularly.
Post specific items related Blogs and have visibility that accept guest posts.
Being present in the best business platforms related to your niche.

I propose a strategy to three months where you are able to get 30 reviews with many links (do-follow / no-follow) on authority sites related to your niche.

For this you must do is to use the following combination of footprints:

“Powered by WordPress” + “Google Analytics” and site: .com

I explain in detail the input data:

“Powered By WordPress” footprint introduce this because I have found that most people who have a blog related to Online Marketing which are often under the WordPress platform.

“Google Analytics” This footprint use it because I want you to give me back only the specific sites where we talk about Google Analytics and WordPress are also low. I use this advanced search operator to select the domains that end in .com alone.

If you want more curl curl you can even get the following footprint:

“Powered by WordPress” + “Marketing” and intext: “guest post”

You see, here a more general search is made. With this we ensure any findings made in WordPress Blog Marketing talk and you will surely have some section or have already been invited papers.

Now once you understand a little operation of Scrapebox footprints just open and insert everything you’ve learned.

4. Check the quality of the links found.

Once you’ve done your keyword research and ScrapeBox has made you first list of URLs should start making a check of some basic data on each of the links. In addition you must also use the APIs Moz.

STEP 1 >> Play with Domains: The strategy that I have raised I spoke guest articles, relationships and authority pages. In this case you should just forget to look out for in the articles and focus on the entire domain . For this you must go to 0pción of “Trim to Root” is in “Manage Lists” to cut all URLs and domains to be alone.

Using ScrapeBox for best practices in SEO

STEP 2 >> Eliminates duplication: Once the previous step, you must click “Remove / Filter” >> “Remove Duplicate URL” for e liminal duplicate domains.
Using ScrapeBox for best practices in SEO

STEP 3 >> Check PageRank: You must check the PageRank of each URL to ensure that this indicator exceeds 2 or 3. This means that the domain is one that has minimal authority. This you can see in the section “Get Domain PageRank”.
Using ScrapeBox for best practices in SEO

STEP 4 >> Check the Domain Authority (DA): This should be done with the “Page Autority Addon” option. Besides domain authority, it also gives you the authority information page, the Moz Rank and external links.
Using ScrapeBox for best practices in SEO

STEP 5 >> Get emails from your list of URLs: One option that I like about ScrapeBox. With the option “Grab Emails locally from list”, you’ll get a list of all the emails from each of the owners of those domains so you can contact them to agree to get reviews, guest articles and synergies. Definitely an option that saves you hours of work.
Using ScrapeBox for best practices in SEO

5. Eliminate duplication using this tool

Another very useful feature of ScrapeBox. With this option you can detect duplicated URLs when you’re doing work for detection of toxic bonds.

To do a job Audit links, especially when you’re doing it on a site penalized for links, you must do a thorough job and do this work based on when different sources:

  • You can do this with the tool Link Detox.
  • Combining it with a craft with the help of Ahrefs.
  • Combining it with manual labor also based on when data Open Site Explorer.

You see, doing this job, you would have three separate files linked to interesting content. One of the main objectives would be to combine these 3 files into 1 single and eliminate duplication of links or repeated domains.

To do this you will use the addon “DupeRemove”. Once installed and running you will get a window like this:

Using ScrapeBox for best practices in SEO

Now go to the “Select source files to merge” and select the 3 files you want to combine into one. After selecting the files containing information of the links you must click on the “Merge Files” option to merge the files into one.

Finally to remove duplicate domains or URLs, you need to click on the “Select Source File” and choose where you want expoertar URLs or domains unduplicated.

6. Detect “META” tags for each URL on your list

ScrapeBox also allows you scrapear titles and descriptions from a list of URLs. This chooses located within the option “Grab / Check” the “Grab harvested meta info from URL list” option.

This can have a quick view of the title, the description and keywords for each of the URLs in the list that you imported target.

Using ScrapeBox for best practices in SEO

This will be useful if you export this information to an Excel file and and you use it to check the number of pages that use an exact match key word in the title or some other over-optimization that is not well seen by Google.

Once you detect these over-optimization you must do is change all those titles for more natural titles.

7. Check the validity of each of your internal and external links.

Using ScrapeBox for best practices in SEO

Another practice that should be done on a website when it is carrying out an audit or simply when you want to test different aspects of the web, is to check the status of links, both internal and external.

To perform this test you must use the addon “Scrapebox Checker Alive”. This option failitara us this task.

To detect any problems with your links you just have to set this up properly. This simply adds response codes 200 and 301 as well and check the “Follow Relocation” box.

Using ScrapeBox for best practices in SEO

This program will mark you as erroneous all links return a different response code 200 and 301 .

8. Check if your URL changes in your project are being indexed by Google.

If you are making major changes in your project related to the total amount of internal pages, you’ll want to make sure that Google re-re-index all new URLs you entered and all the changes you’ve implemented.

To make sure all this I recommend combining Screaming Frog ScrapeBox tool , now I tell you how …

STEP 1: The first thing to do is scrapear your project with Screaming Frog, by setting options that let you Basci following images:

Using ScrapeBox for best practices in SEO

STEP 2: Once Screaming Frog has completed the process, you should keep export data to a CSV file, and later open it and copy it to ScrapeBox. Another option is also valid CSV convert that into a .txt and then import it as you see in the image below:
Using ScrapeBox for best practices in SEO

STEP 3: After importing the list of URLs you just have to click the button “Check indexed” and select the option to “Google Indexed”

As a result of having done well these 3 steps, the tool will return a pop up where you will have all the information indexed and non-indexed URLs.

I recommend you export the list of URLs not indexed to analyze in more detail.

Outbound links 9. Check your list of links obtained.

Using ScrapeBox for best practices in SEO

As a final option that I really like, I tell you that with ScrapeBox can also find outbound links and Outbound links for each of the URLs obtained result of having done a job search sites add your reviews as I have spoken to in point 3 of this article.

For this there is an option in the list of addons called “Outbound link checker”. This option will show the outgoing links of each of the URLs obtained after doing a search for places to insert your links or reviews.

Using ScrapeBox for best practices in SEO

As you see in the picture, you return some useful data. Once obtained this information you can also check the authority of these URLs and decide for yourself if they are valid or not.

ScrapeBox. My conclusions

You see, even though many professionals or “not so professional” ScrapeBox considered as a SPAM tool is actually a very powerful tool that will save you time on your daily tasks and other advanced tasks. As I mentioned throughout the article you can use it to do well or you can use it to make other practices, all depends on the person who is at the head of this tool.

I recommend you to read when eschuches or an industry professional badmouth ScrapeBox think that perhaps you have not used this tool or unknown many of its features as I just like to show throughout this article.

If you liked this article and want to know more information and advanced options for this software I recommend that you consult a Chuiso, the only person I know who knows by heart this tool and who has helped me discover a host of advanced options.

Finally I would like to know what you use your ScrapeBox and what benefits you reported in your daily life.

Read More

Create a daily post. April 2014

Today begins the month of April and thus a daily post * in MkJunior. What I want to achieve is to create a habit and try to create a post every day until June. But What benefits will have to create a post every day?

The first and foremost is to enrich this blog slowly, creating new content and new post to help you understand and improve your SEO + Social Media + Web Analytics expertise. In January this year started this blog, and today, I created 18 post, these entries have to have the following information from the foundation of the Web:

Create a daily post. April 2014

Total views: 1,182 views
Pageviews: 2,748 pages
Duration: 2 minutes and 20 seconds

They are good data to start with, I’m happy, but I want to give it a push, since I have done 3 months and now is when the game starts :), although it started from the first day of the creation of this blog, you can see Steps in SEO chapters.

The second goal is to tell Google that this blog is updated and generated content and quality ;)with which I hope to generate more visits and especially more points in my SEO strategy. Note that a website that does not generate content that is not updated is not important to Google, so you penalize losing positions and soon will have 1,000 links to your page or have optimal SEO strategy, of course, talking about long term, Google does not want to update your website / blog every day, but after 1, 2 or 3 years if you have not updated and pursued a strategy of SEO lose positions and recall is more difficult to lose.

Therefore I decided to create a post every day in April, adding new chapters SEO, creating my first post of Web Analytics and generating more online marketing and social networking.

As the month ends, we’ll see if I gained more views or not, I’ll share with you.

* Post = Post newspaper on weekdays, or 22 post in April.

Valid registration for this blog to serve Paperblog under the pseudonym ignaciomk

Read More

Does Google want to equate SEO with SEM?

Not if you have noticed but Google has changed the format of the ads in their searches. With this new change, Google wants to equate SEO with SEM, as currently there is not as big as it was before visual difference. Look at the comparison:


Does Google want to equate SEO with SEM


Does Google want to equate SEO with SEM

There are only two small differences between ads and natural positioning : The word ad in yellow and the small bar separating ads with SEO. With this new change, what is the aim of Google? Very easy, earn more money. If before people did not click on links either because the background was yellow or because they knew it was an ad and preferred to go to the first results of natural positioning (which are relevant), with this new change is beginning to not differentiate the two types of results.

This may just be the beginning and google in the future or will remove the word “advertisement” or that the bar disappears, or to put more and more results of Google Adwords and organic results down positions … So far I’ve only I got to see 3 advertisements in a search, thus, the 4th is 1 to hold the natural position.

There may be many changes and the key to all the have Google, consider that SEO as we know it depends on Google and if you see that there are companies that make money selling those services and they stop making money, maybe we are starting live stage decline of SEO, I hope to be wrong, because many people live in it and I think all would lose the essence of online marketing.

Increasingly it will be important to invest in SEM to be well positioned as Facebook did with its algorithm change, forcing companies to promote their publications to reach more users. What do you think of the change? Do you agree?

Read More

If Google SEO dies too? #RipGoogle?

If Google SEO dies too

It’s a question I create the following post: Google wants to equate SEO with SEM. The question I threw not think it will happen, at least in the short to medium-term, but in the business world it is known that one day up and one day you’re broke. It may be, it is difficult, but taller towers have fallen. Anyway I’ll answer the question:

If Google SEO dies too?

Rip Google seoTo answer this we must have another. You know how to position a website in Google naturally follow a set of rules (algorithms) to be able to appear in the top positions, therefore, I ask this question:

Does Google is SEO?

You know that SEO is the natural position of a website on a search engine, and Google is the most used search engine by the Internet, 90% worldwide. With which, you might say yes, SEO is Google. If you notice it is he who gives the rules of the game and you and I have them well positioned to meet to our website.

Whenever there is a change in the Google algorithm, SEO changes, therefore, I would say that Google controls the SEO today. And Why? For the monopoly has achieved in his field and the condition of absolute leader. People just search Google and other search engine sees as resembling but not quickly change.

However, that said, back to the question in the title of the article. If Google disappears SEO too? I say yes. To be first in Google (SEO) is to follow a strategy that will get you to this goal: Quality content, 100% web-optimized, presence on social networks, optimum backlinks etc. But Who put these guidelines and / or obligations? In case you did not know, Google.

Therefore, I would say that SEO is a “quality indicator” that creates a search engine for each website, resulting in positioning your web (browser). Today, who gives guidelines for positioning a website in search engines is the search engine itself, and the king on this is Google, so it is in charge of SEO today. That’s why, if you die the most famous Internet search engine, SEO will also die with him.

Read More