I wanted to talk to you, today, for a little while, about advertorials, Native advertising and editorial content. So let's start with the easiest stuff, editorial content. That's the meat and potatoes of whatever you're writing. If you're a blogger, it's the primary stuff you're writing about. If you're a newspaper, it's the news articles that you publish, online or in your newspaper. I think people have a pretty good sense about what editorial content is. So how about advertorial content or native advertising? Well, it's advertising...
Read the entire article
At a very high level view, the goal of your reconsideration request is, number one, to tell Google you’ve stopped whatever violations of the quality guidelines were happening: paid links, cloaking, scraping, doorways. Whatever it was, you need to make a clear and compelling case that that has actually stopped. That behavior no longer is going on and that you’ve cured as much as possible...
Read the entire article
Google is willing to take manual action to remove spam. So if you write an algorithm to detect spam, and then someone searches for their own name, and they find off-topic porn, they're really unhappy about that. And they'll write to Google and let us know that they're unhappy. And if we write back and say, "Well, we hope in six to nine months to be able to have an algorithm that catches this off-topic porn," that's not a really satisfactory answer for the guy who has off-topic porn showing up for his name...
Read the entire article
Cloaking is essentially showing different content to users than to Googlebot. Imagine that you have a web server and a user comes and asks for a page. You give him some sort of the page. And now, let's have Googlebot come and ask for a page as well. And you give Googlebot a page. Now, in the vast majority of situations, the same content goes to Googlebot and to users. Cloaking is when you show different content to users than to Googlebot. And it's definitely high risk. That's a violation of our quality guidelines...
Read the entire article
URL shorteners are like any other redirects. If we crawl a page and we see a 301 redirect, then it will pass PageRank to the final destination. Using custom URL shorteners shouldn’t affect SEO. The PageRank will flow through, the anchor text will flow through, and so I wouldn't necessarily worry about that at all. Now, just to let you know, if you look at, for example, Twitter's web pages, many of those links have a no-follow link, so those links that are on the web page might not necessarily flow PageRank...
Read the entire article
When registering a new domain name, to compete in some particular niche within SEO, you can take a couple different strategies. Either something really brandable, that people will remember, or something strictly keyword-rich. People have reasonable disagreements about whether it's better to shoot for a keyword-laden domain or a domain that doesn't necessarily have the keywords in it, but is a little more brandable.
Read the entire article
Whenever you're talking about banner ads - the vast majority are sold via various advertising networks which already will do block out bots. They don't want bots crawling their banner ads or messing with their impressions, clicks counts, or anything like that. If you're using a standard advertising banner ad package, most of the time those redirects will go through things that are blocked by robots.txt or that somehow are not crawlable by search engines.
Read the entire article
I could get into a lot of really interesting stuff about how to crawl the web. If you really want to know about a signal, the Nyquist rate says you want to sample at two times that frequency. But the fact is: you can always change a web page. So the whole idea, the conception of being able to crawl the entire web and having a perfect copy at every instant, is a little bit flawed, because at any time we can only go and fetch a certain finite number of pages.
Read the entire article
Well, this is something where search engines can change their policy over time, because we might see the web evolving, or we might see how webmasters have issues, those sorts of things. I can tell you about my experience of moving from mattcutts.com to dullest.com and then dullest.com back to mattcutts.com. Whenever I decided to move back, I used a 301 redirect. And it took a period of several weeks, because remember, 301s happen at a page level...
Read the entire article
A lot of websites can have invalid code but actually render just fine, because a lot of modern browsers do a good job dealing with bad code. It's not so much that the code has to be absolutely perfect, but whether or not the page is going to render well for the user in general. It makes sense to still list these websites. And plus, people make a lot of mistakes because they're using different HTML editors, or maybe they're doing some hand coding...
Read the entire article
The original reason for recommending 100 links per page was pretty simple. At one point, Google would only index 101 kilobytes of a page. So we needed some heuristic to say: “don't make a page so incredibly long that we’ll truncate and not index the words at the end”. So we said, “101 kilobytes, 100 links”. That's a pretty rough measure, but if you're getting much beyond that then that's a little unusual. But that 100 links per page guideline dates back 8 or 9 or so years.
Read the entire article
Let me back up a little bit and talk about algorithmic versus manual. We have confirmed that Google's Webspam team is willing to take action manually. For example, if we get a spam report, off-topic porn, things like that. But, of course, we also take that data and try to use it to improve our algorithms. So the engineers write classifiers for content spam and keyword stuffing and poking and sneaky JavaScript redirects. If your site is affected by an algorithm...
Read the entire article
I can think of at least a couple and I'll highlight those and then I'll fill it out with some other misconceptions. Customer complaint sites aren't necessarily contributing to search rankings. The biggest one that made me want to bang my head against the wall was the idea that somehow if you do negative things toward your customers so they complain on customer complaint sites, then that links automatically count. It's a dangerous idea that if you abuse your customers...
Read the entire article
Let's talk about that in a little more detail because sometimes people file lawsuits; you know, Alice has alice.com and Bob has bob.com, and maybe Bob takes Alice's name and puts it in the keywords meta tags, and then Alice sees that, and she gets really angry and she, you know, talks to Bob and maybe sues Bob and all that sort of stuff... Well, how much should Alice be worried about, you know, Bob using that one term "Alice" in the keywords meta tag?
Read the entire article
It is not something that we look at very closely at all. Not meta geo tags. We look at the IP address, we look at the gTLD or the ccTLD, that's the country code TLD (.fr, .de). There's also something in Google's Webmaster Tools where you can say “my site is not just a .com that's about the entire world, it's a British .com. Or it's a .com that really pertains to New Zealand or Australia”. So, you can highlight and say, okay this .com is really about Germany...
Read the entire article
Who you link to can affect your reputation. If you're linking to spammy sites, sites that we considered junkies, cuzzies, spammy, whatever – that can affect your site's reputation. Certainly, if you're selling links within blogroll, that can be a very high risk. But just because your friend lost PR5 to PR0, it doesn't necessarily mean that it was the blogroll, it doesn't necessarily mean that we thought he was selling links, you know. It could be a temporary thing with...
Read the entire article
Whenever we look at whether a directory is useful to users, we say – “OK, what is the value add of that directory? Do they go out and find their entries on their own, or do they only wait for people to come to them? How much do they charge and what's the editorial service that's being charged?” If a directory takes 50 USD and every person who applies automatically gets in for that 50 dollars, there is not as much editorial oversight as something like the Yahoo! directory...
Read the entire article
It does happen. And most of the time when it happens, it's because that's not original content. If you get an affiliate feed and it might have images, it might not have images and you have the same content on your page, your e-commerce product page as four hundred other sites, then really, it's hard to distinguish yourself. You have to ask, "Where is my value add? What does my affiliate site or my site that doesn't have original content compared to these other..."
Read the entire article
The fact is we do. We don't do it for all penalties because there are some bad guys, some spammers, and blackhats who you don't want to clue in. But for things like hidden text, if we think your site is hacked or even vulnerable to hacking, we've even talked about sending messages to people who we think have old Google Analytics packages. But we do send a large number of people warnings about their site to sort of say “Hey, heads up, we've had to penalize your...”
Read the entire article
You should not go stuffing a ton of keywords into your URL. Truthfully, I wouldn't really obsess about it at that level of detail. It does help a little bit to have keywords in the URL. It doesn't help so much that you should go stuffing a ton of keywords into your URL. If there's a convenient way that's good for users where you have four or five keywords that might be worthwhile, but I wouldn't obsess about it to the level of how deep is the URL in the path or how am I...
Read the entire article
In the dawn of Google, you know it was funny because people would rank in different countries based only on the tld. So, .fr meant that you were French but that's all that they knew. Back in 2000-2001 time frame, we started to look at where is the server located, it's IP-address, to say “well, it doesn't end in .fr but it is located in France, according to it. So maybe this is really useful for French users”. That's the primary way you have an impact on Google's rankings.
Read the entire article
I wouldn't say it’s unethical, because it is stuff on your website, you're allowed to control how the PR flows around within your site. I would say that it's not the first thing that I would work on. I would work on getting more links, having higher quality content, those are always the sort of things that you wanna do first. But if you have a certain amount of budget of PR, you certainly can scope your Page Rank. I wouldn't necessarily do it with a nofollow tag, although you can...
Read the entire article
Don’t worry about that very much, not very much at all in fact. Danny Sullivan had asked about this recently because there were some registrars that were sending around emails saying: "Did you know that Google gives you a bonus in ranking if you register your site for three or more years?" And just to clarify – that is not based on anything that we've said. The only thing that might be coming from us is that we did file a patent that basically said we can use historical data in our...
Read the entire article
Google always has to trade off the balance between authority and topicality. If somebody types in "Viagra", which is one of the most spammed terms in the world, you want something that is about Viagra, you do not just want something that has a lot of authority, like Newsweek or Times, that is talking about right in an article and they have one mention of Viagra, where they say: "Oh, this is something like Viagra", just to throw off a phrase. Content has to be about what...
Read the entire article
No, I wouldn't worry about that. In fact, it can be a very good idea to switch your site to HTTPS. HTTPS, or SSL, is a secure version of HTTP that encrypts things between your browser and the web server. So that keeps your boss, or your ISP, or your government from snooping on whatever is happening on that connection, unless they can do some crazy Mission Impossible, man-in-the-middle attack. And those are relatively rare. But let's get back to the question...
Read the entire article
The conventional wisdom a few years ago was that meta tags mattered a whole lot. You really had to tweak them and spent a lot of time to get your keywords right. And did you have a space or a comma between your keywords, and all that kind of stuff? And we've mostly evolved past that. We do use the meta description tag instead of meta keywords. The pendulum might have gone a little bit too far in the other direction, because a lot of people sometimes say...
Read the entire article
To the best of my knowledge, there aren't any guidelines as far as having too many tracking cookies. If you do have so many tracking pixels that it starts to slow down the user, or if you do so much tracking that you get written up in the Wall Street Journal and people really don't like you, then that might have an effect on your business. Ranking wise, I don't think that there's any impact from having cookies. Cookies can be used for a lot of different reasons, sometimes for tracking...
Read the entire article
The information that you get from the Google toolbar is updated about three or four times a year. And the reason why we don't provide it every single day is because we don't want webmasters to get obsessed with the green in the Google toolbar and not pay the attention that should be spent on titles, and accessibility, and good content, and all those kinds of things. A lot of people, if you show them just the PageRank and update it every day, they're just going to focus on...
Read the entire article
I worked on the initial version of SafeSearch for text. So let's concentrate on that. I don't want to give away anything that spammers could use, but I can talk about way back in 2000 how SafeSearch worked, so you can get an idea. And the idea is roughly what you would expect, which is: we look for certain words, and we give them certain weight. And if you have enough words with enough weight, then we sort of say – “OK, this looks like it might be a sort of porn or...”
Read the entire article
No. You don't get any benefit in Google's organic rankings if you buy AdWords. There's no boost. There's nothing going on in the algorithm there, where if you buy AdWords you will rank higher in Google's organic search results. So don't count on that. Instead, try to make compelling, great content – the sort of thing that attracts links because it's really excellent. Don't count on buying AdWords to cause your organic rankings to go up.
Read the entire article
Customer service, or user support, is a really interesting problem. Because normally, if you're a company, you're only supporting the people who buy your product. So even if you're a hugely successful company, you have a relatively small group of people that you have to support. With Google, we're literally supporting anybody on the web who wants to use our free web service. And so there's, at last estimate, something like 2 billion people on the web. A lot of them use...
Read the entire article
You probably shouldn't be worrying about it at this level. Let's go back to the original formulation of PageRank as published by Larry way back in the day. PageRank, the way that you compute it is – you take all the incoming PageRank (and now you have PageRank at a page, a certain amount of it evaporates or decays, but don't worry about that, that just makes it so that everything converges) and then you say, “given the number of out links to this page...”
Read the entire article
They're definitely not penalized in the sense that they don't receive any manual action that would make them rank lower. But we do try to compensate if there's good quality content and people still make mistakes. All the time people ask: “why don't I get a bonus for having W3C code that validates really well?” There's a lot of great content that doesn't validate but is still really good. Just because somebody dots every I, and crosses every T, and gets all of their HTML structure right...
Read the entire article
The answer to this question has changed over time as we try to write new algorithms and find different ways of nailing down where was content originally written, where did it first appear (for example, the first time or the first place on the web that we saw content appear). If you happen to write something and publish it, and we crawl it and we see all that content, and then it shows up two years later somewhere else, well, it's more likely that the source is where we first...
Read the entire article
The answer is no, at least not for Google. That's not a ranking factor right now. But I would recommend that you look into making sure that all of your content works, just for your user's sake. Because whenever a user's browsing, different browsers will do different things. Internet Explorer might pop up a warning. Chrome might have a little x through it that's red. And sometimes, that can make people a little more stressed, even if it's something that's relatively safe...
Read the entire article
I wouldn't recommend going to that lengths. We'll typically crawl almost any filename extension. There's a few like maybe .zero, or .exe, or something like that that we would definitely not recommend. But taking a URL, and taking the .php or the .html and adding a .free, or .cheap on the end is probably going a little bit too far. It's not something that I would really recommend...
Read the entire article
In theory, 404s could be transient. A page could be missing, and then come back later. Technically, if you really want to signal that this page is completely gone and will never come back, there's an HTTP status code called 410. But at least last time we checked back in 2007, we actually treated those in the same way. But to get to the meat of your question, why does it takes so long, the answer is webmasters can do kind of interesting things...
Read the entire article
I have to admit you're swimming upstream here because if you have a .jp domain and you want to target Finland, you're really going against a lot of expectations and conventions that people use on the net. So one thing to think about is whether it's possible to get a generic TLD that you could then use for other countries. Certainly you can try to segment stuff into different subdomains or subdirectories...
Read the entire article
At least the last time I checked, which was a while ago, it is not used as a direct signal within our ranking. So it's not one of the over 200 different signals that we use to assess the quality of a page. But I think it would be fair to think about using it as a signal. So for example, we noticed a while ago that if you look at the page rank of a page, how reputable we think a particular page or site is, the ability to spell correlates relatively well with that. The sites that are lower page rank or very low page rank tend not to spell as well which is a pretty interesting effect…
Read the entire article
Trust is sort of a catch-all term that we use. So PageRank is the most well-known type of trust. It's looking at links and how important those links are. So if you have a lot of very high quality links, then you tend to earn a lot of trust with Google. There are other signals. There are over 200 different signals that we use in our ranking. But you can kind of break them down into this notion of sort of trust and how well you match a particular query...
Read the entire article
So one thing before we even get into the topic of DMOZ is it's hard to tell sometimes why a site is ranking. Historically, Google has the link: operator, which returns the backlinks or some sub-sample of backlinks to people. But we don't show every single backlink that we know of in response to link:, because we show that more on the webmaster tool side. So you can see your own backlinks, but we don't give a full list of all the backlinks to the people who would compete with you...
Read the entire article
That's a really fun question for a couple reasons. So you can think about PDFs specifically. And there's not that much to do in terms of optimization. For one thing, I'd make sure that it's actually text, because you can have PDF that's primarily composed of images. And we might be able to OCR over time. But really, if you have text in that document, it's a lot easier for us to index. You want to make sure that you choose good titles...
Read the entire article
Well, the fact is we're looking at using toolbar data, and that's using toolbar data only from people who have opted in. But that's looking at real world load times from people. For example, if you're in the United States, we might say, how long does it take to load this particular page? And so if we're looking at that, and it takes a long time, sometimes it's not necessarily your site. It could be the network connectivity. But it's a good thing to bear in mind. It's coming from all these different users, who can have dial-up lines...
Read the entire article
So you show up. Suddenly your traffic has taken a big drop. One thing I would do very early on is I would do site:mydomain.com and figure, out are you completely not showing up in Google, or do parts of your site show up in Google? That's also a really good way to find out whether you are partially indexed. Or if you don't see a snippet, then maybe you had a robots.txt that blocked us from crawling...
Read the entire article
You would not believe how common it is. I don't mind telling you that Donald Trump has had a website hacked. Al Gore has had a website hacked. This sort of stuff can happen to anybody. So let's talk about some of the free tools and resources that Google provides, as well as exists on the web, to help clean this stuff up. OK. So, first and foremost, there's something called the safe browsing diagnostic page...
Read the entire article
The answer is no. We don't consider SEO to be spam. Now a few really tech savvy people might get angry at that. So let me explain in a little more detail. SEO stands for Search Engine Optimization. And essentially it just means trying to make sure that your pages are well represented within search engines. And there's plenty of white hat, great quality stuff that you can do as a search engine optimizer...
Read the entire article
So first, let me give a little bit of history about why, whenever we see an underscore, we join that in the URL rather than separate using that. So what I mean? Well, if you say red dash widget in a URL, we view that dash as a separator. So we index the word red, and we index the word widget. And those are separate. Whereas if you were to have War of 1812 with underscores, so, war of 1812, instead of separating on the underscores we actually glom all those together...
Read the entire article
Of course, it's definitely the case that if there's little quality links, we will tend to try to write algorithms that don't necessarily trust those links. But if I had to take a guess, here would be my guess. A lot of people think that pagerank corresponds to popularity, and that's not really true. Let me give you a very simple example to illustrate that. The Iowa Real Estate Commission Board probably gets a fair number of links...
Read the entire article
So you might be moving from old domain to new domain. Let's talk through some of the possible things to be thinking about and some of the stuff that you might want to make sure you avoid as far as problems. OK. Here's what we've got. We've got a site, maybe with a subdirectory or a subdomain. In fact, we've got two or three sites here. And you've got a new domain. Right now, it's just a parked domain...
Read the entire article
Today I wanted to talk a little bit about how to move your stuff from one web host to another web host. So maybe you didn't like the deal that you were getting with your current guy, and you're switching to a new web host. Your domain name remains the same. So maybe it's mattcutts.com or example.com. Your domain name is the same. But you're moving to a new IP address because you're moving to a new web host...
Read the entire article
The most common case for a 301 redirect is you're moving from one site to another site. And if you're doing that, you can put 301 redirects to go to the root of the new domain. But that's kind of a waste. If somebody is looking for a very specific page, and they end up going to the root page of the new domain, that's really not as useful. So what we recommend doing is doing a 301 redirect from the old page location to the new page location on the new site...
Read the entire article
I'm browsing through Techmeme, and McDonalds hacked, customer data stolen, Gawker hacked, you know, and McDonald’s data… I have no idea how all these hacks work. So you know, so one thing we've been working on, that's on our radar, is working on trying to help hacked sites. You know, we've been spending a fair amount of time on that. We're increasing messaging. The other big thing on our radar is probably communication...
Read the entire article
We do use Twitter and Facebook links and ranking, as we always have, in our websearch rankings. But in addition, we're also trying to figure out a little bit about the reputation of an author or a creator on Twitter or Facebook. And let me just give you a little bit of background on that. I filmed a video back in May 2010, where I said that we didn't use that as a signal. And at the time, we did not use that as a signal. But now, we're taping this in December 2010, and we are using that as a signal...
Read the entire article
My guess is most people do not want to tell the search engines, "Oh no, please you're sending me too much traffic. Go away, I don't want you to show me a-a lot more users and visitors." So it honestly is not a request that we've really heard that often. I don't know that I remember anyone ever asking for that before. So we have to prioritize in terms of engineering resources...
Read the entire article
It's a fair question. I think one thing you can do is send people to the page that we have on Google about search engine optimizers and how to find a good search engine optimizer, because I think we say directly no one can guarantee a number one ranking on Google. So, you can say, right from the horse's mouth, right from Google. No one can guarantee a number one ranking. The fact is that if you're like a plumber, you probably don't want to rank number one for plumber...
Read the entire article
I even have a little bit of a problem with how you're phrasing the second question, "if I publish uncopyrighted content," at least in the United States whoever writes stuff it's copyrighted. You don't have to go to the Copyright Office and officially get the word. Here it is. Here's my proof that it's copyrighted. Within the United States as soon as you write it, you basically have copyrighted it...
Read the entire article
Well, there's a few reasons why this might happen. First off, maybe you've got personalized search and maybe you've clicked on things or not clicked on things and so you're getting different search results than a lot of other people might be seeing. The second thing is it can depend on the country. So maybe the query bank returns you’re number one in Australia but not number one in America...
Read the entire article
The PR matters if you want your twitter page to rank, because PageRank is one of more than 200 different signals that we use in our rankings. So, if you want to do a search like Matt Cutts and have twitter.com/mattcutts show up then the PageRank does matter for that Twitter profile page. It generally doesn't matter for the ongoing links on that page because those tend to be no-followed...
Read the entire article
I think the answer to the question is yes. In general, if you've had a domain for 7 months and you search for that domain, maybe even the exact URL like www.example.com and it doesn't show up in Google at all, it might be safe to conclude that the domain does have trust issues with Google. Now maybe it's a really esoteric country code. We're having trouble finding links to it. But if you know of links pointing to the domain and we still don't show it...
Read the entire article
In one sense, yes, the band's best answer is to do the search, scroll through, and see if you see yourself. That's the short answer. The long answer is you really shouldn't be doing that. That's the wrong attitude to approach SEO. Take a different look at it. Rather than concentrating on one specific trophy phrase or two specific trophy phrases, you should be thinking about the long tail and all the different phrases that can bring people to your website...
Read the entire article
That is completely false. There is absolutely no difference between 2004 domains, 2005 domains, and 2006 domains. All domains acquire reputation in the same way. There is literally no value to getting a pre-2004 domain or pre-Google IPO domain or whatever conspiracy theory you want to spew out. Please we have enough crazy SEO theories going on without new ones, so there's no difference whatsoever...
Read the entire article
The answer is, usually they do but not in all situations. So let me give you a couple of situations where it could differ. If you um, you know, personalization is based on who we think you are. So if you're logged in versus not logged in that might make a difference. Maybe you're logged in on one browser and you're not logged in on the other. Another thing that might make a difference is sometimes different browsers support different functionality...
Read the entire article
I think I have actually been relatively consistent over time which is to say, at best I've said, it's usually a waste of your time, it's a second order effect you're much better to spend your time, creating new content that will get links rather than worrying about pagerank sculpting within your own site. So let me just say a little more clearly. I would not use nofollow on internal links...
Read the entire article
The answer to that question is No and if I could reach through the screen and hit the ESC key and Ctrl C and Break I would because that's cloaking. You never want to do something completely different for googlebot than you'd do for regular users. That's the definition of cloaking. If you have something in your code, saying "If user agent equals Googlebot or if the IP address is Google, do something different"...
Read the entire article
The fact is that since WordPress is so popular, and so widespread, it is subject to a lot more attempts by hackers, especially people that have figured out that there are old versions of WordPress that are a little easier to exploit. So the very first thing that I do, is I try to make sure that I always have my server patched up-to-date...
Read the entire article
An addon domain might be something that your web host would offer you where it's basically related, maybe it's, you know, all part of the package deal where you can get mat-cuts.com. And typically, an addon domain might have a connotation of being a separate site. Suppose you have, for example, I have mattcuts.com. Maybe I also want to register a matt-cuts.com. My personal advice would be rather than developing those as separate sites...
Read the entire article
I think it probably is not all that much and I'll give you a reason or two for that. Number one, take a look at how many people follow, I don't know, me or Danny Sullivan on Twitter or people that subscribe to my blog and the numbers that you end up with are about 50,000. So 50,000 people follow Danny, about 50,000 people follow me, about 50,000 people read my blog so what's that telling me is whether you look at any popular blog...
Read the entire article
The answer is no. Webspam does not use Google Analytics. And a while ago I went and checked and search quality in general, does not use Google Analytics in ranking. So you can use Google Analytics, you cannot use Google Analytics. It won't affect your ranking within Google search results. Now just to cover, in case you're very curious, one tiny corner case, which is: what if my site was very, very slowly...
Read the entire article
A lot of webmasters have been talking about it, so I'm glad I have a chance to address it. This is something that webmaster world called "Mayday", and it sort of happened April 28 through May 3rdish, so right around the first of May, so that's why they've been calling it Mayday. It is an algorithmic change. It does affect long-tailed searches more than head searches....
Read the entire article
I can see at least a couple possible answers to that. The answer that I would give you first and foremost is that I always worry whenever you give someone the option to say: "I'm relevant to country A, Country B, Country C, Country D", all the way down to Country Z, that at some point, they say: "You know what? Yeah, I'm relevant to all those countries. I'm relevant to Chad, I'm relevant to Chile. Tag, you know, show me for every single country"...
Read the entire article
Well that depends. Do you want good search engine rankings in Web 2.0? Because I think you do. Site maps are not just good for users, site maps, even on the page, even HTML site maps can also be a fantastic way to distribute page rank throughout your site. One of the best ways to test out how crawlable is your site is to click around within your site and see if you can reach every page...
Read the entire article
Well, let’s talk about when you are logged in, because that’s a much easier case to talk about. If we know where you are located, whether you're using a mobile phone, or whether you're using a web browser, we can return you better results. If you type in pizza, or yoga instructor, or plumber, you’d much rather see the ones that are close by than something that’s located in San Francisco if you are in New York, or vice versa...
Read the entire article
Let me take a step back and sort of rephrase the question. I think it's saying, look, if everybody has really bad SEO and they can't rank the way that they want then they have to buy ads and isn't that a good thing for Google? I think that's kind of a short-term view of things and Google tries to take the long-term. For example, we never showed pop-up ads on Google even though it might have meant a little bit more money...
Read the entire article
Typically, you might have a blog or something like that, and then over here in the right-hand sidebar or over on the left sidebar you'll see a whole bunch of different tags that have been used to mark the difference blog posts. And sometimes the tags are little bigger if they get more, if they been used more or sometimes a little smaller. So what is a tag cloud? Well, typically that's really just a list of links...
Read the entire article
There is a very short answer which I'll give a minute. But I'll tell you a little story first which is at PubCon a few years ago which is a Webmaster World Conference somebody said, "Do you use Google Analytics in Web spam?" And I said, "Not only doing not used in Web spam, I promise you, my team will never go and ask the analytics team to use their data"...
Read the entire article
Historically, Google is good at several different things. But some of the stuff that we're especially good at is: comprehensiveness, relevance and freshness, how fast we can index information, and the sort of speed that we're able to return the search results and the sort of user experience. I think we're going to keep drilling down on all of those areas...
Read the entire article
That was a very good question and we haven't really done a ton blog post about how we compute breadcrumbs. So, I send an email to the people who had done the blog post and sort of asked them about that. And, the rough answer that I got back is that you should have a set of delimited links on your site that accurately reflect your site's hierarchy. So, the thing that I would add as color to that is still early days for breadcrumbs...
Read the entire article
My standard answer, and I'm talking about web search rankings, has always been that Google basically treats links the same, you know. We use the fact that you have PageRank. So, we know how reputable the site is, and so, you don't just look at the number of links to a site. You look at how reputable those links are. Links don't really matter whether they come from a .gov or an .edu and then applies to Twitter or Facebook as well...
Read the entire article
Relevancy is the most important. If you have two sites that are equally relevant same back links, everything else is the same, you'd probably prefer the one that's a little bit faster. So Page Speed can, in theory, be an interesting idea to try out for a factoring and scoring different websites. Relevance is the primary component. We have over 200 signals in our scoring to try to return the most relevant, the most useful, and the most accurate search results that we can find...
Read the entire article
We don't know what's happening on the side of your web server. Your web server could be running Pearl. It could be running PHP. It could be running Python. It could be running Ruby on Rails. All we know is where the web server returns. So your web server could be running code that would go talk to Amazon's Cloud or Appspot or anywhere else in the Cloud. But we wouldn't even know that. We don't even know whether a page is dynamically created or statically created...
Read the entire article
By definition if somebody links to you and then you link to them, that’s a reciprocal link. Personally, and this is just a general advice, you know, if you link to every single time that somebody is linking to you then it’s almost like you’re tooting your horn. You know, sometimes it’s nice to find people just talking about your site and have you not show up and sort of shoehorn in on the conversation or point people to it...
Read the entire article
A lot of people misunderstand the difference between cloaking and geo or IP based delivery. IP based delivery is delivering content based on IP address. It makes sense, right? Cloaking is showing different content to users than those shown to Googlebot. Cloaking is a type of IP delivery but not all IP delivery is cloaking. Specifically, a type of IP delivery that is not cloaking is showing different content to different users from different countries...
Read the entire article
If you make sure that the pages on your site link to you or, you know, the articles have links to yourself and if someone scraped you, then they might end up linking to you. And, you know, to the extent that that is a successful scraper or a successful spammer, those links will help you along. So there are some people who really, really hate scrapers and try to crack down on them and try to get every single one deleted or kicked off their web host...
Read the entire article
Google has gotten better and better crawling JavaScript to the point where URLs that you might have put in JavaScript, and you thought we are going to be crawled, could possibly be crawled and indexed now. It turns out that the vast majority of people who do JavaScript sort of links or typically ad networks and we handle all the common cases where, you know, there's JavaScript being used to serve up ads...
Read the entire article
I don't know about whether Google links never appear in Google Webmaster Tools. I think I have had, like I think the Google blog has me on the blog roll or something, so I've seen some links from Google. People hold Google to a very high standard. You know, if we have search results that are not blocked by robots.txt, people find out, notice, and let us know, and, you know, blog about it, and it drives a lot of attention...
Read the entire article
Well, my first guess, without knowing any specifics, is that maybe this one got links from a very reputable site, so it has more page rank. We still do look at links and we still do look at page rank. If this is the one that happened to get written off in a very reputable sort of location, then it could be that this one has more page rank and that's why it outranks it...
Read the entire article
Now, don't go hating on Twitter or trying to, you know, make people bust heads. Twitter has many, many great uses. It's great for breaking real-time sort of news, it's fantastic for asking your friends. Google, on the other hand, we try to return really reliable, really reputable information. So, if you're sorting by date, Twitter is fantastic. If you want an answer to a question that's been around for a while, Google's great for that...
Read the entire article
It's a fact. Amit Singhal has talked about it in the New York Times that we believe there are some queries that deserve freshness. So, "QDF" was how he talked about it in the New York Times. That is fact, not fiction. Query Deserve Freshness is a part of Googles algorithm which is designed for queries with a trend for a very fast change in the best results...
Read the entire article
Let me play a little bit of school on you. So, it actually turns out that we used to not use the Meta description at all. We would only use the snippet appropriate to the specific search query. And only in recent years have we added it where if you have a meta description, we will sometimes choose that meta description over a little snippet from within the page. In fact it's moving the other direction...
Read the entire article
That would be a really big undertaking. I think it would be fun. You know, we've joked around the pool table about "Wouldn't it be great if we crawled the web, found all the images, and ran OCR on all the images on the web?" That really would be a lot of work. I think it would be a fun idea, but I'm not sure that you should count on that in the short term from Google...
Read the entire article
I'll give you the same answer, regardless of whether it is real estate or any other industry. I think, there's a couple of things to bear in mind. Start with a small niche. Don't just say, ok, I'm going to rank nr. 1 for blank real estate or whatever your trophy phrase is. It's probably better to concentrate on individual neighborhoods or individual markets, maybe going for the consumer market is a little bit of a big thing to grasp at once...
Read the entire article
Frankly, I wouldn't worry about it. We see tables, we see CSS, and we have to handle both. So we try to score them, you know, well no matter what kind of layout mechanism you use. I would use what's best for you. A lot of people these days tend to like CSS because it's easy to change your site. It's easy to change the layout. It's very modular...
Read the entire article
Well, it is unclear whether you're asking about how many levels deep it is in terms of directories or how far it is from the root page. One way you can make sure that Google reaches those pages is link from your root page, your main page, directly to the deep pages that you want to get crawled. So, we tend not to think about how many directories deep a page is but do look at how much PageRank a page has.
Read the entire article
Let me take the mention of tinyurl.com and substitute URL-shorteners in general. The answer is whenever we crawl one of these links if they do a 301: yes we do follow those and flow the PageRank as we normally would with a 301 from any other site. Danny Sullivan did a really great piece about URL-shortening services where he took the top URL-shortening services...
Read the entire article