Welcome, Guest: Register On Nairaland / LOGIN! / Trending / Recent / New
Stats: 3,191,421 members, 7,944,102 topics. Date: Monday, 09 September 2024 at 11:19 AM

Manish703777's Posts

Nairaland Forum / Manish703777's Profile / Manish703777's Posts

(1) (2) (of 2 pages)

Programming / Capitalising On The Ultimate Form Of Duplicate Content by Manish703777: 5:35pm On Sep 09, 2021
The first time I ever accessed the Internet was from my mother's work computer in late 1995. I was eleven years old and her homepage was set to Yahoo. I can't really remember what it looked like, but Googling (oh, I hate the irony too) "Yahoo in 1995" produced a post by John Battelle with a magnificent screen cap of the portal in the mid-90s. This was thirteen years ago (so, over half my lifetime), and my memory might not be serving me very well, but I'm fairly sure that the first thing I ever searched for was song lyrics. Probably to a very bad 1995 song. My father wanted to try it next and he searched for the lyrics to "Flower of Scotland." That, I remember.
drow last names
Today, searching for lyrics is a horrendous task. Most top-ranked lyrics websites look like MySpace threw up on GeoCities and, if I dare to click on a result, inundate my computer with pop-up advertising. Earlier today, I actually stumbled on an instance of a robotic voice congratulating me for having won two iPod nanos. To get a coherent result and not be presented with the "Are You Stupid?" test, you have to memorise which sites are worthwhile to click on.

How do search engines really determine which sites should rank well for song lyrics-related material? This niche seems to be relatively competitive, with advertising being the business model of choice. The first big problem is certainly duplicate content. This is an especially important question when it comes to lyrics because of people's tendency to take a sample of a song they've heard and search for it without knowing the song's name. If there are thousands of instances of the same song present online, how does a site make sure its version is ranked?drow last names

The suggestions Google shows for searches beginning with "lyrics" is a good place to start when analysing what search engines value for these types of searches.
Currently popular music obviously dominates. Choosing the search "lyrics to take a bow," you'll see that Google presents both results for a currently popular song with that name by Rihanna, as well as a track from 2007 by Leona Lewis and a fourteen-year-old song by Madonna. Edit: two YouTube videos have made it into the mix in the last twenty-four hours, taking out the Leona Lewis song.

Visit This: getloadedinthepark
Programming / How It's Feasible To Manually Review All Domains by Manish703777: 12:59pm On Sep 03, 2021
After watching Nate Buggia a few weeks ago speak about Live's Webmaster Tools, I was struck by his statistic about the number of domains on the web. He suggested that there are 78 million domains. There's certainly room for disagreement about this number—don't forget Google has one trillion web pages wink —but I bet he's in the right ballpark. If that's right, could we manually review all of them?
drow last names
Sure, 78 million domains is big. But not that big. A few months ago while investigating spam, Danny reviewed a fairly randomly chosen 500 domains in a matter of hours. And I think he did a great job of it, too. That's a good foundation, but could we scale that up and review millions of domains?

I see a few challenges here. Probably the biggest challenge I see is just getting this list of Live's 78 million domains. Next you're going to need a lot of manual reviewers. But if you're Live (or some other search engine) you've already got that list, and a large contract labor force. Too bad for the rest of us.
drow last names
I suppose if you're clever you might be able to do this through Alexa's Web Information Service and Amazon Mechanical Turk. Taking a look at the Mechanical Turk pricing, it looks like you could charge one cent for every domain (or maybe each block of a few dozen domains). So we're probably talking about tens or hundreds of thousands of dollars. But that's pocket change for Google. And Google has plenty of remote offices with lots of search quality engineers. In fact, they say, "Google makes use of evaluators in many countries and languages. These evaluators are carefully trained and are asked to evaluate the quality of search results in several different ways."

So let's say a single person can review 1000 domains in a single day. And let's say you've got 1000 reviewers working on this problem. That tells me that 78 days later you've got all the relevant domains on the internet reviewed. That's less than 10% of Google's workforce, less than 2% of Microsoft's Workforce. Of course you could do it with less if you pre-filtered some of those domains, or took longer than three months to do it. If Google, Yahoo!, and Live haven't already done this... well, I can't imagine that they haven't done at least part of this by now.

Visit This: getloadedinthepark
Programming / Together, We Can Create An Accurate Formula For Predicting Keyword Search Volume by Manish703777: 12:52pm On Sep 03, 2021
An astute reader emailed me (Preston, you're a genius) to suggest that rather than attempt to collect PPC stats on our own, why not tap into the vast resource of the Internet marketing community - surely, together, we will be far stronger than we would be as just one firm.
Here's the goal - to find a formula whereby we can measure the efficacy and accuracy of keyword research sources like Overture, KeywordDiscovery, Hitwise's Keyword Intelligence, Google's Adwords Estimator, MSN's Adcenter, and others. There's just a few simple steps to follow:drow last names
1. Gather a large list of hundreds, hopefully thousands of keyword terms and their actual search numbers at Google - this can be done by getting data from advertisers who were always listed on the 1st page of results for the term through AdWords (using the impression numbers for "exact match"wink.
2. Gather a list of the estimated numbers reported by all of the major keyword research services for those same terms and phrases.
3. Distill the data, tossing out those phrases that might cause issue or those reports that wouldn't have a huge impact and keep only the best, most useful ones.
4. Compare the two sources (Adwords Impressions and estimator tools) and find out fascinating information like - which tool is most accurate? Is there a formula we could create that would predict (based on all the estimation sources) within 10%, 20% or 30% of actual traffic?
To my mind, this formula would be priceless to search marketers. Once you can accurately predict search traffic (even if it's off by 20-30%), you've got a system that doesn't require a massive spend at Google to predict actual searches. Even if the data isn't conclusive, the results will be incredibly enlightening.
We have a few clients (and some botched test data) that we can provide for the project, but we really need your help to make it possible. My promise is that we'll collect the data, report it all honestly (but anonymously - not revealing any of your keywords or those of clients) and build a tool that will use this formula. We'll also provide the exact formula publicly, so that anyone can use the same information to build their own tools, use it in their internal calculations, etc.
How can you help?
1. If you are willing to participate, please email Rebecca@SEOmoz.org. We'll figure out all the necessary details, whittle down what's required, and request screenshots to help make the data collection happen.
2. If you are a PPC expert and can help us to analyze the data and issue the requirements, please email Rebecca as well; we'll need some help from the best smileydrow last names
3. If you're a blogger, you could ask your readers to do likewise (you don't need to link here, you can just ask them to email Rebecca; I don't want it to look like I'm begging for link love)
4. If you participate at forums like DigitalPoint, Sitepoint, WebProWorld, SearchEngineWatch, HighRankings, etc. you could post about this and ask for volunteers (again, no need to link).
5. If you work for Google, you could just make the AdWords estimator accurate and save all the trouble (we'll even send you a lovely present).
What do you say? I think that together, we could make something that will be truly valuable for years to come (and, as the data becomes less accurate, we can refresh it).
Am I off my rocker here, or is this an idea that has some merit? Any suggestions, additional ideas, important points to consider?
p.s. Rebecca - I'm sorry to flood your inbox like this

Visit This: getloadedinthepark

(1) (2) (of 2 pages)

(Go Up)

Sections: politics (1) business autos (1) jobs (1) career education (1) romance computers phones travel sports fashion health
religion celebs tv-movies music-radio literature webmasters programming techmarket

Links: (1) (2) (3) (4) (5) (6) (7) (8) (9) (10)

Nairaland - Copyright © 2005 - 2024 Oluwaseun Osewa. All rights reserved. See How To Advertise. 41
Disclaimer: Every Nairaland member is solely responsible for anything that he/she posts or uploads on Nairaland.