Trevor got this interesting email today regarding a domain we purchased a few months ago. The email is shockingly deceptive:
Attn Bailey, Trevor
This letter is to inform you that it’s time to send in your search engine registration for COSMETICSURGEONNORTHCAROLINA.COM.
Failure to complete your search engine registration by Jan 26, 2011 may result in the cancellation of this offer (making it difficult for your customers to locate you using search engines on the web).
Your registration includes search engine submission for COSMETICSURGEONNORTHCAROLINA.COM for 1 year. You are under no obligation to pay the amount stated above unless you accept this offer by Jan 26, 2011. This notice is not an invoice. It is a courtesy reminder to register COSMETICSURGEONNORTHCAROLINA.COM for search engine listing so that your customers can locate you on the web.
The email points to this link – which is clearly an invoice.
The domain is registered to:
200 Park Avenue South
Which I’m guessing is a faked registration as the number format appears to be invalid. Fortunately Google provides some good advice on how to steer clear of these scams. From the Official Google Blog:
How to identify scams and other schemes
In general, if it looks too good to be true, it probably is. Here are some pointers on what to look out for:
Before you fill out a form or give someone a credit card, do a web search to see what other people are saying about the company and its practices.
Be wary of companies that ask for upfront charges for services that Google actually offers for free. Check out our business solutions page before writing a check.
Always read the fine print. Watch out for get-rich-quick schemes that charge a very low initial fee before sneaking in large reoccurring charges on your credit card or bank account.
Google never guarantees top placement in search results or AdWords — beware of companies that claim to guarantee rankings, allege a special relationship with Google, or advertise a “priority submit” to Google. There is no priority submit for Google. In fact, the only way to submit a site to Google directly is through our Add URL page or through the Sitemaps program — you can do these tasks yourself at no cost whatsoever.
I’ve been following the domaining industry for a few months now. You know these people. They’re the ones who invented ‘what you need, when you need it’. The low value “parked page” site stuffed with low-quality pay-per-click links, or the “mini-site” with three pages of cursorily-researched content and a whole lot of AdSense.
While it’s often seen as a grand investment strategy– building a portfolio of names and holding them for sale, it’s actually a very weak strategy. (more…)
Do write copy with local language. You know the sort of things I mean- is it “pop” or “soda”? Matching their word choice ensures you match their search terms.
Don’t stuff pages with zip codes and city names. It’s spammy and frequently chasing no-volume searches. A rank for a popular -search term- in Mesa beats ten low-volume -search term- in 85215 rankings.
Do use secondary means to imply your locality. Make sure the phone number, especially a local one, appears in text. Licence numbers are a good excuse to mention local authorities. Reference local codes, or charities you support in the area. It helps with semantic analysis– these words go with your address, reinforcing your relevance for the area.
Don’t get too wrapped up in trying to handle out-of-area leads. Some firms believe they can turn into a firm that sells business to others- if they rank for every city. Good luck unless it’s a full commitment thing. Finding shops out of town can be a hassle, and you can end up spending all your time running the side business.
Do claim your business in local sites. Aside from adding link value, it ensures they’re under your control to see reviews and spam.
Don’t buy any services from local sites. Most of them are just selling Google traffic, and you can outrank ‘em and get users directly.
Do promote cross-media. Search is big, but some businesses still benefit from brick-and-mortar messages or social-network activity to put the service in front of a visitor at their need. The more emergency your service is, the less non-search brand you have to build, but there’s still merit in being the brand a visitor recalls from other media.
Don’t go for excessively labor-intensive promotions. Give the visitor a coupon; don’t expect him to Like you on Facebook before you’ll cut him a promotion. It becomes analogous to the shops which demand 20 page forms for their discount-club card.
A few days ago, this google patent application was awarded a patent that details a
“System and method for modulating search relevancy using pointer activity monitoring “
according to the patent title and abstract. If you read on, it explains that the data the patent suggests collecting is the mouse location on page and hover duration. What could this mean for SEO?
The simple answer is that there’s a new factor influencing rankings. The patent calls it the “client attention coefficient.” That wording suggests that it will have a direct effect on how “relevancy” is calculated for all Google searches. Any time a search engine makes a change in how they rank sites it’s reflected in the rankings. That may sound obvious, but it’s something every good SEO thinks about when changes start happening. Should Google incorporate this mouse tracking idea into their search engine it could produce some interesting results both good and bad. One thing we know is that we’d have to start paying more attention to how our indexed pages appear on SERPs (Search Engine Results Pages.)
When Google builds a SERP for a search query it takes the titles and descriptions of the results and serves them up as a vertically aligned list with higher ranking pages at the top. The typical searcher begins scanning with their eyes at the page and sometimes follow with a mouse pointer. Referring back to the patent, this shouldn’t have a direct effect because the patent proposes a timer or “threshold value” that would filter out times when a cursor “temporarily passes through [these] regions.” However, this doesn’t change the fact that the results at the top are more likely to get mouse pointer attention. Depending on how much weight Google assigns to this new metric this could strengthen the barrier-to-entry for new rankings even more.
The “client attention coefficient” might also accidentally favor indexed pages with longer titles and descriptions. The two search results below illustrate an example case.
A result that shows up on a serp (search engine results page) looking like the first result might not hold a visitor’s attention as long as the second. Another advantage the second has over the first is that it simply occupies more space on the page. It will grab more mouse time because of this but, Google’s engineers aren’t dumb. I bet they’ve already thought up a solution but there’s no best way around it. There will be some artificial-ness leaking into the organic rankings.
We won’t know how effective it is in improving results until Google actually implements it if they ever do. They may never implement this hopefully out of respect for our privacy. Hopefully we can prevent Google from looking through visitor’s webcams and tracking eye movement across the page. Anyone want to file that one now before Google does?
There are two major types of analytics systems: client-side and server side. Client-side analytics relies on an event fired by a user to record a page-view. Google Analytics is the most common client-side system. Client-side packages are beneficial because they can also track non-page-loading events, such as interacting with a form or video. Being user-ran, they can also harvest user data like screen resolution and connection performance. A server-side system, like AWStats, looks at server logs to determine the volume of pages requested. Server-side analytics are good for tracking special cases– like lost pages that need redirects, the traffic of search engine spiders, and mobile users, but have limited insight on conventional PC users.
It’s important to recognize that client-side and server-side analytics never match exactly. Since a client-side system cannot record traffic by robots, and some limited users– like mobile users with no image or script support, it tends to undercount by a few percent. Server-side systems often mis-classify users based on browser headers- many obscure browsers emulate IE or Safari.
Moreover, even inside a category, disputes occur. Does a visitor who sees a second tab on one page count as a bounce? Is a user who hits a Bing ad driven by Bing or the Microsoft ad network? Analytics vendors have many judgement calls. An important guideline is to use analytics data for month-to-month comparison in a single vendor. Google Analytics for June compares sensibly with Analytics for May, but trying to reference it against AWStats for June leads to confusion and bad decisions. Occasionally, a disparity between packages can reveal unusual user behaviour, such as a denial-of-service attack seen by the server-side system and not Google Analytics, but it’s more often statistical noise.
At Web-Op, we’ve been doing sites for local real estate agents for years. In many ways, it’s still a market which is fairly weak in the SEO space. Many firms rely on cheap ‘iframe’ display of listings, so they end up with a site that Google sees as having no real content.
However, even innovations in data import technology, like TransparentRETS and dsIDXPress, allowing you to import MLS data in bulk onto a familiar, easy-to-install backend, are not a cure-all for top rankings. (more…)
Mesa, AZ (Web-Op) February 4, 2010 – Web-Optimize LLC., a leader in internet marketing and software development, announced the release of a new real-time marketing package specially tailored for Twitter and other social mediums.
New internally developed software allows Web-Op to promote site and brand recognition on Twitter and other social mediums. Fused with current online marketing strategies it provides the ultimate in online media presence. This new marketing strategy covers all online sectors including organic search, pay-per-click, social media, and bad press. Providing an optimum balance between automation and human involvement is our goal.
Social mediums such as Twitter are packed full of opportunity. The problem is it would cost far too much attention to hand manage a marketing campaign. The common solution is to say, “Follow us on Twitter!” and periodically post updates to a Twitter account. This isn’t effective. Our solution provides incentive for interested users to opt in though keyword targeting and friendly conversation. The video shows a more detailed view.
Another strategy is through improving organic search and indexing. Organic search rankings can be frustrating for all. The keys to improvement are having a steady and strategic linking campaign while carefully monitoring the changes over long and short periods. Our time proven solution for search provides sustainable long term growth. Malicious or slanderous online press can be buried with higher rankings and the promotion of positive press.
Our strategy also includes pay-per-click campaigns managed in detail with constant tracking and split testing to maximize gains and performance.
Through our re-tooled reporting system businesses can see growth as it happens in an online statistics monitor as well as detailed monthly reporting from the experts. Our engineers have worked hard on streamlining the process to allow for such transparency.
Web-Optimize, LLC is a leader in internet marketing campaign management and software development for both new and established businesses. Our industry expertise and forward looking strategies help businesses grow and gain positive recognition. We deliver hand crafted solutions to businesses to maximise return on investment. Unlike traditional SEO’s, our services are well documented and transparent with solid results. Web-Optimize is based in Mesa, AZ. For additonal information, please visit web-optimize.com or call 1 (866) 937-7082.
Everyone has a shopping cart on their site. Odds are, it’s been 5 or 10 years since the first time you bought something online. You’d think by now, they would have ironed out the kinks. However, year after year, new website owners continue to make the same mistakes. Before you unpack that ASP.NET Storefront or Zen-Cart archive, why not take a moment to plan a strategy for your cart to search and sell well. (more…)
If you’re starting a web presence from scratch, there’s a significant chance you’re about to waste $5,000.
Many businesses are keen on the concept of the “premium” domain name– in particular, short, generic names. Why not be “loan.com” instead of “SmithMortgageCompany.com”, or “roses.com” for your nursery? Even long after the domain market peaked with the multi-million dollar sales of names like business.com, people are still paying four, five, and six figure prices for attractive sounding names.
The problem is, like many Internet-based profit plans, it’s based on dated logic. (more…)