Monthly Archives: December 2011

How To Type Special Characters for Facebook, Twitter, Google+


Actually you can use these special characters in anything that renders ASCII. Facebook and Twitter do see to accept all of these. You can either copy these from here and paste them in your tweets or  Facebook posts or use the long method of manually typing them into your HTML.

[table id=1 /]

These are HTML special characters and there are many more but these are the ones it seems most bloggers and social media enthusiasts like to use.

Apple to Use Screens from Sharp for Lower Power, Thinner Profile?

Thin … there’s something the world needs! Not being a regular iPad user I’m not sure about the longevity of a battery charge but I’m guessing anything longer than current is always a plus, right?

The screens are indium gallium zinc oxide flat panels referred to by the acronym IGZO. In addition to thinning the profile and making better use of battery resources by increasing electron mobility the screens will present a 330 dpi display for sharp, no pun intended, HD.

More on this from:

YouTube’s TestTube – Test Drive Upcoming Tools

Did you know you have access to look at the tools, and toys, being developed by YouTube’s development engineers? Well, now you do. Just go to TestTube, YouTube’s development playground and take a look around.

One of the more interesting apps is YouTube Leanback + YouTube Remote Android App “Leanback makes watching videos on YouTube as effortless as watching TV. You can even use your Android phone to control the Leanback experience!”

The only down side is they seem to want to leave the ones they have shut down still on the page – like the YouTube 3D creator.

Blogger Liability Lawsuits, and Judgments, on the Rise

Montana real estate agent, Crystal Cox, was on the bad end of a winning decision about some of her blog content. In fact she didn’t win … at all. Her liability for the words she wrote about a bankruptcy custodian? Two point five million dollars. Let me put that in plain English: $2,500,000. And she’s a real estate agent which means that’s probably about 100 years of income.

Bloggers beware. We’re not journalists, at least according to these court cases – $47,000,000 in defamation awarded – and we’re not protected by the same laws. I’m not going to go through the entire details here but I have a tendency to disagree with the whether or not we are journalists BUT I also don’t think journalists should be able to write or report just anything they want because they have shield protection.

SEO Wars: table vs div with CSS

It’s a common question and it has answers highly skewed by viewpoint. However, there is an answer and it’s here.  This may be “over your head” but you’ll also want to know the answer if you ever buy web development or SEO for your site and the SEO analyst makes a recommendation like, “you need to have your site redesigned because your webmaster ________” (tables or css).

CSS is newer than tables ergo some people find it sexier … why we even use “sexy” to determine a level of coolness I don’t know. Be that as it may CSS is more “Web 2.0″ kind of like Starbucks is cool because it’s not the corner gas station or iPhone is cool because a few cool people said it’s cool. Cadillac makes an awesome car incomparable in many ways but Lexus is perceived as cooler. Why? Maybe it’s marketing.

Long before there was CSS there were tables and tables were the way to layout web pages. I have opened the backend of some sites and seen some incredibly hilarious table layouts with tables nested in tables inside of cells inside of rows and columns. In fact I have nested two deep myself.

Tables can be sized by hard numbers or by percentages. Tables display the same cross browser. All browsers render tables the same way. Tables, rows, columns, and cells can be named and given an ID and controlled with CSS. Tables can have border, no border, border style and be manipulated in other ways. You can create a template to be used to change an entire site but …

SEO Wars: Craigslist vs. Backpage

When looking for links that matter, that is links that get traction and pass-through value, it is important to know where you are posting and how that site:

  • Allows search engines to  index pages
  • Allows search engines to follow links
  • If it passes PR value how much is it passing
  • Relates to your topic

Craigslist's robots.text file

A few years ago I created a site called where I invited 25 real estate bloggers to participate. It literally took 3 months to get to a PR3 which, to me, was absolutely amazing. In fact we saw search results in the 100′s in the first 90 days. Call it a lucky storm if you want but in reality it was because so many PR4 and PR5 sites were linked to “The24: America’s 24 Chosen Real Estate Bloggers” and they were linking outbound with good content. There was no Facebook, Twitter or LinkedIn back then. The moral is there were high PR links pointing to The24. (It was downed for two reasons one of which included repeated hack attacks and denial of service attacks and I was too busy to fight it, unfortunately.)

When planning your strategy to build Page Rank you should treat it differently than when you are going for links on clean sites (non-blackhat) and when you’re going for community relevant content. Click links from organic search depend on search engine placement which comes from multiple factors including PR and search relevance. Click-throughs from referring pages depend on neither search relevance nor Page Rank but they do depend on traffic and topic relevance. So, let’s look at two online ad sites which both have a high volume of traffic and how they interact with the search engine robots.

First is Craigslist

If you have not heard of Craigslist welcome to Earth. Craigslist is an online ad site created in 1995 just for San Francisco. It has since gone global and now boasts a whopping 64,000,000 monthly unique visitors, 80,000,000 pages indexed, 113,000,000 backlinks and a PR of 7. Those are some powerful numbers and some everyone in SEO and online marketing would like to take advantage of. The pass-through value alone could be enormous but aside from that capturing even a fraction of a percentage of that monthly traffic is huge, too. Thus the battle.

Craigslist does allow robots but greatly limits their search. You can see from the image the robots.txt disallows a few directories. It’s once you know what those directories are that the impact is really seen. Let’s look at those triple letter directories to see exactly what they contain:

  • ccc = all community ads
  • hhh = all housing ads
  • sss = all for sale ads
  • bbb = all service ads
  • ggg = all gigs
  • jjj = all jobs

So what else is there you’re wondering? Why personals of course.  Even the “res” means resumes so that’s not patrolled either. Now you’re wondering, “why even bother with Craigslist?”

Let me introduce you to our friend RSS. Because Craigslist is nice enough to make their RSS feed available by search category there are many websites which use that data to stuff their content and republish CL content to their sites. The site publishing the RSS feed may not block indexing and you just may get links from search engines picking up your keywords from these sites. The downside is that CL uses the nofollow tag even in their RSS feed. However, if the content is indexed by Google et. al., you’re going to get an increased likelihood for click-throughs from those visitors. And it does happen.

Still, for CL, the best way is to write compelling titles and post in relevant categories.

What about this Backpage thing?

Backpage's robots.txt file

This is a unique approach to the same format as CL started by the Village Voice and partnering with a dozen or so newspapers nationwide. BP handles search engines and their syndication feed a little differently. The RSS feed on BP, for example, strips all HTML from the content and delivers a text only version. If you type your links like <a href=””>CLICK HERE</a> it will completely strip it down to just “CLICK HERE”. So on BP you want to create your links as <a href=””></a> and the feed will be stripped to just “”.

How to use forms to gather information on WordPress

People have paid me good money to do exactly what I’m about to show you how to do … for free. Of course if you don’t have time you can still pay me money – or if you want it done right. Just kidding, I’m sure all of the readers here are highly skilled (tech savvy they call it in the real estate biz) and fully capable of doing it on you own. Seriously.

It’s not rocket science!

If you have WordPress you can do this yourself in just about 8 to 10 minutes after watching the video. There is a slight learning curve but it’s really just about as simple as creating your first Word document was. The plugin we are going to be looking at can be installed through your Dashboard and is called Contact Forms 7.

There is also an extension to Contact Forms 7 we are going to use called Contact Form 7 to Database. Go ahead and install both of those before watching the short video and you will be up and flying in no time. And, as always, if you can’t or don’t want to do it yourself, hire me!

How to track Twitter conversations between 2 people

The other day I needed to find an exact tweet in a short exchange between the Atlanta Falcon’s social media manager and I. Since I had failed to favorite the tweet I needed, expecting the other party to have responded earlier, I was left to use searching and scrolling and other messy approaches.

Then I remembered Bettween, a service I had used a few months ago. It is possible I learned about Bettween from Laura Fitton or from and you can always find great Twitter tools there as well.

Bettween is a free service that allows you to enter two Twitter handles and track the conversation between those people. In fact you may even create a custom widget to embed in your blog or website with an ongoing live stream based on your search results. (I did not cover this in the video but the image to the right is the results from the second search I did in the video.)

How to exclude yourself from Google analytics

When you have a website that is receiving only a couple of hundred “visits” per day and you account for 20 of those you are going to greatly skew your Google analytics results. Excluding yourself from being counted in the stats is really quite simple and can be easily accomplished by using the filter tool inside your Google analytics account.

The following video will show you how to easily exclude your own visits from your Google analytics.

How to create a clickable image map online

You may never find a need for a clickable image map. In case you don’t know what they are I would discribe them thus: a single image with multiple URL links defined by areas created by establishing regions of co-ordinates within the image and assigning a URL value to each of them. Or in more plain English, “an image that has a bunch of links to a bunch of different pages”.

Image maps have been around for a long time. In fact they pre-date the web. For HTML purposes, however, they are just as old as the web. Uses for them vary so we’ll simply discuss why I chose to use one today and let you come along for the ride in the video.

I wanted a banner for to allow the creation of links to all of the regular contributor’s sites. Instead of having multiple single images I simply wanted one image with multiple links. While there are many ways of accomplishing this task there is one way, for sure, that requires no downloads or special skills and has an extremely small learning curve.