Actually you can use these special characters in anything that renders ASCII. Facebook and Twitter do see to accept all of these. You can either copy these from here and paste them in your tweets or Facebook posts or use the long method of manually typing them into your HTML.
[table id=1 /]
These are HTML special characters and there are many more but these are the ones it seems most bloggers and social media enthusiasts like to use.
Thin … there’s something the world needs! Not being a regular iPad user I’m not sure about the longevity of a battery charge but I’m guessing anything longer than current is always a plus, right?
The screens are indium gallium zinc oxide flat panels referred to by the acronym IGZO. In addition to thinning the profile and making better use of battery resources by increasing electron mobility the screens will present a 330 dpi display for sharp, no pun intended, HD.
One of the more interesting apps is YouTube Leanback + YouTube Remote Android App “Leanback makes watching videos on YouTube as effortless as watching TV. You can even use your Android phone to control the Leanback experience!”
The only down side is they seem to want to leave the ones they have shut down still on the page – like the YouTube 3D creator.
Montana real estate agent, Crystal Cox, was on the bad end of a winning decision about some of her blog content. In fact she didn’t win … at all. Her liability for the words she wrote about a bankruptcy custodian? Two point five million dollars. Let me put that in plain English: $2,500,000. And she’s a real estate agent which means that’s probably about 100 years of income.
Bloggers beware. We’re not journalists, at least according to these court cases – $47,000,000 in defamation awarded – and we’re not protected by the same laws. I’m not going to go through the entire details here but I have a tendency to disagree with the whether or not we are journalists BUT I also don’t think journalists should be able to write or report just anything they want because they have shield protection.
It’s a common question and it has answers highly skewed by viewpoint. However, there is an answer and it’s here. This may be “over your head” but you’ll also want to know the answer if you ever buy web development or SEO for your site and the SEO analyst makes a recommendation like, “you need to have your site redesigned because your webmaster ________” (tables or css).
CSS is newer than tables ergo some people find it sexier … why we even use “sexy” to determine a level of coolness I don’t know. Be that as it may CSS is more “Web 2.0″ kind of like Starbucks is cool because it’s not the corner gas station or iPhone is cool because a few cool people said it’s cool. Cadillac makes an awesome car incomparable in many ways but Lexus is perceived as cooler. Why? Maybe it’s marketing.
Long before there was CSS there were tables and tables were the way to layout web pages. I have opened the backend of some sites and seen some incredibly hilarious table layouts with tables nested in tables inside of cells inside of rows and columns. In fact I have nested two deep myself.
Tables can be sized by hard numbers or by percentages. Tables display the same cross browser. All browsers render tables the same way. Tables, rows, columns, and cells can be named and given an ID and controlled with CSS. Tables can have border, no border, border style and be manipulated in other ways. You can create a template to be used to change an entire site but …
CSS can be changed across the site without having to go into every page and make changes to the hard coded layout. Components can be loaded in fixed places or floated. Each element can have it’s own design, borders, backgrounds, font … not unlike table cells. Some browsers munge the CSS and without proper care difference size displays can also munge the layout.
But what about SEO?
I’m so glad you asked. It’s an often asked question, which is better for seo, tables or css? Look no further, the authority speaks:
When looking for links that matter, that is links that get traction and pass-through value, it is important to know where you are posting and how that site:
Allows search engines to index pages
Allows search engines to follow links
If it passes PR value how much is it passing
Relates to your topic
A few years ago I created a site called BlogX3.com where I invited 25 real estate bloggers to participate. It literally took 3 months to get to a PR3 which, to me, was absolutely amazing. In fact we saw search results in the 100’s in the first 90 days. Call it a lucky storm if you want but in reality it was because so many PR4 and PR5 sites were linked to “The24: America’s 24 Chosen Real Estate Bloggers” and they were linking outbound with good content. There was no Facebook, Twitter or LinkedIn back then. The moral is there were high PR links pointing to The24. (It was downed for two reasons one of which included repeated hack attacks and denial of service attacks and I was too busy to fight it, unfortunately.)
When planning your strategy to build Page Rank you should treat it differently than when you are going for links on clean sites (non-blackhat) and when you’re going for community relevant content. Click links from organic search depend on search engine placement which comes from multiple factors including PR and search relevance. Click-throughs from referring pages depend on neither search relevance nor Page Rank but they do depend on traffic and topic relevance. So, let’s look at two online ad sites which both have a high volume of traffic and how they interact with the search engine robots.
First is Craigslist
If you have not heard of Craigslist welcome to Earth. Craigslist is an online ad site created in 1995 just for San Francisco. It has since gone global and now boasts a whopping 64,000,000 monthly unique visitors, 80,000,000 pages indexed, 113,000,000 backlinks and a PR of 7. Those are some powerful numbers and some everyone in SEO and online marketing would like to take advantage of. The pass-through value alone could be enormous but aside from that capturing even a fraction of a percentage of that monthly traffic is huge, too. Thus the battle.
Craigslist does allow robots but greatly limits their search. You can see from the image the robots.txt disallows a few directories. It’s once you know what those directories are that the impact is really seen. Let’s look at those triple letter directories to see exactly what they contain:
ccc = all community ads
hhh = all housing ads
sss = all for sale ads
bbb = all service ads
ggg = all gigs
jjj = all jobs
So what else is there you’re wondering? Why personals of course. Even the “res” means resumes so that’s not patrolled either. Now you’re wondering, “why even bother with Craigslist?”
Let me introduce you to our friend RSS. Because Craigslist is nice enough to make their RSS feed available by search category there are many websites which use that data to stuff their content and republish CL content to their sites. The site publishing the RSS feed may not block indexing and you just may get links from search engines picking up your keywords from these sites. The downside is that CL uses the nofollow tag even in their RSS feed. However, if the content is indexed by Google et. al., you’re going to get an increased likelihood for click-throughs from those visitors. And it does happen.
Still, for CL, the best way is to write compelling titles and post in relevant categories.
What about this Backpage thing?
This is a unique approach to the same format as CL started by the Village Voice and partnering with a dozen or so newspapers nationwide. BP handles search engines and their syndication feed a little differently. The RSS feed on BP, for example, strips all HTML from the content and delivers a text only version. If you type your links like <a href=”http://icobb.com”>CLICK HERE</a> it will completely strip it down to just “CLICK HERE”. So on BP you want to create your links as <a href=”http://icobb.com”>http://icobb.com</a> and the feed will be stripped to just “http://icobb.com”.
Backpage does not allow robots. Take a look at their robots.txt in the second image here on this page. If you know how to read the robots.txt file you will see BP simply denies robots the permission to access all directories on the service. So while BP does not use the nofollow tag it does use the robots’ global exclusion policy meaning robots are not supposed to index (store) any data they find on any pages on Backpage.
At first glance you would think CL is more friendly to robots than BP but that’s not necessarily the case. However, CL has much higher monthly unique traffic than BP. In fact BP has 136,000,000 backlinks with 20,000,000 pages indexed and a PR of 5 but only shows around 3,000,000 unique monthly visitors. If you have used both you have likely had more success with CL simply because of that huge gap, about 60,000,000 in monthly unique visitors between the two.
Summing it all up
If you are counting on either Craigslist or Backpage to bump your SEO and PR you’re going to be let down. However, if you are a steady marketer with good word skills you can expect to jump your traffic from using both in tandem. They both allow a wide range of HTML tags so you can “pretty it up” if you like but we’ve found through long-term A/B testing the difference is insignificant for click-through. Visitor to prospect conversion, which we have not measure, may be higher with design aspects but that’s for you to discover … and share!
Call in the professionals
Let’s face it – you probably need to be doing what it is you do. Fortunately there are people who can handle the reset for a relatively nominal fee (or more if you just like to pay higher prices). Sometimes marketing campaigns deserve CL and BP in the arsenal. Mostly when the marketer does not want to budget in PPC or other paid advertising. We certainly use it for lead generation services and it has served us well. When you are ready to start a low-cost, month-to-month online advertising and social mention campaign I’m sure we can accommodate your needs in an easy to swallow, bite sized but scalable solution. Call me at 678-439-8683 or just contact me here online.
People have paid me good money to do exactly what I’m about to show you how to do … for free. Of course if you don’t have time you can still pay me money – or if you want it done right. Just kidding, I’m sure all of the readers here are highly skilled (tech savvy they call it in the real estate biz) and fully capable of doing it on you own. Seriously.
It’s not rocket science!
If you have WordPress you can do this yourself in just about 8 to 10 minutes after watching the video. There is a slight learning curve but it’s really just about as simple as creating your first Word document was. The plugin we are going to be looking at can be installed through your Dashboard and is called Contact Forms 7.
There is also an extension to Contact Forms 7 we are going to use called Contact Form 7 to Database. Go ahead and install both of those before watching the short video and you will be up and flying in no time. And, as always, if you can’t or don’t want to do it yourself, hire me!
There are several ways to accomplish everything I show you how to do here so if you want to recommend another for the readers please feel free.
The other day I needed to find an exact tweet in a short exchange between the Atlanta Falcon’s social media manager and I. Since I had failed to favorite the tweet I needed, expecting the other party to have responded earlier, I was left to use searching and scrolling and other messy approaches.
Then I remembered Bettween, a service I had used a few months ago. It is possible I learned about Bettween from Laura Fitton or from OneForty.com and you can always find great Twitter tools there as well.
Bettween is a free service that allows you to enter two Twitter handles and track the conversation between those people. In fact you may even create a custom widget to embed in your blog or website with an ongoing live stream based on your search results. (I did not cover this in the video but the image to the right is the results from the second search I did in the video.)
Using the system is very straight forward. Results may take some time to compile so be patient. In fact the search I did between Megan Berry and DKNY did not complete until about 4 minutes after I had completed the video.
There are definitely other ways to do this but this is just a quick look at one method. Feel free to leave others in the comments section and let others know about what you have found. Perhaps there are better ways or just other ways – either way – we want to know about them!
When you have a website that is receiving only a couple of hundred “visits” per day and you account for 20 of those you are going to greatly skew your Google analytics results. Excluding yourself from being counted in the stats is really quite simple and can be easily accomplished by using the filter tool inside your Google analytics account.
The following video will show you how to easily exclude your own visits from your Google analytics.
The problem with not excluding your own activity from your Google analytics results is that if you account for 10% or even more of your site traffic, and many of you do, your results are going to be in error by 10% or more. It will affect your time on site, pages viewed, bounce rate, and other statistics as well.
You can accomplish this task in less than 4 minutes even if you have never done it before. Once you create your new filter it can easily be applied to all of your sites linked with that same account.
You may never find a need for a clickable image map. In case you don’t know what they are I would discribe them thus: a single image with multiple URL links defined by areas created by establishing regions of co-ordinates within the image and assigning a URL value to each of them. Or in more plain English, “an image that has a bunch of links to a bunch of different pages”.
Image maps have been around for a long time. In fact they pre-date the web. For HTML purposes, however, they are just as old as the web. Uses for them vary so we’ll simply discuss why I chose to use one today and let you come along for the ride in the video.
I wanted a banner for http://socialmediaedge.com/itunes-frame.php to allow the creation of links to all of the regular contributor’s sites. Instead of having multiple single images I simply wanted one image with multiple links. While there are many ways of accomplishing this task there is one way, for sure, that requires no downloads or special skills and has an extremely small learning curve.
I apologize for the audio and video getting a little out of sync near the end. I just didn’t have time to re-record the video. It’s not too terribly bad but it’s annoying to me since I made it!