How to Use Mod-Rewrite to Simplify URL Rewriting in Apache - A Basic Guide to the Mod-Rewrite Module

Friday, November 2, 2012 1 comments

URL Rewriting is the process of manipulating an URL or a link, which is send to a web server in such a way that the link is dynamically modified at the server to include additional parameters and information along with a server initiated redirection. The web server performs all these manipulations on the fly so that the browser is kept out of the loop regarding the change made in URL and the redirection.
URL Rewriting can benefit your websites and web based applications by providing better security, better visibility or friendliness with Search Engines and helps in keeping the structure of the website more easy to maintain for future changes.
In this article we will be taking a look at how we can implement URL Rewriting on an Apache based web server environment using the mod_rewrite module for Apache.
What is mod_rewrite?
Mod_rewrite is one of the most favored modules for the Apache web server and there are many web developers and administrators who will vote this module as the best thing to happen on Apache. This module has a lot of tricks up its sleeve so that it can be called the Swiss Army Knife of all Apache Modules. Apart from providing simple URL Rewriting functionality for an Apache based website, this module arms the website with better URL protection, better search engine visibility, protection against bandwidth thieves by stopping hot linking, hassle free restructuring possibilities and options to provide friendliest of URLs for the website users. This module due to its versatility and functionality can at times feel a bit daunting to master, but getting a through understanding of the basics can make you a master of the craft of URL Rewriting.
Lets Begin! - A look at all the stuff you need to have on your test environment to get mod-rewrite alive and kicking.
First and foremost you should have a properly configured Apache Web Server on your test machine. Mod_rewrite is usually installed along with the Apache server, but in case it is missing - this can be the case on a Linux machine where the mod_rewrite module was not compiled along with the installation - you will have to get it installed. For using mod_rewrite on your Apache box you will have to configure this module to load dynamically on demand made by Apache. On a shared server you will have to contact your web hosting company to get this module installed and loaded on Apache.
On your local machine you can find if the module is installed along with Apache by having a look at the modules directory of Apache. Check for a file named and if it is there then the module can be made to load in to the Apache server dynamically. By default this module is not loaded when Apache starts and you need to tell Apache to enable this module for dynamic loading by making changes in the web servers configuration file, which is explained below.
How to Enable mod_rewrite on Apache?
You can make the mod_rewrite module load dynamically in to the Apache web server environment using the LoadModule Directive in the httpd.conf file. Load this file in a text editor and find a line similar to the one given below.
#LoadModule rewrite_module modules/
Uncomment this line by removing the # and save the httpd.conf file. Restart your Apache server and if all went well mod_rewrite module will now be enabled on your web server.
Lets Rewrite our first URL using mod_rewrite Ok, now the mod_rewrite module is enabled on your server. Lets have a look at how to make this module load itself and to make it work for us.
In order to load the module dynamically you have to add a single line to your .htaccess file. The .htaccess files are configuration files with Apache directives defined in them and they provide distributed directory level configuration for a website. Create a .htaccess file in your web servers test directory - or any other directory on which you want to make URL Rewriting active - and add the below given line to it.
RewriteEngine on
Now we have the rewrite engine turned on and Apache is ready to rewrite URLs for you. Lets look at a sample rewrite instruction for making a request to our server for first.html redirected to second.html at server level. Add the below given line to your .htaccess file along with the RewriteEngine directive that we have added before.
RewriteRule ^first.html$ second.html
I will explain what we have done here at the next section, but if all went well then any requests for first.html made on your server will be transferred to second.html. This is one of the simplest forms of URL Rewritting.
A point to note here is that the redirect is kept totally hidden from client and this differs from the classic HTTP Redirects. The client or the browser is given the impression that the content of the second.html is being fetched from first.html. This enables websites to generate on the fly URLs with out the clients awareness and is what makes URL Rewriting very powerful.
Basics of mod_rewrite module
Now we know that mod_rewrite can be enabled for an entire website or a specific directory by using .htaccess file and have done a basic rewrite directive in the previous example. Here I will explain what exactly have we done in the first sample rewrite.
Mod_rewrite module provides a set of configuration directive statements for URL Rewriting and the RewriteRule directive - that we saw in the previous sample - is the most important one. The mod_rewrite engine uses pattern-matching substitutions for making the translations and this means a good grasp of Regular Expressions can help you a lot.
Note: Regular Expressions are so vast that they will not fit in to the scope of this article. I will try to write another article on that topic someday.
1. The RewriteRule Directive
The general syntax of the RewriteRule is very straightforward.
RewriteRule Pattern Substitution [Flags]
The Pattern part is the pattern which the rewrite engine will look for in the incoming URL to catch. So in our first sample ^first.html$ is the Pattern. The pattern is written as a regular expression.
The Substitution is the replacement or translation that is to be done on the caught pattern in the URL. In our sample second.html is the Substitution part.
Flags are optional and they make the rewrite engine to do certain other tasks apart from just doing the substitution on the URL string. The flags if present are defined with in square brackets and should be separated by commas.
Lets take a look at a more complex rewrite rule. Take a look at the following URL.
Now we will convert the above URL in to a search engine and user friendly URL like the one given below.
Create a page called articles.php with the following code:
$category = $_GET['category'];
$id = $_GET['id'];
echo "Category : " . $category . " ";
echo "ID : " . $id;
This page simply prints the two GET variables passed to it on the webpage.
Open the .htaccess file and write in the below given Rule.
RewriteEngine on
RewriteRule ^articles/(w+)/([0-9]+)$ /articles.php?category=$1&id=$2
The pattern ^articles/(w+)/([0-9]+)$ can be bisected as:
^articles/ - checks if the request starts with 'articles/'
(w+)/ - checks if this part is a single word followed by a forward slash. The parenthesis is used for extracting the parameter values, which we need for replacing in the actual query string, in the substituted URL. The pattern, which is placed in parenthesis will be stored in a special variable which can be back-referenced in the substitution part using variables like $1, $2 so on for each pair of parenthesis.
([0-9]+)$ - this checks for digits at the last part of the url.
Try requesting the articles.php file in your test server with the below given url.
The URL Rewrite rule you have written will kick in and you will be seeing the result as if the url requested where:
Now you can work on this sample to build more and more complex URL Rewritting rules. By using URL rewriting in the above example we have achieved a search engine and user friendly URL, which is also tamper proof against casual script kiddie injection sort of attacks.
What does the Flags parameter of RewriteRule directive do?
RewriteRule flags provide us with a way to control the way mod_rewrite handles each rule. These flags are defined inside a common set of square brackets separated by commas and there are about 15 flags to choose from. These flags range from those which controls the way rules are interpreted to complex one's like those which sent specific HTTP headers back to the client when a match is found on the pattern.
Lets look at some of the basic flags.
  • [NC] flag (nocase) -. This makes mod_rewrite to treat the pattern in a case-insensitive manner.
  • [F] flag (forbidden) - This makes Apache send a forbidden HTTP response header - response 403 - back to the client.
  • [R] flag (redirect) - This flag makes mod_rewrite to use a formal HTTP redirect instead of the internal Apache redirect. You can use this flag to inform the client about the redirection and this flag sends a Moved Temporarily - Response 302 - by default, but this flag takes an extra parameter, which you can use to modify the response code. If you wish to send a response code of 301 - Moved Permanently - then this flag can be written as [R=301]
  • [G] flag (gone) - This flag makes Apache respond with a HTTP Response 410 - File Gone.
  • [L] flag (last) - This makes mod_rewrite to stop processing succeeding directives if the current directive is successful.
  • [N] flag (next) - This flag makes the rewrite engine to stop process and loop back to start of the rule list. A point to note is that the URL, which will be used for pattern matching, will be the rewritten one. This flag can create an endless loop and so extreme care should be given while using it.
There are other flags too but they are complex to explain with in the scope of this article so you can find more info on them by referring the mod_rewrite manual.
2. The RewriteCond Directive
This directive gives you the additional power of conditional checking on a range of parameters and conditions. This statement when combined with RewriteRule will let you rewrite URLs based on the success of conditions. RewriteCond are like the if() statement in your programming language but here they are for deciding whether a RewriteRule directive's substitution should take place or not. Things like preventing hot linking and checking whether the client meets certain criteria's before rewriting the URL etc can be achieved by using this directive.
The general syntax of the RewriteCond is:
RewriteCond string-to-test condition-pattern
The string-to-test part of the RewriteCond has access to a large set of Variables like the HTTP Header variables, Request Variables, Server Variables, Time variables etc so you can do a lot of complex conditional checking while writing directives. You can use any of these variables as a string to test by putting it in a %{string} format. Suppose you want to use the HTTP_REFERER variable then it can be used as %{HTTP_REFERER }.
The condition part can be a simple string or a very complex regular expression as your imagination is the only limit with this module.
Lets take a look at an example for conditional rewriting using RewriteCond directive:
RewriteCond %{HTTP_USER_AGENT} ^Mozilla/4(.*)MSIE
RewriteRule ^index.html$ / [L]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla/5(.*)Gecko
RewriteRule ^index.html$ /index.netscape.html [L]
RewriteRule ^index.html$ /index.other.html [L]
This example uses the HTTP_USER_AGENT as the test string with the RewriteCond directive. What it does is that it uses the HTTP_USER_AGENT header variable to find the browser of the visiting user and match it against a set of pre known values to detect the browser and serve different pages to the visitor based on the match result. The first RewriteCond checks the HTTP_USER_AGENT to find a match for the ^Mozilla/4(.*)MSIE pattern. This match will occur when a user visits the page using IE as browser. Then the RewriteRule given just under that statement will kick in and will rewrite the URL to server page to the IE visitor.
Similarly a checking is made for mozilla specific browsers in the second RewriteCond and the RewriteRule will do the substitution for index.netscape.html when a positive match is made on the ^Mozilla/5(.*)Gecko pattern. The third RewriteRule is there to catch other browsers. If both the first and second RewriteCond fails then the last RewriteRule will be considered. A point to note in the above example is the usage of the [L] flag with all the RewriteRule directives. This is used to avoid the cascading of applying the rules when a positive RewriteRule is applied.
Two flags which can be used to further control the way the RewriteCond directive behave are [NC] - case-insensitive - and [OR] - chaining of multiple RewriteCond directives with logical OR.
By using these two directives - RewriteRule and RewriteCond - you can implement a lot of powerful URL Rewriting functionality on your website.
Other mod_rewrite Directives
  1. RewriteBase Directive - This directive can solve the problem of RewriteRule creating non-existent URLs due to difference in the physical file system structure on web server and the structure of website URLs. Setting this directive to the below given statement can solve this problem. RewriteBase /
  2. RewriteMap Directive- This directive is very powerful as it allows you to map unique values to a set of other replacement values from a table and to use it in the substitution to generate on the fly URLs. This can be especially useful for huge e-commerce or CMS kind of applications where you need to replace each section name or category name in the URL with a corresponding id taken from a database.
  3. RewriteLog Directive - This directive can be used to set the log file that the mod_rewrite engine will use to log all the actions taken during processing on client requests. The syntax is: RewriteLog /path/to/logfile This directive should be defined in the httpd.conf file as this directive is applied on a per-server basis.
  4. RewriteLogLevel Directive- This directive tells mod_rewrite module the amount of information on the internal processing done while rewriting URLs to be logged. This directive takes values from 0 to 9 where 0 means no logging and 9 means all the information is logged. A higher level of logging can make Apache run slow, so a level above 2 is desired only for debugging purposes. This directive can be applied using the below given> RewriteLogLevel levelnumber

ConclusionIn this article we have taken only a brief look at the power of the mod_rewrite module. It is only a scratch on the surface but I hope it is enough to get you started on using this module on your web server environment.

Get 25 Facebook Fans to Register a Short URL For Your Facebook Page

Wednesday, September 12, 2012 1 comments
For the past two months millions of Facebook users and Page owners have been able to register short URLs, like From the start Facebook has stated that once registered, the URL can't be changed. However, recently the rules have been updated and now users can change their short URLs. This only applies to profiles. Pages have to settle with the URL they have already grabbed.
Currently you need more than 25 Fans to qualify for a short URL. But the limit might be raised soon. Originally it was at 1000 fans for several weeks... Then suddenly lowered to 0... then 25... and within hours it was raised to 100 fans where it stayed for two months. In the beginning of September it was lowered to 25 fans. But there's no telling how long this will last. Facebook is still testing the most profitable number, at which Page owners are most willing to pay for ads to get visitors.
So you better hurry up to register your special URL before the minimum requirement goes back to 100 fans.
How to register a short Facebook URL
Once you have acquired 26 fans, you can register a short URL at By default there are some suggestions for a username according to the name of your Page. If you administer many Pages you can choose a username for every one of them. Username is the name Facebook has given to these short URLs. Some call them Vanity URLs. To create a username you must use alphanumeric characters a-z, 0-9 or a period(.). Minimum length is 5 characters. Some words like screw are censored and can't be used. Also you can't register generic words like "pizza."
You should get an URL that is closely related to your brand name. If you don't, it might be reclaimed later. Some trademark owners have prevented their name from being used. If you manage to get a trademarked URL, it can be taken away at any time by the real owner.
In order to register the URL you need to verify your account with a code that will be sent to your cell phone. If you have many Pages you only need one cell phone and one verification.
If you want to change your already registered URL to something else, then go to Settings... Account Settings... and click to change the username. If a user changes his short URL, the old one will instantly become available for anyone to register. If on the other hand your account is deleted, the old username will not become available. Unlike profiles, Pages can't change their short URLs. Hopefully this will be allowed in the future.
If you are an application developer, you can register a short URL for your application, too. But instead of the URL being, it is
If a Page or application profile has many administrators, the first one to register gets to choose it and after that it can't be changed. So make sure your team is on the "same page" with your thoughts.

Free URL Redirection - Features

Tuesday, September 11, 2012 1 comments
Well, there are a large number of features that each Free Url Redirection service can theoretically provide. But in practice, Free Short Url services usually don't provide them all, but provide only some of them.
I'd like to tell you about the most important features of free url redirection services. So, the more features from the list below a free url service can provide, the better it is..
- No ads at all is a very important feature and it means that the Free Url Redirection service provider does not implement any forced ads into users websites. Don't confuse it with "No banner and popup ads", as there are many other kinds of ads that are neither banners, nor pop-ups.. e.g. pop-under or exit ads.. Be careful here :) Some service providers may place a very small frame at the bottom of the page with a link back to their site. This is perfectly acceptable, as they have to be able to get new members!
- Free Domain Name - it means the free url provided looks very professional, like a real paid domain name.
- Free Url Cloaking also known as free url masking - is used to mask your real website address with the Free short url provided. So, your short url will always be in the location bar of your website visitors, and nobody ever knows you are using an url redirection.
- Free Email Forwarding means you get a branded email address with your free short url, like webmaster@yourshort.url, and the emails sent to this email address are automatically redirected to the email address you choose, e.g. to
- Free Path Forwarding is the feature that allows you to hide that you are using short url service even more. With path forwarding turned ON you can access the files and subdirectories of your website using your free short url, e.g. www.yourshort.url/forums/ will be in the location bar of your website visitors and will actually point to your real, long address.
- Free Subdomains feature allows you to create subdomains like forums.yourshort.url and point them to different websites and/or to different folders of your hosting account (in other words, you are allowed to create subdomain path forwards of your free short url)
- Meta Tags Support. Meta Tags are the tags located in the head of HTML pages. The most important are Title, Description and Keywords tags, and they are very important for your website to be indexed by search engines
- Free Website Statistics. Some Free Url Redirection services provide website statistics for free - number of visitors, referrals, webpage hits, etc. In this case you will not have to use the 3rd party statistics service (counter) to monitor your website visitors
- Dynamic IP support. This feature is useful if would like to run a webserver from their home computer, but unfortunately, have a dynamic IP address
- Full DNS Support is a very rare feature provided by Free Url Redirection service providers and it means that it is allowed to modify all DNS records (A, NS, MX) for the free short urls provided.
There are also a lot of different not very important features provided by Free Url Redirection providers, like ability to use your free short url with & without WWW, support for other less important meta tags, etc. Plus some very rare features like POP3 boxes, included guestbooks, chats, etc.

The Power of Search Engine Friendly URLs

Monday, September 10, 2012 2 comments
I recently invested quite some time into generating search engine friendly URLs for several of my websites to increase my ranking and to have more pages indexed. I can highly recommend to look into this if your own website does not have se-friendly URLs. Especially Google (the most important search engine nowadays) can be very picky in regards to URLs that are not se-friendly.
Example (the 2 URLs below bring you to exactly the same page):
URL: -> this URL is not se-friendly and search engines will eventually ignore any page behind it or rank it much lower in search results. Visitors will have difficulties to remember this URL. These kind of URLs often come from dynamic database driven websites. Each page is dynamically created when requested. Look at the forums URL this moment, too. It is dynamically created and not very friendly to search engines or the visitor. You get the idea.
URL: -> this URL is se-friendly and search engines will spider the page behind it easily. It is keyword enriched to increase search engine ranking. Overall - this URL is easy to be spidered and easy to remember by a visitor.
For one of my own sites I was able to increase the number of pages indexed from 36 to over 150 pages - just by making the URLs search engine friendly. The additional pages were ignored by the search engines because they could not read the URLs properly. The domain used in my example went from 20 pages to 80 within 2 weeks and should go to over 120 pages indexed (by Google) with the next Google update.
How do you make your URLs search engine friendly?
Your web host/web server needs to support the Apache Web Server module "mod_rewrite". This module allows to rewrite URLs a certain way. By using a ".htaccess" file you can give the web server the necessary commands to work with se-friendly URLs.
How does this now really works?
In general - you are faking the nice clean looking URLs and fool search engines and visitors to believe that the URLs of your website are se-friendly.
SE-friendly URLs work in 2 steps. 1) Your site needs to display the se-friendly URLs. 2) mod_rewrite and htaccess 'translate' the se-friendly URL and redirect the traffic to the ugly looking se-unfriendly URL in the background (invisible to anyone). You will need to setup the htaccess file with the command how you would like the URL to look like and what does it translate to (a certain ugly looking dynamic URL).
The code that generates the URLs dynamically needs to be adjusted to match the rules from your .htaccess file. You upload the code changes and the htaccess and off you go.
Can every website be modified?
Most websites with dynamic URLs can be modified if the server environment meets the requirements. Each website needs to be looked at separately to get the best results.
The learning curve on creating se-friendly URLs can be quite challenging. Spend the time and resources on creating se-friendly URLs. The results can be overwhelming.
Chris Puetz is a successful small business owner (Net Services USA LLC) and international author.

Kick-Start Your SEO Campaign With a Perfect URL

Sunday, September 9, 2012 0 comments
The URL of a website, no doubt holds a lot of significance. URL is a very important on-page factor which can impact your SEO campaign considerably.
Unfortunately, some web site owners ignore URL issues in on-page optimization, and face the consequences. However, if one starts being careful about the creation of URL, it can have a positive impact on the SEO campaign and contribute to the success of the same. Here in this article, we will discuss the common mistakes that one commits in the creation of URL and how it affects the SEO efforts.
Most Common URL Mistakes which can badly affect your SEO campaign:
Insufficient Keywords
If we talk about keywords for URL, we are surely going get into a debate as there are some who believe that keywords are essential for a proper URL and some believe that they are not. However, when it is all about SEO, then Google rules, and Google says that keywords in URL do help your SEO efforts. And I agree to it. The reason being that a recent Google update confirmed that the inner pages are ranking much more frequently for certain strong keywords rather than the homepage. Therefore, it is always favorable for you if you give enough weightage to your website's URL with the targeted search terms.
Too Much of Everything (keywords) Is Bad
As we discussed above, keywords in the URL is recommended, but here, what I would like to add is that keyword stuffing is bad, if your URL has too many keywords; then you are definitely going to ruin things rather than making them good enough. However, there are no firm data sets that state this negative impact of too many keywords in the URL but Google did mention that their algorithm does look out for this. If you start stuffing your URL with too many keywords, you are hampering it.
Improper Semantic Structure
Always use a well crafted semantic structure for your website's URL, as; it helps a lot for search engine spiders. A proper semantic is beneficial in a way that Google can pull these into a SERP and can display them in place where a confusing URL string has been used.
There are various Content Management Systems which permit bypassing vibrant characters in URL. Take WordPress for example. If you are using WordPress, instead of using page ID in the URL of the website, you can instruct WordPress for an SEO friendly pattern.
Dynamic AJAX Content
At times users are worried and complain that only their top category pages could be visible in search engines, i.e, no long tail queries could return back their sub categories. Do you know what can be the reason? It's the use of AJAX to generate these pages with dynamic content with the help of hash parameter. This becomes a hurdle in the indexing of the URLs.
However Google, earlier in 2009, had announced that changes were about to be made allowing these pages to be indexed and for that the exclamation mark token ("!") has to be added after the hash ("#") within an AJAX URL.
Duplicate HTTPs Pages
Another issue that often goes unnoticed is that of http and https pages rendering the same content. This is a very common case when a user enters a secured part of https and then leaves that page returning to a non secured http page. However, the navigation process holds on to the https forerunner because of the relative URL links. This leads to a situation where https are rendered on all pages of the site thereafter. However, to fight this situation we need to take two steps.
Step 1- All navigation to non secured pages have to be http rather than https and this is possible by hard coding or by removing relative URLs from secured pages.
Step 2- In the non secured areas, the https versions should be 301 redirected into the correct http versions.
Category IDs
Category IDs are utilized by many sites within their URLs, and are generated most of the time by their CMS. All in all, lot many numbers, letters and symbols in a URL are senseless to either a human visitor or a search engine spider. To maximize the sites SEO impact, include keywords within the URL and logical semantics, it is required that these IDs are turned into pertinent descriptive text.
Most of the CMS platforms possess this 'pretty URL' ability built in to them. If not, all you need to do is to map each of the IDs to a relevant handle such as a products name or category.
Session IDs
There are certain ecommerce sites that track the activities of the visitors, like; adding products to shopping baskets by appending session IDs at the end of URLs. These IDs are essential for interaction of visitors with functionality that is user specific; but obviously they can result in duplicate content issues. As it is known, that each ID should be unique to every visitor, the situation creates many duplicate website pages.
Solution- Remove session IDs from the URL string and place a session cookie instead. The cookie will work similarly as the ID did, but as it will be stored on thee users machine, it won't affect the URL.
Rendering of Index File
At times it so happens that a website renders both; a root directory in the URL and the root that is appended with the index file (index.php, index.aspx, index.html, etc).This results in both being treated as a single page by search engine, thereby, both getting indexed and resulting in the creation of duplicate content.
However, this is easy to rectify. Just like the trailing slash fix, a 301 redirect rule has to be established to point one into the other. If a greater level of usability is required, redirect the version I would say, and this should include the index page into the root directory URL without the index page.
URL is an important factor that plays a role in search ranking, avoid these mistakes, make your URL search engine friendly and kick-start your SEO campaign.
Link Building Works: is a Link Building Company, offering one way link building [], Link Building Service [] for higher search engine rankings, increased visibility, high Google page rank and quality traffic highly at affordable link building packages.

How to Shorten URLs - Make Your Links More Attractive and Useful

Saturday, September 8, 2012 0 comments

Web applications typically employ long strings of characters in their URLs. These descriptive character sets often represent command syntax, paths, session data, and hierarchies in a general system of information. This exercise mainly causes awkward URLs which are hard to remember, as well as fit into the parameters exacted by other web apps, including sites that cater to microbloggers. URL shorteners mitigate this issue through redirection from a long URL to a relatively shorter, more concise one.
If you want to add a long URL link to a blog post, blog comment, or forum comment, you might end up with a link that is cut off. If this happens, the live link will not work. You may have the same problem with microblogs such as Twitter, which typically limit linked URLs to a hundred forty characters, hence, copying and pasting anything aside from a short URL will be useless. You will need access to simple URL shortening script.
Shorter URLs are more convenient to use for posting in forums, blogs, social networking sites, and other web applications. These abbreviated URLs are also better suited for e-mails, as you can send a URL without having to wrap or break it. You can also hide affiliate links, or customize and turn these into unique and personal addresses with keywords. Exceedingly lengthy URLs turned brief by a simple short URL maker are more meaningful, and enhance the recall factor of the URL and site.
A URL shortener offers Internet users a number of choices to abbreviate URLs. With a simple URL shortening script, you can just paste the link and click on a button to condense the URL. URL shortening PHP scripts permit you to add URLs straight to Twitter, observe the number of clicks on these abbreviated URLs, and more. For example, you can use the URL shortener script, which merely redirects from longer, fully qualified URLs to the shorter one, to also track hits. This can be a very effective tool for testing out the effectiveness of various online marketing campaigns or endeavors. For example, you can post one version of the shortened URL on one site in and ad campaign and another version on a different site. Then you can observe the hit counts to determine which was more effective.
One last idea that you could use a URL shortening script for, would be to create link bait for your site by creating a free toolbar or FireFox plugin that utilizes your script as a way of driving traffic to your site and getting links. As you can clearly see, the uses for a URL shortening script are plenty.

Security Risks of URL (Link) Shortening Services

Friday, September 7, 2012 1 comments

We all share URLs (website links) with each other through emails, blogs, social media sites, book marking websites and word of mouth and we rarely, if ever, think about the potential security risk this simple act can raise. In this article I will outline some of the potential risks involved in sharing shortened links supplied by URL Shortening Services.
What is URL Shortening?
The idea behind URL shortening or link shortening is very simple, take a long URL and encrypt it to produce a shorter URL. This is what URL shortening services do. They provide a shortened URL which is then mapped to the original long URL.
For example, an original destination link 220 characters long would typically be reduced to 25 characters, this is the link you would pass to your friends. When you friends click the short URL the shortening service website maps this short URL to the original and redirects the user.
Social Media Sites
With social media sites such as Twitter the sharing of links is problematic at best because of the 140 character limit imposed on message length. This problem has given rise to the proliferation of URL Shortening Services and whilst these services do a fantastic job on the whole there are risks involved with trusting a third party to redirect your links.
There are over 100 URL shortening services online, the majority of which are free. A more complete list of these services can be found on my website.
Security Risk 1 - Link Manipulation
What is the link destination? There is usually no way of knowing your final destination until you click the link. The true target is obscured.
Admittedly this risk applies to all link cloaking technologies but usually when you receive a dubious link via email for example, you can either view the plain text version of the link or hover your mouse over the URL to see the destination address and assess its validity.
With all shortened URLs this first line of defence has been removed, you don't know where clicking the link will take you. Email Phishing scams are using the URL shortening services for this very reason.
Security Risk 2 - Ineffective Spam Filters
Because the original URL is not available spam filter systems cannot make a judgement about the validity of the URL. Plus, with the shortening services being freely available and taking only seconds to use, keeping abreast of this problem is almost impossible.
Many shortening services take spam complaints very seriously and disable spam URLs immediately. Some services actively scan registered URLs for blacklisted websites and disable the shortened URLs, but, no sooner is one removed than another takes its place.
Even the Safe Browsing features of web browsers such as Firefox and Google Chrome which warns users of malware or phishing sites are no match for shortened URLs. No warning will be issued, instead, users are sent directly to the potentially dangerous web page.
Security Risk 3 - Compromised Shortening Service
A considerable number of the URL shortening services I've visited have not been very secure. Several let me drop down to their directory structure just by typing an invalid URL. Poor security leaves sites open to risk, hacking such a website would allow popular shortened URLs to be redirected to phishing or malware sites.
Security Risk 4 - Privacy Issues
Link shortening services are in a position to track user's behaviour across many domains so creating possible privacy issues.
Security Solutions - Transparency
Some URL shortening services are actively trying to solve security issues by adding a "see before you click" functionality to their short URLs.
  • Any tinyURL shortened URL can be prefixed with the text "preview" to show the destination address.
  • A BudURL shortened link can be previewed by adding a "?" to the end of the URL.
  • Some services provide a popup window to display the destination webpage when your mouse is hovered over the short URL.
Unless you are sure about the links you are clicking and you know the link comes from a reliable source (this applies to all links really) then be very wary about the destination page you are about to reach. We've all become very flippant with unknown links, especially if they come from our friends, be careful out their.

Google AdWords - Display URL For Affiliates

Thursday, September 6, 2012 1 comments

Since an increasing percentage of the users of the AdWords Forum are new affiliates, this article is meant to cover a common misunderstanding leading to problems with the Display URL. Basically when you are creating an ad you have to complete 2 URL fields: the Display URL field and the Destination URL field. It is the Display URL field that is shown to a Google Search user, on the search results page, while it is the Destination URL that determines where the user will be taken on clicking on your sponsored link which is another name for an advert.
After a few days of study, most new affiliates seem to be knowledgeable about the requirement that the Display URL should match the Destination URL. What does the word *match* mean in this context? It means that the domain names associated with the 2 URLs should be identical. The usual problem for affiliates is generally associated with the fact that most of them are registered with one of the major affiliate networks, such as ClickBank, Commission Junction (CJ) or ShareASale. As a rule, these networks provide their affiliates with affiliate links (or HopLink at ClickBank) that don't match the website URL of the online merchant. In fact, these affiliate links are tracking URLs, serving statistical and reporting purposes, that redirect to the particular webpage of the online merchant.
And that's the root of the problem. When your Destination URL is a tracking URL the known rule is simply not true. The Display URL is not supposed to match the Destination URL when it's a tracking URL. In such cases, the Display URL has to match the URL of your Landing Page which is the webpage the user will be taken to on clicking on your advert displayed on the user's search result page. Of course, the Landing Page is, in most cases, a page of the website of the online merchant.
* Display URL:........http ............etc....(merchant's domain).....shown to user
* Destination URL:...http............etc....(aka Tracking URL).......redirects to product page
ShareASale is one of the popular affiliate networks and the above URLs were the details of an old experimental advert, not in use any more. HomeLivingStyle used to be an online merchant participating in the network. The Destination URL was the affiliate link provided by the network. In fact, it was a tracking URL so that the network was able to keep records of clicks the online merchant received from the affiliate link via the network. It redirected to a product page of the website of the online merchant, namely to a page selling some sorts of electric fireplaces on the HomeLivingStyle website.
The striking thing in the above URL configuration is that the Display URL does not match the Destination URL - because it shouldn't. When a user clicked on the advert he was taken to:
* Landing Page:......http .......... etc.......(merchant's product page)
which was a workable page of the online merchant in those days. It was what we call the Landing Page. Remember, that in fact, the Display URL is supposed to match the Landing Page URL when your affiliate link is a tracking URL. In such cases, the Destination URL is different from the Landing Page. The former only serves tracking purposes while the latter is an actual page on the merchants website.
B Lakatos is one of the Top Contributors selected by Google at the official AdWords Help forum. In cooperation with a few other Top Contributors, B Lakatos runs a website that helps people looking for practical assistance with AdWords. A real must have for newbies and beginners.

Dealing with Dynamic URLs (SEO)

Wednesday, September 5, 2012 0 comments
This is a slightly advanced topic on SEO.
What are Dynamic URLs?
Nowadays, a lot of sites are dynamic, that is their site draws information from somewhere (usually a database) and outputs these information into your browser. Think of them as webpages created "on the fly". The biggest advantage in this is that you can create a lot of dynamic pages easily and consistently without too much additional coding. Just use a template, then output the different information on them. For example, if you were to browse this site, you would realize that the pages actually are displayed with the same layout but with just different content. They can save webmasters a lot of time and effort and mistakes as well.
Thus, a site with a lot of pages and content will usually be dynamic, because it would be too tedious for the webmaster to manually edit and recreate the pages and the internal links.They are usually done with server-side technologies like php or asp and a database like MYSQL as their backend storage.
So we now know what is a dynamic site. But what about dynamic URLs? Dynamic sites usually have dynamic URLs. Let me give you an example: [] . From this URL you can sort of see that it is pointing to some category(through a search function) in some directory or something from a dynamic site. The weird characters (eg "?", "=", "&" and so on) are sometimes called STOP characters for obvious reasons. These are the characters which you should be careful about.
Why Search Engines don't like dynamic URLs?
Simple, dynamic URLs can cause problems to Search Engine's indexing, specifically for duplicate content. Search Engine spiders are just not eager to index dynamic URLs because of this.
- A dynamic page may create more pages, sometimes trapping the search engine bots in a endless loop and worse, create tons and tons of pages to index which is practically useless to the Search Engine users.
- Search Engines like unique content, not duplicate or near duplicate content. Dynamic sites may create pages which are similar(or very similar) in content with just different URLs. It's like having one page with a lot of different URLs pointing to it. Think of the poor sob who is searching for something, only to find a lot of links that are pointing to the same page!
- Session ids("sid=") can cause problems with duplicate content as well. Session ids are placed in the URL usually to tell the webserver who the user is. Think of it as cookies in your URL and a lot of website use them. You must have been to some sites which require you to login to enter the site, some of which uses session ids. So a different session id each time (and thus a different URL) may point to the same page, and this creates duplicate content for the Search Engine to deal with. And some URLs with session ids expire and if their server is strict, leaving a dead link. So if Search Engine index this dead link, the next time someone clicks on this link(which is indexed in the Search Engines, it goes to a error page, which everyone(that is you, me and Search Engines) hates. Although Google DO index URLs with session ids, it is still best avoided.
Although Search Engines do read them and do index them(even those with session ids in them), they are best avoided. Your purpose as a webmaster is to HELP Search Engines index your site, don't make it difficult for them by having Dynamic URLs. In the past, a lot of Search Engine disregard the URL after stop characters which resulted in truncated URLs, so you need get a lot of the same URL, for example for [], it may be truncated to []. Think of the problem it would create for the Search Engines and you as a webmaster(the Search Engine only index one page!). Of course, that's in the past, but it doesn't mean that it is still not problematic to Search Engines. I definitely won't blame the Search Engines for not indexng dynamic URLs because of both the problems it will have on Search Engines indexes and the users using the Search Engine.
What can I do?
Basically, you need to change your dynamic URLs to static URLs or at least reduce the number of STOP characters in these dynamic URLs. Static URLs are like for example [] , you can see they don't have the STOP characters in them and they do look neat and juicy to Search Engine spiders.
- If your site is small, do you really need a dynamic site? If you have like 10 or 20 pages, do you really need a dynamic site? Yes, it may take some effort, but using a static site can help Search Engine bots index your site faster and more efficiently. And best of all, you have full control on your layout, script and it is probably much easier to modify and loads faster as well.
- Use some URL rewrite script or feature. If your webserver is on Apache (a very popular webserver software), they have this feature called mod rewrite which displays dynamic URLs as static looking ones and can be done in the .htaccess file. However, this requires a lot of expertise and knowledge on apache and mod rewrite. Those on MS IIS server also have a choice. There is a Module which can create static URLs from dynamic ones.This method is the most popular and most effective way to get rid of dynamic URLs. Ask your programmer for help.
- Use robots.txt to block Search Engine bots from doing certain actions, like search and so on. These actions are usually created with a dynamic URL and can create duplicate content because they may point to a page which can be accessed by other URL. For example, you may have a link which randomly shows your visitors a page. But that page can also be accessed by another different link. Two different URLs, the same page equals duplicate content. So just block Search Engine spiders from following the random link. However, you must be careful not to block the bots from indexing important parts of your site when you do this. Yes, this method will not completely remove the dynamic URL problem, but it will minimize the duplicate content problem.
- Reduce the number of STOP characters and variables in your URL if possible. Ask your programmer to use the minimal number of these characters as possible when coding.
- Use cookies instead of session ids. Althought there are some users who may disable cookies, most won't. However, you have to ensure that your webpage allows requests made without cookies because search engines don't accept cookies. If your coding requires the user to have a cookie, then your site may not get indexed as they can't be navigated by search engines.
- Use of sitemaps. By placing a sitemap on your webpage helps Search Engines index your site more efficiently than other internal links (which are dynamic) in your site. And submit them to Search Engines if possible. Oh, make sure the link to your sitemap on your webpage is static by the way!
- Cloaking. Ask your programmer to write a code especially for Search Engines. So if a search engine bot requests a page instead of an actual human visitor, then they will be redirected to another page with a static url. This way, you won't have to worry about session ids causing you and the Search Engine problems. Note that Search Engines do not like cloaking. But I believe that if your purpose is to prevent session ids problems and the cloaked page is the same as what your visitors see (except without the session ids in the URL), I think Search Engines would not mind.
The advantage of changing to static URLs is that you can add keywords into it. Like []. You can see that this URL is keywords full. They have "cars", "toyota" and "service center" as the keywords. Some major Search Engine place relevance on URLs, and that includes Google.

Why Use A URL Redirect?

Tuesday, September 4, 2012 0 comments
The Benefits of URL Redirects and Sub Domains
Do you have a website that has a long and difficult to remember URL? More than likely you do because you did not realize the importance of a short and easy to remember URL when you purchased your first one. Then, after your website was started you did not believe there was anything you could do regarding your URL so you just left it as it was without ever thinking there was a way to keep your URL and website the same, yet have a shorter URL that was easy to remember and helped visitors return to your site time and again. The answer to your questions is simply URL redirects and sub domain services. Basically, the URL redirect or sub domain changes your hard to remember URL to a short and easy one to remember, making it easier for visitors to return and for new guests to visit. You are probably wondering where you can get this type of service because you are interested. The answer is at []
At [] you will get all of the benefits you need from a URL redirect service. First of all, you will enjoy free unlimited domains and redirections as well as free email and spam blocker. There are also features like creating Meta tags, checking back links, and tools that will help you grow your site just the way you want. In addition to this you can pay a small monthly fee and have all ads removed from your site with unlimited ad redirects. All that you want and more is available at [] so if you are interested in free sub domains or URL redirects then you need to check out this website.
Why Use URL Redirects and Sub Domains?
There are many reasons why you might want to use URL redirects and sub domains that are not simply to help people remember your website. Some of them include moving or changing your URL, advertising the wrong URL, as well as other things that might occur along the way that would require you to change your URL. In addition a URL redirect helps website visitors find what they are looking for, which is very important for the website owner.
When you move or change your URL it is very important to use a URL redirect or sub domain because if you don't the bookmarks that individuals have to reach your website will no longer work and your customers will not be able to find the information they were looking for. This does not benefit you at all and in fact it can lose you a lot of business. So, if you change your URL or move to a new server and need to change your URL make sure you redirect so you can keep your customers and they can still find the information they are looking for.
Another benefit is that if you accidentally advertise the wrong URL with links then you can easily redirect the sub domain and have individuals find the right website that way. This can be very useful for you keeping up with customers to your website that might otherwise be lost if your web site is not redirected.
Also, you still want individuals using the search engines to be able to find your site as well. When you redirect your site the search engines will use the URLs they already have and simply redirect the results for you. This really helps individuals who are looking for particular information find just what they are looking for.
[] Can Redirect For You
If you really know computers then you should have no problems redirecting your URLs yourself. However, the majority of computer users either do not have this type of knowledge or they simply do not have the time. Because of this using a service like that on [] is really a great option, not to mention free with many features and benefits. Also, it is really easy to get started.
Simply visit the website at the URL above and register for the URL redirect services. Of course, you will need to check and make sure the redirect URL you are looking for is available and as long as it is the website will show you where to go from there. If not, then you will need to find a new URL that has not already been taken. You need an original URL so that your site can be redirected. Really, having your URL redirected takes little time and effort when you use []. Of course, the service is free, but your new website redirect will appear with some advertisements. If for some reason you do not want the advertisements then you simply pay a small monthly fee to have your URL redirect ad free! It is as simple as that.
Don't lose all of your hard work and customers because you need to change your URL for whatever reason. Instead use [] to redirect your website to another URL and keep up with all of your past customers as well as your links that are already established on the web. This is really a great service and one that will benefit you and your website so check out the services from ly2 today and have your website redirected!

Short URLs: The Pros and Cons - Should You Be Using Short URLs on Twitter, Facebook, and the Web?

Monday, September 3, 2012 0 comments

Some explanation first. Short URLs are long URLs that have been shortened to make them fit more easily in a text message, Facebook update, or Tweet. In and of themselves there is nothing wrong with Short URLs, but you should consider the Pros and Cons before using them.
There are lots of reasons why you may want or need to use a shortened URL. If you do a lot of Tweets on Twitter you will almost need to use Short URLs if you refer to web sites, recently Twitter initiated an auto URL shortening function on their site. The reason is due to Twitter's limitation of 140 characters for a Tweet, basically what you would have available if you were using a cell phone for texting. So, if you want to put in a URL to a web page and include a bit of explanation you can easily go over that 140 characters. This is where shortening a URL is handy, it gives you more room for comment. Also some longer URLs are more than 140 characters in length, especially URLs from sites using Dynamic content to build pages, such as shopping sites and news services. Fortunately Twitter has information and links to URL shortening services to help you do this (more on these services later).
As an example a typical product link on Amazon or other online sales site can contain 100 or more characters, with 150 not being unheard of. A URL of 120 characters in length would leave you with only 20 characters for comment if you are using Twitter and even if you aren't it leaves you with a very hard to remember URL.
The same URL shortened using a Short URL service on the web could contain 20 characters or less.
Many URL shortening services allow you to create custom shortcuts for your links so that you can retain some name recognition. Custom shortcut names can be used, as long as no one else is using then you can choose any shortcut that you want. Typically these shortcuts would then become the name of the link, following the name of the URL shortening service.
Another reason you may want to shorten a URL is to make it more user friendly and memorable. In the examples above it would be much easier to remember the shortened URL instead of the long one.
So, two very good reasons to use a URL shortening service, now lets take a look at some of the downsides. You need to weigh the pros and cons and consider your audience before you use a shortened URL. The biggest downside to URL shortening services is that you are no longer showing the actual URL. For instance in the examples above the long URL is obviously going where you say it is because the actual path is displayed in the URL, but in the second shortened URL it is not clear where the URL is pointing to.
This can lead to concerns with your audience since they can not immediately tell where the link is pointing. It could be pointing to the product as you claim, or it could be pointing to a malicious web site instead. This is why I recommend against clicking on shortened URLs from people or sites you don't know, it is a security risk. If your audience is unsure of who you are you may lose clicks using a shortened URL.
Another drawback is the lack of name recognition. If you directed your buyers to your home web site instead of using a long URL from an online store you could use a much shorter URL naturally and keep your web site name visible at the same time.
But what if you needed to sell an online store title or some other title with the same long URL problem. Then a great solution is to set up an affiliate web site for your sales, with pages on your site drawn from the affiliate program or other affiliate service. So your URL would go to your site, which you would use in your advertising, and the links on the pages on your site would use the long URLs.
So, now for my recommendations:
• If your audience knows you then go ahead and use shortened URLs if needed
• If you want the name recognition then try to set up relatively short URLs on your own site
• You may need to use shortened URLs if you use a service like Twitter
• If your audience does not know you but you still need shortened URLs, again you should try to set it up through your own site
• And finally, if your audience does not know you and you can't use your own web site then you can try using a service which has a preview feature, allowing your viewers to see the page before they go to it
To find services that create short URLs do a search on Google for "short URL", you will find several free services that do a great job.

URL Rewriting Simplified - An Academic View on URL Rewriting

Saturday, September 1, 2012 0 comments
This article explains the technical aspects of URL Rewriting. This is a starter guide for beginners who need know the basics of URL Rewriting.
What is URL Rewriting? - An explanation of the process called URL Rewriting.
URL Rewriting is the process of manipulating a URL or a link, which is send to a web server in such a way that the link is dynamically modified at the server to include additional parameters and information along with a server initiated redirection. The web server performs all these manipulations on the fly so that the browser is kept out of the loop regarding the change made in URL and the redirection. So client is given a feeling that the content is fetched from the original location mentioned in the URL it has requested.
The Rewriting Engine scans the URLs and makes the needed restructuring on the URL based on pre-defined conditions and rules.
Let's illustrate this with a real life example.
Consider the following URLs.
Both the above given URLs are with a query string and is a common scene in almost all dynamically driven websites. These links are supposed to show the user with the details of a product with an id of 345 or show products with a category name of "cars" from the website's database. This URL has many limitations, as the link is cryptic for humans, the links are not that much search engine friendly and is prone to easy manipulation on the query string side by client side interaction. So lets check out how these URLs can be written in a more overall friendly way.
Both the above given links can be called friendly URLs since they are devoid of cryptic parameters and are much more easy to the eyes of humans. On the server side these 2 clean URL's can be transformed back to application specific URLs by using various URL Rewriting methods. The rewriting module at server will covert those links back to their original form by creating the query strings and supplying them with values sent from the client through the clean URL and then initiating a redirect at server side to the new URL. One thing to note at this point is that the redirection that is initiated by the URL Rewrite module is totally different from the normal HTTP Redirects. When an HTTP Redirect happens the browser is made to fetch the new URL where as the Rewritten new URL is fetched without the knowledge of the client side. The transfer to the new URL happens behind the scenes as far as the request initiating browser is concerned.

Why URL Rewriting is used?
  1. Making URLs User Friendly -Consider a content management system, which is used as a platform to publish recipes done by you. In such a scenario normally there will only be a single server side page or script file armed with the task of serving the clients with recipe details based on an identifier supplied for each recipe. Links to access recipes will be like the one given here: websiteurl/showrecipe.php?recipeid=210 and websiteurl/showrecipe.php?recipeid=430. These URLs have a major drawback! They are not friendly from a user's perspective. They are lacking usability since they are not short, they are not easy to memorize and they are not depicting the structure of our recipe library. In such a scenario we can utilize URL Rewriting to bring a bit of user friendliness to our links.
  2. Making Spiders Happy by providing Search Engines Friendly URLs - Think about our previous scenario about the CMS for our Recipe Collection. Even if we have a collection of 500 recipes there will only a single page, which is responsible for serving the recipes. This page serves the content dynamically and this causes some problems with Search Engine Optimization. Most of the search engines are not happy with the idea of indexing dynamic links with obscure characters in them and this can result in not proper indexing of a website. This means your website wont be indexed correctly by those search engines. Since most search engines give more importance to static links and index sites with static links faster we can make search engine friendly links for our dynamic website using URL Rewriting.
  3. Keeping the Link Structure of a website Permanent and keep Link Rot in check - Often website restructuring can cause a lot of havoc with incoming links towards a website. They may cause links from external sources like other websites, bookmarks etc to point to non-existing resources. When there are links out there on the net pointing towards resources which are no longer where they were on your website, it results in broken links. This is called Link Rot and URL Rewriting can be used to prevent this.
  4. Restricting Hot linking to keep bandwidth thieves at bay - URL Rewriting can be used to avoid Hot Linking towards contents hosted on your server by other websites. Hot linking is the use of a media object - usually but not restricted to image files - on a website by directly embedding the content on the page when the actual resource is residing at another server. This results in bandwidth wastage on the original server each time the website which is piggy backing the content is displayed. We can utilize URL Rewriting to restrict hot linking by checking the HTTP Referrer before serving the content.
  5. Preventing External parties from identifying the technology that is powering a website at a casual glance - Even though this cannot be a foolproof way to prevent a person from foot printing a website URL Rewriting prevents a person to identify the server based technologies which drive a site just from looking at a URL. The extension part of pages can be removed and they can be rewritten from the server side.
  6. Considering Other Security Aspects - Links are made more secure and are made a bit more resilient against casual query string manipulation and other malicious injection techniques. URL Rewrite technique can be also used to make on the fly conditional checks before serving a page for various clients and these conditional checks can server multiple different outputs to clients based on environmental parameters like IP Address from which the request is originating, User Agent which is making the request etc. Further reference on URL Rewriting from a security view point can be read from Here. 

How can I implement URL Rewriting?URL Rewriting can be implemented using various methods based on the web server. On IIS this can be done by creating an ISAPI filter or can be done using the HTTP handler - System.Web.HttpContext class available in ASP .NET. For Apache URL Rewriting can be achieved by using the very powerful but fear invoking mod_rewrite module.
I am planning a follow-up article in the form of a starters guide for using mod-rewrite module on Apache.
A research loving computer teacher interested in Paranormal Science, Martial Arts, Military History and Security Pramod S Nair is now the C.T.O of Humming Bird Informatics - An Internet based business solutions provider.