There really is no reason to prevent links into your website other than to appease Compliance and Legal teams who are covering all options. Some sites have gone down due to high traffic loads when they have been listed on Boing Boing or Slashdot (not mine - not yet anyway) and that could be the reason why IT teams have got into flusters and the legal teams have tried to apply impossible T&Cs. I’m not talking content theft linking here - linking to images on your site or scraping content.
According to the W3C Technical Architecture Group, “any attempt to forbid the practice of deep linking is based on a misunderstanding of the technology, and threatens to undermine the functioning of the Web as a whole”. That’s pretty serious - but no compliance/legal team is going to listen to that when their job is potentially on the line.
Couple of good resources that got me writing this short:
The insane world of “No linking Policy” – what happened to the interNET? (was from stateofsearch.com but site now dead)
Links and Law: Myths
Edit - Jan 2015
As a postscript to this article - yes you would now want to prevent bad links coming into your website. It is possible to attack someones SEO ranking by setting up loads of backlinks from bad websites and thus reduce the sites SEO rankings. This is called negative SEO and is an increasing tactic at this time.
I still hold at this moment that seeking permission for deep linking is fundamentally wrong though.
Moving a site to new hosting can be fraught with issues. Here is one technique that I find useful when moving to a website to a new hosting server.
A plaster model kit of a classic French wooden goods shed.
Completion of the SRB001 Freelance DEUTZ Style 0-4-0DM chassis including the running gear and motor withthe body work following.
Stuart Brewer is a fellow GDNGRS member and is one of the most acomplished model makers I know - this is his first kit and my attempts at building it.
Whitesands Quay first outing gets it out of the house and sets some deadlines for me to work to.