I'm looking into building a content site with possibly thousands of different entries, accessible by index and by search.
What are the measures I can take to prevent malicious crawlers from ripping off all the data from my site? I'm less worried about SEO, although I wouldn't want to block legitimate crawlers all together.
For example, I thought about randomly changing small bits of the HTML structure used to display my data, but I guess it wouldn't really be effective.
Any site that it visible by human eyes is, in theory, potentially rippable. If you're going to even try to be accessible then this, by definition, must be the case (how else will speaking browsers be able to deliver your content if it isn't machine readable).
Your best bet is to look into watermarking your content, so that at least if it does get ripped you can point to the watermarks and claim ownership.