I'm trying to implement breadcrumb functionality on a visual studio ASP C# site using the SiteMapPath Control.
The company I work for has inherited the site and we're primarily PHP developers so forgive the ignorance.
Originally when I dropped in the SiteMapPath from the tool box I got an error message saying there was no web.sitemap file found. I then created one using an application that supposedly does the job for ASP sites.
The error message we get now tells us that your not allowed to have the same URL twice in the xml structure. This seems pretty ridiculous as many pages will have the same links.
Some research has told me to append each of the URLs with a unique virtual useless requeststring in the xml. This also seems a bit ridculous, and a total hack - especially with a site containing potentially hundreds of urls repeated.
Can anyone shed a little light on this, or maybe even a totally different approach??
Thanks so much!
Basically, the default site map provider (System.Web.XmlSiteMapProvider
) requires all URLs to be unique, so it can easily resolve the currently selected node, with the property SiteMap.CurrentNode
.
This is a bit frustrating, which results in people tacking on bogus query strings like you've noted. For the simple case with only a few dups, this is usually acceptable.
You can however, implement your own site map provider, see Implementing ASP.NET Site-Map Providers on MSDN. By doing this, you can have your own logic that processes your sitemap file, and get the behavior you want.
A custom sitemap provider would probably be the cleanest approach in this case.