Archive for the ‘ Search ’ Category

Getting SharePoint Search To Work With Anonymous User Access

So, I recently started working on getting SharePoint working as an externally facing site. The more i work with it the more work arounds I end up using.

Getting SharePoint search working was interesting. First of all, since it is externally facing, our configuration require claims based authentication and Forms Based Authentication (FBA) as well.

The first problem I ran into was getting it to work at all on our externally facing site. To do so, I needed to do a few things:

Continue reading

Advertisements

Crawling DotNetNuke with SharePoint Search

I recently encountered a problem that had us all stumped, we could not get SharePoint Search to crawl DotNetNuke successfully.

The problem was attributed to several things:

  1. SharePoint automatically removed the ‘default.aspx’ on folders when crawling and you cannot turn this off
  2. We had DotNetNuke set to use ‘Friendly URLS’ which means it dynamically generates the URL
  3. There was no actual file path when DotNetNuke Generates the URL, so SharePoint does not find anything when it kills the ‘default.aspx’
  4. Active directory authentication in DotNetNuke was preventing the SharePoint crawler from being granted access to the pages it was trying to crawl.
Unfortunately, I couldn’t do the obvious and turn off friendly URLs because the thousands of users already bookmarks and saved pages so it would appear as if many pages were down.
So, to solve this I did three things:
  1. Wrote a console application to generate a file containing the ‘unfriendly’ URLs from DotNetNuke (URLs using the query string to specify the page) then resave the web.config file to force an application pool restart. Also, during this time turn off Active Directory Authentication for about 30 minutes on the DotNetNuke site so the SharePoint Crawler has access to the pages.
  2. Create a Crawl rule in SharePoint Search that searched ‘Complex URLS’ and point it directly at the file generated
  3. Set a timed job on the server to generate a fresh list of ‘unfriendly’ URLs, set the crawler to crawl the DotNetNuke URL file while Active Directory is turned off.
So, all in all we got this to work. You can see the solution wasn’t easy though.