Crawling engine needs it's own index?

Vote:
 

Hello

I tried to dig thru the documentation, but I couldn't find any information about the Crawling engine in EPiServer find. What we need to know is if we for example have a site that have both normal Episerver content but also want to crawl lets say 2 more sites for content. Can we then include all the content into one index? Or is one crawled websites = one index? And then we need to search in several indexes? I sort of assumed that we could crawl how many websites we want and then just bundle everything into one index.

Please enlighten me! :)

#197731
Oct 11, 2018 17:10
Vote:
 

A single Episerver instance points to a single Find index (1 index per CMS instance)

That Find index can have crawlers associated with it. (multiple crawlers per Index)

Multiple websites can be crawled, and the results will be indexed in the same index that this has been set up on.

NB - the items in the Find index that result from a crawl will contain the full HTML of the crawled pages. To my knowledge there is now HTML parsing.

#197807
Oct 15, 2018 4:50
Vote:
 

Thanks for your answer Marcus, just was I was looking for!

#197813
Oct 15, 2018 7:51
This topic was created over six months ago and has been resolved. If you have a similar question, please create a new topic and refer to this one.
* You are NOT allowed to include any hyperlinks in the post because your account hasn't associated to your company. User profile should be updated.