On this page:
Table of Contents |
---|
Image Removed
For details on using the Aspire Content Source Management page, see Admin UI.
Image Added
To specify exactly which shared folder to crawl, create a new "Content Source"
Image Modified
In the General tab in the Content Source Configuration window, specify basic information for the content source:
After selecting a Scheduled, specify the details, if applicable:
Info |
---|
You can add more schedules by clicking in the Add New option, and rearrange the order of the schedules. |
Info |
---|
If you want to disable the content source just unselect the the "Enable" checkbox. This is useful if the folder will be under maintenance and no crawls are wanted during that period of time. |
Note |
---|
Real Time and Cache Groups crawl will be available depending of the connector. |
Image Added
In the Connector tab, specify the connection information to crawl the File System.
For Windows: D:\folder\folder1\
Image Added
In the Workflow tab, specify the workflow steps for the jobs that come out of the crawl. Drag and drop rules to determine which steps an item should follow after being crawled.
These rules could be where to publish the document or transformations needed on the data before sending it to a search engine. See Workflow for more information.
For this tutorial, drag and drop the Publish To File rule found under the Publishers tab to the onPublish Workflow tree.
Image Added
Now that the content source is set up, the crawl can be initiated.
If there are errors, you will get a clickable "Error" flag that will take you to a detailed error message page.
If you only want to process content updates from the File System (documents that are added, modified, or removed), then
If this is the first time that the connector has crawled, the action of the "Incremental" button depends on the exact method of change discovery.
It may perform the same action as a "Full" crawl (crawling everything), or it may not crawl anything. Thereafter, the Incremental button will only crawl updates.
Info |
---|
Statistics are reset for every crawl. |