The crawler and source library is fully accessible through our API (web service) and requires no local installation. The user interface requires no technical expertise to manage.
CyberWatcher has developed several crawlers for retrieving information - all of which have been designed to meet specific requirements. Via an online dashboard the editor can adjust the retrieval according to type of source, web site link structure, and content elements on each web site. The patented content-element crawler filters out irrelevant content and spam.
When web sites are added to the source library, the source editors include metadata with each source in order to keep a good structure of the library. This secures a high quality of all input.
At the user side of things, this means that the user can make a selection of sources based upon either of the metadata parameters that have been included, in order to get an initial filtering of the content.
CyberWatcher also offers single source monitoring, allowing the user to get content only from one specific website or even a specific section of a website.