Evaluation of Web Crawler
Evaluate the proposed web crawler design based on the non-functional requirements.
Fulfilling requirements
Let’s evaluate how our design meets the non-functional requirements of the proposed system.
Scalability
Our design demands that scaling our system horizontally is vital. Therefore, the proposed design incorporates the following design choices to meet the scalability requirements:
-
The system is scalable to handle the ever-increasing number of URLs. The required resources, including schedulers, web crawler workers, HTML fetchers, extractors, and blob stores, are added/removed on demand.
-
In the case of distributed URL frontier, the system utilizes consistent hashing to distribute the hostnames among various crawling workers where each worker is running on a server; adding or removing a crawler server wouldn’t be a problem.
Extensibility and modularity
So far, our design is only focusing on a particular type of communication protocol: HTTP. But as per ...
Create a free account to access the full course.
By signing up, you agree to Educative's Terms of Service and Privacy Policy